Blog

  • What I Learned About Technology From My Dog

    Originally published on the Kicker Studio blog. 

    What I learned about technology from my dog…

    Dogs understand, communicate and serve like any good robot should. By observing my dog, I came to appreciate how to design interfaces that could properly communicate with people. Dog is man’s best friend, right? That’s because they listen to us and respond appropriately, especially when we’re feeling stressed, down, or otherwise screwed up… hell, they corroborate our very existence with their penetrating eye contact and tail wagging. Here’s a bit about my dog.

    • Expresses empathy: He has eyebrows. This is important. He is able to mimic my mood and let me feel as if he understands my emotional state. When I’m happy and excited, he jumps around, happy and excited. When I’m sad, his face mimics mine and he tries his best to cuddle up next to me and give me a bunch of dog kisses. When I’m mad at him, he shows he’s sorry and listens very closely to my instructions. When I’m scared, he jumps to attention and barks at anything that moves. He understands my emotions and responds with empathy to me.

    • Pays attention to context: When I start getting dressed, my dog gets upset. He knows this means I’m heading to work, and right away, he starts pleading his case to come along with me. He reads my steps to getting ready to leave the house as an indication of my intentions, and works to let me know he understands what’s happening. However, if I do these steps individually, at other times of the day, or not at all, my dog does not run around the house dramatically, letting me know he’s extremely interested in accompanying me somewhere. He understands the context of my actions, and emphatically, (hysterically?) communicates possible functions in that context. If I’m outside and something scares me, he stays by my side and barks like crazy, but he doesn’t do that if we just go outside. He infers what his behavior should be based on what is happening around him… around us.

    • Reads my gestures even better than my words: And he does the best when I use voice with gesture. I can communicate with my dog without making a sound, and with simple gestures. The more I repeat them, the more he learns what I mean when I do them. He lets me know he understands by watching my gestures with his eyes, and then carrying out my request.

    • Productive feedback: Also really important, is that if he doesn’t understand, he communicates this, by tilting his head in a questioning manner. It’s just as important that he communicates when he doesn’t understand as when he does. Otherwise, it’s just frustrating. He’s honest. He doesn’t try to bullshit me by pretending he understands while he’s really just running on some incredibly irrelevant script, no way! He’s the real deal. He feels me.

    So see? My dog and I have a quite the functional relationship happening. Clearly, my dog’s got my back. He aims to please. We’ve got it goin’ on. We understand one another. Well, what if I could train my devices to learn my commands, gestures and feeling states? What if my devices and I could get along as well as my dog and I?

    It turns out, that we can actually do this now with technology. We can design technology that responds just the way a dog does.

    Check this out:

    • Voice Technology: Emotion detection (or affective computing) technology exists. Acknowledging a user’s emotional state will vastly improve his or her functioning, according to various studies about automotive safety and voice emotion recognition. Smile/frown recognition is already an available feature on many devices made by companies like Samsung and Microsoft. If I’m angry, put a little empathy in the voice (not too much — because that’s annoying), just a little cream and sugar. It might actually stop me from spiking my phone on the pavement.
    • Context: The Google interface does a good job of recognizing my most relevant data, based on my calendar and emails. It infers what information I will most likely need in any given moment. Furthering this context-based inference making ability will mean the creation of interfaces that recognize our patterns/needs and offer productive feedback, in the form of applicable options/solutions that feel authentically helpful instead of disconnected and, well… infuriating.
    • Gesture: There are any number of gesture recognition products on the market. The majority are used for novelty, like in immersive video games, but they’re increasingly being used for function. Here are just a few cool examples: The lift-gate on the Ford C-Max Minivan uses gesture technology to provide a hands-free experience. It “sees” that I have my arms full and opens the lift-gate automatically. Gesture tech is also currently being used to assist stroke victims via robot – patient rehabilitation, and there’s even gesture recognition software that can transcribe sign language symbols into text.

    In Conclusion…
    My dog is an amazing companion. We spend lots of time together, and our interactions have revolutionized my idea of what’s possible in terms of personal devices, especially regarding how they can better serve me, with enhanced context awareness, gesture recognition and productive feedback, which is great because I probably spend about as much time with my devices as I do with my dog. Jeez… did I just say that? Well, yeah I did, and if that means we live in a world that relies too much on devices… well, I’m not even going there, except to say that I want my devices to be devoted to my happiness, just like my dog. That’s the mission… at Kicker we’re all about providing that extra kick that’ll make the world a happier place.

    Written by Jody Medich and Wendy Rolon