Our Talking, Walking Objects

press-new-york-times-carla-diana-our-talking-walking-objects-01-28-2013-smart-design

Meeting Simon for the first time was one of the most sublime experiences I’ve had. With every coy head nod, casual hand wave and deep eye gaze, I felt he already knew me.

Simon is a humanoid robot being developed at the Georgia Institute of Technology for the purposes of exploring intuitive ways for people and machines to live and work alongside one another. I had designed the robot’s shell — its outward appearance — so I knew exactly what to expect, but interacting with it as a programmed and somewhat sentient creature surprised me in ways I hadn’t expected.

Our future may not match our sci-fi fantasies of androids… but robots are entering our homes in subtle ways.

Simon understood spoken sentences and used social skills to respond appropriately. If it didn’t understand a certain request, it raised its arms in an apparent plea for forgiveness or tilted its head to express confusion. Its ears lit up when it recognized a color, and it spoke back to me when I was finished talking.

Simon, a research effort and not meant to be sold, is part of a growing collection of social robots that can essentially see, hear, feel and react through humanlike sound and movement. Our future may not match our sci-fi fantasies of androids with limbs, torsos and expressive faces meandering around our rooms to pick up clothes and mix cocktails, but robots are entering our homes in subtle ways, through countertop appliances, hand-held tools and wearable gadgets that display specialized and isolated robotic behaviors.

YOUR coffee maker or camera may already have some of these elements, responding automatically to shut themselves off or follow a sequence of timed commands (wait 30 seconds, take a picture; at 6 a.m. start brewing, etc.) and the next generation of products will be only more sophisticated in this regard.

Whereas designers typically use form, color and materials to make an object express some human element (a drill handle may have a pattern that looks aggressive, a toaster might have knobs and dials that seem friendly), we’re entering a time when sound, light and movement are equally important parts of the creative palette. Everyday objects whose expressive elements have long been static will now glow, sing, vibrate and change position at the drop of a hat.

The behaviors of these future robotic objects may be utilitarian, like a lamp that bends to follow the items you reposition on a desk, or they may provide feedback, like a fork that vibrates when you’re eating too fast. They won’t require you to press a sequence of buttons to make things happen, but they will work alongside you in unobtrusive ways, responding to caresses, waves or verbal commands. Some might just sit back and observe you to understand what you need.

A robotic cutting board might guide you while you cook and offer helpful tips when your knife skills aren’t up to par. Many products will be connected to the Internet, with access to what’s happening in the larger world. A raincoat might glow or whistle when it knows you’ll need it on your commute. Many webcams used for video conferencing already raise their heads to let you know when someone is ready on the other side of a conversation and soon during a chat they will mimic your body’s movements to help express your point to a friend across the globe. A medicine bottle might open its lid to offer you a precise dose or automatically call your pharmacist when it’s empty.

Washing machines will text or call you when your laundry is done. Robotic appliances will become ever more energy-conscious — many automated thermostats already adjust themselves appropriately, combining knowledge of their users’ patterns with information about how to conserve energy.

As products become smarter, their behaviors will mean they essentially have continuing conversations with us, whether they include verbal exchanges or not. Just like we read subtle cues from our pets (we see a dog’s ears and believe that he feels sad, guilty or excited), we’ll read emotion from our products, perceiving nuances of dialogue and a sense that the object is “alive.” For example, colored lights on a robotic vacuum cleaner will tell us what’s going on inside: green, slow pulsing indicates “All systems go!”; rapid red flashing pleads “Help! Something is amiss here.” A jubilant melody at the end of a washing machine cycle says, “Everything went well and your clothes are ready!” When a video conferencing webcam in an office lowers its head, it’s saying: “Bye! Going to sleep now.” These animated behaviors blend together and it’s human nature to read them as emanating from a living entity.

Through their personalities, these objects will offer us emotional value along with other features. Siri, the iPhone speech recognition search engine, has already won the hearts of many by displaying a consistent, witty personality with which people can converse. Autom, a new countertop weight-loss product, was created by researchers who learned that a robotic coach with expressive eyes and face was more effective at keeping dieters eating better and exercising more because the emotional bond was created.

Even robots with minimal expressions, like the Roomba floor cleaner, have led people to treat them as living entities, bestowing names on them and perceiving moments of celebration and even guilt. Researchers at Georgia Tech have found that this emotional connection leads people to feel empathy for the product, making them much more accepting of mechanical flaws that would otherwise be seen as a nuisance.

Browsing Amazon.com reviews of the Roomba reveals just how compelling this sense of life is: “We have named our new Roomba Rosie. She is my new best friend. I vacuumed with my 15-year-old Kirby before letting Rosie do her thing…She is wonderful. Most of my furniture is too heavy to move each week. She cleaned under those pieces and I was amazed at what she picked up.”

When working for the firm Smart Design, designing a floor-cleaning robot for a company called Neato Robotics, I studied research that had been conducted around the Roomba and knew that shaping the vacuum’s personality would be as important in building a positive relationship between product and owner as any other element. To do that, my team and I dissected typical service personalities, like the hotel maid you never see, the chatty bartender and the proper nanny. “Which of these should this vacuum be?” we wondered.

I also asked the team to break down the product’s behavior into critical “moments” for personality definition. With people, we know that a person’s true character is revealed during moments of extremes — both negative (stress, anger, fear) and positive (pride, jubilation, satisfaction). To define the robot’s character, we identified the extremes and then created a language of expression that combined light patterns, a sound palette and choreographed movements.

For example, getting stuck under the couch would be a moment of distress, as would its batteries running low. Completely cleaning a room’s carpet might be a moment of jubilation. We wanted to define exactly what the robot would do at these moments. What sounds should it make? How should it move? Just as we might work with a color or material specialist in a traditional product, here we worked with a music composer to create a palette of sounds.

While the idea of charming entities all around us, guiding us through everyday tasks and reminding us to do things throughout the day may seem benign, some experts raise concerns about the implications of robots in our daily lives. Once designers can perfect the art of manipulation, manufacturers can use this power to entice people to adopt behaviors that may not be in their best interest. There could be, for instance, a robot that tempts you to buy more unhealthy food or buy a lottery ticket.

In addition, there are many security concerns around robotic objects, especially when they are mobile and in the intimate space of the home. For example, what happens when your products get hacked in a nefarious scheme to allow a criminal to see inside your house or, worse, when they are hacked to manipulate a person to behave in a way that he otherwise wouldn’t have?

Despite the potential risks, the future will be rich with sensor-based, animated objects using expressive sound, light, motion and screens to praise, encourage, advise and comfort us. We’ll have fun with TVs and music players that reposition themselves in response to our dance moves, we’ll be relieved of chores by robots that pick up our crumbs and scrub the floors, and we’ll stay fit with robotic parasites that curl around our wrists to take health stats and exhort us to get off the couch.

With this throng of sentient objects in our lives, we’ll have to negotiate a whole new set of relationships. Will we adore our new products as if they were pets, doting on them and anticipating their greetings? Or will all this lively communication create an annoying cacophony of gadgets? My hope is for the former, but it will depend on the designers’ ability to devise interactions that consider emotional value as important as any other product attribute. If designed well, these mechanical creatures, like my robotic pal Simon, can tug at our heartstrings in a new way.

Originally printed in The New York Times.

Photo Credit: Daniel Borris for The New York Times. Simon, a humanoid robot, sits for a photograph.

Filed under: Digital Experiences, Carla Diana, Business Design, Emotions