How social design can make new products more human

Executive Technology Director

Connected devices are playing an ever-greater role in our daily lives

Evolving into trusted and familiar entities we’ve come to depend on to motivate us to exercise, make sure we take our medications on time and help us achieve both our work and personal goals. These devices function best when they go beyond “smart” to truly understand not only our needs but also the social and cultural context in which we live. Yet authentic human interaction with devices will only be achieved through better design—not more technology. So how are designers going to take this to the next level to enhance our lives?

For our third Smart Salon of 2021, we invited Carla Diana, author of the new book My Robot Gets Me, to sit down with technology journalist and podcaster Stacey Higginbotham and discuss the evolving field of social robotics and its growing impact on design and how we live. Carla talked about ways to make social interactions with such devices feel more natural, why dumb objects are really scary and the importance of being vigilant about ethical concerns.

Watch the entire session featuring (clockwise from the left) Smart Design host John Anderson with Stacey Higginbotham and Carla Diana.

Collection

Scaling creative research methods

Use a blend of qualitative and data-driven methods to deeply understand people’s behaviors and attitudes

Collection

Scaling creative research methods

Use a blend of qualitative and data-driven methods to deeply understand people’s behaviors and attitudes

Our Panelists

Carla Diana is a designer, author and educator who explores the impact of future technologies on everything from robots to connected home appliances. She developed the 4D Design program at the Cranbrook Academy of Art, serving as its first Designer in Residence, and is also Head of Design for Diligent Robotics, an AI company developing robotic assistants in healthcare settings.

Stacey Higginbotham has been covering technology for 20 years for major publications including Fortune and MIT Tech Review, and is the creator and co-host of the Internet of Things podcast.

Here are the key takeaways from the evening’s conversation:

Design social interactions to feel more natural and intuitive—more human

Social robotics is about the interaction of the physical and the digital, and how technology can humanize our interactions with products and devices. As Carla explained, the premise is that “instead of having to type code, or using a keyboard or press buttons, we could interact with computing devices by bringing to the table what we know about being a human being.”

For example, the robot Diligent Robotics, where Carla leads design efforts, designed for healthcare, called “Moxi,”, made by Diligent Robotics, where Carla leads design, can speak, gesture, respond and even navigate its hectic hospital environment. Speaking about the new technology behind social robotics, Carla added, “We have the ability to interact with products as entities—and actually have a conversation that is evolving.” This can be a robot or, say, a vacuum cleaner.

Use this core framework to build better-connected devices

Carla outlined a series of steps to reach this goal. The first step is recognition of the physicality of the object and its relationship to your body, face and hands. Next, is expression, or how the device communicates with you through light, sound and movement. Consider the Interaction, which is how the device (or, as in one example she mentioned, the Stanford Center for Design Research’s Mechanical Ottoman) takes cues and responds. Context is also critical to understand both you (the user) and the environment in which it will function (e.g., the rugged, waterproof Hammerhead bicycle navigator). Finally, create an ecosystem or family of devices that can work together, communicate and scale, thereby providing the context to evolve from “smart” to “social.”

Maintain a North Star of social appropriateness

During the design process, Carla recommended teams should think about “the social exchange between the person, the product, the people and the product, and the people in the products.” They should use techniques such as playacting and visioning experiences to map out key moments of interaction; for instance, in the process of designing Moxi, a person pretended to be the robot in order to help determine its points of contact with a human, and how it should react. “This highlights the nuances of the social interaction that might not come through in a rendering or drawing,” noted Carla.

Don’t fear scary robot overlords

Thanks to popular media and Hollywood movies, there’s an ever-present angst about machines taking over the human race (I, Robot comes to mind). But what Carla finds particularly frightening, she admitted, is today’s “socially insensitive [devices that are always] beeping and screaming at us [and] misbehaving.” That happens when the machines are detached and operating on their own, outside of what they are doing [or: outside of what they are meant to be doing].

One solution is to better define “what it means to be a social product as opposed to a smart product.” And we need to inform consumers about these new devices, recognizing that, whereas young people may be more used to them—“having grown up with cameras on every street corner”—the older generation may feel uncomfortable.

Be hyper-aware of ethical concerns

The advent of super-powerful sensors—what Carla referred to as the “everything sensor”—has the potential to violate privacy. Companies and brands must therefore carefully balance the benefits and potential risks of social interaction.

For example, while camera vision technology can read a person’s heart rate just by looking at their face, it can also “expose things that we don’t want them to necessarily expose,” she warned. Technology helps people with disabilities, but at the same time may feel intrusive, “like someone lurking in the corner of a room and listening and not letting you know they are there.”

Devices that can read our emotions may manipulate and persuade users to do things; their data could be hacked. Designers must be transparent about what the devices do, especially among vulnerable populations, such as kids, she advises. And they should question whether “bad habits and bad behaviors” might result.

Stop thinking only about the technology

Avoid using technology for technology’s sake, as this can lead to embedding unnecessary features that confuse the user. “There may be enormous technology challenges for us as designers, and how we use these complex tools, but to the person experiencing [technology] it should feel like it is just not in the way,” Carla stated. In other words, function trumps functionality. A good example, she mentions, is Smart Design’s Flip Mino camera, an easy-to-use, point-and-click “record” device that fits in your pocket, unlike cumbersome old camcorders with a million settings. And there are other aspects of social interaction to consider: movement, light, and animation, as well as the 4th dimension—time—“and how things change over time,” she pointed out.

Conclusion

For all its recent advances, “there are lots of very wicked, gnarly technical challenges at this moment in time,” Carla says about social robotics. “We are still in what I would call the awkward teenage years.” How far into the future will it be before we truly embrace this design philosophy, and are able to interact with socially savvy robots? Perhaps a better question, Carla averred, is: “Are these robots showing up well, or are they failing us?”

Smart starters

Discover the human (and robot) brain
By understanding our social interaction with robots, we can translate what’s happening in the robot brain (or machine brain) into human-ese. Employ playacting and visioning experiences to map the key moments and nuances of social interaction—and then design them to feel more natural and intuitive.

Imagine a socially-savvy product
To go beyond a “smart” product, consider the “social exchange” that takes place between the product and the person. And always be aware of the social “appropriateness” of this interaction.

Reduce the fear factor
Assess the level of consumers’ concerns about technology and devices that’s been fueled by Hollywood and popular media. Be upfront with them about what these devices can—and cannot—do. 

Balance benefits and risks
Highlight the ethical concerns of more social interaction and the possible downsides— intrusion, manipulation, and violating personal privacy. Make sure these interactions do not lead to bad habits and bad behaviors, especially for vulnerable populations, such as young people.

Achieve authentic human interaction
Avoid deploying technology for technology’s sake, so that features don’t get in the way of social interaction and confuse users. And always ask whether the connected device or robot is achieving its goal—or failing us.

Let‘s design a smarter world together