The evolution of sensors: Charting the path of services passively improved by AI

Executive Technology Director

The house where I grew up, on Cape Cod, has a wraparound deck that my father built.

It also has a simple, low-tech sensor that he installed during the construction: a small switch that flipped when someone opened the front door, ringing a bell. Nothing about this was digital, and we certainly didn’t call it a “sensor”. But in computing terms, the ringing of that bell created a single bit—on or off, yes or no—indicating an event to a group of people. And it was a useful bit, because it was attached to a context that we all understood. Not just a bell ring, but a bell ring for our door, in real time.

In the decades since, our ability to gather data through sensors has exploded

But the underlying goal is the same: to provide people with a piece of information that they find useful and meaningful. The tools we have for collecting, comparing, and analyzing data have advanced immeasurably, and in many ways this makes the challenge of drawing useful insights from it that much harder.

Consider how much value we can extract from a single value, tracked over time. The price of a stock or the number of steps taken in a day can be examined for trends, to serve as encouragement, track progress, or give us a clue as to what might happen next. The big improvements in this type of analysis come from Machine Learning (ML) algorithms, which have become extremely sophisticated at spotting patterns and comparing them with earlier patterns upon which they were trained. With each improvement, the world becomes a bit more predictable. And as Artificial Intelligence (AI) improves, more decisions can be made directly from these patterns, and actions taken — making the world a bit more effortless.

Putting one or two types of data into a specific context multiplies the value of the insight. The Gx Sweat Patch that Smart Design helped develop for Gatorade captures just two data points: the volume of sweat produced while it’s being used, and the sweat sodium content. These two data points are placed into a rich context, though, created by details provided by the athlete about their body and their activity. This allows a customized algorithm to generate crucial insights that direct the athlete’s hydration strategy, leading to clear improvements in performance.

Of course, even the measurements made by a disposable skin patch are benefitting from a leap in sensor technology. In the case of Gx, innovative microfluidic sensors, developed by a biotech company called Epicore, are used to capture sweat data—but this is just a tiny part of an explosion in sensor types over the past decade, allowing us to measure literally thousands of different events, often at minimal cost. 

This leads to the next step in data-analysis complexity

Correlating multiple data streams. Early fitness trackers like the FitBit used an accelerometer to quantify activity, more recent ones added GPS to infer the type of activity, and now trackers like Apple Health Kit add heart rate data to the mix. This represents not just an improvement in tracking, but a fundamental shift. A well-trained algorithm that looks at all three of those data streams can tell you not just what you did, but how it affected you, and by extension, what you should be doing (and avoiding) for maximum benefit. In some situations, it can even warn you of impending cardiac events.

It’s debatable whether the expansion of sensor types or the improvement in ML algorithms will ultimately have a greater impact. But there’s not much question as to which technology is most mature in combining the two: video tracking. So much money and effort has been thrown at video data at this point that facial recognition is the subject of legislation, and just about any moving object can be a candidate for self-driving. 

The ubiquity of video tracking evolves data analysis even further, by allowing us to derive information from a camera and correlate it with data from other sensors. Peloton’s new “Peloton Guide” platform boasts camera-based motion tracking to guide users through strength-building workouts, correlate the collected data with heart rate through a wearable, and offer suggestions on what workouts to do next. 

That’s just scratching the surface too: imagine what could be achieved in a retail space, factory, or home by identifying and tracking the movements of people (or pets) and correlating them with sales, production rate, time of day, or dozens of other metrics. The introduction of 5G and more powerful embedded hardware (think Nvidia) further accelerates this trend, by allowing much of the ML analysis to be performed locally.

But remember, for all the promise of this new IoT and ML tech, we’re still looking for the same thing that the door switch on my childhood home offered: a simple insight that we can use to make a decision. Is there an intruder in my house (Y/N)? Should I buy this stock? (Y/N)? Which workout should I do next?

This realization might bring some lofty predictions about the future of tech crashing back to earth, but in fact, the challenge of sifting useful results from mountains of input has been around for decades. It’s certainly something Smart has been helping our clients with since the 1980s, and many of the tools for tackling it haven’t changed all that much.

To produce an outcome that’s useful and meaningful to people, you still have to build empathy with the users and understand the context (human, commercial, and technological). You still have to experiment with the available technology, to understand what it’s capable of and how it might integrate into existing products and experiences. And you still have to run pilots—lots of them—in order to observe whether all this technology is actually delivering value to the users.

More importantly—and this is where many tech-minded organizations drop the ball—you have to maintain an obsessive human focus throughout your testing and learning process. Cool tech, even if successfully implemented, isn’t successful if it doesn’t produce value or meaning…and users are the final arbiters of that. It might sound odd to say that empathy and iteration are the crucial tools for optimizing new advances in IoT and ML, but this is still tech in the service of people, so it has to be human-centered.

What has changed is the range of personalized services and products made possible by these advances, and this is an area where Smart Design has invested heavily in building our expertise. Plenty of companies understand the tech, and any design studio worth its salt understands how to understand and respond to human needs. But the overlap between these two groups is still far too small, and if we’re going to get full value out of this new dawn in intuitive, highly customized user experiences, it’s going to need to grow.

In the longer term, we can also expect more complex systems, where the insights developed by your data-powered algorithms aren’t delivered to people, but to other devices or platforms. That certainly changes the constraints of your output design, but in the long run it’s just an additional layer on a familiar cake. No matter how many algorithms are handing off insights to devices that are acting on a person’s behalf, the person’s experience is still what defines success. That’s a truth that hasn’t changed, whether the data is being gathered by a multi-million dollar video tracker, or a switch on the door of house in Cape Cod.

Examples of sensors at work in your daily life

Gx Sweat Patch

Volume of sweat produced + sweat sodium content + activity + user data.

FitBit

Accelerometer is used to develop an algorithmic score called “steps.

Apple Health Kit

Pairs GPS (for aerobic exercises such as long distance running) and heart rate to determine total workout and calories burned. This also tracks rest, such as sleep tracking. Health kit also works as a partner platform to allow the Apple Watch to be compatible with products like Strava. 

Tonal

AI powered resistance nautilus machine. The difference between Tonal and most smart equipment is that it adjusts mid-workout based on performance. The AI knows you better than you do and thus will get a better workout based on the fitness plan. This is fascinating, and points to the future of fitness (the AI powered resistance).

Adaptive Cruise Control 

Cruise control that uses sensors and radar/lidar to automatically adjust speed and recognizes cars to remain in the flow of traffic. This is a great example of cars using complex sensor technology, but ultimately output information into an experience for the user as the final endpoint. This will figure heavily into the next development phase of autonomous vehicles.

Amazon Walk out

With wider integration planned for Whole Foods, Amazons Walk out consists of hundreds of cameras with a Gd’s-eye view of customers. Sensors are placed under each apple, carton of oatmeal, and boulle of multigrain bread. Behind the scenes, deep-learning software analyzes the shopping activity to detect patterns and increase the accuracy of its charges.

Nest Cam IQ Indoor

Nest indoor camera uses facial recognition to tell you who is in your house based on who passes through the cameras field of view. 

Technology

Explore our work in technology

Sensor Technology, Machine Learning and AI, Data science, and more.

Technology

Explore our work in technology

Sensor Technology, Machine Learning and AI, Data science, and more.

About John Anderson

John’s background is in software development, hardware engineering, data science and product design. Recently, he has been the project director on the award-winning Gatorade Gx Sweat Patch that mixes software, IoT, data science and a sprinkle of biomedical engineering. Additionally, John oversees a product and engineering team that’s building an IoT platform using machine learning, sensing and computer vision. John is also a technical advisor at NYU Stern and speaks regularly on how designers and engineers can create experiences for people with accessibility needs.

Let’s design a smarter world together