Skip to main content

The Life Robotic: When Autonomous Machines Can Do More Than Just Vacuum

A leading technologist imagines what will happen when home robots take on a larger role as domestic servants

Roomba, the robotic vacuum cleaner.

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


There are already more robot vacuum cleaners in people's homes than many species of intelligent animals boast in absolute numbers. Although these machines are effective at simple tasks, they are not yet as intelligent as any of our pets. They have only a few sensors and do not really understand very much at all about the world they live in.

Still, lots and lots of start-ups and established companies want to build more sophisticated domestic robots, as there is a belief that developing home automatons is going to be a big growth area. I believe that this trend will continue, driven largely by rapidly aging populations in Japan, Korea, Europe, North America, and even China.  To help us grow old, though, the next generation of domestic robots is going to have a much richer sensory world than existing home robots.

Rosie the robot maid from The Jetsons; cosplay at San Diego Comic-Con 2012. Credit: Pat Loika Flickr(CC BY-SA 2.0) 


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


We humans know how we sense the world, what we are aware of, and what catches our attention. Our sensory world is very different from that of our dogs. We know that compared to with us, our dogs have an exquisite sense of smell, and that they are motivated by smells in ways we never could be. Although their noses are perhaps a million times more sensitive than ours, they love to shove them in places we would never want ours to be placed. They do that because they can learn so much about the world that we never could from our relatively unsophisticated sense of smell. In contrast, however, our eyesight is much better than that of a dog’s; we have much better acuity, and our range of color visiokan includes red, something completely lost on dogs, who just see our vivid reds as shades of grey. We are visual creatures; dogs are smelling (and smelly) creatures.

So how will future domestic robots sense the world, and how will their world view be different from ours?  What will it be like to be a robot in human homes? What are robot creatures going to be like?

A Phone’s Worldview

The new sensor world of domestic robots will be largely driven by the success experienced by smart phones over the last 10 years, the drop in price and increasing performance of certain sensors, and by the availability of low power, physically small computation units. Let’s try a thought experiment of what it will be like to see the world as a smart phone does.

The innards of a modern smart phone, absent the actual cameras, the expensive touch screen, the speakers and microphone, without the high performance batteries, and the beautifully machined metal case, are worth at retail about $100 to $200. Both Samsung and Qualcomm, two of the biggest manufacturers of chips for phones, sell circuit boards at retail, without discounts for a large order, which have most of the rest of a modern cell phone for about $100. This includes eight high performance 64-bit processors, internal circuitry for two high definition cameras, the GPU (graphics processing unit) that is used to render photos and videos, the hardware circuits to drive the screen, special-purpose silicon circuitry that finds faces and focuses on them, special-purpose speech processing and sound-generating silicon, vast quantities of computer memory, cryptographic hardware to protect code and data from external attacks, and wi-fi Bluetooth drivers and antennas. Missing is the GPS system, the motion sensors, near-field communication for services such as Apple Pay, and the radio frequency hardware to connect to cellular networks. Rounding up to $200 to include everything in a unit would be a safe bet. 

Anyone considering building a domestic robot in the next five years would be crazy not to take advantage of phone technology and the manufacturing infrastructure for producing the devices. The $200 for a single board gets a vast leg up on building the guts of a domestic robot. So this phone-related technology is what will largely drive the sensory world of our coming domestic robots over the next decade or so, and phones may continue to set the pace more widely for many technologies, given that phones themselves have changed for many decades. Don't be surprised to see more silicon in phones over the next few years testing out AI technologies such as deep learning. Their current GPUs are already well suited to that, but smarter and smarter phones will compete on how well machine learning can be applied to practical problems of convenience and enjoyment for the phone owner.

There may well be other sensors added to domestic robots that are not in phones, and they will be connected to the machines’ processors and have their data handled in the devices, adding to their sensory world. A lot of what robots sense is going to come from the cameras, microphones, and radio spectra transceivers (transmitting and receiving units) for mobile phones. In the forthcoming 5G phone chip sets for high-end smart phones, there will be a total of nine different radio systems.

Even if only equipped with the $100 stripped-down phone systems our domestic robots will be able to "smell" our Bluetooth devices, our wi-fi access points, and any devices that use wi-fi.  As I look around my house I see wi-fi printers, laptops, tablets, smart phones, and a Roku device attached to my TV (more recent "smart TVs" have wi-fi connections directly). I already own an array of active Bluetooth devices, including computer mice, keyboards, scales, Fitbits, watches, speakers, and many more. The sensory world of our domestic robots will include all these devices; they will notice them as they drive by, just as we notice when we are in a green bedroom or a blue dining room. Some of these Bluetooth devices will be in fixed locations, some will move around with people. Some will be linked to a particular person who might also be located by cameras on the robot. But the robot will have eyes in the back of its head. Without pointing its cameras in some direction, it may well be able to know when a particular person is approaching just from their Bluetooth signature. 

With this sensory world and just a little bit of processing of the Bluetooth signals, our domestic robots can evoke a totally different view of the world than we have when we use our phones. To us humans, our devices are a means to communicate, but to our robots they will be geographic and personal tags, to be deciphered by viewing even just a scant part of the information within the signals. 

Depending on who builds the robots and who programs them, they might be able to extract much more information about us than just distinguishing us as recognizable persons. Perhaps they will have the ability to listen in on the content of our communications, be it how many steps we have taken today, or the words we are typing into our computer, or our e-mails coming in over the wi-fi in our homes.

In recent years Dina Katabi and her students at M.I.T. CSAIL (Computer Science and Artificial Intelligence Lab, where I was the director until 2007) have been experimenting with using the radio signals used in wi-fi—but not the actual digital content carried by these signals. Every phone has to perform this type of processing, but mostly they just want to get to the content, and perhaps maintain the quality of the connections. If the wi-fi signal is not strong, or if there are too many other devices trying to use the same frequency band, the phone may choose to switch to another radio frequency that is clearer. 

Katabi and her students have taken advantage of this embedded processing power in phones to determine how timing varies subtly for the multiple paths that a radio wave may take to reach a device. They have used it to detect the way changes in people’s heartbeats and breathing affect the timing of the signals very slightly. The altered signals may denote fluctuations in a person’s emotional state. (Note that this does not require any sensors attached to someone’s body—the radio transceivers just detect how the person's physical presence changes the behavior of wi-fi signals.) Our future domestic robots may be able to get a direct read on us in ways that people are never able to do. They will detect every one of us within a living space and constantly sense measures such as heart rate and breathing.

If the chip sets use the full complement of communications channels that a smart phone has, there will be much more richness to the sensory world of the robot. Using GPS, even indoors, the robot will roughly know its global coordinates. And it will have access to all sorts of services any smart phone uses, including the time of day, the date,  and the current weather both locally and elsewhere in the world. It will know that it is dark outside because of the time of day and and time of year, rather than knowing the time of day because of how long it has been dark or light. Humans get that sort of information in other ways. We know whether we are inside or outside, not from GPS coordinates and detailed maps, but because of the perception of light, the way the air circulates, and how sounds reverberate. Those sensory inputs are part of our subjective experiences.

If our robots bypass the way that we experience the world with direct technological access to information, it will be hard for them to understand our limitations. If way in the future, they ever achieve their own levels of consciousness, they may not have much empathy for us. And should there be a horrible infrastructure-destroying event, such as an attack on satellites or a nuclear war, our robots will be left incapacitated just when we need them the most—a scenarios that we should ponder when we design them.

With the additional radio frequencies of 5G, it will be possible to compare the arrival time of different signals in many wireless domains. That development will allow the machine to begin to understand things about the built environment—whether a wall is built of concrete,  or wood, or whether a large, metal moving object nearby is a truck or bus. If the builders of our domestic robots decide to program all this in, the machines will be imbued with a kind of super-sense of the world around them.

The cameras on our robots could be readily chosen to extend the range of light that they see into the ultraviolet and infra-red spectra. In the latter case, they will be able to see where we have recently been sitting, and perhaps which way we have walked by following a heat trail. That capability would serve as a demonstration of another weird sense that they would have and we don’t. 

Robots might also tap into the devices that are starting to populate our homes, our Amazon Echos and our Google Homes, and even our smart TVs, that can listen to us at all times. A home robot—in its terms of service, which we will likely agree to sign without reading— might well have access to what those devices are hearing. It might also have link to what the cameras on our smart smoke detectors are seeing, what went on in our car right before we got home, where we just used our electronic wallet to buy something, and perhaps what it is that we bought. Domestic robots may have an ability to sense things about our lives that the most controlling parent of teenage children could only imagine.

What They Know

So the sensor world may be very different for our domestic robots than for us. They will know things we can't know, but they also may lack understanding of our subjective experiences. How well they are able to relate to us will depend on how well their perceptual processing aligns with ours. And this is where thinking about the sensory world of future robots eventually becomes very murky indeed.

How we interpret the world that we sense has been informed by a lifetime of experience that involves both sensing and doing. Our new home robots are likely to interpret their sensors through deep learning networks, providing a label for what they see (or radio sense) such as "oven", or "fridge", or "toilet.". Deep learning can easily provide that level of understanding.  But they the robots will not have the more incisive understanding that we possess.  We know that when we put something into an oven and take it out later its visual appearance may have completely changed—just one example of our  wealth of common sense knowledge. Currently machine- learning techniques do not handle this sort of knowledge, so our robots are going to be quite naive about how the world really works for some time to come.

We and our robots will have different sensory worlds in our shared homes. The better we understand the worlds of our robot helpers, the better we will be able to hold realistic expectations of what they should be able to do, and what we should delegate and entrust to them, and what aspects of our privacy we are giving up by having them around. If we build into them a reasonable model of the human sensory world, they will better anticipate what we can know and will do, and so smooth our interactions.

For the next couple of decades, at least, our robots will lack a sense of self, subjective experiences, episodic memory blended into those subjective experiences, or any semblance of consciousness, in the way that we might think our dogs have some form of consciousness. The first few generations of domestic robots (robot vacuum cleaners are the first generation) will be much more insect-like than mammal-like in their behavior and in their interactions with us. They will not be like us at all.

Perhaps as more people start researching how to imbue robots with subjective experiences and consciousness-like experiences, we will begin to establish a sense of empathy with our machines. Or perhaps we and our robots will always be more like an octopus and scuba diver: one is not completely comfortable in the other's world, not as agile and aware of the details of that world. The two beings are aware of each other, but never truly engage.