September 19, 2011 | 3
Will we recognize our robot overlords when we meet them?
The burst of light to my right made me pause: my photo had just been taken. Sure, the sign at the Microsoft Maker Faire tent said entering the area gave them permission to use my image, but I hadn’t quite expected anything so sudden. Or bright. I turned to find the paparazzo, and saw no one. And then, dropping my gaze some two feet, I met EDDIE for the first time.
We regarded each other—to the extent that we could—for a few seconds before EDDIE determined that there were more suitable subjects nearby. EDDIE rolled off to the center of a more photo-friendly group. Delighted, they mugged for the camera.
Maker Faire, a two-day event showcasing the best of DIY technologies spanning everything from sustainability to reimagining healthcare to bots, was the perfect venue for EDDIE to meet and engage with the public, particularly a public with a leaning toward innovation, design, and technology.EDDIE is a robot, after all—one that is grounded in Microsoft’s Robotics Developer Studio (RDS), and uses a Kinect-based platform to push the traditional boundaries of robotics and deliver practical applications to a broader audience. EDDIE, short for Expandable Development Discs for Innovation and Experimentation, provides a template for developers, sharing a standardized platform from which virtually everyone can build a robot. Essentially, EDDIE can wear many hats—and today, armed with a DSLR, EDDIE was playing event photographer.
A Challenge Defined
In 2006 Bill Gates asked us to imagine being present at the birth of a new industry, where the PC would allow us to experience and manipulate objects beyond our reach in the form of robots. In A Robot in Every Home (which is unfortunately behind the Scientific American paywall), Gates outlined the future of robotics, issuing a challenge that would form the Microsoft Robotics Team:
I can envision a future in which robotic devices will become a nearly ubiquitous part of our day-to-day lives. I believe that technologies such as distributed computing, voice and visual recognition, and wireless broadband connectivity will open the door to a new generation of autonomous devices that enable computers to perform tasks in the physical world on our behalf.
Gates believed that we would find ways to apply robotic technologies beyond industrialized labor, the entertainment industry, and military duties. He cited our long fascination of mechanics—from Greek and Roman metalwork to the fantastic imaginings of science fiction and popular culture—as indication that we were receptive to robot partners who could conceivable perform such mundane activities like mowing the lawn or helping with the housework. Gates believed that the robotic revolution—culminating in consumer-centric robot products—was well within reach and could happen fairly quickly if the right community could be mobilized.
What community? The hobbyists. The enthusiasts. The tinkerers. The makers.
Gates saw parallels between the computer industry in the 1970s and the then nascent robotic industry, which just a few years ago was characterized by high operating costs, extremely specialized and proprietary development, and painfully slow progress. Establishing development standards and inviting the creativity of the interested community helped propel computing forward, and its growth continues at a staggering pace. A similar community exists for robots; Gates believed that by providing its members with a shared set of tools, they could help each other and the industry would grow exponentially.
As the story goes, Gates formed a robotics team under the leadership of Tandy Trower, now of Hoaloha Robotics, which drew on the resources at hand to tackle two large challenges facing designers. Makers will always create what they need. When it comes to robots, that means that there was a strong possibility that very few robots were alike. With different, often specialized operating systems meant to tackle specific tasks in specific ways, a broken part could send a developer back to the beginning. Designers also struggled with managing sensory data—concurrency: analyzing data from multiple sources and producing the appropriate commands in response. Concurrency meant that while your robot was processing data about how fast it was traveling, it could be rolling over the side of a cliff—leaving you short of a robot. The issue of concurrency hampered the functionality and finesse of robots, which would need to be managed if robots were to find a place in daily activities.
Microsoft’s Robotic Design Studio released a CCR/DDS programming model, which solved these two issues. The CCR (concurrency and coordination runtime) is a library of functions that helps coordinate simultaneous actions. It meant that robots could be doing things like calculating speed and tracking distance at once, minimizing chances of a crash. Decentralized software services (DSS) helps sync the separate processes, so that these calculations happen more quickly—enhancing sensitivity and finesse. Now anyone with basic programming skills could write applications using this model, which created a standardized base from which developers could share ideas. What could the future hold?
Calling All Makers
“He likes you!” Stathis Papaefstathiou, the general manager for Microsoft Robotics, announced with a hearty laugh as he clasped my hand in his and gave it a warm shake. Photo-EDDIE rounded on us, pausing to frame and shoot. EDDIE’s aim was a bit high, and only the top of my head showed up on the display tablet, but given the height discrepancy between me and Stathis, I could hardly blame the robot. EDDIE fired off another shot before being ushered away to be recharged, and Stathis and I sat down to chat.
We watched as a another team member helped young visitors control another EDDIE with airplane-like motions with their arms. The directions were simple enough: arms up will move you forward, down takes you backward, and tilting one up and and one down (airplane-style) turns you. The Kinect helped capture and interpret these motions to control the robot. It delighted young and old participants alike to no end.
Stathis’ excitement about EDDIE matched those of the spectators. He smiled at the wonder of the youngsters leaning from side to side to make the non-photo EDDIE turn in circles. “We’ve defined a hardware reference platform with Kinect,” he said. “When you take RDS and use EDDIE, everything just works. Here all you do is add your laptop, and it works. It’s seamless.”
EDDIE packages the CCR/DDS programming model, so the basics are already in place for would-be robot builders. And Stathis sees it as an important way of unleashing the creativity of robotic-enthusiasts because it frees them to focus on the challenge of applications—of creating robots that are relevant to their contexts, or “scenarios” as Stathis prefers to call them. Designers have the opportunity to test ideas in the RDS simulator and tweak as needed before publishing an application to run on an actual robot. The possibilities seem endless in this environment.
Photo-EDDIE was built in three days from scratch using the RDS suite. His algorithm helps him find people he “likes,” and if he likes you, he takes your picture. How does he decide who he likes? Photo-EDDIE’s team member wasn’t telling—hardly a surprise, but some general hints would have been more in keeping with the Maker Faire atmosphere, particularly given the overall friendliness of the group. Well, we can guess: EDDIE uses a Kinect-based platform, so perhaps a facial recognition program? One coded for smiles? The world may never know.
The Revolution Is Increasingly DIY
Last year, the predominant topic at Maker Faire was the DIY boom, and the impact it would have on industry. With EDDIE, another layer of DIY seems to have been breached—one that Microsoft is working hard to connect with as is evidenced by the Robotics @ Home Competition, which invites robot designs from public for a chance to be funded for production.
“We think we are getting very close to an inflection point, where robotics will become more relevant to consumers,” said Stathis. “Scenarios will be important. The experiences will be important. Here we have a Kinect on wheels, but at home we’re mobile – we have smart phones, tablets – how do we utilized these devices to generate cool experiences?” How do we tie these devices to the robotics industry, and what will it mean for robotics?
Stathis believes that input from the community is important. But cautioned that we’re still at the beginning of defining that community and helping members find each other. “At some point, cloud computing will play a role in robotics. How do we integrate that? The community might have the answer.” It was also apparent that the Robotics team enjoyed these interactions themselves. They broke down the mechanics in simple terms for younger visitors and didn’t shy away from harder questions from seasoned enthusiasts. They were actively engaged with the community they wanted to mobilize.
A Robot in Every Home?
A few weeks ago, I mentioned the tech-savvy Jetsons, whose world was completely wired, in a discussion about smart cities. They seem an appropriate example as we consider what it means to have robots in our home, or what we might still imagine that means. The Jetsons had food preparation robots, robots that cleaned, robots that helped them get dressed, and of course, Rosie, a fair representation of the sci-fi image of a human-based bot.
The anthropomorphic designs of Rosie and C-3PO helped to create a sense of familiarity to connect us with devices programmed to be our intimate companions. However our relationship with technology has changed in the last few years. As we grow increasingly comfortable with mobile computing capabilities, our perception of what we can share with technology is also changing. Essentially, if we’re comfortable letting our smart devices track our friends, our locations, share our photos, help us bank, and find recipes for dinner, we’re moving beyond the need to disguise our mechanics in humanoid shapes.
In 2006, Gates cited data from the International Federation of Robotics (IFR) projecting the installation of 9 million personal-use robots by 2009. Updated information from IFR reports that 8.7 million units were sold that year alone! These robots include household robots (vaccuum cleaning and lawnmowing robots) and entertainment and leisure robots (toys, hobby systems, and educational robots). The new projections suggest that by 2014, more than 11 million units will be sold.
EDDIE and his many hats are still a ways from covering the activities the Jetsonian robots did on a daily basis, but perhaps we aren’t too far off.
All photos by KDCosta.
Stay tuned for more Maker Faire highlights, including photos and videos of the days events. In the meantime, you might enjoy the following coverage of last year’s Maker Faire:
Secrets of the Universe: Past, Present, FutureX