Skip to main content

Do you know where you are? Your cell phone does

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American



On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


A team of Duke University researchers in Durham, N.C., is studying new ways to use the abundance of sensors contained in most smart phones (including the camera, accelerometer, microphone, GPS and Wi-Fi radio) to determine mobile users' precise locations and thereby deliver hyper-localized services. This could enable a business such as Starbuck's to text-message a coupon to a person's phone as he or she enters the coffee shop, or it could allow Wal-Mart to send shoppers a listing of sales items as soon as the store's doors slide open. Another option could be to provide blind mobile subscribers with information about where they are as they move from store to store within a mall.

The researchers argue in a paper presented today at the ACM MobiCom 2009 conference in Beijing that the increasing number of sensors on mobile phones presents new opportunities for logical localization, which is more useful to people than simply representing their location as a set of latitude and longitude coordinates.

To test their argument, the researchers wrote a software application called SurroundSense, which uses optical, acoustic, motion and other data captured by mobile phones to create a "fingerprint" of a given location, says lead researcher Romit Roy Choudhury, a Duke assistant professor of electrical and computer engineering. For the purposes of their study, the researchers visited 51 different stores and restaurants and used their mobile phones to gather data about the sights, sounds and layout (measured via accelerometer) at various times of the day. The phones send this data back to the SurroundSense server, which created a database of location fingerprints.

Once a fingerprint was made for each place, the researchers sent students into the stores and restaurants visited to see if their mobile phones (communicating with SurroundSense) could correctly deduce their location. "We achieved an average accuracy of over 85 percent when all sensors were employed for localization," Choudhury says.

SurroundSense alone might not work as a service that could be sold to mobile users, Choudhury acknowledges, but it could be combined with a global-satellite positioning service such as Microsoft's prototype GeoLife to enable location-specific advertising on cell phones or locator services that can communicate to a blind person info about his or her surroundings in real time. Scaling SurroundSense could also prove to be a big challenge, given the volume of information coming into the system from mobile users (whether from the same Wal-Mart location or an entire city).

Choudhury is working on several other mobile-phone projects as well, including the PhonePoint Pen application, which can capture a mobile phone user's hand movements, whether drawing a picture, jotting down numbers or writing a sentence. The software is able to translate numbers and capital letters written in English (adding other languages is on their to-do list) into a message that can be e-mailed and accessed either from a computer or mobile phone. Pictures are saved as .jpg attachments that can be sent along with e-mails.
Image ©iStockphoto.com/ Josh Hodge

Larry Greenemeier is the associate editor of technology for Scientific American, covering a variety of tech-related topics, including biotech, computers, military tech, nanotech and robots.

More by Larry Greenemeier