Google's Wireless Sensors: Big Data or Big Brother?
May 22, 2013
Perhaps the most intriguing—and potentially frightening—technology on display at Google’s recent I/O developers conference was a collection of networked wireless sensors that were deployed inside San Francisco’s Moscone Center.
Rather than just let attendees soak up the atmosphere at I/O, Google decided to measure, analyze and report on that atmosphere. It used 525 wireless devices that detected noise levels, humidity, temperature and other variables. The network, which comes out of Google’s Data Sensing Lab, was made up of cell-phone-sized circuit boards connected by a ZiGBee wireless network managed by Etherios.
- Big Data Analytics: Are You Ready?
- Bring Salesforce.com Alive with Your Key Business Processes: Register Now
- Forrester Study: The Total Economic Impact of VMware View
- HP Newsletter with Gartner Research: Maximizing Your Infrastructure through Virtualization
The network is an expansion of one designed by O'Reilly Media for its own shows focusing on M2M data networking.
The sensors are built using the open-source electronics platform created by Arduino, which provides open-source software to network and control the devices as well as building the hardware.
The network feeds data in 4,000 continuous streams to the Google Cloud Platform, which uses the App Engine Datastore and Cloud Endpoints to provide an entry point for the data, which is then processed using the Google Compute Engine, analyzed using Google BigQuery and presented using an interactive Web application
To Google, the project was an exercise in data gathering and analysis--something "a bit different, kind of futuristic and maybe a little crazy," according to Michael Manoochehri, the Google Cloud Platform developer programs engineer in charge of it.
"We think about data problems all the time and this looked like an interesting big data challenge that we could try to solve," he wrote in a blog introducing the network.
But what’s interesting to Google may be worrying to others. Replace the atmospheric sensors with cameras and motion detectors and the same network could become a far more intrusive and detailed version of the closed-circuit security camera networks that already monitor most public buildings. The potential, for both good and bad, is in Google's relentless pursuit of every shred of data it can gather about its users and customers.
Google’s data sources already include dozens of services through which flow details of all the public and personal lives of millions of subscribers. Add to that a host of devices like those demonstrated by the Data Sensing Lab, and new projects like Google Glass, and suddenly Google is gathering reams of data in the physical world as well as online.
It’s possible that devices such as Google Glass will eventually become the standard way to interface with smartphones, computers, networks and each other.
And there are upsides to this approach. A system fully integrated with your knowledge and activities would be able to tell you the correct date, the correct pronunciation of a troublesome word, or translate in real time a conversation or text. It could use its camera and your contact list to put a name you can't remember to a face coming toward you through a crowd. It could show you turn-by-turn directions in an unobtrusive display, rather than make you glance away from the road at your smartphone.
That’s a compelling dream for the techno-centric . But to have Google as the sole backend to all those personal interfaces is a nightmare for the privacy-is-dead types. Privacy has been a point of contention between Google and everyone else almost from Google's beginning; the ubiquity and personal intrusion possible with Google Glass and networks of remote wireless sensors exacerbates that contention.
"Obviously with any new technology there is the potential for misuse--and Google Glass is, of course, a little problematic when it comes to privacy," wrote TechRadar's James Rivington, in a review of Google Glass in April.
Google insists it is less interested in personal details than in anonymized trends that could identify new services customers want, new information they need, or new social or political movements whose members may not yet have realized they're part of a crowd.
Google developers scoff at privacy paranoia and insist neither they nor Google are interested in Big Brothering anyone. Most are probably telling the truth.
The amount of power and potential for abuse inherent in access to that much data is so obvious, however, that it has become a standard trope in action movies and TV cop shows.
By developing the networks and software to gather and make use of so much data, Google is advancing an inevitable change in the dynamic of human societies--from one in which most of our activities are unobserved by all but a few, to one in which all our activities in both the physical world and virtual are recorded and available online.
We haven't even begun to deal with the changes, attitudes and rules that will have to change to accommodate that level of visibility. It will take a decade even to make a start; we may never get all the rules down pat.
Google's Data Sensing Lab project in the Moscone Center demonstrates more than that systems exist to provide detailed, real-time monitoring of a physical environment. It also demonstrates that we're much closer than we thought to the time that someone could be watching over our shoulders no matter where we are or what we're doing.
If we don't create social and legal rules to establish how that knowledge can be used and by whom, the job will be done for us by default. Or by Google.
Kevin Fogarty is a freelance writer covering networking, security, virtualization, cloud computing, big data and IT innovation. His byline has appeared in The New York Times, The Boston Globe, CNN.com, CIO, Computerworld, Network World and other leading IT publications.