Citizen Sensors: Individuals’ Mobile Updates Contribute to a Bigger Picture

Citizen sensing applications range from public health to disaster relief.

In 1999, before the advent of Foursquare, mobile Twitter clients or sensor-enabled phones, a somewhat prescient Neil Gross in Bloomberg Business Week said: “In the next century, planet earth will don an electronic skin. It will use the Internet as a scaffold to support and transmit its sensations.” A new area of research called “citizen sensing” has emerged since then that aims to derive collective knowledge from the actions and reactions of individuals armed with internet-enabled mobile devices.

In the past ten years, we have seen the growth of online social networks, but there has been a parallel surge in sensor networks, many of which are also connected to the Internet. These usually consist of multiple static or inert sensors that capture certain readings from their environment whenever they are programmed to do so. Also, many people are now carrying some form of sensor-laden device – a mobile phone, a tablet, a fitness device – from which sensor readings can also be retrieved. This is sometimes called ‘human-in-the-loop sensing’, but sensors are also being carried by cars, animals and other moving entities.

There are various advantages for human-in-the-loop sensing. For collecting data in large urban areas – for example, for environmental or traffic monitoring purposes – it can be both expensive and time consuming to build large networks of sensors in these areas. Having people walking around with sensor-enabled devices makes sense due to the high population densities in urban areas and the willingness of people to contribute sensor data if it will have an eventual positive impact on their lives.

Five years ago, a team in UCLA wrote a research paper on ‘participatory sensing’, which uses, “mobile devices to form interactive, participatory sensor networks that enable public and professional users to gather, analyze and share local knowledge.” Applications were described in the areas of public health, urban planning and even creative expression. In 2007, Michael Goodchild described citizens as sensors in the field of volunteered geography, when he talked about, “[humans] equipped with some working subset of the five senses and with the intelligence to compile and interpret what they sense, and each free to rove the surface of the planet.”

More recently, a professor in Ohio’s Wright State University, Amit Sheth, outlined the notion of ‘citizen sensing’ whereby people are, “Acting as sensors and sharing their observations and views using mobile devices and Web 2.0 services.” A citizen sensor network is “an interconnected network of people who actively observe, report, collect, analyze, and disseminate information via text, audio or video messages.” In particular, Sheth presented work in which semantic annotations were applied to Twitter microblog posts from ‘citizen sensors’ in order to provide situational awareness, e.g. in the Mumbai terrorist attacks.

If interpreted correctly, the data that is available from citizen sensor networks can have a wide variety of applications. Some of these include: earthquake sensing (people interested in acting as citizen seismologists can apply to Stanford for a tiny seismic sensor for their computer); disaster relief (there are various platforms available from Ushahidi for disaster response); traffic monitoring (Dr. Liam Kilmartin at NUI Galway is leading a project that uses mobile apps to monitor and reduce traffic congestion in Galway); and environmental data analysis (UC Berkeley and Intel provided personal air quality sensors to community members in California as part of their Common Sense project).

In a previous article (“What If Your Car Could Tweet?”), we briefly talked about how sensor readings could be attached to microblog posts through the Twitter Annotations extension. Twitter Annotations will allow arbitrary metadata to be attached to any tweet. There is an overall limit of 512 bytes for this metadata ‘payload’, and each metadata item is expressed in the form of “type”:{“attribute”:”value”}, e.g. “movie”:{“title”:”Planet of the Apes”}. Inspired by Twitter Annotations, work is ongoing with David Crowley at DERI, NUI Galway to attach mobile sensor data to Social Web content, to develop mobile sensor-specific extensions to the SIOC de-facto standard developed in DERI, and to build Android apps that use this data model. The next step is then to provide novel methods for interpreting and visualising the data for different domains.

At the moment, you can attach geolocation information to a tweet, and every tweet is timestamped, but what if you could append temperatures, air pressures or other contextual information to a tweet? When combined with the actual texts of the tweets themselves, this combination of human-contributed and machine-contributed data could potentially be very useful.

10 thoughts on “Citizen Sensors: Individuals’ Mobile Updates Contribute to a Bigger Picture

  1. Good stuff John! I hope my thesis (especially the lit reviews and Ontology chapters) can give David a bit of a leg up with some aspects of his research. There’s a copy in the DERI library (I had a digital copy on my unofficial DERI webpage, but that page seems to have been removed now). He should feel free to contact me by replacing with in my old DERI email address!


  2. interesting reading, rather scary though. 1984 all over again, in a way, to the exception that the population is willingly delivering infos as to its whereabouts and the rest.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s