Sensors, big data and code
control your data
Three things you often come across these days are “sensors”, “big data” and “computer code”. Even if you will never hear these words, the objects and ideas they refer to are slowly starting to dominate your life.
Sensors have been around the longest. They date back to the pre-electronic age. A glass tube flled with mercury attached to the right scale allows us to measure temperature. Built into a thermostat, the same thermometer in a diferent shape efortlessly takes over decisions about when to switch heating and cooling on and of.
Yet it is not their electronic upgrade that is game changing. The magazine Popular Mechanics advertised an electronic thermometer in 1954. It is not even their omnipresence that makes sensors so important. Admittedly, there are so many of them around that it is hard to keep track of when we activate them. Some require a specifc action as an input -think of buttons (on keyboards), touch-screens, switches or (card) readers — others come hidden as microphones, speakers, camera’s or even laser beams. Although some are now embedded in smart phones, automobiles, shoes and even glasses, most of the actual measuring devices have been around for decades.
It is, however, the combination of the familiar devices with faster computers and cheaper data storage, that will become an essential part of your life. Imagine you would measure a temperature once per minute and store this information on a hard drive. After a year this would generate roughly 5 Megabyte of data. Storing these data in 1981 would have cost you roughly 1.100 Euro, or the price of a small family car at the time. In 2010 the cost of storing 5 Megabyte of data had fallen to 0.04 EuroCENT, or 5% of a bumper sticker for a car. Similarly, calculating the average of these temperature data acquired over one year would have taken the IBM personal computer in 1980 about 5 seconds. In 2012 an iPhone would would take about a millisecond, or less than it takes light to travel from London to Brussels.
Cheaper storage and faster processing make it sensible to preserve every measurement a sensor makes. Instead of measuring temperature once per minute, it becomes afordable to measure every millisecond (1000 times per second), and calculate an average between every measurement.
And again, not only our active inputs are stored, also data we generate unconsciously is gathering interest. How we surf the internet, what we buy, what we eat, is all captured, stored and processed. These large storages of data and their processing is what is known as big data.
The communication between sensors, computer, memory and possible outputs happens in a specifc language, called code. Smart phone and web based applications that consumers can self- built (and while at it make him or her super-rich) have pulled computer code out of maths and engineering departments of universities into the realm of popular imagination. Newspapers are full of stories of fve year olds who are better at code than their mother tongue. Although these might be slightly exaggerated, with the proliferation of sensors, and the processing of big data, computer code will become increasingly important.
Progress has always been a two-sided story. Not less in this case. The data gathered about our consumer behaviour is used to spam us with unwanted advertisement, and the detailed storage of our habits in order to anticipate our future behaviour leads to a conformity that blanks out everything unusual. On the other hand, storing and predicting behaviour on our energy and water consumption may help us reduce our environmental impact, and might even help us reduce the time we spent on boring household chores that machines can do for us (the reason why thermostats were invented in the frst place).
What we can do is stay curious but critical. Use what helps us, block what bothers us. But how to choose when the options present themselves as highly technical or hidden in a black box?
City Mine(d) is currently opening black boxes. The physical ones: sensors, computers and outputs; but also the mental black boxes: unfold the mysteries that surround computer code, big data and the science that interprets the big data.
How? By building a machine ourselves. From scratch. The Pacco-test aims at measuring the quality of surface water. Over the coming months a test-kit will be built consisting of 5 probes, a computer, screen and printer, which will generate data made available both online as well as to interested scientists. Control your data, join the Pacco-test! The journey starts here.