Feb 4th (Mashable) – Sometimes, a system as sophisticated as Google Maps can be tricked in the simplest of ways.
This was recently proved by artist Simon Weckert, who hauled around 99 smartphones in a handcart, thus generating a traffic jam on Google Maps, which falsely registered this activity as a bunch of cars being around the same location.
In reality, as you can see in Weckert’s YouTubevideobelow, the streets were quite empty, but they’d lit up red on Maps as he passed through them with his cart.
This may seem innocuous, but in reality, if you were in a car near these locations, you’d likely be tempted to choose a different traffic route to avoid the “traffic jam.”
Weckert, a Berlin-based artist, “seeks to assess the value of technology, not in terms of actual utility, but from the perspective of future generations. He wants to raise awareness of the privileged state in which people live within Western civilization and remind them of the obligations attached to this privilege,” according to hiswebsite.
The artist doesn’t give many technical details about the experiment, so we have to trust his word that it actually worked. But Google tech lead and engineering manager for Google Maps Torrey Hoffman said on Twitter that he believes this is possible to do.
Weckert’s experiment raises an interesting point. It is reminiscent of asybil attack, a well-known problem in computer science, in which an attacker creates a large number of fake identities to influence a network service. But two aspects of the experiment are particularly troubling: First, the attack is extremely simple to perform, and second, it potentially (negatively) affects real-world infrastructure such as traffic.
In his post describing the experiment, Weckert touches on the latter. “Maps, which themselves are the product of a combination of states of knowledge and states of power, have an inscribed power dispositive. Google’s simulation-based map and world models determine the actuality and perception of physical spaces and the development of action models,” he writes.
I’ve asked Google whether this type of attack is possible, and what it plans to do to prevent it in the future, and I will update the article when I hear back.