I’m not an urbanist, however that’s defined. I don’t even play one on TV
. But I am curious about how technologies define the space we inhabit. With the Internet and mobile, this felt more subtractive or redistributive than redirective—physical stores on high streets and malls disappearing, in some cases replaced by reconfigured supply chains
. But in the era of physical computing for which we are at the beginning, use of space is being reauthored, some space is being erased and other space invented. Decades of fights over rights of way, zoning and more recently air rights
will soon be giving way to property and personal rights in digital spaces created out of pixels and map coordinates. The flat, single two-dimensional plane of groundspace is becoming infinite planes, to be developed and owned.
As I’ve pointed out before, the scramble to develop and expand logistics services on which connected populations rely increasingly heavily now—Amazon, Uber, and I’d count Google here too through dominance of Maps—and the boom for imaging technologies like Lidar have been a prelude, not an endgame.
The Gibson quote from BURNING CHROME I’ve appropriated for this edition—"The street finds its own uses for things"—was a brilliant distillation of the importance of human appropriation of technologies for unanticipated uses, and is still as relevant today as it was when Gibson dropped it into the public mind in 1981, a moment when the idea of personal technology in the digital sense was still relatively new. But, given where we are, and where we’ll be shortly, in the realms of machine learning and vision, autonomous mobility, and the reperception and new forms of monetization of space, I wonder if this phrase shouldn’t be inverted.
Things, from autonomous vehicles for transport, delivery and maintenance, to aerial transport, mapping and surveillance, to digital objects and phenomena, increasingly see space—and will use space—in ways that increasingly deviate from, and are independent of, human rationalization of that same space. I include the NVIDIA blog post below for that very reason. Until quite recently, most self-driving car research has been constructed around infrastructure, rules, and signals designed for human use. We train cars to drive as if there is a human driving them, recognizing roads as we recognize them—"see what I would see, do what I would do.“ (this is massively simplified, but you get the point). New training techniques are being developed by NVIDIA and others to allow vehicle systems to understand and interpret roads independent of systems, uses and markings developed for humans. In short, the vehicle is developing a different understanding of "road”.
As we progress to a point where fewer people are needed to pilot vehicles, and more roads become “robot readable,” we will inevitably see new uses being found for roads, and road infrastructure changing to optimize for machine, not human, legibility and use. Given the substantial role human roads play in shaping our social and commercial environments, like rivers and rails before them, streets, buildings, and towns and cities will gradually reshape to reflect machine uses.
Think about how machine optimization will impact interior space as well. In my opinion, the sameness of the international Airbnb standard
of interior living space described below by friend Igor Schwarzmann
will intensify as Airbnb’s Aerosolve
works its algorithmic ways on price optimization and the characteristics of desirable and profitable living space are further quantified. Difference is inefficient, and can be found more readily in local knowledge and cataloged experience than habitation packaging and services. And besides, how will the automated cleaners find there way around a million different floorplans? :)
Anyway, these are some Saturday afternoon thoughts about how space may change around us as we increasingly share it with silicon. Now to enjoy a ride down a local forest path before its fully mapped and cataloged.