Quick, random thoughts tonight on Google (and a between-the-lines love letter to Guy Debord):
Reflecting a bit on “Don’t Be Evil,” Google’s famous old “informal corporate motto”— one is tempted at first to laud it as a pithy, hip-phrased commitment to integrity, transparency and corporate citizenship.
Yet, anyone that has given the thing an ounce of thought is also tempted to wonder why the lady doth protest so much. In other words, why go on about evil unless you’re doing it? Well, there are plenty who think Google knows too much about us, and they probably do. But I don’t think they were ever necessarily evil or interested in becoming evil.
Instead, I think they’re extremely smart and generally know what they’re doing. For one, they know they’re always on the edge of any number of minor Faustian bargains. More than that, I believe Brin and Page saw the psycho-cultural-epistemological magnitude of their search engine project pretty early on. I think the “Don’t Be Evil” thing was a rather genuine ethical response. For dramatic purposes, I imagine a moment when they saw the raw corrupting power of their creation, looked up at each other, had an oh-shit moment, then nervously scrawled the motto on a conference room whiteboard, where none dared erase it.
I know it probably didn’t happen that way, but hey, why not imagine it?
Anyway, reverse-engineering that “Don’t Be Evil” scrawl, here’s what I think Brin and Page saw:
1) The web is nothing more than the first iteration of a future world that is pure datasphere. We will all live in that world. It will shape us. It will teach us. It will define what is possible and what is not. It’s already happening, but it’s just begun.
2) Page Rank and algorithms like it will be primary forces of nature that will do nothing less than shape and define the world we live in. And who we are, what we can be. It’s already happening, but it’s just begun.
3) Specifically, the search engine scuplts the psychogeography of the datasphere in which we all live. By favoring some data, and starving off others, Google and things like Google passively and actively delimit what data exists in the world, according to their own logic and judgment, and thereby (I repeat) define the world we live in, shape what we can think and who we can be.
4) And thus, Google is nothing less than a kind of demiurge cartographer of a living world of data: they are mapping (and through mapping, creating) our digital world for us. They are creating/mapping our intellectual, social and cultural possibilities, no less, and it’s no surprise that they long ago set about to map (or re-map) the physical world too. It’s all the same project of psychogeographic engineering.
So that’s the beautiful terror in the algorithm: world-creator power of many magnitudes.
Any distrust we might have of Google, then, comes from our intuitive sense of that power. The Chinese government exercises a similar power over its citizens, perhaps. That’s the closest analogy I can think of. But I think in Google’s case, to this point, it amounts more to a tremendous ability to be evil than actually being evil. Either way, that’s how big the thing is that Google holds in its hands. If you don’t believe me, consider alone the sway Google has over internet marketers. There is something akin to deep moral import in every tweak of Page Rank, and the faithful tremble. Just read the SEO blogs each time PR changes—those folks rattle about like ancient oracles attempting to extract an omen from an animal carcass.
So there it is. Maybe it’s all obvious to you. Or maybe I’m wrong.
Regardless, I always liked the ideas of the Situationist Movement, particularly the idea of the derivé, that aimless wandering through the city in a way that subverts its intended psychogeographical discourse, that allows you to think and be in ways that aren’t programmed by the structures around you.
And I wonder what a datasphere equivalent of the derivé would look like.