Where did this culture of disruption first emerge? As we’ve explored previously in our series, the origin of Silicon Valley’s zeitgeist goes back to the 1960s and the rise of the counterculture. The key similarity between then and today is a persistent anti-establishment and anti-elitist attitude. But while the historical roots of disruption were a logical outgrowth of the Vietnam War and the Jim Crow era, one wonders whether anti-establishment fetishism has now gone too far.
Disruption is like a weapon, and you can’t always know which way it points. Maybe that wouldn’t be so dangerous if it was forged in an ideology that promoted greater freedom and equality for all Americans. And maybe once upon a time such idealism did fuel anti-establishment action. The concern, though, is that disruption now seems driven not by ideology, but by the market. It threatens institutions whose dismantling will yield the most profit, not those that stand in the way of achieving a more just and free society. Moreover, the information provided by Silicon Valley’s gatekeepers “for free” comes at a price.
When it comes to the information ecosystem and the whole notion that “Information wants to be free,” we shouldn’t forget that Stewart Brand had more to say about the topic during his famous conversation with Steve Wozniak at the first hackers conference. Here’s his original quote in its entirety:
“On the one hand you have — the point you’re making Woz — is that information sort of wants to be expensive because it is so valuable — the right information in the right place just changes your life. On the other hand, information almost wants to be free because the costs of getting it out is getting lower and lower all of the time. So you have these two things fighting against each other.”
Information – and maybe more importantly, data – do not actually “want” anything. But the tech firms have taken Brand’s quote and run with the idea that online information should be free – or at least that it should come at the cost of our personal data. Along the way, Google and Facebook have played a growing role in reshaping the face of our entire information ecosystem. Their algorithms make decisions for us, but with no transparency about the criteria used. And while it’s true that, at least for now, humans do train the algorithms, these are largely humans who just write code. Now more than ever we still need people involved who acknowledge their responsibility for framing and presenting different perspectives, for policing bias and veracity, for weighing ethical dilemmas. In a complex world we need information that enables us to make our own informed choices.
So one of the lingering questions is whether the institutions that enable us to make our informed choices – even when the process is messy and inefficient – can be rebuilt once they’ve been destroyed.
Susan Athey, Frank Foer.