Skip to content

Latest commit

 

History

History
74 lines (35 loc) · 10.2 KB

2 The Web We Lost.md

File metadata and controls

74 lines (35 loc) · 10.2 KB

The Web We Lost

As I reflect on all the things that have happened over the last few years, specifically in the realm of technology as it pertains to human society, I can't help but wonder how I would describe the world we've built to the 12-year-old version of myself, who was first discovering BBSes and the Internet.

We lost the sense that an open, efficient, safe communications network is a Commons for every person. There were actual online communities that thrived on the basis of humans building trust with each other. One could stumble into them, learn their values and rules, and become enriched. We could actually learn to build trust with total strangers.

The "End-to-end" nature of the internet got replaced. It's not actually a network anymore, it's just a centralized place for networked apps.

We've been disabused of the naive and mistaken notion that the Internet is resilient. Geeks used to proudly proclaim, "The Internet views censorship as damage and routes around it!" Sadly, they were incorrect, and to some extent it has never been true. The Internet can only route around broken IP links. It has cannot distinguish the source of breakage: carefully crafted rules in Cisco routers in the Great Firewall of China, or a careless backhoe driver. The Internet cannot route around censorship because the concept of "censorship" does not exist at the Internet level.

Censorship is a concept in the plane of human relationships. That plane is inhabited by individuals and organizations for aggregating, curating, and amplifying communications. Because the modulation of human communications is a kind of power, the communication plane is a battle space for other human organizations: churches, corporations, governments.

We currently use an infrastructure that is both centralized and vertically integrated: just a handful of companies control everything from the chips and memory, through the operating system, up to the browser standards. Every pixel on every screen becomes an explicitly managed, tiny television set, and every interaction and non-interaction becomes a surveillance dataset. We are rapidly headed into becoming a society where all significant interactions between individuals is mediated by a private infrastructure that is up for sale to the highest bidder.


Walter Isaacson writes about a couple of the core tech issues:

"There is a bug in its original design that at first seemed like a feature but has gradually, and now rapidly, been exploited by hackers and trolls and malevolent actors: its packets are encoded with the address of their destination but not of their authentic origin. With a circuit-switched network, you can track or trace back the origins of the information, but that’s not true with the packet-switched design of the internet.

Compounding this was the architecture that Tim Berners-Lee and the inventors of the early browsers created for the World Wide Web. It brilliantly allowed the whole of the earth’s computers to be webbed together and navigated through hyperlinks. But the links were one-way. You knew where the links took you. But if you had a webpage or piece of content, you didn’t exactly know who was linking to you or coming to use your content.

This has poisoned civil discourse, enabled hacking, permitted cyberbullying, and made email a risk. Its inherent lack of security has allowed Russian actors to screw with our democratic process."

Isaacson’s ideas are not far off the mark, but they are not sufficient. The reason is because they do not consider the social impact of the communications infrastructure, instead looking at the security and provenance of single interactions. Even if we were to fix these specific technical shortcomings, we would still end up in a situation where a site that satisfied humans’ desire for socialization (sharing, group chat) will gain a user community, and if certain design constraints are not respected, then the same kinds of interaction anti-patterns will manifest.

More deeply, cyberbullying is only one aspect of the overall problem. The aggregation of human attention into centralized infrastructure creates a situation that is ripe for exploitation, manipulation, and mass control. It condenses the fluidity of social communications into a more interlocked, crystalline form, that can be used to amplify and broadcast messages. Goebbels understood the potential of newspaper and radio to perform this function, and he exploited it. Putinate Russia has done the same.


There's a famous science fiction story from the 1950s which compared humanity's development of atomic weapons to giving a child a loaded gun. [[LINK]] It asked the moral question of whether or not we are ready for this technology we have created, and the terrible consequences that may await us.

Social media and the evolution of Internet-as-medium-of-social-validation is potentially a similarly disastrous technology invention. But rather than vaporizing cities in a mushroom cloud, it roots in everyone's brains and lets them hear only what they want to hear.

Other terms that have been used for this include "information diabetes", but focus on the quantity of electronic content people consume, as opposed to the actual nature of that content. We shouldn't be too alarmed if people are just loading up on Plato and Shakespeare. Instead, it's a viral stream of clickbait that is designed to appeal to the worst in each of us.

This is the most difficult thing to acknowledge: Technologists are culpable here. The creators of software systems, network protocols, application interaction patterns, monetization schemes, etc. are all directly or indirectly responsible.

And if we are to go one step further, and fully embrace Voltaire's maxim that "Man is guilty of all the good he does not do", then the rest of us who could build alternative, open networks that promote positive human engagement are guilty of not doing so.


As an optimist and an idealist, I would like to think that this is because most of us haven't realized just how fundamental of a problem this is. We laugh over beers about the crazy people we know on Facebook who post insanely right-wing and bigoted things, not realizing that those people are basically eating such hatred and venom for breakfast, lunch, and dinner. The social impact of that information diet is real. We have now discovered that if you take a population the size of the United States, and link up all of its ignorance, fear, hatred, and desperation on a computer network, the resonant frequency of that nasty medium is a shrill tone we can defined as "Trump".

There's a recent story about alt-right people creating a consipiracy theory about antifa planning mass decapitations of white parents. We have created broadcast platforms that can be targeted towards selected groups of the population, and then proceeds to give anyone the ability with broadcast without recourse. When someone falsely yells "Fire!" in a crowded theater, we hold them culpable. But when someone targets millions of right-wing racists and riles them up with false conspiracy theories, and then organizes them to go out in armed protest, we shrug our shoulders.

Because after all, what technology fix, or legal solution, can we point to, that can structurally solve this problem, but doesn't violate the business rights of companies like Facebook and Twitter to attract users to their websites, and facilitate communciations with them?

The answer lies in Lawrence Lessig's brilliant observations that when it comes to technology, Code Is Law. We cannot simply legislate things like fact-checking or try to create slow-moving government commissions that dictate the features of private web sites. Even if such things were legally feasible, it would never be effective. We have to solve this problem via code, by building a better alternative mechanism that cannot be exploited in this way.

My belief is that by constructing a system of communications that is architected and engineered to facilitate real human trust and conviviality, our users will be able to achieve a depth of connection and experiences which are simply not possible with centralized attention farms.

In order to build the right system, we have to focus on solving the right problems, and understand them at their most fundamental level.

Further Reading

  • From a post about a model (Emily R.) getting nasty responses on Twitter after tweeting that she supports Bernie:

    "You may be wondering what the significance of all this is, but there is none. Social media is often a toxic place, even more so for women. The companies running these websites do little or nothing to change the environment, and we continue to use their products in spite of that. So it goes."

    The makers of tools of communication must recognize their power and take responsibility for the culture that emerges within the fabric of their tool. This is especially true for tools which facilitate communication between the mass of people, who are typically not excellent at self-reflection and empathy.

  • Back in 2012 Anil Dash wrote a piece called "The Web We Lost", but he could not see what the advertising-based business models would do to these "great sites" (as he calls them). His main point was that:

    The primary fallacy that underpins many of their mistakes is that user flexibility and control necessarily lead to a user experience complexity that hurts growth. And the second, more grave fallacy, is the thinking that exerting extreme control over users is the best way to maximize the profitability and sustainability of their networks.

    The first step to disabusing them of this notion is for the people creating the next generation of social applications to learn a little bit of history, to know your shit, whether that's about Twitter's business model or Google's social features or anything else. We have to know what's been tried and failed, what good ideas were simply ahead of their time, and what opportunities have been lost in the current generation of dominant social networks.