In the last entry I looked at how the concept of ‘disruption’ is used to obscure the unequal distribution of benefits and consequences in technological disruption. To follow it up I’d like to explore the way that technological development is also a deeply personal and political process - one where responsibility can be selectively embraced or abdicated depending on the product’s wider social effects.
Tech: It’s personal
Whilst Forest may claim to only be observing, it is clear that he is acting, and his primary motivation is a highly personal one - the death of his wife and his daughter, Amaya. The Devs project is firstly an instrumental tech driven ‘solution’ to death. By perfectly simulating the past he hopes in some way to bring his family back. Whenever left alone in the Devs lab, Forest can be found replaying simulations of his daughter’s life. With Forest and others such as his Devs engineer Stuart, fraying the boundaries of reality by suggesting that the simulation is equivalent to reality itself, the Devs project presents an opportunity to recreate the family he has lost. Forest’s insistence that there is only one tramline, one world, is driven by this very personal need to know that the simulation of his daughter, is his daughter - not a hair out of place. Not only the goals of the project, but its fundamental design are deeply personally motivated. The Devs project is not an objectively rational next step in technological evolution. It is entirely driven by Forest’s own trauma and personal needs.
In the flashback scene depicting their deaths, we see Forest standing outside his house speaking on the phone to his wife who is driving their daughter home. As they mundanely argue about dinner she crosses an intersection and is hit by another car, killing them both. Later on, we are shown the same scene, but from the many-worlds perspective, layer upon layer of potential outcomes where one or other of the cars was slightly faster, slightly slower, swerved differently, resulting in a street covered in different outcomes of the same event.
As the show depicts, a multi-world model could be comforting for Forest - out there somehow in a different timeline, his family are alive, another Forest lives a different life with his family. They are not gone, they are just elsewhere. However, a multi-world model would also mean that because of his choices, he has ended up in the timeline where they are not.
Forest’s obsession with the tramline deterministic model is about denying that possibility, denying any responsibility, no matter how tenuous, for his family’s death. Whether it was because Forest distracted his wife with the phone call, or something seemingly indirect, minute and arbitrary, many-worlds meant Forest made a choice that resulted in their deaths.
For Forest, his absolution can only come through predetermination. If their deaths, his role, those choices and those consequences were pre-ordained the moment the universe kicked into action then Forest is not responsible.
The purported apolitical nature of technology de-personalises it. If all technology is an inevitability, and it is developed from an apolitical mindset of rational action, it is politically neutral, its design and the motivations behind it pure. Indeed it is a common view that technology is inherently neutral, that politics only come in through human use.
Technologies have politics, if not in the figurative nuts and bolts of their design1, at least in the opportunity and decision to develop them, in the inequity of whose voice counts when it is being produced. The politics, whether acknowledged or not, are there in the limited experiences of the decision makers who are unable to imagine that others may wish to, or have to, live differently.
The closest we get to recognising that new technologies may be brought about by fallible humans, is also a process that erases those flaws, lionising heroic individuals for their unique qualities and brilliance in bringing about transformative tech. Only through the innate qualities of these unique distinctive geniuses could we have achieved such technological heights, but of course absolutely nothing about them would in any way shape the design, deployment or outcome. That was always an inevitability. In this framing, tech is not responsible for the consequences of the products they develop, but they can take credit for pushing us forward nonetheless.
Agency without responsibility
This abdication of personal agency or responsibility comes in a key scene where Katie and Lyndon are standing on a dam talking about the Devs project. By this point in the series the project is able to clearly project forward as well as backward in time allowing her and Forest to see what will happen days into the future. Through hints in the dialogue between Forest and Katie, we get the sense that they have played out the next few days repeatedly, and are now experiencing life on the tramlines, all their actions and words performed as rehearsed based on what they’ve seen. Lyndon questions why Katie, if she has already watched this moment, continues to perform it, ‘why ask a question when you’ve already heard the answer?’.
Katie admits that Forest’s single-world interpretation is wrong. The only reason they’ve been able to accelerate the project to this point was because Lyndon’s changes to the project that relied on a many-worlds interpretation reflected the truth, rather than Forest’s truth.
Katie tells Lyndon that at this moment she has seen him climb over the railing and stand precariously on the edge of the dam. That it is a demonstration of faith in the many-worlds model, because Lyndon will have faith that whatever happens in that moment, there will be worlds where he falls, and worlds where he doesn’t, and he will only remain conscious in the worlds where he survives.
When Lyndon asks whether he would climb over if Katie hadn’t told him that he does, she dismisses him. “I did tell you”, she says, closing down alternate framings of the situation. Katie is both recognising her role in the moment but also abdicating responsibility for it. Events would not proceed if she did not act, but when she does, she is absolved of the consequences that result.
Lyndon asks if she saw him fall and Katie dodges the question, instead saying that in all the times she’s watched this scenario play out, she never answers that question because that would negate the act of faith. Lyndon is then told that this is the moment he climbs over, balances on the edge and tests his faith. As he does we see see a layered montage of all the slight variations of this moment that Katie has already seen, and in every one of them, Lyndon falls.
Whilst Katie knew the consequences of her performance, she continued to pursue it. The technology had predetermined the outcome, and believing it to be objectively true, Katie brought about that outcome whilst remaining, in her mind, not responsible for it.
When Steven Levy wrote his book, Hackers: Heroes of the Computer Revolution (1984) he laid out a number of principles common within tech culture that he called the ‘Hacker Ethic’. One of these principles was that “information should be free”. In the same year the book was published, the first Hackers Conference was organised by Stewart Brand of Whole Earth Catalogue fame, and Kevin Kelly founding editor of Wired. In an address to the conference Brand invoked Levy’s principle when he talked about the ways in which information had a tension between being both expensive and free - that information can be incredibly valuable and yet it is becoming cheaper and cheaper to distribute it. Notwithstanding Brand’s potential role in redirecting the assembled’s idealism towards a more capital friendly approach to information, what is critical here is Brand’s appeal to essentialise these qualities of information. ‘Information’ he said, ‘wants to be expensive, because it’s so valuable’ but ‘[o]n the other hand, information wants to be free…’. This subtle reframing I’m sure was unintentional, but also illustrative of the underlying assumptions at play amongst cyberculture’s greatest proponents. However this capitalisation on information would go, whilst tech culture would be there to capitalise and bring forth human progress, ultimately the consequences were essential to information itself. No longer a normative political claim that information ‘should’ be a certain way, but an essentialist apolitical claim that information ‘is’ a certain way. Luckily enough the way information ‘is’ just happened to align very well with the motivations of those looking to capitalise on it.
It is implied that in Katie’s view, she was performing her predetermined role. Whilst there were subtle variations in the Devs predictions, she had never seen herself tell Lyndon the fall would be deadly, never not instigated the fall and never seen Lyndon survive it. From Katie’s perspective she was not responsible for the outcome because it was pre-ordained so tightly. If however we deny the predetermination, and instead place the agency back with Katie we can reframe the Devs simulations not as a statement of what must happen, but as a projection of what does happen. That in this moment it is not that Katie must tell Lyndon to climb over the rails, but that she always chooses to because getting rid of Lyndon is in her and Forest’s interest.
Technological determinism, the strategic relinquishing of agency in some degree, allows tech as a culture to pursue their own interests whilst selectively claiming responsibility for the consequences. The political, ideological and personal motivations, the traumas and assumptions and structural inequities that contribute to the development of technology within tech culture can be hidden away by a determinist narrative. Critically this determinist narrative also means that whatever the cost of technology for the wider population, whatever the cost of developing technologies that coincidentally align with the interests of tech culture for the rest of us, are reframed as simply the cost of progress, not a choice wrought on the rest of us for their benefit.
In the final piece of this series, via the final episode of Devs I’ll look at the costs of progress as it is defined by tech culture - beyond the practical and material, and more towards what it costs us through a process of dehumanisation, a reframing of life for the benefits of tech.
Though the idea that politics is ingrained into material design choices is a growing consensus within recent Science and Technology studies work. I once wrote a piece on the role of software design in shaping people’s interpretation of texts - because that’s how I get wild on Friday nights.