On Indirection & the Implications of Moral Profiteering

Sun, 29 Mar 2020 18:32:56 GMT

Originally published on March 22, 2017

Indirection and abstraction are the tools which blur corporate accountability and provide corporations and its workers with the plausible deniability required to equivocate and violate social contract and ethical contract.

Code (i.e. abstracting and codifying a process, whether codified as software or hardware) is one such form of indirection, as noted by Lawrence Lessig in the claim, "code is law". By this it is perhaps meant to suggest, automation takes a responsibility (e.g. advertising) traditionally conducted by a person (about which present day law predominantly considers and governs) and indirectly shifts responsibility (and as a corollary blame) away from the programmer to an unthinking counterpart which just does what it's "told". In this way, corporations can freely optimize processes beyond the limitations of human bandwidth to achieve greater scalability characteristics (e.g. reach a greater volume of people, conduct business faster). An important trade-off (read: liability) of shifting responsibility from people to technology is, if an unplanned for edge case arises, the automated program must be responsible for making a potentially difficult decision (in a way which may not be informed by social contract) or, alternatively, malfunctioning. In either case, this indirection affords latitude to the programmer, who can't reasonable be held responsible for the outcome -- they may have behaved entirely differently if they had directly intervened. Within these decisions lay the "law" of which Lawrence Lessig speaks. If a self-driving car must crash and hit either a granny or a toddler, which should it choose?

This scenario raises another interesting question, which is spurred by the reality that the legislative concerns of our government (with its checks and balances) almost necessarily lags behind that of fast-moving corporations and their incentives. As a result, there is often misalignment in the focus of government legislation and the multitude of needs and necessarily exception handling brought on by the rise of technology. If the United States of America had to wait for the government to answer the question posed above about self driving cars, and all the multitude of questions like it which emerge alongside automation, it would be at the expense of our competitive advantage. And so, to a great extent, technologists and their managers (being in the fastest driver's seat) enact policy every day, unchecked, often without sufficient process of due diligence or accountability to protect the peoples' interests.

The challenge is complicated further by the notion that technological methods are advancing faster than our (anyone's) ability to legislate. Deep Learning can evolve solutions to problems whose structure lacks transparency, making it hard (sometimes unprovable) to know how technology will act when a certain edge case emerges. This is a problem I view as more imminent and likely than a "Singularity" (i.e. artificial intelligence surpassing our cognitive intelligence and imprisoning us), potentially leading to chaotic economic fluctuation and distribution of wealth.

The solution to these challenges is left as an exercise to the reader. I starts with spending time to think about and acknowledge the problem, that corporations and their hierarchies and practices are becoming increasingly separated from the problems they are solving, by way of indirection and abstraction. The challenge is difficult to generalize across industries and there is no Hippocratic oath for programmer and manager.


These opinions shaped by the books:
https://openlibrary.org/books/OL21399542M/Moral_mazes Moral Mazes
https://openlibrary.org/books/OL3947191M/Understanding_power Understanding Power by Noam Chomsky
https://openlibrary.org/works/OL16801714W/Who_Owns_the_Future Who Owns the Future? by Jaron Lanier
https://openlibrary.org/works/OL6037025W/Code by Lawrence Lessig

Tags: social contract, legislate, Lawrence Lessig, ethical contract, Hippocratic oath, Who Owns the Future?, corporate accountability, Jaron Lanier, abstraction, Indirection, plausible deniability, artificial intelligence surpassing our cognitive intelligence and imprisoning us, United States of America, Moral Mazes, Singularity, corporations, liability, Deep Learning, due diligence, code is law, Understanding Power by Noam Chomsky, trade-off