Virtuous Tools are Tangled Hierarchies

The question is not "what to automate and what not," nor even "what should we use tools for and what is exclusively human," but rather: Are the tools corrupting, or virtuous? Are they instruments that lessen our contact with reality, or do they strengthen it...

Virtuous Tools are Tangled Hierarchies

When you have a hammer...

Some time ago, my friend Otto gifted me a patent-pending state-of-the-art Smart Hammer. It was a product of his own design.

I asked the reasonable question. "Why do I need a Smart Hammer?"

"Well," Otto replied "Regular hammers are pretty dumb. They're so hard to aim! And it's so easy to hit your own thumb! I need my thumbs for texting, so I designed the Smart Hammer to aim itself. You just swing it in the vicinity of a nail and the hammer recognizes the nail and does some complicated mass redistribution to adjust its trajectory and hit the nail exactly on the head."

I blinked. "Wow."

"Your days of bruised thumbs are over," said Otto, with a smile, as he ambled off to build the future.

Over the next few weeks, I used the Smart Hammer more and more. It was astonishingly helpful. With a regular hammer, my aim was usually off by a few centimeters, oftentimes resulting in a frustratingly bent nail -- but the smart hammer guided itself to hit the nail dead on target every time.

Then something curious happened. The more I used the Smart Hammer, the worse I became at aiming it. At first, I took pains to be as accurate as I could, and my swing required only minimal correction. But gradually, some part of me realized that I didn't need to be that accurate. I began to slip. My swings relaxed to lie whole inches away from the nail. Still the smart hammer corrected course. I slipped further. Soon my arm flailed as much as a foot away from its target, with the smart hammer adjusting wildly in flight like an errant magic wand. Eventually, I fell completely, taking a swing so wild that the smart hammer failed to compensate, and instead punched a hole in my wall.

This was frustrating. With most tools, the more you use them, the better you become at using them. When I first starting playing viola, I was probably mistaken for a feral cat. When I first started writing java, I had errors on every line; gradually, these diminished, and playing with the tool became more pleasurable. I was able to make things -- pleasant sounds! working programs! And, in some vague way, becoming better at the tool made me better -- at critical thinking, perhaps even at creativity. (A disproportionate number of Nobel prize winners are musicians.)

When you have a hammer, you see everything as a nail. But when you have a Smart Hammer, you start to see Smart Hammers in everything. Discovering a text from Otto, I tried to distill my experience with his invention:

"its funy," I wrote "teh more I use it, the wrse I get at usign it".

Or this is what I would have written, had the autocorrect not swept in and cleaned up my messy spelling -- rather like the Smart Hammer.

This reminded me of something from Nicholas Carr's The Glass Cage. I went fishing for the quote in the usual fashion, with a vaguely worded google query. "Carr glass cage IDE dulling," I instructed the search engine, which dutifully found this quote:

"Modern IDEs are getting 'helpful' enough that at times I feel like an IDE operator rather than a programmer," writes Vivek Haldar, a veteran software developer with Google. "The behavior all these tools encourage is not 'think deeply about your code and write it carefully,' but 'just write a crappy first draft of your code, and then the tools will tell you not just what's wrong with it, but also how to make it better.'" His verdict: "Sharp tools, dull minds." (Carr 2010)

Just below this, another phrase leapt out.

"Presumably," the journalist remarked, "we have got more precise in our search terms the more we have used Google." Singhal sighed and, "somewhat wearily," corrected the reporter: "Actually, it works the other way. The more accurate the machine gets, the lazier the questions become."(Carr 2010)

Smart Hammers are everywhere.

Corporations also wield Smart Hammers

There are three types of companies, but only one of them ever manages to put much a dent in the universe. Where do the others go wrong? There's a telling metaphor between this and my experience with the smart hammer.

First there are the Top Down Corp's, those companies with an explicitly hierarchical management structure through which information can usually only manage to flow one direction: from the top down. The CEO, playing the role of God, hallucinates a strategic vision. He hands this commandment down to senior management, who divides and delegates it to the mid-level managers, who tell the engineers to build such and such and the marketers to advertise such and such. The lowest levels, where the actual work is carried out, have the greatest contact with reality. Woe betide them if reality does not support the CEO's commandment! --- there's no sending messages backwards into the managerial heaven.

Top Down Corp manages some projects tolerably. Others just never seem to get done, for reasons everyone except upper management understands (I think Dilbert works at one of these). And sometimes, the communication asymmetry causes Top Down Corp to flub something so spectacularly the rest of the world takes notice.

Some examples: The CEO of a large social media company says "We're going to let users pay to get a 'verified' checkmark next to their name!" The people on the implementation level, who readily spot the potential for abuse, sound the alarm -- but the sound is muffled by clouds of obeisance, and soon verified imposters are claiming that insulin is free.

Another example: the head of a large country invents a "zero covid" policy. Down it goes, through the levels of implementation, who realize its horrific effect on the citizens, but don't manage to convey this to their superiors - until widespread protests engulf the streets.

The CEOs of these Top Down Corp's are a little like me with my smart hammer. Absent the crucial feedback loop between one's vision of reality and what it actually is, their directives become as erratic as my hammer's swings. Eventually they take a swing so wild that something breaks.

There are two ways around this. One, the "Bottom Up Coop", flips the paradigm on its head by eliminating the role of CEO-Prophet and managing everything from the implementation level. This can work, in a plodding sort of way, on a certain kind of well-defined problem. Crowdsourced open-source projects, for example, work best building clones of existing software. But in any endeavor with some ideological daring, the hive mind is prone to fracture.

The other most elusive corporate structure is found in what I call, after Douglas Hofstadter's phrase, "Tangled Hierarchy & Co". This miraculous company has different levels of implementation and abstraction, but cultivates an intricate feedback loop between them so tangled that no one can say which level is on top.

Think of the team behind some scrappy start up: one fellow may be listed as CEO, and another as their lead programmer, but the CEO never makes business decisions without a long discussion with his colleagues, and the programmer values the CEO's opinion on technical matters. There is a strategic vision, and someone responsible for it -- but that vision is in constant conversation with the implementation. The CEO may want to introduce their app on every platform simultaneously -- a wild swing of the hammer! -- but the programmer provides him that crucial contact with reality by insisting that doing so will sacrifice quality. Better a bruised thumb than a busted wall.

In this situation, the CEO of Top Down Corp would have demanded that her vision be treated like gospel. Meanwhile, in Bottom Up Coop, there would be no vision, and half of the community might splinter off into a "fork" building a "high-quality" native app, while the other half pursued their "non-elitist" strategy of cross-platform development.

To a management theorist, these are banal observations, amply covered by their peculiar language of business jargon. Ah, you're speaking about the importance of synergy between the strategic vision and the customer desires. To you, they likely sound obvious. Of course communication is important! Indeed, they are obvious -- which is why they make an excellent metaphor.

Virtuous Tools are Tangled Hierarchies

You and I are each, increasingly, the CEO of our own armada of tools. As I write this, my smart tea machine is magnetically lowering a basket of loose-leaf tea into 205° water. My computer's processor is running voice transcription on an audio recording of some musings from my morning walk. Somewhere in a cabinet, my Smart Hammer is ready to take instruction as to which nails to depress. This proliferation of powerful tools unloads a lot of responsibility on each of us. No wonder we've seen a parallel expansion of productivity content. A lot of personal productivity advice is management advice: how to manage our tools.

This mirrors the trend of human history, which is, on one level, a story of people using ever more tools: fire, agriculture, the printing press -- to make even more tools: the steam engine, vaccines, the microchip. As Reid Hoffman, serial entrepreneur and co-founder of Inflection AI, has quipped:

If we merely lived up to our scientific classification---Homo sapiens---and just sat around thinking all day, we'd be much different creatures than we actually are. A more accurate name for us is Homo techne: humans as toolmakers and tool users. The story of humanity is the story of technology.(Hoffman 2023)

Here, then, is the vital question. All tools, if they work at all, improve the user's capabilities. But some tools (like violas, or programming) also improve the user, while other tools (like my Smart Hammer, or Google) somehow coddle the user into a state of mild retardation. In other words, some tools are virtuous; some tools are not. How can we tell them apart? And how can we make more virtuous tools?

Virtuous Tool: a tool whose use improves the user.

Here's a hypothesis: the relationship between a user and a tool is like the relationship between a CEO and her company. She has something she'd like to do with the tool, and some way of communicating this to the tool, either by pushing buttons or issuing directions. The tool, in turn, has some way of converting these instructions into the real world. My viola turns the motions of my hand and placement of my fingers into sound waves. A search engine turns a vague query into a list of helpful links.

And here's the crux: some tools, like my Smart Hammer, behave like Top Down Corp, silently correcting the user's errors until she drifts out of touch with reality. Other tools, like my viola, enact the Tangled Hierarchy. On one level, it corrects me. If I hold it wrong or press too hard, it screeches in protest. But it does so much more: a musician doesn't know what sort of sound he wants to produce until he plays. The music emerges out of the conversation between the player and his instrument -- a "strange loop" of improvisation, imagination, and exploration, in which both the creative vision and its construction take place simultaneously.

I believe the best tools are like this: instruments with which to perceive and play with some previously inaccessible part of reality.

What to Automate? It's the wrong question.

My phone dinged. Otto had responded to my texted critique of the Smart Hammer. In conversation, Otto spoke like an engineer. But in text he, or his thumbs, always struck me as poetic.

it's always the question isn't it,

he said.

what to automate, and what not?..
the more you outsource, the more you forget
but maybe forgetting how to hammer lets you focus on being human

Up until now, we're skirted around an entire canon of well-trodden talking points from that immortal debate between Luddites and cyborgs. Will technology take our jobs, or will it create new ones? Will human ingenuity wither under mechanically induced neglect, or will it thrive as never before when unleashed to focus exclusively on creative pursuits? The Luddites fixate around the darkest interpretation of the present -- of a world in intellectual decline, where bodies and minds have been coddled by an over-reliance on technological aids. The cyborgs focus on the positives: accelerating research productivity, vaccine discovery, enhanced convenience and supercharged connection!

The usual recourse is a kind of managerial pragmatism that takes Otto's question very seriously: What to automate, and what not? What to delegate, and what to reserve for ourselves? Underlying this is the assumption that we, as humans, have some "core competency" to which we should devote most of our time. The details, like driving cars or spelling correctly or putting nails in the walls, are distractions. We should automate or delegate them, and focus on the important stuff. As Alfred North Whitehead put it,

Civilization advances by extending the number of important operations which we can perform without thinking of them.

So, to advance civilization, we should make as many important operations as automatic as possible. Speech transcription! Spell Checking! Web searching! Hammering!

This managerial framing makes sense in the simplest cases. Sure - hire a mechanic, get a secretary to help you schedule meetings, get a librarian to help you find relevant research papers. But as the complexity of automation creeps upward, viewing automation like delegation recreates the fallacy of Top Down Corp.

Take the deluge of Al "assistants". Would you like ChatGPT to write your book chapter for you? Github Copilot to program your next java package? VIOL-E to play your viola better than you can?

These tools, in effect, offer to promote you to CEO of Top Down Corp. Your wish will become their commandment, handed down from your own managerial heaven to be realized by your own AI genie in the "cloud". Alas, this genie is no proponent of the Tangled Hierarchy.

As an illustration, let's zoom in on the process of writing. Suppose I have a brilliant idea for a blog post that expounds on my new theory about toothbrushing and economic security. As every writer knows, what follows is a frustrating and sacred process. I struggle to express these ideas in words. Doing so exposes gaps in my thinking, such that in the process of writing, I gain a clearer picture of the ideas. Oftentimes, the gaps cannot be closed, and I discover that, alas, my ideas don't make any sense. Writing in other words, is like playing viola: it's a tangled hierarchy, an instrument through which we examine and play with the space of ideas. It is a very virtuous tool.

Now suppose that I attempt to shortcut the process. I fire up ChatGPT and instruct it, in a few sentences, to write about toothbrushing and economic security while arguing that chocolate milk is like a condominium. I push enter. The CEO has issued his mandate! Down it goes, through the levels of implementation --
and out comes a coherent block of prose that confirms my point while exposing none of its limitations. I have swung wildly, and the Smart Hammer has hit the nail, leaving me with the continued impression that my ideas are coherent. By bypassing the struggle of writing, I have evicted the value.

In all of these tools, there is an assumption: "just tell it what you want, and it will make it so." -- and a big problem: I don't know what I want! I may have some vague ideas --- but without a good tool's hard, tight interface with reality, they are at best shallow, incomplete and very likely utterly fallacious!

This assumption is one of the (many) symptoms of the "philosophical poverty" of Silicon Valley, and calls for something of a Copernican revolution in how we think about technology. People once imagined the earth as the center of the cosmos. People (especially tech companies) are now prone to imagine the user at the center of his tools, issuing commands like the CEO of Top Down Corp.

In reality, there is no center - just a string of relationships with our violas, hammers, text editors. Meaning emerges only through the strange loop of writing, rewriting, and making continual contact with reality.

Making more Virtuous Tools

Some days after our text exchange, Otto issued a wireless firmware update for my smart hammer.

"I heard what you were saying," he said
"and I realized that my aim has also gotten worse with my Smart Hammer. So now it vibrates to let you know when you're aiming wrong. You get to keep your hammering skills up, if that's so important to you, and you can spare your thumbs the negative feedback!"

It worked. Feeling the hammer buzzing angrily with each errant swing, my arm soon relearned its aim.

The difference between a tool and a virtuous tool can be subtle. Boeing, for example, has been widely praised for adding tactile feedback to its electronic throttle to mimic the feeling of an analog throttle connected directly to the wings (Maaz
2022). Some experts think that this small change could have prevented Air France's 2009 crash of an Airbus A330, in which two pilots lost control of a plane in part because each had no feel for how the other was controlling it (Carr
2014).

Similarly there are many small re-framings we can apply to tools like ChatGPT or DALL-E to treat them less like employees and more like violas, principally by giving the user more direct control of their mathematical underpinnings. Imagine you could take a piece of Al generated text, and drag a slider to increase the amount of sarcasm as easily as an audio engineer tweaks the treble and bass in a recording. DALL-E and Stable Diffusion already expose some of this power through prompt engineering. Or imagine if, on being fed my wild ideas about tooth brushing and economic security, the writing assistant asked pointed clarifying questions that accelerated my ideas' contact with reality.

The question is not "what to automate and what not," nor even "what should we use tools for and what is exclusively human," but rather: Are the tools corrupting, or virtuous? Are they instruments that lessen our contact with reality, or do they strengthen it, helping us perceive and play with the world in previously unimaginable ways?

References

Carr, N. G. (2010). The shallows : What the Internet is doing to our brains. W.W. Norton.

Carr, N. G. (2014). The glass cage : Automation and us. W.W. Norton & Company.

Hoffman, R. (2023, January 28). Technology Makes Us More Human. The Atlantic. https://www.theatlantic.com/ideas/archive/2023/01/chatgpt-ai-technology-techo-humanism-reid-hoffman/672872/

Maaz, M. A. (2022, August 8). How Do Airbus & Boeing Aircraft Differ On A Technical Level? Simple Flying. https://simpleflying.com/airbus-boeing-aircraft-technical-differences/