Not Written By AI
- Susan Loucks
- Oct 10
- 2 min read
Conversations about artificial intelligence have been heating up in my own practice. A new client checked in to make sure promised deliverables would be the product of my brainpower, as opposed to large language models. Peers in other circles are requesting that every meeting come with transcript-level notes – only possible with this new tool. I'm making choices. If I’m dedicated to building a world where all species have potential to thrive together, how does AI fit in?
Alarm generated by emerging technologies follow predictable patterns. Lightning rods, with their ability to divert destruction, were once seen as potentially interfering with Divine intention. A 1889 magazine warned that “the universal result of the use of the telegraph is to overfill the ordinary mind with undigested and indigestible scraps of information…its tendency must be to weaken and ultimately to paralyze reflective power”. Organizational consultants are all too familiar with systemic resistance to change. And yet, AI's negative consequences are more than blind resistance: potential for mass manipulation, booming demands for energy.
A couple of weeks ago, a conversation partner brought up The Axemaker’s Gift, a book that outlines changes in power and social dynamics that have followed new technologies over the centuries. Each gift brought obvious advantages, but also the potential to separate us from ourselves, each other, and the world. For example, the printing press democratized the flow of information. It also led to consolidation of dialects, linguistic identities, and (as a result) political boundaries. Agricultural developments allowed both huge production increases and gradual sacrifice of soil health. The polycrisis we face is due, in part, to long-term costs of those near-term benefits.
What might appropriate balance of the costs and benefits of this new technology look like? Here’s my evolving guidance for myself:
I support uses that connect us, illuminating a bigger picture that we can all see ourselves in.
I support distributed power to manage AI’s place in the world, held by many (not a few large corporations). Use of this technology compels me to advocate for regulation and opportunities for public input on its place in society. I work to keep data controlled by its owners.
I support distributed gifts of this tool. I work towards AI’s advantages being accessible for people who think and create in all languages, for example.
I give equal weight to AI’s advantages and costs. Because I’m concerned about the environmental impacts of data centers, I avoid AI for recreation, low-value tasks, or to overproduce.
I pledge myself to ongoing learning about benefits and impacts.
The Axemaker’s Gift ends with a recommended path out of our cycle of embracing technology's gifts while downplaying costs. I was initially surprised – and then not surprised at all – that it hinges on small-scale participatory democratic systems. These systems are harder to control, more connected, and incorporate gifts from all over (including the environment) as opposed to inevitably prioritizing technology.
It seems wise use of gifts correlates with good, shared decision-making and robust participation. Let’s recommit to modeling that, using whatever tools we choose, wisely and together.




Comments