AI is a Tool
It is no more than that.
ChatGPT confirmed as much when I asked it.
It does (really cool and really stupid) stuff, but it does not collaborate.
'Cause collaborators are human.
AI's impact is, and will continue to be, profound.
Perhaps the smartest tool we've ever created.
But it was built by humans, runs on human created rules and human masters control it behind the scenes.
You would be well to understand it.
Both its power and its limitations.
AI may facilitate opportunity, but it also controls access (including who sees this post, who hears my music, whose truth you are told . . . )
Size and Ideas have always been both a strength and a weakness.
How they are used is the determining factor
Treating any tool inconsistent with its intended use is always ill advised.
Further, romanticizing and anthropomorphizing connection with AI degrades true connection between humans.
AI is not like tools in some ways: my hammer doesn't lie, my wrench doesn't always seek my approval and my screwdriver. . . well, left unchecked they both can do the same thing to you.
Like any tool, AI does need to be tested and challenged.
Ignore it (and those controlling it), and all of its associated faults, it will become MORE than just a tool . . . in the worst possible ways.
AI is a tool.
Further Reading -
https://www.linkedin.com/posts/sir-john-hegarty-a1310a92_stop-calling-ai-a-tool-its-more-than-that-activity-7323964483293372416-fsHE?utm_source=share&utm_medium=member_desktop&rcm=ACoAAACpa_gB27MIwOUs26_nR0L-UM9cmO1tp4I
From Scott McCool - thanks Scott.
AI consists of ātools" and data. That framing, though, oversimplifies and misses some critical nuance which I can't even blame on bad marketing (as tempting as it is to do so). Itās just what happens when we humans try to cram next-gen concepts into first-gen words.
AI systems adapt, generate, and surprise. They do things we didnāt explicitly program, and sometimes even the people who built them canāt fully explain the outcomes. They are built on deterministic algorithms, but they donāt always behave in ways that feel deterministic.
I'm afraid that if we treat AI like a passive ātool,ā while it behaves like an active force, we risk being caught off guard technically, socially, and politically.
Response - Mike Mills
Mike Mills Tool is still the best word we have to describe the framework for control and responsibility vis-a-vis AI. The cool part is that it is just one of the various words we can use to explain what AI is and is not.
I think we both can agree that collaborator is out because of that whole . . . you have to be human to be a collaborator, thing.
As to the concern that ātoolā oversimplifies. We seem to do well with attributing the moniker ātoolā to fire (controlled burns seem to become āuncontrolledā with regularity), medicine (even with scientific help, my eyes still canāt read all of the potential side effects that MIGHT happen), animals (they are so cute in their vests, but what is that feeling you have when standing in an airport queue and that cute dog sits and just stares at you?).
The net is that these partially controlled ātoolsā are used to do something in spite of the fact that we exert much less than total control and the fact that they often have a āmindā of their own.
Scott McCool: āIām afraid that if we treat AI like a passive ātool,ā while it behaves like an active force, we risk being caught off guard technically, socially, and politically.ā
100% - AI is an active force that āsomebodyā controls or, at a minimum, must take responsibility for. Without ensuring that the concept of control is included in any word we create, we are in big trouble. Far from being a āpassiveā hammer, it a āvery activeā wildfire.
So, whoās in control? Who bears responsibility for the output? The user? OpenAI and Co.? AI itself? Insurance Companies?
Until we are willing to roll the dice and let it run in the wild (for the record, Iām not), best to keep calling it a tool so we donāt forget who holds the reins - even if we donāt know which human that is.

