What is intelligence?
If the erudite and respected literary scholar fails to grasp even the most rudimentary functions of smart phone use, can we call them “intelligent?” What about the mathematician whose groundbreaking theories revolutionize mankind’s understanding of the universe, but who’s incapable of passing a fourth grade geography test?
Intelligence is a word wielded in bias — assigned in accordance with the adjudicator’s own prejudice.
Dictionaries often define intelligence as an accumulation of knowledge — a definition that my own bias rejects absolutely. Suggesting that intelligence equals knowledge creates a class structure that excludes anyone without education. That makes absolutely no sense — one should not be able to buy their way into an “intelligent” diagnosis. An educated dolt is still a dolt; and an uneducated genius is still a genius. Besides, everyone has knowledge of something others do not, so the standard definition degrades to utter nonsense.
My personal belief is that intelligence and knowledge have little to do with one another — a belief that’s no doubt prejudiced by my own proclivity to calculate and derive, rather than draw from a database of knowledge. Since I don’t believe the possession of knowledge implies intelligence, I certainly don’t believe possessing more knowledge implies even more intelligence. Rather, I believe intelligence relates to how one applies whatever knowledge they do have. Remembering a million bits of trivia is not intelligence. Going on Jeopardy so you can leverage all that trivia to win a stack of money — that’s intelligence.
And don’t even get me started on the whole anthropomorphic aspect of the definition, in which humans have deemed themselves the most intelligent species on earth. We pants-donning primates are so arrogant, we even rate animal intelligence by how closely their cognition resembles our own. But I bet you Cousin Tammy, unlike an octopus, doesn’t have two-thirds of her brain’s neurons dispersed throughout her limbs. Sever an octopus’ arm and that arm will continue to function in support of the creature — finding food and feeding it. Cousin Tammy? Her severed arm isn’t gong to do a damn thing but lie motionless in a pool of gore. Sure, the octopus hasn’t a clue how to colonize Mars, but neither does Cousin Tammy.
Troubled as I am by the word “intelligence,” you can imagine how I feel about appending the word “artificial” to the front of it. It’s like using one totally arbitrary word to modify a second totally arbitrary word.
Yet here we are — living in a world where Artificial Intelligence (AI) is all the rage. As best I can ascertain, all these newfangled AI products are really just LE products — but it’s a whole lot harder to market “Laziness Enhancement” software than it is to woo customers with the notion of “Artificial Intelligence.”
Now don’t get me wrong — I’m all for software that lets me be as lazy as possible. The less time I have to spend preparing my taxes, buying socks, or transferring files to someone, the better. The greater the number of rote functions a machine can learn, the happier I am. The problem is that companies are developing AI to make our creative and aesthetic decisions for us. The whole reason I want my software doing the rote stuff, is so that I have more time to do what really matters — the creative and aesthetic stuff.
Think about it. If you want to create something that you actually care about, are you going to turn over the decision making process to an algorithm? Particularly an algorithm that’s designed to produce a result in the most pedestrian, middle-of-the road, unimaginative way possible? A way that looks or sounds the most like everyone else’s AI-derived results?
The only time you would want your software making aesthetic decisions is when your end goal isn’t the creation itself, but to do as little as possible to create it. In which case, why create it at all? Why contribute more noise to a noisy world?
AI is designed to produce an outcome that will generate the widest appeal amongst the lowest common denominator — and it’s very good at doing that. It can mix and master your music so that it sounds like the mixing and mastering of the most popular songs; and it can process your photos and videos so they match the appearance of the most widely liked photos and videos on social media.
AI is an ouroboros — a snake that eats itself. The more frequently people rely on AI to make creative decisions, the more the algorithms will feed into themselves — further narrowing the scope of acceptability and further precluding true creativity and self expression.
Some have opined that AI technology is a dangerous path — one that will take us toward a dawn in which machines become more “intelligent” than man, and thus our overlords. I fear the exact opposite — that AI will take us toward a dawn in which machines become so narrow-mindedly unintelligent and prejudiced that mankind’s reliance on them will render us incapable of expansive thought. Sadly, both theories achieve the same end — mankind is toast. So, you might want to stock up on your favourite jam.
Call me unfashionable, but I believe one’s creative output should be an expression of oneself, not the algorithm they purchased. Otherwise, the ‘likes’ that someone gathers on social media are every bit as artificial as their intelligence — however you choose to define it.
©2021 grEGORy simpson
ABOUT THE ARTICLE: In case you can’t tell, I’ve been messing around with the newly released Luminar AI photo editing software. For now at least, my belief is that mankind is safe, since the product seems to be more a collection of ‘presets’ than it is actual AI. But there’s still an insidiousness to some of those presets — particularly all the ones designed to sham your Instagram feed with altered faces, idealized bodies, and stock-imaged skies. But if you stay away from that kind of crap, Luminar AI does have some decent (albeit typical) editing tools hiding just below the surface.
In any event, no Luminar software (nor any AI) were employed in the creation of the hodgepodge of film and digital AI metaphors included with this article. I have no doubt my webstats will suffer accordingly…
REMINDER: If you’ve managed to extract a modicum of enjoyment from the plethora of material contained on this site, please consider making a DONATION to its continuing evolution. As you’ve likely realized, ULTRAsomething is not an aggregator site. Serious time and effort go into developing the original content contained within these virtual walls — even the silly stuff.