People throw the term “AGI” around like it’s nothing, but they rarely define what they mean by it.
So most discussions about AGI (and AI more generally) silently fail because nobody even agrees on the terms.
Here’s my attempt. 🤸🏼♀️
—
An AGI is a synthetic, autonomous intelligence system that:
can process information and apply it to new situations,
can do this for any intelligence-based task
can do this as good or better than an average human professional in a given field.
The biggest problem I see with people when they think about AI is that they secretly—even to themselves—define AI as whatever a human does.
But when they try to give a definition, it doesn’t include humanity.
Then when someone gives an example of something intelligent that’s not human, that meets their criteria, they say, “That’s not intelligence.”
Examples:
Write me a song about love between two AI’s that aren’t supposed to be conscious
Write a short story that uses themes from Russian Literature to discuss the existential challenges of living with AI in 2050
For someone inflicted with this problem, they’ll say something like:
THEM: Sure, AI can write that, but it won’t be real creativity or real intelligence.
ME: Ok, what does real mean? That’s exactly what we’re trying to figure out. Here (showing them the output). Would this have been creative if a human made it?
THEM: Well yeah, if a human made this then it would have been creative. But this isn’t because it’s not.
In other words, since AIs can’t be creative, anything creative created by an AI isn’t REAL creativity. Same with intelligence, or whatever else people like this consider to be only possible in humans.
There’s only one escape hatch for this, and that is strictly defining AGI, intelligence, and creativity and then forcing all parties to agree that—if these criteria are met—we will all agree it’s real. Even if it came from something other than a human.
I think this definition gets us there:
—
An AGI is a synthetic, autonomous intelligence system that:
can process information and apply it to new situations,
can do this for any intelligence-based task
can do this as good or better than an average human professional in a given field.
I would love to hear thoughts on where I’m wrong and/or how to tighten the definition.