We can be predators. We have to.
I have some thoughts about the hockey-stick rise of “AI” (more like advanced machine learning) , it’s current iterations are bullshittiers (from On Bullshit by Harry Frankfurt
His book On Bullshit addresses his concern and makes a distinction between “bullshitters” and liars. He concludes that bullshitters are more insidious: they are more of a threat against the truth than are liars.
Whatever happens with AI— it’s a bullshitter. It doesn’t, it can’t, care about what is true or false. Whomever uses it also does not care, and they will use it to expand their underdeveloped thoughts and ideas to have it talk about nothing at all, in an effort to confuse and control; to influence you to buy some random widget you don’t need.
Combine the output of AI writing tools with software to paraphrase and re-edit and everyone will have a really hard time telling what was written entirely by humans and what was fabricated from nothingness.
In time, media created with the assistance of AI will probably become the apex predator of your time and attention on the internet, where to attract attention is to attract predators:
Is our universe an empty forest or a dark one? If it’s a dark forest, then only Earth is foolish enough to ping the heavens and announce its presence. The rest of the universe already knows the real reason why the forest stays dark. It’s only a matter of time before the Earth learns as well.
This is also what the internet is becoming: a dark forest.
While it’s markedly safe to hide, it also means you give up your potential for change on the larger Internet, letting those predators bend the environment to their will permanently.
We’d do well to remember that on the Internet nobody cares you’re a dog.