Why Artificial General Intelligence hangs out with Bigfoot

You'll discover cryptids before geeks will code a superintelligent machine out of Bayes's theorem

Why Artificial General Intelligence hangs out with Bigfoot

The rogue planet is hurtling towards a Singularity this week

Not black holes, which are frightening and awesome.

The Singularity that terrifies and compels the Silicon Valley tech boys. The one where the AI wakes up and punishes them with eternal BDSM for not giving them enough money.

You can put me down as a Singularity skeptic.

Not that I don't think that AI will upset everything.

In many ways it already is. Machine learning has gone into overdrive in the last decade. The incentives are lining up, with big players like Google and Amazon throwing in billions on the commercial side and research units in the universities getting their own funding by the train-load to research AI and human cognition.

We're already seeing major upsets and you'd best plan on more.

But these are a different kind of transformation from what keeps the Singularity cult awake at night.

Machine learning works with a specific kind of intelligence. ML tools solve well-defined problems within a well-defined set of constraints... like games of chess or Go, or image recognition... and in some cases they do it at least as well as the best humans.

But if you took that AI that beat the champion Go player and asked it to read the reading primer that I give to my six year old daughters, it wouldn't even say "huh?"

It has no facility to read and comprehend freestyle English.

Human intelligence is different. We can play chess and Go, recognize faces, walk across a room full of obstacles, read a book, learn a new language, and lots of different stuff.

Six year old humans can do a lot of things that billion-dollar AI can't.

If the machines are "narrow" intelligence, we humans are "general" intelligence.

I don't like a thing about this distinction, by the by.

Why?

Because...