We’re still a long way away from AI that can think for itself

In This Article:

This article was first featured in Yahoo Finance Tech, a weekly newsletter highlighting our original content on the industry. Get it sent directly to your inbox every Wednesday. Subscribe!

Wednesday, Feb. 15, 2023

Don't expect human-like AI anytime soon

ChatGPT, Microsoft’s (MSFT) Bing, and Google’s (GOOG, GOOGL) Bard have the world talking about AI. Whether it’s how generative AI, artificial intelligence that creates content, will change art or help people more efficiently browse the web, the new crop of AI offerings is generating tons of buzz.

What makes ChatGPT and its cohort so intriguing is that they provide the illusion of an artificial intelligence that can think like a human. After all, if it’s able to write like one of us, it’s surely doing some kind of thinking, right? The reality, however, is that AI that can truly think like a person, referred to as artificial general intelligence, is still potentially decades away from becoming a reality, if it ever does at all.

“In the last few years, we've made pretty dramatic leaps,” Rayid Ghani, professor of AI at Carnegie Mellon University’s Heinz College, told Yahoo Finance. “But we’re still several leaps away from something that's general purpose that can be used for critical applications.”

Still, some experts say that we’re already on the way to reaching the kind of sci-fi inspired AI seen in films like “Her” and “Star Trek.”

“If you asked me…five years ago, I think I would have said no,” explained Yoon Kim, MIT assistant professor of electrical engineering and computer science. “Now I'm less sure. It might be one of the most pivotal moments in human history…and it will obviously raise a lot of philosophical questions and have a massive impact on society.”

ChatGPT, Bing, and Bard are smart, but not human-like

Ask ChatGPT to write a story about a girl who becomes a powerful mage who can summon storms and control dragons, and it will do just that. But that doesn’t mean that the platform is thinking like a person.

Instead, it’s been trained to recognize how certain words tend to pair based on feedback from human trainers and data pulled from the web and puts them together in an order that makes it seem like a person is writing. It’s more complicated than that behind the scenes, but that’s the gist of it. This process is called generative artificial intelligence, because, well, it’s generating something new from content it already recognizes.

ChatGPT can come up with stories on the fly based on prompts you provide it with. (Image: ChatGPT) · (ChaptGPT)

But that doesn’t mean these platforms are thinking like you and me.

“I don't think these models can think in a human sense,” explained Qian Yang, an assistant professor in information science at Cornell University. “There is some level of reasoning abilities there, but the mechanism in which these models produce these logical judgments are apparently very different from how humans think.”