Zephyrnet Logo

Twitter Suspends AI Bot Following Musk’s ‘Scam Crypto’ Claim

Date:

Despite the hype about AI becoming sentient, generative AI systems such as ChatGPT are still not as intelligent as dogs, or even cats, according to Meta chief AI scientist Yann LeCun. That’s because AI chatbots lack an understanding of the underlying reality of the real world, he said.

Speaking at the Viva Tech conference in Paris, France on June 14, LeCun explained that existing AI systems are limited in what they can do, generating only text or images based on what they have been trained on. They are incapable of understanding the meaning of the things they create.

By comparison, human knowledge far exceeds the limits of language, LeCun said.

Also read: New York Woman ‘Marries’ AI Bot She Created on Replika

Dumb AI

Someday there will be AIs that are more intelligent than humans, but this is still some way off and should not be seen as a threat, says Yann LeCun.

“Those [current AI] systems are still very limited, they don’t have any understanding of the underlying reality of the real world, because they are purely trained on text, massive amount of text,” said the Meta scientist, as reported by CNBC on Thursday.

“Most of human knowledge has nothing to do with language … so that part of the human experience is not captured by AI.”

LeCun said that while generative AI is capable of passing the Bar exam in the United States, a test that is required to become an attorney, it cannot load a dishwasher, a task that most children can learn to do in a matter of minutes.

“What it tells you we are missing something really big … to reach not just human level intelligence, but even dog intelligence,” he stated.

There are real concerns about AI developing skills that could be uncontrollable and harmful, posing a threat to humanity.

In April, Google senior VP of technology and society James Manyika claimed that one of their AI programs taught itself a new language which it was not trained to know. He said the AI was somehow able to learn Bengali without training after being prompted in the language.

Google’s announcement raised concerns about AI developing skills independently of its programmers’ intentions, which has long been a topic of discussion among scientists, ethicists, and science fiction writers.

Manyika later clarified the fact that the AI could learn a new skill without training does not mean it was sentient – meaning to have feelings, emotions, ideas, thoughts, perspectives etc, just like human beings.

Are we there yet?

According to Meta’s LeCun, artificially intelligent chatbots such as OpenAI’s ChatGPT, the groundbreaking AI that started all the hype, and Google’s Bard, are still far from being able to replicate human intelligence in all its complexity.

In another example, LeCun, a professor of AI and machine learning at New York University, pointed out that babies have an intuitive understanding of the physical world that AI systems do not yet possess.

A five-month-old baby will not be surprised to see an object floating in the air. However, by the time they are nine months old, babies will understand that objects should not float, and they will express surprise if they see one doing so, he said.

“[We have] no idea how to reproduce this capacity with machines today. Until we can do this, we are not going to have human-level intelligence, we are not going to have dog level or cat level [intelligence],” he detailed.

The scientist envisions a future where everyone has their own AI assistant that is smarter than them. The assistants would be designed to be “controllable and subservient to humans,” and there will be no correlation between being smart and wanting to take over the world.

“We should not see this as a threat, we should see this as something very beneficial. Every one of us will have an AI assistant … it will be like a staff to assist you in your daily life that is smarter than yourself,” LeCun said.

“A fear that has been popularized by science fictions [is], that if robots are smarter than us, they are going to want to take over the world … there is no correlation between being smart and wanting to take over.”

spot_img

Latest Intelligence

spot_img