The New Generation of AI Applications is Still Far From Gaining Consciousness | NTT DATA

Tue, 02 August 2022

The New Generation of AI Applications is Still Far From Gaining Consciousness

It is too early to say that they will never be. But the evolution of language abilities in the past decade is astonishing.

Recently, Blake Lemoine, an engineer at Google, attracted worldwide attention when he claimed that the artificial intelligence of the LaMDA project, Google's most advanced conversational AI, has sensitivity. That means the ability to be positively or negatively affected by external or internal experiences. According to Lemoine, technology has acquired perception and the ability to express thoughts and feelings at the level of a human child.

Blake was removed for violating privacy clauses. If in the lay public, the engineer's statement caused anger, fear, and a certain fascination, in the professional public, the feeling was a shock. After all, how can an engineer who knows the techniques used speak sensibly about artificial intelligence? Importantly, Blake is not an expert in machine learning but a professional dedicated to testing technology that he likely had no in-depth knowledge of.

In response to Blake’s statement, Google stated that LaMDA was designed precisely to mimic feelings. Solutions that simulate feeling/consciousness are nothing new. Popular in Asian countries are social bots that engage users by simulating friendship – and even proactively start conversations. As of 2014, Microsoft is developing XiaoIce with nearly 700 million active users and an average of 23 bot-human interactions per conversation. XiaoIce aims to simulate the personality and consciousness according to the user’s taste.

Despite achieving this level of engagement, AI is not actually sentient but has mechanisms designed to simulate sentience. So being able to fool Blake proves that the technology is indeed doing its job.

Although not intelligent, Google's new generation of AI solutions is disruptive. It is the result of the work started in 2013 with the publication of "Efficient Estimation of Word Representations in Vector Space." In 2017, Google introduced transformers, a technology based on deep neural networks that sequentially process text and manage to learn language construction. The primary application of this type of intelligence is to predict the most likely sentence in a text string.

Solutions using transformers are called self-supervised learning. Learning ensues through access to large language databases, whether in books, websites, or videos. As they manage to absorb the complexity of the language, its evolution has been exponential. In 2018, Google released the first known model of the BERT transformer, a model with 110 million parameters. In 2020, it released OpenAI's GPT-3 with 175 billion parameters. And in April, he published the PaLM model with 540 billion parameters. The greater the number of parameters, the greater the possibility of acquiring knowledge and understanding the language.

The latest model released by Google, PaLM, is the one that best represents the current state of the art in natural language processing solutions. This is not just because it is a multi-parameter model but because the model used innovative training techniques that minimize the number of active network neurons for each input so that similar languages activate similar areas of the neural network. This enables reduced operating time and energy consumption.

The PaLM model is multipurpose and can be used in different situations. It can, for example, answer a question related to the text from the reference text. Interestingly, the answer may include information from sources other than the main text. It can also explain the logic of a joke, as well as solve and explain the solution to a particular math problem.

We are still far from seeing intelligent AI applications; it is too early to say they will never be. But the evolution of language abilities in the past decade is astonishing. There is little doubt that software coding will soon be done in natural language – without the need to use a programming language. At NTT DATA, we have solutions for code generation using the GPT-3 Open AI model. We will also use AI for complex text analysis, such as automating legal documents.

 

How can we help you

Get in touch