Facebook AI Blender the bot that lies

Facebook AI researchers announced today Blender, an advanced open source chatbot model. The new model is supposed to be "more human" than previous versions and can have better discussions.

Based on early indications it appears to be as good as the competing bots from Google and Microsoft.

Facebook reportedly used a huge amount of information and trained AI. Blender chatbots are open source, which means they can't be trained just to answer a certain set of questions, like the ones you see on websites.

These bots are theoretically capable of talking about any topic.

Facebook

A post from the company blog states:

This is the first time a chatbot has learned to combine several chat skills, such as the ability to have a personality, discuss almost any topic and show empathy - in natural chat streams.

Η νέα μας συνταγή ενσωματώνει όχι μόνο νευρωνικά μεγάλης κλίμακας, με έως και 9,4 δισεκατομμύρια παραμέτρους – ή 3,6 φορές περισσότερα από το μεγαλύτερο υπάρχον σύστημα – αλλά και εξίσου σημαντικές τεχνικές συνδυασμού δεξιοτήτων και λεπτομερούς συζήτησης.

The number of parameters that Blender plays a big role in how convincing any AI talk is. AI requires more computing power, but produces significantly better results.

According the page of the Blender project:

Good conversation requires a set of skills that a skilled interlocutor combines seamlessly: Provide attractive points in the speech and listen to the collaborators, ask and answer showing that he has the necessary knowledge, has empathy and personality, depending on the occasion . It seems that large-scale models can learn these skills when given the right training data and a choice of strategy.

The result is a chatbot that can convincingly reproduce human conversation. In a way, it is capable of passing the Turing Test. This may well represent the most advanced AI chat on the planet.

Η artificial intelligence Facebook does not understand a word of what it says. He does not "listen" to anyone, he cannot understand the language or the meaning behind the words. It only makes the correlations between the statements given to it by its developers and deems them to be the appropriate answers to a topic.

He can't "know" because he doesn't actually use a limited database, but he makes associations that seem to make sense. For example, if you tell it that your favorite song is one by Smashing Pumpkins, it might report that it likes a different song better. But he has never listened to music. It only processes the language. He does not understand what sound, images or video are.

He also has no empathy, can not analyze your feelings or express his own. He simply responds with statements that he has been trained to understand as representing the appropriate answers. If, for example, you say you are sad, he will not say "congratulations", or "and time."

Basically, the artificial intelligence Facebook seems to be becoming a perfect liar because it has no other choice. Artificial intelligence does not feel or think, so it cannot experience. Thus without experience, one will not become human. The bot will have to lie in order for a person to "play" it.

There may come a time when perhaps humanity regrets its decision to create bots of artificial intelligence. The biggest danger, however, is not the bots who will use their knowledge to harm us, but the people who can do it.

If you want more information about Blender, see the paper from here.

iGuRu.gr The Best Technology Site in Greecefgns

every publication, directly to your inbox

Join the 2.087 registrants.

Written by giorgos

George still wonders what he's doing here ...

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).