A few years ago, Business Insider predicted that 80% of business applications would use chatbots by 2020. Today, with a multitude of artificial intelligences that relate to the user, reality says that only a few of them are actually used by people. That is, we still lack a fundamental understanding of the mechanisms that influence our experience: why, for example, is Microsoft’s Chinese chatbot Xiaoice accumulating millions of monthly users, while Microsoft’s English version, Tay, has been discontinued? A Stanford University research investigated the effects associated with the linguistic use of these relationship artificial intelligences. How to Build a Likable Chatbot, was presented at the last ACM (Conference on Computer-Supported Cooperative Work and Social Computing) and is based on three studies in which nearly 300 participants interacted with an artificial intelligence agent. Today, artificial intelligence agents are often associated with some kind of metaphor. Some, like Siri and Alexa, are seen as administrative assistants, Xiaoice is proposed as a friend, almost a virtual boyfriend, and Woebot as a psychotherapist. These metaphors are intended to help us understand and predict how these AI agents should be used and how they will behave. One of the authors of the study Ranjay Krishna explains: “If, for example, the metaphor prompts people to expect highly competent artificial intelligence capable of understanding complex commands, they evaluate the same interaction with the agent differently than they do. if they expect to be confronted with an intelligence capable of understanding only simple commands. Similarly, if users think of a warm and welcoming experience, they evaluate it differently than expecting a colder professional experience “. The experiment involved the use of different metaphors to introduce people to the various chatbots they would be confronted with. And: “Low proficiency metaphors (for example, ‘This chatbot is like a child’) has led to increased usability and a desire on the part of the user to cooperate. Conversely, high proficiency metaphors such as’ this chatbot is of a professional level ‘have led the user to a negative view of its use. “The descriptions are powerful.” Our analysis suggests that designers should carefully analyze the effects of the metaphors they associate with the artificial intelligence systems they create, especially if they are communicating high competence expectations. ”For example, Mitsuku, presented as the“ five-time winner of the Turing test ”is a project that has been abandoned.