This New Tech Puts AI In Touch With Its Emotions—and Yours

Posted on:
Key Points

A new empathic voice interface launched today by Hume AI, a New Yorkbased startup, makes it possible to add a range of emotionally expressive voices, plus an emotionally attuned ear, to large language models from Anthropic, Google, Meta, Mistral, and OpenAIportending an era when AI helpers may more routinely get all gushy on us...

We specialize in building empathic personalities that speak in ways people would speak, rather than stereotypes of AI assistants, says Hume AI cofounder Alan Cowen, a psychologist who has coauthored a number of research papers on AI and emotion, and who previously worked on emotional technologies at Google and Facebook...

WIRED tested Humes latest voice technology, called EVI 2 and found its output to be similar to that developed by OpenAI for ChatGPT..

OpenAI has not said how much its voice interface tries to measure the emotions of users, but Humes is expressly designed to do that..

During interactions, Humes developer interface will show values indicating a measure of things like determination, anxiety, and happiness in the users voice..