At the 2009 TED, Hanson Robotics showcased our collaboration with UCSD Machine Perception Lab (Javier Movellan, Marian Bartlett, Nick Butko, Jake Whitehill, Paul Ruvolo), work supported by Stuart Baurmann and David Hanson on our side. This robot tracks faces and sound, perceives facial expressions, and mimics the user’s facial expressions. Our belief is that understanding [...]
Hanson Robotics’ Einstein learns about emotions, at UCSD Machine Perception Lab. A hyper-realistic Einstein robot at the University of California, San Diego learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to empower their robot to learn to make realistic facial expressions.
The android portrait of Philip K. Dick–an intelligent, evolving robotic recreation of the sci-fi writer who authored VALIS, Do Androids Dream of Electric Sheep, UBIK, and many other masterworks. By resurrecting PKD as an android, we seek to realize genius-level AI with compassion and creativity. While we have a long way to go, even the [...]
This robot serves cognitive robotics research at the famous MIRA-lab at the University of Geneva, starting as part of the INDIGO cognitive robotics consortium, in which Hanson Robotics was a founding member. The INDIGO consortium won over $6M Euro in EU funding, and its collaborative research resulting in ground-breaking robots and open-source software, and numerous [...]
Hanson Robotics’ Jules robot, shown at WIRED Nextfest 2006. Jules had naturalistic eye saccades, conversational interactivity, and a full range of humanlike facial expressions. We built Jules for the University of the West of England, but tested Jules with our own software first, before shipping Jules. Since then, Jules has served numerous science and development [...]