Jibo: The first social robot with interactive skills
In less than 10 years, robotics markets around the world will grow twice as much to $67 billion, according to the Boston Consulting Group. In 2015, startups in robotics raised about $1.2 billion, which was three times as much as 2014.
During this robotics boom, social robotics are becoming a specialized business and technology niche, where robots will become human companions and be able to interact and communicate through social behavior patterns and deep speech recognition mechanisms.
Social robotics is also seen by some engineers as a new experience which will free users from constant and direct engagement with multiple devices that we typically use every single day. Instead, it intends to change technology’s role in our daily habits and making our life more enriching.
One of the latest and most anticipated projects in social robotics is Jibo, which is bringing social engagement with individuals to a new level. Jibo will respond to your needs with emotion and voice expression, and this will be a game-changer.
What is Jibo anyway
We are living in an era where the dream of creating a home robot is now reachable due to the availability of high end and inexpensive technologies. These range from powerful low-power microprocessors, 3-D sensors, gyroscopes, display and sensing technologies to lightweight lithium batteries, machine learning and cloud computing.
Together they help make a robot both autonomous and sensitive to detection of objects and people in the surrounding world. Robots can inform their owners, emit human emotions, entertain and support everyday needs.
For Jibo, the goal of Cynthia Breazeal’s team is to create a robot that possesses a personality and a wide range of skills that will make the interaction between Jibo and humans easy and helpful. The expert team that was assembled to work on Jibo includes professionals that deal with animation, speech recognition, and human-machine interaction. Jibo is a family-oriented robot that has been created to become an assistant for each family member in its own way.
Jibo’s SDK features and natural skills
Jibo’s engineers developed a three-cylinder robot body. Due to a thoughtful location of electrical and mechanical components, different body sections are connected at an angle to ensure natural rotation and make Jibo’s body motions smoother and more expressive. Used in conjunction with its voice, Jibo will use body language to add emotional aspects to the spoken words.
Jibo’s ability to communicate is based on a concept much more complicated than being able to hear and respond. The robot’s speech algorithm is different from Siri or Google’s speech perception and response, where speech signals are sent to cloud based computers.
Jibo’s conversational skills will involve a different approach to ensure quick response, so speech processing will happen without time delays. Jibo will communicate based on a natural-language model, so he can talk to people in an engaging manner. His speech engine is based on around 14,000 pre-recorded word phrases, which the speech algorithm will transform into intelligent utterances.
Jibo will launch with a limited amount of skills, but with Jibo’s open platform, professional developers will be able to increase Jibo’s capabilities by creating new apps. Users will be able to access the Jibo Store to purchase and download apps to enlarge the list of their own Jibo’s skills.
Early Indiegogo third party developer package purchasers will be the first to develop and test the new Jibo SDK. Our team at Onix is an early supporter of the Jibo early development package. Interested developers, who wish to have their skill released first, please contact us.