Monday, November 27, 2017

Oh, patents! Pepper

Copyright © Françoise Herrmann

Before concluding this chapter on the French Softbank Robotics humanoid robots with their many patented inventions for breakthrough body parts, expressive body language, and impressive skills at relating, here is Pepper's US Design patent. Pepper is one of the three humanoids embodying the Softbank Robotics patented inventions, previously presented.


Pepper's US design patent USD719620, titled Robot, was filed by Aldebaran Robotics (the former Softbank Robotics company) on Oct. 18, 2013, and awarded on Dec.16, 2014, to Vincent Clerc.

As a reminder a US design patent is different from a US utility patent in that:

“a “utility patent” protects the way an article is used and works (35 U.S.C. 101), while a "design patent" protects the way an article looks (35 U.S.C. 171)". (USPTO)
The USD719620 patent figure drawing No. 1 is included above, and the video below, posted on YouTube by Softbank Robotics Europe on June 10, 2014, shows Pepper awaking


References
Softbank Robotics
Softbank Robotics - Pepper

Sunday, November 26, 2017

Oh, patents! Humanoid voices (2)

Copyright © Françoise Herrmann

Lots more patented inventions for humanoids in the quest to emulate participation in human conversations! 

The Softbank Robotics family of robots are all connected robots. This makes them all technically unfinished products, considering that their functionalities might be indefinitely expanded, customized and modified with downloadable apps and extensions, designed by a community of third-party developers.   

Access to the world wide web, and to a remote database of information, greatly improves the humanoid’s capacity to parse and respond to human input in a dialogue, as it provides a way of personalizing interactions and breaking out of canned and repetitive modes of response. Indeed, even when the prior art of humanoid output is programmed to prevent repetitions, locally stored responses (even shuffled) are substantially the same for all human interlocutors. For an extreme example of stereotypical interaction, everyone has experienced interactive computer voice response (IVR) systems on the telephone, designed to lead human callers indiscriminately through a script and its options. For a humanoid robot seeking to function as a companion, this sort of indiscriminate, pre-recorded and stereotypical routine is unacceptable. The companion robot requires a bit more conversational capacity, if it is to eventually function as an assistant to daily living, and especially if it is to create engaging communication, or any kind of bond, albeit an inevitably deceptive one.

The invention recited in  US2016283465, titled Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method, precisely addresses the issue of conversation ability, and in particular two aspects, termed: personalization, and progression, both within and across sessions of communication. 

The patent discloses that the humanoid robot has access to a remote database containing user profiles which may be extracted and updated with new information provided in the dialogue with a human interlocutor. Similarly, the humanoid robot is programmed to use profile information in the responses it generates, or to solicit more information to fill-in missing profile information. So, for example, the humanoid may identify an interlocutor by name, based on input data from camera sensors (not only verbal input), and then recall the previous conversation, based on stored history information in the profile, to generate a more personalized response, as in the following sequence, cited in the patent: 
[0062] Robot: Hello Jean! Thanks for coming back to see me. That makes me happy! 
[0063] How have you been since yesterday?
The profile might also be linked to additional contextual information that the humanoid can use in response to human input. The example cited in the patent concerns Europe Day, celebrated on  May 9, and both the user profile information concerning "Mother tongue"  and the missing information in the profile on "Other languages spoken". The cited dialogue prompt, generated by the robot to feed the conversation, and its human response, run as follows:
[0069] Robot: Tell me, do you know that today is Europe day? I think it's great, these countries uniting, with all the different languages and cultures. Also, I speak a number of European languages! And you, Jean, do you speak any languages other than French? [...]
[0072] Human: Yes, I speak English fluently.
The abstract for US2016283465 is included below. The patent figure drawing No. 1 is also included. The drawing shows a human interlocutor, and a humanoid robot with access, via a communication network, to a remote server containing user profiles, for use (extraction and update) in dialogue with humans.  

A method for performing a dialog between a machine, preferably a humanoid robot, and at least one human speaker, comprises the following steps, implemented by a computer: a) identifying the human speaker; b) extracting from a database a speaker profile comprising a plurality of dialog variables, at least one value being assigned to at least one of the dialog variables; c) receiving and analyzing at least one sentence originating from the speaker; and d) formulating and emitting at least one response sentence as a function at least of the sentence received and interpreted in step c) and of one dialog variable of the speaker profile. [Abstract 
US2016283465] 

This patent is part of a family, the list of which is appended below. 
  • US2016283465 (A1) ― 2016-09-29 - Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method
  • AU2014331209 (A1) ― 2016-05-19 - Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method 
  • CA2925930 (A1) ― 2015-04-09 - Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method  
  • EP3053162 (A1) ― 2016-08-10 - Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method
  • FR3011375 (A1) ― 2015-04-03 - Procédé de dialogue entre une machine, telle qu'un robot humanoïde, et un interlocuteur humain, produit programme d'ordinateur et robot humanoïde pour la mise en œuvre d'un tel procédé 
  • JP2016536630 (A) ― 2016-11-24 - 人型ロボット等の機械と人間話者との間の対話方法、コンピュータプログラム製品、および同方法を実行する人型ロボット
  • MX2016004208 (A) ― 2017-08-16 - Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method. 
  • WO2015049198 (A1) ― 2015-04-09 - Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method
So, does it all work? Can Softbank Robotics humanoids hold up their end of the conversation? Are Softbank Robotics humanoids likable? Lovable? Engaging? Can these humanoids be potential companions? The following video shows what happened with Nao, at three schools in the UK.

References
Softbank Robotics
Softbank Robotics - Nao

Tuesday, November 21, 2017

Oh, patents! Humanoid voices (1)

Copyright © Françoise Herrmann

A humanoid robot without a voice would really have missed the point of emulating humans!

So, each of the Softbank Robotics humanoid robots – Nao, Pepper and Romeo, are not only equipped with voices (i.e.; speech synthesizer and speech recognition with a natural language interface), they also have actuators generating coordinated body language that animates interactions to make them more mimetic -- plus much more in terms of personalized and synergetic interactive capacity.

The many Softbank Robotics R&D partnerships in human/machine interaction, at major academic robotics research labs, in France and Europe, are both part of the humanoids' development, and users of the robotics platforms to further their own research agendas.  

One such project, led by Devillers (2017) at L'IMSI (affiliated with France's National Center for Scientific Research), seeks to model not only the verbal components of human/machine interaction, but also the non-verbal or paralinguistic aspects of interaction. The assumption is that the quality of human/machine interactions might increase when more aspects of the interactions are modeled and detected. It is not only what is said that matters, but how it is said, for example with intonation or facial expression. Detecting that an interlocutor might be annoyed, or angry, perplexed or unsure, will greatly enhance the quality of the interactions. 

Ultimately, the goal is for the robot companion to please, so that more satisfying human/machine interactions might arise, perhaps creating conditions for a warm relationship to take root, even if it is going to be a very deceptive one. Thus, the project also includes an ethical component, designed to limit the ways in which such potentially deceptive interactions play out with vulnerable populations, such as children, the elderly and handicapped. 

Finally, the project seeks to define machine humor, also for the purposes of making the machine more likable. If the machine is bound to make mistakes, then the machine’s own error detection might be transformed into humor. Otherwise, humor might also be modeled as a form of response to the detection of certain emotional states. 

The detection and modeling of non-verbal input in human/machine interaction, for the purposes of enhancing human/machine interactions, is a patented Softbank Robotics invention.  US2017148434 titled Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method, discloses the acquisition of input from at least a sound sensor and a motion or image sensor, for the purposes of interpreting such linguistic and paralinguistic aspects of human/machine interaction as utterances, intonation, gestures, facial expressions and body posture. In turn, the interpretation of multi-modal input is designed to invoke a humanoid response that also includes both linguistic and paralinguistic features, such as an utterance, intonation, gestures, facial expression, and body posture!  

Thus, the humanoid response is also animated. The patent figure drawing 5b shows the syntactic analysis of the utterance: I agree with you. for the purpose of determining the insertion point(s) of the mechanical actuation that will animate the robot's response. 

The abstract of this patent is included below:
A method of performing dialogue between a humanoid robot and user comprises: i) acquiring input signals from respective sensors, at least one being a sound sensor and another being a motion or image sensor; ii) interpreting the signals to recognize events generated by the user, including: the utterance of a word or sentence, an intonation of voice, a gesture, a body posture, a facial expression; iii) detennining a response of the humanoid robot, comprising an event such as: the utterance of a word or sentence, an intonation of voice, a gesture, a body posture, a facial expression; iv) generating, an event by the humanoid robot; wherein step iii) comprises detemlining the response from events jointly generated by the user and recognized at step ii), of which at least one is not words uttered by the user. A computer program product and humanoid robot for carrying out the method is provided. [Abstract US2017148434] 

This invention is disclosed in a whole family of patents listed, and hyperlinked, below. 
  • US2017148434 (A1) ― 2017-05-25 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method 
  • AU2015248713 (A1) ― 2016-11-03 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method 
  • CA2946056 (A1) ― 2015-10-22 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method 
  •  EP2933067 (A1) ― 2015-10-21 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method 
  • HK1216405 (A1) ― 2016-11-11 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method
  • JP2017520782 (A) ― 2017-07-27 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method  
  • KR20170003580 (A) ― 2017-01-09 - Method of performing multi-modal dialogue between a humanoid robot and user computer program product and humanoid robot for implementing said method
  • MX2016013019 (A) ― 2017-05-30 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method. 
  • SG11201608205U (A) ― 2016-10-28 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method 
  • WO2015158887 (A2) ― 2015-10-22 - Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method
The following video will give you a glimpse of how well Pepper performs, in an interview, with a human. 


NB. Aldebaran Robotics is the former Softbank Robotics.

References
Devillers, L. (2017) Rire avec les robots pour mieux vivre avec. Le Journal du CNRS 9-02-2017 
LIMSI - Laboratoire  d'Informatique pour la Mécanique et les Sciences de l'Ingénieur
Projet romeo – Partenaires
Softbank Robotics

Saturday, November 18, 2017

Oh, patents! Nao's joints

Copyright © Françoise Herrmann

Still a few more state of the art robotics patents! 

Nao’s joints arise as a safety issue in the larger focus on how to prevent unanticipated movement of a humanoid robot, and in particular, injury to a human who might get pinched in the robot’s movements or joints. The anti-pinching system disclosed in WO2015185671, titled Anti-jamming system in a humanoid-type robot, was thus designed to address problems of the prior art focused on solving issues of safety when using humanoid robots, and in particular the risks of pinching.

Indeed, the state of the art had previously explored reducing, or leaving a gap in the movement of a robot’s joints to prevent pinching, which by the same token reduced the sorts of movements that could be performed, and thus limited the anthropomorphism of the humanoid. The state of the art also attempted to limit the force deployed to actuate a joint, but this solution tended to reduce the overall capacity of the humanoid robot, for example, to lift heavy equipment.

In response, this anti-jamming invention thus discloses a passive solution that limits the pinching force of a humanoid joint. It consists in a strategically positioned, soft and flexible membrane, at any area of a humanoid joint (e.g.; hip, elbow, shoulder, knee, wrist) where a risk of pinching might be anticipated. This membrane is able to distort a certain distance (i.e.; the diameter of the round black circle (20) representing a finger in the two included patent drawings), in cases where such a finger might get caught in the humanoid joint, or area between the robot’s trunk and joint.  Thus, while the outer shell of the robot’s limbs and trunk are made of hard plastic, certain researched areas of the robot’s outer shell comprise the contiguous invention areas, made of soft and flexible silicone, or rubber, able to accommodate a finger, that could get caught, and thereby passively limiting the pinching force.

This anti-jamming invention is disclosed in a family of patents. The hyperlinked list of patents in this family appears below. The English abstract of the WIPO member of this patent family, WO2015185671, titled Anti-jamming system in a humanoid-type robot, is also included with two patent figure drawings. The two drawings respectively show a black circle (20) representing a finger potentially pinched in Nao’s elbow joint (below), and a black circle (20) representing a finger potentially caught in the space between Nao’s shoulder joint and trunk (above), with both drawings showing the anti-jamming flexible area designed to accommodates the finger, or to limit the pinching force.

The invention relates to the safety of use of a humanoid-type robot. The humanoid-type robot comprises two elements (2, 3) and an articulation (11) with at least one degree of freedom that connects the two elements, the two elements each comprising a skin (22, 23) delimiting the outer surface thereof, the articulation (11) allowing movement in a given range, a first of the two elements (2, 3) being intended to come substantially into contact with a region (25, 26) of the skin (22, 23) of a second of the two elements at one end of the range. According to the invention, the region (25, 26) is flexible such that it can be deformed by a given distance with a force less than a given force. The first element (3) is coupled to the second element (2) by passing through the flexible region (26).  [Abstract WO2015185671]
Patent Family 
  • WO2015185671 (A1) ― 2015-12-10 - Anti-jamming system in a humanoid-type robot
  • AU2015270477 (A1) ― 2016-12-01 - Anti-jamming system in a humanoid-type robot 
  • CA2950652 (A1) ― 2015-12-10 - Anti-jamming system in a humanoid-type robot 
  • EP3152013 (A1) ― 2017-04-12 - Anti-jamming system in a humanoid-type robot  
  • FR3021573 (A1) ― 2015-12-04 - Système anti coincement dans un robot a caractère humanoide  
  • JP2017516672 (a) ― 2017-06-22 -ヒューマノイド型ロボットにおける噛み込み防止システム
  • KR20170021828 (A) ― 2017-02-28 - Anti-jamming system in a humanoid-type robot
  • MX2016015823 (A) ― 2017-06-28 - Anti-jamming system in a humanoid-type robot 
  • SG11201609419R (A) ― 2016-12-29 - Anti-jamming system in a humanoid-type robot
  • US2017080582 (A1) ― 2017-03-23 - Anti-jamming system in a humanoid-type robot

Thursday, November 9, 2017

Oh, Netflix! Frank and The Robot (2012)

Copyright © Françoise Herrmann

And, now for a bit of entertainment!  

Frank and The Robot (2012) is the story of a robot, a bit like Romeo's alter ego, who is purchased by Hunter to assist Frank, his aging, sloppy, and forgetful father. After a bit of resistance, and much learning on the part of both master and automated companion,  Frank and the robot team together for one last triumphant heist.  

This is the sweetest science-fiction comedy-drama, staring Susan Sarandon and Frank Langella, directed by Jake Schreier, and Winner of the Sundance Festival Alfred P. Sloan Feature Film Award.

Here is the trailer.  This film and preview are rated PG-13. 


Reference
Frank and The Robot (2012). Directed by Jake Schreier. Produced by Lance Acord, Sam Brisbee, Jackie Kelman-Brisbee and Gail Nederhoffer. Witten by Christopher ford

Wednesday, November 8, 2017

Oh, patents! Nao’s sibling (Romeo)

Copyright © Françoise Herrmann

Nao’s youngest and largest sibling is Romeo, a humanoid robot designed as a companion for the elderly. Thus, Romeo is being developed with both specific automated physical capacities and artificial intelligence for cognitive assistance and communication with an elderly population. The robot’s cognitive assistance module includes, for example, breakfast, lunch, nap and dinner routines, and reminders to drink fluids. The conversation module, for example, enables patients to query the robot for time of day, date and the latest news, or for certain tasks, such as “turn on the lights”, “show me a movie”, “answer the phone”, “bring me my glasses”, “follow me”, “let’s exercise:”, “hold this”, “empty the dishwasher”, plus much more, since this is a connected robot, whose functionalities are, in principle, indefinitely expandable with new apps, new extensions, and upgrades. 

Indeed, the Romeo platform was actually tested with a consortium of 16 top industrial and academic robotics research partners, such as the CNRS (France’s National Center for Scientific Research) and INRIA (France’s Institute for Research in Computer Science and Automation), for more than 4 years. This mega-collaboration has given rise to many patented inventions, some of which inform the design of  Nao second-generation robots and the design of Pepper, or alternatively, some of which were tested on Nao before becoming available on Romeo. Such R&D research projects have, for example, focused on hand and wrist operation, gestures to enhance communication, enhanced visual recognition or coordination of visual recognition with gestures in communication, robot safety, and collision avoidance. The following video shows just one such R&D project, focused on Romeo pouring a glass of water, while still tethered to a computer workbench.


The below-listed patent family, related to safe use of a humanoid robot, discloses one of the important inventions, connected to the Romeo project. This is an invention that addresses the issue of preventing damage to surroundings, in case the robot falls, and loses control of its movements. In fact, this invention, titled Safety of a humanoid-type robot in US2017072560discloses an emergency stop button. 

The emergency stop button is triggered when a certain force is exerted that exceeds a given threshold, for example, in case of impact, or if the robot falls, for one reason or another. The button can also be actuated by an operator who wants to take the robot out of service because of some observed malfunction. Otherwise, the force exerted that exceeds a given threshold is preferably exerted by the robot’s movements, and in particular, the robot's head on its trunk, as shown in the appended patent Figure 4, below.

The following is a list of the patents belonging to the patent family disclosing the emergency stop button for the purpose of safely using humanoid robots. The abstract for the US member of the patent family is also included below with Figure 4 of the patent showing the robot's head actuating the switch on the robot's trunk. 
A humanoid-type robot comprises two elements and an articulation with at least one degree of freedom linking the two elements, the articulation allowing a travel in a given range in operational operation, a first of the two elements being intended to come into contact with an abutment belonging to a second of the two elements at the end of the range. According to the invention, the robot further comprises at least one switch. The switch is configured to actuate an electrical contact when a force exerted by the first element against the abutment exceeds a given force. [Abstract US2017072560]
  • US2017072560 (A1) ― 2017-03-16 - Safety of a humanoid-type robot
  • AU2015270476 (A1) ― 2016-12-01 - Safety of a humanoid-type robot 
  • CA2950660 (A1) ― 2015-12-10 - Safety of a humanoid-type robot
  • EP3152008 (A1) ― 2017-04-12 - Safety of a humanoid-type robot
  • FR3021572 (A1) ― 2015-12-04 - Sécurité d'un robot à caractère humanoïde
  • JP2017516671 (A) ― 2017-06-22 - ヒューマノイド型ロボットの安全性
  • KR20170021800 (A) ― 2017-02-28 - Safety of a humanoid-type robot
  • MX2016015822 (A) ― 2017-06-28 - Safety of a humanoid-type robot
  • NZ726224 (A) ― 2017-09-29 - Safety of a humanoid-type robot  
  • SG11201609420R (A) ― 2016-12-29 - Safety of a humanoid-type robot
  • WO2015185670 (A1) ― 2015-12-10 - Safety of a humanoid-type robot 
References
Softbank Robotics
Softbank Robotics - Nao
Softbank robotics - Pepper
Sofbank Robotics - Romeo
CNRS
INRIA

Sunday, November 5, 2017

Oh, patents! Nao's siblings (Pepper)

Copyright Françoise Herrmann

The definition of a humanoid robot according to many of the Softbank Robotics patents is the following:
A robot can be qualified as humanoid from the moment … it has certain human appearance attributes: a head, a trunk, two arms, two hands, etc. A humanoid robot may, however, be more or less sophisticated. Its limbs may have a greater or lesser number of articulations. It may control its own balance statically and dynamically and walk on two limbs, possibly in three dimensions, or simply roll over a base. It may pick up signals from the environment (“hear”, “see”, “touch”, “sense”, etc.) and react according to more or less sophisticated behaviors, and interact with other robots or humans, either by speech or by gesture. [Extracted from US20170197311A1] 
Thus, it probably comes as no surprise that Nao is also part of humanoid family that includes a couple of humanoid siblings. Pepper is Nao’s first humanoid sibling. This Softbank Robotics humanoid robot measures 4 ft in height. Pepper’s trunk is humanoid, with two arms, two hands and a head. The lower half of her body is designed as a skirt comprising wheels to ensure mobility. Pepper is equipped with a 3D camera to detect people and their emotions, as well as her own surroundings. The little robot interacts via a tablet and voice. Her hands, fingers and forearms together display 20 degrees of freedom for gestures, specifically designed to enhance communication.  

Pepper was designed as a business companion, in particular for retail and services. She is the ideal host, fielding reception of clients in hotel lobbies, at airports, and in buildings. She might also be found in stores, where customers might query her about goods or inventory. Pepper is also connected, using apps that can be downloaded to increase, or customize, functionalities. She was also designed with a community of developers in mind, who would provide her with endless new possibilities of use and functionality.

The following video pitches Pepper to developers:


Pepper embodies numerous patented inventions, some of which are found in all the humanoid robots manufactured by Softbank Robotics. For example, see posts referring to the inventions covering Nao's hands and Nao's skull. However, the following two patent families are each respectively specific to Pepper’s mobility, and her ability to dock independently into a recharging base:

Patent Family I (Pepper’s mobility)
  • US20170144299A1 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller
  • AU2015248711 (A1) ― 2016-11-03 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller  
  • EP2933069 (A1) ― 2015-10-21 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller  
  • CA2946049 (A1) ― 2015-10-22 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller 
  • HK1216406 (A1) ― 2016-11-11 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller 
  • JP2017513726 (A) ― 2017-06-01 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller
  • KR20170030078 (A) ― 2017-03-16 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller
  • SG11201608204Q (A) ― 2016-10-28 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller 
  • WO2015158885 (A2) ― 2015-10-22 - Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller 

Patent family II (Pepper’s ability to dock herself into her recharging base)
  • US20170080816A1 Battery charging base and recharging method implementing such a base
  • AU2015270600 (A1) ― 2016-12-01 - Battery charging base and recharging method implementing such a base
  • CA2951060 (A1) ― 2015-12-10 - Battery charging base and recharging method implementing such a base 
  • EP3152080 (A1) ― 2017-04-12 - Battery charging base and recharging method implementing such a base
  • FR3021914 (A1) ― 2015-12-11 - Battery charging base and recharging method implementing such a base 
  • JP2017518195 (A) ― 2017-07-06 - Battery charging base and recharging method implementing such a base 
  • KR20170026441 (A) ― 2017-03-08 - Battery charging base and recharging method implementing such a base 
  • MX2016015828 (A) ― 2017-06-28 - Battery charging base and recharging method implementing such a base
  • SG11201609422X (A) ― 2016-12-29 - Battery charging base and recharging method implementing such a base 
  • WO2015185525 (A1) ― 2015-12-10 - Battery charging base and recharging method implementing such a base
References
Softbank Robotics – Pepper
Wikipedia – Pepper (Robot)

Thursday, November 2, 2017

Oh, patents! Nao’s skull


Copyright © Françoise Herrmann

Lots of fragile on-board electronic circuitry inside a humanoid robot that requires protection from falls!

A family of patents, including the US utility patent application US20170043487A1, titled Shock-absorbing device for a humanoid robot, discloses a Softbank Robotics shock absorbing invention. This shock absorber (skull) is designed, in particular, for fall-protection of the electronic payload, found inside a humanoid robot’s head, such as Nao’s (or Nao's sibling called Pepper). 

The Softbank Robotics shock absorber invention comprises specially designed shock-absorbing rubber material, and two additional layers of protection: a deformable outer layer and a hard shell containing the fragile electronics. The shock absorbing skull further comprises an outlet for a fan housed inside the humanoid's head, the fan functioning to cool the onboard electronics. The shock absorbing skull also comprises conduits for channeling sound and light emitted by the onboard electronics through the eye orbits, and the ear openings of the humanoid robot's head. Finally, a microphone, attached to the hard shell, is also insulated with the shock absorbing material.   

The shock absorbing system is disclosed for the front and back of the humanoid robot's head. It is also used on the front and back of the humanoid's torso. 

The sorts of falls that are protected by the shock-absorbing invention, disclosed in US20170043487A1, are shown at the end of the following video. 



The family of patents for this shock absorber invention is listed below:
  • AU2015257687 (A1) ― 2016-11-10 - Shock-absorbing device for a humanoid robot
  • CA2948212 (A1) ― 2015-11-12 - Shock-absorbing device for a humanoid robot 
  • EP3140086 (A1) ― 2017-03-15 - Shock-absorbing device for a humanoid robot
  • FR3020847 (A1) ― 2015-11-13 - Shock-absorbing device for a humanoid robot 
  • JP2017514711 (A) ― 2017-06-08 - Shock-absorbing device for a humanoid robot
  • KR20170058332 (A) ― 2017-05-26 - Shock-absorbing device for a humanoid robot
  • MX2016013596 (A) ― 2017-06-29 - Shock-absorbing device for a humanoid robot
  • SG11201608726X (A) ― 2016-12-29 - Shock-absorbing device for a humanoid robot
  • WO2015169894 (A1) ― 2015-11-12 - Shock-absorbing device for a humanoid robot
  • US2017043487 (A1) ― 2017-02-16 - Shock-absorbing device for a humanoid robot 

The abstract of the US member of this patent family is included below, with the patent drawing 2a showing a bird's eye view of Nao’s shock-absorbing skull. In patent drawing 2a,  the skull cap was removed to uncover the insides of the head. The front (11), and back (12), shock absorbing parts are connected transversely. The eye sockets are labeled (14) and (15).
A shock-absorbing device for a humanoid robot, comprises a rigid structure linked to the humanoid robot, a deformable outer shell, and a shock-absorber; the shock-absorber consisting of a flexible cellular structure comprising a set of cells emerging in a main direction, and being secured to the rigid structure at a first end in the main direction, and linked to the deformable outer shell at a second end opposite the first in the main direction. Advantageously, the outer shell is also linked directly to the rigid structure by means of at least one absorbent fixing of silent block type. The invention relates also to a humanoid robot, and in particular the head of a humanoid robot, comprising such a shock-absorbing device.                  [Absract  US20170043487A1]
References
Softbank Robotics