[go: nahoru, domu]

US20020081937A1 - Electronic toy - Google Patents

Electronic toy Download PDF

Info

Publication number
US20020081937A1
US20020081937A1 US09/985,909 US98590901A US2002081937A1 US 20020081937 A1 US20020081937 A1 US 20020081937A1 US 98590901 A US98590901 A US 98590901A US 2002081937 A1 US2002081937 A1 US 2002081937A1
Authority
US
United States
Prior art keywords
electronic toy
sound
toy according
robot
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/985,909
Inventor
Satoshi Yamada
Hirohiko Atobe
Kaoru Igarashi
Ryotaro Saji
Tetsuya Hayakawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Toys Co Ltd
Original Assignee
Sega Toys Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Toys Co Ltd filed Critical Sega Toys Co Ltd
Assigned to SEGA TOYS, LTD. reassignment SEGA TOYS, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MORIOKA, SEISUKI, OKUBO, JUN, YASUI, KEISUKE
Publication of US20020081937A1 publication Critical patent/US20020081937A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/36Details; Accessories
    • A63H3/48Mounting of parts within dolls, e.g. automatic eyes or parts for animation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present invention relates to an electronic toy (including adult “toys” and “playthings”, and domestic robots) that performs control so as to move arbitrarily in accordance with external sound and contact.
  • Stuffed animals of dogs, cats, bears and so on have been widely used as animal toys from the past. Further, there are animal toys structured by housing a motor and speaker inside the body of a stuffed animal or a synthetic resin body in the shape of an animal, and, for example, by contacting and pressing the head portion, such animal toy would perform prescribed movements such as moving its legs or mouth, as well as generate a prescribed cry.
  • an object of the present invention is to provide an electronic toy that is automatically activated when the user exists nearby.
  • Another object of the present invention is to provide an electronic toy having elements for seeking communication with the user.
  • the electronic toy (i.e., domestic robot) of the present invention is an electronic toy controlled so as to react to external information, comprising: a movement mechanism structuring the mechanical movement of the toy; an input means for obtaining external information; a distinction means for distinguishing whether an object body exists in the periphery; and a control means for selecting, among a plurality of control parameters, a control parameter for controlling the movement mechanism in correspondence with the external information based on the distinction result, and controlling the movement of the movement mechanism.
  • the electronic toy further comprises an information display means for externally displaying information
  • the control means further selects, among a plurality of control parameters, a control parameter for controlling the information display means in correspondence with the external information, and controls the operation of the information display means.
  • Obtained thereby is an electronic toy capable of reacting in correspondence with external information pursuant to the mechanism movement and visual display.
  • the electronic toy further comprises a sound generation means for externally outputting sound
  • the control means further selects, among a plurality of control parameters, a control parameter for controlling the sound generation means in correspondence with the external information, and controls the operation of the sound generation means.
  • Obtained thereby is an electronic toy capable of reacting in correspondence with external information pursuant to the mechanism movement, visual display, and sound output.
  • the electronic toy further comprises: a means for calculating the lifestyle rhythm of a specific person; and an event detection means for detecting the occurrence of an event during such lifestyle rhythm; wherein the control means further selects the control parameter of at least the movement mechanism, the information display means and the sound generation means in correspondence with the event.
  • an electronic toy which makes communication in correspondence with the lifestyle rhythm (e.g., biorhythm) of the user.
  • the lifestyle rhythm e.g., biorhythm
  • the electronic toy further comprises: a clock means for detecting the present time; and a detection means for detecting the occurrence of an event planned on a time base in advance; wherein the control means further selects the control parameter of at least the movement mechanism, the information display means and the sound generation means in correspondence with the event.
  • the distinction means detects peripheral sound and/or movement.
  • the distinction means detects peripheral sound and/or brightness.
  • the distinction means comprises a microphone for collecting sound and/or a camera for photographing the periphery.
  • the movement mechanism is structured in the shape of a humanoid robot, and the movement thereof is controlled so as to express at least one of a human emotion of “delight”, “anger”, “sorrow” and “pleasure”.
  • control means selects the control parameter for a predetermined solo play operation when it is judged that a person does not exist in the periphery.
  • the operation of a solo play is activated irrespective of the input made by the user, and, for example, includes the display of a solo play game on the display unit of the electronic toy.
  • the electronic toy is in the shape of a human, and the information display means is provided to a part corresponding to the face and displays expressions of the face and symbols such as text.
  • the electronic toy further comprises a storage means for recording the voice of a person. This will enable voice memos and impersonation (voice imitation).
  • the input means includes at least one of a touch sensor, microphone, light sensor, camera, ⁇ switch and condition sensor.
  • the electronic toy further comprises a means for detecting the output of a battery, which is the source of power of the movement mechanism; and wherein the control means further generates a warning with the information display means for externally displaying information or the sound generation means for externally outputting sound when the output of the battery becomes weak.
  • the electronic toy of the present invention is an electronic toy controlled so as to react to external information, comprising: a human-shaped structure; a control means for controlling the movement of the structure in correspondence with external information; a miniature camera provided to the structure and for photographing the external situation; and a communication means for externally transmitting the photographed image.
  • the communication means for example, infrared (IR) communication, PHS, mobile phone, wired communication, general telephone circuits and so on may be used.
  • IR infrared
  • PHS mobile phone
  • wired communication general telephone circuits and so on
  • the electronic toy of the present invention comprises: a basic frame disposed at the torso portion of the human-shaped toy; first and second sub-frames respectively provided at both sides of the basic frame and rotatably mounted on the basic frame; first and second rotational axes respectively provided to the first and second sub-frames; a cam mechanism provide to a third rotational axis driven by a first motor; a link for connecting the cam mechanism between the first and second sub-frames and oscillating both sub-frames; a gear mechanism driven by a second motor; and a transmission mechanism disposed across the basic frame and between the first and second sub-frames and for transmitting the output of the gear mechanism to the first and second rotational axes.
  • the shoulder, arm and wrist become movable, and simulated humanlike movement can be represented.
  • the transmission mechanism is structured of a gear train formed of a plurality of gears, and each of the gears on both ends are respectively disposed in the first and second sub-frames and respectively engage with the first and second rotational axes via a bevel gear.
  • the arm can be rotated simultaneously together with the rotation of the shoulder.
  • a first clutch mechanism for protecting the member from an overload is provided between the first motor and third rotational axis.
  • a second clutch mechanism for protecting the member from an overload is provided to the gear mechanism.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head portion capable of displaying text and symbols, and which is structured such that the information input pursuant to the operation of the input unit, which is formed of a plurality of input switches provided on the body, can be visually confirmed with the display unit provided on the face-corresponding portion.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal having a head portion and a torso portion and comprising a display unit at a face-corresponding portion of the head portion capable of displaying text and symbols, an input unit formed of a plurality of input switches is provided to the torso portion, and which is structured such that the operation results of the input unit can be visually confirmed with the display unit provided on the face-corresponding portion.
  • the electronic robot of the present invention is an electronic robot in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head portion capable of displaying text and symbols, and which is structured such that the information input by the operator operating the input unit provided to the body of the robot can be visually confirmed with the display unit provided on the face-corresponding portion.
  • the electronic robot of the present invention is an electronic robot in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head capable of displaying text and symbols, and which is structured such that the information input by the operator operating the input unit provided to the body of the robot is displayed on the display unit provided on the face-corresponding portion so as to form the expression of the robot.
  • an emotion parameter is included in the control parameter, and the emotion parameter is represented as the biorhythm of a specific person or the biorhythm of the robot. Expressions based on the emotions of the robot itself are thereby realized.
  • the emotion parameter is affected by the occurrence of an event. Emotions thereby change on a case-by-case basis pursuant to the situation.
  • the response to a question inquired by the electronic toy to the user is included in the event. Emotions can be changed depending on the response to the question.
  • defined beforehand in the question are changes in the emotion parameter against the anticipated response to the question. Influence on the respective responses to the question may thereby be made to differ.
  • control unit further selects the information to be externally displayed and/or selects the sound to be externally output based on the emotion parameter. Obtained thereby is information and sound to be externally output based on emotions.
  • control means further stores the response to the question, and forms a standard sentence by using data relating to the response. That is, the response results are used (reflected) in the control.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; an input means provided on the body for performing input operations; a storage means for storing a plurality of words; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the words based on the emotion parameter value and displays the words on the display unit.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a vocalization means for outputting sound data as a voice; an input means provided to the body for performing input operations; storage means for storing a plurality of sound data; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the sound data based on the emotion parameter value and makes the vocalization means vocalize the sound data.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a vocalization means for outputting sound data as a voice; an input means provided on the body for performing input operations; a storage means for storing a plurality of words and a plurality of sound data; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the words and the sound data based on the emotion parameter value and supplies the words and the sound data to the display unit and the vocalization means, respectively.
  • the emotion parameter value temporally changes between the maximum value and minimum value.
  • control means further inquires questions with text or sound, and changes the value of the emotion parameter in accordance with the input operation in response thereto.
  • the emotion of the electronic toy will thereby change depending on the user's response to the question.
  • a plurality of questions are stored in advance, and changes in the emotion parameter are defined against the anticipated response to the respective questions. This is amusing in that the degree of change in the emotion per question will differ.
  • a plurality of questions are stored in advance, and the degree of intimacy between the electronic toy and user is defined against the anticipated response to the respective questions.
  • control means further stores the response to the question, and forms a standard sentence by using data relating to the response.
  • control unit further accumulates the degree of intimacy obtained pursuant to the respective questions and, when this exceeds a prescribed value, supplies data for expressing a specific emotion to the display unit and/or the vocalization means.
  • the electronic toy is thereby able to make affectionate expressions to its user.
  • the aforementioned questions include questions that will affect and questions that will not affect the emotion parameter.
  • a plurality zones are defined in advance between the maximum value and minimum value of the emotion parameter and the words and the voice data are distributed in the respective zones, and wherein the control means selects words and sound data of the corresponding zone depending on to which zone the current emotion parameter value belongs.
  • control means in a special zone, further selects the control for performing a special movement accompanying the mechanical movement of the components structuring the human shape or animal shape.
  • the overall movement will have a large impact on the user.
  • control means further comprises an exhibition mode for changing the emotion parameter value between the maximum value and minimum value thereof in short cycles.
  • the characteristics of the electronic toy in the show window can thereby be introduced in a short period of time.
  • the electronic toy further comprises a connection means for connecting the electronic toy to a network, and wherein at least one of the words and sound data is downloaded to the storage means from a server device connected to the network. Data of words and sound as well as control data can thereby be updated.
  • the downloaded words and sound data are current affair terms. This is amusing in that the electronic toy will be contemporary.
  • downloaded words and sound data are terms corresponding to the characteristics of the user. Words befitting the user can thereby be selected.
  • the electronic toy further comprises a connection means for connecting two electronic toys, and wherein at least one of the words and sound data stored in the connected electronic toy of the opponent is received by the storage means. Data exchange between toys is thereby possible.
  • connection means includes at least one of a communication cable, PHS, mobile telephone and personal computer.
  • text data is exchanged between the electronic toys, and simulated conversation is conducted by incorporating the exchanged data in standard sentences. It is thereby possible to make it look like the electronic toys are having a conversation.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a sound detection means for detecting peripheral sound; a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a storage means for storing a plurality of expressions; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the expression based on the emotion parameter value and displays the expression on the display unit, and sets the emotion parameter to an unpleasant state when the sound exceeds a prescribed level and continues beyond a prescribed time.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a storage means for storing a plurality of expressions; an input means provided on the body for performing input operations; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the expression based on the emotion parameter value and displays the expression on the display unit, and selects an expression corresponding to the emotion parameter when the input means is continuously operated for a prescribed time or a prescribed number of times.
  • an expression of anger is displayed on the display unit during the unpleasant state.
  • the expression selected in correspondence with the continuous operation is a painful expression upon being pounded, or a pleasant expression upon being patted.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a storage means for storing a plurality of expressions; a light sensor for detecting the peripheral brightness; and a control means for selecting an expression corresponding to the self emotion and selecting the expression of closing the eyes when the light sensor detects a dark state beyond a prescribed time.
  • the state of falling asleep can be represented.
  • control means further moves the mechanical components structuring the human shape or animal shape so as to express a reluctant expression of going to sleep.
  • the initial value of the function for outputting the emotion parameter value expressing the emotions is set randomly.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; movably structured mechanical components structuring a human shape or an animal shape; and a control unit for distinguishing the message and control information from a file attached to an e-mail, displaying the message on the display unit, and moving the mechanical components in correspondence with the control information.
  • the attachment file is a sound file.
  • the sound file is reproduced as a sound signal by a computer, and the sound signal is supplied to the control unit.
  • control information designates the movement stored beforehand in the control unit.
  • control information designates to the control unit a series of control procedures of the mechanical components.
  • control unit selects adequate movement of the mechanical components.
  • control information expresses emotions such as delight, anger, romance and pleasure of the robot.
  • the e-mail method of the present invention comprises: a step of converting the input message to be displayed on the electronic toy of the receiving side and the movement to be made by the electronic toy into a sound signal; a step of converting the sound signal into a sound file and making this an attachment file of an e-mail; a step of transmitting the e-mail with the sound attachment file from the terminal device of the transmitting side to the terminal device of the receiving side; a step of forwarding the reproduced sound signal from the terminal device of the receiving side to the electronic toy; and a step of making the electronic toy display the message and perform the movement.
  • the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a leg structure structuring a pair of movable legs in the shape of a human or an animal; and a control unit for controlling the movement of the legs in correspondence with sound to be output.
  • control unit sets the speed of movement of the legs in correspondence with the volume of the sound and rhythm.
  • the movement of the pair of legs is a movement of opening/closing the legs in the horizontal direction.
  • a slide prevention means is provided to the sole of one of the legs, and sliding means is provided to the sole of the other of the legs.
  • the leg structure comprises: a waist portion frame provided with a pair of hip joint portions rotatable at least in one direction; a pair of leg portions respectively connected to the pair of hip joint portions; a pair of drive shafts in which one end portion thereof is mounted on the leg portion and the other end portion thereof extends inside the waist portion frame beyond the hip joint portion of the leg portion; a link member for mutually connecting the respective other end portions of the respective drive shafts; a cam mechanism interjacent between the other end portion of at least one of the drive shafts and the link member, and for changing the respective one end portions of the drive shafts to become wide or narrow; and a motor built in one of the leg portions and for rotatably driving one of drive shafts.
  • the other end portion of the drive shaft and the link member, or the cam and the link member are connected via a spherical engagement member.
  • the electronic toy further comprises a sliding means provided on one end portion of the other drive shaft among the pair of drive shafts so as to slide on the ground surface or floor surface.
  • the other of the leg portions comprises an above-knee portion connected rotatably in the cross direction to the hip joint portion, a below-knee portion connected rotatably in the cross direction with the above-knee portion, and a ground portion connected rotatably in the horizontal direction with one end portion of the other drive shaft among the pair of drive shafts; and wherein the electronic toy has a structure in which a protrusion is formed at the lower face of the below-knee portion, an inclined face to which the protrusion contacts is formed on the upper face of the ground portion, the protrusion is pushed up pursuant to the opening/closing movement of the leg portions, and the connection of the above-knee portion and the below-knee portion bends thereby.
  • the sliding means is a roller.
  • the degree of opening/closing of the pair of legs is adjustable pursuant to the position in which the engagement member is mounted on the cam mechanism.
  • the electronic toy of the present invention is an electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of one leg comprises: a waist portion frame; an above-knee portion connected rotatably to the waist portion frame; a below-knee portion connected rotatably to the above-knee portion; a ground portion connected rotatably to the below-knee portion; a rotatably driven cam pulley provided to the waist portion frame; a first cam provided to the cam pulley; a second cam provided to the cam pulley; a long member for vertically oscillating the ground portion with the first cam; and a short member for oscillating the below-knee portion with the second cam. in the cross direction.
  • the electronic toy of the present invention is an electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of one leg comprises: a waist portion frame; an above-knee portion connected rotatably to the waist portion frame; a below-knee portion connected rotatably to the above-knee portion; a ground portion connected rotatably to the below-knee portion; a rotatably driven cam provided to the waist portion frame; a long member for vertically oscillating the ground portion with the cam; and a short member for oscillating the below-knee portion with the cam in the cross direction.
  • the long member comprises a guide hole to be engaged with a guide member and a pushdown plate in contact with the cam.
  • the pushdown plate pushes down the ground portion and sets the upper limit of the long member. It is thereby possible to prevent a larger inclination and excess lifting of the ground portion.
  • the electronic toy further comprises an energization means for energizing the tip of the ground portion in the pushdown direction.
  • the size of the electronic toy is approximately 30 cm.
  • an oblique direction drive means is provided to the ground portion for driving the electronic toy in an oblique direction against the advancing direction by the bipedal locomotion mechanism.
  • the oblique direction drive means is structured by comprising a rotatably driven drive roller or drive belt.
  • a plurality of driver rollers or drive belts are provided.
  • the oblique direction drive means is respectively provided to the respective ground portions of both legs, and the respective drive directions of each of the oblique direction drive means exist on the circumference of an approximate curvature.
  • the present invention is an electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, and is provided with an oblique direction drive mechanism for driving the electronic toy in an oblique direction against the advancing direction by the bipedal locomotion mechanism.
  • the oblique direction drive mechanism is structured by comprising a rotatably driven drive roller or drive belt.
  • driver rollers or drive belts are provided.
  • the oblique direction drive mechanism is respectively provided to the respective sole portions of both feet, and the respective drive directions of each of the oblique direction drive means exist on the circumference of an approximate curvature.
  • the oblique direction drive mechanism is provided at the toe side of the sole portion of the feet, and a sliding roller is provided at the heel side of the sole portion of the feet.
  • FIG. 1 is a front view for explaining the robot as the electronic toy (domestic robot);
  • FIG. 2 is a rear view for explaining the robot as the electronic toy
  • FIG. 3 is a top, view, for explaining the robot as the electronic toy
  • FIG. 4 is a side view for explaining the robot as the electronic toy
  • FIG. 5 is an explanatory diagram for explaining the mechanism enabling the rotation of the arm, shoulder, neck, and so on of the robot;
  • FIG. 6 is a perspective view for explaining the aforementioned mechanism
  • FIG. 7 is an explanatory diagram showing the mechanism enabling the rotation of the neck portion and rotation of the shoulder portion of the robot;
  • FIG. 8 is an explanatory diagram showing the mechanism enabling the rotation of the arm portion of the robot
  • FIG. 10 is a block diagram explaining the schematic structure of the control unit 60 ;
  • FIG. 11 is a flowchart explaining an example of inputting the “date of birth” into the robot for calculating the biorhythm
  • FIG. 12 is a flowchart explaining an example of enabling the determination of the existence of a user by collecting ambient sound
  • FIG. 13 is a flowchart explaining an example of recognizing the voice (order, etc.) of the user and the robot making movements corresponding thereto;
  • FIG. 14 is a flowchart explaining an example of detecting the movement of the object
  • FIG. 15 is a flowchart explaining an example of distinguishing the existence of the user based on the switch operation, movement of the object, and existence of sound;
  • FIG. 16 is a flowchart explaining an example for distinguishing the existence of the user based on the switch operation, peripheral brightness, and existence of sound;
  • FIG. 17 is a flowchart explaining an example of the control movement in consideration of the biorhythm
  • FIG. 18 is an explanatory diagram explaining the biorhythm
  • FIG. 19 is an explanatory diagram explaining an example of the facial eye expressions and the text (symbol) scroll displayed on the display screen;
  • FIG. 20 is a flowchart explaining an example of movement control of the robot pursuant to the lapse in time
  • FIG. 21 is a flowchart explaining the execution of a control program pursuant to the control unit (CPU);
  • FIG. 22 is an explanatory diagram explaining an example of the posture expressing a feeling of “delight” of the robot.
  • FIG. 23 is an explanatory diagram explaining an example of the posture expressing a feeling of “pleasure” of the robot
  • FIG. 24 is an explanatory diagram explaining an example of the posture expressing a feeling of “sorrow” of the robot
  • FIG. 25 is an explanatory diagram explaining an example of the posture expressing a feeling of “affection” of the robot.
  • FIG. 26 is a front view explaining an example of another robot as the electronic toy
  • FIG. 27 is a side view explaining an example of another robot as the electronic toy
  • FIG. 28( a ) to FIG. 28( d ) are explanatory diagrams explaining an example of the various expressions corresponding to the delight, anger, romance and pleasure of the robot;
  • FIG. 29( a ) and FIG. 29( b ) are explanatory diagrams explaining an example of the various expressions corresponding to the emotions of the robot;
  • FIG. 30 is an explanatory diagram showing another example of a different biorhythm (biorhythm of robot).
  • FIG. 31 is a flowchart explaining the operation of an example of displaying words on the display screen of the robot influencing the current feelings
  • FIG. 32 is an explanatory diagram explaining an example of displaying words (anger mode of biorhythm) on the display screen of the robot;
  • FIG. 33 is an explanatory diagram explaining an example of displaying words (normal emotion mode) on the display screen of the robot;
  • FIG. 34 is an explanatory diagram explaining an example of displaying words (haiku style) on the display screen of the robot;
  • FIG. 35 is a flowchart explaining an example of the feeling of the robot changing pursuant to the response to the question asked by the robot;
  • FIG. 36 is an explanatory diagram showing an example of a question in which the response thereof will influence the biorhythm
  • FIG. 37 is an explanatory diagram showing an example of a question in which the response thereof will not influence the biorhythm
  • FIG. 38 is an explanatory diagram explaining an example of questions that will influence the biorhythm (feelings) of the robot;
  • FIG. 39 is an explanatory diagram explaining an example of questions that will influence the biorhythm (feelings) of the robot;
  • FIG. 40 is an explanatory diagram explaining an example of the feelings (biorhythm) becoming worse pursuant to the response to the result of the question;
  • FIG. 41 is an explanatory diagram explaining an example of two robots being connected via a cable for exchanging data in order to communicate with each other;
  • FIG. 42 is an explanatory diagram showing an example where a robot is connected to a PHS or mobile phone in order to acquire data by communicating with another robot or a server device so as to make conversation or movement;
  • FIG. 43 is a block diagram explaining an example of connecting the communication interfaces of robots via a cable in order to conduct communication;
  • FIG. 44 is a block diagram explaining an example of communicating by using a terminal device connectable to a communication network such as a PHS or mobile phone;
  • FIG. 45 is a block diagram explaining an example of enabling communication between robots by using the Internet.
  • FIG. 46 is an explanatory diagram explaining an example of downloading data from a server device to the robot.
  • FIG. 47 is a communication diagram explaining a procedural example upon conducting data communication via a connection cable
  • FIG. 48 is a communication diagram explaining a procedural example upon conducting data communication with a mobile phone or PHS;
  • FIG. 49 is a communication diagram explaining a procedural example upon conducting data communication when obtaining data from the server device
  • FIG. 50 is an explanatory diagram explaining an example of “current affairs” and user adaptive data provided by the server device;
  • FIG. 51 is a block diagram explaining the operation of an action mail
  • FIG. 52 is an explanatory diagram explaining the contents (format) of the action mail
  • FIG. 53 is an explanatory diagram explaining an example the robot making a movement of “delight” upon receiving the action mail;
  • FIG. 54 is an explanatory diagram explaining an example the robot making a movement of “anger” upon receiving the action mail;
  • FIG. 55 is an explanatory diagram explaining an example the robot making a movement of “sorrow” upon receiving the action mail;
  • FIG. 56 is an explanatory diagram explaining an example the robot making a movement of “pleasure” upon receiving the action mail;
  • FIG. 57 is a perspective view explaining the first posture (legs closed) of the dance robot.
  • FIG. 58 is a perspective view explaining the second posture (legs opened) of the dance robot.
  • FIG. 59 is a perspective view explaining open/close mechanism of the legs (posture of legs closed);
  • FIG. 60 is a perspective view explaining open/close mechanism of the legs (posture of legs opened);
  • FIG. 61 is a perspective view explaining a structural example of the right leg
  • FIG. 62 is an explanatory diagram explaining the operation of bending the knee of the right leg
  • FIG. 63 is a perspective view explaining a structural example of the left leg
  • FIG. 64 is an explanatory diagram explaining the adjustment of synchronization of the left and right legs by a cam
  • FIG. 65 is a block diagram explaining the control system of the dance robot.
  • FIG. 66 is a block diagram explaining the control system of another dance robot.
  • FIG. 67 is a perspective view explaining the bipedal robot
  • FIG. 68 is a perspective view explaining the bipedal robot
  • FIG. 69 is a perspective view explaining the bipedal robot
  • FIG. 70 is an explanatory diagram explaining the bipedal robot mechanism
  • FIG. 71 is an explanatory diagram explaining the waist portion frame
  • FIG. 72 is an explanatory diagram explaining the upper knee portion
  • FIG. 73 is an explanatory diagram explaining the lower knee portion
  • FIG. 74 is an explanatory diagram explaining the ground portion
  • FIG. 75 is an explanatory diagram explaining the campulley, long rod and short rod;
  • FIG. 76( 1 ) to FIG. 76( 4 ) are explanatory diagrams explaining the movement of the bipedal mechanism corresponding to the rotation of the camshaft;
  • FIG. 77( 5 ) to FIG. 77( 8 ) are explanatory diagrams explaining the movement of the bipedal mechanism corresponding to the rotation of the camshaft;
  • FIG. 78 is an explanatory diagram explaining the movement of the leg of the robot.
  • FIG. 79 is an explanatory diagram explaining another bipedal mechanism
  • FIG. 80 is an explanatory diagram explaining the cam pulley, long rod, short rod and spring of an example of another bipedal mechanism
  • FIG. 81 is an explanatory diagram explaining the turnabout mechanism of the robot.
  • FIG. 82 is an explanatory diagram explaining the turnabout mechanism of the robot.
  • FIG. 83 is an explanatory diagram explaining the turnabout mechanism of another robot.
  • FIG. 1 through FIG. 4 show examples of a humanoid robot (pet robot) as the electronic toy (domestic robot), and the diagrams respectively illustrate the front view, back view, top view and side view of the robot.
  • the robot 1 is structured by comprising a head portion 10 , a torso portion 20 , left and right arm portions 30 , and left and right leg portions 40 .
  • the head portion 10 and the torso portion 20 are rotatably connected via a neck joint K 6 .
  • the torso portion 20 and the arm portion 30 are rotatably connected via a shoulder joint K 1 .
  • An elbow joint K 2 and a wrist joint K 3 are provided to the arm portion 30 so as to realize the free bending of the arm portion 30 .
  • the torso portion 20 and the leg portion 40 are rotatably connected via a hip joint K 4 .
  • a knee joint KS is provided to the leg portion 40 .
  • an after-mentioned microcomputer system for controlling the robot, a window display unit for enabling communication between the user and robot, a sound sensor for collecting sound, a light sensor (or camera) for acquiring peripheral information, a touch sensor, a speaker for generation the sound of the robot, and so on.
  • an after-mentioned waggle mechanism for rotating the internal frame (not shown) of the head portion 10 and a nodding mechanism (not shown) for moving the head portion 10 back and forth.
  • the neck joint K 6 corresponds to the above.
  • the torso portion 20 comprises a motor as the source of power, an arm opening/closing mechanism for rotating the left and right arm portions 30 around the Z axis (vertical direction in FIG. 1) of the shoulder joint K 1 , an arm rotating mechanism for rotating the left and right arm portions 30 around the X axis (horizontal direction in FIG. 1) of the shoulder joint K 1 , and a neck rotating mechanism for rotating the head portion 10 around the Z axis.
  • “ ⁇ ” and “ ⁇ ” switches 54 as the detection switch are provided to the torso portion 20 .
  • a battery as the power source for activating the aforementioned motor and microcomputer system or the like is disposed inside the left and right leg portions 40 .
  • the battery may also be disposed in the torso portion 20 or the arm portion 30 . When disposing the battery in the torso portion 20 , bending of the knee joint K 5 becomes possible.
  • the bending of arms and legs can be realized by disposing an actuator such as an electromagnet or micro motor inside the respective arm portions or respective leg portions, thereby enabling more humanlike movements.
  • an actuator such as an electromagnet or micro motor
  • the aforementioned electronic robot is in the shape of a human, this electronic robot may also be in the shape of an animal.
  • the display unit 71 capable of displaying text and symbols on the face-corresponding portion of the head portion is structured such that the information, which is input by the operator who operates the input unit formed of a plurality of input switches 51 , 54 and so on provided to the body of the aforementioned robot, can be visually confirmed with the display unit 71 provided on the face-corresponding portion described above.
  • FIG. 5 through FIG. 8 are diagrams for explaining the mechanical structure built in the torso portion 20 .
  • FIG. 5 is a front view of the mechanical structure
  • FIG. 6 is the perspective view thereof.
  • FIG. 7 is an explanatory diagram illustrating the components corresponding to the aforementioned arm opening/closing mechanism and neck rotating mechanism in the mechanical structure.
  • FIG. 8 is an explanatory diagram illustrating the components corresponding to the aforementioned arm rotating mechanism in the mechanical structure.
  • the mechanical structure 200 is structured by comprising a basic frame 201 , a sub frame 202 , a neck (head portion) rotating mechanism 210 (c.f. FIG. 7), an arm (or shoulder) opening/closing mechanism 220 (c.f. FIG. 7), an arm rotating mechanism 230 (c.f. FIG. 8), a neck (head portion) rotational axis ( 203 ), an arm rotational axis 204 , a first motor 205 , a second motor 206 , a motor mounting plate 207 for fixing the respective motors. to the frame 201 , and so on.
  • the sub frame 202 is formed in an approximate horseshoe shape, and is respectively provided to both left and right sides of the basic frame 201 so as to be freely rotatable around the Z axis against the frame 201 .
  • a bevel gear mechanism for changing the transmission direction of the power is provided inside the sub frame, and power is thereby transmitted to the arm rotational axis 205 even when the sub frame 202 rotates around the Z axis.
  • the neck rotating mechanism 210 and the arm opening/closing mechanism 220 are driven with the first motor 205 .
  • the rotational axis of the motor 205 is connected to a warm gear mechanism 211 for changing the power transmission direction and converting the torque, and rotates the head portion rotational axis 203 via a spring clutch mechanism 212 as the safety device.
  • a frame (not shown) of the head portion 10 is connected to the upper end of the head portion rotational axis 203 and rotates the head portion 10 around the Z axis.
  • a warm gear mechanism may be provided at the upper end of the head portion rotational axis 203 in order to realize the movement of the head portion nods back and forth by obtaining the rotation around the x axis.
  • An arm opening/closing mechanism 220 is connected to the lower end of the head portion rotational axis 203 .
  • the spring clutch mechanism 212 prevents the breakage of components by sliding when there is an overload in the head portion rotational axis 203 or the sub frame (arm opening/closing mechanism).
  • a cam mechanism 221 is provided to the lower end of the head portion rotational axis 203 .
  • the cam mechanism 221 is structured by comprising a plate 222 fixed to the axis 203 , two arm mounting pins 223 provided to the plate 222 , pins 224 respectively mounted on the two sub frames 202 , and two links 225 for rotatably connecting one of the arm mounting pins 223 and the pin 224 of one of the sub frames, and the other arm mounting pin 223 and the pin 224 of the other sub frame 202 , respectively.
  • Each of the sub frames 202 is rotatably retained with the basic frame 201 with the pin 226 .
  • the head portion rotational axis 203 rotates in correspondence with the reverse rotational direction, thereby rotating the head portion 10 .
  • the plate 222 rotates pursuant thereto, thereby moving the link 226 and moving the sub frame 202 around the Z axis.
  • This enables the movement of opening and closing the arm portion 30 (e.g., movement of hugging).
  • the motor 205 is controlled by the microcomputer.
  • the rotational quantum of the axis 203 or the comprehension of the movement posture is, for example, grasped by reading the codes of a sensor disk (not shown) provided to the tip portion of the axis 203 , or with the combination of the cam and switch (not shown) provided to the tip portion of the axis 203 .
  • the arm rotating mechanism 230 is driven with the second motor 206 .
  • the pinion gear mounted on the rotational axis of the motor 206 drives a gear mechanism 231 formed of a plurality of gears.
  • This gear mechanism 231 further drives the gear train 233 which propagates the driving force in the longitudinal (horizontal) direction at the upper part inside the basic frame 201 .
  • a clutch mechanism 233 as the protection mechanism for preventing the breakage of components due to an overload is provided between the gear mechanism 231 and the gear train 233 .
  • the clutch mechanism 233 for example, generates a slide movement on the rubber surface in the case of an overload via the rubber friction plate (surface) sandwiched between the gears.
  • the likes of the aforementioned spring system or a flexible concave/convex plate may also be combined therewith.
  • the gear train 233 for example, is structured of 6 gears, and the gears on both ends are provided within the sub frame 202 . And, these gears on both ends engage with the bevel gear fixed to one end of the arm rotational axis 204 retained rotatably by the sub frame 202 .
  • An arm portion 30 (not shown) is mounted to the other end side of the arm rotational axis 204 via a collar 234 fixed to the axis 204 . Therefore, the driving force of the motor 206 rotates the arm rotational axis 204 via the gear mechanism 231 , clutch mechanism 233 and gear train 232 , and rotates the arm portion 30 mounted on this rotational axis 204 .
  • a sensor is provided to an appropriate position, to the collar 234 for example, for detecting the rotational position of this arm and controlling the motor.
  • FIG. 9 is a block diagram for explaining the control system of a robot as the electronic toy.
  • the robot comprises, as means for detecting the peripheral situation and inputs, a touch sensor 51 , microphone (sound sensor) 52 , light sensor (e.g., CCD camera) 53 , “ ⁇ ” ⁇ “ ⁇ ” switch 54 for generating output corresponding to the operation of the “ ⁇ ” button and “ ⁇ ” button, status (posture) sensor 55 and battery voltage detection sensor 56 .
  • the touch sensor 51 for example, is provided to the upper surface of the head portion 10 of the robot (c.f. FIG. 3) and is capable of detecting the patting (contact) on the head by the user.
  • the touch sensor for instance, is a micro switch or a capacitance-detecting contact detection switch.
  • the status sensor 55 detects the posture of the robot. Outputs from these respective sensors are supplied to the control unit 60 . Based on these inputs, the control unit 60 controls the motors 205 and 206 , the window display unit 71 of the head portion, the speaker 72 , and the joint actuator group 73 . When performing a simpler posture control in which detailed movements of the arms and legs of the robot are not made, the joint actuator group 73 may be omitted. Further, by internally providing a USB terminal or an infrared interface, it is possible to incorporate a function of forwarding the image read by the light sensor to a personal computer, PHS or mobile phone.
  • a function may be provided where additional name data is prepared in advance in the homepage server or in accordance with the request from the user, such that the user may connect his/her PC, PHS or mobile phone and the like to the USB terminal or infrared interface of the robot in order to download and use the desired name information from such homepage.
  • the USB terminal or infrared interface may be disposed in the back of the head portion where the CPU is built in.
  • the control unit 60 comprises a CPU 61 as the central processing unit, a ROM 62 (storage means), a RM 63 and a timer (time and calendar function).
  • Stored in the ROM 62 are a movement control program for driving and controlling the display unit 71 , speaker 72 , motors 205 and 206 , and actuator group 73 ; posture control data for switching a plurality of movement postures by controlling the rotational direction and rotational quantum of the motors 205 and 206 (and actuator group 73 ) in accordance with the posture of the robot to be set; sound control data for generating voices and melodies to be output from the speaker 72 ; display control data for making the display unit 71 display information to be displayed on the robot; program data for calculating the biorhythm of the user; sound/image processing program for judging the peripheral situation; for example, the existence of the user, based on the sound input or image input of the CCD camera; communication program (not shown) for externally conducting data communication via a P
  • the sound/image program includes a sound processing program for performing the likes of filter processing, discrimination processing and modulation processing of input sound, and an image processing program for detecting the peripheral brightness and detecting the movement of the subject.
  • the movement control program includes the likes of a movement selection program for selecting the movement pattern and display pattern corresponding to the situation among a plurality of movement patterns based on the judgment result of the peripheral situation pursuant to sound and/or images, and a posture control program for performing control so as to move the head portion 10 , arm portion 30 , joints and the like in the selected movement pattern.
  • RAM 84 Stored in the RAM 84 are output data of the microphone 52 and the output data of the light sensor (camera) 53 pursuant to the DMA operation via the interface of the microcomputer (not shown).
  • the sound signal output by the microphone 52 is A/D converted with the interface, low-pass filter processed so as to eliminate noise and only extract the voice range of a person, and retained as sound data by the RAM 63 .
  • Sound data is subject to the sound processing program. This data is stored for a fixed period of time and subject to sound recognition processing.
  • the method of sound recognition may either be recognition of general speakers or recognition of a specific speaker.
  • a command corresponding to the words communicated by the voice of the user is output.
  • the corresponding movement control is enabled by this command being communicated to the movement control program, thereby realizing the robot to make movements, displays and enunciations corresponding to the voice.
  • the sound processing program which includes the storage processing of storing the sound in the memory 63 , may also be used as a so-called voice memo for storing the voice of the user. Further, impersonation (voice imitation) by performing conversion processing of tone and pitch to the stored sound data, and forwarding this to the speaker 72 for vocalization is also possible.
  • the output signal corresponding to one frame output from the CCD camera as the light sensor is converted into image data with the interface, and retained in the image storage region of the RAM 63 .
  • Image data is subject to the image processing program. For example, in the standby state, the image is periodically sampled, and changes in the image (movement of subject) based on the difference of the image data of the previous frame and the image data of the present frame are read. The existence of the user is distinguished (or presumed) with the movement of the subject of the camera. Moreover, there is no need to compare each and every frame, and image data in a plurality of sections within the frame may be compared.
  • the peripheral brightness of the robot can be distinguished with the average value of the image data (luminance).
  • a CCD camera Upon distinguishing the peripheral brightness, a CCD camera is not essential, and a light detection element such as an SPD or phototransistor may also be used.
  • a light detection element such as an SPD or phototransistor may also be used.
  • the existence of the user may be distinguished by recognizing that it is bright during night hours by combining the time and brightness. It is also possible to distinguish the existence of the user by distinguishing the existence of voices (or lifestyle sound) and the brightness in the room. The existence or non-existence of the user is displayed in the flag area of the RAM 63 .
  • the robot as the electronic toy of the present invention moves in conformity with the biorhythm, which is a parameter for representing the physical condition (behavior) of the user, and moves so as to exhibit a so-called healing atmosphere.
  • FIG. 11 is a flowchart explaining the input processing for acquiring the birthday necessary for calculating the biorhythm of the user.
  • the user when the user simultaneously presses both the “ ⁇ ” and “ ⁇ ” buttons 54 provided to the torso portion 20 , it becomes a mode selection state (not shown). In this state, various modes are sequentially displayed in prescribed time intervals on the display unit 71 . Included in the modes are “calendar date setting”, “clock time setting”, “user name input”, “user birthday input”, “user gender input”, “voice memo input”, “voice sample input”, “external (mobile phone) forwarding setting”, “energy saving setting” and so on.
  • the ⁇ button when the “user birthday input” is displayed on the screen, the birthday input program is activated and proceeds to the present routine.
  • the control unit (CPU) 60 displays “Please input your birth date”, “Please input in the order of year, month and day” on the liquid crystal panel or LED matrix of the display unit 71 .
  • the letter string is displayed so as to move (scroll display) in the horizontal or vertical direction of the screen (S 22 )
  • the last two digits of the Christian year “40” to “00 (current Christian year)” corresponding to the range of age of target users are sequentially displayed on the display unit 71 in prescribed time intervals (S 24 ).
  • the user presses the ⁇ button to select such year.
  • the operation of the ⁇ button or the ⁇ button is distinguished by the setting of the corresponding flag within the RAM 63 .
  • the control unit 60 distinguishes whether a selection has been made or not (S 26 ). When a selection is not made even upon a prescribed time elapsing (S 26 ; No), the displayed year is repeatedly increased in increments of “1” (S 24 and S 26 ). When selected (S 26 ; Yes), the selected “year” is retained. Moreover, after having pressed the ⁇ button, the user may cancel such input by pressing the ⁇ button if it is within a prescribed time.
  • the routine proceeds to the input of “month”.
  • the control unit 60 after having displayed “Please input the month”, sequentially displays “1” to “12” on the display unit 71 in prescribed time intervals (S 28 ).
  • the user presses the ⁇ button to select such month.
  • the control unit 60 distinguishes whether a selection has been made or not (S 30 ).
  • S 30 When a selection is not made even upon a prescribed time elapsing (S 30 ; No), the displayed month is repeatedly increased in increments of “1” (S 28 and S 30 ).
  • S 30 Yes
  • the selected “month” is retained.
  • the routine proceeds to the input of “day”.
  • the control unit 60 after having displayed “Please input the day”, sequentially displays “1” to “31” on the display unit 71 in prescribed time intervals (S 32 ).
  • the user presses the ⁇ button to select such day.
  • the control unit 60 distinguishes whether a selection has been made or not (S 34 ).
  • S 34 When a selection is not made even upon a prescribed time elapsing (S 34 ; No), the displayed day is repeatedly increased in increments of “1” (S 32 and S 34 ).
  • S 34 Yes
  • the selected “day” is retained.
  • the control unit 60 When the input of “year”, “month” and “day” is completed, the control unit 60 writes the user's “year”, “month” and “day” in the user biorhythm data area of the ROM 62 .
  • the user's biorhythm calculation is thereby possible.
  • it is also possible to set the biorhythm of the robot such that the robot will exert itself under its own biorhythm.
  • FIG. 12 illustrates an example of the aforementioned sound processing (sound volume detection) of the control unit 60 .
  • the control unit (CPU) 60 performs computing processing equivalent to a low-pass filter for eliminating a high frequency noise component from the sound data stored in the RAM 63 (S 42 ).
  • the average value is sought by multiplying the amplitude level of the sound data in a prescribed time frame of the processed sound data (S 44 )
  • the control unit 60 stores this average value (S 46 ).
  • the control unit 60 further judges the location of the user by distinguishing whether the sound level increased rapidly by continuously observing the average value of the sound level (S 48 ). When it is judged that the user exists (is located) in the room, the (sound) flag representing the aforementioned location is set (S 50 )
  • FIG. 13 illustrates an example of the second sound processing (sound recognition).
  • the control unit (CPU) 60 performs normalization processing in order to match the time axis and signal level of the sound data stored in the RAM 63 with the contrast data (S 62 ). Characteristic parameters of the sound are extracted form the normalized data (S 64 ). Vocalization is distinguished based on the extracted characteristic parameters, and the movement command of the robot corresponding to the subject matter (meaning) of the generation is output (S 66 ). The flag representing this command is set in the RAM 63 (S 68 ). The control unit 60 thereby reads the vocalization control data, display control data and posture control data corresponding to the command and controls the movement of the robot as described later.
  • FIG. 14 illustrates an example of the image processing of the control unit 60 .
  • the control unit 60 compares the previously stored image data from the CCD camera 53 stored in the RAM 63 in a prescribed sampling frequency (S 72 ) with the currently stored image data (S 74 , and distinguishes changes in the image data. For example, the difference in data of the respective pixels in positions corresponding to both frames is sought and accumulated. When the subject is moving, this accumulated value significantly changes. Further, in order to reduce the operational load, changes in data in a prescribed position on the screen; for example, the center and four corners of the screen, may be compared (S 76 ).
  • Whether the.subject moved (or changed) is judged with the CCD screen (image) based on such difference (S 78 ).
  • a flag representing movement detection (user location) is set (S 80 ). Further, the brightness inside the room can also be distinguished with the average value of the luminescence of the image data.
  • FIG. 15 is a flowchart explaining an example of judging whether a user is located (or exists) based on the switches, sound, movement of the subject, and so on.
  • the control unit 60 repeats this routine in prescribed cycles during the standby state. Foremost, the control unit 60 distinguishes whether the user directly operated the switches with the likes of the touch sensor 51 or ⁇ switch 54 by checking the relevant flag (S 102 ). If the switches have been operated (S 102 ; Yes), since this means nothing less than that the user exists, the flag representing the location of the user is set (S 112 ), and this routine is ended.
  • FIG. 16 is a flowchart explaining another example of judging the location (or existence) of the user based on the switches, sound, movement of the subject, and so on.
  • the brightness of the room is detected instead of the detection of movement, and differs from the case depicted in FIG. 15 in that the user is considered to exist nearby when the room is bright.
  • it is distinguished whether the flag representing that the room is bright pursuant to the results of the aforementioned image processing or based on a phototransistor, and a sound detection flag (S 50 ) pursuant to the results of the sound processing have been turned on (S 120 )
  • the other routines are the same as the case in FIG. 15 and explanation thereof is omitted.
  • FIG. 17 illustrates an example where the robot reacts in correspondence with the biorhythm of the user.
  • the control unit 60 executes this routine when it is activated in the morning. Foremost, it is judged whether the user exists in (or near) the room (S 132 ). When the user does not exist (S 132 ; No), this routine is ended. When the user does exist (S 132 ; Yes), the built-in calendar is read (S 134 ). The biorhythm of the user is calculated as depicted in FIG. 18 based on today's date and the birth date of the user (S 136 ). Event occurrence dates are set in this biorhythm beforehand. For example, an event occurrence date shall be turnabout points El and E 3 in which the positive and negative behavior switches, optimum point E 2 and worst point E 3 . It is then judged whether today corresponds to an event occurrence date set in advance (S 138 ). When it is not an event occurrence date (S 138 ; No), this routine is ended.
  • FIG. 20 is a flowchart showing an example of controlling the movement such that the movement of the robot changes with time.
  • the control unit (CPU) 60 foremost distinguishes whether the user exists nearby with the settingof the aforementioned flags (e.g., S 126 ) (S 152 ).
  • S 152 When the user does not exist nearby (S 152 ; No), solo play is implemented from time to time.
  • Solo play for example, is represented with a play state by displaying a one-person game on the display unit 71 .
  • Random numbers are thereby generated (S 154 ) in order to judge whether the number for solo play has been output (S 156 ). When such number is not output, this routine is ended (S 156 ; No).
  • solo play data is extracted from the posture control data, sound control data and display control data and set in the movement control program (S 158 ).
  • control unit 60 reads the present time from the internal clock (S 160 ), and judges whether this time is a time to wake up (S 162 ).
  • the control unit 60 sets a movement control program by extracting (wakeup) data for waking the robot up from the posture control data, sound control data and display control data (S 164 ).
  • the robot thereby performs wakeup operations such as “Good morning”, “I'm awake” and the like. If it is not a time to wake up (S 164 ; No), it is subsequently distinguished whether it is a time to send off the user (S 166 ).
  • the control unit 60 sets a movement control program by extracting data for sending the user off from the posture control data, sound control data and display control data (S 168 ).
  • the robot thereby performs sendoff operations such as “It's time to go”, “Have a good day” and the like. If it is not a time to send the user off (S 166 ; No), it is subsequently distinguished whether it is a predetermined time for the user to come home (S 170 ).
  • the control unit 60 sets a movement control program by extracting data for welcoming the user home from the posture control data, sound control data and display control data (S 172 ).
  • the robot thereby performs welcome operations such as “Welcome home”, “Good to see you” and the like. If it is not a time to welcome the user home (S 170 ; No), it is subsequently distinguished whether it is a predetermined time for the user to go to sleep (S 174 ).
  • the control unit 60 sets a movement control program by extracting operational data for sleeping from the posture control data, sound control data and display control data (S 176 ).
  • the robot thereby performs goodnight operations such as “Good night”, “See you tomorrow” and the like, and thereafter enters a power saving mode (sleep mode) If it is not a time to go to sleep (S 174 ; No), it is subsequently distinguished whether it is a predetermined time of the user's alarm setting (S 178 ).
  • the control unit 60 sets a movement control program by extracting alarm data from the posture control data, sound control data and display control data (S 180 ).
  • the robot thereby performs operations for informing the time such as “It's time”, “Wake up” and “It's (present time)”. If it is not a time of the alarm setting (S 178 ; No), this routine is ended.
  • the control unit 60 performs display control of the display unit 21 pursuant to the display control set in the control program (S 202 ).
  • the posture of the robot is controlled by controlling the motors 205 and 206 pursuant to the posture control data set in the control program (S 204 ). Further, sound is output from the speaker 72 with the vocalization mechanism (synthesizer, sound data reproduction) pursuant to the sound control data set in the control program (S 206 ).
  • FIG. 22 illustrates an operational example of the robot when the operational data of “delight” is set in the control program.
  • FIG. 23 illustrates an operational example of the robot when the operational data of “pleasure” is set in the control program.
  • FIG. 24 illustrates an operational example of the robot when the operational data of “sorrow” is set in the control program.
  • FIG. 24 illustrates an operational example of the robot when the operational data of “affection” is set in the control program.
  • the electronic toy of the present invention can be connected to the likes of a PHS, mobile phone and standard circuit, and the user may view the situation inside the house by forwarding the image obtained by the robot to himself/herself.
  • control unit 60 When the remaining battery level is low, the control unit 60 yields expressions of vocalized words such as “I'm going to sleep to save the batteries, okay?” pursuant to the battery voltage detection sensor 56 .
  • the robot as the electronic toy depicted in the present embodiment expresses emotions with its entire body, and it is thereby possible to yield so-called healing element in the toy since operations as though seeking communication with the user are enabled. Moreover, various conversations are also realized.
  • FIG. 26 and FIG. 27 illustrate examples of another robot as the electronic toy.
  • the components in FIG. 26 and FIG. 27 corresponding with FIG. 1 have the same reference numerals, and the explanation of such components is omitted.
  • the robot of this example comprises the same structure and functions as the robot illustrated in FIG. 1, but has a display unit 71 which covers approximately the entire front part (face) of the headportion 10 .
  • the display unit 71 may employ an LCD display unit, but is not limited thereto.
  • the ⁇ switch 54 is disposed on the upper surface of the head portion.
  • FIG. 28( a ) is expressing the facial state of “pleasure”, FIG. 28( b ) “dizziness”, FIG. 28( c ) “anger”, and FIG. 28( d ) “sentimentality”, respectively.
  • FIG. 29( a ) is expressing the state of “sadness” and FIG. 29( b ) “sleep”.
  • the “sleep” state is the power saving mode, and is similar to the power saving mode of a personal computer.
  • control 60 stores approximately 300 facial display animations for changing the facial expression.
  • three basic facial patterns are prepared for the respective modes of “delight”, “anger”, “sorrow” and “pleasure”, and sound and movement are additionally combined in correspondence with the respective modes.
  • FIG. 30 is a diagram explaining an example wherein the robot illustrated in FIG. 1 or FIG. 26 has its own biorhythm.
  • the user biorhythm data of ROM 62 described above can be replaced with the biorhythm function program of the robot.
  • the personal biorhythm of the robot generates random numbers when the insulation paper is removed from the battery housing and power is supplied, a random start position (initial value) as depicted with the plurality of points in the sinusoidal wave of FIG. 30 is selected based on the results thereof, and the biorhythm is accordingly made to differ per robot.
  • the random numbers the data spread of the switching operation when the switch (not shown) is pressed with the mechanical movement upon activating the motor may also be used as such random numbers for setting the initial value.
  • the amplitude value of the function created by the biorhythm is utilized as one of the emotion parameters of the control parameter (c.f. FIG. 18).
  • the four operational modes are set in accordance with the value of the emotion parameter.
  • the first range containing the center of the amplitude is the “normal mode”, and a “pleasure mode” in which the robot is of a happy feeling is defined in a prescribed range thereabove, and a “delight mode” in which the robot is full of delight is defined in a prescribed range further thereabove.
  • a “sorrow mode” in which the robot is sad is defined in a prescribed range below the “normal mode”
  • an “anger” in which the robot is angry is defined in a prescribed range further therebelow.
  • a biorhythm with a short cycle may be set for demonstration exhibitions in front of the store by performing specific switching operations. For instance, one cycle can be set to 5 minutes. Expressive changes and gestures accompanying the emotional changes of the robot can thereby be shown to the audience in a short time span in order to introduce the capabilities and characteristics of this robot.
  • FIG. 29( a ) shows an expression saying, “Stop it!” when a prank is played on the robot. In order to perform this type of gesture in a timely manner, it would be amusing if this kind of expression is displayed on the display unit when a high level of sound is continuously provided to the robot.
  • output of the microphone 5 as the sound detection means is monitored with the control unit 60 via the low pass filter for eliminating noise, and it is determined whether a sound signal exceeding a prescribed level is continued for a prescribed time; for example, beyond 10 seconds. If such sound signal continues, since it is “noisy”, the control unit selects the expression of the robot illustrated in FIG. 29( a ) from the storage means ( 62 , 63 ) and displays this on the display unit. Moreover, since the selective operation of the expression is conducted pursuant to the value of the aforementioned emotion parameter, the same results can be obtained even if the emotion parameter value is changed to an “unpleasant” level.
  • FIG. 29( b ) The expression of FIG. 29( b ) represents a “sleep” state. It would be amusing if the robot would express this type of sleeping expression when a blanket is placed over the robot or when the periphery becomes dark.
  • the light sensor 53 e.g., CCD, photodiode, phototransistor, etc.
  • the control unit 60 monitors this light intensity and judges whether a dark state continues beyond a prescribed time; for example, beyond 10 seconds. If such dark state continues, since the “periphery is dark”, the control unit selects the “sleeping” expression of the robot illustrated in FIG. 29( b ) from the storage means ( 62 , 63 ) and displays this on the display unit.
  • the mechanical components of the arm 30 or the like may also be made to move for a prescribed time.
  • the control unit 60 monitors the output of the switch 54 or touch sensor 51 , and distinguishes whether the operation is continued for a prescribed time; for example, 10 seconds.
  • a prescribed time for example, 10 seconds.
  • the expression, words, sound, etc. in accordance with the emotion of the robot at such time are output.
  • the emotion parameter is in an “unpleasant” state
  • the unpleasant expressions such as the “painful” expression shown in FIG. 29( a ) or an “angry” expression is displayed.
  • the emotion parameter is in a state of “delight”
  • the “sentimental” expression shown in FIG. 28( d ) is displayed.
  • FIG. 31 is a flowchart for explaining an example of a “soliloquy mode” of the robot reflecting the aforementioned biorhythm.
  • the control unit (CPU) 60 executes the soliloquy mode when corresponding to a soliloquy start condition; for example, the condition falling under “user absent” and “generation of prescribed random numbers” (S 270 ; Yes). Foremost, the emotion parameter representing the biorhythm amplitude, which is one type of control parameter, is read (S 272 ). It is then judged which of the foregoing 5 modes corresponds thereto from this value (S 274 ). Mode judgment is conducted by comparison with the threshold values of the respective modes, and the result thereof is output (S 274 to S 284 ). Display of the expression corresponding to each of the judged modes and, as necessary, robot control accompanying the movement and sound is additionally performed (S 286 ).
  • FIG. 33 and FIG. 34 illustrate display examples of words in the normal mode.
  • a sentence is created using the term “IT (information technology)” stored in advance or input by the user. “Know what?”, “IT is” and “the trend” are sequentially displayed on the screen.
  • a sentence is created in a haiku-like format (Japanese unrhymed verse form having three lines containing 5, 7 and 5 syllables, respectively).
  • the robot may perform a “single-action performance” of delight or anger corresponding thereto. For instance, while playing background music, the robot can say, “I feel good! I will now impersonate (so and so) !” and do an impersonation, or rotate its arms or display “question mark (??)” eyes and so on.
  • FIG. 35 is a flowchart explaining this mode.
  • the text communication mode is for seeking communication with the user by the robot displaying text on the display unit.
  • the control unit 60 selects a question from pre-stored question data (S 242 ).
  • the respective questions are made to be distinguishable in advance; namely, those that change the emotion of the robot depending on the response as depicted in FIG. 38, and those that do not affect the emotion of the robot irrespective of the response as depicted in FIG. 39.
  • the control unit 60 displays the selected question on the display unit (S 244 )
  • the ⁇ button is operated (S 245 )
  • This processing is used, for instance, when the user presses “ ⁇ ” to the question of “Do you like”, “cars?”, in the modes described later by remembering that the user “likes cars” upon storing such response.
  • the robot expresses its sadness, for example, by making the “sorrowful” pose shown in FIG. 24 and displaying the “dislike” expression, and lowers the emotion parameter in the minus direction (S 252 ). This, as shown in FIG. 40, moves the biorhythm in the state of grief. The robot thereby moves to a mode for expressing a griefful expression.
  • the favorable image is a parameter corresponding to the robot's feelings toward the user.
  • n points are added when a response is made which makes the robot happy.
  • minus m points are added when a response is made which makes the robot sad.
  • the values of n and m differ depending the respective questions.
  • the favorable image is determined based on such integrated values (S 254 ).
  • FIG. 41 shows an example of conducting data exchange by connecting the robots bi a communication cable 741 .
  • the communication interfaces 74 of the control unit as shown in FIG. 43 are connected via the connector (not shown) provided on the back face of the robot.
  • FIG. 42 shows an example of connecting a PHS or mobile phone 742 to the communication interface 74 of the robot and conducting data exchange between the PHS or mobile phone 742 in a distant place and the connected robot via a mobile communication network as shown in FIG. 44.
  • a card module of the PHS or mobile phone may be integrated in the back face of the robot.
  • the connection example of the communication interface 74 and the PHS or mobile phone 742 , 743 , etc. described in this embodiment of the present invention includes cases where a telephone communication function itself is installed in the robot.
  • FIG. 45 shows an example of connecting the communication interface 74 of the robot with the personal computer 743 connected to the Internet 745 as the communication network, and, similarly, of conducting data communication with other robots connected to the Internet 745 . Further, the description of FIG. 45 is omitting providers and the like that provide Internet connection services.
  • the composition shown in FIG. 46 shows a system capable of obtaining robot data by communication with the server device.
  • the communication interface 74 of the robot is connected to the server device 750 of such robot via a communication means such as the PHS or mobile phone 743 , Internet 745 or telephone communication network.
  • a communication network such as the Internet 745
  • data such as words and current affair terms corresponding to the user characteristics or attributes described later and control data for controlling the robot gesture.
  • FIG. 47 explains an example of data exchange upon connecting robot A and robot B with a communication cable 41 .
  • the robots are foremost connected with the cable.
  • the mode selection status is entered into by simultaneously operating the “ ⁇ ” and “ ⁇ ” button(switch 54 ) of the respective robots, for example, and the “communication” mode is selected thereby.
  • communication parameters are exchanged between the robots, communication conditions and so on are set, and communication is started.
  • Robot A transmits the user name of robot A, terms and so forth which it stores.
  • the user name for instance, is stored by the user inputting his/her name through sequential selection of the corresponding character displayed on the display unit.
  • Stored terms for example, can be obtained by storing the response to the questions inquired by the robot as described above (S 256 ). This includes various terms such as the likes and dislikes of the user, age, male/female, personality, etc.
  • Data is forwarded from robot A to robot B and, when robot B confirms such data, an ACK signal representing the reception of data is transmitted. When data reception ends in failure, a NACK signal is transmitted.
  • robot A receives the NACK signal
  • robot A retransmits the data.
  • robot A receives the ACK signal it distinguishes the success of data transmission, enters the standby state, and awaits the signal from robot B.
  • robot B transmits to robot A data on the user name of robot B and stored terms which it retains.
  • an ACK signal representing the reception of data is transmitted.
  • robot A transmits a NACK signal, and robot B receiving such signal retransmits the data.
  • These words are applied to a standard sentence selected from a plurality of standard sentences stored in advance, and output by at least one of a sound and screen display of text.
  • the output timing of sound and display and selection of the standard sentence may be set in accordance with the initial exchange of communication parameters.
  • connection cable 741 an infrared communication interface employed in remote controllers and portable terminals may also be used.
  • FIG. 48 is a communication diagram explaining the procedures in the case of conducting data communication between robots with a portable telephone.
  • the respective users of robot A and robot B connects his/her robot to a PHS or mobile phone and makes a call to the telephone of the other party.
  • communication parameters are exchanged, and, for example, the mutual data communication speed is set to the slower speed among the telephones.
  • the communication parameter is set, data is transmitted from one of the robots (robot A) to the other robot (robot B).
  • Robot B transmits the ACK signal when there are no abnormalities in the received data, and transmits the NACK signal when there are abnormalities.
  • robot A receives the ACK signal representing reception from robot B, it enters a standby state. Robot retransmits the data upon receiving the NACK signal.
  • robot B transmits data which it stores to robot A.
  • robot B transmits words such as “Hanako” (user name), “Chocolate” (personal favorite), “F-1 racers” (personal favorite), “tea mushroom” (personal favorite), Pico (personal favorite), and the like.
  • Robot A and Robot B respectively select a standard sentence stored in advance, complete the sentence by applying the received data in the black space in the standard sentence, and output communication results by conducting at least one of a vocalization or text display. Attributes of the word to be filled in such black space should be predetermined; for example, the user's name, personal favorite, personal dislike, age, weather, and so on.
  • robot A would vocalize “Yeah, I received data from ‘Hanako’ who likes ‘chocolate’”, “Hey, are ‘F1 racers’ delicious?”, “Hanako taught me ‘tea mushrooms,’ but I don't know what they are . . . ”, “Are ‘Hanako's’ pants cool?”, “Maybe ‘Hanako’ is a “Pico” mania”, and so on.
  • robot B would vocalize “Let's see. I received data from ‘Taro’ who likes to take ‘naps’”, “Are MDs the thang with young hipsters?”, “I love ‘pachinko’”, “Is ‘Sazaesan’ really smart?”, “‘Taro’ said this year's ‘Pochi’ is well made.”, “Wouldn't it be scary if they had a ‘Thunderbird’ ramen?”, and so on.
  • FIG. 49 and FIG. 50 show examples of renewing the data retained by the robot with the server device 750 illustrated in FIG. 46.
  • the aforementioned function is provided inexpensively by adequately providing from the server device data such as requisite words pursuant to the server device and data (control program) for controlling the operation of the robot upon speaking such words.
  • the control program may be used for controlling the series of operations pursuant to such program, or may be used for designating the operation of one among a plurality of operational control programs such as “delight”, “anger”, “sorrow” and “pleasure” pre-stored in the robot.
  • the communication interface 74 of the robot is connected to a PHS, mobile phone or personal computer 743 , and then connected to the server device 750 via a communication network, the Internet 745 for example, in order to establish the circuit for conducting data transmission.
  • Communication parameters required for establishing communication such as the communication speed, specification of electronic toy, ID, password, and so on are transmitted from robot A to the server device 750 .
  • the server device 750 conducts authentication on whether to permit the connection, and thereby permits access to robot A.
  • the robots request the transmission of updated data.
  • the server 750 transmits a requested designation ideas, for example, in only a required number of words.
  • a requested designation ideas for example, in only a required number of words.
  • “divine nation comment”, “train accident”, “New Years”, “Christmas” and so on are transmitted. It is also possible to forward a new standard sentence suitable for such words.
  • control program data 1 , program data 2 , program data 3 for controlling the robot operation upon vocalizing standard sentences employing the aforementioned words.
  • control program data 41 , 42
  • control program data is also transmitted together with the word data.
  • robot A When robot A receives the data, it stores this in the memory 63 .
  • the ACK signal is transmitted to the server, the circuit is opened, and the update is completed.
  • the NACK signal When data reception ends in failure, the NACK signal is transmitted to the server, and retransmission of the data is requested.
  • the server device receives the ACK signal from robot A, or when the circuit is opened, it ends the communication with robot A.
  • Robot A applies the acquired words in the standard sentences and conducts at least one of a vocalization or text display (sentence display). Further, although the robot has a function of converting text data into sound, sound data of words and standard sentences may be received from the server device and vocalized by encoding the same.
  • Action mail is for displaying or reading the contents of the e-mail received by the robot as well as to perform prescribed actions corresponding thereto; for example, the movement of the arms and legs and representation of expressions.
  • FIG. 51 and FIG. 52 show structural examples in the case of conducting action mail.
  • the e-mail sender downloads beforehand the action mail software, which can be downloaded via the Internet, in his/her personal computer 743 a .
  • the personal computer 743 a in an environment connected to a communication network such as the Internet 745 capable of e-mail communication.
  • the sender creates an e-mail message by operating an input device such as a keyboard device.
  • the aforementioned software downloaded into the personal computer includes a message/operation editing program for conducting text input, message editing, control movement input, and so on; a data/sound conversion program for converting the message into sound data; and an e-mail program with a data file attachment function capable of transmitting sound data as the attachment file.
  • the sender creates an e-mail by utilizing control information and the message/operation editing program.
  • E-mail designates the name of sender (4 letters for example), message (44 letters for example), and operation of the robot. These can be assembled with text data.
  • the text code is converted into a sound signal, an FM modulation signal for example, with the data/sound conversion program.
  • Information on the name, message and robot operation for example, may be classified in three-second silent intervals as shown in FIG. 52. Further, headers and footers (not shown) may also be suitably added.
  • Such FM sound is converted into sound data; for example, sound data formats such as WAV, MP3, ram, and so on.
  • the e-mail program transmits this sound data file, upon attaching it to the e-mail, to the party on the other side of the line using the robot.
  • E-mail is transmitted to the e-mail server device of the other party via the Internet 745 .
  • various server devices containing a communication circuit and e-mail server as well as connection service providers are included in the Internet 745 .
  • the receiver downloads beforehand into his/her personal computer the communication software having an action mail reception function obtainable via the Internet. Decoding of the sound file is included in the reception function.
  • the receiver connects the robot to his/her personal computer 743 b .
  • the receiver accesses the e-mail server (not shown) with the personal computer 743 b and downloads the e-mail sent to such receiver.
  • the attached sound file is reproduced with the aforementioned communication software in order to demodulate the sound signal.
  • This sound signal is supplied to the control unit 60 of the robot via the communication interface 74 .
  • the control unit 60 demodulates the FM signal and converts this into digital data. Control information such as the name of sender, message and movement is distinguished from the data.
  • the control unit 60 converts the text data into image data and displays the same on the display unit 71 .
  • the name of the sender is displayed, and a long message can be scroll displayed thereafter on a small display unit screen.
  • the text data may also be read aloud. This is repeated a prescribed number of times. Needless to say, the entire message may be displayed when employing a large display unit.
  • the control unit 60 controls the motors 205 and 206 based on the operational control information and makes the robot perform operations corresponding to the message.
  • the control of the action operation may also be performed pursuant to the control code stored beforehand in the ROM of the robot, or by the sender designating a control program formed from a series of control codes.
  • the sender may program, to his/her liking, the series of movements of the robot by assembling control codes corresponding to the individual operations.
  • the message display and action movement corresponding to the reception of action e-mail may be performed simultaneously. Further, the action may be performed first, and the message displayed thereafter. Or, the message may be displayed first, and the action movement performed thereafter. These may be repeated or combined. Moreover, the sender may create a voice message and transmit such message as an attachment file, and the robot may reproduce this from the speaker as a voice message.
  • the control unit 60 downloads the action mail software via the communication function and may possess an email receiving function.
  • the control unit 60 of the robot receives the e-mail and may perform the conversion of the sound file implemented by the personal computer 742 b , and the personal computer will no long be necessary.
  • the structures illustrated in FIG. 44 to FIG. 46 are also able to conduct action mail.
  • the server device may also be made to be the sender of the action mail.
  • the robot may speak a one-word message, today's fortune, shopping information, weather forecast, current affairs, and the like together with action in accordance with the characteristics and attributes of the user. For instance, if it snowed the previous day, the server device will transmit “It snowed a lot yesterday. And boy was it cold!” (message) and a “jerky move” (movement+facial expression).
  • FIG. 53 to FIG. 56 show examples of movements (action movements) accompanying the facial expressions displayed together with the message.
  • FIG. 53 is representing “delight” by raising both arms upward at an angle and displaying hearts on the face.
  • FIG. 54 is representing “anger” by raising the hands near the head and displaying slant eyes on the face.
  • FIG. 55 is representing “sorrow” by lowering both hands and displaying tear-filled eyes.
  • FIG. 56 is representing “pleasure” by placing both arms forward and displaying a smiling expression on the face.
  • FIG. 57 and FIG. 58 are perspective views showing an example of a robot structured so as to change the movement of its lower body pursuant to the “volume”, “speed”, “rhythm” and so on of sound such as music.
  • the robot makes gestures as though of dancing by opening and closing its legs left and right in accordance with the music.
  • FIG. 57 shows the first state where the robot is standing upright with both feet together.
  • FIG. 58 shows the second state where the robot is opening its legs left and right. The robot consecutively moves from the first state to the second state, and consecutively moves from the second state to the first state.
  • the robot is made to bend the knees pursuant to the mechanism described later so as to simulate the movement of a human.
  • FIG. 59 and FIG. 60 are perspective views showing the drive portions of the leg opening/closing mechanism 300 , and FIG. 59 shows the closed state of the legs and FIG. 60 shows the open state of the legs.
  • a motor 301 is built in the lower part of the left leg of the robot, and the driving force is increased by the gear mechanism 302 .
  • Driving force rotates and drives the left leg cam mechanism 306 inside the waist portion frame upon going through the hip joint portion 305 of the waist portion frame 304 via the drive shaft 303 .
  • One end of the link 308 is rotatably connected to the cam 307 of this mechanism via a roller bearing.
  • the other end of the link 308 is rotatably mounted on the roller bearing 311 on the upper end of the right leg axis 310 .
  • the right leg axis 303 and left leg axis 310 are able to move symmetrically in the left and right directions based on a (virtual) central axis (line) extending in the upward and downward directions of the central portion of the torso pursuant to the cam mechanism 306 , link 308 , hip joints 305 , 313 , and so on.
  • FIG. 61 is a perspective view showing a structural example of the right leg 320 .
  • a hip joint portion 313 is mounted on the upper part of the right leg axis 310 .
  • a right leg is rotatably mounted against the waist portion frame 304 with a pin (not shown) on the upper part of this hip joint portion 313 so as to move such right leg in the left and right directions.
  • the lower part of the hip joint portion 313 is mounted, with a pin 322 (not shown), on the concave portion of the upper part of the above-knee portion 321 of the leg so as to be rotatable in the front and back directions of the robot.
  • the concave portion of the lower end portion of the above-knee portion 321 is rotatably mounted in the front and back directions with the protrusion of the below-knee portion front cover 323 and the pin 324 .
  • the lower central part 323 b of the below-knee portion front cover 322 is opened in a reverse V-shape.
  • the roller portion 312 of the lower end portion of the right leg axis 310 is made to contact the ground surface or floor surface (or the mounting face of the robot) (not shown) by being positioned in the center penetration hole 325 a of the ground portion extending in the front and back directions of the robot, and mounted in a rotatable manner with a pair of protrusions 325 in the shape of an approximate reverse V-shape and pins 326 respectively disposed on both sides of such penetration hole 325 a .
  • the below-knee portion back cover 327 engages with the below-knee front cover 323 while sandwiching the right leg axis 310 .
  • a U-shaped opening 327 a in which the right leg axis is position is provided to the upper face portion of the below-knee back cover 327 .
  • a long hole 327 b is provided at a position opposite the reverse V-shaped opening 323 b of the below-knee portion front cover of the below-knee portion back cover 327 .
  • Pins 326 that rotatably connect the roller portion 312 with the ground portion 325 are position in the reverse V-shaped opening 323 b and long hole 327 b .
  • the below-knee portion U-shaped opening 327 a , reverse V-shaped opening 323 b and long hole 327 b are structured such that the right leg axis 310 and connection pin 326 do not interfere with (do not contact) the covers 323 , 327 when the knee portion, which is the portion connection the above-knee portion and below-knee portion, is bent.
  • a protrusion 327 c (c.f. FIG. 57, FIG. 58) is formed on the inside bottom portion of the below-knee back cover 327 .
  • This protrusion 327 c contacts the inclined face 325 b formed on the ground 325 .
  • the protrusion 327 c of the below-knee portion contacts the upper part of the inclined face 325 b of the ground portion 325 , and pushes the below-knee portion 327 upward.
  • the position of the hip joint 313 does not change, the lower part of the above-knee portion 321 and the upper part of the below-knee portion 323 are pushed forward, thereby bending the knee of the leg.
  • FIG. 63 shows the appearance of the left leg 330 having a built-in motor 301 .
  • An eccentric cam 307 is mounted on the upper end portion of the left leg axis 303 , and a spherical engagement member 309 for engaging with the link 308 is mounted on the cam 307 with a screw (c.f. FIG. 64).
  • the motor is built in the below-knee portion 331 , and the below-knee portion 331 and the ground portion 332 are rotatably connected with a pin.
  • a friction member (not shown) such as rubber for preventing sliding is adhered to the bottom of the ground portion 332 of the left foot.
  • the mechanical portion only occupies the lower part of the torso, and it is possible to make the bulk of the robot torso internally empty. This is convenient in that the inside of the torso may be used for electric circuits or an upper body mechanism. Moreover, since a relatively heavy motor is disposed in the below-knee portion of the leg, it is easy to stabilize the robot. Further, with the aforementioned mechanism, although the knee of the right leg will bend, the knee of the left leg is fixed, and, by providing a friction member at the bottom portion, it is possible to prevent the unstable posture of the robot, movement and rotation of the robot, and so on.
  • FIG. 64 shows an example of the cam 307 of the left leg axis. As shown in FIG. 64, it is possible to adjust the degree of opening the legs in the left and right directions by changing the mounting position of the engagement member 309 of the cam 307 . Since the link 308 and engagement member 309 (and 311 ) have a spherical shape, unnecessary force is not inflicted between the link and engagement member (or cam) even when the legs are opened. Adjustment can be made by providing beforehand a plurality of screw holes in the cam and mounting the engagement member in an appropriate screw hole, or by changing the cam.
  • FIG. 65 is a block diagram explaining an example of synchronizing (corresponding) music or sound with the movement of the robot.
  • a square chip card (micro IC card) 621 in place of the ROM 62 of the control unit, or in addition to the ROM 62 , used is a square chip card (micro IC card) 621 , wherein one side thereof is approximately 2 cm, having recorded thereon music information and control data. Exchange of songs is thereby facilitated. Needless to say, music information and control data may be recorded on the ROM 62 .
  • the control unit 60 reads the sound data (information) from the chip card 621 , converts this into a sound signal with the sound reproduction processing function 601 of the control unit 60 , and supplies this to the speaker 72 at an appropriate level.
  • a song with a prescribed rhythm is thereby played from the speaker 72 .
  • the control unit 60 reads control data from the chip card 621 and controls the motor 301 with the rhythm control function 602 of the control unit 60 .
  • the motor 301 is able to control the rotation speed, normal/reverse rotation, length of step and so on pursuant to PWM control and level control of the supplied voltage.
  • FIG. 66 is a block diagram explaining an example of making the robot move in correspondence with music or sound.
  • Sound data is at least previously stored in the chip card 621 .
  • the control unit 60 reads the sound data from the chip card 621 , converts this into a sound signal with the sound reproduction processing function 601 of the control unit 60 , and supplies this to the speaker 72 at an appropriate level. A song with a prescribed rhythm is thereby played from the speaker 72 . Further, the control unit 60 samples the sound signal with its sampling function 603 and extracts the rhythm (cycle of accents) of the song from such sound signal with the rhythm extracting function 604 .
  • a rotation corresponding to the rhythm of this song is set in the motor control function 605 .
  • the motor control function 605 sets the rotation speed, normal/reverse rotation, length of step and soon by performing PWM control and level control of the supplied voltage.
  • FIG. 67 to FIG. 75 are diagrams explaining the bipedal locomotion state.
  • FIG. 67 is a perspective view showing the state where the left leg is positioned backward and the right leg is positioned forward.
  • FIG. 68 is a perspective view showing the state where the right leg and left leg are approximately together.
  • FIG. 69 is a perspective view showing the state where the left leg is positioned forward and the right leg is positioned backward.
  • FIG. 70 is a side view showing the leg mechanism of the left leg. The operation of the leg mechanism will be described in detail after the explanation of the respective components thereof.
  • FIG. 71 shows the waist portion frame 401 .
  • the waist portion frame 401 comprises a stopper 401 a for stopping the long rod 410 from being raised excessively, a drive shaft hole 401 b of the cam pulley, a connection axis 401 d for rotatably mounting the above-knee portion 402 , and a guide pin 401 c for engaging with the long hole 411 b of the short rod 411 .
  • FIG. 72 shows the above-knee portion 402 rotatably connected to the waist portion frame 401 .
  • a connection portion 402 a to be connected to the connection axis 401 d of the waist frame 401 is provided to the upper part of the above-knee portion 402 .
  • a connection portion 402 b to be connected to the below-knee portion 403 is provided to the lower part of the above-knee portion 402 .
  • FIG. 73 shows the below-knee portion 403 .
  • a connection portion 403 a to be connected to the connection portion 402 b of the above-knee portion 402 and a connection portion 403 b to be connected to the short rod 411 are provided to the upper part of the below-knee portion 403 .
  • a connection portion 403 c to be rotatably connected to the connection portion 404 a of the ground portion 404 is provided to the lower part of the below-knee portion 403 .
  • FIG. 74 shows the ground portion 404 .
  • a connection portion 404 a to be connected to the connection portion 403 c of the below-knee portion 403 and a connection portion 404 b to be connected to the long rod 410 are provided to the upper part of the ground portion 404 .
  • FIG. 75 shows the cam pulley 420 , long rod 410 and short rod 411 .
  • the cam pulley 402 is connected to the axis to be rotatively driven by a motor not shown.
  • a drive pin 420 a is provided in an eccentric position from the drive shaft (not shown) of the pulley at the outside of the cam pulley 420 .
  • a cylindrical cam 420 b is provided in an eccentric position from the drive shaft (not shown) of the pulley at the inside of the cam pulley 420 .
  • a long hole 410 a is provided to the upper part of the “dogleg” shaped rod, and a pin 420 a is inserted in this long hole 410 a so as to rotatably engage such rod and long hole.
  • the long hole 410 a prevents the toe of the foot of the robot from lowering excessively.
  • a connection portion 410 b to be connected with the connection portion 404 b of the ground portion 404 is provided to the lower part of the “dogleg” shaped rod.
  • An annular engagement portion 411 a for engaging with the cam 420 b is provided to the upper part of the short rod 411 .
  • a long hole 411 b for engaging with the guide pin of the frame 401 c is provided to the center portion of the short rod 411 .
  • a connection portion 411 c to be connected with the connection portion 403 b of the below-knee portion 403 is provided to the lower part of the short rod 411 .
  • the waist portion frame 401 and the above-knee portion 402 are rotatably connected via the connection portions 401 d and 402 a
  • the above-knee portion 402 and the below-knee portion 403 are rotatably connected via the connection portions 402 b and 403 a
  • the below-knee portion 403 is rotatably connected to the ground portion 404 via the connection portions 403 c and 404 a
  • the short rod connects the cam 420 b and the below-knee portion 403 via the connection portions 403 b , 422 c .
  • the eccentric cam 420 b swings the below-knee portion 403 with the short rod 411 and lifts the below-knee portion 403 .
  • the above-knee portion 403 also oscillates pursuant thereto.
  • the long rod 410 connects the drive pin (cam) 420 a and the ground portion 404 via the connection portions 410 b , 404 b .
  • the drive pin 420 a lifts the ground portion 404 with the long rod 410 .
  • the lifting and lowering of the toe of the foot upon moving the legs is set.
  • This mechanism lifts the toe of the left foot while maintaining the balance with the right foot on the ground, and advances the legs by moving the left foot from the backward to the forward position with the heel on the ground.
  • the right foot is moved to the forward position repeatedly to realize the walking motion.
  • FIG. 76 and FIG. 77 show movements of the left leg pursuant to the rotation of the drive shaft.
  • the ground portion is not in contact with the floor, and shows the movement in a state where the foot is hanging in the air.
  • FIG. 76( 1 ) to FIG. 76( 4 ) and FIG. 77( 5 ) to FIG. 77( 8 ) show movements of the respective legs when the drive shaft of the cam is rotated in increments of 45 degrees.
  • FIG. 76( 1 ) shows the state where the rotational angle of the cam axis is 0 degrees (basic position). In this state, the short rod 411 is swung forward by the cam 420 a with the guide pin 401 c as the fulcrum, and the leg is thereby moved forward.
  • FIG. 76( 3 ) shows a state where the cam axis rotated 90 degrees.
  • FIG. 77( 5 ) shows a state where the cam axis rotated 180 degrees. In this state, the short rod 411 is swung backward by the cam 420 a with the guide pin 401 c as the fulcrum, and the leg is thereby moved backward.
  • FIG. 77( 7 ) shows a state where the cam axis rotated 270 degrees. This state corresponds to the state where the legs are together. Nevertheless, this state is different from the case of FIG.
  • the toe is raised, the foot is moved forward in a state where the heel is in contact with the ground, and, when the foot moves forward, the entire sole is made to contact the ground.
  • the toe side of the sole has a mechanism built therein for changing the direction of the overall robot.
  • a roller is built in the heel side of the sole for sliding across the ground.
  • employed may be a metal roller such that it may concurrently act as a weight, and the stability of the posture of the overall robot can be sought thereby.
  • the mode of leg movement of the robot as described above is advantageous for this type of mechanism.
  • FIG. 79 and FIG. 80 show structural examples (leg mechanism of left leg) of another leg mechanism. Components in these diagrams corresponding with those illustrated in FIG. 70 are given the same reference numerals.
  • a pressing plate 410 c is integrally mounted on the long rod 410 .
  • the inclination (posture) and lifting/lowering of the toe (or heel) of the feet of the robot are set by pressing the connection portion (rear axis) 404 b of the ground portion 404 in a set timing as a result of pushing this pressing plate with the cam. Further, the shape of the cam and hole is adjusted such that the toe of the foot (ground portion) of the robot can be lifted higher.
  • the advancing power (or retreating power) of the robot is increased by increasing the driving force of the toes by pressing the front end side (toe side) of the ground portion 404 on the ground, or by actively applying grind-like energization force with a spring.
  • the waist portion frame 401 and the above-knee portion 402 are rotatably connected via the connection portions 401 d and 402 a
  • the above-knee portion 402 and below-knee portion 403 are rotatably connected via the connection portions 402 b and 403 a
  • the below-knee portion 403 is rotatably connected to the ground portion 404 via the connection portions 403 c and 404 a .
  • a spring SP as the energization means is mounted on a part of the ground portion 404 , for example, between the connection portion 404 b and a part of the case of the below-knee portion 403 so as to apply force to continuously lift the back part (heel of foot) of the ground portion 404 .
  • This spring SP operates in the direction of continuously keeping the eccentric cam 420 b and the pressing plate 410 in contact. Moreover, the spring SP only needs to operate so as to lift the heel of the ground portion 404 in the upward direction, and the mounting position thereof may be selected accordingly.
  • the short rod 411 connects the eccentric cam 420 b and the below-knee portion 403 with the connection portion 403 b , 411 c .
  • the eccentric cam 420 b swings the below-knee portion 403 back and forth with the short rod 411 and lifts the below-knee portion (leg) 403 .
  • the above-knee portion 403 also oscillates so as to bend the knee pursuant thereto.
  • the long rod 410 is guided by the guide pin 420 c , and connects the eccentric cam 420 b and the ground portion 404 via the connection portions 404 b , 410 b .
  • the pressing plate 410 c is lifted by the eccentric cam 420 b with the pin 420 c as the guide, and further lowers the engagement portion (rear axis) 404 b of the ground portion 404 with the long rod 410 .
  • the posture (inclination) of the robot during the walking motion, or the lifting and lowering of the toe of the foot upon moving the legs is set.
  • the toe of the stepping foot may be lifted in a timing of moving the stepping foot forward, and the toe of the stepping foot may be lowered in a timing of moving the stepping foot backward. It is thereby possible to improve the running performance of a robot capable of walking without falling down.
  • FIG. 80 shows the cam pulley 420 , log rod 410 , short rod 411 and spring SP of another mechanical example described above.
  • the cam pulley 402 is connected to the drive shaft (not shown) to be rotatively driven by a motor.
  • a guide pin 420 c is provided in a concentric position to the drive shaft of the pulley at the outside of the cam pulley 420 .
  • a cylindrical cam 420 b is provided to the cam pulley 420 in an eccentric position from the drive shaft of the pulley.
  • a long hole 410 a is provided to the upper part of the approximate “dogleg” shaped rod, and a pin 420 a is inserted in this long hole 410 a so as to rotatably engage such rod and long hole.
  • the guide pin 420 c is movably engaged with the long hole 410 a .
  • the pressing plate 410 c is formed at the lower part of the long hole 410 a .
  • the upper surface of the pressing plate 410 c contacts the eccentric cam 420 b and moves the log rod in the upward and downward direction in accordance with the movement of the cam 420 b .
  • a connection portion 410 b to be connected with the connection portion 404 b of the ground portion 404 is provided to the lower part of the “dogleg” shaped rod 410 .
  • An annular engagement portion 411 a for rotatably engaging with the cam 420 b is provided to the upper part of the short rod 411 .
  • a long hole 411 b for engaging with the guide pin of the frame 401 c is provided to the center portion of the short rod 411 .
  • a connection portion 411 c to be connected with the connection portion 403 b of the below-knee portion 403 is provided to the lower part of the short rod 411 .
  • FIG. 81 and FIG. 82 show the mechanism for changing the direction of the robot.
  • FIG. 81 is a side view of the ground portion, and a drive roller 404 c is disposed at the toe side and a sliding roller 404 d is disposed at the heel side thereof.
  • FIG. 82( a ) is a diagram of the ground portions 404 of the left and right legs viewed from the front of the robot, and
  • FIG. 82( b ) is a diagram of the ground portions 404 of the left and right legs viewed from the bottom.
  • a motor 404 e disposed in the front part within the ground portion 404 are a motor 404 e , a gear mechanism 404 f for increasing the rotational quantum force of this motor, and a drive roller 404 c rotated by this gear mechanism 404 f .
  • a plurality of drive rollers 404 c may be provided, and, in this example, two drive rollers 404 c , 404 c are provided and connected additionally with a drive belt 404 g .
  • the drive direction pursuant to the drive roller 404 c and the drive belt 404 g is set in obliquely against the front and back direction of the robot.
  • the motor and gear mechanism are also disposed obliquely in correspondence thereto, these may be set appropriately.
  • the ground plane increases, and the stability of the robot increases. It is also possible to increase the speed of turning.
  • the left and right drive rollers and the drive direction by the drive belt 404 g are in an “inverted V-shape” positioned on the approximately identical circumference.
  • a freely rotatable sliding roller 404 d is positioned at the front part within the ground portion 404 .
  • this roller By structuring this roller with a relatively heavy material, metal for example, this will concurrently act as the weight for adjusting the balance of the robot. Needless to say, an item corresponding to a weight for maintaining the balance may also be separately provided to the ground portion 404 .
  • FIG. 83 shows another example of a mechanism for changing the direction of the robot.
  • the components shown in FIG. 83 that correspond to those illustrated in FIG. 82( b ) are given the same reference numerals, and the explanation thereof is omitted.
  • This example only shows the left leg side of the robot, and the right leg side (not shown) is structured symmetrical to the example of the left leg side.
  • the drive roller 404 c and the drive belt 404 g depicted in FIG. 82 are structured with a drive rubber roller 404 h .
  • the drive rubber roller 404 h for example, is structured by covering a large friction rubber over the periphery of a plastic pulley.
  • the aforementioned bipedal locomotion mechanism of the legs walks in a state where the heel is constantly contacting the ground.
  • Providing the drive roller or the drive belt at the toe side is suitable for this walking structure.
  • the robot may turn when the toe is lifted, and the posture of the robot will become unstable. Further, this will be an unnatural movement as the robot movement simulating a person.
  • the posture is stabilized since the turn is made with the foot with the entire sole thereof in contact with the ground, and the movement will look natural. The turn during the walk in particular is stable.
  • the control unit 60 is able to change the direction of the robot in order to avoid obstacles by activating the aforementioned turnabout mechanism upon detecting an obstacle in the forward direction of the robot with the likes of a light sensor 53 .
  • the position of the sensor for detecting obstacles may also be at the tip of the ground portion. In such a case, the sensor may be a switch, supersonic sensor, or the like.
  • the electronic toy is not limited to being battery powered, and the power supply may also be via an AC power supply or AC power supply adaptor.
  • the robot of the embodiments stores a program for deciding on its own actions, and self-activates various operations in accordance with the time.
  • the subsequent action is decided with respect to whether there was a reaction; for example, whether a sound could be heard or whether a switch was touched.
  • wasteful movements are not made that much, but whether the user is nearby is periodically (in fixed time intervals) confirmed.
  • the user will view this as though the robot is moving on its own at all times.
  • the robot of the embodiments comprehends the biorhythm of the user in order to presume the health and mood of the user, and, when it is presumed that the user is not feeling well, it takes (is programmed to take) humane action of cheering the user up and so on.
  • the robot of the embodiments possesses a self-emotion parameter, and vocalizes or displays words corresponding to the present emotion. This is amusing since it would seem as though the robot has emotions.
  • the robot of the embodiments reacts to pranks such as continuously talking loudly near the robot, covering the robot with a cloth, or hitting the robot repeatedly. Thus, this is also amusing.
  • the robot of the embodiments conducts communication by text. For example, this is amusing in that the robot inquires questions to the user or soliloquizes.
  • the emotion of the robot is affected by the response to such questions, and the mood will become good or bad. As this is humanlike, this is amusing in that such emotion is displayed on the display unit or represented by movement.
  • the mechanical structure shown in the embodiments is able to obtain two degrees of freedom of the arm, one degree of freedom of the neck, and expressions of the face (eyes) with a minimal structure, thereby realizing emotional movements and expressions of the robot.
  • the electronic toy and the electronic robot of the present invention are also applicable to so-called pet robots, therapy products (e.g., healing robots), domestic robots comprising a function of monitoring patients and elderly persons, and so on, and may be enjoyed by adults and elderly persons without being limited to use as toys for children. Needless to say, the present invention may also be applied in adult toys and playthings.
  • the walking robot as the electronic toy of the embodiments is able to lift and move the tip (toe) or the back end (heel) of the ground portion (foot) at a larger angle upon advancing or retreating by alternately moving both legs.
  • the driving force (or friction) to the toe is increased.
  • the running performance in places with relatively bad foothold is improved, and the falling of the robot is thereby reduced.
  • the electronic toy. of the present invention is able to communicate with the user from the electronic toy side since it automatically activates when the user is nearby. It is also possible to suppress the wasteful consumption of the power source.

Landscapes

  • Toys (AREA)
  • Manipulator (AREA)

Abstract

Provided is an electronic toy which automatically activates when the user is nearby. An electronic toy controlled so as to react to external information has a movement mechanism structuring the mechanical movement of the toy (FIG. 5); an input element for obtaining external information (51-55); a distinction element for distinguishing whether an object body exists in the periphery (S152); and a control element for selecting, among a plurality of control parameters, a control parameter for controlling the movement mechanism in correspondence with the external information based on the distinction result, and controlling the movement of the movement mechanism, and activates when a person exists in the periphery (S162-180).

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to an electronic toy (including adult “toys” and “playthings”, and domestic robots) that performs control so as to move arbitrarily in accordance with external sound and contact. [0002]
  • 2. Description of the Related Art [0003]
  • Stuffed animals of dogs, cats, bears and so on have been widely used as animal toys from the past. Further, there are animal toys structured by housing a motor and speaker inside the body of a stuffed animal or a synthetic resin body in the shape of an animal, and, for example, by contacting and pressing the head portion, such animal toy would perform prescribed movements such as moving its legs or mouth, as well as generate a prescribed cry. [0004]
  • With this type of animal toy, since it repeats the same movement and also repeatedly generates the same cry, the user often loses interest quickly. [0005]
  • Contrarily, if the movement is selected randomly, the movement expected by the user will not occur, and, again, the user may lose interest soon. [0006]
  • In view of such conventional animal toys, development of electronic toys equipped with a microcomputer for controlling various movements so as to keep the interest of users is being promoted. [0007]
  • As such electronic toy, there are those which are structured to make a certain movement (e.g., generating words stored beforehand from the speaker, making a movement of swaying its body) pursuant to the command of the microcomputer when the user, for example, pats the head of the electronic toy, lifts it up, or speaks to it. Moreover, with this type of electronic toy, the number of times the head was patted, the number of times it was lifted up, and the number of times it was spoken to are counted, and, for example, the electronic toy is controlled such that the words generated from the speaker gradually change to adorable phrasing pursuant to the increase in the count value. [0008]
  • SUMMARY OF THE INVENTION
  • In the aforementioned conventional electronic toy, since the user would play with such toy after he/she turns on the power, it does not move by automatically selecting a movement in correspondence with the existence of the user (person). Further, the electronic toy itself does not determine the movement by judging the surrounding circumstances. When envisioning an electronic toy for seeking communication with the user, it is desirable that the electronic toy automatically makes movement in correspondence with the user from its activation. [0009]
  • Thus, an object of the present invention is to provide an electronic toy that is automatically activated when the user exists nearby. [0010]
  • Another object of the present invention is to provide an electronic toy having elements for seeking communication with the user. [0011]
  • In order to achieve the foregoing objects, the electronic toy (i.e., domestic robot) of the present invention is an electronic toy controlled so as to react to external information, comprising: a movement mechanism structuring the mechanical movement of the toy; an input means for obtaining external information; a distinction means for distinguishing whether an object body exists in the periphery; and a control means for selecting, among a plurality of control parameters, a control parameter for controlling the movement mechanism in correspondence with the external information based on the distinction result, and controlling the movement of the movement mechanism. [0012]
  • According to the aforementioned structure, obtained is an electronic toy that will move in correspondence with external information when a person or the like (object body) exists in the periphery. It is thereby possible for the electronic toy side to communicate to the user. This is also effective for conserving batteries (power source). [0013]
  • Preferably, the electronic toy further comprises an information display means for externally displaying information, and wherein the control means further selects, among a plurality of control parameters, a control parameter for controlling the information display means in correspondence with the external information, and controls the operation of the information display means. [0014]
  • Obtained thereby is an electronic toy capable of reacting in correspondence with external information pursuant to the mechanism movement and visual display. [0015]
  • Preferably, the electronic toy further comprises a sound generation means for externally outputting sound, and wherein the control means further selects, among a plurality of control parameters, a control parameter for controlling the sound generation means in correspondence with the external information, and controls the operation of the sound generation means. [0016]
  • Obtained thereby is an electronic toy capable of reacting in correspondence with external information pursuant to the mechanism movement, visual display, and sound output. [0017]
  • Preferably, the electronic toy further comprises: a means for calculating the lifestyle rhythm of a specific person; and an event detection means for detecting the occurrence of an event during such lifestyle rhythm; wherein the control means further selects the control parameter of at least the movement mechanism, the information display means and the sound generation means in correspondence with the event. [0018]
  • According to the foregoing structure, obtained is an electronic toy which makes communication in correspondence with the lifestyle rhythm (e.g., biorhythm) of the user. [0019]
  • Preferably, the electronic toy further comprises: a clock means for detecting the present time; and a detection means for detecting the occurrence of an event planned on a time base in advance; wherein the control means further selects the control parameter of at least the movement mechanism, the information display means and the sound generation means in correspondence with the event. [0020]
  • According to the foregoing structure, obtained is an electronic toy which makes communication in correspondence with the temporal lifestyle activity pattern of the user. [0021]
  • Preferably, the distinction means detects peripheral sound and/or movement. [0022]
  • Preferably, the distinction means detects peripheral sound and/or brightness. [0023]
  • Preferably, the distinction means comprises a microphone for collecting sound and/or a camera for photographing the periphery. [0024]
  • According to the foregoing structure, it is possible to sense that the user is near the electronic toy by detecting the peripheral sound or brightness, existence of a moving body, or the like. [0025]
  • Preferably, the movement mechanism is structured in the shape of a humanoid robot, and the movement thereof is controlled so as to express at least one of a human emotion of “delight”, “anger”, “sorrow” and “pleasure”. [0026]
  • Preferably, the control means selects the control parameter for a predetermined solo play operation when it is judged that a person does not exist in the periphery. The operation of a solo play is activated irrespective of the input made by the user, and, for example, includes the display of a solo play game on the display unit of the electronic toy. [0027]
  • Preferably, the electronic toy is in the shape of a human, and the information display means is provided to a part corresponding to the face and displays expressions of the face and symbols such as text. [0028]
  • Preferably, the electronic toy further comprises a storage means for recording the voice of a person. This will enable voice memos and impersonation (voice imitation). [0029]
  • Preferably, the input means includes at least one of a touch sensor, microphone, light sensor, camera, ∘× switch and condition sensor. [0030]
  • Preferably, the electronic toy further comprises a means for detecting the output of a battery, which is the source of power of the movement mechanism; and wherein the control means further generates a warning with the information display means for externally displaying information or the sound generation means for externally outputting sound when the output of the battery becomes weak. [0031]
  • Further, the electronic toy of the present invention is an electronic toy controlled so as to react to external information, comprising: a human-shaped structure; a control means for controlling the movement of the structure in correspondence with external information; a miniature camera provided to the structure and for photographing the external situation; and a communication means for externally transmitting the photographed image. [0032]
  • According to the foregoing structure, it is possible to comprehend the peripheral situation as image data, and grasp the existence of the user (person) from the movement of the subject. [0033]
  • As the communication means, for example, infrared (IR) communication, PHS, mobile phone, wired communication, general telephone circuits and so on may be used. [0034]
  • Moreover, the electronic toy of the present invention comprises: a basic frame disposed at the torso portion of the human-shaped toy; first and second sub-frames respectively provided at both sides of the basic frame and rotatably mounted on the basic frame; first and second rotational axes respectively provided to the first and second sub-frames; a cam mechanism provide to a third rotational axis driven by a first motor; a link for connecting the cam mechanism between the first and second sub-frames and oscillating both sub-frames; a gear mechanism driven by a second motor; and a transmission mechanism disposed across the basic frame and between the first and second sub-frames and for transmitting the output of the gear mechanism to the first and second rotational axes. [0035]
  • According to the foregoing structure, the shoulder, arm and wrist become movable, and simulated humanlike movement can be represented. [0036]
  • Preferably, the transmission mechanism is structured of a gear train formed of a plurality of gears, and each of the gears on both ends are respectively disposed in the first and second sub-frames and respectively engage with the first and second rotational axes via a bevel gear. [0037]
  • According to the foregoing structure, the arm can be rotated simultaneously together with the rotation of the shoulder. [0038]
  • Preferably, a first clutch mechanism for protecting the member from an overload is provided between the first motor and third rotational axis. [0039]
  • Preferably, a second clutch mechanism for protecting the member from an overload is provided to the gear mechanism. [0040]
  • Further, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head portion capable of displaying text and symbols, and which is structured such that the information input pursuant to the operation of the input unit, which is formed of a plurality of input switches provided on the body, can be visually confirmed with the display unit provided on the face-corresponding portion. [0041]
  • Moreover, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal having a head portion and a torso portion and comprising a display unit at a face-corresponding portion of the head portion capable of displaying text and symbols, an input unit formed of a plurality of input switches is provided to the torso portion, and which is structured such that the operation results of the input unit can be visually confirmed with the display unit provided on the face-corresponding portion. [0042]
  • Further, the electronic robot of the present invention is an electronic robot in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head portion capable of displaying text and symbols, and which is structured such that the information input by the operator operating the input unit provided to the body of the robot can be visually confirmed with the display unit provided on the face-corresponding portion. [0043]
  • Moreover, the electronic robot of the present invention is an electronic robot in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head capable of displaying text and symbols, and which is structured such that the information input by the operator operating the input unit provided to the body of the robot is displayed on the display unit provided on the face-corresponding portion so as to form the expression of the robot. [0044]
  • Preferably, an emotion parameter is included in the control parameter, and the emotion parameter is represented as the biorhythm of a specific person or the biorhythm of the robot. Expressions based on the emotions of the robot itself are thereby realized. [0045]
  • Preferably, the emotion parameter is affected by the occurrence of an event. Emotions thereby change on a case-by-case basis pursuant to the situation. [0046]
  • Preferably, the response to a question inquired by the electronic toy to the user is included in the event. Emotions can be changed depending on the response to the question. [0047]
  • Preferably, defined beforehand in the question are changes in the emotion parameter against the anticipated response to the question. Influence on the respective responses to the question may thereby be made to differ. [0048]
  • Preferably, the control unit further selects the information to be externally displayed and/or selects the sound to be externally output based on the emotion parameter. Obtained thereby is information and sound to be externally output based on emotions. [0049]
  • Preferably, the control means further stores the response to the question, and forms a standard sentence by using data relating to the response. That is, the response results are used (reflected) in the control. [0050]
  • Further, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; an input means provided on the body for performing input operations; a storage means for storing a plurality of words; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the words based on the emotion parameter value and displays the words on the display unit. [0051]
  • According to the foregoing structure, it is possible to output words based on the emotions of the robot. [0052]
  • Moreover, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a vocalization means for outputting sound data as a voice; an input means provided to the body for performing input operations; storage means for storing a plurality of sound data; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the sound data based on the emotion parameter value and makes the vocalization means vocalize the sound data. [0053]
  • According to the foregoing structure, it is possible to output words based on the emotions of the robot. [0054]
  • Further, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a vocalization means for outputting sound data as a voice; an input means provided on the body for performing input operations; a storage means for storing a plurality of words and a plurality of sound data; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the words and the sound data based on the emotion parameter value and supplies the words and the sound data to the display unit and the vocalization means, respectively. [0055]
  • According to the foregoing structure, it is possible to output words and sound based on the emotions of the robot. [0056]
  • Preferably, the emotion parameter value temporally changes between the maximum value and minimum value. [0057]
  • Preferably, the control means further inquires questions with text or sound, and changes the value of the emotion parameter in accordance with the input operation in response thereto. The emotion of the electronic toy will thereby change depending on the user's response to the question. [0058]
  • Preferably, a plurality of questions are stored in advance, and changes in the emotion parameter are defined against the anticipated response to the respective questions. This is amusing in that the degree of change in the emotion per question will differ. [0059]
  • Preferably, a plurality of questions are stored in advance, and the degree of intimacy between the electronic toy and user is defined against the anticipated response to the respective questions. [0060]
  • Preferably, the control means further stores the response to the question, and forms a standard sentence by using data relating to the response. [0061]
  • Preferably, the control unit further accumulates the degree of intimacy obtained pursuant to the respective questions and, when this exceeds a prescribed value, supplies data for expressing a specific emotion to the display unit and/or the vocalization means. The electronic toy is thereby able to make affectionate expressions to its user. [0062]
  • Preferably, the aforementioned questions include questions that will affect and questions that will not affect the emotion parameter. [0063]
  • Preferably, a plurality zones are defined in advance between the maximum value and minimum value of the emotion parameter and the words and the voice data are distributed in the respective zones, and wherein the control means selects words and sound data of the corresponding zone depending on to which zone the current emotion parameter value belongs. [0064]
  • Preferably, the control means, in a special zone, further selects the control for performing a special movement accompanying the mechanical movement of the components structuring the human shape or animal shape. The overall movement will have a large impact on the user. [0065]
  • Preferably, the control means further comprises an exhibition mode for changing the emotion parameter value between the maximum value and minimum value thereof in short cycles. The characteristics of the electronic toy in the show window can thereby be introduced in a short period of time. [0066]
  • Preferably, the electronic toy further comprises a connection means for connecting the electronic toy to a network, and wherein at least one of the words and sound data is downloaded to the storage means from a server device connected to the network. Data of words and sound as well as control data can thereby be updated. [0067]
  • Preferably, the downloaded words and sound data are current affair terms. This is amusing in that the electronic toy will be contemporary. [0068]
  • Preferably, downloaded words and sound data are terms corresponding to the characteristics of the user. Words befitting the user can thereby be selected. [0069]
  • Preferably, the electronic toy further comprises a connection means for connecting two electronic toys, and wherein at least one of the words and sound data stored in the connected electronic toy of the opponent is received by the storage means. Data exchange between toys is thereby possible. [0070]
  • Preferably, the connection means includes at least one of a communication cable, PHS, mobile telephone and personal computer. [0071]
  • Preferably, text data is exchanged between the electronic toys, and simulated conversation is conducted by incorporating the exchanged data in standard sentences. It is thereby possible to make it look like the electronic toys are having a conversation. [0072]
  • Moreover, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a sound detection means for detecting peripheral sound; a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a storage means for storing a plurality of expressions; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the expression based on the emotion parameter value and displays the expression on the display unit, and sets the emotion parameter to an unpleasant state when the sound exceeds a prescribed level and continues beyond a prescribed time. [0073]
  • According to the foregoing structure, reluctant expressions and gestures are made if a loud sound is continuously provided to the electronic toy. [0074]
  • Further, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a storage means for storing a plurality of expressions; an input means provided on the body for performing input operations; and a control means having a function for outputting emotion parameter values representing self emotions, and which selects the expression based on the emotion parameter value and displays the expression on the display unit, and selects an expression corresponding to the emotion parameter when the input means is continuously operated for a prescribed time or a prescribed number of times. [0075]
  • According to the foregoing structure, when pounding or patting the electronic toy, expressions and movements corresponding to the emotion at such time can be expected. [0076]
  • Preferably, an expression of anger is displayed on the display unit during the unpleasant state. [0077]
  • Preferably, the expression selected in correspondence with the continuous operation is a painful expression upon being pounded, or a delightful expression upon being patted. [0078]
  • Moreover, the electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; a storage means for storing a plurality of expressions; a light sensor for detecting the peripheral brightness; and a control means for selecting an expression corresponding to the self emotion and selecting the expression of closing the eyes when the light sensor detects a dark state beyond a prescribed time. [0079]
  • According to the foregoing structure, the state of falling asleep can be represented. [0080]
  • Preferably, the control means further moves the mechanical components structuring the human shape or animal shape so as to express a reluctant expression of going to sleep. [0081]
  • Preferably, the initial value of the function for outputting the emotion parameter value expressing the emotions is set randomly. [0082]
  • According to the foregoing structure, the state of commencing movement in the respective electronic toys will differ. This is amusing in that the each electronic toy will be individualized. [0083]
  • The electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion; movably structured mechanical components structuring a human shape or an animal shape; and a control unit for distinguishing the message and control information from a file attached to an e-mail, displaying the message on the display unit, and moving the mechanical components in correspondence with the control information. [0084]
  • Preferably, the attachment file is a sound file. [0085]
  • Preferably, the sound file is reproduced as a sound signal by a computer, and the sound signal is supplied to the control unit. [0086]
  • Preferably, the control information designates the movement stored beforehand in the control unit. [0087]
  • Preferably, the control information designates to the control unit a series of control procedures of the mechanical components. [0088]
  • Preferably, when the control information is not attached, the control unit selects adequate movement of the mechanical components. [0089]
  • Preferably, the control information expresses emotions such as delight, anger, sorrow and pleasure of the robot. [0090]
  • The e-mail method of the present invention comprises: a step of converting the input message to be displayed on the electronic toy of the receiving side and the movement to be made by the electronic toy into a sound signal; a step of converting the sound signal into a sound file and making this an attachment file of an e-mail; a step of transmitting the e-mail with the sound attachment file from the terminal device of the transmitting side to the terminal device of the receiving side; a step of forwarding the reproduced sound signal from the terminal device of the receiving side to the electronic toy; and a step of making the electronic toy display the message and perform the movement. [0091]
  • The electronic toy of the present invention is an electronic toy in the shape of a human or an animal, comprising: a leg structure structuring a pair of movable legs in the shape of a human or an animal; and a control unit for controlling the movement of the legs in correspondence with sound to be output. [0092]
  • Preferably, the control unit sets the speed of movement of the legs in correspondence with the volume of the sound and rhythm. [0093]
  • Preferably, the movement of the pair of legs is a movement of opening/closing the legs in the horizontal direction. [0094]
  • Preferably, a slide prevention means is provided to the sole of one of the legs, and sliding means is provided to the sole of the other of the legs. [0095]
  • Preferably, the leg structure comprises: a waist portion frame provided with a pair of hip joint portions rotatable at least in one direction; a pair of leg portions respectively connected to the pair of hip joint portions; a pair of drive shafts in which one end portion thereof is mounted on the leg portion and the other end portion thereof extends inside the waist portion frame beyond the hip joint portion of the leg portion; a link member for mutually connecting the respective other end portions of the respective drive shafts; a cam mechanism interjacent between the other end portion of at least one of the drive shafts and the link member, and for changing the respective one end portions of the drive shafts to become wide or narrow; and a motor built in one of the leg portions and for rotatably driving one of drive shafts. [0096]
  • Preferably, the other end portion of the drive shaft and the link member, or the cam and the link member, are connected via a spherical engagement member. [0097]
  • Preferably, the electronic toy further comprises a sliding means provided on one end portion of the other drive shaft among the pair of drive shafts so as to slide on the ground surface or floor surface. [0098]
  • Preferably, the other of the leg portions comprises an above-knee portion connected rotatably in the cross direction to the hip joint portion, a below-knee portion connected rotatably in the cross direction with the above-knee portion, and a ground portion connected rotatably in the horizontal direction with one end portion of the other drive shaft among the pair of drive shafts; and wherein the electronic toy has a structure in which a protrusion is formed at the lower face of the below-knee portion, an inclined face to which the protrusion contacts is formed on the upper face of the ground portion, the protrusion is pushed up pursuant to the opening/closing movement of the leg portions, and the connection of the above-knee portion and the below-knee portion bends thereby. [0099]
  • Preferably, the sliding means is a roller. [0100]
  • Preferably, the degree of opening/closing of the pair of legs is adjustable pursuant to the position in which the engagement member is mounted on the cam mechanism. [0101]
  • The electronic toy of the present invention is an electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of one leg comprises: a waist portion frame; an above-knee portion connected rotatably to the waist portion frame; a below-knee portion connected rotatably to the above-knee portion; a ground portion connected rotatably to the below-knee portion; a rotatably driven cam pulley provided to the waist portion frame; a first cam provided to the cam pulley; a second cam provided to the cam pulley; a long member for vertically oscillating the ground portion with the first cam; and a short member for oscillating the below-knee portion with the second cam. in the cross direction. [0102]
  • According to the foregoing structure, it is possible to lift and move the tip (toe) or back end (heel) of the ground portion (feet) in an appropriate angle upon moving both feet alternatively and advancing forward or retreating backward. [0103]
  • Further, the electronic toy of the present invention is an electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of one leg comprises: a waist portion frame; an above-knee portion connected rotatably to the waist portion frame; a below-knee portion connected rotatably to the above-knee portion; a ground portion connected rotatably to the below-knee portion; a rotatably driven cam provided to the waist portion frame; a long member for vertically oscillating the ground portion with the cam; and a short member for oscillating the below-knee portion with the cam in the cross direction. [0104]
  • According to the foregoing structure, it is possible to lift and move the tip (toe) or back end (heel) of the ground portion (feet) in an even larger angle upon moving both feet alternatively and advancing forward or retreating backward. [0105]
  • Preferably, the long member comprises a guide hole to be engaged with a guide member and a pushdown plate in contact with the cam. The pushdown plate pushes down the ground portion and sets the upper limit of the long member. It is thereby possible to prevent a larger inclination and excess lifting of the ground portion. [0106]
  • Preferably, the electronic toy further comprises an energization means for energizing the tip of the ground portion in the pushdown direction. [0107]
  • Preferably, the size of the electronic toy is approximately 30 cm. [0108]
  • Preferably, an oblique direction drive means is provided to the ground portion for driving the electronic toy in an oblique direction against the advancing direction by the bipedal locomotion mechanism. [0109]
  • Preferably, the oblique direction drive means is structured by comprising a rotatably driven drive roller or drive belt. [0110]
  • Preferably, a plurality of driver rollers or drive belts are provided. [0111]
  • Preferably, the oblique direction drive means is respectively provided to the respective ground portions of both legs, and the respective drive directions of each of the oblique direction drive means exist on the circumference of an approximate curvature. [0112]
  • Preferably, the oblique direction drive means is provided at the toe side of the ground portion, and a sliding roller is provided at the heel side of the ground portion. [0113]
  • The present invention is an electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, and is provided with an oblique direction drive mechanism for driving the electronic toy in an oblique direction against the advancing direction by the bipedal locomotion mechanism. [0114]
  • Preferably, the oblique direction drive mechanism is structured by comprising a rotatably driven drive roller or drive belt. [0115]
  • Preferably, a plurality of driver rollers or drive belts are provided. [0116]
  • Preferably, the oblique direction drive mechanism is respectively provided to the respective sole portions of both feet, and the respective drive directions of each of the oblique direction drive means exist on the circumference of an approximate curvature. [0117]
  • Preferably, the oblique direction drive mechanism is provided at the toe side of the sole portion of the feet, and a sliding roller is provided at the heel side of the sole portion of the feet.[0118]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a front view for explaining the robot as the electronic toy (domestic robot); [0119]
  • FIG. 2 is a rear view for explaining the robot as the electronic toy; [0120]
  • FIG. 3 is a top, view, for explaining the robot as the electronic toy; [0121]
  • FIG. 4 is a side view for explaining the robot as the electronic toy; [0122]
  • FIG. 5 is an explanatory diagram for explaining the mechanism enabling the rotation of the arm, shoulder, neck, and so on of the robot; [0123]
  • FIG. 6 is a perspective view for explaining the aforementioned mechanism; [0124]
  • FIG. 7 is an explanatory diagram showing the mechanism enabling the rotation of the neck portion and rotation of the shoulder portion of the robot; [0125]
  • FIG. 8 is an explanatory diagram showing the mechanism enabling the rotation of the arm portion of the robot; [0126]
  • FIG. 9 is a block diagram for explaining the structure of the control system; [0127]
  • FIG. 10 is a block diagram explaining the schematic structure of the [0128] control unit 60;
  • FIG. 11 is a flowchart explaining an example of inputting the “date of birth” into the robot for calculating the biorhythm; [0129]
  • FIG. 12 is a flowchart explaining an example of enabling the determination of the existence of a user by collecting ambient sound; [0130]
  • FIG. 13 is a flowchart explaining an example of recognizing the voice (order, etc.) of the user and the robot making movements corresponding thereto; [0131]
  • FIG. 14 is a flowchart explaining an example of detecting the movement of the object; [0132]
  • FIG. 15 is a flowchart explaining an example of distinguishing the existence of the user based on the switch operation, movement of the object, and existence of sound; [0133]
  • FIG. 16 is a flowchart explaining an example for distinguishing the existence of the user based on the switch operation, peripheral brightness, and existence of sound; [0134]
  • FIG. 17 is a flowchart explaining an example of the control movement in consideration of the biorhythm; [0135]
  • FIG. 18 is an explanatory diagram explaining the biorhythm; [0136]
  • FIG. 19 is an explanatory diagram explaining an example of the facial eye expressions and the text (symbol) scroll displayed on the display screen; [0137]
  • FIG. 20 is a flowchart explaining an example of movement control of the robot pursuant to the lapse in time; [0138]
  • FIG. 21 is a flowchart explaining the execution of a control program pursuant to the control unit (CPU); [0139]
  • FIG. 22 is an explanatory diagram explaining an example of the posture expressing a feeling of “delight” of the robot; [0140]
  • FIG. 23 is an explanatory diagram explaining an example of the posture expressing a feeling of “pleasure” of the robot; [0141]
  • FIG. 24 is an explanatory diagram explaining an example of the posture expressing a feeling of “sorrow” of the robot; [0142]
  • FIG. 25 is an explanatory diagram explaining an example of the posture expressing a feeling of “affection” of the robot; [0143]
  • FIG. 26 is a front view explaining an example of another robot as the electronic toy; [0144]
  • FIG. 27 is a side view explaining an example of another robot as the electronic toy; [0145]
  • FIG. 28([0146] a) to FIG. 28(d) are explanatory diagrams explaining an example of the various expressions corresponding to the delight, anger, sorrow and pleasure of the robot;
  • FIG. 29([0147] a) and FIG. 29(b) are explanatory diagrams explaining an example of the various expressions corresponding to the emotions of the robot;
  • FIG. 30 is an explanatory diagram showing another example of a different biorhythm (biorhythm of robot); [0148]
  • FIG. 31 is a flowchart explaining the operation of an example of displaying words on the display screen of the robot influencing the current feelings; [0149]
  • FIG. 32 is an explanatory diagram explaining an example of displaying words (anger mode of biorhythm) on the display screen of the robot; [0150]
  • FIG. 33 is an explanatory diagram explaining an example of displaying words (normal emotion mode) on the display screen of the robot; [0151]
  • FIG. 34 is an explanatory diagram explaining an example of displaying words (haiku style) on the display screen of the robot; [0152]
  • FIG. 35 is a flowchart explaining an example of the feeling of the robot changing pursuant to the response to the question asked by the robot; [0153]
  • FIG. 36 is an explanatory diagram showing an example of a question in which the response thereof will influence the biorhythm; [0154]
  • FIG. 37 is an explanatory diagram showing an example of a question in which the response thereof will not influence the biorhythm; [0155]
  • FIG. 38 is an explanatory diagram explaining an example of questions that will influence the biorhythm (feelings) of the robot; [0156]
  • FIG. 39 is an explanatory diagram explaining an example of questions that will influence the biorhythm (feelings) of the robot; [0157]
  • FIG. 40 is an explanatory diagram explaining an example of the feelings (biorhythm) becoming worse pursuant to the response to the result of the question; [0158]
  • FIG. 41 is an explanatory diagram explaining an example of two robots being connected via a cable for exchanging data in order to communicate with each other; [0159]
  • FIG. 42 is an explanatory diagram showing an example where a robot is connected to a PHS or mobile phone in order to acquire data by communicating with another robot or a server device so as to make conversation or movement; [0160]
  • FIG. 43 is a block diagram explaining an example of connecting the communication interfaces of robots via a cable in order to conduct communication; [0161]
  • FIG. 44 is a block diagram explaining an example of communicating by using a terminal device connectable to a communication network such as a PHS or mobile phone; [0162]
  • FIG. 45 is a block diagram explaining an example of enabling communication between robots by using the Internet; [0163]
  • FIG. 46 is an explanatory diagram explaining an example of downloading data from a server device to the robot; [0164]
  • FIG. 47 is a communication diagram explaining a procedural example upon conducting data communication via a connection cable; [0165]
  • FIG. 48 is a communication diagram explaining a procedural example upon conducting data communication with a mobile phone or PHS; [0166]
  • FIG. 49 is a communication diagram explaining a procedural example upon conducting data communication when obtaining data from the server device; [0167]
  • FIG. 50 is an explanatory diagram explaining an example of “current affairs” and user adaptive data provided by the server device; [0168]
  • FIG. 51 is a block diagram explaining the operation of an action mail; [0169]
  • FIG. 52 is an explanatory diagram explaining the contents (format) of the action mail; [0170]
  • FIG. 53 is an explanatory diagram explaining an example the robot making a movement of “delight” upon receiving the action mail; [0171]
  • FIG. 54 is an explanatory diagram explaining an example the robot making a movement of “anger” upon receiving the action mail; [0172]
  • FIG. 55 is an explanatory diagram explaining an example the robot making a movement of “sorrow” upon receiving the action mail; [0173]
  • FIG. 56 is an explanatory diagram explaining an example the robot making a movement of “pleasure” upon receiving the action mail; [0174]
  • FIG. 57 is a perspective view explaining the first posture (legs closed) of the dance robot; [0175]
  • FIG. 58 is a perspective view explaining the second posture (legs opened) of the dance robot; [0176]
  • FIG. 59 is a perspective view explaining open/close mechanism of the legs (posture of legs closed); [0177]
  • FIG. 60 is a perspective view explaining open/close mechanism of the legs (posture of legs opened); [0178]
  • FIG. 61 is a perspective view explaining a structural example of the right leg; [0179]
  • FIG. 62 is an explanatory diagram explaining the operation of bending the knee of the right leg; [0180]
  • FIG. 63 is a perspective view explaining a structural example of the left leg; [0181]
  • FIG. 64 is an explanatory diagram explaining the adjustment of synchronization of the left and right legs by a cam; [0182]
  • FIG. 65 is a block diagram explaining the control system of the dance robot; [0183]
  • FIG. 66 is a block diagram explaining the control system of another dance robot; [0184]
  • FIG. 67 is a perspective view explaining the bipedal robot; [0185]
  • FIG. 68 is a perspective view explaining the bipedal robot; [0186]
  • FIG. 69 is a perspective view explaining the bipedal robot; [0187]
  • FIG. 70 is an explanatory diagram explaining the bipedal robot mechanism; [0188]
  • FIG. 71 is an explanatory diagram explaining the waist portion frame; [0189]
  • FIG. 72 is an explanatory diagram explaining the upper knee portion; [0190]
  • FIG. 73 is an explanatory diagram explaining the lower knee portion; [0191]
  • FIG. 74 is an explanatory diagram explaining the ground portion; [0192]
  • FIG. 75 is an explanatory diagram explaining the campulley, long rod and short rod; [0193]
  • FIG. 76([0194] 1) to FIG. 76(4) are explanatory diagrams explaining the movement of the bipedal mechanism corresponding to the rotation of the camshaft;
  • FIG. 77([0195] 5) to FIG. 77(8) are explanatory diagrams explaining the movement of the bipedal mechanism corresponding to the rotation of the camshaft;
  • FIG. 78 is an explanatory diagram explaining the movement of the leg of the robot; [0196]
  • FIG. 79 is an explanatory diagram explaining another bipedal mechanism; [0197]
  • FIG. 80 is an explanatory diagram explaining the cam pulley, long rod, short rod and spring of an example of another bipedal mechanism; [0198]
  • FIG. 81 is an explanatory diagram explaining the turnabout mechanism of the robot; [0199]
  • FIG. 82 is an explanatory diagram explaining the turnabout mechanism of the robot; [0200]
  • FIG. 83 is an explanatory diagram explaining the turnabout mechanism of another robot; [0201]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Embodiments of the present invention are now described with reference to the attached drawings. [0202]
  • FIG. 1 through FIG. 4 show examples of a humanoid robot (pet robot) as the electronic toy (domestic robot), and the diagrams respectively illustrate the front view, back view, top view and side view of the robot. [0203]
  • The [0204] robot 1 is structured by comprising a head portion 10, a torso portion 20, left and right arm portions 30, and left and right leg portions 40. The head portion 10 and the torso portion 20 are rotatably connected via a neck joint K6. The torso portion 20 and the arm portion 30 are rotatably connected via a shoulder joint K1. An elbow joint K2 and a wrist joint K3 are provided to the arm portion 30 so as to realize the free bending of the arm portion 30. The torso portion 20 and the leg portion 40 are rotatably connected via a hip joint K4. Moreover, a knee joint KS is provided to the leg portion 40.
  • Provided to the [0205] head portion 10 are an after-mentioned microcomputer system for controlling the robot, a window display unit for enabling communication between the user and robot, a sound sensor for collecting sound, a light sensor (or camera) for acquiring peripheral information, a touch sensor, a speaker for generation the sound of the robot, and so on. Further provided to the head portion 10 are an after-mentioned waggle mechanism for rotating the internal frame (not shown) of the head portion 10 and a nodding mechanism (not shown) for moving the head portion 10 back and forth. The neck joint K6 corresponds to the above.
  • The [0206] torso portion 20 comprises a motor as the source of power, an arm opening/closing mechanism for rotating the left and right arm portions 30 around the Z axis (vertical direction in FIG. 1) of the shoulder joint K1, an arm rotating mechanism for rotating the left and right arm portions 30 around the X axis (horizontal direction in FIG. 1) of the shoulder joint K1, and a neck rotating mechanism for rotating the head portion 10 around the Z axis. Moreover, “∘” and “×” switches 54 as the detection switch are provided to the torso portion 20.
  • A battery as the power source for activating the aforementioned motor and microcomputer system or the like is disposed inside the left and [0207] right leg portions 40. The battery may also be disposed in the torso portion 20 or the arm portion 30. When disposing the battery in the torso portion 20, bending of the knee joint K5 becomes possible.
  • In addition, the bending of arms and legs can be realized by disposing an actuator such as an electromagnet or micro motor inside the respective arm portions or respective leg portions, thereby enabling more humanlike movements. [0208]
  • Although the aforementioned electronic robot is in the shape of a human, this electronic robot may also be in the shape of an animal. Further, the [0209] display unit 71 capable of displaying text and symbols on the face-corresponding portion of the head portion is structured such that the information, which is input by the operator who operates the input unit formed of a plurality of input switches 51, 54 and so on provided to the body of the aforementioned robot, can be visually confirmed with the display unit 71 provided on the face-corresponding portion described above.
  • FIG. 5 through FIG. 8 are diagrams for explaining the mechanical structure built in the [0210] torso portion 20. FIG. 5 is a front view of the mechanical structure, and FIG. 6 is the perspective view thereof. FIG. 7 is an explanatory diagram illustrating the components corresponding to the aforementioned arm opening/closing mechanism and neck rotating mechanism in the mechanical structure. FIG. 8 is an explanatory diagram illustrating the components corresponding to the aforementioned arm rotating mechanism in the mechanical structure.
  • As shown in FIG. 5 and FIG. 6, the [0211] mechanical structure 200 is structured by comprising a basic frame 201, a sub frame 202, a neck (head portion) rotating mechanism 210 (c.f. FIG. 7), an arm (or shoulder) opening/closing mechanism 220 (c.f. FIG. 7), an arm rotating mechanism 230 (c.f. FIG. 8), a neck (head portion) rotational axis (203), an arm rotational axis 204, a first motor 205, a second motor 206, a motor mounting plate 207 for fixing the respective motors. to the frame 201, and so on.
  • The [0212] sub frame 202 is formed in an approximate horseshoe shape, and is respectively provided to both left and right sides of the basic frame 201 so as to be freely rotatable around the Z axis against the frame 201. A bevel gear mechanism for changing the transmission direction of the power is provided inside the sub frame, and power is thereby transmitted to the arm rotational axis 205 even when the sub frame 202 rotates around the Z axis.
  • As shown in FIG. 7, the neck [0213] rotating mechanism 210 and the arm opening/closing mechanism 220 are driven with the first motor 205. The rotational axis of the motor 205 is connected to a warm gear mechanism 211 for changing the power transmission direction and converting the torque, and rotates the head portion rotational axis 203 via a spring clutch mechanism 212 as the safety device. A frame (not shown) of the head portion 10 is connected to the upper end of the head portion rotational axis 203 and rotates the head portion 10 around the Z axis. Or, a warm gear mechanism may be provided at the upper end of the head portion rotational axis 203 in order to realize the movement of the head portion nods back and forth by obtaining the rotation around the x axis. An arm opening/closing mechanism 220 is connected to the lower end of the head portion rotational axis 203. The spring clutch mechanism 212 prevents the breakage of components by sliding when there is an overload in the head portion rotational axis 203 or the sub frame (arm opening/closing mechanism).
  • A [0214] cam mechanism 221 is provided to the lower end of the head portion rotational axis 203. The cam mechanism 221 is structured by comprising a plate 222 fixed to the axis 203, two arm mounting pins 223 provided to the plate 222, pins 224 respectively mounted on the two sub frames 202, and two links 225 for rotatably connecting one of the arm mounting pins 223 and the pin 224 of one of the sub frames, and the other arm mounting pin 223 and the pin 224 of the other sub frame 202, respectively. Each of the sub frames 202 is rotatably retained with the basic frame 201 with the pin 226.
  • Thus, when the [0215] motor 205 rotates, the head portion rotational axis 203 rotates in correspondence with the reverse rotational direction, thereby rotating the head portion 10. The plate 222 rotates pursuant thereto, thereby moving the link 226 and moving the sub frame 202 around the Z axis. This enables the movement of opening and closing the arm portion 30 (e.g., movement of hugging). The motor 205 is controlled by the microcomputer. The rotational quantum of the axis 203 or the comprehension of the movement posture is, for example, grasped by reading the codes of a sensor disk (not shown) provided to the tip portion of the axis203, or with the combination of the cam and switch (not shown) provided to the tip portion of the axis 203.
  • As depicted in FIG. 8, the arm [0216] rotating mechanism 230 is driven with the second motor 206. The pinion gear mounted on the rotational axis of the motor 206 drives a gear mechanism 231 formed of a plurality of gears. This gear mechanism 231 further drives the gear train 233 which propagates the driving force in the longitudinal (horizontal) direction at the upper part inside the basic frame 201. A clutch mechanism 233 as the protection mechanism for preventing the breakage of components due to an overload is provided between the gear mechanism 231 and the gear train 233. The clutch mechanism 233, for example, generates a slide movement on the rubber surface in the case of an overload via the rubber friction plate (surface) sandwiched between the gears. Further, the likes of the aforementioned spring system or a flexible concave/convex plate may also be combined therewith.
  • The [0217] gear train 233, for example, is structured of 6 gears, and the gears on both ends are provided within the sub frame 202. And, these gears on both ends engage with the bevel gear fixed to one end of the arm rotational axis 204 retained rotatably by the sub frame 202. An arm portion 30 (not shown) is mounted to the other end side of the arm rotational axis 204 via a collar 234 fixed to the axis 204. Therefore, the driving force of the motor 206 rotates the arm rotational axis 204 via the gear mechanism 231, clutch mechanism 233 and gear train 232, and rotates the arm portion 30 mounted on this rotational axis 204. A sensor is provided to an appropriate position, to the collar 234 for example, for detecting the rotational position of this arm and controlling the motor.
  • FIG. 9 is a block diagram for explaining the control system of a robot as the electronic toy. The robot comprises, as means for detecting the peripheral situation and inputs, a [0218] touch sensor 51, microphone (sound sensor) 52, light sensor (e.g., CCD camera) 53, “∘”·“×” switch 54 for generating output corresponding to the operation of the “∘” button and “×” button, status (posture) sensor 55 and battery voltage detection sensor 56. The touch sensor 51, for example, is provided to the upper surface of the head portion 10 of the robot (c.f. FIG. 3) and is capable of detecting the patting (contact) on the head by the user. The touch sensor, for instance, is a micro switch or a capacitance-detecting contact detection switch. The status sensor 55 detects the posture of the robot. Outputs from these respective sensors are supplied to the control unit 60. Based on these inputs, the control unit 60 controls the motors 205 and 206, the window display unit 71 of the head portion, the speaker 72, and the joint actuator group 73. When performing a simpler posture control in which detailed movements of the arms and legs of the robot are not made, the joint actuator group 73 may be omitted. Further, by internally providing a USB terminal or an infrared interface, it is possible to incorporate a function of forwarding the image read by the light sensor to a personal computer, PHS or mobile phone. Moreover, when incorporating a function of storing the user's name and calling such user's name in the robot, since it is impossible to store each and every name at the initial stage of shipment, a function may be provided where additional name data is prepared in advance in the homepage server or in accordance with the request from the user, such that the user may connect his/her PC, PHS or mobile phone and the like to the USB terminal or infrared interface of the robot in order to download and use the desired name information from such homepage. The USB terminal or infrared interface, for example, may be disposed in the back of the head portion where the CPU is built in.
  • As illustrated in FIG. 10, the [0219] control unit 60 comprises a CPU 61 as the central processing unit, a ROM 62 (storage means), a RM 63 and a timer (time and calendar function). Stored in the ROM 62 are a movement control program for driving and controlling the display unit 71, speaker 72, motors 205 and 206, and actuator group 73; posture control data for switching a plurality of movement postures by controlling the rotational direction and rotational quantum of the motors 205 and 206 (and actuator group 73) in accordance with the posture of the robot to be set; sound control data for generating voices and melodies to be output from the speaker 72; display control data for making the display unit 71 display information to be displayed on the robot; program data for calculating the biorhythm of the user; sound/image processing program for judging the peripheral situation; for example, the existence of the user, based on the sound input or image input of the CCD camera; communication program (not shown) for externally conducting data communication via a PHS and the like.
  • The sound/image program includes a sound processing program for performing the likes of filter processing, discrimination processing and modulation processing of input sound, and an image processing program for detecting the peripheral brightness and detecting the movement of the subject. Further, the movement control program includes the likes of a movement selection program for selecting the movement pattern and display pattern corresponding to the situation among a plurality of movement patterns based on the judgment result of the peripheral situation pursuant to sound and/or images, and a posture control program for performing control so as to move the [0220] head portion 10, arm portion 30, joints and the like in the selected movement pattern.
  • Stored in the RAM [0221] 84 are output data of the microphone 52 and the output data of the light sensor (camera) 53 pursuant to the DMA operation via the interface of the microcomputer (not shown).
  • The sound signal output by the [0222] microphone 52 is A/D converted with the interface, low-pass filter processed so as to eliminate noise and only extract the voice range of a person, and retained as sound data by the RAM 63. Sound data is subject to the sound processing program. This data is stored for a fixed period of time and subject to sound recognition processing. The method of sound recognition may either be recognition of general speakers or recognition of a specific speaker. As a result of this sound recognition processing, a command corresponding to the words communicated by the voice of the user is output. The corresponding movement control is enabled by this command being communicated to the movement control program, thereby realizing the robot to make movements, displays and enunciations corresponding to the voice.
  • Moreover, in a standby state where the robot is not moving, lifestyle sounds are collected by temporally observing the average level of the sound data in order to distinguish whether the user is near the robot. [0223]
  • The sound processing program, which includes the storage processing of storing the sound in the [0224] memory 63, may also be used as a so-called voice memo for storing the voice of the user. Further, impersonation (voice imitation) by performing conversion processing of tone and pitch to the stored sound data, and forwarding this to the speaker 72 for vocalization is also possible.
  • The output signal corresponding to one frame output from the CCD camera as the light sensor is converted into image data with the interface, and retained in the image storage region of the [0225] RAM 63. Image data is subject to the image processing program. For example, in the standby state, the image is periodically sampled, and changes in the image (movement of subject) based on the difference of the image data of the previous frame and the image data of the present frame are read. The existence of the user is distinguished (or presumed) with the movement of the subject of the camera. Moreover, there is no need to compare each and every frame, and image data in a plurality of sections within the frame may be compared. The peripheral brightness of the robot can be distinguished with the average value of the image data (luminance). Upon distinguishing the peripheral brightness, a CCD camera is not essential, and a light detection element such as an SPD or phototransistor may also be used. In such a case, for example, the existence of the user may be distinguished by recognizing that it is bright during night hours by combining the time and brightness. It is also possible to distinguish the existence of the user by distinguishing the existence of voices (or lifestyle sound) and the brightness in the room. The existence or non-existence of the user is displayed in the flag area of the RAM 63.
  • In addition, it is possible to externally forward the image data read by the [0226] CCD camera 53 according to an external request via a communication interface 74, and, for instance, it would be possible to transmit the condition of the room, in correspondence to the access from the mobile phone of the user, to such mobile phone.
  • The respective outputs of the [0227] touch sensor 51, ∘× switch 54, status sensor 55 and the like are set to the flags in the flag area of the respective switches of the RAM 63 via the interface. Interruption is generated pursuant to this setting of flags and event processing is performed thereby.
  • Next, the operation of the [0228] control unit 60 is explained. The robot as the electronic toy of the present invention moves in conformity with the biorhythm, which is a parameter for representing the physical condition (behavior) of the user, and moves so as to exhibit a so-called healing atmosphere.
  • FIG. 11 is a flowchart explaining the input processing for acquiring the birthday necessary for calculating the biorhythm of the user. [0229]
  • For example, when the user simultaneously presses both the “∘” and “×” [0230] buttons 54 provided to the torso portion 20, it becomes a mode selection state (not shown). In this state, various modes are sequentially displayed in prescribed time intervals on the display unit 71. Included in the modes are “calendar date setting”, “clock time setting”, “user name input”, “user birthday input”, “user gender input”, “voice memo input”, “voice sample input”, “external (mobile phone) forwarding setting”, “energy saving setting” and so on. When the user presses the ∘ button when the “user birthday input” is displayed on the screen, the birthday input program is activated and proceeds to the present routine.
  • The control unit (CPU) [0231] 60 displays “Please input your birth date”, “Please input in the order of year, month and day” on the liquid crystal panel or LED matrix of the display unit 71. When the letter string does not fit in the size of the screen of the display unit, the letter string is displayed so as to move (scroll display) in the horizontal or vertical direction of the screen (S22) After the display of “Please input your birth date”, for example, the last two digits of the Christian year “40” to “00 (current Christian year)” corresponding to the range of age of target users are sequentially displayed on the display unit 71 in prescribed time intervals (S24). When the year in which the user was born is displayed, the user presses the ∘ button to select such year. The operation of the ∘ button or the × button is distinguished by the setting of the corresponding flag within the RAM 63. The control unit 60 distinguishes whether a selection has been made or not (S26). When a selection is not made even upon a prescribed time elapsing (S26; No), the displayed year is repeatedly increased in increments of “1” (S24 and S26). When selected (S26; Yes), the selected “year” is retained. Moreover, after having pressed the ∘ button, the user may cancel such input by pressing the × button if it is within a prescribed time.
  • When the “year” is selected, the routine proceeds to the input of “month”. The [0232] control unit 60, after having displayed “Please input the month”, sequentially displays “1” to “12” on the display unit 71 in prescribed time intervals (S28). When the month in which the user was born is displayed, the user presses the ∘ button to select such month. The control unit 60 distinguishes whether a selection has been made or not (S30). When a selection is not made even upon a prescribed time elapsing (S30; No), the displayed month is repeatedly increased in increments of “1” (S28 and S30). When selected (S30; Yes), the selected “month” is retained.
  • When the “month” is selected, the routine proceeds to the input of “day”. The [0233] control unit 60, after having displayed “Please input the day”, sequentially displays “1” to “31” on the display unit 71 in prescribed time intervals (S32). When the day in which the user was born is displayed, the user presses the ∘ button to select such day. The control unit 60 distinguishes whether a selection has been made or not (S34). When a selection is not made even upon a prescribed time elapsing (S34; No), the displayed day is repeatedly increased in increments of “1” (S32 and S34). When selected (S34; Yes), the selected “day” is retained. When the input of “year”, “month” and “day” is completed, the control unit 60 writes the user's “year”, “month” and “day” in the user biorhythm data area of the ROM 62. The user's biorhythm calculation is thereby possible. Moreover, as described later, it is also possible to set the biorhythm of the robot such that the robot will exert itself under its own biorhythm.
  • Similarly, the user sets the “calendar date”, sets the “clock time” built in the robot, inputs the “user's name”, inputs the “user's gender” and so on. [0234]
  • FIG. 12 illustrates an example of the aforementioned sound processing (sound volume detection) of the [0235] control unit 60. The control unit (CPU) 60 performs computing processing equivalent to a low-pass filter for eliminating a high frequency noise component from the sound data stored in the RAM 63 (S42). The average value is sought by multiplying the amplitude level of the sound data in a prescribed time frame of the processed sound data (S44) The control unit 60 stores this average value (S46). The control unit 60 further judges the location of the user by distinguishing whether the sound level increased rapidly by continuously observing the average value of the sound level (S48). When it is judged that the user exists (is located) in the room, the (sound) flag representing the aforementioned location is set (S50)
  • FIG. 13 illustrates an example of the second sound processing (sound recognition). The control unit (CPU) [0236] 60 performs normalization processing in order to match the time axis and signal level of the sound data stored in the RAM 63 with the contrast data (S62). Characteristic parameters of the sound are extracted form the normalized data (S64). Vocalization is distinguished based on the extracted characteristic parameters, and the movement command of the robot corresponding to the subject matter (meaning) of the generation is output (S66). The flag representing this command is set in the RAM 63 (S68). The control unit 60 thereby reads the vocalization control data, display control data and posture control data corresponding to the command and controls the movement of the robot as described later.
  • FIG. 14 illustrates an example of the image processing of the [0237] control unit 60. The control unit 60 compares the previously stored image data from the CCD camera 53 stored in the RAM 63 in a prescribed sampling frequency (S72) with the currently stored image data (S74, and distinguishes changes in the image data. For example, the difference in data of the respective pixels in positions corresponding to both frames is sought and accumulated. When the subject is moving, this accumulated value significantly changes. Further, in order to reduce the operational load, changes in data in a prescribed position on the screen; for example, the center and four corners of the screen, may be compared (S76). Whether the.subject moved (or changed) is judged with the CCD screen (image) based on such difference (S78). When a moving body exists, a flag representing movement detection (user location) is set (S80). Further, the brightness inside the room can also be distinguished with the average value of the luminescence of the image data.
  • FIG. 15 is a flowchart explaining an example of judging whether a user is located (or exists) based on the switches, sound, movement of the subject, and so on. [0238]
  • In FIG. 15, the [0239] control unit 60 repeats this routine in prescribed cycles during the standby state. Foremost, the control unit 60 distinguishes whether the user directly operated the switches with the likes of the touch sensor 51 or ∘× switch 54 by checking the relevant flag (S102). If the switches have been operated (S102; Yes), since this means nothing less than that the user exists, the flag representing the location of the user is set (S112), and this routine is ended.
  • When the switches have not be operated (S[0240] 102; No), it is distinguished whether both the movement detection flag (S80) based on the results of the aforementioned image processing and the sound detection flag (S50) based on the results of the sound processing have been turned on (S104). When both flags have been turned on (S104; Yes), since the probability of the user's location is high, the flag representing the location of user is set (S112), and this routine is ended.
  • When both flags have not been turned on (S[0241] 104; No), it is distinguished whether one of the flags have been turned on (S106). When neither flag has been turned on (S106; No), since the possibility of the user being in the room is low, the flag representing the user's location is turned off or reset S110), and this routine is ended. When one of the flags have been turned on (S106; Yes), it is judged whether the present time is within the movement prohibition time frame preset by the user or the factory in advance (S108). For example, this enables the prevention of annoyance caused by movements in the middle of the night as well as the prevention of wasteful movements during the time of absence. When it is outside the movement prohibition time frame (S108; No), the flag representing the existence of the user is turned on (S112), and this routine is ended. When it is within the movement prohibition time frame (S108; Yes), the user flag is turned off or reset (S110), and this routine is ended.
  • FIG. 16 is a flowchart explaining another example of judging the location (or existence) of the user based on the switches, sound, movement of the subject, and so on. [0242]
  • In this example, the brightness of the room is detected instead of the detection of movement, and differs from the case depicted in FIG. 15 in that the user is considered to exist nearby when the room is bright. In other words, it is distinguished whether the flag representing that the room is bright pursuant to the results of the aforementioned image processing or based on a phototransistor, and a sound detection flag (S[0243] 50) pursuant to the results of the sound processing have been turned on (S120) The other routines are the same as the case in FIG. 15 and explanation thereof is omitted.
  • Next, an example of the robot movement control is explained. The example shown in FIG. 17 illustrates an example where the robot reacts in correspondence with the biorhythm of the user. [0244]
  • The [0245] control unit 60, for instance, executes this routine when it is activated in the morning. Foremost, it is judged whether the user exists in (or near) the room (S132). When the user does not exist (S132; No), this routine is ended. When the user does exist (S132; Yes), the built-in calendar is read (S134). The biorhythm of the user is calculated as depicted in FIG. 18 based on today's date and the birth date of the user (S136). Event occurrence dates are set in this biorhythm beforehand. For example, an event occurrence date shall be turnabout points El and E3 in which the positive and negative behavior switches, optimum point E2 and worst point E3. It is then judged whether today corresponds to an event occurrence date set in advance (S138). When it is not an event occurrence date (S138; No), this routine is ended.
  • When it is an event occurrence date (S[0246] 138; Yes), it is judged whether it is a preset time; for example, a time for the user to start work (S140). When it becomes the set time (S140; Yes), processing (robot movement) corresponding to the biorhythm of the event occurrence date is selected. For example, when it is event E1, the “happy eyes” as illustrated in FIG. 19(A) and the text (scroll display) of “You'll start feeling better” as illustrated in FIG. 19(F) are displayed on the display unit 21. Moreover, the likes of “good luck” are output from the speaker 72. When it is event E2, the “heart eyes” as illustrated in FIG. 19(C) and the text of “It's a perfect day for you” are displayed on the display unit 21. Moreover, the likes of “Don't get too excited” are output from the speaker 72. When it is event E3, the “disappointed eyes” as illustrated in FIG. 19(D) and the text of “Please take care of your health” are displayed on the display unit 21. Moreover, the likes of “Don't work too hard” are output from the speaker 72. When it is event E4, the “round eyes” as illustrated in FIG. 19(E) and the text of “Please watch out for accidents” are displayed on the display unit 21. Moreover, the likes of “Today is not your lucky day” are output from the speaker 72.
  • FIG. 20 is a flowchart showing an example of controlling the movement such that the movement of the robot changes with time. When entering this movement mode, the control unit (CPU) [0247] 60 foremost distinguishes whether the user exists nearby with the settingof the aforementioned flags (e.g., S126) (S152). When the user does not exist nearby (S152; No), solo play is implemented from time to time. Solo play, for example, is represented with a play state by displaying a one-person game on the display unit 71. Random numbers are thereby generated (S154) in order to judge whether the number for solo play has been output (S156). When such number is not output, this routine is ended (S156; No). When such number is output, solo play data is extracted from the posture control data, sound control data and display control data and set in the movement control program (S158).
  • When the user does exist (S[0248] 152; Yes), the control unit 60 reads the present time from the internal clock (S160), and judges whether this time is a time to wake up (S162).
  • When it is a time to wake up (S[0249] 164; Yes), the control unit 60 sets a movement control program by extracting (wakeup) data for waking the robot up from the posture control data, sound control data and display control data (S164). The robot thereby performs wakeup operations such as “Good morning”, “I'm awake” and the like. If it is not a time to wake up (S164; No), it is subsequently distinguished whether it is a time to send off the user (S166).
  • When it is a time to send the user off (S[0250] 166; Yes), the control unit 60 sets a movement control program by extracting data for sending the user off from the posture control data, sound control data and display control data (S168). The robot thereby performs sendoff operations such as “It's time to go”, “Have a good day” and the like. If it is not a time to send the user off (S166; No), it is subsequently distinguished whether it is a predetermined time for the user to come home (S170).
  • When it is a time for the user to come home (S[0251] 170; Yes), the control unit 60 sets a movement control program by extracting data for welcoming the user home from the posture control data, sound control data and display control data (S172). The robot thereby performs welcome operations such as “Welcome home”, “Good to see you” and the like. If it is not a time to welcome the user home (S170; No), it is subsequently distinguished whether it is a predetermined time for the user to go to sleep (S174).
  • When it is a time for the user to go to sleep (S[0252] 174; Yes), the control unit 60 sets a movement control program by extracting operational data for sleeping from the posture control data, sound control data and display control data (S176). The robot thereby performs goodnight operations such as “Good night”, “See you tomorrow” and the like, and thereafter enters a power saving mode (sleep mode) If it is not a time to go to sleep (S174; No), it is subsequently distinguished whether it is a predetermined time of the user's alarm setting (S178).
  • When it is a time of the user's alarm setting (S[0253] 178; Yes), the control unit 60 sets a movement control program by extracting alarm data from the posture control data, sound control data and display control data (S180). The robot thereby performs operations for informing the time such as “It's time”, “Wake up” and “It's (present time)”. If it is not a time of the alarm setting (S178; No), this routine is ended.
  • As shown in FIG. 21, the [0254] control unit 60 performs display control of the display unit 21 pursuant to the display control set in the control program (S202). The posture of the robot is controlled by controlling the motors 205 and 206 pursuant to the posture control data set in the control program (S204). Further, sound is output from the speaker 72 with the vocalization mechanism (synthesizer, sound data reproduction) pursuant to the sound control data set in the control program (S206).
  • FIG. 22 illustrates an operational example of the robot when the operational data of “delight” is set in the control program. [0255]
  • FIG. 23 illustrates an operational example of the robot when the operational data of “pleasure” is set in the control program. [0256]
  • FIG. 24 illustrates an operational example of the robot when the operational data of “sorrow” is set in the control program. [0257]
  • FIG. 24 illustrates an operational example of the robot when the operational data of “affection” is set in the control program. [0258]
  • As described above, the electronic toy of the present invention can be connected to the likes of a PHS, mobile phone and standard circuit, and the user may view the situation inside the house by forwarding the image obtained by the robot to himself/herself. [0259]
  • When the remaining battery level is low, the [0260] control unit 60 yields expressions of vocalized words such as “I'm going to sleep to save the batteries, okay?” pursuant to the battery voltage detection sensor 56.
  • As described above, the robot as the electronic toy depicted in the present embodiment expresses emotions with its entire body, and it is thereby possible to yield so-called healing element in the toy since operations as though seeking communication with the user are enabled. Moreover, various conversations are also realized. [0261]
  • FIG. 26 and FIG. 27 illustrate examples of another robot as the electronic toy. The components in FIG. 26 and FIG. 27 corresponding with FIG. 1 have the same reference numerals, and the explanation of such components is omitted. [0262]
  • The robot of this example comprises the same structure and functions as the robot illustrated in FIG. 1, but has a [0263] display unit 71 which covers approximately the entire front part (face) of the headportion 10. The display unit 71, for example, may employ an LCD display unit, but is not limited thereto. Moreover, the ∘× switch 54 is disposed on the upper surface of the head portion.
  • As previously depicted in FIG. 19, the [0264] display unit 71, as illustrated in FIG. 28 and FIG. 29, expresses various expressions (emotions) of the robot. The robot is able to decide these expressions in correspondence with the various modes described later. FIG. 28(a) is expressing the facial state of “pleasure”, FIG. 28(b) “dizziness”, FIG. 28(c) “anger”, and FIG. 28(d) “sentimentality”, respectively. Further, FIG. 29(a) is expressing the state of “sadness” and FIG. 29(b) “sleep”. The “sleep” state is the power saving mode, and is similar to the power saving mode of a personal computer. In addition, the control 60 stores approximately 300 facial display animations for changing the facial expression. For example, three basic facial patterns are prepared for the respective modes of “delight”, “anger”, “sorrow” and “pleasure”, and sound and movement are additionally combined in correspondence with the respective modes.
  • FIG. 30 is a diagram explaining an example wherein the robot illustrated in FIG. 1 or FIG. 26 has its own biorhythm. The user biorhythm data of [0265] ROM 62 described above can be replaced with the biorhythm function program of the robot. Moreover, for example, it would also be possible to integrate a function that changes sinusoidally and making this the function for representing emotions. The personal biorhythm of the robot generates random numbers when the insulation paper is removed from the battery housing and power is supplied, a random start position (initial value) as depicted with the plurality of points in the sinusoidal wave of FIG. 30 is selected based on the results thereof, and the biorhythm is accordingly made to differ per robot. Incidentally, as the random numbers, the data spread of the switching operation when the switch (not shown) is pressed with the mechanical movement upon activating the motor may also be used as such random numbers for setting the initial value.
  • The amplitude value of the function created by the biorhythm is utilized as one of the emotion parameters of the control parameter (c.f. FIG. 18). The four operational modes are set in accordance with the value of the emotion parameter. The first range containing the center of the amplitude is the “normal mode”, and a “pleasure mode” in which the robot is of a happy feeling is defined in a prescribed range thereabove, and a “delight mode” in which the robot is full of delight is defined in a prescribed range further thereabove. Moreover, a “sorrow mode” in which the robot is sad is defined in a prescribed range below the “normal mode”, and an “anger” in which the robot is angry is defined in a prescribed range further therebelow. Although the robot repeats these modes periodically, the time in which the robot exists in the “delight mode” and “anger mode” is relatively short. [0266]
  • Further, a biorhythm with a short cycle may be set for demonstration exhibitions in front of the store by performing specific switching operations. For instance, one cycle can be set to 5 minutes. Expressive changes and gestures accompanying the emotional changes of the robot can thereby be shown to the audience in a short time span in order to introduce the capabilities and characteristics of this robot. [0267]
  • Control examples of the robot employing the expressions illustrated in FIG. 28 and FIG. 29 are now explained. [0268]
  • FIG. 29([0269] a) shows an expression saying, “Stop it!” when a prank is played on the robot. In order to perform this type of gesture in a timely manner, it would be amusing if this kind of expression is displayed on the display unit when a high level of sound is continuously provided to the robot.
  • Thus, in a mode for performing this type of operation, output of the [0270] microphone 5 as the sound detection means is monitored with the control unit 60 via the low pass filter for eliminating noise, and it is determined whether a sound signal exceeding a prescribed level is continued for a prescribed time; for example, beyond 10 seconds. If such sound signal continues, since it is “noisy”, the control unit selects the expression of the robot illustrated in FIG. 29(a) from the storage means (62, 63) and displays this on the display unit. Moreover, since the selective operation of the expression is conducted pursuant to the value of the aforementioned emotion parameter, the same results can be obtained even if the emotion parameter value is changed to an “unpleasant” level.
  • The expression of FIG. 29([0271] b) represents a “sleep” state. It would be amusing if the robot would express this type of sleeping expression when a blanket is placed over the robot or when the periphery becomes dark.
  • Thus, in a mode for performing this type of operation, the light sensor [0272] 53 (e.g., CCD, photodiode, phototransistor, etc.) as the light detection means detects the peripheral light intensity. The control unit 60 monitors this light intensity and judges whether a dark state continues beyond a prescribed time; for example, beyond 10 seconds. If such dark state continues, since the “periphery is dark”, the control unit selects the “sleeping” expression of the robot illustrated in FIG. 29(b) from the storage means (62, 63) and displays this on the display unit. Moreover, since it would be amusing if the robot exhibits a reluctant gesture when a blanket is placed over the robot, the mechanical components of the arm 30 or the like may also be made to move for a prescribed time.
  • When the [0273] switch 54 of the head portion is continuously or intermittently (repeated tapping) operated for a prescribed time (or a prescribed number of times), this can be considered as the user tapping or padding the head of the robot. It would be amusing if the robot reacts to this type of operation.
  • Thus, the [0274] control unit 60 monitors the output of the switch 54 or touch sensor 51, and distinguishes whether the operation is continued for a prescribed time; for example, 10 seconds. When operation is being made, the expression, words, sound, etc. in accordance with the emotion of the robot at such time are output. For example, if the emotion parameter is in an “unpleasant” state, the unpleasant expressions such as the “painful” expression shown in FIG. 29(a) or an “angry” expression is displayed. When the emotion parameter is in a state of “delight”, the “sentimental” expression shown in FIG. 28(d) is displayed.
  • FIG. 31 is a flowchart for explaining an example of a “soliloquy mode” of the robot reflecting the aforementioned biorhythm. [0275]
  • The control unit (CPU) [0276] 60 executes the soliloquy mode when corresponding to a soliloquy start condition; for example, the condition falling under “user absent” and “generation of prescribed random numbers” (S270; Yes). Foremost, the emotion parameter representing the biorhythm amplitude, which is one type of control parameter, is read (S272). It is then judged which of the foregoing 5 modes corresponds thereto from this value (S274). Mode judgment is conducted by comparison with the threshold values of the respective modes, and the result thereof is output (S274 to S284). Display of the expression corresponding to each of the judged modes and, as necessary, robot control accompanying the movement and sound is additionally performed (S286).
  • For instance, when it is judged as the “anger mode”, as shown in FIG. 32, “I'm mad”, “I quit” and “No more robot” are sequentially displayed on the [0277] display unit 71 capable of display 8 characters. This display is repeated for a prescribed time. In addition, an angry pose (not shown) of the robot can also be made.
  • Similarly, when it is judged as other modes, a display corresponding to such operational mode is selected and, as necessary, the corresponding movement is made. FIG. 33 and FIG. 34 illustrate display examples of words in the normal mode. In the former examples a sentence is created using the term “IT (information technology)” stored in advance or input by the user. “Know what?”, “IT is” and “the trend” are sequentially displayed on the screen. In the later example, a sentence is created in a haiku-like format (Japanese unrhymed verse form having three lines containing 5, 7 and 5 syllables, respectively). [0278]
  • When the value of the emotion parameter exists in the range of the delight mode or anger mode, the robot may perform a “single-action performance” of delight or anger corresponding thereto. For instance, while playing background music, the robot can say, “I feel good! I will now impersonate (so and so) !” and do an impersonation, or rotate its arms or display “question mark (??)” eyes and so on. [0279]
  • Next, the text communication mode for the robot to seek simulated communication with the user through Q&A is explained. [0280]
  • FIG. 35 is a flowchart explaining this mode. For example, when a certain condition such as the user existing nearby is satisfied by sound, movement, switch operation, light or the like (S[0281] 240; Yes), the text communication mode is commenced. The text communication mode is for seeking communication with the user by the robot displaying text on the display unit. The control unit 60, as illustrated in FIG. 38 and FIG. 39, selects a question from pre-stored question data (S242). The respective questions are made to be distinguishable in advance; namely, those that change the emotion of the robot depending on the response as depicted in FIG. 38, and those that do not affect the emotion of the robot irrespective of the response as depicted in FIG. 39. The control unit 60 displays the selected question on the display unit (S244) When the ∘× button is operated (S245), it is judged whether the question will affect the emotion (S246). If the question will not affect the emotion (S246; No), response storage processing is performed as necessary) (S256). This processing is used, for instance, when the user presses “∘” to the question of “Do you like”, “cars?”, in the modes described later by remembering that the user “likes cars” upon storing such response.
  • When a question that will affect the emotion is inquired (S[0282] 246; Yes), and an answer of “∘” is given to, for example, a question shown in FIG. 36 asking, “Are”, “we”, “friends?” (S248; Yes), the “∘” corresponding processing is performed. In the case of this example, the robot expresses its delight, for example, by making the “affectionate” pose shown in FIG. 25 and displaying the “affectionate” expression, and raises the emotion parameter in the plus direction (S250). Meanwhile, when the answer is“×” (S248; No), “×” corresponding processing is performed. In the case of this example, the robot expresses its sadness, for example, by making the “sorrowful” pose shown in FIG. 24 and displaying the “dislike” expression, and lowers the emotion parameter in the minus direction (S252). This, as shown in FIG. 40, moves the biorhythm in the state of sorrow. The robot thereby moves to a mode for expressing a sorrowful expression.
  • Next, favorable image calculation is conducted. The favorable image is a parameter corresponding to the robot's feelings toward the user. In the aforementioned question, plus n points are added when a response is made which makes the robot happy. Moreover, minus m points are added when a response is made which makes the robot sad. The values of n and m differ depending the respective questions. The favorable image is determined based on such integrated values (S[0283] 254).
  • Simulated communication (transmission) between the robots is now explained with reference to FIG. 41 to FIG. 45. [0284]
  • FIG. 41 shows an example of conducting data exchange by connecting the robots bi a [0285] communication cable 741. The communication interfaces 74 of the control unit as shown in FIG. 43 are connected via the connector (not shown) provided on the back face of the robot.
  • FIG. 42 shows an example of connecting a PHS or [0286] mobile phone 742 to the communication interface 74 of the robot and conducting data exchange between the PHS or mobile phone 742 in a distant place and the connected robot via a mobile communication network as shown in FIG. 44. Moreover, as shown in FIG. 42, a card module of the PHS or mobile phone may be integrated in the back face of the robot. The connection example of the communication interface 74 and the PHS or mobile phone 742, 743, etc. described in this embodiment of the present invention includes cases where a telephone communication function itself is installed in the robot.
  • FIG. 45 shows an example of connecting the [0287] communication interface 74 of the robot with the personal computer 743 connected to the Internet 745 as the communication network, and, similarly, of conducting data communication with other robots connected to the Internet 745. Further, the description of FIG. 45 is omitting providers and the like that provide Internet connection services.
  • The composition shown in FIG. 46 (as well as in FIG. 49 and FIG. 50 described later) shows a system capable of obtaining robot data by communication with the server device. Thus, the [0288] communication interface 74 of the robot is connected to the server device 750 of such robot via a communication means such as the PHS or mobile phone 743, Internet 745 or telephone communication network. Supplied from the server device 750 via a communication network such as the Internet 745 are data such as words and current affair terms corresponding to the user characteristics or attributes described later and control data for controlling the robot gesture.
  • FIG. 47 explains an example of data exchange upon connecting robot A and robot B with a [0289] communication cable 41. The robots are foremost connected with the cable. Next, the mode selection status is entered into by simultaneously operating the “∘” and “×” button(switch 54) of the respective robots, for example, and the “communication” mode is selected thereby. When both robots enter the communication mode, communication parameters are exchanged between the robots, communication conditions and so on are set, and communication is started.
  • Robot A transmits the user name of robot A, terms and so forth which it stores. The user name, for instance, is stored by the user inputting his/her name through sequential selection of the corresponding character displayed on the display unit. Stored terms, for example, can be obtained by storing the response to the questions inquired by the robot as described above (S[0290] 256). This includes various terms such as the likes and dislikes of the user, age, male/female, personality, etc. Data is forwarded from robot A to robot B and, when robot B confirms such data, an ACK signal representing the reception of data is transmitted. When data reception ends in failure, a NACK signal is transmitted. When robot A receives the NACK signal, robot A retransmits the data. When robot A receives the ACK signal, it distinguishes the success of data transmission, enters the standby state, and awaits the signal from robot B.
  • Subsequent to the transmission of the ACK signal, robot B transmits to robot A data on the user name of robot B and stored terms which it retains. When robot A confirms the reception of such data, an ACK signal representing the reception of data is transmitted. When data reception ends in failure, robot A transmits a NACK signal, and robot B receiving such signal retransmits the data. [0291]
  • Pursuant to such data exchange procedures, transferred from robot A to robot B are, for example, “ΔΔ” (user name), “□□□□” (word 1), “××××” (word 2), and so on, and transferred from robot B to robot A are the likes of “∘∘” (user name), “□□□□” (word 1), and so on. [0292]
  • These words are applied to a standard sentence selected from a plurality of standard sentences stored in advance, and output by at least one of a sound and screen display of text. The output timing of sound and display and selection of the standard sentence, for example, may be set in accordance with the initial exchange of communication parameters. [0293]
  • For example, as shown in FIG. 47, when robot A vocalizes “Hi! Is (name of user) taking good care of you?”, robot B thereafter vocalizes “Uh-huh. But (he/she) is mean sometimes, and makes a (adjective; “scary” for example) face”. Then, robot A vocalizes “(Adjective; “Scary” for example)? Well, (name of user) looks (adjective; “freaky” for example), too, robot B thereafter vocalizes “(Adjective; “Freaky” for example)!! It's not easy being a robot. See you later!”, and robot A finally vocalizes “I know what you mean. Bye-bye!” When the user of the robot hears such vocalization nearby, he/she will receive the impression as though the robots are having a conversation. [0294]
  • Further, instead of the [0295] connection cable 741, an infrared communication interface employed in remote controllers and portable terminals may also be used.
  • FIG. 48 is a communication diagram explaining the procedures in the case of conducting data communication between robots with a portable telephone. [0296]
  • Here, since robot A and robot B are distant from each other, the vocalization or display of words will be as though a soliloquy. [0297]
  • Foremost, the respective users of robot A and robot B connects his/her robot to a PHS or mobile phone and makes a call to the telephone of the other party. When the communication line between the telephones is connected, communication parameters are exchanged, and, for example, the mutual data communication speed is set to the slower speed among the telephones. For example, in the case of a PHS (communication speed of 64 k bit/second) and a mobile phone (9600 bit/second), data communication is conducted at 9600 bit/second. When the communication parameter is set, data is transmitted from one of the robots (robot A) to the other robot (robot B). For example, words such as “Taro” (user name), “nap” (personal favorite), “MD” (personal favorite), “pachinko” (personal favorite), “Sazaesan (character of cartoon)” (personal favorite), “Pochi (name of dog)” (personal favorite), “Thunderbird” (personal favorite) and so on are transmitted. Robot B transmits the ACK signal when there are no abnormalities in the received data, and transmits the NACK signal when there are abnormalities. When robot A receives the ACK signal representing reception from robot B, it enters a standby state. Robot retransmits the data upon receiving the NACK signal. [0298]
  • Subsequent to the transmission of the ACK signal, robot B transmits data which it stores to robot A. For example, robot B transmits words such as “Hanako” (user name), “Chocolate” (personal favorite), “F-1 racers” (personal favorite), “tea mushroom” (personal favorite), Pico (personal favorite), and the like. [0299]
  • Robot A and Robot B respectively select a standard sentence stored in advance, complete the sentence by applying the received data in the black space in the standard sentence, and output communication results by conducting at least one of a vocalization or text display. Attributes of the word to be filled in such black space should be predetermined; for example, the user's name, personal favorite, personal dislike, age, weather, and so on. [0300]
  • For instance, robot A would vocalize “Yeah, I received data from ‘Hanako’ who likes ‘chocolate’”, “Hey, are ‘F1 racers’ delicious?”, “Hanako taught me ‘tea mushrooms,’ but I don't know what they are . . . ”, “Are ‘Hanako's’ pants cool?”, “Maybe ‘Hanako’ is a “Pico” mania”, and so on. [0301]
  • And, for example, robot B would vocalize “Let's see. I received data from ‘Taro’ who likes to take ‘naps’”, “Are MDs the thang with young hipsters?”, “I love ‘pachinko’”, “Is ‘Sazaesan’ really smart?”, “‘Taro’ said this year's ‘Pochi’ is well made.”, “Wouldn't it be scary if they had a ‘Thunderbird’ ramen?”, and so on. [0302]
  • Since simulated conversation is created by exchanging data that is mutually retained by the robots as described above, users in distant places can also enjoy themselves. [0303]
  • FIG. 49 and FIG. 50 show examples of renewing the data retained by the robot with the [0304] server device 750 illustrated in FIG. 46.
  • It would be amusing if the robot could speak words in view of the times. It would also be amusing if the robot is able to speak words corresponding to the individual characteristics of the user such as age, gender, hobbies, and so on. Nevertheless, it is difficult cost-wise to realize such functions in an electronic toy. [0305]
  • Thus, as shown in FIG. 50, the aforementioned function is provided inexpensively by adequately providing from the server device data such as requisite words pursuant to the server device and data (control program) for controlling the operation of the robot upon speaking such words. The control program may be used for controlling the series of operations pursuant to such program, or may be used for designating the operation of one among a plurality of operational control programs such as “delight”, “anger”, “sorrow” and “pleasure” pre-stored in the robot. [0306]
  • Data exchange procedures in the aforementioned case is now explained with reference to FIG. 49. Foremost, as shown in FIG. 49, the [0307] communication interface 74 of the robot is connected to a PHS, mobile phone or personal computer 743, and then connected to the server device 750 via a communication network, the Internet 745 for example, in order to establish the circuit for conducting data transmission. Communication parameters required for establishing communication such as the communication speed, specification of electronic toy, ID, password, and so on are transmitted from robot A to the server device 750. The server device 750 conducts authentication on whether to permit the connection, and thereby permits access to robot A. The robots request the transmission of updated data. Here, it is possible to designate the likes of designation ideas and user-adaptable data. The server 750 transmits a requested designation ideas, for example, in only a required number of words. In the example illustrated in the diagram, “divine nation comment”, “train accident”, “New Years”, “Christmas” and so on are transmitted. It is also possible to forward a new standard sentence suitable for such words. Moreover, as necessary, it is also possible to provide control program data 1, program data 2, program data 3, for controlling the robot operation upon vocalizing standard sentences employing the aforementioned words. Similarly, it is also possible to select and transmit user-corresponding words among the word groups prepared beforehand in correspondence with a plurality of user characteristics. For instance, when the user characteristic is a businessman, “convertible bonds”, “clear note”, “monster” and the like are transmitted. Here, it is also possible to define the operation of the robot with respect to a specific word. In such a case, control program data (41, 42) is also transmitted together with the word data.
  • When robot A receives the data, it stores this in the [0308] memory 63. The ACK signal is transmitted to the server, the circuit is opened, and the update is completed. When data reception ends in failure, the NACK signal is transmitted to the server, and retransmission of the data is requested. When the server device receives the ACK signal from robot A, or when the circuit is opened, it ends the communication with robot A.
  • Robot A applies the acquired words in the standard sentences and conducts at least one of a vocalization or text display (sentence display). Further, although the robot has a function of converting text data into sound, sound data of words and standard sentences may be received from the server device and vocalized by encoding the same. [0309]
  • “Action mail” is now explained. Action mail is for displaying or reading the contents of the e-mail received by the robot as well as to perform prescribed actions corresponding thereto; for example, the movement of the arms and legs and representation of expressions. [0310]
  • FIG. 51 and FIG. 52 show structural examples in the case of conducting action mail. The e-mail sender downloads beforehand the action mail software, which can be downloaded via the Internet, in his/her [0311] personal computer 743 a. The personal computer 743 a in an environment connected to a communication network such as the Internet 745 capable of e-mail communication. The sender creates an e-mail message by operating an input device such as a keyboard device. The aforementioned software downloaded into the personal computer includes a message/operation editing program for conducting text input, message editing, control movement input, and so on; a data/sound conversion program for converting the message into sound data; and an e-mail program with a data file attachment function capable of transmitting sound data as the attachment file.
  • The sender creates an e-mail by utilizing control information and the message/operation editing program. E-mail, for example, as shown in FIG. 52, designates the name of sender (4 letters for example), message (44 letters for example), and operation of the robot. These can be assembled with text data. Next, the text code is converted into a sound signal, an FM modulation signal for example, with the data/sound conversion program. Information on the name, message and robot operation, for example, may be classified in three-second silent intervals as shown in FIG. 52. Further, headers and footers (not shown) may also be suitably added. Such FM sound is converted into sound data; for example, sound data formats such as WAV, MP3, ram, and so on. The e-mail program transmits this sound data file, upon attaching it to the e-mail, to the party on the other side of the line using the robot. [0312]
  • E-mail is transmitted to the e-mail server device of the other party via the [0313] Internet 745. Moreover, although simplified in the diagram, various server devices containing a communication circuit and e-mail server as well as connection service providers are included in the Internet 745.
  • The receiver downloads beforehand into his/her personal computer the communication software having an action mail reception function obtainable via the Internet. Decoding of the sound file is included in the reception function. The receiver connects the robot to his/her [0314] personal computer 743 b. The receiver accesses the e-mail server (not shown) with the personal computer 743 b and downloads the e-mail sent to such receiver. When the e-mail requires the use of the robot, the attached sound file is reproduced with the aforementioned communication software in order to demodulate the sound signal. This sound signal is supplied to the control unit 60 of the robot via the communication interface 74. The control unit 60 demodulates the FM signal and converts this into digital data. Control information such as the name of sender, message and movement is distinguished from the data. As described above, this is distinguishable with the blank space between the data. The control unit 60 converts the text data into image data and displays the same on the display unit 71. Here, foremost, the name of the sender is displayed, and a long message can be scroll displayed thereafter on a small display unit screen. The text data may also be read aloud. This is repeated a prescribed number of times. Needless to say, the entire message may be displayed when employing a large display unit. Further, the control unit 60 controls the motors 205 and 206 based on the operational control information and makes the robot perform operations corresponding to the message. The control of the action operation may also be performed pursuant to the control code stored beforehand in the ROM of the robot, or by the sender designating a control program formed from a series of control codes. Moreover, the sender may program, to his/her liking, the series of movements of the robot by assembling control codes corresponding to the individual operations.
  • The message display and action movement corresponding to the reception of action e-mail may be performed simultaneously. Further, the action may be performed first, and the message displayed thereafter. Or, the message may be displayed first, and the action movement performed thereafter. These may be repeated or combined. Moreover, the sender may create a voice message and transmit such message as an attachment file, and the robot may reproduce this from the speaker as a voice message. [0315]
  • When the, robot has a built-in telephone function of a PHS or mobile phone, the [0316] control unit 60 downloads the action mail software via the communication function and may possess an email receiving function. In such a case, the control unit 60 of the robot receives the e-mail and may perform the conversion of the sound file implemented by the personal computer 742 b, and the personal computer will no long be necessary. Further, the structures illustrated in FIG. 44 to FIG. 46 are also able to conduct action mail.
  • The server device may also be made to be the sender of the action mail. For example, the robot may speak a one-word message, today's fortune, shopping information, weather forecast, current affairs, and the like together with action in accordance with the characteristics and attributes of the user. For instance, if it snowed the previous day, the server device will transmit “It snowed a lot yesterday. And boy was it cold!” (message) and a “jerky move” (movement+facial expression). [0317]
  • FIG. 53 to FIG. 56 show examples of movements (action movements) accompanying the facial expressions displayed together with the message. FIG. 53 is representing “delight” by raising both arms upward at an angle and displaying hearts on the face. FIG. 54 is representing “anger” by raising the hands near the head and displaying slant eyes on the face. FIG. 55 is representing “sorrow” by lowering both hands and displaying tear-filled eyes. FIG. 56 is representing “pleasure” by placing both arms forward and displaying a smiling expression on the face. [0318]
  • Although a structural example of the movement mechanism of the lower body of the robot was exemplified in FIG. 5 to FIG. 8, another structural example of the movement mechanism of the lower body of the robot is now explained. [0319]
  • FIG. 57 and FIG. 58 are perspective views showing an example of a robot structured so as to change the movement of its lower body pursuant to the “volume”, “speed”, “rhythm” and so on of sound such as music. [0320]
  • In this example, the robot makes gestures as though of dancing by opening and closing its legs left and right in accordance with the music. Among these movements, FIG. 57 shows the first state where the robot is standing upright with both feet together. FIG. 58 shows the second state where the robot is opening its legs left and right. The robot consecutively moves from the first state to the second state, and consecutively moves from the second state to the first state. Upon opening its legs left and right, as shown in FIG. 58, the robot is made to bend the knees pursuant to the mechanism described later so as to simulate the movement of a human. [0321]
  • FIG. 59 and FIG. 60 are perspective views showing the drive portions of the leg opening/[0322] closing mechanism 300, and FIG. 59 shows the closed state of the legs and FIG. 60 shows the open state of the legs. A motor 301 is built in the lower part of the left leg of the robot, and the driving force is increased by the gear mechanism 302. Driving force rotates and drives the left leg cam mechanism 306 inside the waist portion frame upon going through the hip joint portion 305 of the waist portion frame 304 via the drive shaft 303. One end of the link 308 is rotatably connected to the cam 307 of this mechanism via a roller bearing. The other end of the link 308 is rotatably mounted on the roller bearing 311 on the upper end of the right leg axis 310. Mounted on the bottom end of the right leg axis 310 is a roller portion 312 for sliding on the floor surface, and oscillates the right leg axis in the left and right directions with the hip joint portion 313 mounted on the waist portion frame 304 as the center. As a result, the right leg axis 303 and left leg axis 310 are able to move symmetrically in the left and right directions based on a (virtual) central axis (line) extending in the upward and downward directions of the central portion of the torso pursuant to the cam mechanism 306, link 308, hip joints 305, 313, and so on.
  • FIG. 61 is a perspective view showing a structural example of the [0323] right leg 320. A hip joint portion 313 is mounted on the upper part of the right leg axis 310. A right leg is rotatably mounted against the waist portion frame 304 with a pin (not shown) on the upper part of this hip joint portion 313 so as to move such right leg in the left and right directions. The lower part of the hip joint portion 313 is mounted, with a pin 322 (not shown), on the concave portion of the upper part of the above-knee portion 321 of the leg so as to be rotatable in the front and back directions of the robot. The concave portion of the lower end portion of the above-knee portion 321 is rotatably mounted in the front and back directions with the protrusion of the below-knee portion front cover 323 and the pin 324. The lower central part 323 b of the below-knee portion front cover 322 is opened in a reverse V-shape. The roller portion 312 of the lower end portion of the right leg axis 310 is made to contact the ground surface or floor surface (or the mounting face of the robot) (not shown) by being positioned in the center penetration hole 325 a of the ground portion extending in the front and back directions of the robot, and mounted in a rotatable manner with a pair of protrusions 325 in the shape of an approximate reverse V-shape and pins 326 respectively disposed on both sides of such penetration hole 325 a. The below-knee portion back cover 327 engages with the below-knee front cover 323 while sandwiching the right leg axis 310. A U-shaped opening 327 a in which the right leg axis is position is provided to the upper face portion of the below-knee back cover 327. Moreover, a long hole 327 b is provided at a position opposite the reverse V-shaped opening 323 b of the below-knee portion front cover of the below-knee portion back cover 327. Pins 326 that rotatably connect the roller portion 312 with the ground portion 325 are position in the reverse V-shaped opening 323 b and long hole 327 b. The below-knee portion U-shaped opening 327 a, reverse V-shaped opening 323 b and long hole 327 b are structured such that the right leg axis 310 and connection pin 326 do not interfere with (do not contact) the covers 323, 327 when the knee portion, which is the portion connection the above-knee portion and below-knee portion, is bent.
  • A [0324] protrusion 327 c (c.f. FIG. 57, FIG. 58) is formed on the inside bottom portion of the below-knee back cover 327. This protrusion 327 c contacts the inclined face 325 b formed on the ground 325. When the right leg axis 310 opens toward the right side of the robot as shown in FIG. 60 pursuant to the rotation of the motor 301, the right ankle slides from the state shown in FIG. 62(a) across the floor surface of the ground portion 325, and, as shown in FIG. 62(b), relatively rotates the below-knee portion 327 in the clockwise direction with the connection pin 326 of the roller portion/ground portion in the center. Thereby, the protrusion 327 c of the below-knee portion contacts the upper part of the inclined face 325 b of the ground portion 325, and pushes the below-knee portion 327 upward. Here, since the position of the hip joint 313 does not change, the lower part of the above-knee portion 321 and the upper part of the below-knee portion 323 are pushed forward, thereby bending the knee of the leg.
  • FIG. 63 shows the appearance of the [0325] left leg 330 having a built-in motor 301. An eccentric cam 307 is mounted on the upper end portion of the left leg axis 303, and a spherical engagement member 309 for engaging with the link 308 is mounted on the cam 307 with a screw (c.f. FIG. 64). The motor is built in the below-knee portion 331, and the below-knee portion 331 and the ground portion 332 are rotatably connected with a pin. A friction member (not shown) such as rubber for preventing sliding is adhered to the bottom of the ground portion 332 of the left foot. Although a knee-bending mechanism as with the right leg portion 320 is not provided to the left leg portion 330, a knee-bending mechanism similar to the right leg may also be provided to the left leg.
  • With the aforementioned [0326] lower body mechanism 300, the mechanical portion only occupies the lower part of the torso, and it is possible to make the bulk of the robot torso internally empty. This is convenient in that the inside of the torso may be used for electric circuits or an upper body mechanism. Moreover, since a relatively heavy motor is disposed in the below-knee portion of the leg, it is easy to stabilize the robot. Further, with the aforementioned mechanism, although the knee of the right leg will bend, the knee of the left leg is fixed, and, by providing a friction member at the bottom portion, it is possible to prevent the unstable posture of the robot, movement and rotation of the robot, and so on.
  • FIG. 64 shows an example of the [0327] cam 307 of the left leg axis. As shown in FIG. 64, it is possible to adjust the degree of opening the legs in the left and right directions by changing the mounting position of the engagement member 309 of the cam 307. Since the link 308 and engagement member 309 (and 311) have a spherical shape, unnecessary force is not inflicted between the link and engagement member (or cam) even when the legs are opened. Adjustment can be made by providing beforehand a plurality of screw holes in the cam and mounting the engagement member in an appropriate screw hole, or by changing the cam.
  • FIG. 65 is a block diagram explaining an example of synchronizing (corresponding) music or sound with the movement of the robot. [0328]
  • In this example, in place of the [0329] ROM 62 of the control unit, or in addition to the ROM 62, used is a square chip card (micro IC card) 621, wherein one side thereof is approximately 2 cm, having recorded thereon music information and control data. Exchange of songs is thereby facilitated. Needless to say, music information and control data may be recorded on the ROM 62. When the user inserts the chip card 621 in the robot and orders a movement by operating a switch (not shown), the control unit 60 reads the sound data (information) from the chip card 621, converts this into a sound signal with the sound reproduction processing function 601 of the control unit 60, and supplies this to the speaker 72 at an appropriate level. A song with a prescribed rhythm is thereby played from the speaker 72. Further, the control unit 60 reads control data from the chip card 621 and controls the motor 301 with the rhythm control function 602 of the control unit 60. The motor 301 is able to control the rotation speed, normal/reverse rotation, length of step and so on pursuant to PWM control and level control of the supplied voltage. By previously storing data representing the rhythm of the song in the control data, movement of the legs matching the performance of the song is enabled, and it is thereby possible to make the robot move as though it is dancing to the song.
  • FIG. 66 is a block diagram explaining an example of making the robot move in correspondence with music or sound. Sound data is at least previously stored in the [0330] chip card 621. When the user inserts the chip card 621 in the robot and orders a movement by operating a switch, the control unit 60 reads the sound data from the chip card 621, converts this into a sound signal with the sound reproduction processing function 601 of the control unit 60, and supplies this to the speaker 72 at an appropriate level. A song with a prescribed rhythm is thereby played from the speaker 72. Further, the control unit 60 samples the sound signal with its sampling function 603 and extracts the rhythm (cycle of accents) of the song from such sound signal with the rhythm extracting function 604. A rotation corresponding to the rhythm of this song is set in the motor control function 605. The motor control function 605 sets the rotation speed, normal/reverse rotation, length of step and soon by performing PWM control and level control of the supplied voltage. When performing this type of control, movement of the legs matching the performance of the song is enabled without having to previously record data representing the rhythm of the song in the control data, and it is thereby possible to make the robot move as though it is dancing to the song.
  • It is also possible to move the legs of the robot in concert with the sound collected by the [0331] microphone 52. For example, when the user claps his/her hands, speaks or sings near the microphone 52, such sounds are sampled with the sampling function 603 and rhythm is extracted from the sound signal with the rhythm extracting function 604. The rotation corresponding to the rhythm of this song is set to the motor control function 605. This is also amusing in that the robot would move correspondingly to such cases.
  • Next, an example of providing a bipedal locomotion mechanism to the lower body of the robot is explained. [0332]
  • FIG. 67 to FIG. 75 are diagrams explaining the bipedal locomotion state. FIG. 67 is a perspective view showing the state where the left leg is positioned backward and the right leg is positioned forward. FIG. 68 is a perspective view showing the state where the right leg and left leg are approximately together. FIG. 69 is a perspective view showing the state where the left leg is positioned forward and the right leg is positioned backward. FIG. 70 is a side view showing the leg mechanism of the left leg. The operation of the leg mechanism will be described in detail after the explanation of the respective components thereof. FIG. 71 shows the [0333] waist portion frame 401. The waist portion frame 401 comprises a stopper 401 a for stopping the long rod 410 from being raised excessively, a drive shaft hole 401 b of the cam pulley, a connection axis 401 d for rotatably mounting the above-knee portion 402, and a guide pin 401 c for engaging with the long hole 411 b of the short rod 411.
  • FIG. 72 shows the above-[0334] knee portion 402 rotatably connected to the waist portion frame 401. A connection portion 402 a to be connected to the connection axis 401 d of the waist frame 401 is provided to the upper part of the above-knee portion 402. A connection portion 402 b to be connected to the below-knee portion 403 is provided to the lower part of the above-knee portion 402.
  • FIG. 73 shows the below-[0335] knee portion 403. A connection portion 403 a to be connected to the connection portion 402 b of the above-knee portion 402 and a connection portion 403 b to be connected to the short rod 411 are provided to the upper part of the below-knee portion 403. A connection portion 403 c to be rotatably connected to the connection portion 404 a of the ground portion 404 is provided to the lower part of the below-knee portion 403.
  • FIG. 74 shows the [0336] ground portion 404. A connection portion 404 a to be connected to the connection portion 403 c of the below-knee portion 403 and a connection portion 404 b to be connected to the long rod 410 are provided to the upper part of the ground portion 404.
  • FIG. 75 shows the [0337] cam pulley 420, long rod 410 and short rod 411. The cam pulley 402 is connected to the axis to be rotatively driven by a motor not shown. A drive pin 420 a is provided in an eccentric position from the drive shaft (not shown) of the pulley at the outside of the cam pulley 420. A cylindrical cam 420 b is provided in an eccentric position from the drive shaft (not shown) of the pulley at the inside of the cam pulley 420. A long hole 410 a is provided to the upper part of the “dogleg” shaped rod, and a pin 420 a is inserted in this long hole 410 a so as to rotatably engage such rod and long hole. The long hole 410 a prevents the toe of the foot of the robot from lowering excessively. A connection portion 410 b to be connected with the connection portion 404 b of the ground portion 404 is provided to the lower part of the “dogleg” shaped rod. An annular engagement portion 411 a for engaging with the cam 420 b is provided to the upper part of the short rod 411. A long hole 411 b for engaging with the guide pin of the frame 401 c is provided to the center portion of the short rod 411. A connection portion 411 c to be connected with the connection portion 403 b of the below-knee portion 403 is provided to the lower part of the short rod 411.
  • According to the foregoing structure, as shown in FIG. 70, the [0338] waist portion frame 401 and the above-knee portion 402 are rotatably connected via the connection portions 401 d and 402 a, and the above-knee portion 402 and the below-knee portion 403 are rotatably connected via the connection portions 402 b and 403 a. Further, the below-knee portion 403 is rotatably connected to the ground portion 404 via the connection portions 403 c and 404 a. The short rod connects the cam 420 b and the below-knee portion 403 via the connection portions 403 b, 422 c. When the cam pulley 420 is rotatively driven, the eccentric cam 420 b swings the below-knee portion 403 with the short rod 411 and lifts the below-knee portion 403. The above-knee portion 403 also oscillates pursuant thereto. The long rod 410 connects the drive pin (cam) 420 a and the ground portion 404 via the connection portions 410 b, 404 b. When the cam pulley 420 is rotatively driven, the drive pin 420 a lifts the ground portion 404 with the long rod 410. The lifting and lowering of the toe of the foot upon moving the legs is set.
  • This mechanism, as shown in FIG. 67 to FIG. 69, lifts the toe of the left foot while maintaining the balance with the right foot on the ground, and advances the legs by moving the left foot from the backward to the forward position with the heel on the ground. When the entire left foot is contacting the ground, similarly, the right foot is moved to the forward position repeatedly to realize the walking motion. [0339]
  • FIG. 76 and FIG. 77 show movements of the left leg pursuant to the rotation of the drive shaft. In this example, the ground portion is not in contact with the floor, and shows the movement in a state where the foot is hanging in the air. [0340]
  • The respective diagrams of FIG. 76([0341] 1) to FIG. 76(4) and FIG. 77(5) to FIG. 77(8) show movements of the respective legs when the drive shaft of the cam is rotated in increments of 45 degrees. FIG. 76(1) shows the state where the rotational angle of the cam axis is 0 degrees (basic position). In this state, the short rod 411 is swung forward by the cam 420 a with the guide pin 401 c as the fulcrum, and the leg is thereby moved forward. FIG. 76(3) shows a state where the cam axis rotated 90 degrees. In this state, the short rod 411 is in the approximate center position of oscillation pursuant to the cam 420 a, and the legs are together. FIG. 77(5) shows a state where the cam axis rotated 180 degrees. In this state, the short rod 411 is swung backward by the cam 420 a with the guide pin 401 c as the fulcrum, and the leg is thereby moved backward. FIG. 77(7) shows a state where the cam axis rotated 270 degrees. This state corresponds to the state where the legs are together. Nevertheless, this state is different from the case of FIG. 76(3) above in that the upper end of the long rod 411 is not in contact with the stopper, and the degree of rotational freedom of the ground portion 404 with the connection portion 404 a as the center is large. As shown in FIG. 76(1) to FIG. 77(5), the lifting of the long rod 410 is prevented by the upper end of the long rod 410 contacting the stopper 401 a, thereby preventing the tip of the foot (ground portion) from lowering excessively. Further, the frame weight of the robot when the ground portion is in contact with the ground is conveyed to the backside of the ground portion 404 in order to stabilize the posture.
  • Pursuant to the series of operations described, as shown in FIG. 78([0342] a) and FIG. 78(b), the toe is raised, the foot is moved forward in a state where the heel is in contact with the ground, and, when the foot moves forward, the entire sole is made to contact the ground. As described later, the toe side of the sole has a mechanism built therein for changing the direction of the overall robot. Moreover, a roller is built in the heel side of the sole for sliding across the ground. As the roller, for example, employed may be a metal roller such that it may concurrently act as a weight, and the stability of the posture of the overall robot can be sought thereby. The mode of leg movement of the robot as described above is advantageous for this type of mechanism.
  • The robot will move in reverse by counter-rotating the cam axis. [0343]
  • FIG. 79 and FIG. 80 show structural examples (leg mechanism of left leg) of another leg mechanism. Components in these diagrams corresponding with those illustrated in FIG. 70 are given the same reference numerals. [0344]
  • In this example, a [0345] pressing plate 410 c is integrally mounted on the long rod 410. The inclination (posture) and lifting/lowering of the toe (or heel) of the feet of the robot are set by pressing the connection portion (rear axis) 404 b of the ground portion 404 in a set timing as a result of pushing this pressing plate with the cam. Further, the shape of the cam and hole is adjusted such that the toe of the foot (ground portion) of the robot can be lifted higher. Moreover, in this example, the advancing power (or retreating power) of the robot is increased by increasing the driving force of the toes by pressing the front end side (toe side) of the ground portion 404 on the ground, or by actively applying grind-like energization force with a spring.
  • In the leg mechanism of the left leg shown in FIG. 79, the [0346] waist portion frame 401 and the above-knee portion 402 are rotatably connected via the connection portions 401 d and 402 a, and the above-knee portion 402 and below-knee portion 403 are rotatably connected via the connection portions 402 b and 403 a. Further, the below-knee portion 403 is rotatably connected to the ground portion 404 via the connection portions 403 c and 404 a. A spring SP as the energization means is mounted on a part of the ground portion 404, for example, between the connection portion 404 b and a part of the case of the below-knee portion 403 so as to apply force to continuously lift the back part (heel of foot) of the ground portion 404. This spring SP operates in the direction of continuously keeping the eccentric cam 420 b and the pressing plate 410 in contact. Moreover, the spring SP only needs to operate so as to lift the heel of the ground portion 404 in the upward direction, and the mounting position thereof may be selected accordingly.
  • The [0347] short rod 411 connects the eccentric cam 420 b and the below-knee portion 403 with the connection portion 403 b, 411 c. When the cam pulley is rotatively driven, the eccentric cam 420 b swings the below-knee portion 403 back and forth with the short rod 411 and lifts the below-knee portion (leg) 403. The above-knee portion 403 also oscillates so as to bend the knee pursuant thereto. The long rod 410 is guided by the guide pin 420 c, and connects the eccentric cam 420 b and the ground portion 404 via the connection portions 404 b, 410 b. When the cam pulley 420 is rotatively driven, the pressing plate 410 c is lifted by the eccentric cam 420 b with the pin 420 c as the guide, and further lowers the engagement portion (rear axis) 404 b of the ground portion 404 with the long rod 410. The posture (inclination) of the robot during the walking motion, or the lifting and lowering of the toe of the foot upon moving the legs is set.
  • In this type of mechanism, the toe of the stepping foot may be lifted in a timing of moving the stepping foot forward, and the toe of the stepping foot may be lowered in a timing of moving the stepping foot backward. It is thereby possible to improve the running performance of a robot capable of walking without falling down. [0348]
  • FIG. 80 shows the [0349] cam pulley 420, log rod 410, short rod 411 and spring SP of another mechanical example described above. The cam pulley 402 is connected to the drive shaft (not shown) to be rotatively driven by a motor. A guide pin 420 c is provided in a concentric position to the drive shaft of the pulley at the outside of the cam pulley 420. A cylindrical cam 420 b is provided to the cam pulley 420 in an eccentric position from the drive shaft of the pulley. A long hole 410 a is provided to the upper part of the approximate “dogleg” shaped rod, and a pin 420 a is inserted in this long hole 410 a so as to rotatably engage such rod and long hole. The guide pin 420 c is movably engaged with the long hole 410 a. The pressing plate 410 c is formed at the lower part of the long hole 410 a. The upper surface of the pressing plate 410 c contacts the eccentric cam 420 b and moves the log rod in the upward and downward direction in accordance with the movement of the cam 420 b. A connection portion 410 b to be connected with the connection portion 404 b of the ground portion 404 is provided to the lower part of the “dogleg” shaped rod 410. An annular engagement portion 411 a for rotatably engaging with the cam 420 b is provided to the upper part of the short rod 411. A long hole 411 b for engaging with the guide pin of the frame 401 c is provided to the center portion of the short rod 411. A connection portion 411 c to be connected with the connection portion 403 b of the below-knee portion 403 is provided to the lower part of the short rod 411.
  • Further, the above-[0350] knee portion 402, below-knee portion 403 and ground portion 404 in the other embodiment described above are structured in a similar manner as with the first embodiment.
  • FIG. 81 and FIG. 82 show the mechanism for changing the direction of the robot. FIG. 81 is a side view of the ground portion, and a [0351] drive roller 404 c is disposed at the toe side and a sliding roller 404 d is disposed at the heel side thereof. FIG. 82(a) is a diagram of the ground portions 404 of the left and right legs viewed from the front of the robot, and FIG. 82(b) is a diagram of the ground portions 404 of the left and right legs viewed from the bottom.
  • As shown in FIG. 82([0352] b), disposed in the front part within the ground portion 404 are a motor 404 e, a gear mechanism 404 f for increasing the rotational quantum force of this motor, and a drive roller 404 c rotated by this gear mechanism 404 f. A plurality of drive rollers 404 c may be provided, and, in this example, two drive rollers 404 c, 404 c are provided and connected additionally with a drive belt 404 g. The drive direction pursuant to the drive roller 404 c and the drive belt 404 g is set in obliquely against the front and back direction of the robot. Although the motor and gear mechanism are also disposed obliquely in correspondence thereto, these may be set appropriately. When increasing the number of drive rollers 404 c, the ground plane increases, and the stability of the robot increases. It is also possible to increase the speed of turning.
  • Preferably, as shown in FIG. 82([0353] b), when both legs are together, the left and right drive rollers and the drive direction by the drive belt 404 g are in an “inverted V-shape” positioned on the approximately identical circumference. A freely rotatable sliding roller 404 d is positioned at the front part within the ground portion 404. By structuring this roller with a relatively heavy material, metal for example, this will concurrently act as the weight for adjusting the balance of the robot. Needless to say, an item corresponding to a weight for maintaining the balance may also be separately provided to the ground portion 404.
  • FIG. 83 shows another example of a mechanism for changing the direction of the robot. The components shown in FIG. 83 that correspond to those illustrated in FIG. 82([0354] b) are given the same reference numerals, and the explanation thereof is omitted. This example only shows the left leg side of the robot, and the right leg side (not shown) is structured symmetrical to the example of the left leg side. The drive roller 404 c and the drive belt 404 g depicted in FIG. 82 are structured with a drive rubber roller 404 h. The drive rubber roller 404 h, for example, is structured by covering a large friction rubber over the periphery of a plastic pulley. In a state where both legs are together, the drive direction by the left and right rubber drive rollers 404 h is in an “inverted V-shape” positioned on the approximately identical circumference. A freely rotatable sliding roller 404 d is positioned at the back part within the ground portion 404. This structure will also operate similarly as with the example shown in aforementioned FIG. 81 and FIG. 82.
  • By providing a drive roller or a drive belt to the sole of the foot of the robot, while performing a bipedal locomotion, it becomes possible to change (turn) directions, which is technically difficult in a bipedal locomotion mechanism. Needless to say, the robot can turn even in a non-walking state. Further, by adopting a structure of disposing the drive roller or drive belt obliquely against the front and back direction of the robot, the posture of the robot is more stable and the directional change of the robot can be performed in a shorter time span in comparison to driving the drive roller in a perpendicular direction against the advancing direction in order to make the turn. [0355]
  • Moreover, the aforementioned bipedal locomotion mechanism of the legs walks in a state where the heel is constantly contacting the ground. Providing the drive roller or the drive belt at the toe side is suitable for this walking structure. In other words, assuming that the drive roller is provided to the heel side, the robot may turn when the toe is lifted, and the posture of the robot will become unstable. Further, this will be an unnatural movement as the robot movement simulating a person. With respect to this point, when providing the drive belt or the like to the toe side, the posture is stabilized since the turn is made with the foot with the entire sole thereof in contact with the ground, and the movement will look natural. The turn during the walk in particular is stable. [0356]
  • The [0357] control unit 60 is able to change the direction of the robot in order to avoid obstacles by activating the aforementioned turnabout mechanism upon detecting an obstacle in the forward direction of the robot with the likes of a light sensor 53. The position of the sensor for detecting obstacles may also be at the tip of the ground portion. In such a case, the sensor may be a switch, supersonic sensor, or the like.
  • As described above, in the embodiments of the present invention, it is possible to reduce the consumption of the battery since the robot is activated upon previously distinguishing that the user is nearby. [0358]
  • Further, the electronic toy is not limited to being battery powered, and the power supply may also be via an AC power supply or AC power supply adaptor. [0359]
  • Moreover, the robot of the embodiments stores a program for deciding on its own actions, and self-activates various operations in accordance with the time. The subsequent action is decided with respect to whether there was a reaction; for example, whether a sound could be heard or whether a switch was touched. When no one watching, wasteful movements are not made that much, but whether the user is nearby is periodically (in fixed time intervals) confirmed. When someone is nearby, by taking an even larger action, the user will view this as though the robot is moving on its own at all times. [0360]
  • Further, the robot of the embodiments comprehends the biorhythm of the user in order to presume the health and mood of the user, and, when it is presumed that the user is not feeling well, it takes (is programmed to take) humane action of cheering the user up and so on. [0361]
  • Moreover, since the robot periodically performs solo play, this would be amusing for the user when he/she discovers such solo play. [0362]
  • The robot of the embodiments possesses a self-emotion parameter, and vocalizes or displays words corresponding to the present emotion. This is amusing since it would seem as though the robot has emotions. [0363]
  • The robot of the embodiments reacts to pranks such as continuously talking loudly near the robot, covering the robot with a cloth, or hitting the robot repeatedly. Thus, this is also amusing. [0364]
  • The robot of the embodiments conducts communication by text. For example, this is amusing in that the robot inquires questions to the user or soliloquizes. [0365]
  • Further, the emotion of the robot is affected by the response to such questions, and the mood will become good or bad. As this is humanlike, this is amusing in that such emotion is displayed on the display unit or represented by movement. [0366]
  • Moreover, a standard sentence is formed so as to realize conversation by data exchange upon connecting the robots. This is amusing in that it would seem as though the robots are having a conversation by outputting this by sound or on the display unit. [0367]
  • Further, the mechanical structure shown in the embodiments is able to obtain two degrees of freedom of the arm, one degree of freedom of the neck, and expressions of the face (eyes) with a minimal structure, thereby realizing emotional movements and expressions of the robot. [0368]
  • Moreover, the electronic toy and the electronic robot of the present invention are also applicable to so-called pet robots, therapy products (e.g., healing robots), domestic robots comprising a function of monitoring patients and elderly persons, and so on, and may be enjoyed by adults and elderly persons without being limited to use as toys for children. Needless to say, the present invention may also be applied in adult toys and playthings. [0369]
  • Further, the walking robot as the electronic toy of the embodiments is able to lift and move the tip (toe) or the back end (heel) of the ground portion (foot) at a larger angle upon advancing or retreating by alternately moving both legs. In addition, the driving force (or friction) to the toe is increased. Thus, the running performance in places with relatively bad foothold is improved, and the falling of the robot is thereby reduced. [0370]
  • Moreover, it is also possible to combine the respective embodiments described above. For example, the mechanism of the upper body of the robot shown in FIG. 5 can be suitably combined with the mechanism of the lower body shown in FIG. 59 or FIG. 70. In addition, it is also possible to combine the various control modes explained in the embodiments, those in FIG. 11 to FIG. 56 for example, to the robot structured as described above. [0371]
  • As described above, the electronic toy. of the present invention is able to communicate with the user from the electronic toy side since it automatically activates when the user is nearby. It is also possible to suppress the wasteful consumption of the power source. [0372]
  • Further, it is amusing in that the robot behaves so as to communication with the user by employing text. In addition, the amusement is increased since the robot selects words and movements to be output pursuant to its emotions, which is humanlike. [0373]
  • Moreover, obtained is an electronic toy (robot toy) with favorable walking performance. [0374]

Claims (89)

What is claimed is:
1. An electronic toy controlled so as to react to external information, comprising:
a movement mechanism structuring the mechanical movement of the toy;
input means for obtaining external information;
distinction means for distinguishing whether an object body exists in the periphery; and
control means for selecting, among a plurality of control parameters, a control parameter for controlling said movement mechanism in correspondence with said external information based on said distinction result, and controlling the movement of said movement mechanism.
2. An electronic toy according to claim 1 further comprising information display means for externally displaying information,
wherein said control means further selects, among a plurality of control parameters, a control parameter for controlling said information display means in correspondence with said external information, and controls the operation of said information display means.
3. An electronic toy according to claim 1 or claim 2 further comprising sound generation means for externally outputting sound,
wherein said control means further selects, among a plurality of control parameters, a control parameter for controlling said sound generation means in correspondence with said external information, and controls the operation of said sound generation means.
4. An electronic toy according to claim 1, further comprising:
information display means for externally displaying information;
sound generation means for externally outputting sound;
means for calculating the lifestyle rhythm of a specific person; and
event detection means for detecting the occurrence of an event during said lifestyle rhythm;
wherein said control means further selects the control parameter of at least said movement mechanism, said information display means and said sound generation means in correspondence with said event.
5. An electronic toy according to claim 1, further comprising:
information display means for externally displaying information;
sound generation means for externally outputting sound;
clock means for detecting the present time; and
detection means for detecting the occurrence of an event planned on a time base in advance;
wherein said control means further selects the control parameter of at least said movement mechanism, said information display means and said sound generation means in correspondence with said event.
6. An electronic toy according to claim 1, wherein said distinction means detects peripheral sound and/or movement.
7. An electronic toy according to claim 1, wherein said distinction means detects peripheral sound and/or brightness.
8. An electronic toy according to claim 1, wherein said distinction means comprises a microphone for collecting sound and/or a camera for photographing the periphery.
9. An electronic toy according to claim 1, wherein said movement mechanism is structured in the shape of a humanoid robot, and the movement thereof is controlled so as to express at least one of a human emotion of “delight”, “anger”, “sorrow” and “pleasure”.
10. An electronic toy according to claim 1, wherein said control means selects the control parameter for a predetermined solo play operation when it is judged that a person does not exist in the periphery.
11. An electronic toy according to claim 2, wherein said electronic toy is in the shape of a human, and said information display means is provided to a part corresponding to the face and displays expressions of the face and symbols such as text.
12. An electronic toy according to claim 1 further comprising storage means for recording the voice of a person.
13. An electronic toy according to claim 1, wherein said input means includes at least one of a touch sensor, microphone, light sensor, camera, ∘× switch and condition sensor.
14. An electronic toy according to claim 1 further comprising means for detecting the output of a battery, which is the source of power of said movement mechanism;
wherein said control means further generates a warning with the information display means for externally displaying information or the sound generation means for externally outputting sound when the output of said battery becomes weak.
15. An electronic toy controlled so as to react to external information, comprising:
a human-shaped structure;
control means for controlling the movement of said structure in correspondence with external information;
a miniature camera provided to said structure and for photographing the external situation; and
communication means for externally transmitting the photographed image.
16. A toy, comprising:
a basic frame disposed at the torso portion of the human-shaped toy;
first and second sub-frames respectively provided at both sides of said basic frame and rotatably mounted on said basic frame;
first and second rotational axes respectively provided to said first and second sub-frames;
a cam mechanism provide to a third rotational axis driven by a first motor;
a link for connecting said cam mechanism between said first and second sub-frames and oscillating both sub-frames;
a gear mechanism driven by a second motor; and
a transmission mechanism disposed across said basic frame and between said first and second sub-frames and for transmitting the output of said gear mechanism to said first and second rotational axes.
17. A toy according to claim 16, wherein said transmission mechanism is structured of a gear train formed of a plurality of gears, and each of the gears on both ends are respectively disposed in said first and second sub-frames and respectively engage with said first and second rotational axes via a bevel gear.
18. An electronic toy in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head capable of displaying text and symbols, and which is structured such that the information input pursuant to the operation of the input unit, which is formed of a plurality of input switches provided on the body, can be visually confirmed with the display unit provided on said face-corresponding portion.
19. An electronic toy in the shape of a human or an animal having a head portion and a torso portion and comprising a display unit at a face-corresponding portion of the head capable of displaying text and symbols, an input unit formed of a plurality of input switches is provided to the torso portion, and which is structured such that the operation results of said input unit can be visually confirmed with the display unit provided on said face-corresponding portion.
20. An electronic robot in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head capable of displaying text and symbols, and which is structured such that the information input by the operator operating the input unit provided to the body of said robot can be visually confirmed with the display unit provided on said face-corresponding portion.
21. An electronic robot in the shape of a human or an animal comprising a display unit at a face-corresponding portion of the head capable of displaying text and symbols, and which is structured such that the information input by the operator operating the input unit provided to the body of said robot is displayed on the display unit provided on said face-corresponding portion so as to form the expression of said robot.
22. An electronic toy according to claim 1, wherein an emotion parameter is included in said control parameter, and said emotion parameter is represented as the biorhythm of a specific person or the biorhythm of the robot.
23. An electronic toy according to claim 22, wherein said emotion parameter is affected by the occurrence of an event.
24. An electronic toy according to claim 23, wherein the response to a question inquired by the electronic toy to the user is included in said event.
25. An electronic toy according to claim 24, wherein defined beforehand in said question are changes in the emotion parameter against the anticipated response to the question.
26. An electronic toy according to claim 22, further comprising:
information display means for externally displaying information; and
sound generation means for externally outputting sound;
wherein said control unit further selects at least one of the information to be externally displayed or the sound to be externally output based on said emotion parameter, and respectively supplies this to the corresponding means among said information display means and said sound generation means.
27. An electronic toy according to claim 24, wherein said control means further stores the response to said question, and forms a standard sentence by using data relating to said response.
28. An electronic toy in the shape of a human or an animal, comprising:
a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion;
input means provided on the body for performing input operations;
storage means for storing a plurality of words; and
control means having a function for outputting emotion parameter values representing self emotions, and which selects said words based on said emotion parameter value and displays said words on said display unit.
29. An electronic toy in the shape of a human or an animal, comprising:
vocalization means for outputting sound data as a voice;
input means provided to the body for performing input operations;
storage means for storing a plurality of sound data; and
control means having a function for outputting emotion parameter values representing self emotions, and which selects said sound data based on said emotion parameter value and makes said vocalization means vocalize said sound data.
30. An electronic toy in the shape of a human or an animal, comprising:
a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion;
vocalization means for outputting sound data as a voice;
input means provided on the body for performing input operations;
storage means for storing a plurality of words and a plurality of sound data; and
control means having a function for outputting emotion parameter values representing self emotions, and which selects said words and said sound data based on said emotion parameter value and supplies said words and said sound data to said display unit and said vocalization means, respectively.
31. An electronic toy according to any one of claims 28 to 30, wherein said emotion parameter value temporally changes between the maximum value and minimum value.
32. An electronic toy according to claim 28, wherein said control means further inquires questions with said text or sound, and changes the value of said emotion parameter in accordance with the input operation in response thereto.
33. An electronic toy according to claim 32, wherein a plurality of said questions are stored in advance, and changes in said emotion parameter are defined against the anticipated response to the respective questions.
34. An electronic toy according to claim 32, wherein a plurality of said questions are stored in advance, and the degree of intimacy between the electronic toy and user is defined against the anticipated response to the respective questions.
35. An electronic toy according to claim 32, wherein said control means further stores the response to said question, and forms a standard sentence by using data relating to said response.
36. An electronic toy according to claim 34, wherein said control unit further accumulates the degree of intimacy obtained pursuant to the. respective questions and, when this exceeds a prescribed value, supplies data for expressing a specific emotion to said display unit and/or said vocalization means.
37. An electronic toy according to claim 32, wherein said questions include questions that will affect and questions that will not affect said emotion parameter.
38. An electronic toy according to claim 31, wherein a plurality zones are defined in advance between the maximum value and minimum value of said emotion parameter and said words and said voice data are distributed in the respective zones,
and wherein said control means selects words and sound data of the corresponding zone depending on to which zone the current emotion parameter value belongs.
39. An electronic toy according to claim 38, wherein said control means, in a special zone, further selects the control for performing a special movement accompanying the mechanical movement of the components structuring the human shape or animal shape.
40. An electronic toy according to claim 28, wherein said control means further comprises an exhibition mode for changing said emotion parameter value between the maximum value and minimum value thereof in short cycles.
41. An electronic toy according to claim 30 further comprising connection means for connecting the electronic toy to a network,
wherein at least one of the words and sound data is downloaded to said storage means from a server device connected to said network.
42. An electronic toy according to claim 41, wherein said downloaded words and sound data are current affair terms.
43. An electronic toy according to claim 41, wherein said downloaded words and sound data are terms corresponding to the characteristics of the user.
44. An electronic toy according to claim 30 further comprising connection means for connecting two electronic toys,
wherein at least one of the words and sound data stored in the connected electronic toy of the opponent is received by said storage means.
45. An electronic toy according to claim 41 or claim 44, wherein said connection means includes at least one of a communication cable, PHS, mobile telephone and personal computer.
46. An electronic toy according to claim 44, wherein text data is exchanged between the electronic toys, and simulated conversation is conducted by incorporating the exchanged data in standard sentences.
47. An electronic toy in the shape of a human or an animal, comprising:
sound detection means for detecting peripheral sound;
a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion;
storage means for storing a plurality of expressions; and
control means having a function for outputting emotion parameter values representing self emotions, and which selects said expression based on said emotion parameter value and displays said expression on said display unit, and sets said emotion parameter to an unpleasant state when said sound exceeds a prescribed level and continues beyond a prescribed time.
48. An electronic toy in the shape of a human or an animal, comprising:
a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion;
storage means for storing a plurality of expressions;
input means provided on the body for performing input operations; and
control means having a function for outputting emotion parameter values representing self emotions, and which selects said expression based on said emotion parameter value and displays said expression on said display unit, and selects an expression corresponding to said emotion parameter when said input means is continuously operated for a prescribed time or a prescribed number of times.
49. An electronic toy according to claim 47, wherein an expression of anger is displayed on said display unit during said unpleasant state.
50. An electronic toy according to claim 48, wherein the expression selected in correspondence with said continuous operation is a painful expression upon being pounded, or a delightful expression upon being patted.
51. An electronic toy in the shape of a human or an animal, comprising:
a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion;
storage means for storing a plurality of expressions;
a light sensor for detecting the peripheral brightness; and
control means for selecting an expression corresponding to the self emotion and selecting the expression of closing the eyes when said light sensor detects a dark state beyond a prescribed time.
52. An electronic toy according to claim 51, wherein said control means further moves the mechanical components structuring the human shape or animal shape so as to express a reluctant expression.
53. An electronic toy according to claim 28, wherein the initial value of the function for outputting the emotion parameter value expressing said emotions is set randomly.
54. An electronic toy in the shape of a human or an animal, comprising:
a display unit capable of displaying text and symbols on the face-corresponding portion of the head or torso-corresponding portion;
movably structured mechanical components structuring a human shape or an animal shape; and
a control unit for distinguishing the message and control information from a file attached to an e-mail, displaying said message on said display unit, and moving said mechanical components in correspondence with said control information.
55. An electronic toy according to claim 54, wherein said attachment file is a sound file.
56. An electronic toy according to claim 55, wherein said sound file is reproduced as a sound signal by a computer, and said sound signal is supplied to said control unit.
57. An electronic toy according to claim 54, wherein said control information designates the movement stored beforehand in said control unit.
58. An electronic toy according to claim 54, wherein said control information designates to said control unit a series of control procedures of said mechanical components.
59. An electronic toy according to claim 54, wherein, when said control information is not attached, said control unit selects adequate movement of said mechanical components.
60. An electronic toy according to claim 54, wherein said control information expresses emotions such as delight, anger, sorrow and pleasure of the robot.
61. A method of exchanging e-mail, comprising:
a step of converting the message to be displayed on the electronic toy of the receiving side and the movement to be made by said electronic toy into a sound signal;
a step of converting said sound signal into a sound file and making this an attachment file of an e-mail; and
a step of transmitting the e-mail with said sound attachment file from the terminal device of the transmitting side to the terminal device of the receiving side.
62. A method of exchanging e-mail, comprising:
a step of receiving e-mail with a sound attachment file in which said attachment file is a sound file where a sound signal bears the message to be displayed on an electronic toy and the movement to be made by said electronic toy, and obtaining said sound signal by reproducing said sound file;
a step of forwarding said reproduced sound signal from the terminal device of the receiving side to the electronic toy of the receiving side; and
a step of making said electronic toy display said message and perform said movement.
63. A method of exchanging e-mail, comprising:
a step of converting the message to be displayed on the electronic toy of the receiving side and the movement to be made by said electronic toy into a sound signal;
a step of converting said sound signal into a sound file and making this an attachment file of an e-mail;
a step of transmitting the e-mail with said sound attachment file from the terminal device of the transmitting side to the terminal device of the receiving side;
a step of forwarding said reproduced sound signal from the terminal device of said receiving side to said electronic toy; and
a step of making said electronic toy display said message and perform said movement.
64. An electronic toy in the shape of a human or an animal, comprising:
a leg structure structuring a pair of movable legs in the shape of a human or an animal; and
a control unit for controlling the movement of said legs in correspondence with sound to be output.
65. An electronic toy according to claim 64, wherein said control unit sets the speed of movement of said legs in correspondence with the volume of said sound and rhythm.
66. An electronic toy according to claim 64, wherein the movement of said pair of legs is a movement of opening/closing said legs in the horizontal direction.
67. An electronic toy according to claim 64, wherein slide prevention means is provided to the sole of one of said legs, and sliding means is provided to the sole of the other of said legs.
68. An electronic toy according to claim 64, wherein said leg structure comprises:
a waist portion frame provided with a pair of hip joint portions rotatable at least in one direction;
a pair of leg portions respectively connected to said pair of hip joint portions;
a pair of drive shafts in which one end portion thereof is mounted on said leg portion and the other end portion thereof extends inside said waist portion frame beyond the hip joint portion of said leg portion;
a link member for mutually connecting the respective other end portions of the respective drive shafts;
a cam mechanism interjacent between said other end portion of at least one of said drive shafts and said link member, and for changing the respective one end portions of said drive shafts to become wide or narrow; and
a motor built in one of said leg portions and for rotatably driving said one of drive shafts.
69. An electronic toy according to claim 68, wherein the other end portion of said drive shaft and said link member, or said cam and said link member, are connected via a spherical engagement member.
70. An electronic toy according to claim 68 further comprising a sliding means provided on one end portion of the other drive shaft among said pair of drive shafts so as to slide on the ground surface or floor surface.
71. An electronic toy according to claim 68, wherein the other of said leg portions comprises an above-knee portion connected rotatably in the cross direction to said hip joint portion, a below-knee portion connected rotatably in the cross direction with said above-knee portion, and a ground portion connected rotatably in the horizontal direction with one end portion of the other drive shaft among said pair of drive shafts;
and said electronic toy having a structure wherein a protrusion is formed at the lower face of said below-knee portion, an inclined face to which said protrusion contacts is formed on the upper face of said ground portion, said protrusion is pushed up pursuant to the opening/closing movement of said leg portions, and the connection of said above-knee portion and said below-knee portion bends thereby.
72. An electronic toy according to claim 67, wherein said sliding means is a roller.
73. An electronic toy according to claim 69, wherein the degree of opening/closing of said pair of legs is adjustable pursuant to the position in which said engagement member is mounted on said cam mechanism.
74. An electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of one leg comprises:
a waist portion frame;
an above-knee portion connected rotatably to said waist portion frame;
a below-knee portion connected rotatably to said above-knee portion;
a ground portion connected rotatably to said below-knee portion;
a rotatably driven cam pulley provided to said waist portion frame;
a first cam provided to said cam pulley;
a second cam provided to said cam pulley;
a long member for vertically oscillating said ground portion with said first cam; and
a short member for oscillating said below-knee portion with said second cam in the cross direction.
75. An electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of one leg comprises:
a waist portion frame;
an above-knee portion connected rotatably to said waist portion frame;
a below-knee portion connected rotatably to said above-knee portion;
a ground portion connected rotatably to said below-knee portion;
a rotatably driven cam provided to said waist portion frame;
a long member for vertically oscillating said ground portion with said cam; and
a short member for oscillating said below-knee portion with said cam in the cross direction.
76. An electronic toy according to claim 75, wherein said long member comprises a guide hole to be engaged with a guide member and a pushdown plate in contact with said cam.
77. An electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of the respective legs comprises:
an above-knee portion connected rotatably to a waist portion frame;
a below-knee portion connected rotatably to said above-knee portion;
a ground portion connected rotatably to said below-knee portion; and
energization means for energizing the tip of said ground portion in the pushdown direction.
78. An electronic toy according to claim 77, wherein the size of said electronic toy is approximately 30 cm.
79. An electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, wherein the movement mechanism of the respective legs comprises:
an above-knee portion connected rotatably to a waist portion frame;
a below-knee portion connected rotatably to said above-knee portion;
a ground portion connected rotatably to said below-knee portion; and
oblique direction drive means provided to said ground portion and for driving said electronic toy in an oblique direction against the advancing direction by said bipedal locomotion mechanism.
80. An electronic toy according to claim 79, wherein said oblique direction drive means is structured by comprising a rotatably driven drive roller or drive belt.
82. An electronic toy according to claim 80, wherein a plurality of said driver rollers or drive belts are provided.
82. An electronic toy according to claim 79, wherein said oblique direction drive means is respectively provided to the respective ground portions of both legs, and the respective drive directions of each of said oblique direction drive means exist on the circumference of an approximate curvature.
83. An electronic toy according to claim 79, wherein said oblique direction drive means is provided at the toe side of said ground portion, and a sliding roller is provided at the heel side of said ground portion.
84. An electronic toy comprising a walking mechanism for performing bipedal locomotion by moving both legs back and forth, provided with an oblique direction drive mechanism for driving said electronic toy in an oblique direction against the advancing direction by said bipedal locomotion mechanism.
85. An electronic toy according to claim 84, wherein said oblique direction drive mechanism is structured by comprising a rotatably driven drive roller or drive belt.
86. An electronic toy according to claim 85, wherein a plurality of said driver rollers or drive belts are provided.
87. An electronic toy according to claim 84, wherein said oblique direction drive mechanism is respectively provided to the respective sole portions of both feet, and the respective drive directions of each of said oblique direction drive means exist on the circumference of an approximate curvature.
88. An electronic toy according to claim 84, wherein said oblique direction drive mechanism is provided at the toe side of the sole portion of said feet, and a sliding roller is provided at the heel side of the sole portion of said feet.
89. An electronic toy according to claim 84 further comprising an energization means for pushing down the tip side of said feet in the sole direction.
US09/985,909 2000-11-07 2001-11-06 Electronic toy Abandoned US20020081937A1 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2000339744 2000-11-07
JP2000-339744 2000-11-07
JP2001-9555 2001-01-17
JP2001009555 2001-01-17
JP2001-79425 2001-02-12
JP2001079425 2001-02-12
JP2001170342A JP2002307354A (en) 2000-11-07 2001-06-05 Electronic toy
JP2001-170342 2001-06-05

Publications (1)

Publication Number Publication Date
US20020081937A1 true US20020081937A1 (en) 2002-06-27

Family

ID=27481758

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/985,909 Abandoned US20020081937A1 (en) 2000-11-07 2001-11-06 Electronic toy

Country Status (2)

Country Link
US (1) US20020081937A1 (en)
JP (1) JP2002307354A (en)

Cited By (120)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030067486A1 (en) * 2001-10-06 2003-04-10 Samsung Electronics Co., Ltd. Apparatus and method for synthesizing emotions based on the human nervous system
US20030171846A1 (en) * 2001-11-28 2003-09-11 Murray Thomas J. Sensor and actuator abstraction and aggregation in a hardware abstraction layer for a robot
US20040141620A1 (en) * 2003-01-17 2004-07-22 Mattel, Inc. Audible sound detection control circuits for toys and other amusement devices
US20040210345A1 (en) * 2003-02-05 2004-10-21 Kuniaki Noda Buffer mechanism and recording and/or reproducing apparatus
US20050136901A1 (en) * 2003-12-22 2005-06-23 Younghee Jung System and method for assigning contact information to an external device for communication purposes using a mobile device
US20050233675A1 (en) * 2002-09-27 2005-10-20 Mattel, Inc. Animated multi-persona toy
US20060003664A1 (en) * 2004-06-09 2006-01-05 Ming-Hsiang Yeh Interactive toy
US20060009879A1 (en) * 2004-06-24 2006-01-12 Lynch James K Programming and diagnostic tool for a mobile robot
FR2872714A1 (en) * 2004-07-08 2006-01-13 Corolle TOY SENSITIVE TO HUMAN TOUCH
US20060277249A1 (en) * 2002-12-16 2006-12-07 Koninklijke Philips Electronics N.V. Robotic web browser
US20060286895A1 (en) * 2005-06-17 2006-12-21 Paul Thomson Talking doll
US20070072511A1 (en) * 2005-09-26 2007-03-29 M-Systems Flash Disk Pioneers Ltd. USB desktop toy
US20070166004A1 (en) * 2006-01-10 2007-07-19 Io.Tek Co., Ltd Robot system using menu selection card having printed menu codes and pictorial symbols
US20080162648A1 (en) * 2006-12-29 2008-07-03 Wang Kai Benson Leung Device and method of expressing information contained in a communication message sent through a network
US20080168143A1 (en) * 2007-01-05 2008-07-10 Allgates Semiconductor Inc. Control system of interactive toy set that responds to network real-time communication messages
WO2008096134A2 (en) * 2007-02-08 2008-08-14 Genie Toys Plc Toy in the form of a doll
US20080234862A1 (en) * 2004-05-19 2008-09-25 Nec Corporation User Preference Interring Apparatus, User Profile Interring Apparatus, and Robot
US20080287033A1 (en) * 2007-05-17 2008-11-20 Wendy Steinberg Personalizable Doll
US20080293324A1 (en) * 2007-05-22 2008-11-27 Winway Corporation Ltd. Toy doll system
US20090005167A1 (en) * 2004-11-29 2009-01-01 Juha Arrasvuori Mobile Gaming with External Devices in Single and Multiplayer Games
US20090007366A1 (en) * 2005-12-02 2009-01-08 Irobot Corporation Coverage Robot Mobility
EP2031481A1 (en) * 2007-08-29 2009-03-04 Industrial Technology Research Institut Information communication and interaction device and method for the same
FR2921008A1 (en) * 2007-09-19 2009-03-20 Aldebaran Robotics Soc Par Act IMPROVING A ROBOT THAT CAN REMOVE ITS LEGS FROM THE VERTICAL
US20090091470A1 (en) * 2007-08-29 2009-04-09 Industrial Technology Research Institute Information communication and interaction device and method for the same
KR100894569B1 (en) 2007-11-20 2009-04-24 전인자 Humanized doll with adaptive function
US20090305603A1 (en) * 2008-06-04 2009-12-10 Hon Hai Precision Industry Co., Ltd. Interactive toy system
US20100147879A1 (en) * 2007-06-18 2010-06-17 Heiner Ophardt Photochromic optically keyed dispenser
US20100185326A1 (en) * 2009-01-22 2010-07-22 Samsung Electronics Co., Ltd. Robot
WO2011014263A1 (en) * 2009-07-30 2011-02-03 While We're Apart, Llc Article for upholding personal affinity
US20110131165A1 (en) * 2009-12-02 2011-06-02 Phison Electronics Corp. Emotion engine, emotion engine system and electronic device control method
US20110151746A1 (en) * 2009-12-18 2011-06-23 Austin Rucker Interactive toy for audio output
US8000837B2 (en) 2004-10-05 2011-08-16 J&L Group International, Llc Programmable load forming system, components thereof, and methods of use
WO2012000927A1 (en) * 2010-07-02 2012-01-05 Aldebaran Robotics Humanoid game-playing robot, method and system for using said robot
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
CN102772901A (en) * 2011-08-12 2012-11-14 万代股份有限公司 Action body toy and control method
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US20130073087A1 (en) * 2011-09-20 2013-03-21 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US8428778B2 (en) 2002-09-13 2013-04-23 Irobot Corporation Navigational control system for a robotic device
US20130122777A1 (en) * 2011-08-04 2013-05-16 Chris Scheppegrell Communications and monitoring using a toy
US8463438B2 (en) 2001-06-12 2013-06-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US20130158748A1 (en) * 2010-09-03 2013-06-20 Aldebaran Robotics Mobile robot
US20130165014A1 (en) * 2011-12-26 2013-06-27 Sam Yang Interactive electronic toy
US8474090B2 (en) 2002-01-03 2013-07-02 Irobot Corporation Autonomous floor-cleaning robot
US8515578B2 (en) 2002-09-13 2013-08-20 Irobot Corporation Navigational control system for a robotic device
US8584305B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US8594840B1 (en) 2004-07-07 2013-11-26 Irobot Corporation Celestial navigation system for an autonomous robot
USD700250S1 (en) 2011-07-21 2014-02-25 Mattel, Inc. Toy vehicle
US20140099613A1 (en) * 2012-10-02 2014-04-10 Gavriel Yaacov Krauss Methods circuits, devices and systems for personality interpretation and expression
US8698965B2 (en) 2009-01-22 2014-04-15 Samsung Electronics Co., Ltd. Robot
USD703275S1 (en) 2011-07-21 2014-04-22 Mattel, Inc. Toy vehicle housing
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US20140220855A1 (en) * 2010-10-19 2014-08-07 I-Star Entertainment Llc Illumination display and communication device and method
US8800107B2 (en) 2010-02-16 2014-08-12 Irobot Corporation Vacuum brush
US20140303982A1 (en) * 2013-04-09 2014-10-09 Yally Inc. Phonetic conversation method and device using wired and wiress communication
US8881339B2 (en) 2011-04-29 2014-11-11 Irobot Corporation Robotic vacuum
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
CN104815445A (en) * 2014-01-22 2015-08-05 广东奥飞动漫文化股份有限公司 Inductive control system for electrical scooter
US20150224640A1 (en) * 2005-09-30 2015-08-13 Irobot Corporation Companion robot for personal interaction
US20150375115A1 (en) * 2014-06-30 2015-12-31 Microsoft Corporation Interacting with a story through physical pieces
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US20160136534A1 (en) * 2014-11-13 2016-05-19 Robert A. EARL-OCRAN Programmable Interactive Toy
US9364950B2 (en) * 2014-03-13 2016-06-14 Brain Corporation Trainable modular robotic methods
US9426946B2 (en) 2014-12-02 2016-08-30 Brain Corporation Computerized learning landscaping apparatus and methods
US20160261313A1 (en) * 2015-03-06 2016-09-08 Nxp B.V. Toy, method for controlling a toy, and computer program product
US9446515B1 (en) 2012-08-31 2016-09-20 Brain Corporation Apparatus and methods for controlling attention of a robot
US9533413B2 (en) 2014-03-13 2017-01-03 Brain Corporation Trainable modular robotic apparatus and methods
US20170169203A1 (en) * 2015-12-14 2017-06-15 Casio Computer Co., Ltd. Robot-human interactive device, robot, interaction method, and recording medium storing program
US9840003B2 (en) 2015-06-24 2017-12-12 Brain Corporation Apparatus and methods for safe navigation of robotic devices
CN107538497A (en) * 2016-06-23 2018-01-05 卡西欧计算机株式会社 Robot, robot control system, robot control method and storage medium
US9919231B1 (en) * 2017-07-10 2018-03-20 Peter Chin Cuboid or spherical head figurine
US20180147728A1 (en) * 2016-11-30 2018-05-31 Universal City Studios Llc Animated character head systems and methods
US9987743B2 (en) 2014-03-13 2018-06-05 Brain Corporation Trainable modular robotic apparatus and methods
WO2018127863A3 (en) * 2018-03-15 2018-09-20 Ingeniería Aplicada, S.A. System for monitoring power consumption
EP3392003A1 (en) * 2017-04-19 2018-10-24 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
EP3392004A1 (en) * 2017-04-19 2018-10-24 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
EP3392005A1 (en) * 2017-04-19 2018-10-24 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
US20190030424A1 (en) * 2016-07-05 2019-01-31 Fujian Blue Hat Interactive Entertainment Technology Ltd. Interactive system based on light intensity recognition
US10242666B2 (en) * 2014-04-17 2019-03-26 Softbank Robotics Europe Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method
CN109732612A (en) * 2019-03-12 2019-05-10 苏州工业职业技术学院 Intelligent interactive toy robot control system based on STM32 microcontroller
US10307911B2 (en) * 2017-08-30 2019-06-04 Panasonic Inttellectual Property Management Co., Ltd. Robot
US10369477B2 (en) 2014-10-08 2019-08-06 Microsoft Technology Licensing, Llc Management of resources within a virtual world
CN110152314A (en) * 2018-02-13 2019-08-23 卡西欧计算机株式会社 Session output system, session export server, session output method and storage medium
CN110268468A (en) * 2017-02-24 2019-09-20 索尼移动通信株式会社 Information processing equipment, information processing method and computer program
US20190308327A1 (en) * 2018-04-06 2019-10-10 Anki, Inc. Condition-Based Robot Audio Techniques
US10500497B2 (en) 2014-10-08 2019-12-10 Microsoft Corporation Transfer of attributes between generations of characters
RU2708697C2 (en) * 2012-12-12 2019-12-11 Арам Акопян Exercise demonstration devices and systems
US10512384B2 (en) 2016-12-15 2019-12-24 Irobot Corporation Cleaning roller for cleaning robots
US10595624B2 (en) 2017-07-25 2020-03-24 Irobot Corporation Cleaning roller for cleaning robots
US20200164519A1 (en) * 2018-11-22 2020-05-28 Lg Electronics Inc. Motion control apparatus of action robot and motion generation and control system including the same
US10707457B2 (en) * 2017-10-26 2020-07-07 UBTECH Robotics Corp. Battery case and robot having the same
USD916928S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD916925S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD916926S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD916927S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
CN112717433A (en) * 2020-11-16 2021-04-30 北京六十六号互动科技有限公司 Toy control device
US11000952B2 (en) 2017-06-23 2021-05-11 Casio Computer Co., Ltd. More endearing robot, method of controlling the same, and non-transitory recording medium
US11099807B2 (en) 2019-02-14 2021-08-24 Sharp Kabushiki Kaisha Electronic apparatus, control device, control method, and recording medium
US20210260773A1 (en) * 2016-06-15 2021-08-26 Irobot Corporation Systems and methods to control an autonomous mobile robot
US11103800B1 (en) * 2017-02-17 2021-08-31 Hasbro, Inc. Toy robot with programmable and movable appendages
US11109727B2 (en) 2019-02-28 2021-09-07 Irobot Corporation Cleaning rollers for cleaning robots
US11192257B2 (en) 2016-04-08 2021-12-07 Groove X, Inc. Autonomously acting robot exhibiting shyness
US20210402589A1 (en) * 2019-06-17 2021-12-30 Lg Electronics Inc. Artificial intelligence (ai) robot and control method thereof
US11285611B2 (en) * 2018-10-18 2022-03-29 Lg Electronics Inc. Robot and method of controlling thereof
US11471020B2 (en) 2011-04-29 2022-10-18 Irobot Corporation Robotic vacuum cleaning system
US20230018066A1 (en) * 2020-11-20 2023-01-19 Aurora World Corporation Apparatus and system for growth type smart toy
US11579617B2 (en) 2016-07-11 2023-02-14 Groove X, Inc. Autonomously acting robot whose activity amount is controlled
US11831955B2 (en) 2010-07-12 2023-11-28 Time Warner Cable Enterprises Llc Apparatus and methods for content management and account linking across multiple content delivery networks

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004318862A (en) * 2003-03-28 2004-11-11 Sony Corp Information providing device and method, and information providing system
JP3958253B2 (en) * 2003-06-09 2007-08-15 株式会社シーエーアイメディア共同開発 Dialog system
JP4247149B2 (en) * 2004-03-30 2009-04-02 株式会社国際電気通信基礎技術研究所 robot
JP2005342862A (en) * 2004-06-04 2005-12-15 Nec Corp Robot
JP4582449B2 (en) * 2005-01-31 2010-11-17 日本ビクター株式会社 Communication device
JP5162852B2 (en) * 2006-07-18 2013-03-13 株式会社国際電気通信基礎技術研究所 Android control system
JP2008090013A (en) * 2006-10-02 2008-04-17 Sony Corp Robot device, music output method, and music output program
JP5187563B2 (en) * 2007-01-22 2013-04-24 株式会社ゼットエムピー Sound reproduction robot
WO2010007336A1 (en) * 2008-07-18 2010-01-21 Steven Lipman Interacting toys
KR101399601B1 (en) * 2011-10-27 2014-05-27 허헌 Emotion expresion robot using user data
JP5753865B2 (en) * 2013-02-28 2015-07-22 東芝テック株式会社 Assist robot and its control program
JP6208102B2 (en) * 2014-09-04 2017-10-04 富士ソフト株式会社 Humanoid robot
JP2016081190A (en) * 2014-10-14 2016-05-16 株式会社リラク Physical condition predicting system and program
JP6530906B2 (en) * 2014-11-28 2019-06-12 マッスル株式会社 Partner robot and its remote control system
WO2016104193A1 (en) * 2014-12-26 2016-06-30 シャープ株式会社 Response determination device, speech interaction system, method for controlling response determination device, and speech interaction device
JP6583765B2 (en) * 2015-01-16 2019-10-02 国立大学法人大阪大学 Agent dialogue system and program
US10713006B2 (en) 2016-07-19 2020-07-14 Gatebox Inc. Image display apparatus, topic selection method, topic selection program, image display method, and image display program
JP2018117821A (en) * 2017-01-25 2018-08-02 群馬電機株式会社 Stuffed animal for welfare nursing
US11709476B2 (en) 2017-10-30 2023-07-25 Sony Corporation Information processing apparatus, information processing method and program
CN108214510A (en) * 2018-01-17 2018-06-29 深圳市共进电子股份有限公司 Children accompany robot
JP2019123055A (en) 2018-01-18 2019-07-25 株式会社ユピテル apparatus
JP7130201B2 (en) * 2018-01-18 2022-09-05 株式会社ユピテル Equipment and programs, etc.
KR102063389B1 (en) * 2018-03-16 2020-02-11 숙명여자대학교산학협력단 Character display device based the artificial intelligent and the display method thereof
JP6800183B2 (en) * 2018-07-13 2020-12-16 本田技研工業株式会社 Communication device
JP6707701B1 (en) * 2019-09-03 2020-06-10 株式会社タカラトミー Action robot toys
JP7287411B2 (en) * 2021-03-16 2023-06-06 カシオ計算機株式会社 Equipment control device, equipment control method and program
JP7102575B1 (en) * 2021-04-14 2022-07-19 株式会社バンダイ Directing output toys
JP2023092204A (en) 2021-12-21 2023-07-03 カシオ計算機株式会社 robot

Cited By (267)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8565920B2 (en) 2000-01-24 2013-10-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8761935B2 (en) 2000-01-24 2014-06-24 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9446521B2 (en) 2000-01-24 2016-09-20 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8478442B2 (en) 2000-01-24 2013-07-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9144361B2 (en) 2000-04-04 2015-09-29 Irobot Corporation Debris sensor for cleaning apparatus
US9582005B2 (en) 2001-01-24 2017-02-28 Irobot Corporation Robot confinement
US9038233B2 (en) 2001-01-24 2015-05-26 Irobot Corporation Autonomous floor-cleaning robot
US9167946B2 (en) 2001-01-24 2015-10-27 Irobot Corporation Autonomous floor cleaning robot
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US8659256B2 (en) 2001-01-24 2014-02-25 Irobot Corporation Robot confinement
US8659255B2 (en) 2001-01-24 2014-02-25 Irobot Corporation Robot confinement
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US8686679B2 (en) 2001-01-24 2014-04-01 Irobot Corporation Robot confinement
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8463438B2 (en) 2001-06-12 2013-06-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9104204B2 (en) 2001-06-12 2015-08-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US7333969B2 (en) * 2001-10-06 2008-02-19 Samsung Electronics Co., Ltd. Apparatus and method for synthesizing emotions based on the human nervous system
US20030067486A1 (en) * 2001-10-06 2003-04-10 Samsung Electronics Co., Ltd. Apparatus and method for synthesizing emotions based on the human nervous system
US20030171846A1 (en) * 2001-11-28 2003-09-11 Murray Thomas J. Sensor and actuator abstraction and aggregation in a hardware abstraction layer for a robot
US7302312B2 (en) 2001-11-28 2007-11-27 Evolution Robotics, Inc. Hardware abstraction layer (HAL) for a robot
US7076336B2 (en) 2001-11-28 2006-07-11 Evolution Robotics, Inc. Hardware abstraction layer (HAL) for a robot
US20080071423A1 (en) * 2001-11-28 2008-03-20 Evolution Robotics, Inc. Hardware abstraction layer (hal) for a robot
US6889118B2 (en) * 2001-11-28 2005-05-03 Evolution Robotics, Inc. Hardware abstraction layer for a robot
US7925381B2 (en) 2001-11-28 2011-04-12 Evolution Robotics, Inc. Hardware abstraction layer (HAL) for a robot
US8996168B2 (en) 2001-11-28 2015-03-31 Irobot Corporation Hardware abstraction layer (HAL) for a robot
US20070050088A1 (en) * 2001-11-28 2007-03-01 Murray Thomas J Iv Hardware abstraction layer (HAL) for a robot
US20050021186A1 (en) * 2001-11-28 2005-01-27 Murray Thomas J. Hardware abstraction layer (HAL) for a robot
US8516651B2 (en) 2002-01-03 2013-08-27 Irobot Corporation Autonomous floor-cleaning robot
US8474090B2 (en) 2002-01-03 2013-07-02 Irobot Corporation Autonomous floor-cleaning robot
US9128486B2 (en) 2002-01-24 2015-09-08 Irobot Corporation Navigational control system for a robotic device
US8793020B2 (en) 2002-09-13 2014-07-29 Irobot Corporation Navigational control system for a robotic device
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US8515578B2 (en) 2002-09-13 2013-08-20 Irobot Corporation Navigational control system for a robotic device
US8781626B2 (en) 2002-09-13 2014-07-15 Irobot Corporation Navigational control system for a robotic device
US8428778B2 (en) 2002-09-13 2013-04-23 Irobot Corporation Navigational control system for a robotic device
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US20050233675A1 (en) * 2002-09-27 2005-10-20 Mattel, Inc. Animated multi-persona toy
US20060277249A1 (en) * 2002-12-16 2006-12-07 Koninklijke Philips Electronics N.V. Robotic web browser
US7120257B2 (en) 2003-01-17 2006-10-10 Mattel, Inc. Audible sound detection control circuits for toys and other amusement devices
US20040141620A1 (en) * 2003-01-17 2004-07-22 Mattel, Inc. Audible sound detection control circuits for toys and other amusement devices
US20040210345A1 (en) * 2003-02-05 2004-10-21 Kuniaki Noda Buffer mechanism and recording and/or reproducing apparatus
US7363108B2 (en) * 2003-02-05 2008-04-22 Sony Corporation Robot and control method for controlling robot expressions
US20050136901A1 (en) * 2003-12-22 2005-06-23 Younghee Jung System and method for assigning contact information to an external device for communication purposes using a mobile device
US7177597B2 (en) 2003-12-22 2007-02-13 Nokia Corporation System and method for assigning contact information to an external device for communication purposes using a mobile device
WO2005062700A2 (en) 2003-12-22 2005-07-14 Nokia Corporation A system and method for assigning contact information to an external device for communication purposes using a mobile device
US8854001B2 (en) 2004-01-21 2014-10-07 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US9215957B2 (en) 2004-01-21 2015-12-22 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8461803B2 (en) 2004-01-21 2013-06-11 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8749196B2 (en) 2004-01-21 2014-06-10 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
US8456125B2 (en) 2004-01-28 2013-06-04 Irobot Corporation Debris sensor for cleaning apparatus
US8378613B2 (en) 2004-01-28 2013-02-19 Irobot Corporation Debris sensor for cleaning apparatus
US8598829B2 (en) 2004-01-28 2013-12-03 Irobot Corporation Debris sensor for cleaning apparatus
US20080234862A1 (en) * 2004-05-19 2008-09-25 Nec Corporation User Preference Interring Apparatus, User Profile Interring Apparatus, and Robot
US8340816B2 (en) * 2004-05-19 2012-12-25 Nec Corporation User preference inferring apparatus, user profile inferring apparatus, and robot
US20060003664A1 (en) * 2004-06-09 2006-01-05 Ming-Hsiang Yeh Interactive toy
US20060009879A1 (en) * 2004-06-24 2006-01-12 Lynch James K Programming and diagnostic tool for a mobile robot
US9008835B2 (en) 2004-06-24 2015-04-14 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9486924B2 (en) 2004-06-24 2016-11-08 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US8634956B1 (en) 2004-07-07 2014-01-21 Irobot Corporation Celestial navigation system for an autonomous robot
US9229454B1 (en) 2004-07-07 2016-01-05 Irobot Corporation Autonomous mobile robot system
US8874264B1 (en) 2004-07-07 2014-10-28 Irobot Corporation Celestial navigation system for an autonomous robot
US9223749B2 (en) 2004-07-07 2015-12-29 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8594840B1 (en) 2004-07-07 2013-11-26 Irobot Corporation Celestial navigation system for an autonomous robot
WO2006005635A1 (en) * 2004-07-08 2006-01-19 Corolle A toy sensitive to human touch
US20080305709A1 (en) * 2004-07-08 2008-12-11 Corolle Toy Sensitive to Human Touch
FR2872714A1 (en) * 2004-07-08 2006-01-13 Corolle TOY SENSITIVE TO HUMAN TOUCH
US8000837B2 (en) 2004-10-05 2011-08-16 J&L Group International, Llc Programmable load forming system, components thereof, and methods of use
US20090005167A1 (en) * 2004-11-29 2009-01-01 Juha Arrasvuori Mobile Gaming with External Devices in Single and Multiplayer Games
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8670866B2 (en) 2005-02-18 2014-03-11 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8392021B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US9445702B2 (en) 2005-02-18 2016-09-20 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8966707B2 (en) 2005-02-18 2015-03-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8985127B2 (en) 2005-02-18 2015-03-24 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8855813B2 (en) 2005-02-18 2014-10-07 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8774966B2 (en) 2005-02-18 2014-07-08 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US10470629B2 (en) 2005-02-18 2019-11-12 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8782848B2 (en) 2005-02-18 2014-07-22 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US20060286895A1 (en) * 2005-06-17 2006-12-21 Paul Thomson Talking doll
US20070072511A1 (en) * 2005-09-26 2007-03-29 M-Systems Flash Disk Pioneers Ltd. USB desktop toy
US20150224640A1 (en) * 2005-09-30 2015-08-13 Irobot Corporation Companion robot for personal interaction
US9446510B2 (en) * 2005-09-30 2016-09-20 Irobot Corporation Companion robot for personal interaction
US9392920B2 (en) * 2005-12-02 2016-07-19 Irobot Corporation Robot system
US8954192B2 (en) 2005-12-02 2015-02-10 Irobot Corporation Navigating autonomous coverage robots
US8584305B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US10182695B2 (en) 2005-12-02 2019-01-22 Irobot Corporation Robot system
US9144360B2 (en) 2005-12-02 2015-09-29 Irobot Corporation Autonomous coverage robot navigation system
US8600553B2 (en) 2005-12-02 2013-12-03 Irobot Corporation Coverage robot mobility
US8978196B2 (en) 2005-12-02 2015-03-17 Irobot Corporation Coverage robot mobility
US8606401B2 (en) 2005-12-02 2013-12-10 Irobot Corporation Autonomous coverage robot navigation system
US20090007366A1 (en) * 2005-12-02 2009-01-08 Irobot Corporation Coverage Robot Mobility
US9599990B2 (en) 2005-12-02 2017-03-21 Irobot Corporation Robot system
US8950038B2 (en) 2005-12-02 2015-02-10 Irobot Corporation Modular robot
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US9901236B2 (en) 2005-12-02 2018-02-27 Irobot Corporation Robot system
US8661605B2 (en) 2005-12-02 2014-03-04 Irobot Corporation Coverage robot mobility
US9149170B2 (en) 2005-12-02 2015-10-06 Irobot Corporation Navigating autonomous coverage robots
US20140249671A1 (en) * 2005-12-02 2014-09-04 Irobot Corporation Robot System
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US10524629B2 (en) 2005-12-02 2020-01-07 Irobot Corporation Modular Robot
US8761931B2 (en) 2005-12-02 2014-06-24 Irobot Corporation Robot system
US20070166004A1 (en) * 2006-01-10 2007-07-19 Io.Tek Co., Ltd Robot system using menu selection card having printed menu codes and pictorial symbols
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US8528157B2 (en) 2006-05-19 2013-09-10 Irobot Corporation Coverage robots and associated cleaning bins
US10244915B2 (en) 2006-05-19 2019-04-02 Irobot Corporation Coverage robots and associated cleaning bins
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US9492048B2 (en) 2006-05-19 2016-11-15 Irobot Corporation Removing debris from cleaning robots
US8572799B2 (en) 2006-05-19 2013-11-05 Irobot Corporation Removing debris from cleaning robots
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US9317038B2 (en) 2006-05-31 2016-04-19 Irobot Corporation Detecting robot stasis
US20080162648A1 (en) * 2006-12-29 2008-07-03 Wang Kai Benson Leung Device and method of expressing information contained in a communication message sent through a network
US20080168143A1 (en) * 2007-01-05 2008-07-10 Allgates Semiconductor Inc. Control system of interactive toy set that responds to network real-time communication messages
WO2008096134A3 (en) * 2007-02-08 2008-11-06 Genie Toys Plc Toy in the form of a doll
WO2008096134A2 (en) * 2007-02-08 2008-08-14 Genie Toys Plc Toy in the form of a doll
US8839477B2 (en) 2007-05-09 2014-09-23 Irobot Corporation Compact autonomous coverage robot
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US11498438B2 (en) 2007-05-09 2022-11-15 Irobot Corporation Autonomous coverage robot
US9480381B2 (en) 2007-05-09 2016-11-01 Irobot Corporation Compact autonomous coverage robot
US11072250B2 (en) 2007-05-09 2021-07-27 Irobot Corporation Autonomous coverage robot sensing
US10299652B2 (en) 2007-05-09 2019-05-28 Irobot Corporation Autonomous coverage robot
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US8726454B2 (en) 2007-05-09 2014-05-20 Irobot Corporation Autonomous coverage robot
US8438695B2 (en) 2007-05-09 2013-05-14 Irobot Corporation Autonomous coverage robot sensing
US20080287033A1 (en) * 2007-05-17 2008-11-20 Wendy Steinberg Personalizable Doll
US20080293324A1 (en) * 2007-05-22 2008-11-27 Winway Corporation Ltd. Toy doll system
US20100147879A1 (en) * 2007-06-18 2010-06-17 Heiner Ophardt Photochromic optically keyed dispenser
EP2031481A1 (en) * 2007-08-29 2009-03-04 Industrial Technology Research Institut Information communication and interaction device and method for the same
US20090058673A1 (en) * 2007-08-29 2009-03-05 Industrial Technology Research Institute Information communication and interaction device and method for the same
TWI421767B (en) * 2007-08-29 2014-01-01 Ind Tech Res Inst Device for information communication and interaction and method for the same
US20090091470A1 (en) * 2007-08-29 2009-04-09 Industrial Technology Research Institute Information communication and interaction device and method for the same
FR2921008A1 (en) * 2007-09-19 2009-03-20 Aldebaran Robotics Soc Par Act IMPROVING A ROBOT THAT CAN REMOVE ITS LEGS FROM THE VERTICAL
WO2009037149A1 (en) * 2007-09-19 2009-03-26 Aldebaran Robotics Improvement in a robot capable of spreading legs relative to the vertical
KR100894569B1 (en) 2007-11-20 2009-04-24 전인자 Humanized doll with adaptive function
US20090305603A1 (en) * 2008-06-04 2009-12-10 Hon Hai Precision Industry Co., Ltd. Interactive toy system
US20100185326A1 (en) * 2009-01-22 2010-07-22 Samsung Electronics Co., Ltd. Robot
US8698965B2 (en) 2009-01-22 2014-04-15 Samsung Electronics Co., Ltd. Robot
US8588978B2 (en) * 2009-01-22 2013-11-19 Samsung Electronics Co., Ltd. Robot
WO2011014263A1 (en) * 2009-07-30 2011-02-03 While We're Apart, Llc Article for upholding personal affinity
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US20110131165A1 (en) * 2009-12-02 2011-06-02 Phison Electronics Corp. Emotion engine, emotion engine system and electronic device control method
TWI413938B (en) * 2009-12-02 2013-11-01 Phison Electronics Corp Emotion engine, emotion engine system and electronic device control method
US8306929B2 (en) * 2009-12-02 2012-11-06 Phison Electronics Corp. Emotion engine, emotion engine system and electronic device control method
US8515092B2 (en) * 2009-12-18 2013-08-20 Mattel, Inc. Interactive toy for audio output
US20110151746A1 (en) * 2009-12-18 2011-06-23 Austin Rucker Interactive toy for audio output
US8800107B2 (en) 2010-02-16 2014-08-12 Irobot Corporation Vacuum brush
US11058271B2 (en) 2010-02-16 2021-07-13 Irobot Corporation Vacuum brush
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
WO2012000927A1 (en) * 2010-07-02 2012-01-05 Aldebaran Robotics Humanoid game-playing robot, method and system for using said robot
US20130103196A1 (en) * 2010-07-02 2013-04-25 Aldebaran Robotics Humanoid game-playing robot, method and system for using said robot
CN103079657A (en) * 2010-07-02 2013-05-01 奥尔德巴伦机器人公司 Humanoid game-playing robot, method and system for using said robot
US9950421B2 (en) * 2010-07-02 2018-04-24 Softbank Robotics Europe Humanoid game-playing robot, method and system for using said robot
FR2962048A1 (en) * 2010-07-02 2012-01-06 Aldebaran Robotics S A HUMANOID ROBOT PLAYER, METHOD AND SYSTEM FOR USING THE SAME
US11831955B2 (en) 2010-07-12 2023-11-28 Time Warner Cable Enterprises Llc Apparatus and methods for content management and account linking across multiple content delivery networks
US9400504B2 (en) * 2010-09-03 2016-07-26 Aldebaran Robotics Mobile robot
US20130158748A1 (en) * 2010-09-03 2013-06-20 Aldebaran Robotics Mobile robot
US20140220855A1 (en) * 2010-10-19 2014-08-07 I-Star Entertainment Llc Illumination display and communication device and method
US8955192B2 (en) 2011-04-29 2015-02-17 Irobot Corporation Robotic vacuum cleaning system
US8910342B2 (en) 2011-04-29 2014-12-16 Irobot Corporation Robotic vacuum cleaning system
US9220386B2 (en) 2011-04-29 2015-12-29 Irobot Corporation Robotic vacuum
US10433696B2 (en) 2011-04-29 2019-10-08 Irobot Corporation Robotic vacuum cleaning system
US8881339B2 (en) 2011-04-29 2014-11-11 Irobot Corporation Robotic vacuum
US11471020B2 (en) 2011-04-29 2022-10-18 Irobot Corporation Robotic vacuum cleaning system
US9320400B2 (en) 2011-04-29 2016-04-26 Irobot Corporation Robotic vacuum cleaning system
US9675224B2 (en) 2011-04-29 2017-06-13 Irobot Corporation Robotic vacuum cleaning system
USD709139S1 (en) 2011-07-21 2014-07-15 Mattel, Inc. Wheel
USD700250S1 (en) 2011-07-21 2014-02-25 Mattel, Inc. Toy vehicle
USD701578S1 (en) 2011-07-21 2014-03-25 Mattel, Inc. Toy vehicle
USD703766S1 (en) 2011-07-21 2014-04-29 Mattel, Inc. Toy vehicle housing
USD703275S1 (en) 2011-07-21 2014-04-22 Mattel, Inc. Toy vehicle housing
US20130122777A1 (en) * 2011-08-04 2013-05-16 Chris Scheppegrell Communications and monitoring using a toy
EP2556869A1 (en) * 2011-08-12 2013-02-13 Bandai Co., Ltd. Operable toy
US20130040530A1 (en) * 2011-08-12 2013-02-14 Bandai Co., Ltd. Operable toy
CN102772901A (en) * 2011-08-12 2012-11-14 万代股份有限公司 Action body toy and control method
US20130073087A1 (en) * 2011-09-20 2013-03-21 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US9656392B2 (en) * 2011-09-20 2017-05-23 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US20130165014A1 (en) * 2011-12-26 2013-06-27 Sam Yang Interactive electronic toy
US8808052B2 (en) * 2011-12-26 2014-08-19 Sap Link Technology Corp. Interactive electronic toy
US10213921B2 (en) 2012-08-31 2019-02-26 Gopro, Inc. Apparatus and methods for controlling attention of a robot
US11867599B2 (en) 2012-08-31 2024-01-09 Gopro, Inc. Apparatus and methods for controlling attention of a robot
US9446515B1 (en) 2012-08-31 2016-09-20 Brain Corporation Apparatus and methods for controlling attention of a robot
US11360003B2 (en) 2012-08-31 2022-06-14 Gopro, Inc. Apparatus and methods for controlling attention of a robot
US10545074B2 (en) 2012-08-31 2020-01-28 Gopro, Inc. Apparatus and methods for controlling attention of a robot
US20140099613A1 (en) * 2012-10-02 2014-04-10 Gavriel Yaacov Krauss Methods circuits, devices and systems for personality interpretation and expression
US9569976B2 (en) * 2012-10-02 2017-02-14 Gavriel Yaacov Krauss Methods circuits, devices and systems for personality interpretation and expression
RU2708697C2 (en) * 2012-12-12 2019-12-11 Арам Акопян Exercise demonstration devices and systems
US20140303982A1 (en) * 2013-04-09 2014-10-09 Yally Inc. Phonetic conversation method and device using wired and wiress communication
US9636598B2 (en) * 2014-01-22 2017-05-02 Guangdong Alpha Animation & Culture Co., Ltd. Sensing control system for electric toy
AU2014340680B2 (en) * 2014-01-22 2016-10-06 Guangdong Alpha Animation & Culture Co. , Ltd. A sensing control system for electric toy
US20160045836A1 (en) * 2014-01-22 2016-02-18 Guangdong Alpha Animation & Culture Co., Ltd. A Sensing Control System For Electric Toy
CN104815445A (en) * 2014-01-22 2015-08-05 广东奥飞动漫文化股份有限公司 Inductive control system for electrical scooter
US9987743B2 (en) 2014-03-13 2018-06-05 Brain Corporation Trainable modular robotic apparatus and methods
US10391628B2 (en) 2014-03-13 2019-08-27 Brain Corporation Trainable modular robotic apparatus and methods
US9364950B2 (en) * 2014-03-13 2016-06-14 Brain Corporation Trainable modular robotic methods
US10166675B2 (en) 2014-03-13 2019-01-01 Brain Corporation Trainable modular robotic apparatus
US9533413B2 (en) 2014-03-13 2017-01-03 Brain Corporation Trainable modular robotic apparatus and methods
US9862092B2 (en) 2014-03-13 2018-01-09 Brain Corporation Interface for use with trainable modular robotic apparatus
US10242666B2 (en) * 2014-04-17 2019-03-26 Softbank Robotics Europe Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method
US20190172448A1 (en) * 2014-04-17 2019-06-06 Softbank Robotics Europe Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method
US20150375115A1 (en) * 2014-06-30 2015-12-31 Microsoft Corporation Interacting with a story through physical pieces
US10500497B2 (en) 2014-10-08 2019-12-10 Microsoft Corporation Transfer of attributes between generations of characters
US10369477B2 (en) 2014-10-08 2019-08-06 Microsoft Technology Licensing, Llc Management of resources within a virtual world
US20160136534A1 (en) * 2014-11-13 2016-05-19 Robert A. EARL-OCRAN Programmable Interactive Toy
US9426946B2 (en) 2014-12-02 2016-08-30 Brain Corporation Computerized learning landscaping apparatus and methods
US20160261313A1 (en) * 2015-03-06 2016-09-08 Nxp B.V. Toy, method for controlling a toy, and computer program product
CN105935497A (en) * 2015-03-06 2016-09-14 恩智浦有限公司 Toy, method for controlling a toy, and computer program product
US11050461B2 (en) * 2015-03-06 2021-06-29 Nxp B.V. Toy, method for controlling a toy, and computer program product
US9840003B2 (en) 2015-06-24 2017-12-12 Brain Corporation Apparatus and methods for safe navigation of robotic devices
US10807230B2 (en) 2015-06-24 2020-10-20 Brain Corporation Bistatic object detection apparatus and methods
US9873196B2 (en) 2015-06-24 2018-01-23 Brain Corporation Bistatic object detection apparatus and methods
US10614203B2 (en) * 2015-12-14 2020-04-07 Casio Computer Co., Ltd. Robot-human interactive device which performs control for authenticating a user, robot, interaction method, and recording medium storing program
US20170169203A1 (en) * 2015-12-14 2017-06-15 Casio Computer Co., Ltd. Robot-human interactive device, robot, interaction method, and recording medium storing program
US11192257B2 (en) 2016-04-08 2021-12-07 Groove X, Inc. Autonomously acting robot exhibiting shyness
US20210260773A1 (en) * 2016-06-15 2021-08-26 Irobot Corporation Systems and methods to control an autonomous mobile robot
US10350760B2 (en) * 2016-06-23 2019-07-16 Casio Computer Co., Ltd. Robot, robot control system, robot control method, and non-transitory computer-readable recording medium
CN107538497A (en) * 2016-06-23 2018-01-05 卡西欧计算机株式会社 Robot, robot control system, robot control method and storage medium
US20190030424A1 (en) * 2016-07-05 2019-01-31 Fujian Blue Hat Interactive Entertainment Technology Ltd. Interactive system based on light intensity recognition
US10512836B2 (en) * 2016-07-05 2019-12-24 Fujian Blue Hat Interactive Entertainment Technology Ltd. Interactive system based on light intensity recognition
US11809192B2 (en) 2016-07-11 2023-11-07 Groove X, Inc. Autonomously acting robot whose activity amount is controlled
US11579617B2 (en) 2016-07-11 2023-02-14 Groove X, Inc. Autonomously acting robot whose activity amount is controlled
US20180147728A1 (en) * 2016-11-30 2018-05-31 Universal City Studios Llc Animated character head systems and methods
US10775880B2 (en) * 2016-11-30 2020-09-15 Universal City Studios Llc Animated character head systems and methods
US10512384B2 (en) 2016-12-15 2019-12-24 Irobot Corporation Cleaning roller for cleaning robots
US11998151B2 (en) 2016-12-15 2024-06-04 Irobot Corporation Cleaning roller for cleaning robots
US11284769B2 (en) 2016-12-15 2022-03-29 Irobot Corporation Cleaning roller for cleaning robots
US11103800B1 (en) * 2017-02-17 2021-08-31 Hasbro, Inc. Toy robot with programmable and movable appendages
US11380332B2 (en) * 2017-02-24 2022-07-05 Sony Mobile Communications Inc. Information processing apparatus, information processing method, and computer program
CN110268468A (en) * 2017-02-24 2019-09-20 索尼移动通信株式会社 Information processing equipment, information processing method and computer program
US10719758B2 (en) 2017-04-19 2020-07-21 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
US10803376B2 (en) 2017-04-19 2020-10-13 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
US11007647B2 (en) 2017-04-19 2021-05-18 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
EP3392005A1 (en) * 2017-04-19 2018-10-24 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
EP3392004A1 (en) * 2017-04-19 2018-10-24 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
CN108724205A (en) * 2017-04-19 2018-11-02 松下知识产权经营株式会社 Interactive device, interactive approach, interactive process and robot
EP3392003A1 (en) * 2017-04-19 2018-10-24 Panasonic Intellectual Property Management Co., Ltd. Interaction apparatus, interaction method, non-transitory recording medium, and robot
US11000952B2 (en) 2017-06-23 2021-05-11 Casio Computer Co., Ltd. More endearing robot, method of controlling the same, and non-transitory recording medium
US9919231B1 (en) * 2017-07-10 2018-03-20 Peter Chin Cuboid or spherical head figurine
US10595624B2 (en) 2017-07-25 2020-03-24 Irobot Corporation Cleaning roller for cleaning robots
US11241082B2 (en) 2017-07-25 2022-02-08 Irobot Corporation Cleaning roller for cleaning robots
US10307911B2 (en) * 2017-08-30 2019-06-04 Panasonic Inttellectual Property Management Co., Ltd. Robot
US10707457B2 (en) * 2017-10-26 2020-07-07 UBTECH Robotics Corp. Battery case and robot having the same
US11267121B2 (en) * 2018-02-13 2022-03-08 Casio Computer Co., Ltd. Conversation output system, conversation output method, and non-transitory recording medium
CN110152314A (en) * 2018-02-13 2019-08-23 卡西欧计算机株式会社 Session output system, session export server, session output method and storage medium
WO2018127863A3 (en) * 2018-03-15 2018-09-20 Ingeniería Aplicada, S.A. System for monitoring power consumption
US20190308327A1 (en) * 2018-04-06 2019-10-10 Anki, Inc. Condition-Based Robot Audio Techniques
US11633863B2 (en) * 2018-04-06 2023-04-25 Digital Dream Labs, Llc Condition-based robot audio techniques
US11285611B2 (en) * 2018-10-18 2022-03-29 Lg Electronics Inc. Robot and method of controlling thereof
US20200164519A1 (en) * 2018-11-22 2020-05-28 Lg Electronics Inc. Motion control apparatus of action robot and motion generation and control system including the same
USD916928S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD916925S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD916926S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD916927S1 (en) * 2018-12-20 2021-04-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
US11099807B2 (en) 2019-02-14 2021-08-24 Sharp Kabushiki Kaisha Electronic apparatus, control device, control method, and recording medium
US11109727B2 (en) 2019-02-28 2021-09-07 Irobot Corporation Cleaning rollers for cleaning robots
US11871888B2 (en) 2019-02-28 2024-01-16 Irobot Corporation Cleaning rollers for cleaning robots
CN109732612A (en) * 2019-03-12 2019-05-10 苏州工业职业技术学院 Intelligent interactive toy robot control system based on STM32 microcontroller
US20210402589A1 (en) * 2019-06-17 2021-12-30 Lg Electronics Inc. Artificial intelligence (ai) robot and control method thereof
US11511410B2 (en) * 2019-06-17 2022-11-29 Lg Electronics Inc. Artificial intelligence (AI) robot and control method thereof
CN112717433A (en) * 2020-11-16 2021-04-30 北京六十六号互动科技有限公司 Toy control device
US20230018066A1 (en) * 2020-11-20 2023-01-19 Aurora World Corporation Apparatus and system for growth type smart toy

Also Published As

Publication number Publication date
JP2002307354A (en) 2002-10-23

Similar Documents

Publication Publication Date Title
US20020081937A1 (en) Electronic toy
TWI658377B (en) Robot assisted interaction system and method thereof
JP2002532169A (en) Interactive toys
US8135128B2 (en) Animatronic creatures that act as intermediaries between human users and a telephone system
KR101008085B1 (en) Robot system and robot apparatus control method
US6319010B1 (en) PC peripheral interactive doll
US8121653B2 (en) Methods and apparatus for autonomously managing communications using an intelligent intermediary
US5746602A (en) PC peripheral interactive doll
US20070128979A1 (en) Interactive Hi-Tech doll
US20040103222A1 (en) Interactive three-dimensional multimedia i/o device for a computer
WO2001012285A9 (en) Networked toys
JP2002369974A (en) Pseudo living thing apparatus
JP2013099823A (en) Robot device, robot control method, robot control program and robot system
US6547631B1 (en) Prayer doll
JP2003071763A (en) Leg type mobile robot
KR20060079832A (en) Humanoid robot using emotion expression based on the embedded system
JP2002205291A (en) Leg type robot and movement control method for leg type robot, as well as recording medium
US20050148283A1 (en) Interactive display
JP2003108362A (en) Communication supporting device and system thereof
US20060154560A1 (en) Communication device
JP2003305677A (en) Robot device, robot control method, recording medium and program
JP3066762U (en) Conversation toys
EP1547260A1 (en) Communication device
WO2023037608A1 (en) Autonomous mobile body, information processing method, and program
WO2023037609A1 (en) Autonomous mobile body, information processing method, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: SEGA TOYS, LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YASUI, KEISUKE;MORIOKA, SEISUKI;OKUBO, JUN;REEL/FRAME:012650/0146

Effective date: 20020125

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION