US20170189803A1 - Task-oriented feedback using a modular sensing device - Google Patents
Task-oriented feedback using a modular sensing device Download PDFInfo
- Publication number
- US20170189803A1 US20170189803A1 US15/253,785 US201615253785A US2017189803A1 US 20170189803 A1 US20170189803 A1 US 20170189803A1 US 201615253785 A US201615253785 A US 201615253785A US 2017189803 A1 US2017189803 A1 US 2017189803A1
- Authority
- US
- United States
- Prior art keywords
- wearable device
- user
- mode
- controller
- feedback
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000009471 action Effects 0.000 claims abstract description 84
- 230000004044 response Effects 0.000 claims abstract description 40
- 230000008713 feedback mechanism Effects 0.000 claims abstract description 22
- 230000000007 visual effect Effects 0.000 claims abstract description 18
- 238000004891 communication Methods 0.000 claims description 48
- 238000000034 method Methods 0.000 claims description 40
- 230000002452 interceptive effect Effects 0.000 claims description 32
- 230000003993 interaction Effects 0.000 claims description 20
- 230000001939 inductive effect Effects 0.000 claims description 17
- 238000005065 mining Methods 0.000 claims description 17
- 230000007704 transition Effects 0.000 claims description 17
- 230000000903 blocking effect Effects 0.000 claims description 10
- 230000000977 initiatory effect Effects 0.000 claims description 7
- 230000001965 increasing effect Effects 0.000 claims description 5
- 238000012546 transfer Methods 0.000 claims description 2
- 230000003247 decreasing effect Effects 0.000 claims 1
- 230000033001 locomotion Effects 0.000 description 36
- 230000000875 corresponding effect Effects 0.000 description 35
- 238000007726 management method Methods 0.000 description 31
- 238000001514 detection method Methods 0.000 description 26
- 238000012545 processing Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 17
- 230000000694 effects Effects 0.000 description 17
- 238000012549 training Methods 0.000 description 17
- 210000000707 wrist Anatomy 0.000 description 16
- 230000008569 process Effects 0.000 description 12
- 230000001960 triggered effect Effects 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 7
- 230000007246 mechanism Effects 0.000 description 7
- 235000013305 food Nutrition 0.000 description 6
- 230000002596 correlated effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 3
- 230000006698 induction Effects 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 230000014616 translation Effects 0.000 description 3
- 238000009825 accumulation Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- UPMXNNIRAGDFEH-UHFFFAOYSA-N 3,5-dibromo-4-hydroxybenzonitrile Chemical compound OC1=C(Br)C=C(C#N)C=C1Br UPMXNNIRAGDFEH-UHFFFAOYSA-N 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 230000009365 direct transmission Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 230000006266 hibernation Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000007958 sleep Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
- A63F13/235—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/32—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H17/00—Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
- A63H17/26—Details; Accessories
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H23/00—Toy boats; Floating toys; Other aquatic toy devices
- A63H23/02—Boats; Sailing boats
- A63H23/04—Self-propelled boats, ships or submarines
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H27/00—Toy aircraft; Other flying toys
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H27/00—Toy aircraft; Other flying toys
- A63H27/12—Helicopters ; Flying tops
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H33/00—Other toys
- A63H33/005—Motorised rolling toys
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B7/00—Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
- G08B7/06—Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W8/00—Network data management
- H04W8/005—Discovery of network devices, e.g. terminals
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/40—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
- A63F2300/404—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network characterized by a local network connection
- A63F2300/405—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network characterized by a local network connection being a wireless ad hoc network, e.g. Bluetooth, Wi-Fi, Pico net
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/32—Remote control based on movements, attitude of remote control device
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/90—Additional features
- G08C2201/92—Universal remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/18—Self-organising networks, e.g. ad-hoc networks or sensor networks
Definitions
- wireless networks typically utilize protocols that enable wireless devices to detect signal sources from other devices for initiating data and communication links.
- Such networks are typically implemented using networking hardware, which may be incorporated in various wireless network devices, such as access points (APs), peer-to-peer (P2P) devices, wireless local area network (LAN) equipped devices, and the like—each advertising a unique identity (e.g., a media access control (MAC) address) indiscriminately to devices within range. Connections may be established with such devices to transmit and receive data.
- APs access points
- P2P peer-to-peer
- LAN wireless local area network
- Connections may be established with such devices to transmit and receive data.
- FIG. 1A illustrates a block diagram of an example modular sensing device for use in controlling a remotely operated device, according to examples described herein;
- FIG. 1B is a block diagram illustrating an example modular sensing device, as shown and described herein;
- FIG. 2 is a flow chart describing an example method of generating commands by a modular sensing device, in accordance with examples described herein;
- FIG. 3 illustrates an example modular sensing device insertable into a plurality of compatible apparatuses
- FIG. 4 is a block diagram illustrating an example modular sensing device for acquiring virtual or digital resources
- FIG. 5 is a block diagram illustrating an example virtual resource management system utilized by a wearable device
- FIG. 6 is a flow chart describing an example method of acquiring virtual resources using a wearable device
- FIGS. 7A and 7B are flow charts describing example methods of acquiring and utilizing virtual resources in connection with gameplay
- FIGS. 8A, 8B, and 8C illustrate unique identifier logs and association tables utilized in connection with virtual resource association, acquisition, and allocation
- FIG. 9 illustrates a wearable device pairing triggering virtual resource data sharing in connection with gameplay
- FIGS. 10A and 10B illustrate a wearable device pairing triggering an interactive mode between a pair of proximate users
- FIG. 11 is a flow chart describing an example method of implementing an interactive mode between a pair of user utilizing a corresponding pair of wearable devices
- FIG. 12 is a flow chart describing an example method of initiating a training mode on a wearable device in connection with a self-propelled device
- FIG. 13 is a flow chart describing an example method of implementing a wearable device in a sword mode
- FIG. 14 is a block diagram of an example computer system upon which examples described herein may be implemented.
- FIG. 15 is a block diagram of a mobile computing device upon which examples described herein may be implemented.
- FIG. 16 is a block diagram of an example portable sensing device upon which examples described herein may be implemented.
- FIG. 17 illustrates an embodiment of multiple sensing devices that concurrently provide input for a program or application which utilizes the inputs, along with inferences which can be made about a person or object that carries the devices, according to one or more examples;
- FIG. 18 illustrates a system which concurrently utilizes input from multiple modular sensing devices in connection with execution of an application or program
- FIG. 19 illustrates an example of a modular sensing device insertable into a wrist worn apparatus
- FIG. 20 illustrates an implementation of the modularized sensing device, in accordance with examples described herein.
- Examples described herein relate to a multi-modal modular sensing device, worn or carried by a user (e.g., as a wrist-worn device), to enable a variety of interactions with other devices through sense movement of the modular sensing device.
- a modular sensing device that can individually, or in combination with another device (e.g., controller device, such as a mobile computing device) control other devices, interact with compatible devices of other users, and/or operate in connection with task-oriented activities (e.g., gameplay).
- the modular sensing device corresponds to a wearable device (e.g., a watch, a pendant for a necklace, a hat, glasses) can be placed in multiple modes to, for example, control the characteristics of movement of another device, or interact with the real-word.
- the modular sensing device can control acceleration and maneuvering of a self-propelled device.
- a portable modular sensing device can include an inertial measurement unit (IMU), and can be operated in multiple modes in which gestures (e.g., arm gestures) may be translated based on the particular mode of the wearable device.
- the modular device can be insertable into multiple devices, such as a wrist worn device, clothing, an accessory device (e.g., a toy sabre or sword), a wearable pendant, ring, hat, and the like.
- the modular sensing device can further include output devices, such as lighting elements (e.g., one or more LEDs), an audio device (e.g., a speaker), and/or a haptic feedback system.
- the modular sensing device further includes a charging port (e.g., a mini-universal serial bus port) to enable charging of a power source included in the device.
- a modular sensing device is operable to detect its own movement in three-dimensional space, using the IMU.
- the IMU can be an integrated device.
- the IMU can be implemented through a combination of sensors, such as a three-dimensional accelerometer or gyroscope.
- the modular sensing device can include a processor and memory to interpret the sensor data, and to communicate interpreted sensor data to another device (e.g., mobile computing device) using a wireless connection (e.g., BLUETOOTH).
- the sensor data can generate data that is either raw data or processed, depending on the processing resources that are selected for the portable device, and the processing load which is implemented for the portable device.
- the modular sensing device can generate output (e.g., haptic, audio, and/or visual output) or control commands for operating a remote controlled device by utilizing state machines in memory that correspond to individual gestures.
- processing resources of the modular sensing device can monitor logical state machines or automatons that correspond to specified sensor data combinations (e.g., based on gestures performed by a user).
- a single state machine can correspond to an initial arm raising gesture performed by the user wearing a wrist-worn device having the modular sensing device.
- a sensor profile from a single sensor or multiple sensors included in the IMU can indicate the initial arm raising gesture, which can trigger a state transition for the state machine.
- This state transition can cause processing resources of the modular sensing device to automatically generate a control command to cause the remote controlled device to accelerate.
- a second state machine can correspond to an increased angle in the arm raising gesture.
- a state transition can be triggered for the second state machine, which can trigger an increased acceleration command to be generated automatically.
- Several state machines can be comprised in memory of the modular sensing device, and can each correlate to a specified sensor data profile, from simple single-action gestures, to complex multiple-motion gestures.
- an interpretation engine e.g., a processor
- a “modular sensing device” can comprise an electronic device that includes sensor resources for detecting its own movement, and of dimension and form factor suitable for being insertable into any number of devices configured for receiving the device.
- a “wearable device” such as a wrist worn device (e.g., watch, watch band, bracelet).
- variations to the type and form factor of a wearable device can vary significantly, encompassing, for example, eyeglasses, hats, pendants, armbands, glasses and various other form factors. While many examples describe functionality in the context of a wearable device, embodiments extend such examples to other form factors in which a modular sensing device may be inserted, such as wands, fobs, wielded toys, or mobile communication devices.
- the wearable device can include one or more sensors to detect the device's own movements.
- a wearable device includes an accelerometer and/or a gyroscopic sensor.
- sensor data corresponding to user gestures performed by the user wearing the wearable device, can be translated into control commands or data packets to be transmitted and implemented based on the selected mode of the wearable device.
- the wearable device can include an inductive interface to inductively pair with other devices, which can trigger a specified mode on the wearable device.
- an inductive pairing between the wearable device and a self-propelled device can trigger a drive mode in which the wearable device can be utilized by the user to operate the self-propelled device.
- the wearable device can include an input mechanism, such as an analog or digital button, that enables the user to select a particular mode and/or scroll through a series of modes for the wearable device.
- some examples described herein provide for alternative modes of operation, including, for example (i) a “drive mode” in which the wearable device is utilized to control a self-propelled device; (ii) a “control mode” in which the wearable device is utilized in connection with smart home devices; (iii) a “finding mode” or “finder mode” in which the wearable device is utilized to detect virtual or digital resources; (iv) a “mining mode” which can be initiated by the user to collect virtual resources when they are detected in the finder mode; (v) a “training mode” in which the wearable device is utilized in connection with a self-propelled device to assist the user in training for certain achievements or for increasing the user's abilities to perform task-oriented activities (e.g., increasing skills for a subsequent game or sporting activity); (vi) a “sword mode” in which the wearable device provides feedback (e.g., haptic, audio, and/or visual feedback) when the user performs actions while holding an object; (vii)
- a self-propelled device can include, for example, a device that can be wirelessly and remotely controlled in its movement, whether the movement is on ground, water, or air.
- a self-propelled device can include a wirelessly and remotely controlled drone, car, airplane, helicopter, boat, etc.
- conventional examples enable control of a self-propelled device
- conventional approaches generally utilize a perspective of the device being controlled.
- some conventional devices for example, enable a computing device held by the user to project a perspective of the device under control, examples described herein enable control of such devices to utilize an orientation of the user.
- some examples include a modular sensing device that can determine an orientation of the user, and further enable control of the self-propelled device through an environment that accommodates or is in the perspective of the user, based on the orientation of the user (as determined by the modular sensing device).
- the control of a self-propelled device can be projected through an orientation or perspective of the user for purpose of a virtual environment.
- Some examples include a modular sensing device having a wireless communication module (e.g., a BLUETOOTH low energy module) that enables communication of sensor data (e.g., raw sensor data from the accelerometer or gyroscopic sensor), or translated data (i.e., translations of the sensor data based on the selected mode of the wearable device).
- sensor data e.g., raw sensor data from the accelerometer or gyroscopic sensor
- translated data i.e., translations of the sensor data based on the selected mode of the wearable device.
- the sensor data may be relayed for translation by a mobile computing device before being transmitted to another device (e.g., a paired wearable device or a paired self-propelled device).
- processing resources of the wearable device can execute mode instructions, based on the selected mode, to translate the sensor data (e.g., via use of state machines that trigger control commands or feedback based on a selected mode of the modular sensing device) for direct transmission to one or more other devices, as described herein.
- body part gestures or “user gestures” include gestures performed by a user while utilizing the wearable device.
- the wearable device may be a wrist-worn device, in which case the body part or user gestures may comprise arm gesture, and can include any number of physical movements or actions that affect the sensors of the wearable device when it is worn on the wrist.
- Such movements and actions can include shaking, arm movements (e.g., raising, lowering, pointing, twisting, and any combination thereof), wrist movements, hand actions (such as grasping or grabbing), and the like.
- the wearable device is not limited to wrist-worn devices, but may be utilized as a ring (e.g., a finger-worn device), an ankle-worn device, a neck-worn device, a head-worn device, a belt (e.g., a waist-worn device), etc.
- user gestures performed using the wearable device can be any actions or movements in which correlated sensor data from sensors of the device can be translated into commands, instructions, feedback, etc. depending on the mode of the wearable device.
- examples described herein achieve a technical effect of enhancing user interactivity with other devices and other users.
- Such interactivity may include utilizing the modular sensing device to control a self-propelled device, interact with other users of wearable devices, collect and share data, control smart home devices, interact with the real-world via a gaming interface through the modular sensing device, and the like.
- One or more examples described herein provide that methods, techniques, and actions performed by a computing device, a modular sensing device, or a self-propelled device are performed programmatically, or as a computer-implemented method.
- Programmatically means through the use of code or computer-executable instructions. These instructions can be stored in one or more memory resources of the computing device.
- a programmatically performed step may or may not be automatic.
- a programmatic module or component can include a program, a sub-routine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions.
- a module or component can exist on a hardware component independently of other modules or components.
- a module or component can be a shared element or process of other modules, programs or machines.
- Some examples described herein can generally require the use of computing devices, including processing and memory resources.
- computing devices including processing and memory resources.
- one or more examples described herein can be implemented, in whole or in part, on computing devices such as digital cameras, digital camcorders, desktop computers, cellular or smart phones, personal digital assistants (PDAs), virtual reality (VR), augmented reality (AR), or mixed reality (MR) headsets, laptop computers, printers, digital picture frames, and tablet devices.
- PDAs personal digital assistants
- VR virtual reality
- AR augmented reality
- MR mixed reality
- Memory, processing, and network resources may all be used in connection with the establishment, use, or performance of any example described herein (including with the performance of any method or with the implementation of any system).
- one or more examples described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer-readable medium.
- Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing examples can be carried and/or executed.
- the numerous machines shown with examples include processor(s) and various forms of memory for holding data and instructions.
- Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers.
- Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on smart phones, multifunctional devices or tablets), and magnetic memory.
- Computers, terminals, network enabled devices are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, examples may be implemented in the form of computer-programs, or a non-transitory computer usable carrier medium capable of carrying such a program.
- FIG. 1A illustrates a block diagram of an example modular sensing device for use in controlling a remotely operated device, according to examples described herein.
- the modular sensing device 100 shown in FIG. 1A may be integrated with a wearable device 102 (e.g., a wrist-worn device, pendant, clothing, a hat, eyeglasses, an ankle-worn device, a ring, etc.), or can be inserted into a compartment or otherwise included as a part of the wearable device 102 .
- the wearable device 102 , or modular sensing device 100 disposed therein, can include a mode selector 110 , such as an analog button that enables the user to select a mode of the device.
- mode selector 110 such as an analog button that enables the user to select a mode of the device.
- repeated user input 111 on the mode selector 110 enables the user to scroll through a list of available modes.
- modes include, but are not limited to a drive mode, a control mode, a finder mode, a mining mode, a training mode, a sword mode, a default mode, an interactive or battle mode, a sharing mode, and a gaming mode.
- a user input 111 on the mode selector 110 can cause a processor 120 of the modular sensing device 100 to generate an output 132 that indicates the particular mode selected.
- the modular sensing device 100 can include a number of output devices 130 , such as an audio device, a haptic system, and/or visual elements (e.g., an LED array or display).
- Each user input 111 on the mode selector 110 can trigger an audio, haptic, and/or visual output 132 , indicating the particular mode (e.g., a drive mode).
- the output 132 can comprise a voice output that states the selected mode, or a combination of voice, visual, and haptic output.
- the user may connect the wearable device 102 with a mobile computing device, such as the user's smart phone or tablet computer.
- Mode selection may be performed automatically by the user initiating a designated application of the mobile computing device, such as a smart home application, a controller application (e.g., to control a self-propelled device), a gaming application, and the like.
- the user can execute a designated application in connection with the wearable device 102 that enables the user to scroll through the various modes. The user may scroll through the modes on the mobile computing device, or via successive selection inputs on the mode selector 110 , which can trigger the mobile computing device to display a selectable mode.
- the mode selector 110 can be an input mechanism such as an analog or digital button, a touch panel such as a track pad, or a miniature touch-sensitive display device.
- the modular sensing device 100 can include a memory 115 that stores mode instruction sets executable by the processor 120 based on the mode selected by the user. Each mode instruction set can cause the processor 120 to interpret sensor data 137 in a particular manner. Along these lines, same or similar gestures 106 performed by the user can result in different generated outputs 132 or commands 108 depending on the mode selected by the user. In some implementations, selection of a particular mode can cause the processor 120 to initiate a communications module 125 of the modular sensing device 100 to establish a wireless connection with another device.
- the modular sensing device 100 can include a BLUETOOTH low energy module to enable communications with one or more other devices, such as a second modular sensing device or a remotely operated device 140 .
- the modular sensing device 100 can be relatively small in size compared to current computing devices, and in many implementations, the modular sensing device 100 does not include a power-intensive memory and computational resources.
- the memory and/or memory controller can be comprised of or implement a series of state machines that, when transitioned, can trigger a particular output automatically. Further description of the state machine implementations is provided below with respect to FIG. 1B .
- examples described herein with respect to the drive mode of the modular sensing device 100 can also be implemented in the state machine examples described herein.
- the memory 115 can include a drive mode instruction set 117 executable by the processor 120 in response to a user input 111 on the mode selector 110 .
- execution of the drive mode instructions 117 can cause the processor 120 to initiate the communications module 125 to establish a communications link 104 with a proximate remotely operated device 140 .
- the modular sensing device 100 can include an induction interface 127 which can trigger the communication link 104 between the modular sensing device 100 and the remotely operated device 140 .
- the user can place the wearable device 102 within inductive range (e.g., ⁇ 2-5 cm) of the remotely operated device 140 , which can include a corresponding induction interface.
- the inductive transfer of communication information can enable the modular sensing device 100 to establish the communication link accordingly.
- the modular sensing device 100 can further include an inertial measurement unit (IMU) 135 that can comprise a gyroscope and an accelerometer for accurate measurement of the modular sensing device's 100 movement and orientation.
- IMU 135 can further include a magnetometer to, for example, assist in calibration based on the orientation.
- the processor 120 can monitor the sensor data 137 for the particular gestures 106 being performed by the user.
- the gestures 106 can correspond to the user raising or lowering an arm, and/or performing additional arm motions.
- the sensor data 137 from the IMU 135 can comprise movement, position, and/or orientation information that the processor 120 can interpret in accordance with the drive mode.
- gestures 106 performed by the user can be detected by the processor 120 via sensor data 137 from the IMU 135 .
- Each of the gestures 106 can be interpreted by the processor 120 as one or more control commands 108 to be executed by the remotely operated device 140 .
- the drive mode can be automatically initiated in response to a particular detected gesture, regardless of the current mode of the modular sensing device 100 .
- This gesture can correspond to a distinct sensor data signature that, when detected by the processor 120 in executing any mode, overrides that mode and initiates the drive mode automatically.
- the processor 120 can automatically initiate the communications module 125 , establish the communications link 104 with the remotely operated device 140 , and generate control commands 108 based on the detected gestures 106 performed by the user in the sensor data 137 .
- the modular sensing device 100 may then dynamically transmit such control commands 108 to the remotely operated device 108 for execution as they are generated by the processor 120 .
- the specific gesture corresponds to a pushing motion with an arm wearing the wearable device 102 performed by the user.
- this pushing motion can correspond to a specified sensor data signature not used for any other mode, and is therefore distinct to enable the processor 120 to identify it irrespective of the current mode of the modular sensing device 100 .
- gestures 106 such as raising an arm can cause the processor 120 to generate acceleration commands to drive away from the user. Lowering the arm can cause the processor to generate deceleration and/or reverse commands. Further, moving the arm from side to side can cause the processor 120 to generate steering or directional commands. For example, moving the arm left can cause the remotely operated device 140 to turn left, and moving the arm right can cause the device 140 to turn right as the device 140 travels away from the user.
- Such control commands 108 may be processed by a controller of the remotely operated device 140 , or may be directly executed on the drive motors of the device 140 in order to accelerate and maneuver the device 140 in accordance with the gestures 106 performed by the user.
- angular thresholds can be established in the drive mode instructions 117 that can determine the manner in which the processor 120 interprets the sensor data 137 .
- the processor 120 can alter interpretation of the sensor data 137 into alternative commands 108 . For example, as the user raises the arm above an angular threshold (e.g., 45 degrees), and/or changes an orientation of the arm (e.g., palm down to palm up), the processor 120 can alter the interpretation of the sensor data 137 such that remotely operated device 140 drives towards the user as the arm is raised.
- the directional interpretation of the sensor data 137 can be reversed such that moving the arm left can cause the remotely operated device 140 to turn right, and moving the arm right can cause the remotely operated device 140 to turn left.
- This directional reversal triggered by the angular threshold, and in combination with the change in orientation of the user's arm, can create a palm control illusion of the remotely operated device 140 by the user.
- specified gestures detected in the sensor data 137 e.g., the user's arm rotating or crossing an angular threshold
- the processor 120 can trigger the processor 120 to interpret the sensor data 137 differently, or inversely from an initial interpretation, in order to produce the illusion.
- FIG. 1B is a block diagram illustrating an example modular sensing device, as shown and described herein.
- the modular sensing device 150 can be space limited, and can only include a limited amount memory and computational resources.
- the modular sensing device 150 can represent each possible gesture that can be performed by a user as a state machine.
- a state machine corresponding to that gesture can either positively identify its gesture or negatively identify its gesture.
- the state machine triggers a state transition which can cause an output generator 160 to generate a particular output accordingly.
- the output generator 160 and memory 180 can comprise a hardware controller implementing the various state machines to generate the output commands 164 .
- the modular sensing device 150 can include a memory 180 implementing a number of state machines (e.g., SM 1 181 , SM 2 183 , SM 3 185 , SM 4 187 , . . . , SMN 189 ), each being associated with a particular gesture.
- SM 1 181 can be associated with the user raising an arm
- SM 2 183 can be associated with the user lowering an arm
- SM 3 185 can be associated with the user pointing an arm to the right
- SM 4 187 can be associated with the user pointing an arm to the left.
- any number of state machines may be implemented in the memory 210 representing any number of gestures.
- the state machines can be instantiated for each gesture type, and each state machine can continuously inspect the instantaneous sensor data 167 from the IMU 165 in order to initialize or instantiate, transition individual states along a state string, terminate a current state string, or trigger an accept state or final state.
- the final state is triggered, this means that the particular gesture corresponding to that state machine has been performed by the user.
- each state machine can consist of a finite set of states (a fixed string of states), one or more inputs, one or more outputs, a transition function, and an initial state.
- the state machine can be linked to a particular gesture by way of a sensor data signature, which can comprise an accelerometer data signature, gyroscope data signature, or a combination of both.
- the state machine can be linear and directional, with each state having a particular error tolerance in its sensor data signature. A final state of a state machine can thus be triggered when the full sensor data signature of a particular gesture is matched to the sensor data 167 generated by the IMU 165 .
- the input string for that state machine if at any time after instantiating, an associated gesture corresponding to a respective state machine is not being performed, the input string for that state machine, and for that particular instantiation, automatically terminates.
- the state machine either terminates from the outset (e.g., an initial aspect of the sensor data signature for the gesture is not matched), or instantiates the input string towards the final state.
- the state machine can terminate if the gesture being performed diverges from the error tolerances built into the state machine. If, however, each state along the input string is transitioned accordingly (i.e., the sensor data 167 from the IMU 165 matches the sensor data signature for that state machine within its error tolerances), the final state is triggered for that state machine.
- the memory 180 can include a state machine reporter 182 that can report such final state transitions 188 to an output generator 160 of the modular sensing device 150 .
- the output generator 160 can be configured based on the particular mode of the modular sensing device 150 . Accordingly, final state reports 188 from individual state machines can be interpreted differently, or can cause a particular output, depending on the mode. In other words, the output from the modular sensing device 150 for a particular gesture (e.g., a backhanded swinging motion) can be different depending on the mode initiated via the mode selector 155 .
- certain final state reports 188 from the state machine reporter 182 can correspond to sub-mode triggers 186 for the particular mode. Such sub-mode triggers 186 may not trigger an output, but rather trigger the output generator 160 to alter interpretation of certain final state reports 188 in order to generate an alternative output.
- Such outputs generated by the output generator 160 can comprise control commands 162 to operate a remotely operated device, such as acceleration, steering, and deceleration commands.
- the output generator 160 can generate output commands 164 for the modular sensing device's 150 haptic system 192 , visual system 194 , and/or audio system 196 .
- the output generator 160 can cause the haptic 192 , visual 194 , and audio 196 systems of the modular sensing device 150 to produce individual or combined outputs.
- outputs can include vibration feedback or guidance, colored lights or display output, tonal outputs such as audible chimes that indicate positive and/or negative feedback, and voice output.
- each completed arm gesture can correspond to a final state machine report 188 , and can cause the output generator 160 to generate a control command 162 accordingly.
- the user raising an arm above a certain angle e.g., five degrees
- the state machine reporter 182 can report the final state 188 of that state machine to the output generator 160 , which, based on the drive mode interpretation, can generate a control command 162 to initiate acceleration of the remotely operated device.
- This control command 162 may then be transmitted to the remotely operated device via the communications module 175 of the modular sensing device 150 .
- the communications module 125 , 175 can act as a signal or beacon sensor which can provide signal data to the processor 120 or output generator 160 .
- the processor 120 or output generator 160 can generate a specified response based on the signal data.
- the signal data can indicate a unique identifier of the signal source.
- the memory 115 , 180 can include a UID log of specified signal sources that allow the user to interact with such sources based on the mode of the modular sensing device 100 , 150 .
- the user can be enabled to select a mode in which the user can interact with the other user (e.g., play a physical game involving gestures).
- the interpretation of the sensor data 137 can be altered based on a particular gesture trigger, such as the user's arm exceeding an angular threshold.
- the analog with respect to FIG. 1B comprises a state machine in the memory 180 that correlates to the particular threshold gesture.
- the correlated state machine can transition to its final state, which can be reported to the output generator 160 accordingly.
- Such a final state report 188 can comprise a sub-mode trigger 186 , in that the sub-mode trigger 186 causes the output generator 160 to alter interpretation of certain final state reports 188 while remaining in the same mode.
- a sub-mode trigger 186 can correspond to the user raising an arm above a certain threshold (e.g., 45 degrees), which can cause the output generator 160 to, for example, reverse the interpretation of the final state reports 188 corresponding to the directional commands for the remotely operated device.
- a certain threshold e.g. 45 degrees
- each mode of the modular sensing device 150 can include any number of sub-mode triggers 186 that cause the output generator 160 to alter an interpretation of or disregard a particular final state report 188 corresponding to a specified sensor data signature.
- the drive mode can include a set of angular gesture thresholds (e.g., raising an arm beyond 45 degrees, lower the arm below 45 degrees, turning the arm from palm down to palm up).
- a state machine can be dedicated—within the specified mode—to a sensor data signature indicating the user gesture crossing a gesture threshold.
- the dedicated state machine can transition to its final state, which, when reported to the output generator 160 (i.e., as a sub-mode trigger 186 ), can cause the output generator 160 to alter interpretation of certain final state reports 188 within that mode.
- disregarded final state reports prior to the sub-mode trigger can now trigger a specified output (e.g., an audio, haptic, and/or visual output, or a particular control command 162 ).
- a specific complex gesture represented by a particular state machine in the memory 180 —can cause the output generator 160 to reconfigure its interpretation of certain final state reports 188 , execute a sub-mode, or automatically initialize a different mode for the modular sensing device 150 .
- sensor data 167 from the IMU 165 can continuously cause the various state machines to instantiate, terminate, instigate state transitions, and/or transition to a final state.
- the output generator 160 only when a state machine transitions to its final state does the output generator 160 generate output commands 164 and/or control commands 162 to provide feedback, operative control over a remotely operated device, guidance via the output devices, and/or task-based instructions (e.g., in accordance with a particular game).
- the various aspects performed by the modular sensing device 100 described with respect to FIG. 1A may also be performed by example modular sensing devices 150 as shown and described with respect to FIG. 1B .
- the execution of designated modal instruction sets by the modular sensing device 100 of FIG. 1A in which the processor 120 directly interprets the sensor data 137 based on the executing instruction set—may be substituted by the state machine examples as described with respect to FIG. 1B .
- the limited memory and computational resources of the modular sensing device 150 of FIG. 1B may be compensated by attributing sensor data signatures to state machines, requiring less memory and processing without sacrificing functionality.
- FIG. 2 is a flow chart describing an example method of generating output commands by a modular sensing device, in accordance with examples described herein.
- the methods and processes described with respect to FIG. 2 may be performed by an example modular sensing device 100 , 150 as shown and described with respect to FIGS. 1A and 1B .
- the modular sensing device 150 can receive a mode selection input ( 200 ).
- the mode selection input can comprise the user physically pushing an analog mode selector 155 on the modular sensing device 150 .
- the mode selection input can comprise a specified gesture performed while using the modular sensing device 150 .
- the modular sensing device 150 can interpret the final state machine reports 188 in accordance with the mode selected ( 205 ).
- the modular sensing device 150 can operate in any number of modes, with each mode corresponding to controlling a remote device (e.g., drive mode), generating user tasks and feedback in connection with a remote device (e.g., training mode), generating user tasks and feedback in conjunction with another modular sensing device (e.g., playing a real-world game with another user), and/or generating standalone user tasks and feedback.
- user tasks can comprise instructions or suggestions to the user via the output devices of the modular sensing device in accordance with the selected mode.
- Such instructions or suggestions can be generated based on a programmed game, in which the user is to utilize the modular sensing device 150 to perform gestures and action, search for a certain signal source, cause the remotely operated device 140 to perform a set of actions, and the like.
- feedback can comprise reactive output to the tasked actions to be performed by the user.
- the feedback can comprise audio, visual, and/or haptic responses to actions indicating affirmative or negative completion of such tasks.
- the feedback indicates to the user whether an instructed task or gesture (e.g., indicated by the sensor data 167 and correlated state machine) has successfully been performed.
- the final state machine reports 188 can be correlated to a specified output.
- the modular sensing device 150 can generate commands in accordance with the mode and, when relevant, the sub-mode of the device 150 ( 220 ).
- such generated commands can include output commands 164 to output audio ( 222 ), visual ( 226 ), and/or haptic ( 224 ) output on the modular sensing device 150 .
- the modular sensing device 150 can generate control commands 162 ( 228 ) for operating a remotely operated device 140 . In either case, the modular sensing device 150 can transmit or execute the commands accordingly ( 225 ).
- the modular sensing device 150 can identify a sub-mode trigger 186 in the final state machine reports 188 ( 210 ). In response to the sub-mode trigger 186 , the modular sensing device 150 can reconfigure interpretation of one or more final reports from one or more corresponding state machines ( 215 ). Based on each final state machine report 188 , the modular sensing device 150 can generate commands in accordance with the mode and the sub-mode of the device 150 ( 220 ), including the altered interpretations based on the sub-mode trigger(s) 186 . As discussed above, such generated commands can include output commands 164 to output audio ( 222 ), visual ( 226 ), and/or haptic ( 224 ) output on the modular sensing device 150 . Additionally or alternatively, the modular sensing device 150 can generate control commands 162 ( 228 ) for operating a remotely operated device 140 . In either case, the modular sensing device 150 can transmit or execute the commands accordingly ( 225 ).
- FIG. 3 illustrates an example modular sensing device insertable into a plurality of compatible apparatuses.
- the modular sensing device 300 shown in FIG. 3 can comprise various components and modules of the modular sensing devices 100 , 150 shown in the block diagrams of FIGS. 1A and 1B .
- the modular sensing device 300 can include a number of output devices, such as an LED array 310 , an audio output device 320 (e.g., a speaker), and a haptic driver 360 (included within the device).
- the modular sensing device 300 can include a mode selector 330 , which can comprise an analog or digital button to enable the user to select a particular mode of the device 300 by, for example, scrolling through a stored series of modes.
- the modular sensing device 300 can further include memory and processing resources 365 that can execute the selected mode (either in the state machine implementation ( FIG. 1B ) or the executed instruction set implementation ( FIG. 1A ) described herein).
- the modular sensing device 300 also includes a communications interface 370 (e.g., a BLUETOOTH low energy, WiFi, WiGig, WiMAX, or cellular radio interface), and an IMU 340 to provide the memory and processing resources 365 with sensor data for detecting gestures performed by the user.
- a communications interface 370 e.g., a BLUETOOTH low energy, WiFi, WiGig, WiMAX, or cellular radio interface
- an IMU 340 to provide the memory and processing resources 365 with sensor data for detecting gestures performed by the user.
- the memory and processing resources 365 interpret the sensor data to generate outputs via the output devices 310 , 320 , 360 and/or commands or responses to be output to a connected device via the communications interface 370 (e.g., a remotely operated device or another modular sensing device).
- the modular sensing device 300 can include an input interface 350 (e.g., a mini-USB port) to enable charging of one or more batteries and/or uploading of additional mode instructions.
- the modular sensing device 300 can include an induction interface to charge one or more batteries and/or to enable inductive pairing with a second device to establish a wireless communications link.
- the modular sensing device 300 can be insertable into or otherwise attachable to any number of compatible apparatuses, such as wearable devices 395 (wrist devices, rings, pendants, hats, glasses, etc.) wielded devices 385 , companion toys or dolls, and the like.
- the modular sensing device 300 can be implemented in various other form factors, can be sewn into clothing, or can be mounted, glued, or otherwise attached to various apparatuses.
- Such apparatuses can each include a module holder 387 , 397 into which the modular sensing device 300 may be inserted or otherwise mounted or attached.
- the user can utilize the apparatuses into which the modular sensing device 300 has been inserted or attached, to perform various gestures in accordance with a selected mode of the modular sensing device 300 .
- FIG. 4 is a block diagram illustrating an example portable sensing device operable as a signal detection system for acquiring virtual or digital resources.
- a wearable device 400 may be a standalone device or may be incorporated with a mobile computing device running an application specific to detecting and recording signals from network devices, and searching for those network devices associated virtual resources.
- Standalone devices may include a pocket device, such as a key fob, a wrist-worn device or other wearable device (as described herein), and the like.
- the wearable device 400 may utilize mobile computing device resources, such as device hardware/firmware, or operate utilizing a combination of hardware and software.
- the wearable device 400 may be enabled upon initiation of an application on the mobile computing device.
- the wearable device 400 may continue to run during hibernation or sleep mode of the application using background or standby resources.
- the wearable device 400 may be enabled in the finder mode 406 and mining mode 407 upon user input on a mode selector 405 of the wearable device 400 , as shown and described with respect to FIG. 1 .
- Selection and execution of the finder and/or mining mode can generate an execution command 408 , which can enable the processing resources of the wearable device 400 to execute the respective mode instructions for translating sensor data and providing feedback responses.
- the wearable device 400 can include a signal interface 410 to detect emitted signals from any number of network-enabled devices 460 .
- emitted signals may be background signals advertising the presence and media access control (MAC) addresses of such devices 460 .
- the wearable device 400 may be within wireless range of a laptop computing device 462 having a unique identifier UID 1 .
- the UID of the laptop computing device 462 i.e., UID 1
- UID 1 may correspond to the MAC address of the laptop computing device 462 .
- the wearable device 400 can include a UID recorder 420 , which can receive UIDs 412 from the signal interface 410 and record the respective UIDs 412 in a UID log 432 of a local memory resource 430 . Accordingly, the UID recorder 420 can log the laptop computing device's 462 UID (i.e., UID 1 ) in the UID log 432 .
- the wearable device 400 may come within wireless range of an access point 463 having a MAC address UID 2 .
- the signal interface 410 can receive the MAC address of the access point 463 .
- the signal interface 410 can receive a beacon from the access point 463 which includes the access point's 463 MAC address (UID 2 ).
- the signal interface 410 can communicate UID 2 to the UID recorder 420 , which can log UID 2 in the UID log 432 .
- the wearable device 400 can establish a network connection, e.g., via network 480 , with a host server 470 .
- the wearable device 400 can transmit UID log data 482 from the local UID log 432 to the host server 470 .
- the host server 470 can compile the UID log data 482 and associate the respective UIDs 412 from the log data 482 with a user account 476 associated with a user of the wearable device 400 .
- the remote host server 470 can perform any number of functions described herein.
- the host server 470 can utilize the UID log data 482 from the wearable device 400 to identify the network devices (i.e., the laptop computing device 462 and the access point 463 ) with which the wearable device 400 came within wireless range.
- the host server 470 can store a universal association list 472 that may list associations between, for example, UID 1 of the laptop computing device 462 and a specified virtual resource. Accordingly, the host server 470 can attribute an amount of the specified virtual resource associated with UID 1 to the user account 476 associated with the wearable device 400 .
- the host server 470 could attribute a predetermined amount of virtual ore to the user account 476 .
- the host server 470 may perform a lookup in the universal associations list 472 and identify that UID 2 (corresponding to the access point 463 ) does not yet have an associated virtual resource. Thus, the host server 470 may select, either sequentially or randomly, a virtual resource from a virtual resource catalog 474 and log a new association between the selected virtual resource and UID 2 in the universal associations list 472 . Accordingly, an amount of the new selected resource can be attributed to the user account 476 , in which respective virtual resources may be accumulated based on wireless detection events between the wearable device 400 and the network devices 460 .
- the wearable device 400 can include an association engine 440 which, upon detection of a wireless signal from a network device 460 , can determine whether that device is locally associated with a virtual resource.
- the memory resource 430 of the wearable device 400 can include an association table 434 which may include predetermined associations 435 between the network devices 460 and virtual resources.
- the wearable device 400 can come within wireless range of a mobile device 464 , which can announce its presence using MAC address UID 3 .
- the signal interface 410 can receive UID 3 , and the association engine 440 can perform a lookup in the association table 434 to determine whether UID 3 is associated with a particular virtual resource (e.g., virtual oil). Thus, the wearable device 400 may flag the association between UID 3 and virtual oil until a connection with the host server 470 is established to attribute the virtual oil to the user account 476 .
- a particular virtual resource e.g., virtual oil
- the wearable device 400 can include a resource allocator 450 and the memory resource 430 can include a virtual resource log 436 , comprising an accumulation of the virtual resources collected by the user.
- the association engine 440 can transmit the associations 435 to the resource allocator 450 , which in turn, can select and compile respective amounts of accumulated virtual resources (e.g., five units of virtual oil corresponding to the UID 3 detection event) in the virtual resource log 436 .
- the wearable device 400 may come within range of a tablet computing device 465 .
- UID 4 associated with the MAC address of the tablet computing device 465 may be received by the signal interface 410 and communicated to the association engine 440 , which can determine from the association table 434 that UID 4 is associated with, for example, virtual coal.
- the association engine 440 can communicate the UID 4 association to the resource allocator 450 , which can log an amount of virtual coal in the virtual resource log 436 .
- the signal detection system can flush the UID log 432 , since the UIDs of the network devices 460 are already known and associated in the association table 434 .
- the signal detection system can transmit virtual resource log data 484 to the host server 470 to enable the host server 470 to attribute collected virtual resources to the user account 476 associated with the user of the wearable device 400 .
- the wearable device 400 can communicatively couple with a user's computing device, such as a smartphone or tablet computer.
- the virtual resource log data 484 may then be communicated to the user's computing device which can, in turn, attribute the collected virtual resources to the user's account locally or via network connection to the host server 470 .
- the collected virtual resources may be expended as a result of gameplay, or via various incentive-based programs increasingly prevalent in the mobile application marketplace to induce users to interact with stipulated content.
- the consumption, accumulation, and/or expenditure of collected virtual resources may be performed dynamically in connection with task-oriented operations (e.g., gameplay in the game mode) on a mobile computing device 495 running a respective application utilizing such virtual resources.
- a standalone wearable device 400 may incorporate near field communication (NFC) technology such that the supplementary task of collecting virtual resources may be performed during a user's routine daily activities, and the user can compile the virtual resources via NFC link with the user's computing device 495 .
- NFC near field communication
- the wearable device 400 may come within range of a mobile device 466 having a unique identifier (e.g., MAC address) UID 5 .
- the signal interface 410 can communicate the mobile device's 466 identifier to the UID recorder 420 which can record UID 5 in the UID log 432 of the local memory resource 430 .
- the mobile device's 466 identifier may be further communicated to the association engine 440 which can perform a lookup in the association table 434 to determine whether UID 5 is associated with a virtual resource.
- the wearable device 400 can subsequently utilize the UID log 432 to communicate UID log data 482 to the host server 470 to receive an association between UID 5 and a virtual resource.
- the wearable device 400 can include a local virtual resource catalog, similar to the virtual resource catalog 474 of the host computer 470 , and can perform a sequential or randomized selection of a virtual resource (e.g., virtual wood) to associate with UID 5 .
- the association 435 can be communicated to the resource allocator 450 , which can allocate a predetermined amount of virtual wood in the virtual resource log 436 .
- the virtual resource log data 484 which can comprise the accumulated virtual resources since the last established connection, can be communicated to the host server 470 for attribution to the user's account 476 .
- the virtual resource log data 484 is communicated, the virtual resource log 436 in the wearable device 400 can be flushed.
- the signal interface 410 of the wearable device 400 may be restricted to only receive virtual resources associated with network devices 460 upon establishing a connection with the respective device, as opposed to merely detecting the network device 460 .
- the allocated amounts of a particular virtual resource associated with multiple network devices 460 may be diverse.
- the resource allocator 450 may allocate more virtual resources when the wearable device 400 comes within contact of a less accessible access point (e.g., an access point at a remote location).
- the UID log 432 , the association table 434 , and/or the virtual resource log 436 may be reset or otherwise reconfigured.
- Examples described with respect to FIG. 4 illustrate resource association and allocation being performed by the wearable device 400 .
- the mobile computing device 495 can perform various processes as described with regard to the wearable device 400 .
- the mobile computing device 495 can receive the UID data 484 for each detected network device 460 , and can locally perform lookups for the associations and allocations of the virtual resources.
- the memory 430 shown as a component of the wearable device 400 may be included on the mobile computing device 495 and/or accessible by the mobile computing device 495 , running a designated application, in the host server 470 .
- the wearable device 400 can initiate the feedback mechanism to notify the user that virtual resources are proximate to the user's current location.
- the feedback output can include haptic feedback in combination with lighting and or audio feedback.
- the user can be notified and enabled to select the mining mode, in which the user can “mine” the virtual resource from the network device 460 .
- the feedback mechanism can be initiated and adjusted dynamically to enable the user to search for a direction towards the virtual resource to acquire a certain allocation, and/or to locate the exact position of the network device to acquire a certain allocation, as described below with respect to FIG. 6 .
- the wearable device 400 can utilize location-based resources (not shown) in order to identify virtual resources associated with a particular waypoint. For example, a gameplay environment may preconfigure virtual resources in GPS locations in the real world. The wearable device 400 can identify such locations (e.g., when the wearable device 400 is within a certain proximity of a particular waypoint) and provide feedback via a feedback mechanism that enables the user to search for and mine the virtual resource associated with the waypoint.
- FIG. 5 is a block diagram illustrating an example virtual resource management system utilized by a portable sensing device (e.g., wearable device 400 ), or a mobile computing device 495 executing an application in connection with a wearable device 400 .
- the virtual resource management system 500 can include features from the wearable device 400 described with respect to FIG. 1A .
- one or more components of the virtual resource management system 500 may be comprised in any number of electronic devices, such as smartphones, tablet computers, personal computers, laptop devices, wearable computing systems, and the like.
- the virtual resource management system 500 can comprise an application-based program running on any of such devices, enabling a user to collect, accumulate, expend, and consume virtual resources in connection with task-oriented operations performed on the device.
- the user's device may run an application specific to virtual resource gameplay, rendering a gaming environment 578 on a display 590 .
- the user may perform interactions 594 using the display 590 in order to utilize compiled virtual resources in the virtual resource log 565 , as shown in FIG. 5 .
- the virtual resource management system 500 shown and described with respect to FIG. 5 can be provided with UIDs 512 from, for example, a wearable device in a finder mode that operates to locate virtual resources as a user walks or otherwise travels across various networks and comes within network range of various network devices.
- the wearable device can transmit UIDs of various detected network devices to the mobile computing device, which can connect with the host server 570 to determine whether the detected network devices are associated with virtual resources. If so, the mobile computing device can transmit a confirmation signal to the wearable device which can provide feedback to the user (e.g., audio, haptic, and/or visual feedback) indicating that virtual resources are nearby.
- the mobile computing device or the wearable device itself can determine a direction towards the network device associated with the virtual resource. Furthermore, the generated feedback on the wearable device can enable the user to select the mining mode, allowing the user to first identify the direction towards the virtual resource(s) (e.g., via dynamically provided feedback indicating whether the user is “hot” or “cold” depending on the direction the user points the wearable device), and optionally locate the actual position of the network resource device in order to be allocated with a certain amount of the virtual resource.
- the wearable device can include an IMU to determine the user's orientation and a direction in which the user is facing in order to provide the feedback.
- the mining mode may terminate automatically upon acquiring the virtual resources, and the wearable device may revert, automatically, back to finder mode.
- the virtual resource management system 500 can include a signal detector 510 (e.g., the wearable device) to detect UIDs 512 of various network devices, as discussed above with respect to FIG. 4 .
- the UIDs 512 may be logged in a UID Log 530 by a UID module 520 .
- various UIDs i.e., UID XYZ 532 , UID EFG 534 , UID NOP 536 , UID TUV 538 , and UID DK 539 ), each associated with a respective network device, have been logged in the UID log 530 .
- the UIDs 512 may be communicated to an association engine 540 , which can perform lookups in an association table 560 to determine whether a respective UID is associated with a respective virtual resource.
- UID XYZ 532 is associated with the virtual resource
- virtual wood 562
- UID EFG 534 is associated with virtual ore 564
- UID NOP 536 is associated with virtual food items 566
- UID TUV 538 is associated with virtual workers 568 .
- Virtual resources can further include various forms of digital currency or money, or images, video clips, three-dimensional (3D) interactive models, sound clips, a mini-game, or other content that may be presented to the user in response to finding the virtual resource.
- the association table 560 can be provided with any type of virtual resource described herein.
- virtual resources may be stored for subsequent task-oriented activities (e.g., virtual gold or virtual money to be expended in a subsequent game), or may require immediate consumption (e.g., a “mined” video clip discovered by the user which can be viewed once and deleted).
- UID NOP 536 may be communicated to the association engine 540 , which, upon performing a lookup in the association table 560 , can determine that UID NOP 536 is associated with virtual food items 566 .
- the association engine 540 can communicate this association 542 to a resource engine 550 , which can log allocations 552 of virtual resources in a virtual resource log 565 , as shown in FIG. 5 .
- the resource engine 550 can allocate a predetermined amount of virtual food 566 in the virtual resource log 565 .
- the user has accumulated 1777 units of virtual food 566 , in addition to 765 units of virtual wood 562 , 925 units of virtual ore 564 , and 25 units of virtual workers 568 .
- the user has yet to accumulate any virtual crystal, virtual coal, or virtual oil, which may be essential items in connection with the rendered gameplay 592 on the display 590 .
- the signal detector 510 can communicate UID IJK 539 to the association engine 540 , which upon performing a lookup in the association table 560 , can determine that UID IJK 539 is not yet associated with a virtual resource (i.e., unknown 569 ).
- the association engine 540 may attribute UID IJK 539 with a particular virtual resource (e.g., virtual oil) by referencing a local virtual resource catalog, similar to the virtual resource catalog 574 of the host server 570 .
- the association engine 540 may compile an association call 544 , which can be communicated to the host server 570 upon establishing a network connection via network 580 .
- the host server 570 can sequentially or randomly select a virtual resource (e.g., virtual oil), from the virtual resource catalog 574 , and log the new association between UID IJK 539 and virtual oil in a universal associations list 572 .
- the host server 570 can respond to association calls 544 with new associations 596 logged in the universal associations list 572 .
- the new associations 596 can be communicated to the virtual resource management system 500 over the network 580 via a communication interface 595 of the virtual resource management system 500 . Accordingly, the new associations 596 can be received by the association engine 540 , which can log the new associations 596 in the association table 560 .
- the association between UID IJK 539 and virtual oil may be transmitted through the communication interface 595 of the virtual resource management system 500 to the association engine 540 , which can update the association table 560 to replace “unknown 569 ” with virtual oil.
- the resource engine 550 can manage the resource log 565 based on both user interactions 594 performed via the rendered gameplay 592 on the display 590 , and detected UIDs 512 and received associations 542 based on such UIDs 512 .
- the user interactions 594 in connection with the rendered gameplay 592 can cause virtual resources stored in the virtual resource log 565 to be expended. Such expenditures 567 in the rendered gameplay 592 can correlate to a depletion of the relevant virtual resource in the virtual resource log 565 .
- the user interactions 594 can cause respective virtual resources to be traded, consumed, accumulated, invested, or expended in accordance with the rendered gameplay 592 .
- the detection of UID IJK 539 can cause the resource engine 550 to attribute a predetermined amount of virtual oil (e.g., five units) in the virtual resource log 565 .
- user interactions 594 in connection with the rendered gameplay 592 can cause dynamic updates, by the resource engine 550 , to the virtual resource log 565 .
- the virtual resource log 565 can be dynamic in nature, in connection with the rendered gameplay 592 and the detection of UIDs 512 .
- the resource engine 550 can continuously perform allocations 552 and expenditures 567 of virtual resources in the virtual resource log 565 .
- the resource engine 550 can communicate accumulated virtual resources 597 to the host server 570 , which may attribute the accumulated virtual resources 597 to the user's account 576 to save progress data corresponding to the rendered gameplay 592 .
- the rendered gameplay 592 may be application or software-based utilizing any number of resources of the user's computing device.
- the rendered gameplay 592 may be provided in connection with augmented reality, virtual reality, or a virtually generated gaming environment 578 provided by the host server 570 .
- the gaming environment 578 provided can comprise virtual features rendered in a real world environment.
- a camera included on the user's computing device may be utilized to capture real-world images or video, and the gaming environment 578 may be rendered thereon.
- the rendered gameplay 592 may be incorporated in conjunction with the use of a remotely operated self-propelled device. In such implementations, the rendered gameplay may include virtual controls to remotely control the self-propelled device.
- the virtual resource management system 500 can operate in connection with the self-propelled device, which can incorporate one or more features of the virtual resource management system 500 , such as the signal detection features (i.e., signal detector 510 , UID module 520 , and UID log 530 ).
- the signal detection features i.e., signal detector 510 , UID module 520 , and UID log 530 .
- FIG. 6 is a flow chart describing an example method of acquiring virtual resources using a wearable device.
- the wearable device 400 as illustrated in FIG. 4 , or a wearable device 400 in communication with a mobile computing device 495 running a designated application, as shown in FIG. 4 .
- the wearable device 400 can initiate a finder mode and initially detect a signal, such as an advertising beacon, from a network device, such as an access point ( 600 ).
- the beacon can include the network device's MAC address, or other UID representing the network device.
- the wearable device 400 can then perform a lookup in an association table 434 to determine whether the UID is associated with a given virtual resource ( 605 ). In some aspects, the wearable device 400 can transmit the UID to the mobile computing device 495 , which can make the determination accordingly.
- the wearable device 400 (or mobile computing device 495 ) identifies from the association table 434 whether the UID is associated. If the UID is not associated with a virtual resource ( 614 ), then the wearable device 400 can continue to monitor and detect signals from network devices ( 600 ), and/or generate a request for a UID association and retrieve the virtual resource association.
- the wearable device 400 or user utilizing the wearable device 400 , can be allocated a predetermined amount of the given virtual resource in the virtual resource log 436 in the following manner.
- the wearable device 400 can generate feedback to the user indicating the proximate virtual resource ( 615 ).
- the wearable device 400 may then receive a user input (e.g., on an input mechanism or a mode selector on the wearable device) to search for the virtual resource, thereby initiating the mining mode ( 420 ).
- the wearable device 400 can determine a direction towards the network signal (e.g., the advertising beacon having the UID associated with the virtual resource) ( 625 ).
- the wearable device 400 can monitor the sensor data, from one or more sensors of the wearable device 400 , to determine whether the user points the wearable device towards or away from the virtual resource signal ( 630 ). While the user points the wearable device 400 in differing directions, the wearable device 400 can adjust the feedback dynamically ( 635 ).
- the feedback which can comprise a combination of haptic, audio, and visual feedback, can increase in intensity as the user points the wearable device 400 towards the network device, and decrease in intensity as the user points the wearable device 400 away from the network device. Accordingly, once the user has pointed the wearable device 400 in the correct direction for a predetermined time period (e.g., three seconds), the wearable device 400 can acquire a certain allocation of the virtual resource, and store or otherwise enable the user to consume or expend the acquired resource ( 640 ).
- a predetermined time period e.g., three seconds
- FIGS. 7A and 7B are low level flow charts describing example processes for managing virtual resources in connection with signal detection and gameplay.
- the low level method described in connection with FIGS. 7A and 7B may be performed by, for example, the virtual resource management system 500 as illustrated in FIG. 5 .
- the virtual resource management system 500 can detect advertising signals (e.g., beacons) from network devices ( 700 ).
- advertising signals may include UIDs, such as MAC addresses for the detected network devices.
- the virtual resource management system 500 can then record the UIDs of the network devices in the UID log 530 ( 705 ).
- the virtual resource management system 500 may perform a lookup in the association table 560 ( 710 ) to determine whether a respective UID is associated with a respective virtual resource ( 715 ). If the respective UID is associated with a respective virtual resource ( 717 ), then the virtual resource management system 500 can allocate a predetermined amount of the respective virtual resource in the virtual resource log 565 ( 720 ). Thereafter, the process may begin again with the detection of advertising signals from network devices ( 700 ). However, if the respective UID is not associated with a respective virtual resource ( 719 ), the virtual resource management system 500 can generate a request, or an association call 544 , to create a new association 596 for the respective UID ( 725 ).
- the virtual resource management system 500 can transmit the request to the host server 570 when a network connection is established ( 730 ).
- the virtual resource management system 500 can locally select a respective virtual resource from a local virtual resource catalog ( 740 ). Such a selection may be made sequentially ( 743 ), for example, if the virtual resource catalog is a sequential list of virtual resources. Or, the selection may be made randomly ( 741 ) from the virtual resource catalog by way of a random selection technique.
- the virtual resource management system 500 can receive, remotely or locally, associated virtual resources for unassociated UIDs ( 745 ).
- the respective unassociated UID may be associated with a respective virtual resource, and thereafter the virtual resource management system 500 can allocate a predetermined amount of the respective virtual resource in the virtual resource log 565 ( 720 ). After all detected UIDs are associated with their respective virtual resources, the virtual resource management system 500 can flush the UID log 747 , since storing the UIDs may no longer be necessary.
- the virtual resource management system 500 may receive a user input (e.g., via a touch input on a touch-sensitive display) to launch a gaming application associated with trading, consuming, accumulating, earning, investing, or otherwise expending virtual resources. Based on the user input, the virtual resource management system 500 can initiate the gameplay application to place the wearable device in game mode ( 750 ). In doing so, the gaming environment may be rendered on the display 590 ( 755 ) (e.g., on the display of the mobile computing device).
- a user input e.g., via a touch input on a touch-sensitive display
- the virtual resource management system 500 can initiate the gameplay application to place the wearable device in game mode ( 750 ). In doing so, the gaming environment may be rendered on the display 590 ( 755 ) (e.g., on the display of the mobile computing device).
- the virtual resource management system 500 can receive various user interactions 594 in connection with the gameplay ( 760 ). Such user interactions 594 may be performed by way of touch inputs, mouse interactions, keyboard interactions, interactions using a game controller such as a joystick or a specialized controller device, or a combination of the above. Furthermore, such user interactions 594 may be performed in connection with remote operation of a remotely operated device.
- the gameplay can incorporate a virtual or augmented reality environment in which the user can utilize collected virtual resources. For example, collected virtual resources may be consumed by a virtual character under operative control of the user. Additionally or alternatively, selected virtual resources can be expended to build a virtual building or town. Collected virtual food items may be utilized to feed a virtual colony of gameplay characters. Virtual oil may be utilized by the user during gameplay to modernize a primitive society. Virtual workers may be employed for production or to build infrastructure. Various alternatives and additions in connection with task-oriented operations and gameplay are contemplated.
- the resource engine 550 of the virtual resource management system 500 can dynamically modify or update the virtual resource log 565 to enable the user to expend, trade, accumulate, invest, earn, etc. virtual resources from the virtual resource log 565 ( 765 ).
- the virtual resource management system 500 may receive a user input to deactivate the gaming application, and thus terminate the gaming mode ( 770 ).
- the resource engine 550 of the virtual resource management system 500 can compile the virtual resources left in the virtual resource log 565 , and transmit a saved list of virtual resources to the host server 570 ( 775 ). Thereafter, the virtual resource management system 500 may flush the virtual resource log ( 777 ), since the progress is remotely saved.
- FIGS. 8A through 8C illustrate unique identifier logs and association tables utilized in connection with virtual resource association, acquisition, and allocation.
- the detected MAC address of various network devices may be associated with any number of plausible items. Such items may be individuals associated with the network device (e.g., an owner of the network device), or such items may be associated with persons of interest (e.g., characters in connection with a gaming environment), as illustrated in FIG. 8A . Alternatively, such items may be locations of the respective network devices, such that a user is enabled to identify where he/she has traveled, as illustrated in FIG. 8B .
- such items may be associated with a time stamp, thereby enabling a user to determine that an interaction has taken place with a particular network device and a time in which the interaction took place, as illustrated in FIG. 8C .
- the examples provided with respect to FIGS. 8A through 8C may be implemented using the wearable device 400 as described with respect to FIG. 4 , and/or using a modified virtual resource management system 500 as discussed with respect to FIG. 5 .
- a UID Log 810 of a signal detection system can compile UIDs (e.g., UID XYZ 812 , UID EFG 814 , UID NOP 816 , UID TUV 818 , UID IJK 819 , etc.) associated with respective network devices.
- the respective network devices may be any one of an access point, a mobile computing device, a tablet computer, a personal computer, a BLUETOOTH-enabled device, a radio-frequency identification device, a local area network enabled device, etc.
- a user may carry the wrist-worn device 800 , which can come within wireless detection range of any number of the foregoing network devices.
- the wrist-worn device 800 can compile UIDs corresponding to the network devices in a UID Log 810 .
- the wrist-worn device 800 only compiles UIDs associated with network devices in which the user come within wireless range.
- the user may enable a function on a computing device (e.g., a tablet computer, smart phone, PC, etc.), which can pull the UIDs from the wrist-worn device 800 and make associations according to a local association table.
- the wrist-worn device 800 can include connectivity functions to establish a connection with the computing device (e.g., Wi-Fi, BLUETOOTH, etc.).
- the wrist-worn device 800 can include inductive data communication capabilities in order to transmit the UIDs to the computing device over an inductive link.
- the wrist-worn device 800 can include a local association table 820 in order to make a given association when the wrist-worn device 800 comes within wireless range of a given network device.
- the association table 820 of the wrist-worn device 800 can log an association. For example, as the user carries the wrist-worn device 800 within wireless range of an access point, the wrist-worn device 800 can receive the access point's advertising beacon, which can include the access point's unique identifier, UID XYZ 812 .
- the wrist-worn device 800 association table 820 can be referenced to identify that UID XYZ is associated with Ricky H. 822 —who may be an owner of the access point.
- the user may then journey within range of a wireless device with UID EFG 814 , which the association table 820 may associate with Jose C. 824 —an owner of the wireless device.
- the user may thereafter review the association table/log to identify the individuals (i.e., Ricky H., Jose C., etc.) whose devices the user encountered during the excursion.
- the association table may associate such UIDs with characters or places in a gaming environment.
- each detection event can correspond to a meeting or visitation in which the user meets with a character or visits a point of interest.
- a task-oriented application may require the user to meet a certain character or visit a certain place before a next achievement is reached.
- the wrist-worn device 800 may detect advertising beacons for respective devices associated with UID NOP 816 , UID TUV 818 , and UID 819 .
- the association table 820 may be referenced to identify characters and/or places (real or virtual) with which the respective network devices are associated.
- the association table/log 820 can identify Mark M. 826 as being associated with UID NOP 816 , Dave. S. 828 as being associated with UID TUV 818 , and the Coliseum 829 as being associated with UID IJK 819 .
- the task-oriented application can input such meetings and visits and record a number of respective achievements.
- FIG. 8B illustrates another usage scenario in which the UIDs of network devices may be associated with a real-world or mock-world environment.
- the user may carry the signal detection device 800 to within wireless range of an access point of a coffee shop, where the access point has a unique identifier UID XYZ 832 .
- the wrist-worn device 800 can log UID XYZ in the UID Log 830 and reference the association table/log 840 to determine that UID XYZ 832 is associated with the coffee shop 842 . After an excursion of passing through any number of wireless beacons, the user can map the excursion using the physical locations, logged in the association table/log 840 , of the network devices along the way.
- the physical locations of the network devices may be determined via the detected beacon, which may include location information.
- the wrist-worn device 800 or mobile computing device to which the wrist-worn device 800 is connected, can include location-based functionality (e.g., GPS resources) to log the location in the association table/log 840 . Accordingly, in response to a detection event, the wrist-worn device 800 can be triggered to pinpoint the physical location of the detection event—which can be logged along with the UID of the network device, as shown in FIG. 8B .
- the location associated with a UID may be based on a mock-world environment corresponding to, for example, a gaming environment.
- a user may interact with a task-oriented application on a computing device, which may require the user to visit El Dorado 844 .
- the user may then physically search for or journey to a specified network device having UID EFG 834 , which, in accordance with the association table, is associated with El Dorado 844 .
- the user may be enabled to reference the association table 840 , which may provide a physical location in the real-world of the network device associated with El Dorado 844 .
- the association table/log 840 can log the detection event, and the user can accomplish a next achievement in the task-oriented application.
- the wrist-worn device 800 may also include a timer or clock to enter a timestamp as triggered by a detection event, as illustrated in FIG. 8C .
- a user may further review a time in which the wrist-worn device 800 detected the wireless signal of a respective wireless device.
- the user may come within wireless range of a network device having UID XYZ 852 , which can be logged as being associated with Item 1 862 .
- the detection event of UID XYZ 852 can further trigger a clock or timer to log a timestamp associated with the detection event.
- the time may reflect a local or universal time, or may reflect an elapsed time from say, the beginning of the excursion or a start time of the task-oriented application.
- FIG. 9 illustrates a wearable device pairing triggering virtual resource data sharing in connection with gameplay, as described herein.
- the wearable device 900 may be carried or worn by a user to detect advertising beacons or wireless signals from various network devices.
- the wearable device 900 e.g., as wrist-worn device
- wearable device 900 upon linking with a computing device 905 , wearable device 900 can transmit the UID's 904 from the UID Log 901 to the computing device 905 .
- the link may be any data connection.
- the wearable device 900 can include functionality corresponding to Wi-Fi, radio-frequency, infrared, BLUETOOTH, near-field communication (NFC), and the like.
- the UID's 904 may be transmitted to the computing device 905 over such a communication link.
- the wearable device 900 can collect UIDs corresponding to various network devices, which may be associated with any number of items.
- an association table of either the computing device 905 or the wearable device 900 can include associations between UIDs of network devices and the registered owners of those network devices.
- the registered owner may be logged in an association log 903 .
- a user may scroll through the association log 903 to identify individuals with which the user came into wireless contact.
- association table of either the wearable device 900 or the computing device 905 can associate various network devices with real-world or mock-world characters, landmarks, or other places. Such associations may be made in connection with a task-oriented application 902 , such as a gaming application providing a gaming environment.
- the wearable device 900 can itself include an association table/log and/or a virtual resource log to perform and log associations and collect virtual resources.
- logged association items 906 and/or collected virtual resources 908 may also be transmitted to the computing device over the communication link.
- timestamps 909 correlated to the detection events and logged associated items 906 may also be transmitted to the computing device 905 .
- the computing device 905 may run a task-oriented application 902 , which can trigger the communication link with the wearable device 900 .
- Execution of the task-oriented application 902 can correspond to running a game providing a gameplay environment which utilizes items associated with the UID's 904 of network devices.
- the task-oriented application 902 can cause the computing device 905 to receive the UID's 904 from the wearable device 900 and reference an association log 903 to determine whether a given UID is associated with a given association item. If the given UID is not associated, the computing device 905 can create an association or retrieve an association from a host server. Given an association, the computing device 905 can log a specified amount of collected virtual resources in a local virtual resource log 907 for use in the task-oriented application 902 .
- the wearable device 900 may operate in default mode in which the wearable device 900 monitors for wireless signals indicating that another wearable device, (e.g., wearable device 930 ) is nearby.
- the wearable devices 900 , 930 can each detect wireless signals from each other (e.g., beacon signals) and can generate feedback (e.g., a haptic and audio response) indicating such.
- the users can visually locate each other and have the option of pairing their wearable devices 900 , 930 .
- the default mode may be executing as a background mode while the wearable device 900 operates in another mode, such as finder mode.
- the wearable devices 900 , 930 can each initiate a sharing mode that enables the users to view and/or share acquired virtual resources.
- the computing devices 905 , 910 of the users can each run a task-oriented application, or gaming application 912 that enable the users to share information regarding the virtual resources collected, and trade virtual resources accordingly.
- the user of the wearable device 930 can receive virtual resource data 927 from the wearable device 900 (or the mobile computing device 905 ), and the user's mobile computing device 910 can acquire “unlocked” virtual resources 929 based on a trade with the wearable device 900 .
- the user may then run the gaming application 912 on the computing device 910 and interact with the computing device 910 via the gaming application 912 , which can provide a gaming environment 902 that requires the use, directly or indirectly, of an association table 914 in connection with wearable device 930 and the gaming application 912 .
- the gaming environment 920 may require user interactions with real-world or mock-world characters and/or places, which may be accomplished by coming within wireless range of specified network devices associated with such real-world or mock-world characters and/or places in an association table 914 of the computing device 910 .
- the computing device 910 can log associated items (e.g., real-world characters, mock-world characters, real-world locations, mock-world locations, etc.). In accordance with the gaming environment 920 , upon logging such associations, various tasks may be achieved.
- associated items e.g., real-world characters, mock-world characters, real-world locations, mock-world locations, etc.
- the gaming environment 920 may require the collection and use of virtual resources, as described above.
- the user may run the signal detector/gaming application 912 and interact with the gaming environment 920 , which may require the user to collect a number and amount of virtual resources.
- the wearable device 930 enables the user to acquire unlocked virtual resources 929 from another user, which can be compiled in a virtual resource log 918 associated with the gaming application 912 .
- the user can perform a physical excursion to enable the computing device 910 to wirelessly interact or otherwise detect various network devices to collect such virtual resources for utilization in the gaming environment 920 .
- the gaming environment 920 can provide the virtual resource log 918 which can inform the user of which virtual resources and how many of each virtual resource the user has collected.
- FIGS. 10A and 10B illustrate a wearable device pairing triggering an interactive mode, or battle mode, between a pair of proximate users 1002 , 1012 as shown in FIG. 10B .
- the interactive mode can be triggered upon selection of the mode on each of a pair of wearable devices 1010 , 1030 .
- an inductive link 1025 between the devices 1010 , 1030 can enable the users to select the interactive mode, as shown in FIG. 10A .
- the devices 1010 may each be connected to a respective mobile computing device, such as mobile computing device 1050 and mobile computing device 1060 shown in FIG. 10B .
- the users 1002 , 1012 can each select the interactive mode to perform a series of actions and alternate between offensive and defensive sub-modes in the interactive mode.
- users 1002 and 1012 operate their wearable devices 1010 , 1030 in interactive mode, which transmits action data 1048 between the devices 1010 , 1030 .
- the action data 1048 can correspond to user gestures 1041 performed by the users 1002 , 1012 while the devices 1010 , 1030 are in interactive mode.
- one device may be in an offensive sub-mode in which the user 1002 can perform a number of offensive actions 1042 , such as attack actions, using the wearable device 1010 .
- Action data 1048 corresponding to the offensive action 1042 can be generated by the wearable device 1010 , and transmitted to the wearable device 1030 .
- the wearable device 1010 can generate feedback to the user indicating whether or not the offensive action 1042 was successful.
- the proximate user 1012 can perform user gestures 1041 corresponding to a defensive action 1044 based on the offensive action 1042 performed by the user 1002 .
- the wearable device 1030 can generate feedback reflecting such.
- the wearable devices 1010 , 1012 can alternate between offensive and defensive sub-modes giving each user successive opportunities to perform offensive 1042 and defensive actions 1044 .
- sensor patterns can be preconfigured corresponding to any number of offensive actions 1042 and defensive actions 1044 . Still further, only certain sensor patterns for defensive actions 1044 (e.g., a specified type of blocking action) may be effective against a given offensive action 1042 .
- Each sensor pattern for each offensive 1042 or defensive action 1044 can be preconfigured in accordance with a set of rules (e.g., gaming rules) that utilize the wearable devices 1010 , 1030 in connection with such physical gameplay.
- rules e.g., gaming rules
- the users 1002 , 1012 may previously “train” to perfect the defensive actions 1044 in light of the offensive actions 1042 and vice versa in a training mode described herein.
- the mobile computing devices 1050 , 1060 connected to each wearable device 1010 , 1030 can maintain a tally or score based on any number of metrics in light of the offensive and defensive interactions between the users 1002 , 1012 .
- Successful or partially successful offensive 1042 or defensive actions 1044 may be given respective scores, and unsuccessful actions 1042 , 1044 may be given other scores, or negative scores.
- a predetermined threshold score 1052 may be set by the interactive session, for example, as agreed upon by the users 1002 , 1012 , or in accordance with a particular game selected in the interactive mode. Once reached, the interactive session may end and a winner may be declared.
- a final result can be displayed on the respective mobile computing devices 1050 , 1060 , and the user's 1002 may then choose to initiate another interactive session or conclude the experience.
- FIG. 11 is a flow chart describing an example method of implementing an interactive mode between a pair of user utilizing a corresponding pair of wearable devices.
- the wearable device 1010 can detect an inductive pairing 1025 with a proximate wearable device 1030 ( 1100 ).
- the wearable device 1010 can initiate an interactive mode ( 1105 ).
- the wearable device 1010 can determine an ordering or sequence between offensive and defensive sub-modes ( 1110 ), and then monitor sensor data for user gestures 1041 performed depending on a present sub-mode.
- the wearable device 1010 can identify user gestures 1041 indicating an and offensive attack 1042 ( 1120 ). The wearable device 1010 can do so by active monitoring or via state machine monitoring, as described herein. Sensor patterns corresponding to each of any number of offensive attack actions can be detected, and cause the wearable device 1010 to generate and transmit action data 1048 indicating the particular offensive action 1042 performed by the user 1002 to the proximate device 1030 ( 1125 ). The wearable device 1010 may thereafter receive data from the proximate wearable device 1030 indicating whether or not the offensive action was successful ( 1130 ) and generate a feedback output accordingly ( 1135 ).
- the wearable device 1010 can generate positive feedback (e.g., audio, visual, etc.) ( 1137 ). If the offensive action 1042 was unsuccessful or a failure, then the wearable device 1010 can generate negative feedback ( 1139 ). Upon generating the feedback, the wearable device 1010 can switch to the defensive sub-mode ( 1180 ).
- positive feedback e.g., audio, visual, etc.
- the wearable device 1010 can receive data indicating an offensive attack action 1042 from the proximate wearable device 1030 ( 1150 ).
- the wearable device 1010 can monitor sensor data for sensor patterns corresponding to user gestures 1041 indicating a defensive action 1044 , or blocking move, performed by the user 1002 ( 1155 ).
- the wearable device 1010 can further determine, using timing and sensor data, whether the defensive action 1044 was successful ( 1160 ).
- the user 1002 may be given a predetermined amount of time to perform a defensive action 1044 in response to an offensive action 1042 once the action data 1048 (indicating the offensive action 1042 ) is received. Accordingly, reception of the action data 1048 can trigger a timer on the wearable device 1010 giving the user limited time to perform an appropriate defensive action 1044 .
- the wearable device 1010 can generate a feedback output ( 1165 ). If the defensive action 1044 is successful and performed within the time threshold, the wearable device 1010 can generate positive feedback ( 1167 ). However, if the defensive action 1044 was unsuccessful, the wearable device 1010 can generate negative feedback ( 1169 ), and then transmit successful/unsuccessful data to the proximate device 1030 ( 1170 ), and tally the score for the interactive session.
- the wearable device 1010 can determine whether the threshold score has been achieved by either the user 1002 or the proximate user 1012 ( 1185 ). If not ( 1187 ), then the users 1002 , 1012 can continue in the interactive mode, alternating between offensive ( 1175 ) and defensive ( 1180 ) sub-modes. However, if the threshold score has been achieved by one of the users 1002 , 1012 ( 1189 ), then the wearable device 1010 can exit the interactive mode and transmit the final scores to the connected mobile computing device 1050 ( 1195 ).
- Proximity sensors can include components of an inertial measurement unit (IMU) incorporated into each wearable device 1010 , 1030 (e.g., an infrared sensor). Additionally or alternatively, GPS resources on the wearable devices 1010 , 1030 themselves or on connected mobile computing devices 1050 , 1060 can be utilized for proximity-based activities. Such activities can include sporting activities to determine, for example, finish line positions of a running race or physical game activities for children (e.g., hide-and-seek or capture the flag). Feedback can be provided to each user in the interactive session, providing information connected to the proximity-based activity (e.g., information indicating a team-leader, a ranking, a team score, positional information, and the like).
- IMU inertial measurement unit
- GPS resources on the wearable devices 1010 , 1030 themselves or on connected mobile computing devices 1050 , 1060 can be utilized for proximity-based activities. Such activities can include sporting activities to determine, for example, finish line positions of a running race or physical game activities for children (
- FIG. 12 is a flow chart describing an example method of initiating a training mode on a wearable device in connection with a self-propelled device.
- the wearable device 102 can detect a user input placing the wearable device 102 in training mode ( 1200 ).
- the user input can be detected via a mode selector on the wearable device 102 ( 1202 ), or via launch of a designated application on a connected mobile computing device ( 1204 ).
- the training mode can be initiated via a combination of user inputs on the mode selector and an inductive link with a remotely operated device 140 .
- the wearable device 102 may also detect an inductive pairing with a remotely operated device 140 ( 1205 ).
- the wearable device 102 can transmit data to initiate the training mode on the remotely operated device 140 as well ( 1210 ).
- the transmitted data can cause the remotely operated device 140 to execute instructions to aid the user in training for a series of actions.
- the series of actions can correspond to offensive and defensive actions that the user can implement when the wearable device 102 is in interactive or battle mode. Additionally or alternatively, the series of actions can get progressively more difficult as the user successively accomplishes each action.
- the wearable device 102 can synchronize directionally with the remotely operated device 140 ( 1215 ).
- the user can manually synchronize the gyroscopic sensors of the devices 102 , 140 by physically pointing a forward operational direction of the remotely operated device 140 away from the wearable device 102 and providing a calibration input (e.g., an input on the mode selector of the wearable device 102 ) ( 1217 ).
- the gyro synchronization may be performed automatically ( 1219 ).
- Automatic synchronization can be initiated by the wearable device 102 by generating and transmitting a spin command to the remotely operated device 140 , which can execute a spin accordingly ( 1220 ).
- the wearable device 102 detect an asymmetry in the radiation pattern of the remotely operated device 140 as it spins, indicating the direction towards the remotely operated device 140 ( 1225 ).
- the wearable device 102 can transmit a direction calibration command to the remotely operated device 140 indicating the direction, which the remotely operated device 140 can process to align its internal drive system accordingly ( 1230 ).
- the wearable device 102 can track the location of the remotely operated device 140 as it traverses and maneuvers ( 1240 ).
- the remotely operated device 140 can include surface features or an accessory (e.g., a magnetically coupled attachment) that indicates a forward “looking” direction of the remotely operated device 140 .
- the user is instructed to walk or run around in a circle until the user is directly facing the forward looking direction of the remotely operated device 140 .
- the wearable device 102 can include sensors to determine an orientation of the user. For example, the wearable device 102 can determine whether the user is facing an instructed direction in connection with the training mode, such as facing the remotely operated device 140 .
- the wearable device 102 can generate an output, via the feedback mechanism, instructing the user to perform a set of actions ( 1245 ).
- the output may be in the form of audio instructions, and can be based on data received from the remotely operated device 140 ( 1247 ), or from the wearable device 102 utilizing a local routine set ( 1249 ), which may be randomized or sequenced in accordance with the executing training mode instructions.
- the wearable device 102 can initiate a timer ( 1250 ).
- the timer can be initiated for each instruction outputted to the user, and a threshold time limit can be set for each instruction.
- the wearable device 102 can monitor the sensor data to determine whether the user successfully performs the set of actions ( 1255 ). Specifically, for each instruction output, the wearable device 102 can determine whether the user has performed the instructed set of actions within the established threshold time limit ( 1260 ). If so ( 1261 ), then the wearable device 102 can generate another output instructing the user to perform another set of actions ( 1245 ). However, if the user fails to perform the set of actions ( 1263 ), then the wearable device 102 can terminate the training session and generate a final score ( 1265 ), which may be displayed on the user's mobile computing device.
- FIG. 13 is a flow chart describing an example method of implementing a wearable device in a sword mode.
- the wearable device 102 can detect a user input placing the wearable device 102 in a sword mode ( 1300 ).
- the wearable device 102 can initially monitor sensor data 137 to determine whether the user has grabbed or grasped an object ( 1305 ).
- a state machine in the memory 180 to the wearable device 102 specific to the grabbing/grasping action can provide a state machine report to the output generator 160 indicating that the user is holding an object.
- the wearable device 102 can determine a series of actions performed by the user with the object from the sensor data 137 ( 1310 ). Such actions can be actively determined by the output generator 160 in the sword mode, or via state machine reports from respective state machines that indicate each specific action. In any case, in response to each action, the wearable device 102 can generate a feedback response or output 132 using a feedback mechanism ( 1315 ).
- the feedback output 132 for each action can be distinct audio responses 196 ( 1317 ), haptic responses 192 ( 1316 ), and/or visual responses 194 ( 1318 ).
- the feedback mechanism can generate a corresponding output, such as sword fighting sounds, or other weapon-like or wand-like sounds based on the executing sword mode instructions.
- the feedback output 132 based on the sensor patterns detected when the user is grasping an object may be any such sounds corresponding to the actions performed by the user.
- the “sword mode” may rather be a “wand mode,” a “magic mode,” a “conductor mode,” a “sporting mode” (e.g., for a tennis-like game), and the like.
- the wearable device 102 can detect a user input, such as an input on the mode selector 110 ( 1320 ). In response to the input, the wearable device 102 can deactivate or terminate the sword mode ( 1325 ).
- FIG. 14 is a block diagram that illustrates a computer system upon which examples described may be implemented. For example, one or more components discussed with respect to the remotely operated device of FIG. 1A , and the methods described herein, may be performed by the system 1400 of FIG. 14 .
- the computer system 1400 includes processing resources 1410 , a main memory 1420 , ROM 1430 , a storage device 1440 , and a communication interface 1450 .
- the computer system 1400 includes at least one processor 1410 for processing information and a main memory 1420 , such as a random access memory (RAM) or other dynamic storage device, for storing information and instructions to be executed by the processor 1410 .
- the main memory 1420 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 1410 .
- the computer system 1400 may also include a read only memory (ROM) 1430 or other static storage device for storing static information and instructions for the processor 1410 .
- a storage device 1440 such as a magnetic disk or optical disk, is provided for storing information and instructions.
- the storage device 1440 can correspond to a computer-readable medium that store instructions performing sensor data processing and translation operations as discussed herein.
- the communication interface 1450 can enable computer system 1400 to communicate with a computing device and/or wearable device (e.g., via a cellular or Wi-Fi network) through use of a network link (wireless or wired). Using the network link, the computer system 1400 can communicate with a plurality of devices, such as the wearable device, a mobile computing device, and/or other self-propelled devices.
- the main memory 1420 of the computer system 1400 can further store the drive instructions 1424 , which can be initiated by the processor 1410 .
- the computer system 1400 can receive control commands 1462 from the wearable device and/or mobile computing device.
- the processor 1410 can execute the drive instructions 1424 to process and/or translate the control commands 1462 —corresponding to user gestures performed by the user—and implement the control commands 1452 on the drive system of the self-propelled device.
- main memory 1420 can further include mode instructions 1424 , which the processor 1410 can execute to place the self-propelled device in one or multiple modes to interact with the wearable device.
- execution of the mode instructions 1422 can place the self-propelled device in an operational mode that provides feedback 1452 and/or instructions 1454 to the wearable device over the network 1480 (e.g., in training mode).
- Examples described herein are related to the use of computer system 1400 for implementing the techniques described herein. According to one example, those techniques are performed by computer system 1400 in response to processor 1410 executing one or more sequences of one or more instructions contained in main memory 1420 . Such instructions may be read into main memory 1420 from another machine-readable medium, such as storage device 1440 . Execution of the sequences of instructions contained in main memory 1420 causes processor 1410 to perform the process steps described herein. In alternative implementations, hard-wired circuitry and/or hardware may be used in place of or in combination with software instructions to implement examples described herein. Thus, the examples described are not limited to any specific combination of hardware circuitry and software.
- FIG. 15 is a block diagram that illustrates a mobile computing device upon which examples described herein may be implemented, such as the mobile computing device 495 of FIG. 4 .
- the computing device 1500 may correspond to, for example, a cellular communication device (e.g., feature phone, smartphone, etc.) that is capable of telephony, messaging, and/or data services.
- the computing device 1500 can correspond to, for example, a tablet or wearable computing device.
- the computing device 1500 includes a processor 1510 , memory resources 1520 , a display device 1530 (e.g., such as a touch-sensitive display device), one or more communication sub-systems 1540 (including wireless communication sub-systems), input mechanisms 1550 (e.g., an input mechanism can include or be part of the touch-sensitive display device), and one or more location detection mechanisms (e.g., GPS component) 1560 .
- a display device 1530 e.g., such as a touch-sensitive display device
- input mechanisms 1550 e.g., an input mechanism can include or be part of the touch-sensitive display device
- one or more location detection mechanisms e.g., GPS component
- at least one of the communication sub-systems 1540 sends and receives cellular data over data channels and voice channels.
- the memory resources 1520 can store a designated control application 1522 , as one of multiple applications, to initiate the communication sub-system 1540 to establish one or more wireless communication links with the self-propelled device and/or a wearable device.
- Execution of the control application 1522 by the processor 1510 may cause a specified graphical user interface (GUI) 1535 to be generated on the display 1530 .
- GUI graphical user interface
- Interaction with the GUI 1535 can enable the user to calibrate the forward directional alignment between the self-propelled device and the computing device 1500 .
- the GUI 1535 can allow the user to initiate a task-oriented operation (e.g., a game) to be performed by the user in conjunction with operating the self-propelled device with user gestures using the wearable device, as described herein.
- a task-oriented operation e.g., a game
- FIG. 16 is a block diagram of an example portable sensing device upon which examples described herein may be implemented, such as the wearable device 102 of FIG. 1A .
- the portable sensing device 1600 includes a processor 1610 , memory resources 1620 , a feedback mechanism 1630 (e.g., audio 1632 , haptic 1633 , visual 1631 devices), a communication sub-systems 1640 (e.g., wireless communication sub-systems such as BLUETOOTH low energy), one or more sensors 1660 (e.g., a gyroscopic sensor or accelerometer) and an input mechanism 1650 (e.g., an analog or digital mode selector).
- the communication sub-system 1640 sends and receives data over one or more channels.
- the memory resources 1620 can store mode instructions 1623 corresponding to a plurality of control modes 1622 , as described herein, which can be executed by the processor 1610 to initiate a particular mode. Certain executing mode instructions 1623 can initiate the communication sub-system 1640 to establish one or more wireless communication links with the self-propelled device and/or the mobile computing device. Execution of a control mode 1622 by the processor 1610 may cause the processor 1610 to generate distinct feedback responses using the feedback mechanism 1630 based on sensor data from the sensor(s) 1660 indicating user gestures performed by the user.
- the memory resources 1620 can comprise a number of state machines 1624 which can provide state machine reports 1627 to the processor 1610 can specified sensor patterns are identified by respective states machines 1624 .
- Each state machine 1624 may monitor for a single sensor pattern which, if identified by that state machine 1624 , can cause the state machine 1624 to transition states, thereby providing a state machine report 1627 to the processor 1610 identifying the user gesture performed.
- the processor 1610 can translate the state machine reports 1627 —which indicate the user gestures—in accordance with an executing set of mode instructions 1623 in order to generate a corresponding output via the feedback mechanism 1630 and/or control commands 1612 to be communicated to the self-propelled device via the communication sub-system 1640 .
- FIG. 14 , FIG. 15 , and FIG. 16 provide for a computer system 1400 , a computing device 1500 , and a portable sensing device 1600 for implementing aspects described, in some variations, other devices of the three can be arranged to implement some or all of the functionality described with the processing resources of the remotely operated device 140 of FIG. 1A , the mobile computing device 495 of FIG. 4 , or the wearable device 102 of FIG. 1A , as shown and described throughout.
- some examples include functionality for projecting an orientation and/or perspective of a user onto a gaming environment via sensing output of the portable sensing device 1600 .
- the orientation and perspective of the user can be inferred from sensors 1660 (e.g., IMU), and this sensor information can be virtualized for the gaming environment.
- the gaming environment can be shown on a computing device (e.g., display screen of a computer, mobile computing device etc.).
- the gaming environment can a perspective that is based on the orientation of the user (e.g., user is standing north), as determined by the portable sensing device 1600 .
- the perspective can change as the user changes orientation, moves in a particular direction etc.
- the portable sensing device 1600 can be used to control a virtual or actual object (e.g., self-propelled device or remotely operated device), and the orientation and direction of the controlled object may be with reference to a reference frame of the user.
- a virtual or actual object e.g., self-propelled device or remotely operated device
- a reference frame of the self-propelled device may be used, and the user's orientation can be used to influence control of the virtual or actual device in motion.
- the user's movement or motion can influence a change of direction.
- both orientations can be used concurrently.
- the device under control is a virtual vehicle that carries the user
- the user may turn his head (e.g., when wearing a necklace carrying the portable sensing device 2000 ) to see a view to a particular side while the orientation of the vehicle is used for the motion of the vehicle.
- FIG. 17 illustrates an embodiment of multiple sensing devices that concurrently provide input for a program or application which utilizes the inputs, along with inferences which can be made about a person or object that carries the devices, according to one or more examples.
- an example such as shown enables input from multiple sensing devices to be used for purpose of enabling inferences of movement and pose from two relevant sources of user motion.
- a user 1701 carries wearable devices in the form of a wrist device 1710 and pendent 1712 .
- one or both of the wrist device 1710 and pendent 1712 can be in the form of an alternative form factor or device type.
- the combination of sensing devices can include a hat, a ring, eyeglasses or a device which the user can carry in his or her hand (e.g., FOB, mobile computing device).
- FOB mobile computing device
- more than two wearable devices can be employed by one user.
- FIG. 18 illustrates a system which concurrently utilizes input from multiple modular sensing devices in connection with execution of an application or program.
- a multi-device system 1800 includes a first modular sensing device 1810 , a second modular sensing device 1820 , and a controller 1830 .
- Each of the first and second modular sensing devices 1810 , 1820 includes a respective inertial measurement unit (IMU) 1812 , 1822 , a processor 1814 , 1824 and memory 1816 , 1826 .
- the IMU 1812 , 1822 of each modular sensing device 1810 , 1820 can include sensors such as an accelerometer 1815 , 1825 and gyroscopic sensor 1817 , 1827 .
- the first and second modular sensing devices 1810 , 1820 may also include additional sensing resources, such as a magnetometer and/or proximity sensor.
- the controller 1830 can include a processor 1832 and a memory 1834 .
- the processor 1832 can execute instructions 1835 for a program or application that can execute and process inputs 1811 , 1813 from each of the respective modular sensing devices 1810 , 1820 .
- the controller 1830 is a mobile computing device, such as a multi-purpose wireless communication device which can wirelessly communicate with each of the first and second modular sensing devices 1810 , 1820 .
- FIG. 18 illustrates the controller 1830 as a separate device from the first and second modular sensing devices 1810 , 1820
- the controller 1830 is integrated or otherwise combined with at least one of the first or second modular sensing devices 1810 , 1820 .
- the controller 1830 can include a multi-purpose wireless communication device that is equipped with a gyroscopic sensor and accelerometer.
- variations can provide the second modular sensing device 1820 to be a local resource of the controller 1830 , which communicates with the first modular sensing device 1810 .
- the controller 1830 can receive inputs 1811 , 1813 from respective first and second modular sensing devices 1810 , 1820 .
- the inputs 1811 , 1813 can be received in connection with an application 1839 or program that is executed by the processor 1832 of the controller 1830 .
- the processor 1832 can execute the instructions 1845 in order to implement an inference engine 1835 for determining inferences about the person or object with one or both of the modular sensing devices 1810 , 1820 .
- the application 1839 can correspond to a game or simulation, and the inference engine 1835 can be specific to the application 1839 .
- the inference engine 1835 can be used to determine when the motions of two modular sensing devices 1810 , 1820 are separate and distinct from one another, or continuous and/or part of the same input motion.
- each input 1811 , 1813 can correspond to one or more of a position, height, orientation, velocity, linear and/or rotational acceleration.
- Each of the first and second sensing devices 1810 , 1820 generate a set of measured (or sensed data) corresponding to, for example, a movement (e.g., gesture) made with the respective sensing device 1810 , 1820 .
- the controller 1830 can process input 1811 , 1813 corresponding to each of the respective data sets in order to determine a third data set of inferences. In this way, the inferences reflect information determined from sensed data, rather than directly measured data.
- the inferences which can be output from the inference engine 1835 can be determinative or probabilistic, depending on implementation.
- user 1701 can wear two modular sensing devices, and the inference engine 1835 can assume some inferences based on anatomical constraints and/or context (e.g., such as provided from execution of the application 1839 ).
- each of the first and second modular sensing devices 1810 , 1820 can correspond to a wearable wrist device.
- the second modular sensing device 1820 can correspond to the pendent 1712 or neck-worn device.
- the inference engine 1835 can be used to determine additional position data for the movement of that device along a third axis based on orientation, position or context of second modular sensing device 1820 (wrist device 1711 or pendent device 1712 ). For example, if the first modular sensing device 1810 (wrist device 1711 ) measures arc motion, and the second modular sensing 1820 is the pendent, then the orientation of the second modular sensing device can indicate whether, for example, the arc motion is in front of the user or to the user's side.
- the second modular sensing device 1820 is the second wrist device 1712
- the information sensed from the second wrist device can identify the corresponding hand or device as being in front of the body.
- the inference engine 1835 can determine the inference to be that the user is making the arc of motion in front of his body.
- the height of the second sensing device 1820 is determined to be belt high and the device is held by the user, the orientation of the user's torso can be inferred (along with the direction of the arc).
- the second modular sensing device 1820 is a pocket device (e.g., mobile computing device, FOB)
- information can be determined from, for example, the height of the device (e.g., user standing, crouching or jumping) and the rotation of the device.
- the second modular sensing device 1820 is pocket word
- a change in the orientation of the device from vertical to horizontal, in combination with a downward acceleration can indicate the user is crouching.
- the type of motion that is likely by the first modular sensing device 1810 may be limited (e.g., motion of the wrist device 1710 is likely in front of user when user is moving up or down).
- 17 and 18 can enable the user to utilize the modular sensing device(s) in connection with a real-world gameplay environment (or other task-oriented activities) executed by one or more of the modular sensing devices 1810 , 1820 , control a remotely operated device using gestures sensed by the modular sensing devices 1810 , 1820 , interact with other users, and perform various tasks in which the modular sensing devices 1810 , 1820 can provide feedback and response output.
- FIG. 19 illustrates an example of a modular sensing device insertable into a wrist worn apparatus.
- a modular sensing device 1900 can be constructed in accordance with examples provided herein, in order to implement operations and functionality such provided by any of the examples described.
- the modular sensing device 1900 can include a housing 1910 for containing a processor, memory (e.g., a controller implementing a plurality of state machines), and one or more sensors (e.g., IMU, gyroscope, accelerometer, proximity sensor, magnetometer, etc.).
- the modular sensing device 1900 can also include a wireless communication resource for communicating with other devices, including devices which may be controlled in movement (e.g., self-propelled device) or other processes.
- the modular sensing device 1900 can communicate sensor data, including output from the IMU, to another device for purpose of controlling movement of the other device.
- the modular sensing device 1900 can include processing capabilities to process raw sensor data into higher data forms of communication.
- the modular sensing device 1900 can generate output in the form of commands, or input for command selection from a receiving device.
- the housing 1910 of the modular sensing device 1900 can include securement features for enabling the modular sensing device 1900 to fasten onto another compatible structure 1920 .
- the compatible structure 1920 can include an opening that is shaped to receive and secure the modular sensing device 1910 .
- the securement features can include, for example, structural or shaped features of the housing 1910 .
- the housing 1910 can be dimensioned and/or structured (e.g., housing may be biased) to snap-fit into the compatible structure 1920 .
- at least one of the housing 1910 or compatible structure 1920 can include an integrated and/or mechanical fastener to secure the modular sensing device 1900 .
- FIG. 20 illustrates an implementation of the modularized sensing device 2000 .
- the sensing device 2000 can be retained by the compatible structure 2020 (e.g., wrist-worn strap), and then removed and placed in an opening of a wielded device 2010 (e.g., play sword).
- the placement of the modular sensing device 2000 in different compatible structures 2020 , 2010 for retention and use can be coordinated with different functionality being enabled through the sensing device.
- the modular sensing device 2000 in the wrist-worn strap 2020 can be used in conjunction with a first program running on a mobile computing device (controller), self-propelled device and/or other computer system (e.g., virtual gaming system).
- the modular sensing device 2000 When placed in the wielded device 2010 (e.g., a wand), the modular sensing device 2000 can be operated in conjunction with a mobile computing device, self-propelled device and/or other computer system (e.g., virtual gaming system) which executes a second program or application.
- the orientation of the modular sensing device 2000 can be used to determine a perspective, such as a virtual field of view for gameplay.
- the perspective can refer to the orientation, direction and/or position of the user, and/or of the user's body part with respect to the sensing device.
- the orientation and direction of the sensing device can be used to project a corresponding virtual object in a virtual environment (e.g., sword).
- the modular sensing device 2000 may also be able to read an identifier of the compatible structure 2020 , 2010 in order to determine information about the structure, such as its dimension, and whether the structure is word or carried. Based on known information, inferences can be determined for purpose of virtualization, etc. (e.g., length of sword).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Heart & Thoracic Surgery (AREA)
- Cardiology (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Ocean & Marine Engineering (AREA)
- User Interface Of Digital Computer (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
- This application claims the benefit of priority to U.S. Provisional Patent Application Ser. No. 62/274,514, entitled “PORTABLE SENSING DEVICE FOR PROCESSING GESTURES AS INPUT,” and filed on Jan. 4, 2016; and U.S. Provisional Application Ser. No. 62/346,216, entitled “MODULAR SENSING DEVICE FOR PROCESSING GESTURES AS INPUT,” and filed on Jun. 6, 2016; the aforementioned priority applications being hereby incorporated by reference in their respective entireties.
- Wearable device technology in consumer electronics is rapidly being integrated into routine user activities, such as sporting activities, content viewing or browsing, and task-oriented activities (e.g., gaming). Furthermore, wireless networks typically utilize protocols that enable wireless devices to detect signal sources from other devices for initiating data and communication links. Such networks are typically implemented using networking hardware, which may be incorporated in various wireless network devices, such as access points (APs), peer-to-peer (P2P) devices, wireless local area network (LAN) equipped devices, and the like—each advertising a unique identity (e.g., a media access control (MAC) address) indiscriminately to devices within range. Connections may be established with such devices to transmit and receive data.
- The disclosure herein is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements, and in which:
-
FIG. 1A illustrates a block diagram of an example modular sensing device for use in controlling a remotely operated device, according to examples described herein; -
FIG. 1B is a block diagram illustrating an example modular sensing device, as shown and described herein; -
FIG. 2 is a flow chart describing an example method of generating commands by a modular sensing device, in accordance with examples described herein; -
FIG. 3 illustrates an example modular sensing device insertable into a plurality of compatible apparatuses; -
FIG. 4 is a block diagram illustrating an example modular sensing device for acquiring virtual or digital resources; -
FIG. 5 is a block diagram illustrating an example virtual resource management system utilized by a wearable device; -
FIG. 6 is a flow chart describing an example method of acquiring virtual resources using a wearable device; -
FIGS. 7A and 7B are flow charts describing example methods of acquiring and utilizing virtual resources in connection with gameplay; -
FIGS. 8A, 8B, and 8C illustrate unique identifier logs and association tables utilized in connection with virtual resource association, acquisition, and allocation; -
FIG. 9 illustrates a wearable device pairing triggering virtual resource data sharing in connection with gameplay; -
FIGS. 10A and 10B illustrate a wearable device pairing triggering an interactive mode between a pair of proximate users; -
FIG. 11 is a flow chart describing an example method of implementing an interactive mode between a pair of user utilizing a corresponding pair of wearable devices; -
FIG. 12 is a flow chart describing an example method of initiating a training mode on a wearable device in connection with a self-propelled device; -
FIG. 13 is a flow chart describing an example method of implementing a wearable device in a sword mode; -
FIG. 14 is a block diagram of an example computer system upon which examples described herein may be implemented; -
FIG. 15 is a block diagram of a mobile computing device upon which examples described herein may be implemented; -
FIG. 16 is a block diagram of an example portable sensing device upon which examples described herein may be implemented; -
FIG. 17 illustrates an embodiment of multiple sensing devices that concurrently provide input for a program or application which utilizes the inputs, along with inferences which can be made about a person or object that carries the devices, according to one or more examples; -
FIG. 18 illustrates a system which concurrently utilizes input from multiple modular sensing devices in connection with execution of an application or program; -
FIG. 19 illustrates an example of a modular sensing device insertable into a wrist worn apparatus; and -
FIG. 20 illustrates an implementation of the modularized sensing device, in accordance with examples described herein. - Examples described herein relate to a multi-modal modular sensing device, worn or carried by a user (e.g., as a wrist-worn device), to enable a variety of interactions with other devices through sense movement of the modular sensing device. Among other activities, examples provide for a modular sensing device that can individually, or in combination with another device (e.g., controller device, such as a mobile computing device) control other devices, interact with compatible devices of other users, and/or operate in connection with task-oriented activities (e.g., gameplay). In some examples, the modular sensing device corresponds to a wearable device (e.g., a watch, a pendant for a necklace, a hat, glasses) can be placed in multiple modes to, for example, control the characteristics of movement of another device, or interact with the real-word. For example, the modular sensing device can control acceleration and maneuvering of a self-propelled device.
- In certain aspects, a portable modular sensing device can include an inertial measurement unit (IMU), and can be operated in multiple modes in which gestures (e.g., arm gestures) may be translated based on the particular mode of the wearable device. In one example, the modular device can be insertable into multiple devices, such as a wrist worn device, clothing, an accessory device (e.g., a toy sabre or sword), a wearable pendant, ring, hat, and the like. The modular sensing device can further include output devices, such as lighting elements (e.g., one or more LEDs), an audio device (e.g., a speaker), and/or a haptic feedback system. In one example, the modular sensing device further includes a charging port (e.g., a mini-universal serial bus port) to enable charging of a power source included in the device.
- According to some examples, a modular sensing device is operable to detect its own movement in three-dimensional space, using the IMU. In some implementations, the IMU can be an integrated device. Alternatively, the IMU can be implemented through a combination of sensors, such as a three-dimensional accelerometer or gyroscope. In certain examples, the modular sensing device can include a processor and memory to interpret the sensor data, and to communicate interpreted sensor data to another device (e.g., mobile computing device) using a wireless connection (e.g., BLUETOOTH). In variations, the sensor data can generate data that is either raw data or processed, depending on the processing resources that are selected for the portable device, and the processing load which is implemented for the portable device.
- In many implementations, the modular sensing device can generate output (e.g., haptic, audio, and/or visual output) or control commands for operating a remote controlled device by utilizing state machines in memory that correspond to individual gestures. Depending on a selected mode of the modular sensing device, processing resources of the modular sensing device can monitor logical state machines or automatons that correspond to specified sensor data combinations (e.g., based on gestures performed by a user). For example, a single state machine can correspond to an initial arm raising gesture performed by the user wearing a wrist-worn device having the modular sensing device. A sensor profile from a single sensor or multiple sensors included in the IMU (e.g., gyroscopic sensor and/or accelerometer) can indicate the initial arm raising gesture, which can trigger a state transition for the state machine. This state transition can cause processing resources of the modular sensing device to automatically generate a control command to cause the remote controlled device to accelerate.
- Additionally, a second state machine can correspond to an increased angle in the arm raising gesture. When the sensor profile of the IMU indicates the gesture above a certain predetermined angle, a state transition can be triggered for the second state machine, which can trigger an increased acceleration command to be generated automatically. Several state machines can be comprised in memory of the modular sensing device, and can each correlate to a specified sensor data profile, from simple single-action gestures, to complex multiple-motion gestures. Furthermore, an interpretation engine (e.g., a processor) can interpret each state transition for each state machine based on a particular mode and/or sub-mode of the modular sensing device.
- As used herein, a “modular sensing device” can comprise an electronic device that includes sensor resources for detecting its own movement, and of dimension and form factor suitable for being insertable into any number of devices configured for receiving the device. Numerous examples of modular sensing devices are provided in the context of a “wearable device,” such as a wrist worn device (e.g., watch, watch band, bracelet). But as noted by other examples, variations to the type and form factor of a wearable device can vary significantly, encompassing, for example, eyeglasses, hats, pendants, armbands, glasses and various other form factors. While many examples describe functionality in the context of a wearable device, embodiments extend such examples to other form factors in which a modular sensing device may be inserted, such as wands, fobs, wielded toys, or mobile communication devices.
- In many examples, the wearable device can include one or more sensors to detect the device's own movements. In particular, a wearable device includes an accelerometer and/or a gyroscopic sensor. In some examples, sensor data, corresponding to user gestures performed by the user wearing the wearable device, can be translated into control commands or data packets to be transmitted and implemented based on the selected mode of the wearable device. According to many examples, the wearable device can include an inductive interface to inductively pair with other devices, which can trigger a specified mode on the wearable device. For example, an inductive pairing between the wearable device and a self-propelled device can trigger a drive mode in which the wearable device can be utilized by the user to operate the self-propelled device. Additionally or alternatively, the wearable device can include an input mechanism, such as an analog or digital button, that enables the user to select a particular mode and/or scroll through a series of modes for the wearable device.
- Among other functionality, some examples described herein provide for alternative modes of operation, including, for example (i) a “drive mode” in which the wearable device is utilized to control a self-propelled device; (ii) a “control mode” in which the wearable device is utilized in connection with smart home devices; (iii) a “finding mode” or “finder mode” in which the wearable device is utilized to detect virtual or digital resources; (iv) a “mining mode” which can be initiated by the user to collect virtual resources when they are detected in the finder mode; (v) a “training mode” in which the wearable device is utilized in connection with a self-propelled device to assist the user in training for certain achievements or for increasing the user's abilities to perform task-oriented activities (e.g., increasing skills for a subsequent game or sporting activity); (vi) a “sword mode” in which the wearable device provides feedback (e.g., haptic, audio, and/or visual feedback) when the user performs actions while holding an object; (vii) a “default mode” in which the device monitors for and detects other proximate wearable devices (e.g., wearable devices of the same type) which enables the users to pair with each other's wearable devices; (viii) a “interactive mode” or “battle mode” selectable in response to two or more device pairings in which users are able to interact with each other with predetermined sets of actions (e.g., offensive and defensive actions learned and perfected by users practicing in the training mode); (ix) a “sharing mode” selectable in response to two or more device pairings in which users can share information stored in each other's wearable devices, or user accounts associated with the wearable devices (e.g., sharing collected virtual resources discovered and mined in the finder and mining modes to be expended or consumed in a gameplay environment); and (x) a “gaming mode” in which the wearable device can be utilized in connection with a game.
- Still further, numerous examples make reference to a “self-propelled” device. A self-propelled device can include, for example, a device that can be wirelessly and remotely controlled in its movement, whether the movement is on ground, water, or air. For example, a self-propelled device can include a wirelessly and remotely controlled drone, car, airplane, helicopter, boat, etc. While conventional examples enable control of a self-propelled device, conventional approaches generally utilize a perspective of the device being controlled. While some conventional devices, for example, enable a computing device held by the user to project a perspective of the device under control, examples described herein enable control of such devices to utilize an orientation of the user. Specifically, some examples include a modular sensing device that can determine an orientation of the user, and further enable control of the self-propelled device through an environment that accommodates or is in the perspective of the user, based on the orientation of the user (as determined by the modular sensing device). By way of example, the control of a self-propelled device can be projected through an orientation or perspective of the user for purpose of a virtual environment.
- Some examples include a modular sensing device having a wireless communication module (e.g., a BLUETOOTH low energy module) that enables communication of sensor data (e.g., raw sensor data from the accelerometer or gyroscopic sensor), or translated data (i.e., translations of the sensor data based on the selected mode of the wearable device). In some examples, the sensor data may be relayed for translation by a mobile computing device before being transmitted to another device (e.g., a paired wearable device or a paired self-propelled device). In other examples, processing resources of the wearable device can execute mode instructions, based on the selected mode, to translate the sensor data (e.g., via use of state machines that trigger control commands or feedback based on a selected mode of the modular sensing device) for direct transmission to one or more other devices, as described herein.
- As used herein, “body part gestures” or “user gestures” include gestures performed by a user while utilizing the wearable device. For example, the wearable device may be a wrist-worn device, in which case the body part or user gestures may comprise arm gesture, and can include any number of physical movements or actions that affect the sensors of the wearable device when it is worn on the wrist. Such movements and actions can include shaking, arm movements (e.g., raising, lowering, pointing, twisting, and any combination thereof), wrist movements, hand actions (such as grasping or grabbing), and the like. However, the wearable device is not limited to wrist-worn devices, but may be utilized as a ring (e.g., a finger-worn device), an ankle-worn device, a neck-worn device, a head-worn device, a belt (e.g., a waist-worn device), etc. Thus, user gestures performed using the wearable device can be any actions or movements in which correlated sensor data from sensors of the device can be translated into commands, instructions, feedback, etc. depending on the mode of the wearable device.
- Among other benefits, examples described herein achieve a technical effect of enhancing user interactivity with other devices and other users. Such interactivity may include utilizing the modular sensing device to control a self-propelled device, interact with other users of wearable devices, collect and share data, control smart home devices, interact with the real-world via a gaming interface through the modular sensing device, and the like.
- One or more examples described herein provide that methods, techniques, and actions performed by a computing device, a modular sensing device, or a self-propelled device are performed programmatically, or as a computer-implemented method. Programmatically, as used herein, means through the use of code or computer-executable instructions. These instructions can be stored in one or more memory resources of the computing device. A programmatically performed step may or may not be automatic.
- One or more examples described herein can be implemented using programmatic modules or components of a system. A programmatic module or component can include a program, a sub-routine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions. As used herein, a module or component can exist on a hardware component independently of other modules or components. Alternatively, a module or component can be a shared element or process of other modules, programs or machines.
- Some examples described herein can generally require the use of computing devices, including processing and memory resources. For example, one or more examples described herein can be implemented, in whole or in part, on computing devices such as digital cameras, digital camcorders, desktop computers, cellular or smart phones, personal digital assistants (PDAs), virtual reality (VR), augmented reality (AR), or mixed reality (MR) headsets, laptop computers, printers, digital picture frames, and tablet devices. Memory, processing, and network resources may all be used in connection with the establishment, use, or performance of any example described herein (including with the performance of any method or with the implementation of any system).
- Furthermore, one or more examples described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer-readable medium. Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing examples can be carried and/or executed. In particular, the numerous machines shown with examples include processor(s) and various forms of memory for holding data and instructions. Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers. Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on smart phones, multifunctional devices or tablets), and magnetic memory. Computers, terminals, network enabled devices (e.g., mobile devices, such as cell phones) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, examples may be implemented in the form of computer-programs, or a non-transitory computer usable carrier medium capable of carrying such a program.
- System Description
-
FIG. 1A illustrates a block diagram of an example modular sensing device for use in controlling a remotely operated device, according to examples described herein. Themodular sensing device 100 shown inFIG. 1A may be integrated with a wearable device 102 (e.g., a wrist-worn device, pendant, clothing, a hat, eyeglasses, an ankle-worn device, a ring, etc.), or can be inserted into a compartment or otherwise included as a part of thewearable device 102. Thewearable device 102, ormodular sensing device 100 disposed therein, can include amode selector 110, such as an analog button that enables the user to select a mode of the device. In one example, repeateduser input 111 on themode selector 110 enables the user to scroll through a list of available modes. Such modes include, but are not limited to a drive mode, a control mode, a finder mode, a mining mode, a training mode, a sword mode, a default mode, an interactive or battle mode, a sharing mode, and a gaming mode. As shown in the block diagram, auser input 111 on themode selector 110 can cause aprocessor 120 of themodular sensing device 100 to generate anoutput 132 that indicates the particular mode selected. For example, themodular sensing device 100 can include a number ofoutput devices 130, such as an audio device, a haptic system, and/or visual elements (e.g., an LED array or display). Eachuser input 111 on themode selector 110 can trigger an audio, haptic, and/orvisual output 132, indicating the particular mode (e.g., a drive mode). As an example, theoutput 132 can comprise a voice output that states the selected mode, or a combination of voice, visual, and haptic output. - Additionally or alternatively, the user may connect the
wearable device 102 with a mobile computing device, such as the user's smart phone or tablet computer. Mode selection may be performed automatically by the user initiating a designated application of the mobile computing device, such as a smart home application, a controller application (e.g., to control a self-propelled device), a gaming application, and the like. In variations, the user can execute a designated application in connection with thewearable device 102 that enables the user to scroll through the various modes. The user may scroll through the modes on the mobile computing device, or via successive selection inputs on themode selector 110, which can trigger the mobile computing device to display a selectable mode. In other variations, multiple types of inputs can be performed in themode selector 110, such as tap gestures and tap and hold gestures, which can correlate to scrolling through the modes and selecting a particular mode respectively. As provided herein, themode selector 110 can be an input mechanism such as an analog or digital button, a touch panel such as a track pad, or a miniature touch-sensitive display device. - According to examples provided herein, the
modular sensing device 100 can include amemory 115 that stores mode instruction sets executable by theprocessor 120 based on the mode selected by the user. Each mode instruction set can cause theprocessor 120 to interpretsensor data 137 in a particular manner. Along these lines, same orsimilar gestures 106 performed by the user can result in different generatedoutputs 132 or commands 108 depending on the mode selected by the user. In some implementations, selection of a particular mode can cause theprocessor 120 to initiate a communications module 125 of themodular sensing device 100 to establish a wireless connection with another device. In one example, themodular sensing device 100 can include a BLUETOOTH low energy module to enable communications with one or more other devices, such as a second modular sensing device or a remotely operateddevice 140. - The
modular sensing device 100 can be relatively small in size compared to current computing devices, and in many implementations, themodular sensing device 100 does not include a power-intensive memory and computational resources. In such implementations, the memory and/or memory controller can be comprised of or implement a series of state machines that, when transitioned, can trigger a particular output automatically. Further description of the state machine implementations is provided below with respect toFIG. 1B . Furthermore, examples described herein with respect to the drive mode of themodular sensing device 100 can also be implemented in the state machine examples described herein. - As provided herein, the
memory 115 can include a drivemode instruction set 117 executable by theprocessor 120 in response to auser input 111 on themode selector 110. In some aspects, execution of thedrive mode instructions 117 can cause theprocessor 120 to initiate the communications module 125 to establish a communications link 104 with a proximate remotely operateddevice 140. In variations, themodular sensing device 100 can include aninduction interface 127 which can trigger thecommunication link 104 between themodular sensing device 100 and the remotely operateddevice 140. For example, upon selecting the drive mode, the user can place thewearable device 102 within inductive range (e.g., ˜2-5 cm) of the remotely operateddevice 140, which can include a corresponding induction interface. The inductive transfer of communication information can enable themodular sensing device 100 to establish the communication link accordingly. - According to examples described, the
modular sensing device 100 can further include an inertial measurement unit (IMU) 135 that can comprise a gyroscope and an accelerometer for accurate measurement of the modular sensing device's 100 movement and orientation. In variations, theIMU 135 can further include a magnetometer to, for example, assist in calibration based on the orientation. Once thecommunication link 104 is established in the drive mode, theprocessor 120 can monitor thesensor data 137 for theparticular gestures 106 being performed by the user. In several examples, thegestures 106 can correspond to the user raising or lowering an arm, and/or performing additional arm motions. Thesensor data 137 from theIMU 135 can comprise movement, position, and/or orientation information that theprocessor 120 can interpret in accordance with the drive mode. For example, gestures 106 performed by the user can be detected by theprocessor 120 viasensor data 137 from theIMU 135. Each of thegestures 106 can be interpreted by theprocessor 120 as one or more control commands 108 to be executed by the remotely operateddevice 140. - In one example, the drive mode can be automatically initiated in response to a particular detected gesture, regardless of the current mode of the
modular sensing device 100. This gesture can correspond to a distinct sensor data signature that, when detected by theprocessor 120 in executing any mode, overrides that mode and initiates the drive mode automatically. Thus, upon detection of the distinct sensor data signature, theprocessor 120 can automatically initiate the communications module 125, establish the communications link 104 with the remotely operateddevice 140, and generate control commands 108 based on the detectedgestures 106 performed by the user in thesensor data 137. Themodular sensing device 100 may then dynamically transmit such control commands 108 to the remotely operateddevice 108 for execution as they are generated by theprocessor 120. In one example, the specific gesture corresponds to a pushing motion with an arm wearing thewearable device 102 performed by the user. As provided herein, this pushing motion can correspond to a specified sensor data signature not used for any other mode, and is therefore distinct to enable theprocessor 120 to identify it irrespective of the current mode of themodular sensing device 100. - In certain examples, gestures 106 such as raising an arm can cause the
processor 120 to generate acceleration commands to drive away from the user. Lowering the arm can cause the processor to generate deceleration and/or reverse commands. Further, moving the arm from side to side can cause theprocessor 120 to generate steering or directional commands. For example, moving the arm left can cause the remotely operateddevice 140 to turn left, and moving the arm right can cause thedevice 140 to turn right as thedevice 140 travels away from the user. Such control commands 108 may be processed by a controller of the remotely operateddevice 140, or may be directly executed on the drive motors of thedevice 140 in order to accelerate and maneuver thedevice 140 in accordance with thegestures 106 performed by the user. - Furthermore, in the drive mode, angular thresholds can be established in the
drive mode instructions 117 that can determine the manner in which theprocessor 120 interprets thesensor data 137. When such thresholds are crossed, theprocessor 120 can alter interpretation of thesensor data 137 into alternative commands 108. For example, as the user raises the arm above an angular threshold (e.g., 45 degrees), and/or changes an orientation of the arm (e.g., palm down to palm up), theprocessor 120 can alter the interpretation of thesensor data 137 such that remotely operateddevice 140 drives towards the user as the arm is raised. Furthermore, in such a state, the directional interpretation of thesensor data 137 can be reversed such that moving the arm left can cause the remotely operateddevice 140 to turn right, and moving the arm right can cause the remotely operateddevice 140 to turn left. This directional reversal triggered by the angular threshold, and in combination with the change in orientation of the user's arm, can create a palm control illusion of the remotely operateddevice 140 by the user. Thus, in the drive mode, specified gestures detected in the sensor data 137 (e.g., the user's arm rotating or crossing an angular threshold) can trigger theprocessor 120 to interpret thesensor data 137 differently, or inversely from an initial interpretation, in order to produce the illusion. -
FIG. 1B is a block diagram illustrating an example modular sensing device, as shown and described herein. As provided herein, themodular sensing device 150 can be space limited, and can only include a limited amount memory and computational resources. In such implementations, themodular sensing device 150 can represent each possible gesture that can be performed by a user as a state machine. Thus, for each gesture, a state machine corresponding to that gesture can either positively identify its gesture or negatively identify its gesture. When a positive gesture is identified, the state machine triggers a state transition which can cause anoutput generator 160 to generate a particular output accordingly. As provided herein, theoutput generator 160 andmemory 180 can comprise a hardware controller implementing the various state machines to generate the output commands 164. - As an illustration, the
modular sensing device 150 can include amemory 180 implementing a number of state machines (e.g.,SM1 181,SM2 183,SM3 185,SM4 187, . . . , SMN 189), each being associated with a particular gesture. For example,SM1 181 can be associated with the user raising an arm,SM2 183 can be associated with the user lowering an arm,SM3 185 can be associated with the user pointing an arm to the right, andSM4 187 can be associated with the user pointing an arm to the left. Furthermore, any number of state machines may be implemented in thememory 210 representing any number of gestures. At any given time step, the state machines can be instantiated for each gesture type, and each state machine can continuously inspect theinstantaneous sensor data 167 from theIMU 165 in order to initialize or instantiate, transition individual states along a state string, terminate a current state string, or trigger an accept state or final state. When the final state is triggered, this means that the particular gesture corresponding to that state machine has been performed by the user. - According to examples, each state machine can consist of a finite set of states (a fixed string of states), one or more inputs, one or more outputs, a transition function, and an initial state. The state machine can be linked to a particular gesture by way of a sensor data signature, which can comprise an accelerometer data signature, gyroscope data signature, or a combination of both. Furthermore, the state machine can be linear and directional, with each state having a particular error tolerance in its sensor data signature. A final state of a state machine can thus be triggered when the full sensor data signature of a particular gesture is matched to the
sensor data 167 generated by theIMU 165. - In some aspects, if at any time after instantiating, an associated gesture corresponding to a respective state machine is not being performed, the input string for that state machine, and for that particular instantiation, automatically terminates. At any given instance, the state machine either terminates from the outset (e.g., an initial aspect of the sensor data signature for the gesture is not matched), or instantiates the input string towards the final state. At any given state along the input string, the state machine can terminate if the gesture being performed diverges from the error tolerances built into the state machine. If, however, each state along the input string is transitioned accordingly (i.e., the
sensor data 167 from theIMU 165 matches the sensor data signature for that state machine within its error tolerances), the final state is triggered for that state machine. - The
memory 180 can include astate machine reporter 182 that can report such final state transitions 188 to anoutput generator 160 of themodular sensing device 150. Theoutput generator 160 can be configured based on the particular mode of themodular sensing device 150. Accordingly, final state reports 188 from individual state machines can be interpreted differently, or can cause a particular output, depending on the mode. In other words, the output from themodular sensing device 150 for a particular gesture (e.g., a backhanded swinging motion) can be different depending on the mode initiated via themode selector 155. Furthermore, certain final state reports 188 from thestate machine reporter 182 can correspond tosub-mode triggers 186 for the particular mode. Suchsub-mode triggers 186 may not trigger an output, but rather trigger theoutput generator 160 to alter interpretation of certain final state reports 188 in order to generate an alternative output. - Such outputs generated by the
output generator 160 can comprise control commands 162 to operate a remotely operated device, such as acceleration, steering, and deceleration commands. In some aspects, theoutput generator 160 can generate output commands 164 for the modular sensing device's 150haptic system 192,visual system 194, and/oraudio system 196. Thus, as final state reports 188 are received, theoutput generator 160 can cause the haptic 192, visual 194, andaudio 196 systems of themodular sensing device 150 to produce individual or combined outputs. As described herein, such outputs can include vibration feedback or guidance, colored lights or display output, tonal outputs such as audible chimes that indicate positive and/or negative feedback, and voice output. - In the example of the drive mode described with respect to
FIG. 1A , each completed arm gesture can correspond to a finalstate machine report 188, and can cause theoutput generator 160 to generate acontrol command 162 accordingly. In the example shown inFIG. 1B , the user raising an arm above a certain angle (e.g., five degrees) can cause a state machine corresponding to that gesture to transition to a final state. Thestate machine reporter 182 can report thefinal state 188 of that state machine to theoutput generator 160, which, based on the drive mode interpretation, can generate acontrol command 162 to initiate acceleration of the remotely operated device. Thiscontrol command 162 may then be transmitted to the remotely operated device via thecommunications module 175 of themodular sensing device 150. - Referring to both
FIGS. 1A and 1B , in certain modes, thecommunications module 125, 175 can act as a signal or beacon sensor which can provide signal data to theprocessor 120 oroutput generator 160. Based on the mode, theprocessor 120 oroutput generator 160 can generate a specified response based on the signal data. For example, the signal data can indicate a unique identifier of the signal source. Thememory modular sensing device - As described with respect to
FIG. 1A , the interpretation of thesensor data 137 can be altered based on a particular gesture trigger, such as the user's arm exceeding an angular threshold. The analog with respect toFIG. 1B comprises a state machine in thememory 180 that correlates to the particular threshold gesture. Thus, when thesensor data 167 indicates the specified threshold gesture, the correlated state machine can transition to its final state, which can be reported to theoutput generator 160 accordingly. Such afinal state report 188 can comprise asub-mode trigger 186, in that thesub-mode trigger 186 causes theoutput generator 160 to alter interpretation of certain final state reports 188 while remaining in the same mode. In the drive mode example, asub-mode trigger 186 can correspond to the user raising an arm above a certain threshold (e.g., 45 degrees), which can cause theoutput generator 160 to, for example, reverse the interpretation of the final state reports 188 corresponding to the directional commands for the remotely operated device. - Additionally, each mode of the
modular sensing device 150 can include any number ofsub-mode triggers 186 that cause theoutput generator 160 to alter an interpretation of or disregard a particularfinal state report 188 corresponding to a specified sensor data signature. For example, the drive mode can include a set of angular gesture thresholds (e.g., raising an arm beyond 45 degrees, lower the arm below 45 degrees, turning the arm from palm down to palm up). A state machine can be dedicated—within the specified mode—to a sensor data signature indicating the user gesture crossing a gesture threshold. Thus, when the user gesture crosses the gesture threshold, the dedicated state machine can transition to its final state, which, when reported to the output generator 160 (i.e., as a sub-mode trigger 186), can cause theoutput generator 160 to alter interpretation of certain final state reports 188 within that mode. In some aspects, disregarded final state reports prior to the sub-mode trigger can now trigger a specified output (e.g., an audio, haptic, and/or visual output, or a particular control command 162). - In further examples, when the
modular sensing device 150 is in a particular mode, a specific complex gesture—represented by a particular state machine in thememory 180—can cause theoutput generator 160 to reconfigure its interpretation of certain final state reports 188, execute a sub-mode, or automatically initialize a different mode for themodular sensing device 150. Accordingly, for any given instance,sensor data 167 from theIMU 165 can continuously cause the various state machines to instantiate, terminate, instigate state transitions, and/or transition to a final state. In aspects described, only when a state machine transitions to its final state does theoutput generator 160 generate output commands 164 and/or control commands 162 to provide feedback, operative control over a remotely operated device, guidance via the output devices, and/or task-based instructions (e.g., in accordance with a particular game). - Furthermore, the various aspects performed by the
modular sensing device 100 described with respect toFIG. 1A may also be performed by examplemodular sensing devices 150 as shown and described with respect toFIG. 1B . Thus, the execution of designated modal instruction sets by themodular sensing device 100 ofFIG. 1A —in which theprocessor 120 directly interprets thesensor data 137 based on the executing instruction set—may be substituted by the state machine examples as described with respect toFIG. 1B . In other words, the limited memory and computational resources of themodular sensing device 150 ofFIG. 1B may be compensated by attributing sensor data signatures to state machines, requiring less memory and processing without sacrificing functionality. - Methodology
-
FIG. 2 is a flow chart describing an example method of generating output commands by a modular sensing device, in accordance with examples described herein. In the examples described with respect toFIG. 2 , reference may be made to reference characters representing like features as shown and described with respect toFIGS. 1A and 1B . Furthermore, the methods and processes described with respect toFIG. 2 may be performed by an examplemodular sensing device FIGS. 1A and 1B . Referring toFIG. 2 , themodular sensing device 150 can receive a mode selection input (200). In some examples, the mode selection input can comprise the user physically pushing ananalog mode selector 155 on themodular sensing device 150. Additionally or alternatively, the mode selection input can comprise a specified gesture performed while using themodular sensing device 150. - In response to the mode selection input, the
modular sensing device 150 can interpret the final state machine reports 188 in accordance with the mode selected (205). As provided herein, themodular sensing device 150 can operate in any number of modes, with each mode corresponding to controlling a remote device (e.g., drive mode), generating user tasks and feedback in connection with a remote device (e.g., training mode), generating user tasks and feedback in conjunction with another modular sensing device (e.g., playing a real-world game with another user), and/or generating standalone user tasks and feedback. As provided herein, user tasks can comprise instructions or suggestions to the user via the output devices of the modular sensing device in accordance with the selected mode. Such instructions or suggestions can be generated based on a programmed game, in which the user is to utilize themodular sensing device 150 to perform gestures and action, search for a certain signal source, cause the remotely operateddevice 140 to perform a set of actions, and the like. Furthermore, feedback can comprise reactive output to the tasked actions to be performed by the user. For example, the feedback can comprise audio, visual, and/or haptic responses to actions indicating affirmative or negative completion of such tasks. In one example, the feedback indicates to the user whether an instructed task or gesture (e.g., indicated by thesensor data 167 and correlated state machine) has successfully been performed. - According to several examples, the final state machine reports 188 can be correlated to a specified output. Thus, based on each final
state machine report 188 themodular sensing device 150 can generate commands in accordance with the mode and, when relevant, the sub-mode of the device 150 (220). As provided herein, such generated commands can include output commands 164 to output audio (222), visual (226), and/or haptic (224) output on themodular sensing device 150. Additionally or alternatively, themodular sensing device 150 can generate control commands 162 (228) for operating a remotely operateddevice 140. In either case, themodular sensing device 150 can transmit or execute the commands accordingly (225). - At any given time in any given mode, the
modular sensing device 150 can identify asub-mode trigger 186 in the final state machine reports 188 (210). In response to thesub-mode trigger 186, themodular sensing device 150 can reconfigure interpretation of one or more final reports from one or more corresponding state machines (215). Based on each finalstate machine report 188, themodular sensing device 150 can generate commands in accordance with the mode and the sub-mode of the device 150 (220), including the altered interpretations based on the sub-mode trigger(s) 186. As discussed above, such generated commands can include output commands 164 to output audio (222), visual (226), and/or haptic (224) output on themodular sensing device 150. Additionally or alternatively, themodular sensing device 150 can generate control commands 162 (228) for operating a remotely operateddevice 140. In either case, themodular sensing device 150 can transmit or execute the commands accordingly (225). - Modular Sensing Device
-
FIG. 3 illustrates an example modular sensing device insertable into a plurality of compatible apparatuses. Themodular sensing device 300 shown inFIG. 3 can comprise various components and modules of themodular sensing devices FIGS. 1A and 1B . Referring toFIG. 3 , themodular sensing device 300 can include a number of output devices, such as anLED array 310, an audio output device 320 (e.g., a speaker), and a haptic driver 360 (included within the device). Furthermore, themodular sensing device 300 can include amode selector 330, which can comprise an analog or digital button to enable the user to select a particular mode of thedevice 300 by, for example, scrolling through a stored series of modes. Themodular sensing device 300 can further include memory andprocessing resources 365 that can execute the selected mode (either in the state machine implementation (FIG. 1B ) or the executed instruction set implementation (FIG. 1A ) described herein). - In various aspects, the
modular sensing device 300 also includes a communications interface 370 (e.g., a BLUETOOTH low energy, WiFi, WiGig, WiMAX, or cellular radio interface), and anIMU 340 to provide the memory andprocessing resources 365 with sensor data for detecting gestures performed by the user. As described herein, depending on the mode and sub-mode of thedevice 300 the memory andprocessing resources 365 interpret the sensor data to generate outputs via theoutput devices modular sensing device 300 can include an input interface 350 (e.g., a mini-USB port) to enable charging of one or more batteries and/or uploading of additional mode instructions. In variations, themodular sensing device 300 can include an induction interface to charge one or more batteries and/or to enable inductive pairing with a second device to establish a wireless communications link. - In the various examples described herein, the
modular sensing device 300 can be insertable into or otherwise attachable to any number of compatible apparatuses, such as wearable devices 395 (wrist devices, rings, pendants, hats, glasses, etc.) wielded devices 385, companion toys or dolls, and the like. Furthermore, themodular sensing device 300 can be implemented in various other form factors, can be sewn into clothing, or can be mounted, glued, or otherwise attached to various apparatuses. Such apparatuses can each include amodule holder modular sensing device 300 may be inserted or otherwise mounted or attached. Thus, according to examples provided herein, the user can utilize the apparatuses into which themodular sensing device 300 has been inserted or attached, to perform various gestures in accordance with a selected mode of themodular sensing device 300. - Finder and Mining Mode
-
FIG. 4 is a block diagram illustrating an example portable sensing device operable as a signal detection system for acquiring virtual or digital resources. In more detail, awearable device 400 may be a standalone device or may be incorporated with a mobile computing device running an application specific to detecting and recording signals from network devices, and searching for those network devices associated virtual resources. Standalone devices may include a pocket device, such as a key fob, a wrist-worn device or other wearable device (as described herein), and the like. For mobile computing environments, thewearable device 400 may utilize mobile computing device resources, such as device hardware/firmware, or operate utilizing a combination of hardware and software. For example, thewearable device 400 may be enabled upon initiation of an application on the mobile computing device. Additionally or alternatively, thewearable device 400 may continue to run during hibernation or sleep mode of the application using background or standby resources. Alternatively, thewearable device 400 may be enabled in thefinder mode 406 andmining mode 407 upon user input on amode selector 405 of thewearable device 400, as shown and described with respect toFIG. 1 . Selection and execution of the finder and/or mining mode can generate anexecution command 408, which can enable the processing resources of thewearable device 400 to execute the respective mode instructions for translating sensor data and providing feedback responses. - Referring to
FIG. 4 , thewearable device 400 can include asignal interface 410 to detect emitted signals from any number of network-enableddevices 460. Such emitted signals may be background signals advertising the presence and media access control (MAC) addresses ofsuch devices 460. For example, at time t1, thewearable device 400 may be within wireless range of alaptop computing device 462 having aunique identifier UID 1. The UID of the laptop computing device 462 (i.e., UID 1) may correspond to the MAC address of thelaptop computing device 462. Thewearable device 400 can include aUID recorder 420, which can receiveUIDs 412 from thesignal interface 410 and record therespective UIDs 412 in aUID log 432 of alocal memory resource 430. Accordingly, theUID recorder 420 can log the laptop computing device's 462 UID (i.e., UID 1) in theUID log 432. - As a further example, at time t2, the
wearable device 400 may come within wireless range of anaccess point 463 having aMAC address UID 2. According to examples described herein, thesignal interface 410 can receive the MAC address of theaccess point 463. For example, thesignal interface 410 can receive a beacon from theaccess point 463 which includes the access point's 463 MAC address (UID 2). Thesignal interface 410 can communicateUID 2 to theUID recorder 420, which can logUID 2 in theUID log 432. - According to some examples, the
wearable device 400 can establish a network connection, e.g., vianetwork 480, with ahost server 470. Upon connecting with thehost server 470, thewearable device 400 can transmitUID log data 482 from the local UID log 432 to thehost server 470. Accordingly, thehost server 470 can compile theUID log data 482 and associate therespective UIDs 412 from thelog data 482 with a user account 476 associated with a user of thewearable device 400. In such examples, theremote host server 470 can perform any number of functions described herein. For example, thehost server 470 can utilize theUID log data 482 from thewearable device 400 to identify the network devices (i.e., thelaptop computing device 462 and the access point 463) with which thewearable device 400 came within wireless range. Thehost server 470 can store auniversal association list 472 that may list associations between, for example,UID 1 of thelaptop computing device 462 and a specified virtual resource. Accordingly, thehost server 470 can attribute an amount of the specified virtual resource associated withUID 1 to the user account 476 associated with thewearable device 400. Thus, if the specified virtual resource associated withUID 1 was, for example, a virtual ore, then thehost server 470 could attribute a predetermined amount of virtual ore to the user account 476. - Additionally or alternatively, the
host server 470 may perform a lookup in the universal associations list 472 and identify that UID 2 (corresponding to the access point 463) does not yet have an associated virtual resource. Thus, thehost server 470 may select, either sequentially or randomly, a virtual resource from a virtual resource catalog 474 and log a new association between the selected virtual resource andUID 2 in theuniversal associations list 472. Accordingly, an amount of the new selected resource can be attributed to the user account 476, in which respective virtual resources may be accumulated based on wireless detection events between thewearable device 400 and thenetwork devices 460. - Additionally or alternatively, one or more operations described with respect to the
host server 470 may be performed by thewearable device 400 itself. For example, thewearable device 400 can include anassociation engine 440 which, upon detection of a wireless signal from anetwork device 460, can determine whether that device is locally associated with a virtual resource. Thus, thememory resource 430 of thewearable device 400 can include an association table 434 which may includepredetermined associations 435 between thenetwork devices 460 and virtual resources. As an example, at time t3, thewearable device 400 can come within wireless range of amobile device 464, which can announce its presence usingMAC address UID 3. Thesignal interface 410 can receiveUID 3, and theassociation engine 440 can perform a lookup in the association table 434 to determine whetherUID 3 is associated with a particular virtual resource (e.g., virtual oil). Thus, thewearable device 400 may flag the association betweenUID 3 and virtual oil until a connection with thehost server 470 is established to attribute the virtual oil to the user account 476. - Alternatively, the
wearable device 400 can include aresource allocator 450 and thememory resource 430 can include avirtual resource log 436, comprising an accumulation of the virtual resources collected by the user. Theassociation engine 440 can transmit theassociations 435 to theresource allocator 450, which in turn, can select and compile respective amounts of accumulated virtual resources (e.g., five units of virtual oil corresponding to theUID 3 detection event) in thevirtual resource log 436. - According to one or more examples, at time t4, the
wearable device 400 may come within range of atablet computing device 465.UID 4, associated with the MAC address of thetablet computing device 465 may be received by thesignal interface 410 and communicated to theassociation engine 440, which can determine from the association table 434 thatUID 4 is associated with, for example, virtual coal. Theassociation engine 440 can communicate theUID 4 association to theresource allocator 450, which can log an amount of virtual coal in thevirtual resource log 436. Additionally or alternatively, if the virtual resources (e.g., virtual coal) of all recently detecteddevices 460 are logged in thevirtual resource log 436, the signal detection system can flush the UID log 432, since the UIDs of thenetwork devices 460 are already known and associated in the association table 434. - Upon network connectivity with the
host server 470 over thenetwork 480, the signal detection system can transmit virtualresource log data 484 to thehost server 470 to enable thehost server 470 to attribute collected virtual resources to the user account 476 associated with the user of thewearable device 400. Various alternatives are contemplated. For standalone devices, thewearable device 400 can communicatively couple with a user's computing device, such as a smartphone or tablet computer. The virtualresource log data 484 may then be communicated to the user's computing device which can, in turn, attribute the collected virtual resources to the user's account locally or via network connection to thehost server 470. The collected virtual resources may be expended as a result of gameplay, or via various incentive-based programs increasingly prevalent in the mobile application marketplace to induce users to interact with stipulated content. - Additionally or alternatively, the consumption, accumulation, and/or expenditure of collected virtual resources may be performed dynamically in connection with task-oriented operations (e.g., gameplay in the game mode) on a
mobile computing device 495 running a respective application utilizing such virtual resources. Further, a standalonewearable device 400 may incorporate near field communication (NFC) technology such that the supplementary task of collecting virtual resources may be performed during a user's routine daily activities, and the user can compile the virtual resources via NFC link with the user'scomputing device 495. - In accordance with the above examples, at time t5, the
wearable device 400 may come within range of amobile device 466 having a unique identifier (e.g., MAC address)UID 5. Thesignal interface 410 can communicate the mobile device's 466 identifier to theUID recorder 420 which can recordUID 5 in the UID log 432 of thelocal memory resource 430. The mobile device's 466 identifier may be further communicated to theassociation engine 440 which can perform a lookup in the association table 434 to determine whetherUID 5 is associated with a virtual resource. - If
UID 5 is not associated with a virtual resource, thewearable device 400 can subsequently utilize the UID log 432 to communicateUID log data 482 to thehost server 470 to receive an association betweenUID 5 and a virtual resource. Alternatively, thewearable device 400 can include a local virtual resource catalog, similar to the virtual resource catalog 474 of thehost computer 470, and can perform a sequential or randomized selection of a virtual resource (e.g., virtual wood) to associate withUID 5. - If
UID 5 has already been associated with a virtual resource (e.g., virtual wood), as resultant from the lookup performed by theassociation engine 440, then theassociation 435 can be communicated to theresource allocator 450, which can allocate a predetermined amount of virtual wood in thevirtual resource log 436. Once a connection between thewearable device 400 and thehost server 470 is established, the virtualresource log data 484, which can comprise the accumulated virtual resources since the last established connection, can be communicated to thehost server 470 for attribution to the user's account 476. Once the virtualresource log data 484 is communicated, thevirtual resource log 436 in thewearable device 400 can be flushed. - Various limitations to resource allocation are realized. For example, the
signal interface 410 of thewearable device 400 may be restricted to only receive virtual resources associated withnetwork devices 460 upon establishing a connection with the respective device, as opposed to merely detecting thenetwork device 460. Furthermore, the allocated amounts of a particular virtual resource associated withmultiple network devices 460 may be diverse. For example, theresource allocator 450 may allocate more virtual resources when thewearable device 400 comes within contact of a less accessible access point (e.g., an access point at a remote location). Furthermore, at any given expiration time (e.g., after an expiration period or a time-limited gaming session), the UID log 432, the association table 434, and/or thevirtual resource log 436 may be reset or otherwise reconfigured. - Examples described with respect to
FIG. 4 illustrate resource association and allocation being performed by thewearable device 400. However, according to some aspects, themobile computing device 495 can perform various processes as described with regard to thewearable device 400. For example, themobile computing device 495 can receive theUID data 484 for each detectednetwork device 460, and can locally perform lookups for the associations and allocations of the virtual resources. Accordingly, thememory 430 shown as a component of thewearable device 400, may be included on themobile computing device 495 and/or accessible by themobile computing device 495, running a designated application, in thehost server 470. - As further described herein, upon detecting a
particular network device 460 associated with a particular virtual resource in the finder mode, thewearable device 400 can initiate the feedback mechanism to notify the user that virtual resources are proximate to the user's current location. The feedback output can include haptic feedback in combination with lighting and or audio feedback. In many aspects, the user can be notified and enabled to select the mining mode, in which the user can “mine” the virtual resource from thenetwork device 460. In the mining mode, the feedback mechanism can be initiated and adjusted dynamically to enable the user to search for a direction towards the virtual resource to acquire a certain allocation, and/or to locate the exact position of the network device to acquire a certain allocation, as described below with respect toFIG. 6 . - In still further implementations, the
wearable device 400 can utilize location-based resources (not shown) in order to identify virtual resources associated with a particular waypoint. For example, a gameplay environment may preconfigure virtual resources in GPS locations in the real world. Thewearable device 400 can identify such locations (e.g., when thewearable device 400 is within a certain proximity of a particular waypoint) and provide feedback via a feedback mechanism that enables the user to search for and mine the virtual resource associated with the waypoint. -
FIG. 5 is a block diagram illustrating an example virtual resource management system utilized by a portable sensing device (e.g., wearable device 400), or amobile computing device 495 executing an application in connection with awearable device 400. The virtualresource management system 500 can include features from thewearable device 400 described with respect toFIG. 1A . Furthermore, one or more components of the virtualresource management system 500 may be comprised in any number of electronic devices, such as smartphones, tablet computers, personal computers, laptop devices, wearable computing systems, and the like. Accordingly, the virtualresource management system 500 can comprise an application-based program running on any of such devices, enabling a user to collect, accumulate, expend, and consume virtual resources in connection with task-oriented operations performed on the device. For example, the user's device may run an application specific to virtual resource gameplay, rendering agaming environment 578 on adisplay 590. The user may performinteractions 594 using thedisplay 590 in order to utilize compiled virtual resources in thevirtual resource log 565, as shown inFIG. 5 . - Additionally or alternatively, the virtual
resource management system 500 shown and described with respect toFIG. 5 can be provided withUIDs 512 from, for example, a wearable device in a finder mode that operates to locate virtual resources as a user walks or otherwise travels across various networks and comes within network range of various network devices. In some aspects, the wearable device can transmit UIDs of various detected network devices to the mobile computing device, which can connect with thehost server 570 to determine whether the detected network devices are associated with virtual resources. If so, the mobile computing device can transmit a confirmation signal to the wearable device which can provide feedback to the user (e.g., audio, haptic, and/or visual feedback) indicating that virtual resources are nearby. The mobile computing device or the wearable device itself can determine a direction towards the network device associated with the virtual resource. Furthermore, the generated feedback on the wearable device can enable the user to select the mining mode, allowing the user to first identify the direction towards the virtual resource(s) (e.g., via dynamically provided feedback indicating whether the user is “hot” or “cold” depending on the direction the user points the wearable device), and optionally locate the actual position of the network resource device in order to be allocated with a certain amount of the virtual resource. In some aspects, the wearable device can include an IMU to determine the user's orientation and a direction in which the user is facing in order to provide the feedback. The mining mode may terminate automatically upon acquiring the virtual resources, and the wearable device may revert, automatically, back to finder mode. - Referring to
FIG. 5 , the virtualresource management system 500 can include a signal detector 510 (e.g., the wearable device) to detectUIDs 512 of various network devices, as discussed above with respect toFIG. 4 . TheUIDs 512 may be logged in aUID Log 530 by aUID module 520. As shown inFIG. 5 , various UIDs (i.e.,UID XYZ 532,UID EFG 534,UID NOP 536,UID TUV 538, and UID DK 539), each associated with a respective network device, have been logged in theUID log 530. TheUIDs 512 may be communicated to anassociation engine 540, which can perform lookups in an association table 560 to determine whether a respective UID is associated with a respective virtual resource. Thus, in the example shown inFIG. 5 ,UID XYZ 532 is associated with the virtual resource,virtual wood 562,UID EFG 534 is associated withvirtual ore 564,UID NOP 536 is associated withvirtual food items 566, andUID TUV 538 is associated withvirtual workers 568. - Virtual resources can further include various forms of digital currency or money, or images, video clips, three-dimensional (3D) interactive models, sound clips, a mini-game, or other content that may be presented to the user in response to finding the virtual resource. Thus, the association table 560 can be provided with any type of virtual resource described herein. Furthermore, virtual resources may be stored for subsequent task-oriented activities (e.g., virtual gold or virtual money to be expended in a subsequent game), or may require immediate consumption (e.g., a “mined” video clip discovered by the user which can be viewed once and deleted).
- In accordance with examples described herein, when the
signal detector 510 detects say, an access point corresponding toUID NOP 536,UID NOP 536 may be communicated to theassociation engine 540, which, upon performing a lookup in the association table 560, can determine thatUID NOP 536 is associated withvirtual food items 566. Theassociation engine 540 can communicate thisassociation 542 to aresource engine 550, which can logallocations 552 of virtual resources in avirtual resource log 565, as shown inFIG. 5 . Thus, once theassociation engine 540 communicates the association ofUID NOP 536 withvirtual food items 566 to theresource engine 550, theresource engine 550 can allocate a predetermined amount ofvirtual food 566 in thevirtual resource log 565. As shown in thevirtual resource log 565 ofFIG. 5 , the user has accumulated 1777 units ofvirtual food 566, in addition to 765 units ofvirtual wood virtual ore virtual workers 568. However, the user has yet to accumulate any virtual crystal, virtual coal, or virtual oil, which may be essential items in connection with the renderedgameplay 592 on thedisplay 590. - Continuing with examples provided herein, as the user comes within range of say, a remote access point associated with
UID IJK 539, thesignal detector 510 can communicateUID IJK 539 to theassociation engine 540, which upon performing a lookup in the association table 560, can determine thatUID IJK 539 is not yet associated with a virtual resource (i.e., unknown 569). In accordance with examples described herein, theassociation engine 540 may attributeUID IJK 539 with a particular virtual resource (e.g., virtual oil) by referencing a local virtual resource catalog, similar to the virtual resource catalog 574 of thehost server 570. Alternatively, theassociation engine 540 may compile anassociation call 544, which can be communicated to thehost server 570 upon establishing a network connection vianetwork 580. Thehost server 570 can sequentially or randomly select a virtual resource (e.g., virtual oil), from the virtual resource catalog 574, and log the new association betweenUID IJK 539 and virtual oil in auniversal associations list 572. Thehost server 570 can respond to association calls 544 withnew associations 596 logged in theuniversal associations list 572. Thenew associations 596 can be communicated to the virtualresource management system 500 over thenetwork 580 via acommunication interface 595 of the virtualresource management system 500. Accordingly, thenew associations 596 can be received by theassociation engine 540, which can log thenew associations 596 in the association table 560. - Thus, in the example described, the association between
UID IJK 539 and virtual oil may be transmitted through thecommunication interface 595 of the virtualresource management system 500 to theassociation engine 540, which can update the association table 560 to replace “unknown 569” with virtual oil. - Furthermore, the
resource engine 550 can manage theresource log 565 based on bothuser interactions 594 performed via the renderedgameplay 592 on thedisplay 590, and detectedUIDs 512 and receivedassociations 542 based onsuch UIDs 512. In various implementations, theuser interactions 594 in connection with the renderedgameplay 592 can cause virtual resources stored in thevirtual resource log 565 to be expended.Such expenditures 567 in the renderedgameplay 592 can correlate to a depletion of the relevant virtual resource in thevirtual resource log 565. Various related examples are contemplated. For example, theuser interactions 594 can cause respective virtual resources to be traded, consumed, accumulated, invested, or expended in accordance with the renderedgameplay 592. - In the provided example above, the detection of
UID IJK 539—now associated with virtual oil—can cause theresource engine 550 to attribute a predetermined amount of virtual oil (e.g., five units) in thevirtual resource log 565. Furthermore, as shown inFIG. 5 ,user interactions 594 in connection with the renderedgameplay 592 can cause dynamic updates, by theresource engine 550, to thevirtual resource log 565. Thevirtual resource log 565 can be dynamic in nature, in connection with the renderedgameplay 592 and the detection ofUIDs 512. Theresource engine 550 can continuously performallocations 552 andexpenditures 567 of virtual resources in thevirtual resource log 565. Additionally or alternatively, theresource engine 550 can communicate accumulatedvirtual resources 597 to thehost server 570, which may attribute the accumulatedvirtual resources 597 to the user's account 576 to save progress data corresponding to the renderedgameplay 592. - The rendered
gameplay 592 may be application or software-based utilizing any number of resources of the user's computing device. The renderedgameplay 592 may be provided in connection with augmented reality, virtual reality, or a virtually generatedgaming environment 578 provided by thehost server 570. For augmented reality implementations, thegaming environment 578 provided can comprise virtual features rendered in a real world environment. For example, a camera included on the user's computing device may be utilized to capture real-world images or video, and thegaming environment 578 may be rendered thereon. Furthermore, the renderedgameplay 592 may be incorporated in conjunction with the use of a remotely operated self-propelled device. In such implementations, the rendered gameplay may include virtual controls to remotely control the self-propelled device. Accordingly, the virtualresource management system 500 can operate in connection with the self-propelled device, which can incorporate one or more features of the virtualresource management system 500, such as the signal detection features (i.e.,signal detector 510,UID module 520, and UID log 530). - Finder and Mining Mode Methodology
-
FIG. 6 is a flow chart describing an example method of acquiring virtual resources using a wearable device. In the below discussion ofFIG. 6 , reference may be made to like reference characters representing various features ofFIG. 4 for illustrative purposes. Furthermore, the method described in connection withFIG. 6 may be performed by, for example, thewearable device 400 as illustrated inFIG. 4 , or awearable device 400 in communication with amobile computing device 495 running a designated application, as shown inFIG. 4 . Referring toFIG. 6 , thewearable device 400 can initiate a finder mode and initially detect a signal, such as an advertising beacon, from a network device, such as an access point (600). The beacon can include the network device's MAC address, or other UID representing the network device. - The
wearable device 400 can then perform a lookup in an association table 434 to determine whether the UID is associated with a given virtual resource (605). In some aspects, thewearable device 400 can transmit the UID to themobile computing device 495, which can make the determination accordingly. At decision block (610), the wearable device 400 (or mobile computing device 495) identifies from the association table 434 whether the UID is associated. If the UID is not associated with a virtual resource (614), then thewearable device 400 can continue to monitor and detect signals from network devices (600), and/or generate a request for a UID association and retrieve the virtual resource association. However, if the UID is associated with a given virtual resource (612), then thewearable device 400, or user utilizing thewearable device 400, can be allocated a predetermined amount of the given virtual resource in thevirtual resource log 436 in the following manner. - Upon determining that the network device is associated with a virtual resource, the
wearable device 400 can generate feedback to the user indicating the proximate virtual resource (615). Thewearable device 400 may then receive a user input (e.g., on an input mechanism or a mode selector on the wearable device) to search for the virtual resource, thereby initiating the mining mode (420). In the mining mode, thewearable device 400 can determine a direction towards the network signal (e.g., the advertising beacon having the UID associated with the virtual resource) (625). Then thewearable device 400 can monitor the sensor data, from one or more sensors of thewearable device 400, to determine whether the user points the wearable device towards or away from the virtual resource signal (630). While the user points thewearable device 400 in differing directions, thewearable device 400 can adjust the feedback dynamically (635). - For example, the feedback, which can comprise a combination of haptic, audio, and visual feedback, can increase in intensity as the user points the
wearable device 400 towards the network device, and decrease in intensity as the user points thewearable device 400 away from the network device. Accordingly, once the user has pointed thewearable device 400 in the correct direction for a predetermined time period (e.g., three seconds), thewearable device 400 can acquire a certain allocation of the virtual resource, and store or otherwise enable the user to consume or expend the acquired resource (640). -
FIGS. 7A and 7B are low level flow charts describing example processes for managing virtual resources in connection with signal detection and gameplay. In the below discussion ofFIGS. 7A and 7B , reference may be made to like reference characters representing various features ofFIG. 5 for illustrative purposes. Furthermore, the low level method described in connection withFIGS. 7A and 7B may be performed by, for example, the virtualresource management system 500 as illustrated inFIG. 5 . Referring toFIG. 5 , the virtualresource management system 500 can detect advertising signals (e.g., beacons) from network devices (700). Such advertising signals may include UIDs, such as MAC addresses for the detected network devices. The virtualresource management system 500 can then record the UIDs of the network devices in the UID log 530 (705). - The virtual
resource management system 500 may perform a lookup in the association table 560 (710) to determine whether a respective UID is associated with a respective virtual resource (715). If the respective UID is associated with a respective virtual resource (717), then the virtualresource management system 500 can allocate a predetermined amount of the respective virtual resource in the virtual resource log 565 (720). Thereafter, the process may begin again with the detection of advertising signals from network devices (700). However, if the respective UID is not associated with a respective virtual resource (719), the virtualresource management system 500 can generate a request, or anassociation call 544, to create anew association 596 for the respective UID (725). - The virtual
resource management system 500 can transmit the request to thehost server 570 when a network connection is established (730). Alternatively, the virtualresource management system 500 can locally select a respective virtual resource from a local virtual resource catalog (740). Such a selection may be made sequentially (743), for example, if the virtual resource catalog is a sequential list of virtual resources. Or, the selection may be made randomly (741) from the virtual resource catalog by way of a random selection technique. - In various implementations, the virtual
resource management system 500 can receive, remotely or locally, associated virtual resources for unassociated UIDs (745). Thus, the respective unassociated UID may be associated with a respective virtual resource, and thereafter the virtualresource management system 500 can allocate a predetermined amount of the respective virtual resource in the virtual resource log 565 (720). After all detected UIDs are associated with their respective virtual resources, the virtualresource management system 500 can flush the UID log 747, since storing the UIDs may no longer be necessary. - The process as described in connection with
FIG. 7B may be performed before, after, or in conjunction with the process as described with respect toFIG. 7A . Referring toFIG. 7B , the virtualresource management system 500, may receive a user input (e.g., via a touch input on a touch-sensitive display) to launch a gaming application associated with trading, consuming, accumulating, earning, investing, or otherwise expending virtual resources. Based on the user input, the virtualresource management system 500 can initiate the gameplay application to place the wearable device in game mode (750). In doing so, the gaming environment may be rendered on the display 590 (755) (e.g., on the display of the mobile computing device). - During gameplay, the virtual
resource management system 500 can receivevarious user interactions 594 in connection with the gameplay (760).Such user interactions 594 may be performed by way of touch inputs, mouse interactions, keyboard interactions, interactions using a game controller such as a joystick or a specialized controller device, or a combination of the above. Furthermore,such user interactions 594 may be performed in connection with remote operation of a remotely operated device. In accordance with examples described herein, the gameplay can incorporate a virtual or augmented reality environment in which the user can utilize collected virtual resources. For example, collected virtual resources may be consumed by a virtual character under operative control of the user. Additionally or alternatively, selected virtual resources can be expended to build a virtual building or town. Collected virtual food items may be utilized to feed a virtual colony of gameplay characters. Virtual oil may be utilized by the user during gameplay to modernize a primitive society. Virtual workers may be employed for production or to build infrastructure. Various alternatives and additions in connection with task-oriented operations and gameplay are contemplated. - Accordingly, based on the
user interactions 594, theresource engine 550 of the virtualresource management system 500 can dynamically modify or update thevirtual resource log 565 to enable the user to expend, trade, accumulate, invest, earn, etc. virtual resources from the virtual resource log 565 (765). - When the user wishes to end a gameplay session, the virtual
resource management system 500 may receive a user input to deactivate the gaming application, and thus terminate the gaming mode (770). In order to save gameplay progress, theresource engine 550 of the virtualresource management system 500 can compile the virtual resources left in thevirtual resource log 565, and transmit a saved list of virtual resources to the host server 570 (775). Thereafter, the virtualresource management system 500 may flush the virtual resource log (777), since the progress is remotely saved. - Use Case Scenarios
-
FIGS. 8A through 8C illustrate unique identifier logs and association tables utilized in connection with virtual resource association, acquisition, and allocation. Various implementations of the disclosed systems and method described herein are contemplated. For example, the detected MAC address of various network devices may be associated with any number of plausible items. Such items may be individuals associated with the network device (e.g., an owner of the network device), or such items may be associated with persons of interest (e.g., characters in connection with a gaming environment), as illustrated inFIG. 8A . Alternatively, such items may be locations of the respective network devices, such that a user is enabled to identify where he/she has traveled, as illustrated inFIG. 8B . Additionally or alternatively, such items may be associated with a time stamp, thereby enabling a user to determine that an interaction has taken place with a particular network device and a time in which the interaction took place, as illustrated inFIG. 8C . Furthermore, the examples provided with respect toFIGS. 8A through 8C may be implemented using thewearable device 400 as described with respect toFIG. 4 , and/or using a modified virtualresource management system 500 as discussed with respect toFIG. 5 . - Referring to
FIG. 8A , aUID Log 810 of a signal detection system (e.g., a wearable device or wrist-worndevice 800 operating in finder mode) can compile UIDs (e.g.,UID XYZ 812,UID EFG 814,UID NOP 816,UID TUV 818,UID IJK 819, etc.) associated with respective network devices. The respective network devices may be any one of an access point, a mobile computing device, a tablet computer, a personal computer, a BLUETOOTH-enabled device, a radio-frequency identification device, a local area network enabled device, etc. Furthermore, a user may carry the wrist-worndevice 800, which can come within wireless detection range of any number of the foregoing network devices. In accordance with examples described herein, the wrist-worndevice 800 can compile UIDs corresponding to the network devices in aUID Log 810. - In many examples, the wrist-worn
device 800 only compiles UIDs associated with network devices in which the user come within wireless range. In such examples, the user may enable a function on a computing device (e.g., a tablet computer, smart phone, PC, etc.), which can pull the UIDs from the wrist-worndevice 800 and make associations according to a local association table. In order to transmit the UIDs to the user's computing device, the wrist-worndevice 800 can include connectivity functions to establish a connection with the computing device (e.g., Wi-Fi, BLUETOOTH, etc.). Alternatively, the wrist-worndevice 800 can include inductive data communication capabilities in order to transmit the UIDs to the computing device over an inductive link. - Alternatively, the wrist-worn
device 800 can include a local association table 820 in order to make a given association when the wrist-worndevice 800 comes within wireless range of a given network device. Upon each detection event, the association table 820 of the wrist-worndevice 800 can log an association. For example, as the user carries the wrist-worndevice 800 within wireless range of an access point, the wrist-worndevice 800 can receive the access point's advertising beacon, which can include the access point's unique identifier,UID XYZ 812. The wrist-worndevice 800 association table 820 can be referenced to identify that UID XYZ is associated with Ricky H. 822—who may be an owner of the access point. The user may then journey within range of a wireless device withUID EFG 814, which the association table 820 may associate with Jose C. 824—an owner of the wireless device. In such implementations, after an excursion in which the user encounters any number of network devices, the user may thereafter review the association table/log to identify the individuals (i.e., Ricky H., Jose C., etc.) whose devices the user encountered during the excursion. - Alternatively, for task-oriented implementations (e.g., gaming), the association table may associate such UIDs with characters or places in a gaming environment. As contemplated in one or more examples, each detection event can correspond to a meeting or visitation in which the user meets with a character or visits a point of interest. A task-oriented application may require the user to meet a certain character or visit a certain place before a next achievement is reached. Accordingly, along a physical excursion by the user, the wrist-worn
device 800 may detect advertising beacons for respective devices associated withUID NOP 816,UID TUV 818, andUID 819. The association table 820 may be referenced to identify characters and/or places (real or virtual) with which the respective network devices are associated. As an example, the association table/log 820 can identify Mark M. 826 as being associated withUID NOP 816, Dave. S. 828 as being associated withUID TUV 818, and theColiseum 829 as being associated withUID IJK 819. Upon transmitting such characters and/or places to the user's computing device, the task-oriented application can input such meetings and visits and record a number of respective achievements. -
FIG. 8B illustrates another usage scenario in which the UIDs of network devices may be associated with a real-world or mock-world environment. As an example, the user may carry thesignal detection device 800 to within wireless range of an access point of a coffee shop, where the access point has a uniqueidentifier UID XYZ 832. The wrist-worndevice 800 can log UID XYZ in theUID Log 830 and reference the association table/log 840 to determine thatUID XYZ 832 is associated with thecoffee shop 842. After an excursion of passing through any number of wireless beacons, the user can map the excursion using the physical locations, logged in the association table/log 840, of the network devices along the way. - In some examples, the physical locations of the network devices may be determined via the detected beacon, which may include location information. Alternatively, the wrist-worn
device 800, or mobile computing device to which the wrist-worndevice 800 is connected, can include location-based functionality (e.g., GPS resources) to log the location in the association table/log 840. Accordingly, in response to a detection event, the wrist-worndevice 800 can be triggered to pinpoint the physical location of the detection event—which can be logged along with the UID of the network device, as shown inFIG. 8B . - In other examples, the location associated with a UID may be based on a mock-world environment corresponding to, for example, a gaming environment. For example, a user may interact with a task-oriented application on a computing device, which may require the user to visit
El Dorado 844. The user may then physically search for or journey to a specified network device havingUID EFG 834, which, in accordance with the association table, is associated withEl Dorado 844. In such examples, the user may be enabled to reference the association table 840, which may provide a physical location in the real-world of the network device associated withEl Dorado 844. Thus, upon wirelessly contacting the network device havingUID EFG 834, the association table/log 840 can log the detection event, and the user can accomplish a next achievement in the task-oriented application. - Additionally, in combination with the above examples with respect to
FIGS. 8A and 8B , the wrist-worndevice 800 may also include a timer or clock to enter a timestamp as triggered by a detection event, as illustrated inFIG. 8C . Thus, a user may further review a time in which the wrist-worndevice 800 detected the wireless signal of a respective wireless device. For example, during a given excursion, the user may come within wireless range of a network device havingUID XYZ 852, which can be logged as being associated withItem 1 862. The detection event ofUID XYZ 852 can further trigger a clock or timer to log a timestamp associated with the detection event. The time may reflect a local or universal time, or may reflect an elapsed time from say, the beginning of the excursion or a start time of the task-oriented application. - Default and Sharing Mode
-
FIG. 9 illustrates a wearable device pairing triggering virtual resource data sharing in connection with gameplay, as described herein. Thewearable device 900 may be carried or worn by a user to detect advertising beacons or wireless signals from various network devices. The wearable device 900 (e.g., as wrist-worn device) can include aUID Log 901 and automatically log each UID (e.g., MAC address) corresponding to each wireless device detected. According to many examples, upon linking with acomputing device 905,wearable device 900 can transmit the UID's 904 from theUID Log 901 to thecomputing device 905. The link may be any data connection. For example, thewearable device 900 can include functionality corresponding to Wi-Fi, radio-frequency, infrared, BLUETOOTH, near-field communication (NFC), and the like. Thus, the UID's 904 may be transmitted to thecomputing device 905 over such a communication link. - In accordance with examples discussed with respect to
FIGS. 8A through 8C , thewearable device 900 can collect UIDs corresponding to various network devices, which may be associated with any number of items. For example, an association table of either thecomputing device 905 or thewearable device 900 can include associations between UIDs of network devices and the registered owners of those network devices. Thus, upon detection of such network devices, the registered owner may be logged in anassociation log 903. In such examples, a user may scroll through the association log 903 to identify individuals with which the user came into wireless contact. - In variations, the association table of either the
wearable device 900 or thecomputing device 905 can associate various network devices with real-world or mock-world characters, landmarks, or other places. Such associations may be made in connection with a task-orientedapplication 902, such as a gaming application providing a gaming environment. - In some examples, the
wearable device 900 can itself include an association table/log and/or a virtual resource log to perform and log associations and collect virtual resources. In such examples, loggedassociation items 906 and/or collectedvirtual resources 908 may also be transmitted to the computing device over the communication link. Furthermore, timestamps 909 correlated to the detection events and logged associateditems 906 may also be transmitted to thecomputing device 905. - In similar implementations, the
computing device 905 may run a task-orientedapplication 902, which can trigger the communication link with thewearable device 900. Execution of the task-orientedapplication 902 can correspond to running a game providing a gameplay environment which utilizes items associated with the UID's 904 of network devices. The task-orientedapplication 902 can cause thecomputing device 905 to receive the UID's 904 from thewearable device 900 and reference anassociation log 903 to determine whether a given UID is associated with a given association item. If the given UID is not associated, thecomputing device 905 can create an association or retrieve an association from a host server. Given an association, thecomputing device 905 can log a specified amount of collected virtual resources in a localvirtual resource log 907 for use in the task-orientedapplication 902. - In certain aspects, the
wearable device 900 may operate in default mode in which thewearable device 900 monitors for wireless signals indicating that another wearable device, (e.g., wearable device 930) is nearby. In the default mode, thewearable devices wearable devices wearable device 900 operates in another mode, such as finder mode. - When the
devices inductive link 925, thewearable devices computing devices wearable device 930 can receivevirtual resource data 927 from the wearable device 900 (or the mobile computing device 905), and the user'smobile computing device 910 can acquire “unlocked”virtual resources 929 based on a trade with thewearable device 900. - According to one or more examples, the user may then run the gaming application 912 on the
computing device 910 and interact with thecomputing device 910 via the gaming application 912, which can provide agaming environment 902 that requires the use, directly or indirectly, of an association table 914 in connection withwearable device 930 and the gaming application 912. For example, thegaming environment 920 may require user interactions with real-world or mock-world characters and/or places, which may be accomplished by coming within wireless range of specified network devices associated with such real-world or mock-world characters and/or places in an association table 914 of thecomputing device 910. Upon detecting such network devices, thecomputing device 910 can log associated items (e.g., real-world characters, mock-world characters, real-world locations, mock-world locations, etc.). In accordance with thegaming environment 920, upon logging such associations, various tasks may be achieved. - In similar examples, the
gaming environment 920 may require the collection and use of virtual resources, as described above. Thus, the user may run the signal detector/gaming application 912 and interact with thegaming environment 920, which may require the user to collect a number and amount of virtual resources. In the sharing mode, thewearable device 930 enables the user to acquire unlockedvirtual resources 929 from another user, which can be compiled in avirtual resource log 918 associated with the gaming application 912. Accordingly, the user can perform a physical excursion to enable thecomputing device 910 to wirelessly interact or otherwise detect various network devices to collect such virtual resources for utilization in thegaming environment 920. Thegaming environment 920 can provide thevirtual resource log 918 which can inform the user of which virtual resources and how many of each virtual resource the user has collected. - Interactive or Battle Mode
-
FIGS. 10A and 10B illustrate a wearable device pairing triggering an interactive mode, or battle mode, between a pair of proximate users 1002, 1012 as shown inFIG. 10B . The interactive mode can be triggered upon selection of the mode on each of a pair ofwearable devices inductive link 1025 between thedevices FIG. 10A . For example, thedevices 1010 may each be connected to a respective mobile computing device, such asmobile computing device 1050 andmobile computing device 1060 shown inFIG. 10B . Upon performing theinductive link 1025, the users 1002, 1012 can each select the interactive mode to perform a series of actions and alternate between offensive and defensive sub-modes in the interactive mode. - Referring to
FIG. 10B , users 1002 and 1012 operate theirwearable devices action data 1048 between thedevices action data 1048 can correspond touser gestures 1041 performed by the users 1002, 1012 while thedevices offensive actions 1042, such as attack actions, using thewearable device 1010.Action data 1048 corresponding to theoffensive action 1042 can be generated by thewearable device 1010, and transmitted to thewearable device 1030. Thewearable device 1010 can generate feedback to the user indicating whether or not theoffensive action 1042 was successful. - At the same time, the proximate user 1012 can perform
user gestures 1041 corresponding to adefensive action 1044 based on theoffensive action 1042 performed by the user 1002. Whether thedefensive action 1044 is effective or ineffective against the offensive action 1042 (e.g., a blocking action), thewearable device 1030 can generate feedback reflecting such. Thewearable devices 1010, 1012 can alternate between offensive and defensive sub-modes giving each user successive opportunities to perform offensive 1042 anddefensive actions 1044. Furthermore, sensor patterns can be preconfigured corresponding to any number ofoffensive actions 1042 anddefensive actions 1044. Still further, only certain sensor patterns for defensive actions 1044 (e.g., a specified type of blocking action) may be effective against a givenoffensive action 1042. Each sensor pattern for each offensive 1042 ordefensive action 1044 can be preconfigured in accordance with a set of rules (e.g., gaming rules) that utilize thewearable devices defensive actions 1044 in light of theoffensive actions 1042 and vice versa in a training mode described herein. - During an interactive or battle session, the
mobile computing devices wearable device defensive actions 1044 may be given respective scores, andunsuccessful actions predetermined threshold score 1052 may be set by the interactive session, for example, as agreed upon by the users 1002, 1012, or in accordance with a particular game selected in the interactive mode. Once reached, the interactive session may end and a winner may be declared. A final result can be displayed on the respectivemobile computing devices -
FIG. 11 is a flow chart describing an example method of implementing an interactive mode between a pair of user utilizing a corresponding pair of wearable devices. In the example described with respect toFIG. 11 , reference may be made to various features shown and described inFIG. 10A andFIG. 10B for illustrative purposes. Referring toFIG. 11 , thewearable device 1010 can detect aninductive pairing 1025 with a proximate wearable device 1030 (1100). In response to a user selection, thewearable device 1010 can initiate an interactive mode (1105). Initially, thewearable device 1010 can determine an ordering or sequence between offensive and defensive sub-modes (1110), and then monitor sensor data foruser gestures 1041 performed depending on a present sub-mode. - In the offensive sub-mode (1175), the
wearable device 1010 can identifyuser gestures 1041 indicating an and offensive attack 1042 (1120). Thewearable device 1010 can do so by active monitoring or via state machine monitoring, as described herein. Sensor patterns corresponding to each of any number of offensive attack actions can be detected, and cause thewearable device 1010 to generate and transmitaction data 1048 indicating the particularoffensive action 1042 performed by the user 1002 to the proximate device 1030 (1125). Thewearable device 1010 may thereafter receive data from the proximatewearable device 1030 indicating whether or not the offensive action was successful (1130) and generate a feedback output accordingly (1135). If theoffensive action 1042 was successful, then thewearable device 1010 can generate positive feedback (e.g., audio, visual, etc.) (1137). If theoffensive action 1042 was unsuccessful or a failure, then thewearable device 1010 can generate negative feedback (1139). Upon generating the feedback, thewearable device 1010 can switch to the defensive sub-mode (1180). - In the defensive sub-mode (1180), the
wearable device 1010 can receive data indicating anoffensive attack action 1042 from the proximate wearable device 1030 (1150). Thewearable device 1010 can monitor sensor data for sensor patterns corresponding touser gestures 1041 indicating adefensive action 1044, or blocking move, performed by the user 1002 (1155). Thewearable device 1010 can further determine, using timing and sensor data, whether thedefensive action 1044 was successful (1160). For example, the user 1002 may be given a predetermined amount of time to perform adefensive action 1044 in response to anoffensive action 1042 once the action data 1048 (indicating the offensive action 1042) is received. Accordingly, reception of theaction data 1048 can trigger a timer on thewearable device 1010 giving the user limited time to perform an appropriatedefensive action 1044. - After transmitting
action data 1048 corresponding to thedefensive action 1044, thewearable device 1010 can generate a feedback output (1165). If thedefensive action 1044 is successful and performed within the time threshold, thewearable device 1010 can generate positive feedback (1167). However, if thedefensive action 1044 was unsuccessful, thewearable device 1010 can generate negative feedback (1169), and then transmit successful/unsuccessful data to the proximate device 1030 (1170), and tally the score for the interactive session. - During the interactive session, the
wearable device 1010 can determine whether the threshold score has been achieved by either the user 1002 or the proximate user 1012 (1185). If not (1187), then the users 1002, 1012 can continue in the interactive mode, alternating between offensive (1175) and defensive (1180) sub-modes. However, if the threshold score has been achieved by one of the users 1002, 1012 (1189), then thewearable device 1010 can exit the interactive mode and transmit the final scores to the connected mobile computing device 1050 (1195). - Use of multiple
wearable devices wearable device 1010, 1030 (e.g., an infrared sensor). Additionally or alternatively, GPS resources on thewearable devices mobile computing devices - Training Mode
-
FIG. 12 is a flow chart describing an example method of initiating a training mode on a wearable device in connection with a self-propelled device. In the below discussion ofFIG. 12 , reference may be made to like reference characters representing various features described with respect toFIG. 1A for illustrative purposes. Referring toFIG. 12 , thewearable device 102 can detect a user input placing thewearable device 102 in training mode (1200). The user input can be detected via a mode selector on the wearable device 102 (1202), or via launch of a designated application on a connected mobile computing device (1204). Alternatively, the training mode can be initiated via a combination of user inputs on the mode selector and an inductive link with a remotely operateddevice 140. Accordingly, thewearable device 102 may also detect an inductive pairing with a remotely operated device 140 (1205). In response to the inductive pairing, thewearable device 102 can transmit data to initiate the training mode on the remotely operateddevice 140 as well (1210). The transmitted data can cause the remotely operateddevice 140 to execute instructions to aid the user in training for a series of actions. For example, the series of actions can correspond to offensive and defensive actions that the user can implement when thewearable device 102 is in interactive or battle mode. Additionally or alternatively, the series of actions can get progressively more difficult as the user successively accomplishes each action. - Initially, the
wearable device 102 can synchronize directionally with the remotely operated device 140 (1215). In some aspects, the user can manually synchronize the gyroscopic sensors of thedevices device 140 away from thewearable device 102 and providing a calibration input (e.g., an input on the mode selector of the wearable device 102) (1217). In other aspects, the gyro synchronization may be performed automatically (1219). - Automatic synchronization can be initiated by the
wearable device 102 by generating and transmitting a spin command to the remotely operateddevice 140, which can execute a spin accordingly (1220). Using a signal detector, thewearable device 102 detect an asymmetry in the radiation pattern of the remotely operateddevice 140 as it spins, indicating the direction towards the remotely operated device 140 (1225). With a known direction, thewearable device 102 can transmit a direction calibration command to the remotely operateddevice 140 indicating the direction, which the remotely operateddevice 140 can process to align its internal drive system accordingly (1230). - In many aspects, the
wearable device 102 can track the location of the remotely operateddevice 140 as it traverses and maneuvers (1240). The remotely operateddevice 140 can include surface features or an accessory (e.g., a magnetically coupled attachment) that indicates a forward “looking” direction of the remotely operateddevice 140. In certain examples, the user is instructed to walk or run around in a circle until the user is directly facing the forward looking direction of the remotely operateddevice 140. In many aspects, thewearable device 102 can include sensors to determine an orientation of the user. For example, thewearable device 102 can determine whether the user is facing an instructed direction in connection with the training mode, such as facing the remotely operateddevice 140. Additionally or alternatively, thewearable device 102 can generate an output, via the feedback mechanism, instructing the user to perform a set of actions (1245). The output may be in the form of audio instructions, and can be based on data received from the remotely operated device 140 (1247), or from thewearable device 102 utilizing a local routine set (1249), which may be randomized or sequenced in accordance with the executing training mode instructions. - Once the instructions are outputted to the user, the
wearable device 102 can initiate a timer (1250). The timer can be initiated for each instruction outputted to the user, and a threshold time limit can be set for each instruction. Thus, the user can be instructed to perform the set of actions within the predetermined time period. Thewearable device 102 can monitor the sensor data to determine whether the user successfully performs the set of actions (1255). Specifically, for each instruction output, thewearable device 102 can determine whether the user has performed the instructed set of actions within the established threshold time limit (1260). If so (1261), then thewearable device 102 can generate another output instructing the user to perform another set of actions (1245). However, if the user fails to perform the set of actions (1263), then thewearable device 102 can terminate the training session and generate a final score (1265), which may be displayed on the user's mobile computing device. - Sword or Wield Mode
-
FIG. 13 is a flow chart describing an example method of implementing a wearable device in a sword mode. In the below description ofFIG. 13 , reference may be made to like reference characters representing various features ofFIG. 1A andFIG. 1B . Referring toFIG. 13 , thewearable device 102 can detect a user input placing thewearable device 102 in a sword mode (1300). When in the sword mode, thewearable device 102 can initially monitorsensor data 137 to determine whether the user has grabbed or grasped an object (1305). In some implementations, a state machine in thememory 180 to thewearable device 102 specific to the grabbing/grasping action can provide a state machine report to theoutput generator 160 indicating that the user is holding an object. - In many examples, the
wearable device 102 can determine a series of actions performed by the user with the object from the sensor data 137 (1310). Such actions can be actively determined by theoutput generator 160 in the sword mode, or via state machine reports from respective state machines that indicate each specific action. In any case, in response to each action, thewearable device 102 can generate a feedback response oroutput 132 using a feedback mechanism (1315). Thefeedback output 132 for each action can be distinct audio responses 196 (1317), haptic responses 192 (1316), and/or visual responses 194 (1318). - In many examples, based on the action performed by the user, the feedback mechanism can generate a corresponding output, such as sword fighting sounds, or other weapon-like or wand-like sounds based on the executing sword mode instructions. Thus, the
feedback output 132 based on the sensor patterns detected when the user is grasping an object may be any such sounds corresponding to the actions performed by the user. In this sense, the “sword mode” may rather be a “wand mode,” a “magic mode,” a “conductor mode,” a “sporting mode” (e.g., for a tennis-like game), and the like. When the user wishes to end the sword mode, thewearable device 102 can detect a user input, such as an input on the mode selector 110 (1320). In response to the input, thewearable device 102 can deactivate or terminate the sword mode (1325). - Hardware Diagrams
-
FIG. 14 is a block diagram that illustrates a computer system upon which examples described may be implemented. For example, one or more components discussed with respect to the remotely operated device ofFIG. 1A , and the methods described herein, may be performed by thesystem 1400 ofFIG. 14 . - In one implementation, the
computer system 1400 includesprocessing resources 1410, amain memory 1420,ROM 1430, astorage device 1440, and acommunication interface 1450. Thecomputer system 1400 includes at least oneprocessor 1410 for processing information and amain memory 1420, such as a random access memory (RAM) or other dynamic storage device, for storing information and instructions to be executed by theprocessor 1410. Themain memory 1420 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by theprocessor 1410. Thecomputer system 1400 may also include a read only memory (ROM) 1430 or other static storage device for storing static information and instructions for theprocessor 1410. Astorage device 1440, such as a magnetic disk or optical disk, is provided for storing information and instructions. For example, thestorage device 1440 can correspond to a computer-readable medium that store instructions performing sensor data processing and translation operations as discussed herein. - The
communication interface 1450 can enablecomputer system 1400 to communicate with a computing device and/or wearable device (e.g., via a cellular or Wi-Fi network) through use of a network link (wireless or wired). Using the network link, thecomputer system 1400 can communicate with a plurality of devices, such as the wearable device, a mobile computing device, and/or other self-propelled devices. Themain memory 1420 of thecomputer system 1400 can further store thedrive instructions 1424, which can be initiated by theprocessor 1410. Furthermore, thecomputer system 1400 can receivecontrol commands 1462 from the wearable device and/or mobile computing device. Theprocessor 1410 can execute thedrive instructions 1424 to process and/or translate the control commands 1462—corresponding to user gestures performed by the user—and implement the control commands 1452 on the drive system of the self-propelled device. - Additionally, the
main memory 1420 can further includemode instructions 1424, which theprocessor 1410 can execute to place the self-propelled device in one or multiple modes to interact with the wearable device. In some examples, execution of themode instructions 1422 can place the self-propelled device in an operational mode that providesfeedback 1452 and/orinstructions 1454 to the wearable device over the network 1480 (e.g., in training mode). - Examples described herein are related to the use of
computer system 1400 for implementing the techniques described herein. According to one example, those techniques are performed bycomputer system 1400 in response toprocessor 1410 executing one or more sequences of one or more instructions contained inmain memory 1420. Such instructions may be read intomain memory 1420 from another machine-readable medium, such asstorage device 1440. Execution of the sequences of instructions contained inmain memory 1420 causesprocessor 1410 to perform the process steps described herein. In alternative implementations, hard-wired circuitry and/or hardware may be used in place of or in combination with software instructions to implement examples described herein. Thus, the examples described are not limited to any specific combination of hardware circuitry and software. -
FIG. 15 is a block diagram that illustrates a mobile computing device upon which examples described herein may be implemented, such as themobile computing device 495 ofFIG. 4 . In one example, thecomputing device 1500 may correspond to, for example, a cellular communication device (e.g., feature phone, smartphone, etc.) that is capable of telephony, messaging, and/or data services. In variations, thecomputing device 1500 can correspond to, for example, a tablet or wearable computing device. - In an example of
FIG. 15 , thecomputing device 1500 includes aprocessor 1510,memory resources 1520, a display device 1530 (e.g., such as a touch-sensitive display device), one or more communication sub-systems 1540 (including wireless communication sub-systems), input mechanisms 1550 (e.g., an input mechanism can include or be part of the touch-sensitive display device), and one or more location detection mechanisms (e.g., GPS component) 1560. In one example, at least one of thecommunication sub-systems 1540 sends and receives cellular data over data channels and voice channels. - The
memory resources 1520 can store a designated control application 1522, as one of multiple applications, to initiate thecommunication sub-system 1540 to establish one or more wireless communication links with the self-propelled device and/or a wearable device. Execution of the control application 1522 by theprocessor 1510 may cause a specified graphical user interface (GUI) 1535 to be generated on thedisplay 1530. Interaction with theGUI 1535 can enable the user to calibrate the forward directional alignment between the self-propelled device and thecomputing device 1500. Furthermore, theGUI 1535 can allow the user to initiate a task-oriented operation (e.g., a game) to be performed by the user in conjunction with operating the self-propelled device with user gestures using the wearable device, as described herein. -
FIG. 16 is a block diagram of an example portable sensing device upon which examples described herein may be implemented, such as thewearable device 102 ofFIG. 1A . - In an example of
FIG. 16 , theportable sensing device 1600 includes aprocessor 1610,memory resources 1620, a feedback mechanism 1630 (e.g., audio 1632, haptic 1633, visual 1631 devices), a communication sub-systems 1640 (e.g., wireless communication sub-systems such as BLUETOOTH low energy), one or more sensors 1660 (e.g., a gyroscopic sensor or accelerometer) and an input mechanism 1650 (e.g., an analog or digital mode selector). In one example, thecommunication sub-system 1640 sends and receives data over one or more channels. - The
memory resources 1620 can storemode instructions 1623 corresponding to a plurality ofcontrol modes 1622, as described herein, which can be executed by theprocessor 1610 to initiate a particular mode. Certain executingmode instructions 1623 can initiate thecommunication sub-system 1640 to establish one or more wireless communication links with the self-propelled device and/or the mobile computing device. Execution of acontrol mode 1622 by theprocessor 1610 may cause theprocessor 1610 to generate distinct feedback responses using thefeedback mechanism 1630 based on sensor data from the sensor(s) 1660 indicating user gestures performed by the user. - In some examples, the
memory resources 1620 can comprise a number ofstate machines 1624 which can provide state machine reports 1627 to theprocessor 1610 can specified sensor patterns are identified byrespective states machines 1624. Eachstate machine 1624 may monitor for a single sensor pattern which, if identified by thatstate machine 1624, can cause thestate machine 1624 to transition states, thereby providing a state machine report 1627 to theprocessor 1610 identifying the user gesture performed. Theprocessor 1610 can translate the state machine reports 1627—which indicate the user gestures—in accordance with an executing set ofmode instructions 1623 in order to generate a corresponding output via thefeedback mechanism 1630 and/orcontrol commands 1612 to be communicated to the self-propelled device via thecommunication sub-system 1640. - While examples of
FIG. 14 ,FIG. 15 , andFIG. 16 provide for acomputer system 1400, acomputing device 1500, and aportable sensing device 1600 for implementing aspects described, in some variations, other devices of the three can be arranged to implement some or all of the functionality described with the processing resources of the remotely operateddevice 140 ofFIG. 1A , themobile computing device 495 ofFIG. 4 , or thewearable device 102 ofFIG. 1A , as shown and described throughout. - With further reference to examples of
FIG. 14 ,FIG. 15 , andFIG. 16 , some examples include functionality for projecting an orientation and/or perspective of a user onto a gaming environment via sensing output of theportable sensing device 1600. For example, when theportable sensing device 1600 is worn, the orientation and perspective of the user can be inferred from sensors 1660 (e.g., IMU), and this sensor information can be virtualized for the gaming environment. For example, the gaming environment can be shown on a computing device (e.g., display screen of a computer, mobile computing device etc.). The gaming environment can a perspective that is based on the orientation of the user (e.g., user is standing north), as determined by theportable sensing device 1600. The perspective can change as the user changes orientation, moves in a particular direction etc. In some examples, theportable sensing device 1600 can be used to control a virtual or actual object (e.g., self-propelled device or remotely operated device), and the orientation and direction of the controlled object may be with reference to a reference frame of the user. - In variations, a reference frame of the self-propelled device may be used, and the user's orientation can be used to influence control of the virtual or actual device in motion. For example, the user's movement or motion can influence a change of direction. Alternatively, both orientations can be used concurrently. For example, if the device under control is a virtual vehicle that carries the user, the user may turn his head (e.g., when wearing a necklace carrying the portable sensing device 2000) to see a view to a particular side while the orientation of the vehicle is used for the motion of the vehicle.
- Multi-Device Usage
-
FIG. 17 illustrates an embodiment of multiple sensing devices that concurrently provide input for a program or application which utilizes the inputs, along with inferences which can be made about a person or object that carries the devices, according to one or more examples. In particular, an example such as shown enables input from multiple sensing devices to be used for purpose of enabling inferences of movement and pose from two relevant sources of user motion. For example, inFIG. 17 , auser 1701 carries wearable devices in the form of awrist device 1710 and pendent 1712. In other examples, one or both of thewrist device 1710 and pendent 1712 can be in the form of an alternative form factor or device type. For example, the combination of sensing devices can include a hat, a ring, eyeglasses or a device which the user can carry in his or her hand (e.g., FOB, mobile computing device). In variations, more than two wearable devices can be employed by one user. -
FIG. 18 illustrates a system which concurrently utilizes input from multiple modular sensing devices in connection with execution of an application or program. With reference to an example ofFIG. 18 , a multi-device system 1800 includes a firstmodular sensing device 1810, a second modular sensing device 1820, and acontroller 1830. Each of the first and secondmodular sensing devices 1810, 1820 includes a respective inertial measurement unit (IMU) 1812, 1822, aprocessor memory IMU modular sensing device 1810, 1820 can include sensors such as anaccelerometer gyroscopic sensor modular sensing devices 1810, 1820 may also include additional sensing resources, such as a magnetometer and/or proximity sensor. - The
controller 1830 can include aprocessor 1832 and amemory 1834. Theprocessor 1832 can execute instructions 1835 for a program or application that can execute andprocess inputs modular sensing devices 1810, 1820. In some variations, thecontroller 1830 is a mobile computing device, such as a multi-purpose wireless communication device which can wirelessly communicate with each of the first and secondmodular sensing devices 1810, 1820. - While an example of
FIG. 18 illustrates thecontroller 1830 as a separate device from the first and secondmodular sensing devices 1810, 1820, variations provide that thecontroller 1830 is integrated or otherwise combined with at least one of the first or secondmodular sensing devices 1810, 1820. For example, thecontroller 1830 can include a multi-purpose wireless communication device that is equipped with a gyroscopic sensor and accelerometer. Thus, for example, variations can provide the second modular sensing device 1820 to be a local resource of thecontroller 1830, which communicates with the firstmodular sensing device 1810. - With further reference to
FIG. 18 , thecontroller 1830 can receiveinputs modular sensing devices 1810, 1820. Theinputs application 1839 or program that is executed by theprocessor 1832 of thecontroller 1830. Theprocessor 1832 can execute theinstructions 1845 in order to implement an inference engine 1835 for determining inferences about the person or object with one or both of themodular sensing devices 1810, 1820. For example, theapplication 1839 can correspond to a game or simulation, and the inference engine 1835 can be specific to theapplication 1839. Among other applications, the inference engine 1835 can be used to determine when the motions of twomodular sensing devices 1810, 1820 are separate and distinct from one another, or continuous and/or part of the same input motion. - According to one implementation, each
input second sensing devices 1810, 1820 generate a set of measured (or sensed data) corresponding to, for example, a movement (e.g., gesture) made with therespective sensing device 1810, 1820. Additionally, thecontroller 1830 can processinput - With reference to an example of
FIG. 17 ,user 1701 can wear two modular sensing devices, and the inference engine 1835 can assume some inferences based on anatomical constraints and/or context (e.g., such as provided from execution of the application 1839). For example, each of the first and secondmodular sensing devices 1810, 1820 can correspond to a wearable wrist device. Alternatively, the second modular sensing device 1820 can correspond to the pendent 1712 or neck-worn device. By way of example, if the first modular sensing device 1810 (wrist device 1710) is detected to be in motion, the inference engine 1835 can be used to determine additional position data for the movement of that device along a third axis based on orientation, position or context of second modular sensing device 1820 (wrist device 1711 or pendent device 1712). For example, if the first modular sensing device 1810 (wrist device 1711) measures arc motion, and the second modular sensing 1820 is the pendent, then the orientation of the second modular sensing device can indicate whether, for example, the arc motion is in front of the user or to the user's side. Alternatively, if the second modular sensing device 1820 is thesecond wrist device 1712, the information sensed from the second wrist device can identify the corresponding hand or device as being in front of the body. In such an orientation, the inference engine 1835 can determine the inference to be that the user is making the arc of motion in front of his body. Similarly, if the height of the second sensing device 1820 is determined to be belt high and the device is held by the user, the orientation of the user's torso can be inferred (along with the direction of the arc). - In examples in which the second modular sensing device 1820 is a pocket device (e.g., mobile computing device, FOB), information can be determined from, for example, the height of the device (e.g., user standing, crouching or jumping) and the rotation of the device. For example, if the second modular sensing device 1820 is pocket word, a change in the orientation of the device from vertical to horizontal, in combination with a downward acceleration can indicate the user is crouching. If the user is crouching, for example, the type of motion that is likely by the first
modular sensing device 1810 may be limited (e.g., motion of thewrist device 1710 is likely in front of user when user is moving up or down). The examples described with respect toFIGS. 17 and 18 can enable the user to utilize the modular sensing device(s) in connection with a real-world gameplay environment (or other task-oriented activities) executed by one or more of themodular sensing devices 1810, 1820, control a remotely operated device using gestures sensed by themodular sensing devices 1810, 1820, interact with other users, and perform various tasks in which themodular sensing devices 1810, 1820 can provide feedback and response output. - Modular Sensing Device Implementations
-
FIG. 19 illustrates an example of a modular sensing device insertable into a wrist worn apparatus. In particular, amodular sensing device 1900 can be constructed in accordance with examples provided herein, in order to implement operations and functionality such provided by any of the examples described. Themodular sensing device 1900 can include ahousing 1910 for containing a processor, memory (e.g., a controller implementing a plurality of state machines), and one or more sensors (e.g., IMU, gyroscope, accelerometer, proximity sensor, magnetometer, etc.). Themodular sensing device 1900 can also include a wireless communication resource for communicating with other devices, including devices which may be controlled in movement (e.g., self-propelled device) or other processes. - In some examples, the
modular sensing device 1900 can communicate sensor data, including output from the IMU, to another device for purpose of controlling movement of the other device. In some variations, themodular sensing device 1900 can include processing capabilities to process raw sensor data into higher data forms of communication. For example, themodular sensing device 1900 can generate output in the form of commands, or input for command selection from a receiving device. - According to some examples, the
housing 1910 of themodular sensing device 1900 can include securement features for enabling themodular sensing device 1900 to fasten onto anothercompatible structure 1920. Thecompatible structure 1920 can include an opening that is shaped to receive and secure themodular sensing device 1910. As shown with an example ofFIG. 19 , the securement features can include, for example, structural or shaped features of thehousing 1910. For example, thehousing 1910 can be dimensioned and/or structured (e.g., housing may be biased) to snap-fit into thecompatible structure 1920. Alternatively, at least one of thehousing 1910 orcompatible structure 1920 can include an integrated and/or mechanical fastener to secure themodular sensing device 1900. -
FIG. 20 illustrates an implementation of themodularized sensing device 2000. As shown, thesensing device 2000 can be retained by the compatible structure 2020 (e.g., wrist-worn strap), and then removed and placed in an opening of a wielded device 2010 (e.g., play sword). The placement of themodular sensing device 2000 in differentcompatible structures modular sensing device 2000 in the wrist-wornstrap 2020 can be used in conjunction with a first program running on a mobile computing device (controller), self-propelled device and/or other computer system (e.g., virtual gaming system). When placed in the wielded device 2010 (e.g., a wand), themodular sensing device 2000 can be operated in conjunction with a mobile computing device, self-propelled device and/or other computer system (e.g., virtual gaming system) which executes a second program or application. In each context, the orientation of themodular sensing device 2000 can be used to determine a perspective, such as a virtual field of view for gameplay. The perspective can refer to the orientation, direction and/or position of the user, and/or of the user's body part with respect to the sensing device. With the wand, the orientation and direction of the sensing device can be used to project a corresponding virtual object in a virtual environment (e.g., sword). Themodular sensing device 2000 may also be able to read an identifier of thecompatible structure - Conclusion
- It is contemplated for examples described herein to extend to individual elements and concepts described herein, independently of other concepts, ideas or system, as well as for examples to include combinations of elements recited anywhere in this application. Although examples are described in detail herein with reference to the accompanying drawings, it is to be understood that this disclosure is not limited to those precise examples. As such, many modifications and variations will be apparent to practitioners skilled in this art. Accordingly, it is intended that the scope of this disclosure be defined by the following claims and their equivalents. Furthermore, it is contemplated that a particular feature described either individually or as part of an example can be combined with other individually described features, or parts of other examples, even if the other features and examples make no mentioned of the particular feature. Thus, the absence of describing combinations should not preclude the inventor from claiming rights to such combinations.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/253,785 US20170189803A1 (en) | 2016-01-04 | 2016-08-31 | Task-oriented feedback using a modular sensing device |
PCT/US2017/020775 WO2017120624A1 (en) | 2016-01-04 | 2017-03-03 | Task-oriented feedback using a modular sensing device |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662274514P | 2016-01-04 | 2016-01-04 | |
US201662346216P | 2016-06-06 | 2016-06-06 | |
US15/253,785 US20170189803A1 (en) | 2016-01-04 | 2016-08-31 | Task-oriented feedback using a modular sensing device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170189803A1 true US20170189803A1 (en) | 2017-07-06 |
Family
ID=59226338
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/253,797 Abandoned US20170189824A1 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device utilized with autonomous self-propelled device |
US15/253,763 Expired - Fee Related US10534437B2 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device for processing gestures |
US15/253,790 Active US9939913B2 (en) | 2016-01-04 | 2016-08-31 | Smart home control using modular sensing device |
US15/253,785 Abandoned US20170189803A1 (en) | 2016-01-04 | 2016-08-31 | Task-oriented feedback using a modular sensing device |
US15/253,799 Active US10001843B2 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device implementing state machine gesture interpretation |
US15/253,778 Active US10275036B2 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device for controlling a self-propelled device |
US16/010,839 Abandoned US20190171295A1 (en) | 2016-01-04 | 2018-06-18 | Modular sensing device implementing state machine gesture interpretation |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/253,797 Abandoned US20170189824A1 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device utilized with autonomous self-propelled device |
US15/253,763 Expired - Fee Related US10534437B2 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device for processing gestures |
US15/253,790 Active US9939913B2 (en) | 2016-01-04 | 2016-08-31 | Smart home control using modular sensing device |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/253,799 Active US10001843B2 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device implementing state machine gesture interpretation |
US15/253,778 Active US10275036B2 (en) | 2016-01-04 | 2016-08-31 | Modular sensing device for controlling a self-propelled device |
US16/010,839 Abandoned US20190171295A1 (en) | 2016-01-04 | 2018-06-18 | Modular sensing device implementing state machine gesture interpretation |
Country Status (2)
Country | Link |
---|---|
US (7) | US20170189824A1 (en) |
WO (6) | WO2017120624A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9939913B2 (en) | 2016-01-04 | 2018-04-10 | Sphero, Inc. | Smart home control using modular sensing device |
US20180136642A1 (en) * | 2016-12-30 | 2018-05-17 | Haoxiang Electric Energy (Kunshan) Co., Ltd. | Control method, control device and control system for unmanned aerial vehicle |
US20180307909A1 (en) * | 2017-04-21 | 2018-10-25 | Walmart Apollo, Llc | Virtual reality network management user interface |
US20200275280A1 (en) * | 2017-09-08 | 2020-08-27 | Telefonaktiebolaget Lm Ericsson (Publ) | Re-Establishing a Connection Between a User Controller Device and a Wireless Device |
US10939159B1 (en) * | 2020-07-31 | 2021-03-02 | Arkade, Inc. | Systems and methods for enhanced remote control |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8572513B2 (en) | 2009-03-16 | 2013-10-29 | Apple Inc. | Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate |
US10706096B2 (en) | 2011-08-18 | 2020-07-07 | Apple Inc. | Management of local and remote media items |
US9002322B2 (en) | 2011-09-29 | 2015-04-07 | Apple Inc. | Authentication with secondary approver |
EP3149554B1 (en) | 2014-05-30 | 2024-05-01 | Apple Inc. | Continuity |
WO2016036510A1 (en) | 2014-09-02 | 2016-03-10 | Apple Inc. | Music user interface |
KR101653146B1 (en) | 2015-09-04 | 2016-09-01 | 홍유정 | Drone controller |
KR101921376B1 (en) * | 2016-02-24 | 2018-11-22 | 홍유정 | Object controller |
JP2017196691A (en) * | 2016-04-27 | 2017-11-02 | パナソニックIpマネジメント株式会社 | robot |
JP2017205324A (en) * | 2016-05-19 | 2017-11-24 | パナソニックIpマネジメント株式会社 | robot |
US20220201191A1 (en) * | 2016-06-10 | 2022-06-23 | Gopro, Inc. | Systems and methods for sharing communications with a multi-purpose device |
DK201670622A1 (en) | 2016-06-12 | 2018-02-12 | Apple Inc | User interfaces for transactions |
US20180129254A1 (en) * | 2016-11-07 | 2018-05-10 | Toyota Motor Engineering & Manufacturing North Ame rica, Inc. | Wearable device programmed to record messages and moments in time |
US20220279063A1 (en) | 2017-05-16 | 2022-09-01 | Apple Inc. | Methods and interfaces for home media control |
CN111343060B (en) | 2017-05-16 | 2022-02-11 | 苹果公司 | Method and interface for home media control |
US10108272B1 (en) * | 2017-05-30 | 2018-10-23 | Motorola Mobility Llc | Wearable device with gesture recognition module |
TWI653550B (en) * | 2017-07-06 | 2019-03-11 | 鴻海精密工業股份有限公司 | Electronic device and display control method thereof |
WO2019028858A1 (en) * | 2017-08-11 | 2019-02-14 | Lenovo (Beijing) Limited | Aerial vehicle state transition |
US11156457B2 (en) * | 2017-12-14 | 2021-10-26 | Sphero, Inc. | Systems and methods for device detection of presence and manipulations |
US12045050B2 (en) * | 2018-01-12 | 2024-07-23 | Superior Marine LLC | Gesturing for control input for a vehicle |
GB2574886A (en) * | 2018-06-22 | 2019-12-25 | Ecole Polytechnique Fed Lausanne Epfl | Teleoperation with a wearable sensor system |
CN109143875B (en) * | 2018-06-29 | 2021-06-15 | 广州市得腾技术服务有限责任公司 | Gesture control smart home method and system |
DK201970533A1 (en) | 2019-05-31 | 2021-02-15 | Apple Inc | Methods and user interfaces for sharing audio |
US11620103B2 (en) | 2019-05-31 | 2023-04-04 | Apple Inc. | User interfaces for audio media control |
US10904029B2 (en) | 2019-05-31 | 2021-01-26 | Apple Inc. | User interfaces for managing controllable external devices |
US10996917B2 (en) | 2019-05-31 | 2021-05-04 | Apple Inc. | User interfaces for audio media control |
CN110515307B (en) * | 2019-08-27 | 2020-11-20 | 珠海格力电器股份有限公司 | Method for controlling intelligent household equipment and network equipment |
CN111586714B (en) * | 2020-04-21 | 2021-07-20 | 珠海格力电器股份有限公司 | Network port allocation method, device, electronic equipment and computer usable medium |
US11392291B2 (en) * | 2020-09-25 | 2022-07-19 | Apple Inc. | Methods and interfaces for media control with dynamic feedback |
KR102372040B1 (en) * | 2021-05-10 | 2022-03-10 | (주)에이티로봇 | Control device for human type toy robot |
US11847378B2 (en) | 2021-06-06 | 2023-12-19 | Apple Inc. | User interfaces for audio routing |
US20230102929A1 (en) * | 2021-09-24 | 2023-03-30 | Embark Trucks, Inc. | Autonomous vehicle automated scenario characterization |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7355A (en) * | 1850-05-14 | Oratia p | ||
US6183365B1 (en) * | 1996-06-05 | 2001-02-06 | Casio Computer Co., Ltd. | Movement measuring device, electronic game machine including movement measuring device, and method of playing game machine |
US20080300055A1 (en) * | 2007-05-29 | 2008-12-04 | Lutnick Howard W | Game with hand motion control |
US8089458B2 (en) * | 2000-02-22 | 2012-01-03 | Creative Kingdoms, Llc | Toy devices and methods for providing an interactive play experience |
US20130190903A1 (en) * | 2012-01-19 | 2013-07-25 | Nike, Inc. | Action Detection and Activity Classification |
US20140347265A1 (en) * | 2013-03-15 | 2014-11-27 | Interaxon Inc. | Wearable computing apparatus and method |
US9125015B2 (en) * | 2013-06-28 | 2015-09-01 | Facebook, Inc. | User activity tracking system and device |
US9418342B2 (en) * | 2013-12-06 | 2016-08-16 | At&T Intellectual Property I, L.P. | Method and apparatus for detecting mode of motion with principal component analysis and hidden markov model |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE60319638T2 (en) * | 2002-08-07 | 2009-04-02 | Seiko Epson Corp. | Portable information device |
US8684839B2 (en) | 2004-06-18 | 2014-04-01 | Igt | Control of wager-based game using gesture recognition |
US8214098B2 (en) | 2008-02-28 | 2012-07-03 | The Boeing Company | System and method for controlling swarm of remote unmanned vehicles through human gestures |
EP2347321B1 (en) | 2008-09-25 | 2013-09-18 | Movea S.A | Command by gesture interface |
US8503932B2 (en) | 2008-11-14 | 2013-08-06 | Sony Mobile Comminications AB | Portable communication device and remote motion input device |
US8289162B2 (en) * | 2008-12-22 | 2012-10-16 | Wimm Labs, Inc. | Gesture-based user interface for a wearable portable device |
US8593576B2 (en) | 2009-10-15 | 2013-11-26 | At&T Intellectual Property I, L.P. | Gesture-based remote control |
US20110159939A1 (en) | 2009-12-24 | 2011-06-30 | Jason McCarthy | Fight analysis system |
US8150384B2 (en) | 2010-06-16 | 2012-04-03 | Qualcomm Incorporated | Methods and apparatuses for gesture based remote control |
KR101117207B1 (en) | 2010-07-12 | 2012-03-16 | 한국항공대학교산학협력단 | Auto and manual control system for unmanned aerial vehicle via smart phone |
US9141150B1 (en) | 2010-09-15 | 2015-09-22 | Alarm.Com Incorporated | Authentication and control interface of a security system |
US20120173050A1 (en) | 2011-01-05 | 2012-07-05 | Bernstein Ian H | System and method for controlling a self-propelled device in connection with a virtual environment |
US8717165B2 (en) | 2011-03-22 | 2014-05-06 | Tassilo Gernandt | Apparatus and method for locating, tracking, controlling and recognizing tagged objects using RFID technology |
US20140008496A1 (en) | 2012-07-05 | 2014-01-09 | Zhou Ye | Using handheld device to control flying object |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US20140180582A1 (en) * | 2012-12-21 | 2014-06-26 | Mark C. Pontarelli | Apparatus, method and techniques for wearable navigation device |
US9221170B2 (en) | 2013-06-13 | 2015-12-29 | GM Global Technology Operations LLC | Method and apparatus for controlling a robotic device via wearable sensors |
US9606721B2 (en) | 2013-07-22 | 2017-03-28 | Lg Electronics Inc. | Mobile terminal and control method thereof |
JP6275839B2 (en) | 2013-08-23 | 2018-02-07 | 華為技術有限公司Huawei Technologies Co.,Ltd. | Remote control device, information processing method and system |
US20150062086A1 (en) | 2013-08-29 | 2015-03-05 | Rohildev Nattukallingal | Method and system of a wearable ring device for management of another computing device |
KR102088018B1 (en) * | 2013-09-03 | 2020-05-27 | 삼성전자주식회사 | Apparatus and method for interworking among electronic devices |
US10139914B2 (en) | 2013-09-13 | 2018-11-27 | Nod, Inc. | Methods and apparatus for using the human body as an input device |
US20160299570A1 (en) | 2013-10-24 | 2016-10-13 | Apple Inc. | Wristband device input using wrist movement |
US20150145653A1 (en) | 2013-11-25 | 2015-05-28 | Invensense, Inc. | Device control using a wearable device |
JP6366730B2 (en) | 2013-12-13 | 2018-08-01 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Method for launching and landing a drone |
EP3069546A1 (en) | 2013-12-18 | 2016-09-21 | Apple Inc. | Gesture-based information exchange between devices in proximity |
WO2015102467A1 (en) | 2014-01-06 | 2015-07-09 | 삼성전자 주식회사 | Home device control apparatus and control method using wearable device |
US9883301B2 (en) | 2014-04-22 | 2018-01-30 | Google Technology Holdings LLC | Portable electronic device with acoustic and/or proximity sensors and methods therefor |
WO2015179838A2 (en) * | 2014-05-23 | 2015-11-26 | Sphero, Inc. | Causing gesture responses on connected devices |
US20160101856A1 (en) | 2014-06-23 | 2016-04-14 | Nixie Labs, Inc. | Wearable unmanned aerial vehicles, and associated systems and methods |
US9720515B2 (en) | 2015-01-02 | 2017-08-01 | Wearable Devices Ltd. | Method and apparatus for a gesture controlled interface for wearable devices |
US10099608B2 (en) * | 2015-01-16 | 2018-10-16 | Ford Global Technologies, Llc | Haptic vehicle alert based on wearable device |
US9721566B2 (en) | 2015-03-08 | 2017-08-01 | Apple Inc. | Competing devices responding to voice triggers |
US10222870B2 (en) * | 2015-04-07 | 2019-03-05 | Santa Clara University | Reminder device wearable by a user |
US10459524B2 (en) | 2015-04-14 | 2019-10-29 | Northrop Grumman Systems Corporation | Multi-sensor control system and method for remote signaling control of unmanned vehicles |
KR20150063998A (en) | 2015-05-21 | 2015-06-10 | 엘지이노텍 주식회사 | Mobile communication terminal with the remote control function and control method thereof |
CN105138126B (en) | 2015-08-26 | 2018-04-13 | 小米科技有限责任公司 | Filming control method and device, the electronic equipment of unmanned plane |
KR102570068B1 (en) | 2015-11-20 | 2023-08-23 | 삼성전자주식회사 | Gesture recognition method, gesture recognition apparatus, wearable device |
US9663227B1 (en) | 2015-12-22 | 2017-05-30 | Gopro, Inc. | Systems and methods for controlling an unmanned aerial vehicle |
US20170189824A1 (en) | 2016-01-04 | 2017-07-06 | Sphero, Inc. | Modular sensing device utilized with autonomous self-propelled device |
-
2016
- 2016-08-31 US US15/253,797 patent/US20170189824A1/en not_active Abandoned
- 2016-08-31 US US15/253,763 patent/US10534437B2/en not_active Expired - Fee Related
- 2016-08-31 US US15/253,790 patent/US9939913B2/en active Active
- 2016-08-31 US US15/253,785 patent/US20170189803A1/en not_active Abandoned
- 2016-08-31 US US15/253,799 patent/US10001843B2/en active Active
- 2016-08-31 US US15/253,778 patent/US10275036B2/en active Active
-
2017
- 2017-03-03 WO PCT/US2017/020775 patent/WO2017120624A1/en active Application Filing
- 2017-03-03 WO PCT/US2017/020786 patent/WO2017120626A1/en active Application Filing
- 2017-03-03 WO PCT/US2017/020771 patent/WO2017120623A2/en active Application Filing
- 2017-03-03 WO PCT/US2017/020762 patent/WO2017120622A1/en active Application Filing
- 2017-03-03 WO PCT/US2017/020779 patent/WO2017120625A1/en active Application Filing
- 2017-03-03 WO PCT/US2017/020790 patent/WO2017139812A2/en active Application Filing
-
2018
- 2018-06-18 US US16/010,839 patent/US20190171295A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7355A (en) * | 1850-05-14 | Oratia p | ||
US6183365B1 (en) * | 1996-06-05 | 2001-02-06 | Casio Computer Co., Ltd. | Movement measuring device, electronic game machine including movement measuring device, and method of playing game machine |
US8089458B2 (en) * | 2000-02-22 | 2012-01-03 | Creative Kingdoms, Llc | Toy devices and methods for providing an interactive play experience |
US20080300055A1 (en) * | 2007-05-29 | 2008-12-04 | Lutnick Howard W | Game with hand motion control |
US20130190903A1 (en) * | 2012-01-19 | 2013-07-25 | Nike, Inc. | Action Detection and Activity Classification |
US20140347265A1 (en) * | 2013-03-15 | 2014-11-27 | Interaxon Inc. | Wearable computing apparatus and method |
US9125015B2 (en) * | 2013-06-28 | 2015-09-01 | Facebook, Inc. | User activity tracking system and device |
US9418342B2 (en) * | 2013-12-06 | 2016-08-16 | At&T Intellectual Property I, L.P. | Method and apparatus for detecting mode of motion with principal component analysis and hidden markov model |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9939913B2 (en) | 2016-01-04 | 2018-04-10 | Sphero, Inc. | Smart home control using modular sensing device |
US10001843B2 (en) | 2016-01-04 | 2018-06-19 | Sphero, Inc. | Modular sensing device implementing state machine gesture interpretation |
US10275036B2 (en) | 2016-01-04 | 2019-04-30 | Sphero, Inc. | Modular sensing device for controlling a self-propelled device |
US10534437B2 (en) | 2016-01-04 | 2020-01-14 | Sphero, Inc. | Modular sensing device for processing gestures |
US20180136642A1 (en) * | 2016-12-30 | 2018-05-17 | Haoxiang Electric Energy (Kunshan) Co., Ltd. | Control method, control device and control system for unmanned aerial vehicle |
US20180307909A1 (en) * | 2017-04-21 | 2018-10-25 | Walmart Apollo, Llc | Virtual reality network management user interface |
US20180307908A1 (en) * | 2017-04-21 | 2018-10-25 | Walmart Apollo, Llc | Virtual reality appliance management user interface |
US20200275280A1 (en) * | 2017-09-08 | 2020-08-27 | Telefonaktiebolaget Lm Ericsson (Publ) | Re-Establishing a Connection Between a User Controller Device and a Wireless Device |
US10939159B1 (en) * | 2020-07-31 | 2021-03-02 | Arkade, Inc. | Systems and methods for enhanced remote control |
US11445236B2 (en) | 2020-07-31 | 2022-09-13 | Arkade, Inc. | Systems and methods for enhanced remote control |
Also Published As
Publication number | Publication date |
---|---|
US10001843B2 (en) | 2018-06-19 |
WO2017120622A1 (en) | 2017-07-13 |
US20170192516A1 (en) | 2017-07-06 |
WO2017139812A3 (en) | 2017-10-05 |
WO2017120624A1 (en) | 2017-07-13 |
WO2017120626A1 (en) | 2017-07-13 |
WO2017139812A2 (en) | 2017-08-17 |
US20170189824A1 (en) | 2017-07-06 |
US20190171295A1 (en) | 2019-06-06 |
US20170192517A1 (en) | 2017-07-06 |
US9939913B2 (en) | 2018-04-10 |
US20170192518A1 (en) | 2017-07-06 |
WO2017120623A2 (en) | 2017-07-13 |
US10275036B2 (en) | 2019-04-30 |
WO2017120625A1 (en) | 2017-07-13 |
WO2017120623A3 (en) | 2017-08-10 |
US10534437B2 (en) | 2020-01-14 |
US20170193813A1 (en) | 2017-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20170189803A1 (en) | Task-oriented feedback using a modular sensing device | |
CN111095150B (en) | Robot as personal trainer | |
US11460837B2 (en) | Self-propelled device with actively engaged drive system | |
US10222868B2 (en) | Wearable device and control method using gestures | |
US9361067B1 (en) | System and method for providing a software development kit to enable configuration of virtual counterparts of action figures or action figure accessories | |
US10661148B2 (en) | Dual motion sensor bands for real time gesture tracking and interactive gaming | |
EP3007030B1 (en) | Portable device and control method via gestures | |
US20150258458A1 (en) | Interactive smart beads | |
KR102376816B1 (en) | Unlock augmented reality experience with target image detection | |
US20220100281A1 (en) | Managing states of a gesture recognition device and an interactive casing | |
CN110123333A (en) | A kind of method, wearable device and the storage medium of wearable device synkinesia | |
US20150273321A1 (en) | Interactive Module | |
CN112915541A (en) | Jumping point searching method, device, equipment and storage medium | |
WO2018057044A1 (en) | Dual motion sensor bands for real time gesture tracking and interactive gaming | |
CN116251343A (en) | Somatosensory game method based on throwing action | |
JP2022156374A (en) | Program, method, and information processing device | |
WO2021181851A1 (en) | Information processing device, method, and program | |
CN115282589B (en) | Somatosensory game method based on rope skipping action | |
CN115337627A (en) | Boxing motion sensing game method and device and computer readable storage medium | |
JP2022156128A (en) | Program, method, and information processing device | |
CN117991901A (en) | Motion-based somatosensory game method | |
JP2022156375A (en) | Program, method, and information processing device | |
CN116173492A (en) | Motion sensing game method based on boxing motion | |
Shen | Application of Depth Sensor in the Design of Hybrid Robotic Gaming Environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: SPHERO, INC., COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ATWELL, JAMES;WIENCROT, JEFFREY;CARROLL, JONATHAN;AND OTHERS;SIGNING DATES FROM 20170124 TO 20170126;REEL/FRAME:041254/0645 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNOR:SPHERO, INC.;REEL/FRAME:052623/0705 Effective date: 20200501 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |