Montanini et al., 2015 - Google Patents
Low complexity head tracking on portable android devices for real time message compositionMontanini et al., 2015
- Document ID
- 4832163407452534898
- Author
- Montanini L
- Cippitelli E
- Gambi E
- Spinsante S
- Publication year
- Publication venue
- Journal on Multimodal User Interfaces
External Links
Snippet
For the people who are totally or partially unable to move or control their limbs and cannot rely on verbal communication, it is very important to obtain an interface capable of interpreting their limited voluntary movements, in order to allow communications with friends …
- 239000000203 mixture 0 title abstract description 14
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00221—Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
- G06K9/00268—Feature extraction; Face representation
- G06K9/00281—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11580711B2 (en) | Systems and methods for controlling virtual scene perspective via physical touch input | |
CN114341779B (en) | Systems, methods, and interfaces for performing input based on neuromuscular control | |
US20220229534A1 (en) | Coordinating cursor movement between a physical surface and a virtual surface | |
US20170083086A1 (en) | Human-Computer Interface | |
US20070074114A1 (en) | Automated dialogue interface | |
Ding et al. | Service robot system with integration of wearable Myo armband for specialized hand gesture human–computer interfaces for people with disabilities with mobility problems | |
US20220291753A1 (en) | Spatial Gesture Recognition using Inputs from Different Devices to Control a Computing Device | |
KR20190030140A (en) | Method for eye-tracking and user terminal for executing the same | |
US20120268359A1 (en) | Control of electronic device using nerve analysis | |
US20220253146A1 (en) | Combine Inputs from Different Devices to Control a Computing Device | |
US20220236801A1 (en) | Method, computer program and head-mounted device for triggering an action, method and computer program for a computing device and computing device | |
KR20240053070A (en) | Touchless image-based input interface | |
Montanini et al. | Low complexity head tracking on portable android devices for real time message composition | |
Chhimpa et al. | Development of a real-time eye movement-based computer interface for communication with improved accuracy for disabled people under natural head movements | |
US11789530B2 (en) | Gaze-based user interface with assistant features for smart glasses in immersive reality applications | |
Esiyok et al. | Novel hands-free interaction techniques based on the software switch approach for computer access with head movements | |
Shree et al. | A Virtual Assistor for Impaired People by using Gestures and Voice | |
Ding et al. | A design on recommendations of sensor development platforms with different sensor modalities for making gesture biometrics-based service applications of the specific group | |
Biswas | Inclusive Human Machine Interaction for India: A Case Study of Developing Inclusive Applications for the Indian Population | |
Kreiensieck et al. | A Comprehensive Evaluation of OpenFace 2.0 Gaze Tracking | |
Abe et al. | Communication-Aid System Using Eye-Gaze and Blink Information | |
Tektonidis et al. | Intuitive user interfaces to help boost adoption of internet-of-things and internet-of-content services for all | |
Umut et al. | Novel Wearable System to Recognize Sign Language in Real Time | |
Arai | Mobile phone operations just by sight and its applications | |
Kowalczyk et al. | Wink Detection on the Eye Image as a Control Tool in Multimodal Interaction |