[go: nahoru, domu]

US9110495B2 - Combined surface user interface - Google Patents

Combined surface user interface Download PDF

Info

Publication number
US9110495B2
US9110495B2 US12/699,706 US69970610A US9110495B2 US 9110495 B2 US9110495 B2 US 9110495B2 US 69970610 A US69970610 A US 69970610A US 9110495 B2 US9110495 B2 US 9110495B2
Authority
US
United States
Prior art keywords
computing device
mobile computing
neighboring
computing devices
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/699,706
Other versions
US20110191690A1 (en
Inventor
Chunhui Zhang
Ji Zhao
Min Wang
Rui Gao
Xiong-Fei Cai
Chunshui Zhao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to US12/699,706 priority Critical patent/US9110495B2/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CAI, XIONG-FEI, GAO, RUI, WANG, MIN, ZHANG, CHUNHUI, ZHAO, CHUNSHUI, ZHAO, JI
Publication of US20110191690A1 publication Critical patent/US20110191690A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Priority to US14/825,711 priority patent/US10452203B2/en
Application granted granted Critical
Publication of US9110495B2 publication Critical patent/US9110495B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Definitions

  • Hand-held mobile devices such as mobile phones have become small and powerful, and they continue to develop at a rapid pace.
  • Pocket-sized, hand-held mobile devices now have the computing power to do many things that previously required large personal computers.
  • small screen sizes and input methods of hand-held mobile devices are still challenging to users, and detract from the user experience. Users desire larger display screens to display more information and to interact more easily with both the user's device itself and with other devices that may be nearby.
  • the Detailed Description describes a mobile device that uses a projector to illuminate a projection area external to the mobile device, and uses the projection area as a user interface.
  • the user interface can function as a touch screen or multi-touch screen and a user's interaction with the projection area can be captured by a camera of the mobile device.
  • the mobile device can interconnect and collaborate with one or more neighboring mobile devices to stitch their projection areas and to thereby create a combined seamless user interface that utilizes the combined projection areas of the mobile device and its neighboring devices. Users of the multiple mobile devices can interact with each other on the combined seamless user interface.
  • FIG. 1 is a block diagram showing an example of how multiple mobile devices can be used together to form a combined seamless user interface.
  • FIG. 2 is a block diagram showing an exemplary creation of a combined seamless user interface by combining multiple projection areas of multiple mobile devices.
  • FIG. 3 is a block diagram showing an exemplary resource sharing scenario among multiple mobile devices based on the combined seamless user interface.
  • FIG. 4 is a block diagram showing an exemplary cooperating work scenario among multiple mobile devices based on the combined seamless user interface.
  • FIG. 5 is a block diagram showing an exemplary automatic identification of neighboring devices for purposes of coupling.
  • FIG. 6 is a flowchart showing an exemplary procedure of stitching overlapping projection areas of multiple mobile devices.
  • This disclosure describes techniques for using a projector of a mobile device to project or illuminate a projection area external to the mobile device and to use the projection area as a display and user interface.
  • the mobile device can use one or more sensing mechanisms, such as an infrared illuminator and camera, to detect a user's interaction with the projection area.
  • the infrared camera might be used to detect movement of a stylus or finger relative to the projection area, such as when a user touches or nearly touches the surface of the projection area.
  • This allows the user interface to act as a “touch” or touch-type screen, where the user touches or nearly touches areas of the projection area to interact with the mobile device. Both single-touch and multi-touch inputs can be detected.
  • two or more mobile devices can cooperate to create a combined or integrated user interface by stitching projection areas of the two or more mobile devices into an integrated and seamless display.
  • the combined user interface has a larger combined display area compared to the LCDs or other native displays of the individual mobile devices.
  • Different users can use the combined seamless user interface to perform interactive operations such as exchanging data and working collaboratively on a common project, document, or other resource.
  • FIG. 1 shows an example of how multiple mobile devices can be used together to form a combined user interface.
  • This example includes two computing devices, designated by reference numerals 102 ( a ) and 102 ( b ) and referred to as first mobile device 102 ( a ) and second or neighboring device 102 ( b ).
  • first mobile device 102 a
  • second or neighboring device 102 b
  • FIG. 1 shows only two devices, the techniques described herein can also be used with more than two devices, to create a combined user interface using projection components of all such devices.
  • One or more of the devices may also be non-mobile.
  • first mobile device 102 ( a ) will be described in some detail.
  • Second device 102 ( b ) and any other neighboring devices are understood to have similar components and functionality, although they may differ significantly in some respects.
  • first mobile device 102 ( a ) can be a mobile phone, a PDA, a mobile internet device, a netbook, a personal media player, a laptop, a hand-held mobile device, or any other portable, mobile, computing or communications device.
  • Mobile device 102 ( a ) can be equipped with a physical screen 104 ( a ), a projector 106 ( a ), an illuminator 108 ( a ), and one or more image or touch sensors such as a camera 110 ( a ).
  • the physical screen 104 ( a ) displays graphics to a user and can be used as part of a default or primary graphical user interface.
  • Screen 104 ( a ) can be touch-sensitive to accept input from a user.
  • keys or buttons can be utilized for user input and interaction.
  • the size of the physical screen 104 ( a ) will often be quite small, limited by the small size of the mobile device 102 ( a ).
  • Projector 106 ( a ) can take many forms, including that of a so-called “pico” projector, which is small in size and has modest power requirements.
  • the projector 106 ( a ) displays a user interface on a surface 112 external to mobile device 102 ( a ).
  • the projected user interface occupies or defines a projection area 114 ( a ) on surface 112 .
  • the projected image in this embodiment can display a secondary graphical user interface occupying at least a portion of projection area 114 ( a ).
  • the user can physically interact with this secondary graphical user interface to control or interact with the mobile device 102 ( a ).
  • the secondary graphical user interface fully occupies projection area 114 ( a ).
  • the secondary user interface provided by projector 106 ( a ) will be much larger than the primary user interface formed by physical screen 104 ( a ).
  • Mobile device 102 ( a ) can coordinate its physical screen 104 ( a ) and its external projection area 114 ( a ) in different ways.
  • physical screen 104 ( a ) and projection area 114 ( a ) can show the same content.
  • physical screen 104 ( a ) only shows simple content, such as a reminder or a clock.
  • the user wants a large display to perform more complex or detailed operations, such as reading a document, surfing the internet, or composing an email, the user can display a relevant application on projection area 114 ( a ) and interact with the application by pointing at or touching surface 112 within projection area 114 ( a ).
  • Illuminator 108 ( a ) and camera 110 ( a ) are used in combination to sense user interaction with the projected user interface, together forming what will be referred to herein as an input sensor.
  • illuminator 108 ( a ) can be an infrared emitter that illuminates projection area 114 ( a ) with non-visible infrared light. More generally, illuminator 108 ( a ) illuminates an input area 116 ( a ) that is at least as large as projection area 114 ( a ) and that encompasses projection area 114 ( a ).
  • Camera 110 ( a ) can be an infrared camera, sensitive to non-visible infrared light incident on input area 116 ( a ). Camera 110 ( a ) monitors the infrared illumination of the projection area to detect touch or touch-like interaction by a user with the displayed user interface. Furthermore, as will be described in more detail below, camera 110 ( a ) detects portions of projection area 114 ( a ) that overlap with projection areas of one or more neighboring computing devices.
  • mobile device 102 ( a ) There can be many different embodiments of mobile device 102 ( a ).
  • projector 106 ( a ), illuminator 108 ( a ), and camera 110 ( a ) are built into the mobile device 102 ( a ), as shown in the FIG. 1 .
  • One or more of projector 106 ( a ), illuminator 108 ( a ), and camera 110 ( a ) can also be modularly integrated with each other.
  • illuminator 108 ( a ) and the camera 110 ( a ) can be integrated as a single unit or module within mobile device 102 ( a ) or connected to mobile device 102 ( a ).
  • Input area 116 ( a ) and projection area 114 ( a ) may or may not be exactly the same as each other.
  • input area 116 ( a ) is larger than and includes projection area 114 ( a ) in order to detect user interaction across the entire projection area 114 ( a ).
  • projector 106 ( a ), illuminatorl 08 ( a ), and camera 110 ( a ) are preconfigured and mounted relative to each other so that when mobile device 102 ( a ) is placed upright on surface 112 , input area 116 ( a ) and projection area 114 ( a ) are properly focused and sized on surface 112 , and properly aligned with each other as shown.
  • the secondary user interface displayed by the projector 106 ( a ) acts as a touch-sensitive display; the input sensor consisting of illuminator 108 ( a ) and camera 110 ( a ) is able to detect when and where a user touches surface 112 within input area 116 ( a ).
  • the camera 110 ( a ) senses the infrared illumination from illuminator 108 ( a ).
  • User interactions relative to surface 112 cause shadows in the IR illumination, which mobile device 102 ( a ) interprets to determine placement of fingers or styli.
  • camera 110 ( a ) may be sensitive to visible or projected light to optically capture the user's interactions within the input area 116 ( a ).
  • Block 118 shows internal or logical components of first mobile device 102 ( a ).
  • Second mobile device 102 ( b ) has similar components and functionality.
  • the components of mobile devices 102 ( a ) and 102 ( b ) include one or more processors 120 , a communication system 122 , and memory 124 .
  • memory 124 contains computer-readable instructions that are accessible and executable by processor 112 .
  • Memory 124 may comprise a variety of computer readable storage media. Such media can be any available media including both volatile and non-volatile storage media, removable and non-removable media, local media, remote media, optical memory, magnetic memory, electronic memory, etc.
  • program modules can be stored in the memory, including by way of example, an operating system, one or more applications, other program modules, and program data. Each of such program modules and program data (or some combination thereof) may implement all or part of the resident components that support the data center system as described herein.
  • Communication system 122 is configured to allow the first mobile computing device 102 ( a ) to communicate with one or more neighboring computing devices.
  • the communication system 122 can use wired or wireless techniques for communication.
  • the neighboring computing devices can be other mobile devices or any other computing devices, such as digital cameras or cell phones.
  • neighboring devices are logically coupled with each other or otherwise connected with each other for communication, collaboration, and display coordination. Devices can be coupled either automatically, in response to physical proximity; or manually, in response to explicit user commands.
  • mobile device 102 ( a ) can use wireless or bluetooth searching, ultrasonic techniques or other techniques to sense physical nearness of another device.
  • mobile device 102 ( a ) can use its camera 110 ( a ) to detect an existence of a projection area of a neighboring device that overlaps projection area 114 ( a ).
  • the user can manually build a connection or coupling between two devices such as mobile device 102 ( a ) and neighboring device 102 ( b ).
  • the user can input information into mobile device 102 ( a ) that identifies a neighboring device to be connected.
  • the projector 106 ( a ) might project a user input field (not shown in the FIG. 1 ) on the projection area 114 ( a ) along with a projected keyboard and a prompt instructing the user to enter information about neighboring device 102 ( b ).
  • Such information might include a device name, user name/password, and/or a relative position of the neighboring device 102 ( b ) to the mobile device 102 ( a ).
  • the user enters this information using the touch-sensitive or touch-like features implemented by mobile device 102 ( a ) in conjunction with its projector 106 ( a ), and mobile device 102 ( a ) attempts to find neighboring device 102 ( b ) according to the information entered by the user.
  • connection among multiple mobile devices can be built automatically, without user action.
  • mobile device 102 ( a ) can detect or find neighboring device 102 ( b ) by wireless searching.
  • the wireless searching can use wi-fi, infrared, bluetooth , or other wireless techniques.
  • camera 110 ( a ) can be monitored to detect a projection area of another device, such as a projection area 114 ( b ) of neighboring device 102 ( b ). Having sensed the existence of a neighboring device 102 ( b ) in this manner, mobile device 102 ( a ) can then attempt to connect or couple with it.
  • a projection area of another device such as a projection area 114 ( b ) of neighboring device 102 ( b ).
  • a mobile device may request identity information of neighboring device or vice versa for security purposes before the two mobile devices are interconnected or coupled for collaboration.
  • a potential coupling can be allowed or denied based on the identity information.
  • mobile device 102 ( a ) is configured with a “white list” of other devices with which coupling is allowed. This list may be configured by the user of mobile device 102 ( a ). In another embodiment, one or more such white lists can be maintained by a third party to which mobile device 102 ( a ) has access. Alternatively, the user of mobile device 102 ( a ) might be asked for confirmation prior to implementing any potential coupling.
  • Mobile device 102 ( a ) also has a collaboration logic 126 , which in this example comprises computer-executable programs, routines, or instructions that are stored within memory 124 and are executed by processor 120 .
  • the collaboration logic 126 communicates with physically neighboring mobile devices using communication system 122 , and coordinates with those devices to graphically stitch the projection area 114 ( a ) of mobile computing device 102 ( a ) with projection areas of one or more neighboring computing devices. For example, as shown in FIG. 1 , collaboration logic 126 stitches projection area 114 ( a ) with projection area 114 ( b ) of neighboring mobile device 102 ( b ).
  • collaboration logic 126 creates a combined seamless user interface 128 utilizing a portion of projection area 114 ( a ) of mobile computing device 102 ( a ) and a portion of projection area 114 ( b ) of neighboring device 102 ( b ).
  • the combined user interface 128 can be any shape or size within a boundary of both projection area 114 ( a ) and projection area 114 ( b ).
  • the projection areas 114 ( a ) and 114 ( b ) are potentially clipped at their intersection so that the resulting clipped projection areas do not overlap each other, but are immediately adjacent each other. This avoids an abnormally bright area in the combined user interface that would otherwise result from the overlapped illumination of the two projectors.
  • combined user interface 128 can be larger than the user interface that any single device might be able to display.
  • the collaboration logic 126 might coordinate with other devices to create a single user interface that is primarily for interaction with the single mobile device 102 ( a ), with other devices acting in a slave mode to expand the user interface of mobile device 102 ( b ).
  • the combined user interface may allow concurrent interaction with all of the different devices, and interactions with the combined user interface may cause actions or results in one or more of the multiple devices.
  • the combined seamless user interface 128 allows one or more graphical representations 130 to span and move seamlessly between the projection areas of the mobile computing device and the neighboring computing devices.
  • FIG. 1 shows a graphical representation 130 (in this case an icon) that spans projection area 114 ( a ) and projection area 114 ( b ).
  • graphical representation 130 can be seamlessly dragged between projection area 114 ( a ) and projection area 114 ( b ).
  • Graphical representation 130 can correspond to device resources such as files, shortcuts, programs, documents, etc., similar to “icons” used in many graphical operating systems to represent various resources. Graphical representation 130 might alternatively comprise a displayed resource, menu, window, pane, document, picture, or similar visual representation that concurrently spans the projection areas of mobile device 102 ( a ) and neighboring device 102 ( b ).
  • collaboration logic 126 of mobile device 102 ( a ) can collaborate with mobile device 102 ( b ) to create the combined seamless user interface 128 .
  • collaboration logic 126 of mobile device 102 ( a ) communicates with mobile device 102 ( b ) to negotiate how each device will clip portions of its projection area 114 to avoid overlap.
  • collaboration logic 126 communicates coordinates allowing the remaining projection areas of the two devices to be graphically stitched to each other to create the appearance of a single projected image or user interface utilizing at least portions of the projection areas 114 ( a ) and 114 ( b ).
  • FIG. 2 shows an example of how the projection areas of three devices might be clipped and stitched to form a combined interface.
  • FIG. 2 shows three overlapping projection areas: projection area 114 ( a ) produced by mobile device 102 ( a ) of FIG. 1 , projection area 114 ( b ) produced by neighboring device 102 ( b ), and another projection area 114 ( c ) that might be produced by another neighboring device having capabilities similar to those of mobile device 102 ( a ).
  • FIG. 2 only shows three overlapping projection areas, the techniques described herein can be used for creation of a combined seamless user interface by combining any number of overlapping projection areas.
  • FIG. 2 shows input area 116 ( a ). Although other input areas are not shown in FIG. 2 , it should be understood that each projection device might define its own input area, corresponding to its projection area.
  • the projection area 114 ( a ) is represented by a polygon defined by (P 4 , B, C, D).
  • the projection area 114 ( b ) is represented by a polygon defined by (P 1 , P 2 , P 3 , A).
  • the projection area 114 ( c ) is represented by a polygon defined by (E, F, P 5 , P 6 ). All of the projection areas and input areas locate at a surface 112 .
  • FIG. 2 there is an overlapping portion between projection area 114 ( a ) and projection area 114 ( b ), forming a polygon defined by (C 1 , B, C 2 , A). Cross points between the two projection areas are C 1 and C 2 . There is also an overlapping portion between projection area 114 ( a ) and projection area 114 ( c ), and cross points between the two projection areas are C 3 and C 4 .
  • camera 110 ( a ) of mobile device 102 ( a ) can detect the overlapping portions and cross points.
  • mobile device 102 ( a ) is configured to monitor its input area 116 ( a ) with camera 110 ( a ) and to detect any portions of projection area 114 ( a ) that are significantly brighter than other parts of the projection area 114 ( a ), after correcting for the localized brightness of the image being projected by mobile device 102 ( a ) itself in projection area 114 ( a ).
  • Collaboration logic 126 of mobile device 102 ( a ) can communicate with the neighboring devices, including neighboring device 102 ( b ), to define a new projection area of each device in the combined seamless user interface. Any one of the devices can detect an overlapping portion of its projection area with a projection area of another mobile device and calculate the cross points. In addition, any one of the devices can transmit this information to a neighboring device, or receive this information from a neighboring device. Once the cross points of the projection areas are determined, each mobile device can graphically stitch its projection area with the projection areas of the neighboring mobile devices to create a combined user interface.
  • the mobile devices communicate with each other to coordinate clipping at least part of the detected overlapping portions from the projection areas and stitching of remaining projection areas.
  • mobile device 102 ( b ) clips an overlapping portion defined by (C 1 , A, C 2 ) from its projection area to leave a remaining projection area of the mobile device 102 ( b ) which is a defined by (P 1 , P 2 , P 3 , C 2 , C 1 ).
  • an overlapping portion defined by polygon (E, F, C 3 , C 4 ) is clipped from projection area 114 ( c ) to leave a remaining projection defined by (C 3 , C 4 , P 5 , P 6 ).
  • Mobile device 102 ( a ) clips an overlapping portion defined by (C 1 , B, C 2 ) and another overlapping portion defined by (C 3 , C 4 , C, D) from its projection area 114 ( a ) to leave a remaining projection area (P 4 , C 1 , C 2 , C 4 , C 3 ).
  • some portions of the projection areas such as an area defined by (F, G, C 3 ) are also clipped from the projection area 206 ( c ) even if they're not overlapping with another projection area, to preserve a uniform remaining projection area.
  • content displayed at the projection area 206 ( c ) is intersected by the remaining projection area of the mobile device 206 ( b ) into multiple portions, a polygon defined by (C 3 , C 4 , P 5 , P 6 ) and a polygon defined by (F, G, C 3 ).
  • the mobile devices communicate with each other regarding the clipped portions and the remaining projection areas, as well as any calculated cross points.
  • Each mobile device e.g., mobile device 202 ( b )
  • a single display combining the three projection areas 114 ( a ), 114 ( b ), and 114 ( c ) is thus created, defined by (P 1 , P 2 , P 3 , C 2 , C 4 , P 5 , P 6 , C 3 , P 4 , C 1 ).
  • the devices then communicate in real time to create a combined seamless user interface combining projection areas 114 ( a ), 114 ( b ), and 114 ( c ).
  • the combined user interface occupies at least a portion of the single display. In the example of FIG. 2 , the combined seamless user interface fully occupies the single display.
  • the shape of the remaining projection areas and the combined seamless user interface in FIG. 2 which is the single display represented by a polygon defined by (P 1 , P 2 , P 3 , C 2 , C 4 , P 5 , P 6 , C 3 , P 4 , C 1 ) are irregular, and possibly distorted because of the angle of surface 112 relative to the projectors of the various devices.
  • Algorithms within the collaboration logic 126 of each device calculate the shape of each of the clipped or remaining projection areas and map the rectangular coordinates of each projection system to the distorted and possibly non-rectangular coordinates of the combined user interface.
  • the collaboration logic of a mobile device can also be configured to do automatic recalibration.
  • mobile device 102 ( a ) can discover this by its input sensor and the collaboration procedures starts again to generate a new display combining a new projection area of mobile device 102 ( b ) with the other projection areas.
  • each mobile device routinely checks the integrity of the combined seamless user interface. When one projection area is lost, the interconnection procedure will start again and a new connection will be generated. To improve the performance, it is not necessary to rebuild all the connections if there are more than three devices and only several mobile devices near the lost projection area are involved.
  • the mobile devices have a projection system that is capable of panning and zooming
  • collaboration logic 126 of mobile device 102 ( a ) can communicate with mobile device 102 ( b ) to adjust a size or location of projection area 114 ( a ) and/or projection area 114 ( b ) to generate a seamless single display by combining the projection areas of all of the mobile devices.
  • mobile device 102 ( a ) can act as a master device that controls projector 106 ( b ) of mobile device 102 ( b ) after the interconnection.
  • Mobile device 102 ( b ) in this example acts as a slave device that follows instructions from the master device 102 ( a ) to adjust sizes or shapes of its projection areas 114 ( b ).
  • the master device 102 ( a ) can detect it through its input sensor and control its own projector 106 ( a ) and/or projector 106 ( b ) of mobile device 102 ( b ) to adjust the location of projection areas 114 ( a ) and 114 ( b ) to remove the space or the overlapping.
  • the master device can also authorize one or more other mobile devices to control certain projection areas that the master device cannot directly detect by its own sensing techniques such as a camera.
  • the combined seamless user interface cannot be created due to a location of one mobile device being too far away from the other mobile devices, such mobile device can make signals or be controlled by the master device to make signals, such as warning sound, to remind the user to relocate it to be close to the other mobile devices to achieve the seamless single display.
  • users of the multiple mobile devices can use the combined seamless user interface to display content, such as a movie or a slide show, or enter into a group discussion mode to interact with each other on the combined seamless user interface.
  • a mobile device can use the combined seamless interface to display content such as a movie, video, document, drawing, picture, or similar accessible resource.
  • content such as a movie, video, document, drawing, picture, or similar accessible resource.
  • a projector of the mobile device e.g., projector 106 ( a )
  • mobile device 102 ( a ) can act as a master device to collaborate with the neighboring devices such as mobile device 102 ( b ).
  • mobile device 102 ( a ) transmits the content to the neighboring devices and controls them to project a portion of the content on their respective clipped or remaining projection areas.
  • Collaboration logic 126 of mobile device 102 ( a ) collaborates with the neighboring devices to ensure that a combination of portions of the content displayed at each remaining projection area presents a complete representation of the content.
  • users of the multiple mobile devices can use combined seamless user interface to interact with each other and collaborate with each other.
  • FIG. 3 shows one collaboration example using a combined user interface 128 .
  • the collaboration logic 126 visually represents computing device resources of the respective computing devices on their corresponding user interface areas.
  • Files or other resources can be transferred between neighboring devices by dragging their visual representations to different areas of the combined user interface 128 .
  • the files or other resources can comprise documents, projects, pictures, videos, shortcuts, or any other resources that can be represented graphically on a user display.
  • a graphical representation 302 represents a resource, such as an object containing contact information for a person.
  • the graphical representation is shown as initially being in projection area 114 ( a ), or a home area corresponding to mobile device 102 ( a ).
  • the user of the mobile device 102 ( a ) can move graphical representation 302 from the projection area 114 ( a ), considered the home area in this example, to the neighboring projection area 114 ( c ) by “dragging” it with a finger or stylus.
  • collaboration logic within the two devices 102 ( a ) and 102 ( b ) causes the resource represented by graphical representation 302 to be moved or copied from mobile device 102 ( a ) to neighboring device 102 ( b ).
  • the physical screen 104 ( a ) of mobile device 102 ( a ) can be used in conjunction with this technique.
  • private resources can be displayed on physical screen 104 ( a ).
  • the resource can be dragged or moved onto combined user interface 128 , where it becomes visible and accessible to other users.
  • FIG. 4 illustrates another collaboration example in which users of devices 102 ( a ) and 102 ( b ) enter into a discussion mode to do some cooperating work. For instance, the users can finish a painting together.
  • a drawing canvas is presented on combined user interface 128 and one or more users can interact with the user interface to add features to the drawing canvas.
  • Each user can directly paint on the combined seamless user interface.
  • users have added features 402 and 404 .
  • Features can span projection areas 114 ( a ) and 114 ( b ), as illustrated by feature 404 .
  • Features or objects can also be moved between projection areas that form the combined user interface. In one embodiment, any changes to the canvas are saved in all devices that are participating in the collaboration.
  • a common canvas or textual input area can be used to record events of a meeting.
  • one mobile device is selected or designated by users to keep meeting notes of the multiple users' inputs on the combined seamless user interface.
  • the meeting notes can also track a history of the inputs by users, which may or may not be removed from the combined seamless user interface during the discussion, and information of which user adds a particular input at a particular time, etc.
  • Any user can also add some-to-do list and even a next meeting's time and location on the combined seamless user interface to be included in the meeting notes.
  • the meeting notes are saved on all or selected ones of the devices participating in the meeting or collaboration.
  • the meeting record can be saved by one device and automatically emailed to the others.
  • a user can exit a group collaboration simply by removing their mobile device so that its projection area is separate from the combined seamless user interface. This device then drops out of the shared collaboration and projection sharing. Alternatively, a user can exit by explicit command, or by a dedicated button on the mobile device.
  • FIG. 5 illustrates how neighboring devices might automatically recognize each other for purposes of automatic coupling.
  • mobile device 102 ( a ) uses its camera to inspect the projection area of the neighboring device. During this inspection, it searches for any identifying information that might be projected by the neighboring device.
  • each device projects its own identifying information within its projection area.
  • the information can be in a machine-readable format such as a barcode, examples of which are shown in FIG. 5 as barcode 502 and barcode 504 .
  • the identifying information can include whatever parameters might be needed for connecting to the device.
  • the information can include a device address, an authentication code, a user's name, and/or other information.
  • the multiple mobile devices collaborate to generate a combined seamless user interface.
  • FIG. 6 shows an exemplary procedure 600 of collaborating between first and second mobile computing devices.
  • Procedure 600 is described in the context of the physical environment of FIG. 1 , in which first and second computing devices 102 ( a ) and 102 ( b ) have user interfaces that are projected in respective projection areas 116 ( a ) and 116 ( b ) on a common projection surface 112 .
  • the procedure is described as being performed by first mobile device 102 ( a ), although in many embodiments it will be performed concurrently by both device 102 ( a ) and device 102 ( b ), and possibly by one or more additional devices having similar capabilities.
  • An action 602 comprises projecting a user interface or image in projection area 116 ( a ) on surface 112 .
  • An action 604 comprises illuminating projection area 116 ( a ) with infrared light, using illuminator 108 ( a ) that is fixed within computing device 102 ( a ).
  • An action 606 comprises optically monitoring projection area 116 ( a ), or monitoring camera 110 ( a ) and detecting overlapping parts of projection area 116 ( a ), and to detect user interactions within projection area 116 ( a ).
  • An action 608 comprises detecting or identifying a neighboring projection device. As discussed above, this can be accomplished automatically or by requesting user input. In some embodiments, this comprises capturing identifying information (such as a barcode) projected by neighboring device 102 ( b ). An action 610 comprises establishing communications with neighboring device 102 ( b ) and communicating with neighboring device 102 ( b ) based on the captured identifying information.
  • identifying information such as a barcode
  • An action 612 comprises detecting an overlapping portion of the projection area 116 ( a ) of first computing device 102 ( a ) that overlaps with the projection area 116 ( b ) of second mobile device 102 ( b ). This can be done by sensing or evaluating brightness of the infrared illumination or of the projected user interface within projection area 116 ( a ) of computing device 102 ( a ). In the embodiment described herein, it is performed by detecting overlapping areas of infrared illumination.
  • An action 614 comprises communicating with second mobile device 102 ( b ) and with other neighboring devices to coordinate clipping and stitching of their respective projection areas. This action involves communicating and negotiating various coordinates that define overlaps, cross points, and the clipping areas of each projection area.
  • An action 616 comprises clipping at least part of the detected overlapping portion from the projection area 114 ( a ) of the first mobile computing device 102 ( a ) to leave a remaining projection area.
  • An action 618 comprises graphically stitching the remaining projection area of the first mobile device with remaining projection areas of neighboring devices, such as device 102 ( b ).
  • An action 620 comprises coordinating with the neighboring devices to create seamless user interface 128 that includes at least portions of the projection areas 116 ( a ) and 116 ( b ) of first and second computing devices 102 ( a ) and 102 ( b ) and any other neighboring devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Techniques for utilizing two or more mobile devices equipped with projectors to generate a combined seamless user interfaces by stitching projection areas generated by the projectors.

Description

BACKGROUND
Hand-held mobile devices such as mobile phones have become small and powerful, and they continue to develop at a rapid pace. Pocket-sized, hand-held mobile devices now have the computing power to do many things that previously required large personal computers. However, small screen sizes and input methods of hand-held mobile devices are still challenging to users, and detract from the user experience. Users desire larger display screens to display more information and to interact more easily with both the user's device itself and with other devices that may be nearby.
SUMMARY
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The term “techniques,” for instance, may refer to device(s), system(s), method(s) and/or computer-readable instructions as permitted by the context above and throughout the document.
The Detailed Description describes a mobile device that uses a projector to illuminate a projection area external to the mobile device, and uses the projection area as a user interface. The user interface can function as a touch screen or multi-touch screen and a user's interaction with the projection area can be captured by a camera of the mobile device. The mobile device can interconnect and collaborate with one or more neighboring mobile devices to stitch their projection areas and to thereby create a combined seamless user interface that utilizes the combined projection areas of the mobile device and its neighboring devices. Users of the multiple mobile devices can interact with each other on the combined seamless user interface.
BRIEF DESCRIPTION OF THE DRAWINGS
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
FIG. 1 is a block diagram showing an example of how multiple mobile devices can be used together to form a combined seamless user interface.
FIG. 2 is a block diagram showing an exemplary creation of a combined seamless user interface by combining multiple projection areas of multiple mobile devices.
FIG. 3 is a block diagram showing an exemplary resource sharing scenario among multiple mobile devices based on the combined seamless user interface.
FIG. 4 is a block diagram showing an exemplary cooperating work scenario among multiple mobile devices based on the combined seamless user interface.
FIG. 5 is a block diagram showing an exemplary automatic identification of neighboring devices for purposes of coupling.
FIG. 6 is a flowchart showing an exemplary procedure of stitching overlapping projection areas of multiple mobile devices.
DETAILED DESCRIPTION
This disclosure describes techniques for using a projector of a mobile device to project or illuminate a projection area external to the mobile device and to use the projection area as a display and user interface. The mobile device can use one or more sensing mechanisms, such as an infrared illuminator and camera, to detect a user's interaction with the projection area. For example, the infrared camera might be used to detect movement of a stylus or finger relative to the projection area, such as when a user touches or nearly touches the surface of the projection area. This allows the user interface to act as a “touch” or touch-type screen, where the user touches or nearly touches areas of the projection area to interact with the mobile device. Both single-touch and multi-touch inputs can be detected.
In addition, two or more mobile devices can cooperate to create a combined or integrated user interface by stitching projection areas of the two or more mobile devices into an integrated and seamless display. The combined user interface has a larger combined display area compared to the LCDs or other native displays of the individual mobile devices. Different users can use the combined seamless user interface to perform interactive operations such as exchanging data and working collaboratively on a common project, document, or other resource.
This brief introduction is provided for the reader's convenience and is not intended to limit the scope of the claims
General Environment
FIG. 1 shows an example of how multiple mobile devices can be used together to form a combined user interface. This example includes two computing devices, designated by reference numerals 102(a) and 102(b) and referred to as first mobile device 102(a) and second or neighboring device 102(b). Although the example of FIG. 1 shows only two devices, the techniques described herein can also be used with more than two devices, to create a combined user interface using projection components of all such devices. One or more of the devices may also be non-mobile.
For purposes of this discussion, first mobile device 102(a) will be described in some detail. Second device 102(b) and any other neighboring devices are understood to have similar components and functionality, although they may differ significantly in some respects.
Generally, first mobile device 102(a) can be a mobile phone, a PDA, a mobile internet device, a netbook, a personal media player, a laptop, a hand-held mobile device, or any other portable, mobile, computing or communications device.
Mobile device 102(a) can be equipped with a physical screen 104(a), a projector 106(a), an illuminator 108(a), and one or more image or touch sensors such as a camera 110(a).
The physical screen 104(a) displays graphics to a user and can be used as part of a default or primary graphical user interface. Screen 104(a) can be touch-sensitive to accept input from a user. Alternatively or additionally, keys or buttons (not shown) can be utilized for user input and interaction. The size of the physical screen 104(a) will often be quite small, limited by the small size of the mobile device 102(a).
Projector 106(a) can take many forms, including that of a so-called “pico” projector, which is small in size and has modest power requirements. The projector 106(a) displays a user interface on a surface 112 external to mobile device 102(a). The projected user interface occupies or defines a projection area 114(a) on surface 112. The projected image in this embodiment can display a secondary graphical user interface occupying at least a portion of projection area 114(a). The user can physically interact with this secondary graphical user interface to control or interact with the mobile device 102(a). In the example of FIG. 1, the secondary graphical user interface fully occupies projection area 114(a). In many scenarios, the secondary user interface provided by projector 106(a) will be much larger than the primary user interface formed by physical screen 104(a).
Mobile device 102(a) can coordinate its physical screen 104(a) and its external projection area 114(a) in different ways. In one example, physical screen 104(a) and projection area 114(a) can show the same content. In another example, physical screen 104(a) only shows simple content, such as a reminder or a clock. When the user wants a large display to perform more complex or detailed operations, such as reading a document, surfing the internet, or composing an email, the user can display a relevant application on projection area 114(a) and interact with the application by pointing at or touching surface 112 within projection area 114(a).
Illuminator 108(a) and camera 110(a) are used in combination to sense user interaction with the projected user interface, together forming what will be referred to herein as an input sensor. For example, illuminator 108(a) can be an infrared emitter that illuminates projection area 114(a) with non-visible infrared light. More generally, illuminator 108(a) illuminates an input area 116(a) that is at least as large as projection area 114(a) and that encompasses projection area 114(a).
Camera 110(a) can be an infrared camera, sensitive to non-visible infrared light incident on input area 116(a). Camera 110(a) monitors the infrared illumination of the projection area to detect touch or touch-like interaction by a user with the displayed user interface. Furthermore, as will be described in more detail below, camera 110(a) detects portions of projection area 114(a) that overlap with projection areas of one or more neighboring computing devices.
There can be many different embodiments of mobile device 102(a). In one embodiment, projector 106(a), illuminator 108(a), and camera 110(a) are built into the mobile device 102(a), as shown in the FIG. 1. One or more of projector 106(a), illuminator 108(a), and camera 110(a) can also be modularly integrated with each other. For example, illuminator 108(a) and the camera 110(a) can be integrated as a single unit or module within mobile device 102(a) or connected to mobile device 102(a).
Input area 116(a) and projection area 114(a) may or may not be exactly the same as each other. In the example of FIG. 1, input area 116(a) is larger than and includes projection area 114(a) in order to detect user interaction across the entire projection area 114(a). In the embodiment described here, projector 106(a), illuminatorl 08(a), and camera 110(a) are preconfigured and mounted relative to each other so that when mobile device 102(a) is placed upright on surface 112, input area 116(a) and projection area 114(a) are properly focused and sized on surface 112, and properly aligned with each other as shown.
In the example of FIG. 1, the secondary user interface displayed by the projector 106(a) acts as a touch-sensitive display; the input sensor consisting of illuminator 108(a) and camera 110(a) is able to detect when and where a user touches surface 112 within input area 116(a). In the illustrated example, the camera 110(a) senses the infrared illumination from illuminator 108(a). User interactions relative to surface 112 cause shadows in the IR illumination, which mobile device 102(a) interprets to determine placement of fingers or styli. In other embodiments, camera 110(a) may be sensitive to visible or projected light to optically capture the user's interactions within the input area 116(a).
Block 118 shows internal or logical components of first mobile device 102(a). Second mobile device 102(b) has similar components and functionality. The components of mobile devices 102(a) and 102(b) include one or more processors 120, a communication system 122, and memory 124.
Generally, memory 124 contains computer-readable instructions that are accessible and executable by processor 112. Memory 124 may comprise a variety of computer readable storage media. Such media can be any available media including both volatile and non-volatile storage media, removable and non-removable media, local media, remote media, optical memory, magnetic memory, electronic memory, etc.
Any number of program modules can be stored in the memory, including by way of example, an operating system, one or more applications, other program modules, and program data. Each of such program modules and program data (or some combination thereof) may implement all or part of the resident components that support the data center system as described herein.
Communication system 122 is configured to allow the first mobile computing device 102(a) to communicate with one or more neighboring computing devices. The communication system 122 can use wired or wireless techniques for communication. The neighboring computing devices can be other mobile devices or any other computing devices, such as digital cameras or cell phones. In order to produce a combined user interface, neighboring devices are logically coupled with each other or otherwise connected with each other for communication, collaboration, and display coordination. Devices can be coupled either automatically, in response to physical proximity; or manually, in response to explicit user commands.
Different techniques can be used by the mobile device 102(a) to automatically couple physically adjacent devices. For example, mobile device 102(a) can use wireless or bluetooth searching, ultrasonic techniques or other techniques to sense physical nearness of another device. Alternatively, mobile device 102(a) can use its camera 110(a) to detect an existence of a projection area of a neighboring device that overlaps projection area 114(a).
In one example, the user can manually build a connection or coupling between two devices such as mobile device 102(a) and neighboring device 102(b). The user can input information into mobile device 102(a) that identifies a neighboring device to be connected. To obtain this information from the user, the projector 106(a) might project a user input field (not shown in the FIG. 1) on the projection area 114(a) along with a projected keyboard and a prompt instructing the user to enter information about neighboring device 102(b). Such information, to be provided by the user, might include a device name, user name/password, and/or a relative position of the neighboring device 102(b) to the mobile device 102(a). The user enters this information using the touch-sensitive or touch-like features implemented by mobile device 102(a) in conjunction with its projector 106(a), and mobile device 102(a) attempts to find neighboring device 102(b) according to the information entered by the user.
In another example, the connection among multiple mobile devices can be built automatically, without user action. For instance, mobile device 102(a) can detect or find neighboring device 102(b) by wireless searching. The wireless searching can use wi-fi, infrared, bluetooth , or other wireless techniques.
As yet another example, camera 110(a) can be monitored to detect a projection area of another device, such as a projection area 114(b) of neighboring device 102(b). Having sensed the existence of a neighboring device 102(b) in this manner, mobile device 102(a) can then attempt to connect or couple with it.
A mobile device may request identity information of neighboring device or vice versa for security purposes before the two mobile devices are interconnected or coupled for collaboration. A potential coupling can be allowed or denied based on the identity information. In one embodiment, mobile device 102(a) is configured with a “white list” of other devices with which coupling is allowed. This list may be configured by the user of mobile device 102(a). In another embodiment, one or more such white lists can be maintained by a third party to which mobile device 102(a) has access. Alternatively, the user of mobile device 102(a) might be asked for confirmation prior to implementing any potential coupling.
Mobile device 102(a) also has a collaboration logic 126, which in this example comprises computer-executable programs, routines, or instructions that are stored within memory 124 and are executed by processor 120. The collaboration logic 126 communicates with physically neighboring mobile devices using communication system 122, and coordinates with those devices to graphically stitch the projection area 114(a) of mobile computing device 102(a) with projection areas of one or more neighboring computing devices. For example, as shown in FIG. 1, collaboration logic 126 stitches projection area 114(a) with projection area 114(b) of neighboring mobile device 102(b). Furthermore, collaboration logic 126 creates a combined seamless user interface 128 utilizing a portion of projection area 114(a) of mobile computing device 102(a) and a portion of projection area 114(b) of neighboring device 102(b). The combined user interface 128 can be any shape or size within a boundary of both projection area 114(a) and projection area 114(b). The projection areas 114(a) and 114(b) are potentially clipped at their intersection so that the resulting clipped projection areas do not overlap each other, but are immediately adjacent each other. This avoids an abnormally bright area in the combined user interface that would otherwise result from the overlapped illumination of the two projectors.
As shown in FIG. 1, combined user interface 128 can be larger than the user interface that any single device might be able to display. In some usage scenarios, the collaboration logic 126 might coordinate with other devices to create a single user interface that is primarily for interaction with the single mobile device 102(a), with other devices acting in a slave mode to expand the user interface of mobile device 102(b). In other usage scenarios, the combined user interface may allow concurrent interaction with all of the different devices, and interactions with the combined user interface may cause actions or results in one or more of the multiple devices.
The combined seamless user interface 128 allows one or more graphical representations 130 to span and move seamlessly between the projection areas of the mobile computing device and the neighboring computing devices. As an example, FIG. 1 shows a graphical representation 130 (in this case an icon) that spans projection area 114(a) and projection area 114(b). In some embodiments, graphical representation 130 can be seamlessly dragged between projection area 114(a) and projection area 114(b).
Graphical representation 130 can correspond to device resources such as files, shortcuts, programs, documents, etc., similar to “icons” used in many graphical operating systems to represent various resources. Graphical representation 130 might alternatively comprise a displayed resource, menu, window, pane, document, picture, or similar visual representation that concurrently spans the projection areas of mobile device 102(a) and neighboring device 102(b).
After mobile device 102(a) is interconnected or logically coupled with neighboring device 102(b), collaboration logic 126 of mobile device 102(a) can collaborate with mobile device 102(b) to create the combined seamless user interface 128. In the example of FIG. 1 where the projection areas 114(a) and 114(b) are overlapping, collaboration logic 126 of mobile device 102(a) communicates with mobile device 102(b) to negotiate how each device will clip portions of its projection area 114 to avoid overlap. In addition, collaboration logic 126 communicates coordinates allowing the remaining projection areas of the two devices to be graphically stitched to each other to create the appearance of a single projected image or user interface utilizing at least portions of the projection areas 114(a) and 114(b).
FIG. 2 shows an example of how the projection areas of three devices might be clipped and stitched to form a combined interface. FIG. 2 shows three overlapping projection areas: projection area 114(a) produced by mobile device 102(a) of FIG. 1, projection area 114(b) produced by neighboring device 102(b), and another projection area 114(c) that might be produced by another neighboring device having capabilities similar to those of mobile device 102(a). Although FIG. 2 only shows three overlapping projection areas, the techniques described herein can be used for creation of a combined seamless user interface by combining any number of overlapping projection areas.
In addition to the projection areas 114(a), 114(b), and 114(c), FIG. 2 shows input area 116(a). Although other input areas are not shown in FIG. 2, it should be understood that each projection device might define its own input area, corresponding to its projection area.
The projection area 114(a) is represented by a polygon defined by (P4, B, C, D). The projection area 114(b) is represented by a polygon defined by (P1, P2, P3, A). The projection area 114(c) is represented by a polygon defined by (E, F, P5, P6). All of the projection areas and input areas locate at a surface 112.
As shown in FIG. 2, there is an overlapping portion between projection area 114(a) and projection area 114(b), forming a polygon defined by (C1, B, C2, A). Cross points between the two projection areas are C1 and C2. There is also an overlapping portion between projection area 114(a) and projection area 114(c), and cross points between the two projection areas are C3 and C4.
Because these overlapping portions and cross points are within input area 116(a), camera 110(a) of mobile device 102(a) can detect the overlapping portions and cross points. Specifically, mobile device 102(a) is configured to monitor its input area 116(a) with camera 110(a) and to detect any portions of projection area 114(a) that are significantly brighter than other parts of the projection area 114(a), after correcting for the localized brightness of the image being projected by mobile device 102(a) itself in projection area 114(a).
Collaboration logic 126 of mobile device 102(a) can communicate with the neighboring devices, including neighboring device 102(b), to define a new projection area of each device in the combined seamless user interface. Any one of the devices can detect an overlapping portion of its projection area with a projection area of another mobile device and calculate the cross points. In addition, any one of the devices can transmit this information to a neighboring device, or receive this information from a neighboring device. Once the cross points of the projection areas are determined, each mobile device can graphically stitch its projection area with the projection areas of the neighboring mobile devices to create a combined user interface.
The mobile devices communicate with each other to coordinate clipping at least part of the detected overlapping portions from the projection areas and stitching of remaining projection areas. In this example, mobile device 102(b) clips an overlapping portion defined by (C1, A, C2) from its projection area to leave a remaining projection area of the mobile device 102(b) which is a defined by (P1, P2, P3, C2, C1). By the same techniques, an overlapping portion defined by polygon (E, F, C3, C4) is clipped from projection area 114(c) to leave a remaining projection defined by (C3, C4, P5, P6). Mobile device 102(a) clips an overlapping portion defined by (C1, B, C2) and another overlapping portion defined by (C3, C4, C, D) from its projection area 114(a) to leave a remaining projection area (P4, C1, C2, C4, C3). As shown in the FIG. 2, some portions of the projection areas, such as an area defined by (F, G, C3) are also clipped from the projection area 206(c) even if they're not overlapping with another projection area, to preserve a uniform remaining projection area. Otherwise, content displayed at the projection area 206(c) is intersected by the remaining projection area of the mobile device 206(b) into multiple portions, a polygon defined by (C3, C4, P5, P6) and a polygon defined by (F, G, C3).
The mobile devices communicate with each other regarding the clipped portions and the remaining projection areas, as well as any calculated cross points. Each mobile device (e.g., mobile device 202(b)) only projects graphic representations in the remaining projection area to thereby avoid overlapping.
A single display combining the three projection areas 114(a), 114(b), and 114(c) is thus created, defined by (P1, P2, P3, C2, C4, P5, P6, C3, P4, C1).
The devices then communicate in real time to create a combined seamless user interface combining projection areas 114(a), 114(b), and 114(c). The combined user interface occupies at least a portion of the single display. In the example of FIG. 2, the combined seamless user interface fully occupies the single display.
The shape of the remaining projection areas and the combined seamless user interface in FIG. 2 which is the single display represented by a polygon defined by (P1, P2, P3, C2, C4, P5, P6, C3, P4, C1) are irregular, and possibly distorted because of the angle of surface 112 relative to the projectors of the various devices. Algorithms within the collaboration logic 126 of each device calculate the shape of each of the clipped or remaining projection areas and map the rectangular coordinates of each projection system to the distorted and possibly non-rectangular coordinates of the combined user interface.
The collaboration logic of a mobile device (e.g., collaboration logic 126) can also be configured to do automatic recalibration. In one example, if the user moves mobile device 102(b) away from mobile device 102(a), mobile device 102(a) can discover this by its input sensor and the collaboration procedures starts again to generate a new display combining a new projection area of mobile device 102(b) with the other projection areas. In another example, each mobile device routinely checks the integrity of the combined seamless user interface. When one projection area is lost, the interconnection procedure will start again and a new connection will be generated. To improve the performance, it is not necessary to rebuild all the connections if there are more than three devices and only several mobile devices near the lost projection area are involved.
Although the projection areas are overlapping in FIG. 1 and FIG. 2, such overlapping is not necessary for the creation of the combined seamless user interface. In one embodiment, the mobile devices have a projection system that is capable of panning and zooming Once mobile devices 102(a) and 102(b) are logically coupled, collaboration logic 126 of mobile device 102(a) can communicate with mobile device 102(b) to adjust a size or location of projection area 114(a) and/or projection area 114(b) to generate a seamless single display by combining the projection areas of all of the mobile devices. For example, mobile device 102(a) can act as a master device that controls projector 106(b) of mobile device 102(b) after the interconnection. Mobile device 102(b) in this example acts as a slave device that follows instructions from the master device 102(a) to adjust sizes or shapes of its projection areas 114(b). Once there is a space or overlapping between two or more projection areas, the master device 102(a) can detect it through its input sensor and control its own projector 106(a) and/or projector 106(b) of mobile device 102(b) to adjust the location of projection areas 114(a) and 114(b) to remove the space or the overlapping. If there are three or more projection areas to be combined, the master device can also authorize one or more other mobile devices to control certain projection areas that the master device cannot directly detect by its own sensing techniques such as a camera. In the case that the combined seamless user interface cannot be created due to a location of one mobile device being too far away from the other mobile devices, such mobile device can make signals or be controlled by the master device to make signals, such as warning sound, to remind the user to relocate it to be close to the other mobile devices to achieve the seamless single display.
After the interconnection among multiple mobile devices is established and the combined seamless user interface is created, users of the multiple mobile devices can use the combined seamless user interface to display content, such as a movie or a slide show, or enter into a group discussion mode to interact with each other on the combined seamless user interface.
In one embodiment, a mobile device can use the combined seamless interface to display content such as a movie, video, document, drawing, picture, or similar accessible resource. For example, in FIG. 2, because a projector of the mobile device (e.g., projector 106(a)) can only project a content such as a picture onto the remaining projection area (P4, C1, C2, C4, C3) of its own projection area 114(a), mobile device 102(a) can act as a master device to collaborate with the neighboring devices such as mobile device 102(b). In this situation, mobile device 102(a) transmits the content to the neighboring devices and controls them to project a portion of the content on their respective clipped or remaining projection areas. Collaboration logic 126 of mobile device 102(a) collaborates with the neighboring devices to ensure that a combination of portions of the content displayed at each remaining projection area presents a complete representation of the content.
In another embodiment, users of the multiple mobile devices can use combined seamless user interface to interact with each other and collaborate with each other. Several exemplary scenarios of such group collaboration are described in the following section.
Collaboration Examples
FIG. 3 shows one collaboration example using a combined user interface 128. In this example, the collaboration logic 126 visually represents computing device resources of the respective computing devices on their corresponding user interface areas. Files or other resources can be transferred between neighboring devices by dragging their visual representations to different areas of the combined user interface 128. The files or other resources can comprise documents, projects, pictures, videos, shortcuts, or any other resources that can be represented graphically on a user display.
In the example of FIG. 3, a graphical representation 302 represents a resource, such as an object containing contact information for a person. The graphical representation is shown as initially being in projection area 114(a), or a home area corresponding to mobile device 102(a). The user of the mobile device 102(a) can move graphical representation 302 from the projection area 114(a), considered the home area in this example, to the neighboring projection area 114(c) by “dragging” it with a finger or stylus. When graphical representation 302 crosses from projection area 114(a) to projection area 114(b), collaboration logic within the two devices 102(a) and 102(b) causes the resource represented by graphical representation 302 to be moved or copied from mobile device 102(a) to neighboring device 102(b).
The physical screen 104(a) of mobile device 102(a) can be used in conjunction with this technique. For example, private resources can be displayed on physical screen 104(a). When a user desires to share a resource, the resource can be dragged or moved onto combined user interface 128, where it becomes visible and accessible to other users.
FIG. 4 illustrates another collaboration example in which users of devices 102(a) and 102(b) enter into a discussion mode to do some cooperating work. For instance, the users can finish a painting together.
In this example, a drawing canvas is presented on combined user interface 128 and one or more users can interact with the user interface to add features to the drawing canvas. Each user can directly paint on the combined seamless user interface. In the illustrated example, users have added features 402 and 404. Features can span projection areas 114(a) and 114(b), as illustrated by feature 404. Features or objects can also be moved between projection areas that form the combined user interface. In one embodiment, any changes to the canvas are saved in all devices that are participating in the collaboration.
Similarly, a common canvas or textual input area can be used to record events of a meeting. In this example, one mobile device is selected or designated by users to keep meeting notes of the multiple users' inputs on the combined seamless user interface. In addition to the final inputs on the combined seamless user interface, the meeting notes can also track a history of the inputs by users, which may or may not be removed from the combined seamless user interface during the discussion, and information of which user adds a particular input at a particular time, etc.
Any user can also add some-to-do list and even a next meeting's time and location on the combined seamless user interface to be included in the meeting notes. When the meeting is over, the meeting notes are saved on all or selected ones of the devices participating in the meeting or collaboration. Alternatively, the meeting record can be saved by one device and automatically emailed to the others.
A user can exit a group collaboration simply by removing their mobile device so that its projection area is separate from the combined seamless user interface. This device then drops out of the shared collaboration and projection sharing. Alternatively, a user can exit by explicit command, or by a dedicated button on the mobile device.
Optical Neighbor Recognition
FIG. 5 illustrates how neighboring devices might automatically recognize each other for purposes of automatic coupling. In this example, after mobile device 102(a) has determined or detected the presence of a neighboring device by optically sensing one or more overlapping projection areas, it uses its camera to inspect the projection area of the neighboring device. During this inspection, it searches for any identifying information that might be projected by the neighboring device. When attempting to connect to an adjacent device, each device projects its own identifying information within its projection area. The information can be in a machine-readable format such as a barcode, examples of which are shown in FIG. 5 as barcode 502 and barcode 504. The identifying information can include whatever parameters might be needed for connecting to the device. Specifically, the information can include a device address, an authentication code, a user's name, and/or other information.
After the multiple mobile devices are interconnected, the multiple mobile devices collaborate to generate a combined seamless user interface.
Exemplary Procedural Details
FIG. 6 shows an exemplary procedure 600 of collaborating between first and second mobile computing devices. Procedure 600 is described in the context of the physical environment of FIG. 1, in which first and second computing devices 102(a) and 102(b) have user interfaces that are projected in respective projection areas 116(a) and 116(b) on a common projection surface 112. The procedure is described as being performed by first mobile device 102(a), although in many embodiments it will be performed concurrently by both device 102(a) and device 102(b), and possibly by one or more additional devices having similar capabilities.
An action 602 comprises projecting a user interface or image in projection area 116(a) on surface 112. An action 604 comprises illuminating projection area 116(a) with infrared light, using illuminator 108(a) that is fixed within computing device 102(a). An action 606 comprises optically monitoring projection area 116(a), or monitoring camera 110(a) and detecting overlapping parts of projection area 116(a), and to detect user interactions within projection area 116(a).
An action 608 comprises detecting or identifying a neighboring projection device. As discussed above, this can be accomplished automatically or by requesting user input. In some embodiments, this comprises capturing identifying information (such as a barcode) projected by neighboring device 102(b). An action 610 comprises establishing communications with neighboring device 102(b) and communicating with neighboring device 102(b) based on the captured identifying information.
An action 612 comprises detecting an overlapping portion of the projection area 116(a) of first computing device 102(a) that overlaps with the projection area 116(b) of second mobile device 102(b). This can be done by sensing or evaluating brightness of the infrared illumination or of the projected user interface within projection area 116(a) of computing device 102(a). In the embodiment described herein, it is performed by detecting overlapping areas of infrared illumination.
An action 614 comprises communicating with second mobile device 102(b) and with other neighboring devices to coordinate clipping and stitching of their respective projection areas. This action involves communicating and negotiating various coordinates that define overlaps, cross points, and the clipping areas of each projection area.
An action 616 comprises clipping at least part of the detected overlapping portion from the projection area 114(a) of the first mobile computing device 102(a) to leave a remaining projection area.
An action 618 comprises graphically stitching the remaining projection area of the first mobile device with remaining projection areas of neighboring devices, such as device 102(b). An action 620 comprises coordinating with the neighboring devices to create seamless user interface 128 that includes at least portions of the projection areas 116(a) and 116(b) of first and second computing devices 102(a) and 102(b) and any other neighboring devices.
Conclusion
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claims.

Claims (20)

We claim:
1. A mobile computing device comprising:
a processor;
a first projector that projects to a first projection area on a surface external to the mobile computing device, wherein the first projector projects a user interface within the first projection area;
an illuminator that illuminates an input area with non-visible light, the input area including at least the first projection area projected by the first projector and first portions of projection areas projected from projectors of one or more neighboring computing devices, wherein the first projector is distinct from the projectors of the one or more neighboring computing devices;
a camera that detects portions of the first projection area that overlap with second portions of the projection areas on the surface projected from the projectors of the one or more neighboring computing devices and monitors the non-visible illumination of the input area to detect touch-type interaction with the first projection area projected by the first projector and the first portions of the projection areas projected from the projectors of the one or more neighboring computing devices;
a communication system allowing the mobile computing device to communicate with the one or more neighboring computing devices; and
collaboration logic executed by the processor that interacts with the one or more neighboring computing devices through the communication system to (a) graphically stitch the first projection area of the mobile computing device with the projection areas projected from the projectors of the one or more neighboring computing devices based on the detection of the portions of the first projection area that overlap with the second portions of the projection areas projected from the projectors of the one or more neighboring computing devices and (b) create a combined seamless user interface utilizing the first projection area of the mobile computing device and the projection areas projected from the projectors of the one or more neighboring computing devices;
wherein the combined seamless user interface allows graphical representations to span the first projection area projected from the first projector of the mobile computing device and the projection areas projected from the projectors of the one or more neighboring computing devices in response to the touch-type interaction that occurs within the combined seamless user interface and within the input area illuminated from the mobile computing device.
2. A mobile computing device as recited in claim 1, wherein the combined seamless user interface allows the graphical representations to move seamlessly between the first projection area from the first projector of the mobile computing device and the projection areas of the one or more neighboring computing devices in response to the touch-type interaction with the combined seamless user interface.
3. A mobile computing device as recited in claim 1, wherein the collaboration logic is configured to graphically stitch the first projection area with the projection areas projected from projectors of the one or more neighboring computing devices in response to physical proximity of the one or more neighboring computing devices with the mobile computing device.
4. A mobile computing device as recited in claim 1, wherein:
the combined seamless user interface includes a home area corresponding to the mobile computing device and one or more neighboring areas corresponding to the neighboring computing devices, respectively;
the collaboration logic visually represents a computing device resource of the mobile computing device on the home area corresponding to the mobile computing device; and
in response to a user moving a visual representation of the computing device resource from the home area corresponding to the mobile computing device to a neighboring area corresponding to one of the neighboring computing devices, the collaboration logic copies the computing device resource from the mobile computing device to the one of the neighboring computing devices.
5. A mobile computing device as recited in claim 1, wherein:
the camera is configured to capture identifying information projected by the projectors of the one or more neighboring computing devices within their respective projection areas; and
the collaboration logic is configured to recognize the captured identifying information and in response to recognizing the captured identifying information, to communicate with the one or more neighboring computing devices and to include the projection areas projected from the projectors of the one or more neighboring computing devices in the combined user interface.
6. A mobile computing device as recited in claim 1, wherein the projection areas projected from the projectors of the neighboring computing devices expand the user interface of the mobile computing device.
7. A method of collaborating between first and second mobile computing devices having user interfaces that are projected from respective projectors having respective projection areas on a common projection surface, comprising:
capturing, by the first mobile computing device, identifying information projected in the projection area of the second mobile computing device;
communicating between the first and the second mobile computing devices based at least in part on the captured identifying information;
optically detecting an overlapping portion of the projection area of the projector of the first mobile computing device that overlaps with at least a portion of the projection area of the projector of the second mobile computing device; and
communicating between the first and the second mobile computing devices to:
clip at least part of the detected overlapping portion from the projection area of the projector of the first mobile computing device to leave a remaining projection area of the projector of the first mobile computing device; and
graphically stitch the remaining projection area of the projector of the first mobile computing device with the projection area of the projector of the second mobile computing device to create a seamless user interface that includes at least portions of the projection areas of the respective projectors of the first and second mobile computing devices.
8. A method as recited in claim 7,
wherein the captured identifying information comprises a barcode.
9. A method as recited in claim 7, wherein the captured identifying information includes parameters used for communicating with the second mobile computing device, the parameters including at least one of a device code of the second mobile computing device, an authentication code for the second mobile computing device, or a username.
10. A method as recited in claim 7, wherein the first and second mobile computing devices illuminate the projection areas of their respective projectors with infrared light to detect user interaction, and wherein optically detecting the overlapping portion comprises detecting overlapping areas of infrared light illumination.
11. A method as recited in claim 7, wherein optically detecting the overlapping portion comprises sensing brightness of the projection area of the projector of the first mobile computing device.
12. A method as recited in claim 7, further comprising:
illuminating the projection area of the projector of the first mobile computing device with infrared light; and
monitoring an infrared camera to detect user interaction with the projected user interface of the first computing device.
13. A method as recited in claim 7, further comprising:
optically monitoring the projected user interface of the first computing device to detect user interaction and to detect the overlapping portion.
14. A mobile computing device comprising:
a processor:
a projector having an associated projection area on a surface external to the mobile computing device, wherein the projector projects a user interface within the projection area on the surface;
an input sensor that senses user interaction with the projected user interface and captures identifying information projected in projection areas of one or more neighboring computing devices, and that detects an overlapping portion of projection area of the mobile computing device with at least a portion of projection areas of the one or more neighboring computing devices;
a communication system allowing the mobile computing device to communicate with the one or more neighboring computing devices based at least in part on the captured identifying information;
an illuminator that illuminates the projection area of the mobile computing device and portions of the projection areas of the one or more neighboring computing devices with non-visible light; and
collaboration logic executed by the processor that graphically stitches the detected overlapping portion of the projection area of the mobile computing device with the projection areas of the one or more neighboring computing devices to create a combined user interface utilizing the projection area of the mobile computing device and the projection areas of the one or more neighboring computing devices, wherein the projector of the mobile computing device is distinct from projectors of the one or more neighboring computing devices.
15. A mobile computing device as recited in claim 14, wherein the collaboration logic is configured to graphically stitch the projection areas in response to physical proximity of the one or more neighboring computing devices with the mobile computing device.
16. A mobile computing device as recited in claim 14, wherein:
the combined user interface has a home area corresponding to the mobile computing device and one or more neighboring areas corresponding to the neighboring computing devices, respectively;
the collaboration logic visually represents computing device resources of the respective computing devices on their corresponding user interface areas; and
in response to a user moving a visual representation of a particular device resource from the user interface area corresponding to a first of the computing devices to the user interface area corresponding to a second of the computing devices, the collaboration logic copies the particular device resource from the first of the computing devices to the second of the computing devices.
17. A mobile computing device as recited in claim 14, wherein:
the input sensor comprises a camera configured to capture the identifying information projected by the projectors of the one or more neighboring computing devices within their respective projection areas; and
the collaboration logic is configured to recognize the captured identifying information and in response to recognizing the captured identifying information, to communicate with the one or more neighboring computing devices and to include the projection areas of the projectors of the one or more neighboring computing devices in the combined user interface.
18. A mobile computing device as recited in claim 14, wherein the projection areas of the projectors of the neighboring computing devices expand the user interface of the mobile computing device.
19. A mobile computing device as recited in claim 14, wherein:
the input sensor comprises a camera configured to detect physical interaction by a user; and
the collaboration logic is responsive to the camera to detect overlapping projection areas and to stitch the projection area of the projector of the mobile computing device with the projection areas of the projectors of the one or more neighboring computing devices based on the detection of the overlapping projection areas.
20. A mobile computing device as recited in claim 14, wherein:
the non-visible light includes infrared light; and
the input sensor comprises a camera that monitors the infrared illumination of the projection area of the mobile computing device and the portions of the projection areas of the one or more neighboring computing devices to detect physical interaction by a user.
US12/699,706 2010-02-03 2010-02-03 Combined surface user interface Expired - Fee Related US9110495B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/699,706 US9110495B2 (en) 2010-02-03 2010-02-03 Combined surface user interface
US14/825,711 US10452203B2 (en) 2010-02-03 2015-08-13 Combined surface user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/699,706 US9110495B2 (en) 2010-02-03 2010-02-03 Combined surface user interface

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/825,711 Continuation US10452203B2 (en) 2010-02-03 2015-08-13 Combined surface user interface

Publications (2)

Publication Number Publication Date
US20110191690A1 US20110191690A1 (en) 2011-08-04
US9110495B2 true US9110495B2 (en) 2015-08-18

Family

ID=44342714

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/699,706 Expired - Fee Related US9110495B2 (en) 2010-02-03 2010-02-03 Combined surface user interface
US14/825,711 Active 2031-01-22 US10452203B2 (en) 2010-02-03 2015-08-13 Combined surface user interface

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/825,711 Active 2031-01-22 US10452203B2 (en) 2010-02-03 2015-08-13 Combined surface user interface

Country Status (1)

Country Link
US (2) US9110495B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001066484A1 (en) 2000-03-05 2001-09-13 3M Innovative Properties Company Radiation-transmissive films on glass articles
US20140218624A1 (en) * 2007-08-07 2014-08-07 Seiko Epson Corporation Graphical user interface device
US11243674B2 (en) * 2018-07-10 2022-02-08 Seiko Epson Corporation Display apparatus and image processing method
US12099196B2 (en) 2018-05-17 2024-09-24 Lumus Ltd. Near-eye display having overlapping projector assemblies

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130076592A1 (en) * 2011-09-27 2013-03-28 Paul E. Reeves Unified desktop docking behavior for visible-to-visible extension
US9268518B2 (en) 2011-09-27 2016-02-23 Z124 Unified desktop docking rules
US9715252B2 (en) 2011-08-24 2017-07-25 Z124 Unified desktop docking behavior for window stickiness
US9405459B2 (en) * 2011-08-24 2016-08-02 Z124 Unified desktop laptop dock software operation
JP5795582B2 (en) * 2009-07-31 2015-10-14 サムスン エレクトロニクス カンパニー リミテッド Integrated user interface generation method and apparatus for performing the same
GB0920754D0 (en) * 2009-11-27 2010-01-13 Compurants Ltd Inamo big book 1
US9110495B2 (en) * 2010-02-03 2015-08-18 Microsoft Technology Licensing, Llc Combined surface user interface
US9134799B2 (en) * 2010-07-16 2015-09-15 Qualcomm Incorporated Interacting with a projected user interface using orientation sensors
US20120026088A1 (en) * 2010-08-01 2012-02-02 T-Mobile Usa, Inc. Handheld device with projected user interface and interactive image
JP2012163796A (en) * 2011-02-08 2012-08-30 Seiko Epson Corp Projector and authentication method
WO2013020206A1 (en) 2011-08-08 2013-02-14 Research In Motion Limited Methods and apparatus to obtain and present information
JP6180072B2 (en) * 2011-08-24 2017-08-16 サターン ライセンシング エルエルシーSaturn Licensing LLC Display device, display system, and display method
US8878794B2 (en) 2011-09-27 2014-11-04 Z124 State of screen info: easel
US9164544B2 (en) 2011-12-09 2015-10-20 Z124 Unified desktop: laptop dock, hardware configuration
JP6083172B2 (en) * 2011-12-26 2017-02-22 セイコーエプソン株式会社 Lighting device
JP5879536B2 (en) * 2012-01-18 2016-03-08 パナソニックIpマネジメント株式会社 Display device and display method
CN202565359U (en) * 2012-05-11 2012-11-28 广东欧珀移动通信有限公司 Micro projector interaction mobile phone
KR101965740B1 (en) * 2012-09-03 2019-04-05 엘지전자 주식회사 Mobile terminal and data communication system using the mobile terminla
US8531519B1 (en) * 2012-09-06 2013-09-10 Google Inc. Automatic multi-device localization and collaboration using cameras
DE202012103472U1 (en) * 2012-09-12 2013-12-17 Zumtobel Lighting Gmbh Lighting system with integrated projection unit
EP2818986A1 (en) * 2013-06-28 2014-12-31 Nokia Corporation A hovering field
US20150187357A1 (en) * 2013-12-30 2015-07-02 Samsung Electronics Co., Ltd. Natural input based virtual ui system for mobile devices
EP3163421A4 (en) * 2014-06-24 2018-06-13 Sony Corporation Information processing device, information processing method, and program
US9465486B2 (en) * 2014-07-14 2016-10-11 Hong Kong Applied Science and Technology Research Institute Company Limited Portable interactive whiteboard module
EP3176636B1 (en) 2014-07-29 2020-01-01 Sony Corporation Projection-type display device
JP6372266B2 (en) * 2014-09-09 2018-08-15 ソニー株式会社 Projection type display device and function control method
CN105898168A (en) * 2014-11-14 2016-08-24 王殿龙 Cellular mobile phone projector
KR102393297B1 (en) * 2015-04-22 2022-05-02 삼성전자주식회사 A eletronic device and a method
JP6544073B2 (en) * 2015-06-22 2019-07-17 セイコーエプソン株式会社 Image display system and image display method
JP6798108B2 (en) * 2016-01-20 2020-12-09 セイコーエプソン株式会社 Image projection system and control method of image projection system
US10223060B2 (en) * 2016-08-22 2019-03-05 Google Llc Interactive video multi-screen experience on mobile phones
US10656777B1 (en) 2017-06-29 2020-05-19 Apple Inc. Concealed user interfaces
US11561406B2 (en) * 2017-12-10 2023-01-24 Lumus Ltd. Image projector
EP4339656A3 (en) 2018-05-14 2024-06-05 Lumus Ltd. Projector configuration with subdivided optical aperture for near-eye displays, and corresponding optical systems
CN108600641A (en) * 2018-07-05 2018-09-28 维沃移动通信有限公司 Photographic method and mobile terminal
CN110928473A (en) * 2018-09-20 2020-03-27 中兴通讯股份有限公司 Display control method, device and equipment of display screen and readable storage medium
KR20220024410A (en) 2019-06-27 2022-03-03 루머스 리미티드 Gaze tracking device and method based on eye imaging through a light guide optical element
US11528678B2 (en) * 2019-12-20 2022-12-13 EMC IP Holding Company LLC Crowdsourcing and organizing multiple devices to perform an activity
EP4115271A1 (en) * 2020-03-04 2023-01-11 Abusizz AG Interactive display apparatus and method for operating the same
CN117396792A (en) 2021-07-04 2024-01-12 鲁姆斯有限公司 Display with stacked light guide elements providing different portions of field of view

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5831601A (en) * 1995-06-07 1998-11-03 Nview Corporation Stylus position sensing and digital camera with a digital micromirror device
US5933132A (en) * 1989-11-07 1999-08-03 Proxima Corporation Method and apparatus for calibrating geometrically an optical computer input system
US20020024612A1 (en) * 2000-08-30 2002-02-28 Takaaki Gyoten Video projecting system
US6733138B2 (en) 2001-08-15 2004-05-11 Mitsubishi Electric Research Laboratories, Inc. Multi-projector mosaic with automatic registration
US20050073508A1 (en) * 1998-08-18 2005-04-07 Digital Ink, Inc., A Massachusetts Corporation Tracking motion of a writing instrument
US7002589B2 (en) 2000-03-17 2006-02-21 Sun Microsystems, Inc. Blending the edges of multiple overlapping screen images
US20060077188A1 (en) * 2004-09-25 2006-04-13 Samsung Electronics Co., Ltd. Device and method for inputting characters or drawings in a mobile terminal using a virtual screen
US20070070066A1 (en) * 2005-09-13 2007-03-29 Bakhash E E System and method for providing three-dimensional graphical user interface
US20070124503A1 (en) * 2005-10-31 2007-05-31 Microsoft Corporation Distributed sensing techniques for mobile devices
US20080018591A1 (en) * 2006-07-20 2008-01-24 Arkady Pittel User Interfacing
US20080143969A1 (en) * 2006-12-15 2008-06-19 Richard Aufranc Dynamic superposition system and method for multi-projection display
US20090284555A1 (en) * 2008-05-16 2009-11-19 Mersive Technologies, Inc. Systems and methods for generating images using radiometric response characterizations
US20100017745A1 (en) * 2008-07-16 2010-01-21 Seiko Epson Corporation Image display system, image supply device, image display device, image display method, and computer program product
US20100123545A1 (en) * 2008-11-17 2010-05-20 Seiko Epson Corporation Projection system, screen, and projector
US20100201702A1 (en) * 2009-02-03 2010-08-12 Robe Lighting S.R.O. Digital image projection luminaire systems
US20100299627A1 (en) * 2009-05-20 2010-11-25 Qualcomm Incorporated Method and apparatus for content boundary detection and scaling
US20110055729A1 (en) * 2009-09-03 2011-03-03 Steven Mason User Interface for a Large Scale Multi-User, Multi-Touch System
US20110050595A1 (en) * 2009-09-03 2011-03-03 Niklas Lundback Large Scale Multi-User, Multi-Touch System
US20110119638A1 (en) * 2009-11-17 2011-05-19 Babak Forutanpour User interface methods and systems for providing gesturing on projected images
US20110248963A1 (en) * 2008-12-24 2011-10-13 Lawrence Nicholas A Touch Sensitive Image Display

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8190695B2 (en) * 2001-08-02 2012-05-29 Sony Corporation Remote control system and remote control method, device for performing remote control operation and control method therefor, device operable by remote control operation and control method therefor, and storage medium
JP2004221908A (en) * 2003-01-14 2004-08-05 Sanyo Electric Co Ltd Method for controlling display, photographed image output sysetm capable of using the method, display control device, liquid crystal projector, and digital camera
JP3849654B2 (en) * 2003-02-21 2006-11-22 株式会社日立製作所 Projection display
WO2004079558A1 (en) * 2003-03-03 2004-09-16 Matsushita Electric Industrial Co., Ltd. Projector system
US7120816B2 (en) * 2003-04-17 2006-10-10 Nvidia Corporation Method for testing synchronization and connection status of a graphics processing unit module
US6935754B2 (en) * 2003-05-14 2005-08-30 In Focus Corporation User-interface for a projection device
JP4641374B2 (en) * 2003-10-14 2011-03-02 キヤノン株式会社 Projection optical system and projection display device using the same
US7338175B2 (en) * 2003-12-01 2008-03-04 Seiko Epson Corporation Front projection type multi-projection display
JP2005227480A (en) * 2004-02-12 2005-08-25 Seiko Epson Corp Multi-projection display and projector
WO2005088602A1 (en) * 2004-03-10 2005-09-22 Matsushita Electric Industrial Co., Ltd. Image transmission system and image transmission method
US8251512B2 (en) * 2004-07-08 2012-08-28 Imax Corporation Equipment and methods for the display of high resolution images using multiple projection displays
JP2006091112A (en) * 2004-09-21 2006-04-06 Nikon Corp Electronic equipment
JP4736436B2 (en) * 2005-01-20 2011-07-27 株式会社日立製作所 Projection type display device and multi-screen display device
JP4026649B2 (en) * 2005-02-16 2007-12-26 セイコーエプソン株式会社 Projector, projector control method, projector control program, and storage medium storing the program
US7765231B2 (en) * 2005-04-08 2010-07-27 Rathus Spencer A System and method for accessing electronic data via an image search engine
EP3169059B1 (en) * 2005-04-26 2018-06-13 Imax Corporation Electronic projection systems and methods
JP4626416B2 (en) * 2005-06-20 2011-02-09 セイコーエプソン株式会社 projector
JP2007079028A (en) * 2005-09-13 2007-03-29 Canon Inc Projection type image display apparatus and multi-projection system
US20070160373A1 (en) * 2005-12-22 2007-07-12 Palo Alto Research Center Incorporated Distributed illumination and sensing system
US20070264976A1 (en) * 2006-03-30 2007-11-15 Sony Ericsson Mobile Communication Ab Portable device with short range communication function
US7755026B2 (en) * 2006-05-04 2010-07-13 CandleDragon Inc. Generating signals representative of sensed light that is associated with writing being done by a user
JP4301262B2 (en) * 2006-08-08 2009-07-22 セイコーエプソン株式会社 Multi-display system and display method
JP4211815B2 (en) * 2006-08-08 2009-01-21 セイコーエプソン株式会社 Display device, multi-display system, image information generation method, image information generation program, and recording medium
US7677737B2 (en) * 2006-08-17 2010-03-16 Sony Ericsson Mobile Communications Ab Projector adaptation for self-calibration
JP4258540B2 (en) * 2006-09-01 2009-04-30 セイコーエプソン株式会社 Information processing apparatus, information processing program, and recording medium thereof
JP4811203B2 (en) * 2006-09-06 2011-11-09 沖電気工業株式会社 Projector, terminal, and image communication system
JP4274217B2 (en) * 2006-09-21 2009-06-03 セイコーエプソン株式会社 Image display device, image display system, and network connection method
JP4908265B2 (en) * 2007-03-01 2012-04-04 京セラ株式会社 Information processing apparatus, projection method, and projection program
EP1975697B1 (en) * 2007-03-26 2017-03-15 Seiko Epson Corporation Image display system and method, and screen device
JP4407841B2 (en) * 2007-07-13 2010-02-03 セイコーエプソン株式会社 Display system and display device
JP2009134069A (en) * 2007-11-30 2009-06-18 Seiko Epson Corp Projection image display position control device, projection image display control method, and projection system
US20090140986A1 (en) * 2007-11-30 2009-06-04 Nokia Corporation Method, apparatus and computer program product for transferring files between devices via drag and drop
JP4448179B2 (en) * 2008-01-31 2010-04-07 キヤノン株式会社 Projection device
US8077157B2 (en) * 2008-03-31 2011-12-13 Intel Corporation Device, system, and method of wireless transfer of files
KR101526995B1 (en) * 2008-10-15 2015-06-11 엘지전자 주식회사 Mobile terminal and method for controlling display thereof
US8255808B2 (en) * 2008-12-12 2012-08-28 Nokia Corporation Controlling data transfer between devices
US20100149096A1 (en) * 2008-12-17 2010-06-17 Migos Charles J Network management using interaction with display surface
JP2010152646A (en) * 2008-12-25 2010-07-08 Fujitsu Ltd Information providing method, and information providing system
US9110495B2 (en) * 2010-02-03 2015-08-18 Microsoft Technology Licensing, Llc Combined surface user interface

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5933132A (en) * 1989-11-07 1999-08-03 Proxima Corporation Method and apparatus for calibrating geometrically an optical computer input system
US5831601A (en) * 1995-06-07 1998-11-03 Nview Corporation Stylus position sensing and digital camera with a digital micromirror device
US20050073508A1 (en) * 1998-08-18 2005-04-07 Digital Ink, Inc., A Massachusetts Corporation Tracking motion of a writing instrument
US7002589B2 (en) 2000-03-17 2006-02-21 Sun Microsystems, Inc. Blending the edges of multiple overlapping screen images
US20020024612A1 (en) * 2000-08-30 2002-02-28 Takaaki Gyoten Video projecting system
US6733138B2 (en) 2001-08-15 2004-05-11 Mitsubishi Electric Research Laboratories, Inc. Multi-projector mosaic with automatic registration
US20060077188A1 (en) * 2004-09-25 2006-04-13 Samsung Electronics Co., Ltd. Device and method for inputting characters or drawings in a mobile terminal using a virtual screen
US7735018B2 (en) * 2005-09-13 2010-06-08 Spacetime3D, Inc. System and method for providing three-dimensional graphical user interface
US20070070066A1 (en) * 2005-09-13 2007-03-29 Bakhash E E System and method for providing three-dimensional graphical user interface
US20070124503A1 (en) * 2005-10-31 2007-05-31 Microsoft Corporation Distributed sensing techniques for mobile devices
US20080018591A1 (en) * 2006-07-20 2008-01-24 Arkady Pittel User Interfacing
US20080143969A1 (en) * 2006-12-15 2008-06-19 Richard Aufranc Dynamic superposition system and method for multi-projection display
US20090284555A1 (en) * 2008-05-16 2009-11-19 Mersive Technologies, Inc. Systems and methods for generating images using radiometric response characterizations
US20100017745A1 (en) * 2008-07-16 2010-01-21 Seiko Epson Corporation Image display system, image supply device, image display device, image display method, and computer program product
US20100123545A1 (en) * 2008-11-17 2010-05-20 Seiko Epson Corporation Projection system, screen, and projector
US20110248963A1 (en) * 2008-12-24 2011-10-13 Lawrence Nicholas A Touch Sensitive Image Display
US20100201702A1 (en) * 2009-02-03 2010-08-12 Robe Lighting S.R.O. Digital image projection luminaire systems
US20100299627A1 (en) * 2009-05-20 2010-11-25 Qualcomm Incorporated Method and apparatus for content boundary detection and scaling
US20110055729A1 (en) * 2009-09-03 2011-03-03 Steven Mason User Interface for a Large Scale Multi-User, Multi-Touch System
US20110050595A1 (en) * 2009-09-03 2011-03-03 Niklas Lundback Large Scale Multi-User, Multi-Touch System
US20110119638A1 (en) * 2009-11-17 2011-05-19 Babak Forutanpour User interface methods and systems for providing gesturing on projected images

Non-Patent Citations (14)

* Cited by examiner, † Cited by third party
Title
"Least Squares Fitting", Wolfram Mathworld, http://mathworld.wolfram.com/LeastSquaresFitting.html, retrieved on Dec. 10, 2009.
Cao, et al., "Multi-User Interaction Using Handheld Projectors", In the Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, Session: Novel Interaction, 2007, pp. 43-52.
D. Oberkampf, D. DeMenthon and L.S. Davis, "Iterative Pose Estimation using Coplanar Feature Points", CVIU (was called CVGIP: Image Understanding at the time), vol. 63, No. 3, May 1996.
Fernandes, L. A. and Oliveira, M. M. 2008. "Real-time line detection through an improved Hough transform voting scheme", Pattern Recogn. 41, 1 (Jan. 2008), 299-314.
Haliburton, et al., "Pico Projectors, ProCams, and a New Kind of Interaction", TAT Discussion Paper, Retrieved on May 27, 2010 at <<http://www.tat.se/site/media/downloads/TAT-procams%20paper%20251108.pdf>> 4 pgs.
Haliburton, et al., "Pico Projectors, ProCams, and a New Kind of Interaction", TAT Discussion Paper, Retrieved on May 27, 2010 at > 4 pgs.
Hartley, Richard and Zisserman, Andrew, "Mutliple View Geometry in Computer Vision", 2nd Edition, Cambridge University Press, 2003.
Miyahara, et al., "Intuitive Manipulation Techniques for Projected Displays of Mobile Devices", ACM, Conference on Human Factors in Computing Systems, Session: Late Breaking Results: Short Papers, 2005, pp. 1657-1660.
Raskar, et al., "Intelligent Clusters and Collaborative Projector-Based Displays", NSF Workshop on Collaborative Virtual Reality and Visualization, Oct. 2003, 7 pgs.
Raskar, et al., "Multi-Projector Imagery on Curved Surfaces", Mitsubishi Electric Research Labs, Nov. 1998, Available at <<http://web.media.mit.edu˜raskar/CurvedScreen/curvedScreen.pdf>> 8 pgs.
Raskar, et al., "Multi-Projector Imagery on Curved Surfaces", Mitsubishi Electric Research Labs, Nov. 1998, Available at > 8 pgs.
Raskar, et al., "Seamless Projection Overlaps Using Image Warping and Intensity Blending", In the Fourth International Conference on Virtual Systems and Multimedia, Nov. 1998, Available at <<http://cs.unc.edu/Research/stc/publications/Raskar-VSMM98.pdf>> pp. 1-5.
Raskar, et al., "Seamless Projection Overlaps Using Image Warping and Intensity Blending", In the Fourth International Conference on Virtual Systems and Multimedia, Nov. 1998, Available at > pp. 1-5.
Z. Zhang, "A flexible new technique for camera calibration", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, No. 11, pp. 1330-1334, 2000.

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001066484A1 (en) 2000-03-05 2001-09-13 3M Innovative Properties Company Radiation-transmissive films on glass articles
US20140218624A1 (en) * 2007-08-07 2014-08-07 Seiko Epson Corporation Graphical user interface device
US12099196B2 (en) 2018-05-17 2024-09-24 Lumus Ltd. Near-eye display having overlapping projector assemblies
US11243674B2 (en) * 2018-07-10 2022-02-08 Seiko Epson Corporation Display apparatus and image processing method

Also Published As

Publication number Publication date
US10452203B2 (en) 2019-10-22
US20150346857A1 (en) 2015-12-03
US20110191690A1 (en) 2011-08-04

Similar Documents

Publication Publication Date Title
US10452203B2 (en) Combined surface user interface
US10528124B2 (en) Information processing apparatus, information processing method, and computer-readable recording medium
US9262117B2 (en) Image capture modes for self portraits
Huang et al. Ubii: Towards seamless interaction between digital and physical worlds
US20120290943A1 (en) Method and apparatus for distributively managing content between multiple users
US20120081317A1 (en) Method and system for performing copy-paste operations on a device via user gestures
US9086840B2 (en) RSID proximity peripheral interconnection
US10276133B2 (en) Projector and display control method for displaying split images
US9377901B2 (en) Display method, a display control method and electric device
JP2014026602A (en) Display terminal device and program
US20130222241A1 (en) Apparatus and method for managing motion recognition operation
US10013595B2 (en) Correlating fingerprints to pointing input device actions
US11733857B2 (en) Graphical user interface (GUI) for controlling virtual workspaces produced across information handling systems (IHSs)
US11150860B1 (en) Dynamic virtual workspace with contextual control of input/output (I/O) devices
US11221760B2 (en) Information processing apparatus, information processing method, and storage medium
CN107230240B (en) Shooting method and mobile terminal
JP5558899B2 (en) Information processing apparatus, processing method thereof, and program
JP2015018426A (en) Information display device
JP2014204169A (en) Display control device and control method thereof
JP2018050285A (en) Communication terminal, communication system, output method, and program
US11928382B2 (en) Contextual intelligence for virtual workspaces produced across information handling systems (IHSs)
TW201142466A (en) Interactive projection system and system control method thereof
JP7287156B2 (en) Display device, display method, program
WO2022037247A1 (en) Device, method and system for operating device
JP6213597B2 (en) Display control apparatus, display control method, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHANG, CHUNHUI;ZHAO, JI;WANG, MIN;AND OTHERS;SIGNING DATES FROM 20091217 TO 20091218;REEL/FRAME:023894/0520

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001

Effective date: 20141014

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230818