[go: nahoru, domu]

USRE44258E1 - Apparatus and method for manipulating a touch-sensitive display panel - Google Patents

Apparatus and method for manipulating a touch-sensitive display panel Download PDF

Info

Publication number
USRE44258E1
USRE44258E1 US12/412,806 US41280609A USRE44258E US RE44258 E1 USRE44258 E1 US RE44258E1 US 41280609 A US41280609 A US 41280609A US RE44258 E USRE44258 E US RE44258E
Authority
US
United States
Prior art keywords
point
touch
contacted
movement distance
display panel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US12/412,806
Inventor
Nobuyuki Matsushita
Yuji Ayatsuka
Junichi Rekimoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US12/412,806 priority Critical patent/USRE44258E1/en
Application granted granted Critical
Publication of USRE44258E1 publication Critical patent/USRE44258E1/en
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a graphic processing apparatus and in particular to an apparatus capable of easily performing graphic processing even when a touch panel is used.
  • PDA personal digital assist
  • a graphic operation is widely performed using a graphic creation software through operation of a keyboard and a mouse.
  • a graphic edition operation is to be performed on the aforementioned PDA touch panel using a pen or finger, only one point on the panel can be specified and it is necessary to repeatedly perform a complicated processing. For example, an operation type (such as move) is selected through a menu and a graphic object is moved with the pen. This should be repeated for edition, requiring a complicated process.
  • the present invention provides a graphic processing apparatus including: a touch panel; means for deciding whether a single point or two points are specified on the touch panel; means for performing a graphic processing in a first graphic processing mode when the single point is specified; and means for performing a graphic processing in a second graphic processing mode when the two points are specified.
  • the edition types may be identified by the moving state of the specified position. For example, when a first point is fixed and a second point is moved apart from the first point, enlargement or reduction is performed in this direction and rotation is performed around the fixed point.
  • the present invention provides a portable computer including: a frame which can be grasped by a user's hand; a touch panel formed on the upper surface of the frame; detection means for detecting specification of a predetermined area on the touch panel in the vicinity of a region where a user's thumb is positioned when he/she grasps the portable computer; interpretation means for interpreting another point specification on the touch panel in a corresponding interpretation mode according to a detection output from the detection means while the predetermined area is specified; and execution means for executing a predetermined processing according to a result of the interpretation.
  • the present invention provides a coordinate position input apparatus including: a touch panel for outputting a coordinate data of a middle point when two points are simultaneously touched; storage means for retaining coordinate position of the two points detected previously; detection means for detecting a coordinate position of a current middle point; and calculation means for calculating a coordinate of one of the two touch points assumed to be a moving point by subtracting a coordinate position of a previous fixed point from a current middle point coordinate multiplied by 2.
  • At least a part of the present invention can be realized as a computer software, and can be implemented as a computer program package (recording medium).
  • FIG. 1 shows a portable computer according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing a functional configuration of the aforementioned embodiment.
  • FIG. 3 is a block diagram explaining an essential portion of a touch panel driver in the aforementioned embodiment.
  • FIG. 4 explains a mode modification block in the aforementioned embodiment.
  • FIGS. 5A , 5 B, 5 C, 5 D, 5 E and 5 F show an operation state in the aforementioned embodiment.
  • FIG. 6 explains a control operation in the aforementioned embodiment.
  • FIG. 7 explains a mode modification block in a modified example of the aforementioned embodiment.
  • FIGS. 8A , 8 B, 8 C, 8 D, 8 E and 8 F show an operation state of the modified example of FIG. 7 .
  • FIG. 9 is a flowchart explaining a control operation in the modified example of FIG. 7 .
  • FIG. 10 explains a mode modification block in another modified example of the aforementioned embodiment.
  • FIGS. 11A , 11 B, 11 C, 11 D, 11 E AND 11 F explain an operation state of the modified example of FIG. 10 .
  • FIG. 12 is a flowchart explaining a control operation in the modified example of FIG. 10 .
  • FIG. 13 is a flowchart explaining coordinate position calculation processing.
  • FIGS. 14A , 14 B, 14 C are additional explanations to the coordinate position calculation processing of FIG. 13 .
  • FIG. 1 is an external view of a portable computer according to the embodiment.
  • the portable computer 1 has a flattened cubic configuration of a size that can be grasped by one hand of a grownup.
  • the portable computer 1 has on its upper side a pressure-sensitive (resistance type) touch panel 2 .
  • the touch panel is an ordinary pressure-sensitive type.
  • a pen not depicted
  • a change of an inter-terminal voltage is detected so as to enter coordinates.
  • the user can freely move his/her thumb while grasping the portable computer 1 .
  • buttons 2 a are arranged in the vicinity of user's thumb, so that the user can specify the buttons 2 a while grasping the portable computer 1 .
  • the buttons 2 a may be displayed or may not be displayed in a predetermined mode.
  • FIG. 2 shows functional blocks realized by internal circuits and the touch panel 2 of the portable computer 1 .
  • the functional blocks realized by the portable computer 1 are a touch panel driver 3 , a display driver 4 , a graphical user interface (GUI) handler 5 , an application 6 , and the like.
  • the touch panel 2 includes a liquid crystal display unit 7 and a resistance film unit 8 . It should be noted that components not related to the present invention will not be explained. Moreover, hardware (CPU, recording apparatus, and the like) constituting the aforementioned functional blocks are identical as an ordinary portable terminal and its explanation is omitted.
  • the application 6 includes a database application for managing an individual information, a mail application, a browser, an image creation application, and the like.
  • the application 6 can be selected through a menu and some of the application 6 such as the mail application may be selected by a push button (mechanical component).
  • the application 6 creates a message related to display and supplies the message to the GUI handler 5 .
  • the GUI handler 5 Upon reception of this message, the GUI handler 5 creates a display image information and transfers it to the display driver 4 .
  • the display driver 4 according to the display data, drives the liquid crystal display unit 7 to display information for the user.
  • the touch panel driver 3 When the resistance film unit 8 is pressed by a pen or a finger, output voltages associated with a coordinate X and coordinate Y are changed and these output voltages are transmitted as X coordinate data and Y coordinate data to the touch panel driver 3 .
  • the touch panel driver 3 according to the outputs from the resistance film unit 8 , generates an event including information such as a touch panel depression, depression release, finger position, and the like and supplies the event to the GUI handler 5 .
  • the GUI handler 5 according to the event, generates a message corresponding to the GUI and supplies it to the application 6 .
  • FIG. 3 shows a configuration example associated with the specified position detection of the touch panel driver 3 .
  • the touch panel driver 3 includes a two-point specification detector 31 , an inhibit circuit 32 , and a two-point position calculator 33 .
  • the two-point specification detector 31 detects that two points are specified and its specific method will be explained later with reference to FIG. 13 and FIG. 14 .
  • Specified coordinate data (X, Y) is entered from an input block 30 .
  • a coordinate data (X, Y) from the touch panel 2 is output as a detected coordinate data (X 1 , Y 1 ).
  • coordinates of an intermediate point between them are output as coordinate data (X, Y).
  • the two-point specification detector 31 drives the inhibit circuit 32 so as to inhibit output of the input data as it is. Moreover, upon detection of that two points are specified, the two-point specification detector 31 uses the input data latched in the preceding value timing (coordinate data (X 1 , Y 1 ) when one point is specified) and a current input data (X, Y) so as to calculate new specification position coordinates (X 2 , Y 2 ) by extrapolation and outputs the coordinates data of two points (X 1 , Y 1 ) and (X 2 , Y 2 ). When the two-point specification detector 31 detects that the two point specification is released, the two-point specification detector 31 disables the inhibit circuit 32 so as to output an input data as it is.
  • FIG. 4 explains a configuration of a processing mode modification block 50 .
  • the processing mode modification block 50 is arranged, for example, in the GUI handler 5 .
  • the processing mode modification block 50 receives a control data input (event) and an operation data input (event).
  • the control data supplied indicates whether a single point has been specified or two points have been specified. Different mode processes are performed depending on whether the control data indicates a single point specification or two-point specification.
  • the operation data is interpreted as a command to move an object to be operated and the corresponding move message is supplied to the application 6 .
  • the control data indicates two-point specification
  • the operation data is interpreted as a command to rotate an object to be operated and a rotation message is supplied to the application 6 .
  • FIGS. 5A , 5 B, 5 C, 5 D, 5 E and 5 F an operation example to process an graphic object using such a processing mode modification block 50 .
  • the graphic processing application is executed.
  • FIG. 5A at an initial stage, it is assumed that a rectangular object is displayed. This can be created by the application 6 or selected through a menu.
  • this rectangular object is touched (pressed) by a finger, as shown in FIG. 5B and when the finger is moved while pressing the rectangular object, the rectangular object is also moved, as shown in FIG. 5C .
  • the rectangular object is pressed at two points, as shown in FIG. 5D .
  • the rectangular object is rotated, as shown in FIGS. 5E and 5F .
  • FIG. 6 explains operation of a control block for executing the operation of FIG. 5 .
  • the control block executing this process includes the GUI handler 5 and the application 6 .
  • no operation is performed in state S 1 .
  • a first finger touches the panel and a graphic object moves according to the finger position in state S 2 .
  • state S 2 if the first finger is released, the state S 1 is again set in.
  • state S 4 if a second finger touches the panel, state S 4 is set in so that the position of the first finger is stored as point A (S 3 ) and the second finger can rotate the graphic object around the point A.
  • state S 4 if one of the fingers is released and the remaining single finger is in the touch state, state is returned to S 2 so that the graphic object is moved.
  • the processing mode can be switched between the move mode and the rotation mode depending oh whether a single point or two points are pressed on the touch panel 2 .
  • a graphic object can easily be operated. It should be noted that the mode can be switched by specifying three positions.
  • FIG. 7 explains the processing mode modification block 50 in the modified example.
  • a control data a data (event) indicating whether a predetermined button is pressed is entered.
  • the buttons 2 a are arranged in a straight line as shown in FIG. 8 so as to be in the vicinity of the thumb of the user. Each of the buttons can be specified by slightly moving the thumb.
  • the control data indicates a predetermined button, the operation data is processed in the corresponding mode.
  • FIGS. 8A , 8 B, 8 C, 8 E and 8 F shows an operation example using the processing mode modification block 50 of FIG. 7 .
  • the graphic processing application is executed.
  • no buttons 2 a are specified, as shown in FIG. 8A , it is possible to specify and move a graphic object, as shown in FIGS. 8B and 8C .
  • a heart-shaped object is moved to the lower left direction.
  • the second button 2 a from the top is pressed, as shown in FIG. 8D , the enlarge/reduce mode is selected and so that the graphic object can be enlarged or reduced by specifying with a pen or finger.
  • the pressing position is moved upward so as to enlarge the graphic object, as shown in FIGS. 8E and 8F .
  • the pressing position is moved downward, reduction is performed.
  • Processes other than enlarge/reduce can also be performed by pressing a corresponding button.
  • FIG. 9 is a flowchart explaining the process of FIG. 8 .
  • state S 11 nothing is performed.
  • state S 12 control is passed to state S 13 where an object is moved together with the position of a pen.
  • state S 14 control is passed to state S 14 to wait for a second pen (or finger) tough in the enlarge/reduce mode. If a second pen (finger) touch is performed in state S 14 , control is passed to state S 15 where enlarge/reduce is performed in accordance with the pen position.
  • control is returned to state S 11 where nothing is performed.
  • control is passed to state S 13 where the object is moved.
  • control is returned to state S 14 to wait for a touch specifying enlargement or reduction.
  • FIG. 10 explains the processing mode modification block 50 of the modified example.
  • a data indicated whether a button is pressed is entered as a control data (event).
  • This data is also entered as an operation data and a corresponding menu is displayed. With the menu displayed, if a data is entered to operate an item selected in the menu, a predetermined processing is performed.
  • FIG. 11 shows a processing state in the modified example of FIG. 10 .
  • an application to select a processing according to a predetermined icon is executed.
  • buttons 2 a are displayed in a vertical straight line at the left side of the touch panel 2 in the same way as the example of FIG. 8 .
  • the move processing is executed so that the object is moved together with the specification point, as shown in FIGS. 11B and 11C .
  • a predetermined button 2 a is pressed, a corresponding menu (a plurality of objects) is displayed, as shown in FIGS. 11D and 11E .
  • the other buttons disappear.
  • buttons 2 a arranged at the left side of the touch panel 2 may also be arranged at the right side of the touch panel 2 instead. It is also possible to configure the apparatus so that the arrangement of buttons 2 a can be switched between the right side and the left side of the touch panel 2 .
  • FIG. 12 is a flowchart explaining the control operation of FIG. 10 .
  • nothing is performed in state S 21 .
  • state S 21 if a first touch specifies a graphic object without specifying any of the menu buttons 2 a (S 22 ), control is passed to state S 23 where the graphic object is moved together with the movement of the pen.
  • state S 21 if the first touch specifies the menu button 2 a (S 22 ), a corresponding menu pops up and control is passed to state S 24 where the touch state is monitored.
  • state S 24 if a second touch selects an icon, a selected command is executed (S 25 ), the menu is pulled down, and control is passed to state S 26 where the touch state is monitored.
  • state S 26 when the touch of the menu button is released, control is passed to state S 23 where the object is moved. In state S 26 , when the touch of the icon is released, control is returned to state S 24 where the menu pops up. Moreover, in state S 23 and state S 24 , when the other touch is also released, control is returned to state S 21 .
  • FIG. 13 shows an operation of the two-point specification detection and the coordinate data calculation. It should be noted that symbols used have meanings shown in the figure.
  • FIGS. 14A , 14 B and 14 C explain a scheme employed by the GUI: FIG. 14A shows that nothing is performed; FIG. 14B assumes that a first touch point A is moved; and FIG. 14C assumes that a second touch point B is moved. It is determined in advance whether to employ FIG. 14B or FIG. 14C . It is also possible to switch between FIG. 14B and FIG. 14C through a button operation according to whether the use is right-handed or the left-handed.
  • state S 31 if a first touchy is performed, control is passed to a first touch coordinate calculation mode state S 32 .
  • state S 32 a detected coordinate position N of the touch panel 2 is received, which is entered as the current first touch position coordinate A n .
  • state S 33 it is decided whether the touch is released or the touch point is moved at a predetermined time interval (S 33 ). When the touch is released, control is returned to state S 31 .
  • the touch point is moved, it is determined whether the movement distance is within a threshold value (S 34 ).
  • the movement distance is within the threshold value, it is determined that only one touch has been made previously and control is returned to state S 32 . Normally, when the specification position is moved continuously using a pen or finger, the movement distance per a unit time is not so great. In contrast to this, when a second touch is performed, the apparent coordinate position is changed in the stepped way up to the middle point. Accordingly, it is possible to detect such a sudden movement to identify a two-point specification.
  • state S 35 two-point mode
  • the movement is monitored to determine whether the movement distance is within the threshold value (S 36 , S 37 ). If within the threshold value, the two-point mode is identified.
  • the two-point mode is identified.
  • the graphic processing can easily be performed with a small number of operations even when using a touch panel. Moreover, a user can use his/her thumb for input operation instead of grasping the portable computer. Moreover, even when two points are simultaneously touched, the user interface can be set so that one of the two points is fixed while the other point movement coordinate can easily be calculated. This significantly simplifies a command creation by a coordinate movement.
  • the present invention it is possible to easily perform a graphic processing even when using a touch panel. Moreover, the thumb of the hand grasping the portable computer body can be used as input means. Moreover, even in the case of a pressure-sensitive (resistance film type) touch panel, it is possible to detect a movement of one of the two points touched, thereby enabling to create a command by two-point touch movement.
  • a pressure-sensitive (resistance film type) touch panel it is possible to detect a movement of one of the two points touched, thereby enabling to create a command by two-point touch movement.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention enables to easily perform a graphic processing even when a touch panel is used. When a resistance film unit is pressed with a pen or a finger, output voltages associated with the X coordinate and the Y coordinate position are changed and these output voltages are transmitted as the X coordinate data and the Y coordinate data to a touch panel driver. According to the output from the resistance film unit, the touch panel driver generates an event for supply to a GUI handler. The touch panel driver includes a two-point specification detector which detects two point specifications and causes to calculate coordinates of the two points. The GUI handler generates a message corresponding to the GUI according to the event and supplies the message to an application. The GUI handler includes a processing mode modification block which differently interprets the event when a single point is specified and when two points are specified, thereby modifying the graphic processing mode.

Description

Notice: More than one reissue application has been filed for the reissue of U.S. Pat. No. 6,958,749. The reissue applications are application Ser. Nos. 12/412,806 (the present application) and 11/862,943 (the Parent reissue application) filed Sep. 27, 2007, all of which are divisional reissues of U.S. Pat. No. 6,958,749.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a graphic processing apparatus and in particular to an apparatus capable of easily performing graphic processing even when a touch panel is used.
2. Description of the Prior Art
With increase of the computer performance and the technique to reduce the size, various portable computers (personal digital assist, PDA) are now widely used. Most of the conventional PDA employs an interface for performing almost all the operations with a single pen. This is based on the metaphor of a notebook and a pencil.
By the way, a graphic operation is widely performed using a graphic creation software through operation of a keyboard and a mouse. When such a graphic edition operation is to be performed on the aforementioned PDA touch panel using a pen or finger, only one point on the panel can be specified and it is necessary to repeatedly perform a complicated processing. For example, an operation type (such as move) is selected through a menu and a graphic object is moved with the pen. This should be repeated for edition, requiring a complicated process.
Recently, as disclosed in Japanese Patent Publication 9-34625, a technique to simultaneously push two points on the touch panel has been suggested. It is known that this technique is used in the touch panel, in the same way as on a keyboard, for example, an operation combining the Shift key and an alphabet key.
SUMMARY OF THE INVENTION
It is therefore an object of the present invention to provide an apparatus capable of easily performing a graphic processing on the touch panel using the technique to simultaneously enter two points on the touch panel.
That is, the present invention provides a graphic processing apparatus including: a touch panel; means for deciding whether a single point or two points are specified on the touch panel; means for performing a graphic processing in a first graphic processing mode when the single point is specified; and means for performing a graphic processing in a second graphic processing mode when the two points are specified.
With this configuration, it is possible to select a graphic processing mode according to the number of points specified and accordingly, it is possible to select a predetermined graphic processing with a small number of operation steps. For example, when a single point is specified, a graphic object is moved and a segment is drawn on point basis and when two points are specified, it is possible to perform edition such as enlargement, reduction, and rotation. In this case, the edition types may be identified by the moving state of the specified position. For example, when a first point is fixed and a second point is moved apart from the first point, enlargement or reduction is performed in this direction and rotation is performed around the fixed point.
Moreover, the present invention provides a portable computer including: a frame which can be grasped by a user's hand; a touch panel formed on the upper surface of the frame; detection means for detecting specification of a predetermined area on the touch panel in the vicinity of a region where a user's thumb is positioned when he/she grasps the portable computer; interpretation means for interpreting another point specification on the touch panel in a corresponding interpretation mode according to a detection output from the detection means while the predetermined area is specified; and execution means for executing a predetermined processing according to a result of the interpretation.
With this configuration, it is possible to specify a point on the touch panel with a pen or a finger and to specify a predetermined area on the touch panel using a thumb of the hand grasping the portable computer body. In the conventional example, one hand is used for grasping a portable terminal and the other hand is used to specify a position on the touch panel. In the present invention, the thumb which has not been used conventionally can be used to select a menu and an operation mode.
Furthermore, the present invention provides a coordinate position input apparatus including: a touch panel for outputting a coordinate data of a middle point when two points are simultaneously touched; storage means for retaining coordinate position of the two points detected previously; detection means for detecting a coordinate position of a current middle point; and calculation means for calculating a coordinate of one of the two touch points assumed to be a moving point by subtracting a coordinate position of a previous fixed point from a current middle point coordinate multiplied by 2.
With this configuration, by employing a user interface to assume one of the two touch points fixed, it is possible to easily and correctly calculate a coordinate position even when one of the two touch points is moved.
It should be noted that at least a part of the present invention can be realized as a computer software, and can be implemented as a computer program package (recording medium).
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 shows a portable computer according to an embodiment of the present invention.
FIG. 2 is a block diagram showing a functional configuration of the aforementioned embodiment.
FIG. 3 is a block diagram explaining an essential portion of a touch panel driver in the aforementioned embodiment.
FIG. 4 explains a mode modification block in the aforementioned embodiment.
FIGS. 5A, 5B, 5C, 5D, 5E and 5F show an operation state in the aforementioned embodiment.
FIG. 6 explains a control operation in the aforementioned embodiment.
FIG. 7 explains a mode modification block in a modified example of the aforementioned embodiment.
FIGS. 8A, 8B, 8C, 8D, 8E and 8F show an operation state of the modified example of FIG. 7.
FIG. 9 is a flowchart explaining a control operation in the modified example of FIG. 7.
FIG. 10 explains a mode modification block in another modified example of the aforementioned embodiment.
FIGS. 11A, 11B, 11C, 11D, 11E AND 11F explain an operation state of the modified example of FIG. 10.
FIG. 12 is a flowchart explaining a control operation in the modified example of FIG. 10.
FIG. 13 is a flowchart explaining coordinate position calculation processing.
FIGS. 14A, 14B, 14C, are additional explanations to the coordinate position calculation processing of FIG. 13.
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT
Description will now be directed to a preferred embodiment of the present invention with reference to the attached drawings.
FIG. 1 is an external view of a portable computer according to the embodiment. In this figure, the portable computer 1 has a flattened cubic configuration of a size that can be grasped by one hand of a grownup. The portable computer 1 has on its upper side a pressure-sensitive (resistance type) touch panel 2. The touch panel is an ordinary pressure-sensitive type. When pressed with a pen (not depicted) or finger, a change of an inter-terminal voltage is detected so as to enter coordinates. In this embodiment, by properly designing the size of the portable computer 1, the user can freely move his/her thumb while grasping the portable computer 1. As shown in the figure, buttons 2a are arranged in the vicinity of user's thumb, so that the user can specify the buttons 2a while grasping the portable computer 1. The buttons 2a may be displayed or may not be displayed in a predetermined mode.
FIG. 2 shows functional blocks realized by internal circuits and the touch panel 2 of the portable computer 1. The functional blocks realized by the portable computer 1 are a touch panel driver 3, a display driver 4, a graphical user interface (GUI) handler 5, an application 6, and the like. Moreover, the touch panel 2 includes a liquid crystal display unit 7 and a resistance film unit 8. It should be noted that components not related to the present invention will not be explained. Moreover, hardware (CPU, recording apparatus, and the like) constituting the aforementioned functional blocks are identical as an ordinary portable terminal and its explanation is omitted.
The application 6 includes a database application for managing an individual information, a mail application, a browser, an image creation application, and the like. The application 6 can be selected through a menu and some of the application 6 such as the mail application may be selected by a push button (mechanical component). The application 6 creates a message related to display and supplies the message to the GUI handler 5. Upon reception of this message, the GUI handler 5 creates a display image information and transfers it to the display driver 4. The display driver 4, according to the display data, drives the liquid crystal display unit 7 to display information for the user.
When the resistance film unit 8 is pressed by a pen or a finger, output voltages associated with a coordinate X and coordinate Y are changed and these output voltages are transmitted as X coordinate data and Y coordinate data to the touch panel driver 3. The touch panel driver 3, according to the outputs from the resistance film unit 8, generates an event including information such as a touch panel depression, depression release, finger position, and the like and supplies the event to the GUI handler 5. The GUI handler 5, according to the event, generates a message corresponding to the GUI and supplies it to the application 6.
FIG. 3 shows a configuration example associated with the specified position detection of the touch panel driver 3. In this figure, the touch panel driver 3 includes a two-point specification detector 31, an inhibit circuit 32, and a two-point position calculator 33. The two-point specification detector 31 detects that two points are specified and its specific method will be explained later with reference to FIG. 13 and FIG. 14. Specified coordinate data (X, Y) is entered from an input block 30. When only one point is specified on the touch panel 2, a coordinate data (X, Y) from the touch panel 2 is output as a detected coordinate data (X1, Y1). When two points are specified on the touch panel 2, coordinates of an intermediate point between them are output as coordinate data (X, Y). When the two-point specification detector 31 decides that two points are specified, the two-point specification detector 31 drives the inhibit circuit 32 so as to inhibit output of the input data as it is. Moreover, upon detection of that two points are specified, the two-point specification detector 31 uses the input data latched in the preceding value timing (coordinate data (X1, Y1) when one point is specified) and a current input data (X, Y) so as to calculate new specification position coordinates (X2, Y2) by extrapolation and outputs the coordinates data of two points (X1, Y1) and (X2, Y2). When the two-point specification detector 31 detects that the two point specification is released, the two-point specification detector 31 disables the inhibit circuit 32 so as to output an input data as it is.
Thus, an even can be generated when a single point is specified and when two points are specified.
FIG. 4 explains a configuration of a processing mode modification block 50. The processing mode modification block 50 is arranged, for example, in the GUI handler 5. In FIG. 4, the processing mode modification block 50 receives a control data input (event) and an operation data input (event). In the example of FIG. 4, the control data supplied indicates whether a single point has been specified or two points have been specified. Different mode processes are performed depending on whether the control data indicates a single point specification or two-point specification. For example, in the case of the graphic process application, when the control data indicates a single point specification, the operation data is interpreted as a command to move an object to be operated and the corresponding move message is supplied to the application 6. On the other hand, when the control data indicates two-point specification, the operation data is interpreted as a command to rotate an object to be operated and a rotation message is supplied to the application 6.
FIGS. 5A, 5B, 5C, 5D, 5E and 5F an operation example to process an graphic object using such a processing mode modification block 50. It should be noted that in this example, it is assumed that the graphic processing application is executed. In FIG. 5A, at an initial stage, it is assumed that a rectangular object is displayed. This can be created by the application 6 or selected through a menu. Next, this rectangular object is touched (pressed) by a finger, as shown in FIG. 5B and when the finger is moved while pressing the rectangular object, the rectangular object is also moved, as shown in FIG. 5C. Next, the rectangular object is pressed at two points, as shown in FIG. 5D. When one of the finger is rotated around the other while pressing the rectangular object, the rectangular object is rotated, as shown in FIGS. 5E and 5F.
FIG. 6 explains operation of a control block for executing the operation of FIG. 5. The control block executing this process includes the GUI handler 5 and the application 6. In FIG. 6, no operation is performed in state S1. Next, a first finger touches the panel and a graphic object moves according to the finger position in state S2. In state S2, if the first finger is released, the state S1 is again set in. Moreover, in state S2, if a second finger touches the panel, state S4 is set in so that the position of the first finger is stored as point A (S3) and the second finger can rotate the graphic object around the point A. In state S4, if one of the fingers is released and the remaining single finger is in the touch state, state is returned to S2 so that the graphic object is moved.
As has been described above, the processing mode can be switched between the move mode and the rotation mode depending oh whether a single point or two points are pressed on the touch panel 2. Thus, a graphic object can easily be operated. It should be noted that the mode can be switched by specifying three positions.
Next, explanation will be given on a modified example of the aforementioned embodiment. FIG. 7 explains the processing mode modification block 50 in the modified example. In this figure, as a control data, a data (event) indicating whether a predetermined button is pressed is entered. The buttons 2a are arranged in a straight line as shown in FIG. 8 so as to be in the vicinity of the thumb of the user. Each of the buttons can be specified by slightly moving the thumb. When the control data indicates a predetermined button, the operation data is processed in the corresponding mode.
FIGS. 8A, 8B, 8C, 8E and 8F shows an operation example using the processing mode modification block 50 of FIG. 7. In this example also, it is assumed that the graphic processing application is executed. When no buttons 2a are specified, as shown in FIG. 8A, it is possible to specify and move a graphic object, as shown in FIGS. 8B and 8C. In this example, a heart-shaped object is moved to the lower left direction. Next, when the second button 2a from the top (enlarge/reduce button) is pressed, as shown in FIG. 8D, the enlarge/reduce mode is selected and so that the graphic object can be enlarged or reduced by specifying with a pen or finger. In this example, the pressing position is moved upward so as to enlarge the graphic object, as shown in FIGS. 8E and 8F. On the other hand, when the pressing position is moved downward, reduction is performed. Processes other than enlarge/reduce can also be performed by pressing a corresponding button. The buttons arranged at the left side of the touch panel in this example but they may be arranged at the right side. It is also possible to configure the apparatus so that the arrangement of the buttons can be switched. In such a case, the portable computer 1 may be grasped by the user's right hand or left hand.
FIG. 9 is a flowchart explaining the process of FIG. 8. Initially, at state S11, nothing is performed. Next, when an area other than the enlarge/reduce button is pressed (S12), control is passed to state S13 where an object is moved together with the position of a pen. When the enlarge/reduce button is pressed (S12), control is passed to state S14 to wait for a second pen (or finger) tough in the enlarge/reduce mode. If a second pen (finger) touch is performed in state S14, control is passed to state S15 where enlarge/reduce is performed in accordance with the pen position. Moreover, if the touch is released in step S13 and S14, control is returned to state S11 where nothing is performed. When the touch of the enlarge/reduce button is released in state S15, control is passed to state S13 where the object is moved. Moreover, if the other touch than the touch of the enlarge/reduce button is released in state S15, control is returned to state S14 to wait for a touch specifying enlargement or reduction.
It should be noted that while explanation has been given on the enlarge/reduce button in FIG. 9, the other button functions are performed in the same way.
Next, explanation will be given on another modified example of the aforementioned embodiment.
FIG. 10 explains the processing mode modification block 50 of the modified example. In this figure also, a data indicated whether a button is pressed is entered as a control data (event). This data is also entered as an operation data and a corresponding menu is displayed. With the menu displayed, if a data is entered to operate an item selected in the menu, a predetermined processing is performed.
FIG. 11 shows a processing state in the modified example of FIG. 10. In this example, an application to select a processing according to a predetermined icon is executed. In FIG. 11A, buttons 2a are displayed in a vertical straight line at the left side of the touch panel 2 in the same way as the example of FIG. 8. If a graphic object is specified without specifying any of the buttons, the move processing is executed so that the object is moved together with the specification point, as shown in FIGS. 11B and 11C. Next, when a predetermined button 2a is pressed, a corresponding menu (a plurality of objects) is displayed, as shown in FIGS. 11D and 11E. Here, the other buttons disappear. When the remaining button and one of the icons (objects displayed) are simultaneously touched, a corresponding processing is performed, as shown in FIG. 11F. In this example, an icon group corresponding to the button 2a is displayed. It should be noted that in this example, two fingers of the right hand are used for operation but it is also possible to operate using the thumb of the left hand and one finger of the right hand or a pen. Moreover, the buttons 2a arranged at the left side of the touch panel 2 may also be arranged at the right side of the touch panel 2 instead. It is also possible to configure the apparatus so that the arrangement of buttons 2a can be switched between the right side and the left side of the touch panel 2.
FIG. 12 is a flowchart explaining the control operation of FIG. 10. In FIG. 12, firstly, nothing is performed in state S21. In state S21, if a first touch specifies a graphic object without specifying any of the menu buttons 2a (S22), control is passed to state S23 where the graphic object is moved together with the movement of the pen. In state S21, if the first touch specifies the menu button 2a (S22), a corresponding menu pops up and control is passed to state S24 where the touch state is monitored. In state S24, if a second touch selects an icon, a selected command is executed (S25), the menu is pulled down, and control is passed to state S26 where the touch state is monitored. In state S26, when the touch of the menu button is released, control is passed to state S23 where the object is moved. In state S26, when the touch of the icon is released, control is returned to state S24 where the menu pops up. Moreover, in state S23 and state S24, when the other touch is also released, control is returned to state S21.
Next, explanation will be given on the two-point specification detection and the coordinate data calculation in the aforementioned embodiment. FIG. 13 shows an operation of the two-point specification detection and the coordinate data calculation. It should be noted that symbols used have meanings shown in the figure. Moreover, FIGS. 14A, 14B and 14C explain a scheme employed by the GUI: FIG. 14A shows that nothing is performed; FIG. 14B assumes that a first touch point A is moved; and FIG. 14C assumes that a second touch point B is moved. It is determined in advance whether to employ FIG. 14B or FIG. 14C. It is also possible to switch between FIG. 14B and FIG. 14C through a button operation according to whether the use is right-handed or the left-handed.
In FIG. 13, firstly nothing is performed in state S31. In state S31, if a first touchy is performed, control is passed to a first touch coordinate calculation mode state S32. In state S32, a detected coordinate position N of the touch panel 2 is received, which is entered as the current first touch position coordinate An. In state S32, it is decided whether the touch is released or the touch point is moved at a predetermined time interval (S33). When the touch is released, control is returned to state S31. When the touch point is moved, it is determined whether the movement distance is within a threshold value (S34). If the movement distance exceeds the threshold value, it is determined that two points are touched and control is passed to a two-point touch coordinate position calculation mode state S35. That is, the previous first coordinate An-1 is made the current first coordinate An, and the previous first coordinate value An-1 is subtracted from the current coordinate data N multiplied by 2 so as to obtain a current second coordinate value Bn. That is, Bn=2N−An-1. If the movement distance is within the threshold value, it is determined that only one touch has been made previously and control is returned to state S32. Normally, when the specification position is moved continuously using a pen or finger, the movement distance per a unit time is not so great. In contrast to this, when a second touch is performed, the apparent coordinate position is changed in the stepped way up to the middle point. Accordingly, it is possible to detect such a sudden movement to identify a two-point specification.
Next, in state S35 (two-point mode), the movement is monitored to determine whether the movement distance is within the threshold value (S36, S37). If within the threshold value, the two-point mode is identified. As has been described above, it is determined in advance which of the touch points is moved for each GUI. As shown in FIG. 14B, if the first touch position is moved according to the GUI design (S38), the first touch position coordinate An is calculated by An=2N−Bn-1 (S39) while the second touch position remains unchanged (Bn=Bn-1). On the contrary, as shown in FIG. 14C, when the GUI used is such that a second touch position is moved (S38), the touch position coordinates are calculated by An=An-1, and Bn=2N−An-1, (S40). After the states S39 and S40, control is returned to state S36. If the movement distance exceeds the threshold value, it is determined that one of the touches is released and control is returned to state S32 (S37).
As has been described above, in this embodiment of the present invention, the graphic processing can easily be performed with a small number of operations even when using a touch panel. Moreover, a user can use his/her thumb for input operation instead of grasping the portable computer. Moreover, even when two points are simultaneously touched, the user interface can be set so that one of the two points is fixed while the other point movement coordinate can easily be calculated. This significantly simplifies a command creation by a coordinate movement.
As has been described above, according to the present invention, it is possible to easily perform a graphic processing even when using a touch panel. Moreover, the thumb of the hand grasping the portable computer body can be used as input means. Moreover, even in the case of a pressure-sensitive (resistance film type) touch panel, it is possible to detect a movement of one of the two points touched, thereby enabling to create a command by two-point touch movement.

Claims (14)

What is claimed is:
1. A coordinate position input apparatus comprising: a touch panel for outputting a coordinate data of a middle point when two points are simultaneously touched; storage means for retaining coordinate position of the two points detected previously; detection means for detecting a coordinate position of a current middle point; and calculation means for calculating a coordinate of one of the two touch points assumed to be a moving point by subtracting a coordinate position of a previous fixed point from a current middle point coordinate multiplied by 2.
2. The coordinate input apparatus as claimed in claim 1, wherein when a second point is touched while a first point is touched, the touch point of the second point is calculated according to a current middle point coordinate position and a previous first point touch position coordinate position.
3. A hand-held portable computer comprising:
a touch-sensitive display panel; and
a processing apparatus, coupled to the touch-sensitive display panel, wherein the processing apparatus:
(a) executes a graphical processing application;
(b) determines a threshold movement distance value;
(c) detects a first point contacted on a surface of the touch-sensitive display panel, wherein the first point corresponds to a graphic object;
(d) determines a value of a movement distance of the first point contacted;
(e) if the determined value of the movement distance of the first point contacted is less than the threshold movement distance value, executes a first processing mode in the graphical processing application, said first processing mode being executed independent of any detection of any additional point contacted on the surface of the touch-sensitive display;
(f) if the determined value of the movement distance of the first point contacted exceeds the threshold movement distance value, determines that a second point is contacted on the surface of the touch-sensitive display panel in addition to the contacted first point, wherein the second point corresponds to the graphic object; and
(g) in response to the determination that the second point is contacted in addition to the contacted first point, executes a second, different processing mode in the graphical processing application.
4. The hand-held portable computer of claim 3, wherein the first and second processing modes perform at least one of enlargement, reduction, and rotation.
5. A hand-held portable computer comprising:
a touch-sensitive display panel; and
a processing apparatus, coupled to the touch-sensitive display panel, wherein the processing apparatus:
(a) executes a graphical processing application;
(b) communicates with the touch-sensitive display panel to display a plurality of selection items and a graphic object on the touch panel;
(c) determines a threshold movement distance value;
(d) detects a first point contacted on a surface of the touch-sensitive display panel that corresponds to one of the plurality of selection items;
(e) determines a value of a movement distance of the first point contacted;
(f) if the determined value of the movement distance of the first point contacted is less than the threshold movement distance value, executes a first processing mode in the graphical processing application;
(g) if the determined value of the movement distance of the first point contacted exceeds the threshold movement distance value, determines that a second point is contacted on the surface of the display panel in addition to the contacted first point, wherein the second point corresponds to a graphic object; and
(h) in response to the determination that the second point is contacted in addition to the contacted first point, executes a second, different processing mode in the graphical processing application.
6. A hand-held portable information processing apparatus, comprising:
a touch-sensitive display panel; and
a processing apparatus, coupled to the touch-sensitive display panel, wherein the processing apparatus
(a) executes a graphical processing application;
(b) determines a threshold movement distance value;
(c) detects a first point contacted on the surface of the touch-sensitive display panel;
(d) determines a value of a movement distance of the first point contacted;
(e) if the determined value of the movement distance of the first point contacted is less than the threshold movement distance value, executes a first process in the graphical processing application, wherein the first point corresponds to a graphic object indicative of the first process;
(f) if the determined value of the movement distance of the first point contacted exceeds the threshold movement distance value, determines that a second point is contacted on the surface of the touch-sensitive display panel in addition to the contacted first point; and
(g) in response to the determination that the second point is contacted in addition to the contacted first point, executes a second, different process in the graphical processing application.
7. The hand-held portable information processing apparatus of claim 6, wherein the first process relates to moving a predetermined object along a trace associated with the detected first point.
8. The hand-held portable information processing apparatus of claim 6, wherein the second process performs at least one of enlargement, reduction, and rotation.
9. The hand-held portable information processing apparatus of claim 6, wherein the first process comprises shifting from a first operation mode to a second operation mode.
10. The hand-held portable information processing apparatus of claim 9, wherein the second process comprises an operation indicated on the touch-sensitive display panel as a result of execution of the first operation mode to a second operation mode.
11. A method of operating a portable information processing apparatus, wherein the portable information processing apparatus includes a touch-sensitive display panel, the method comprising:
executing a graphical processing application;
determining a threshold movement distance value;
detecting a first touch point being contacted on the surface of the touch-sensitive display panel;
determining a value of a movement distance of the first point contacted;
if the determined value of the movement distance of the first point contacted is less than the threshold movement distance value, executing a first process in the graphical processing application;
if the determined value of the movement distance of the first point contacted exceeds the threshold movement distance value, determining that a second touch point is contacted on the surface of the touch-sensitive display panel in addition to the contacted first point, wherein the second touch point corresponds to a graphic object; and
in response to the determination that the second touch point is contacted in addition to the contacted first touch point, executing a second, different process in the graphical processing application, and wherein execution of the second process is dependent on execution of the first process.
12. The method of claim 11, wherein the first process comprises shifting from a first operation mode to a second operation mode.
13. The method of claim 12, wherein the second process comprises an operation indicated on the display panel as a result of execution of the first operation mode to a second operation mode.
14. The hand-held portable computer of claim 3, wherein the graphic object is rotated around the first point contacted on the surface of the touch-sensitive display panel in the second processing mode.
US12/412,806 1999-11-04 2009-03-27 Apparatus and method for manipulating a touch-sensitive display panel Expired - Lifetime USRE44258E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/412,806 USRE44258E1 (en) 1999-11-04 2009-03-27 Apparatus and method for manipulating a touch-sensitive display panel

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JPP11-313536 1999-11-04
JP31353699A JP2001134382A (en) 1999-11-04 1999-11-04 Graphic processor
US09/699,757 US6958749B1 (en) 1999-11-04 2000-10-30 Apparatus and method for manipulating a touch-sensitive display panel
US86294307A 2007-09-27 2007-09-27
US12/412,806 USRE44258E1 (en) 1999-11-04 2009-03-27 Apparatus and method for manipulating a touch-sensitive display panel

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/699,757 Reissue US6958749B1 (en) 1999-11-04 2000-10-30 Apparatus and method for manipulating a touch-sensitive display panel

Publications (1)

Publication Number Publication Date
USRE44258E1 true USRE44258E1 (en) 2013-06-04

Family

ID=18042511

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/699,757 Ceased US6958749B1 (en) 1999-11-04 2000-10-30 Apparatus and method for manipulating a touch-sensitive display panel
US12/412,806 Expired - Lifetime USRE44258E1 (en) 1999-11-04 2009-03-27 Apparatus and method for manipulating a touch-sensitive display panel

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/699,757 Ceased US6958749B1 (en) 1999-11-04 2000-10-30 Apparatus and method for manipulating a touch-sensitive display panel

Country Status (2)

Country Link
US (2) US6958749B1 (en)
JP (1) JP2001134382A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130275914A1 (en) * 2012-04-12 2013-10-17 Xiao-Yi ZHUO Electronic device and method for controlling touch panel
US20170164036A1 (en) * 2013-07-29 2017-06-08 Wew Entertainment Corporation Enabling communication and content viewing

Families Citing this family (175)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US7834855B2 (en) 2004-08-25 2010-11-16 Apple Inc. Wide touchpad on a portable computer
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US8339379B2 (en) * 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
US8674966B2 (en) 2001-11-02 2014-03-18 Neonode Inc. ASIC controller for light-based touch screen
US9052777B2 (en) 2001-11-02 2015-06-09 Neonode Inc. Optical elements with alternating reflective lens facets
US9778794B2 (en) 2001-11-02 2017-10-03 Neonode Inc. Light-based touch screen
US7250939B2 (en) 2002-03-19 2007-07-31 Aol Llc Display motion multiplier
US8416217B1 (en) 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
JP4100195B2 (en) 2003-02-26 2008-06-11 ソニー株式会社 Three-dimensional object display processing apparatus, display processing method, and computer program
CN100412766C (en) * 2003-08-29 2008-08-20 诺基亚公司 Method and device for recognizing dual point user input on touch based user input device
JP4148187B2 (en) * 2004-06-03 2008-09-10 ソニー株式会社 Portable electronic device, input operation control method and program thereof
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
KR101270847B1 (en) * 2004-07-30 2013-06-05 애플 인크. Gestures for touch sensitive input devices
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US7561146B1 (en) 2004-08-25 2009-07-14 Apple Inc. Method and apparatus to reject accidental contact on a touchpad
DE102005038161A1 (en) 2004-12-30 2006-07-13 Volkswagen Ag Input device for cockpit of land vehicle, controls e.g. brushless DC actuator as function of speed of touching movement over operating surface or quantity derived from speed
EP1834229A2 (en) 2004-12-30 2007-09-19 Volkswagon AG Input device and method for the operation thereof
EP1677180A1 (en) * 2004-12-30 2006-07-05 Volkswagen Aktiengesellschaft Touchscreen capable of detecting two simultaneous touch locations
US7760189B2 (en) * 2005-01-21 2010-07-20 Lenovo Singapore Pte. Ltd Touchpad diagonal scrolling
JP4201775B2 (en) 2005-03-02 2008-12-24 株式会社コナミデジタルエンタテインメント Information processing apparatus, information processing apparatus control method, and program
AU2006201734A1 (en) 2005-04-27 2006-11-16 Aruze Corp. Gaming machine
JP2007058513A (en) 2005-08-24 2007-03-08 Sony Corp Controller and method, and program
WO2007079425A2 (en) * 2005-12-30 2007-07-12 Apple Inc. Portable electronic device with multi-touch input
US20070152983A1 (en) 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
JP2007188233A (en) * 2006-01-12 2007-07-26 Victor Co Of Japan Ltd Touch panel input device
JP2007241410A (en) * 2006-03-06 2007-09-20 Pioneer Electronic Corp Display device and display control method
TWI328185B (en) * 2006-04-19 2010-08-01 Lg Electronics Inc Touch screen device for potable terminal and method of displaying and selecting menus thereon
US20090213086A1 (en) * 2006-04-19 2009-08-27 Ji Suk Chae Touch screen device and operating method thereof
KR20070113018A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen
KR101269375B1 (en) * 2006-05-24 2013-05-29 엘지전자 주식회사 Touch screen apparatus and Imige displaying method of touch screen
KR20070113022A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen responds to user input
KR20070113025A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen
KR101327581B1 (en) * 2006-05-24 2013-11-12 엘지전자 주식회사 Apparatus and Operating method of touch screen
US8683362B2 (en) 2008-05-23 2014-03-25 Qualcomm Incorporated Card metaphor for activities in a computing device
US20090278806A1 (en) * 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US8296684B2 (en) 2008-05-23 2012-10-23 Hewlett-Packard Development Company, L.P. Navigating among activities in a computing device
KR100858014B1 (en) 2006-04-21 2008-09-11 이-리드 일렉트로닉 코포레이션, 리미티드 Composite cursor input method
TW200805131A (en) * 2006-05-24 2008-01-16 Lg Electronics Inc Touch screen device and method of selecting files thereon
US8022935B2 (en) 2006-07-06 2011-09-20 Apple Inc. Capacitance sensing electrode with integrated I/O mechanism
JP2008012199A (en) * 2006-07-10 2008-01-24 Aruze Corp Game system and image display control method thereof
US7870508B1 (en) 2006-08-17 2011-01-11 Cypress Semiconductor Corporation Method and apparatus for controlling display of data on a display screen
US8284165B2 (en) 2006-10-13 2012-10-09 Sony Corporation Information display apparatus with proximity detection performance and information display method using the same
US20080158171A1 (en) * 2006-12-29 2008-07-03 Wong Hong W Digitizer for flexible display
US7872652B2 (en) * 2007-01-07 2011-01-18 Apple Inc. Application programming interfaces for synchronization
US20080168478A1 (en) * 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US7903115B2 (en) * 2007-01-07 2011-03-08 Apple Inc. Animations
US8813100B1 (en) 2007-01-07 2014-08-19 Apple Inc. Memory management
US7844915B2 (en) * 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
US8656311B1 (en) 2007-01-07 2014-02-18 Apple Inc. Method and apparatus for compositing various types of content
JP2008176351A (en) * 2007-01-16 2008-07-31 Seiko Epson Corp Image printing apparatus, and method for executing processing in image printing apparatus
US7884805B2 (en) 2007-04-17 2011-02-08 Sony Ericsson Mobile Communications Ab Using touches to transfer information between devices
US8797272B2 (en) * 2007-05-15 2014-08-05 Chih-Feng Hsu Electronic devices with preselected operational characteristics, and associated methods
TWI357012B (en) * 2007-05-15 2012-01-21 Htc Corp Method for operating user interface and recording
US8436815B2 (en) * 2007-05-25 2013-05-07 Microsoft Corporation Selective enabling of multi-input controls
KR101403839B1 (en) * 2007-08-16 2014-06-03 엘지전자 주식회사 Mobile communication terminal with touchscreen and display control method thereof
DE102007039444A1 (en) 2007-08-21 2009-02-26 Volkswagen Ag Method for displaying information in a motor vehicle and display device for a motor vehicle
DE102007039446A1 (en) * 2007-08-21 2009-02-26 Volkswagen Ag A method of displaying information in a variable scale vehicle and display device
JP2009059141A (en) * 2007-08-31 2009-03-19 J Touch Corp Resistance type touch panel controller structure and method for discrimination and arithmetic operation of multi-point coordinate
CN101382851A (en) * 2007-09-06 2009-03-11 鸿富锦精密工业(深圳)有限公司 Computer system
CN101399897B (en) * 2007-09-30 2010-12-29 宏达国际电子股份有限公司 Image processing method
EP2232355B1 (en) * 2007-11-07 2012-08-29 N-Trig Ltd. Multi-point detection on a single-point detection digitizer
US20090122018A1 (en) * 2007-11-12 2009-05-14 Leonid Vymenets User Interface for Touchscreen Device
KR20090058073A (en) * 2007-12-04 2009-06-09 삼성전자주식회사 Terminal and method for performing function thereof
TW200925966A (en) * 2007-12-11 2009-06-16 J Touch Corp Method of controlling multi-point controlled controller
US20090174679A1 (en) 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
TWI460621B (en) * 2008-01-21 2014-11-11 Elan Microelectronics Corp Touch pad for processing a multi-object operation and method using in the same
US9024895B2 (en) 2008-01-21 2015-05-05 Elan Microelectronics Corporation Touch pad operable with multi-objects and method of operating same
US20090184939A1 (en) * 2008-01-23 2009-07-23 N-Trig Ltd. Graphical object manipulation with a touch sensitive screen
US8446373B2 (en) 2008-02-08 2013-05-21 Synaptics Incorporated Method and apparatus for extended adjustment based on relative positioning of multiple objects contemporaneously in a sensing region
JP4646991B2 (en) 2008-02-14 2011-03-09 株式会社コナミデジタルエンタテインメント Selection determination apparatus, selection determination method, and program
US20090207140A1 (en) * 2008-02-19 2009-08-20 Sony Ericsson Mobile Communications Ab Identifying and responding to multiple time-overlapping touches on a touch panel
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US8416196B2 (en) 2008-03-04 2013-04-09 Apple Inc. Touch event model programming interface
US8174502B2 (en) * 2008-03-04 2012-05-08 Apple Inc. Touch event processing for web pages
US8788967B2 (en) * 2008-04-10 2014-07-22 Perceptive Pixel, Inc. Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques
US8745514B1 (en) 2008-04-11 2014-06-03 Perceptive Pixel, Inc. Pressure-sensitive layering of displayed objects
US10180714B1 (en) * 2008-04-24 2019-01-15 Pixar Two-handed multi-stroke marking menus for multi-touch devices
US8799821B1 (en) 2008-04-24 2014-08-05 Pixar Method and apparatus for user inputs for three-dimensional animation
JP2009276819A (en) * 2008-05-12 2009-11-26 Fujitsu Ltd Method for controlling pointing device, pointing device and computer program
SG157240A1 (en) * 2008-05-14 2009-12-29 Pratt & Whitney Services Pte Ltd Compressor stator chord restoration repair method and apparatus
JP5164675B2 (en) * 2008-06-04 2013-03-21 キヤノン株式会社 User interface control method, information processing apparatus, and program
KR101498623B1 (en) * 2008-06-25 2015-03-04 엘지전자 주식회사 Mobile Terminal Capable of Previewing Different Channel
US20090322700A1 (en) * 2008-06-30 2009-12-31 Tyco Electronics Corporation Method and apparatus for detecting two simultaneous touches and gestures on a resistive touchscreen
US20090322701A1 (en) * 2008-06-30 2009-12-31 Tyco Electronics Corporation Method and apparatus for detecting two simultaneous touches and gestures on a resistive touchscreen
CN102099775B (en) * 2008-07-17 2014-09-24 日本电气株式会社 Information processing apparatus, storage medium on which program has been recorded, and object shifting method
KR101009881B1 (en) * 2008-07-30 2011-01-19 삼성전자주식회사 Apparatus and method for zoom display of target area from reproducing image
US20100073303A1 (en) * 2008-09-24 2010-03-25 Compal Electronics, Inc. Method of operating a user interface
JP2010086230A (en) * 2008-09-30 2010-04-15 Sony Corp Information processing apparatus, information processing method and program
TW201013485A (en) 2008-09-30 2010-04-01 Tpk Touch Solutions Inc Touch-control position sensing method for a touch-control device
US8174504B2 (en) 2008-10-21 2012-05-08 Synaptics Incorporated Input device and method for adjusting a parameter of an electronic system
US20100194701A1 (en) * 2008-10-28 2010-08-05 Hill Jared C Method of recognizing a multi-touch area rotation gesture
TWI397852B (en) * 2008-11-12 2013-06-01 Htc Corp Function selection systems and methods, and machine readable medium thereof
US8294047B2 (en) 2008-12-08 2012-10-23 Apple Inc. Selective input signal rejection and modification
EP2370878B8 (en) 2008-12-29 2019-06-19 Hewlett-Packard Development Company, L.P. Gesture detection zones
KR20100078295A (en) * 2008-12-30 2010-07-08 삼성전자주식회사 Apparatus and method for controlling operation of portable terminal using different touch zone
JP4913834B2 (en) * 2009-01-23 2012-04-11 シャープ株式会社 Information processing apparatus, control method, and program
KR101544364B1 (en) * 2009-01-23 2015-08-17 삼성전자주식회사 Mobile terminal having dual touch screen and method for controlling contents thereof
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US8345019B2 (en) * 2009-02-20 2013-01-01 Elo Touch Solutions, Inc. Method and apparatus for two-finger touch coordinate recognition and rotation gesture recognition
CN101833388B (en) * 2009-03-13 2012-02-29 北京京东方光电科技有限公司 Touch display and method for determining positions of touch points
US8285499B2 (en) 2009-03-16 2012-10-09 Apple Inc. Event recognition
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8566044B2 (en) * 2009-03-16 2013-10-22 Apple Inc. Event recognition
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
KR101666995B1 (en) * 2009-03-23 2016-10-17 삼성전자주식회사 Multi-telepointer, virtual object display device, and virtual object control method
US20100251112A1 (en) * 2009-03-24 2010-09-30 Microsoft Corporation Bimodal touch sensitive digital notebook
KR101510484B1 (en) 2009-03-31 2015-04-08 엘지전자 주식회사 Mobile Terminal And Method Of Controlling Mobile Terminal
KR101553629B1 (en) * 2009-05-06 2015-09-17 삼성전자주식회사 Method of Providing Interface
JP5141984B2 (en) * 2009-05-11 2013-02-13 ソニー株式会社 Information processing apparatus and method
US8355007B2 (en) * 2009-05-11 2013-01-15 Adobe Systems Incorporated Methods for use with multi-touch displays for determining when a touch is processed as a mouse event
US8677282B2 (en) * 2009-05-13 2014-03-18 International Business Machines Corporation Multi-finger touch adaptations for medical imaging systems
US8375295B2 (en) * 2009-05-21 2013-02-12 Sony Computer Entertainment Inc. Customization of GUI layout based on history of use
KR101597553B1 (en) * 2009-05-25 2016-02-25 엘지전자 주식회사 Function execution method and apparatus thereof
US8359544B2 (en) * 2009-05-28 2013-01-22 Microsoft Corporation Automated content submission to a share site
JP4798268B2 (en) 2009-07-17 2011-10-19 カシオ計算機株式会社 Electronic equipment and programs
JP5325060B2 (en) * 2009-09-18 2013-10-23 株式会社バンダイナムコゲームス Program, information storage medium and image control system
KR101446644B1 (en) * 2009-10-30 2014-10-01 삼성전자 주식회사 Image forming apparatus and menu selectㆍdisplay method thereof
US8957918B2 (en) * 2009-11-03 2015-02-17 Qualcomm Incorporated Methods for implementing multi-touch gestures on a single-touch touch surface
US20110138284A1 (en) * 2009-12-03 2011-06-09 Microsoft Corporation Three-state touch input system
US20110148786A1 (en) * 2009-12-18 2011-06-23 Synaptics Incorporated Method and apparatus for changing operating modes
US9465532B2 (en) 2009-12-18 2016-10-11 Synaptics Incorporated Method and apparatus for operating in pointing and enhanced gesturing modes
US8416215B2 (en) 2010-02-07 2013-04-09 Itay Sherman Implementation of multi-touch gestures using a resistive touch display
JP2011197848A (en) * 2010-03-18 2011-10-06 Rohm Co Ltd Touch-panel input device
US9250800B2 (en) 2010-02-18 2016-02-02 Rohm Co., Ltd. Touch-panel input device
JP2011227703A (en) * 2010-04-20 2011-11-10 Rohm Co Ltd Touch panel input device capable of two-point detection
JP2011180843A (en) 2010-03-01 2011-09-15 Sony Corp Apparatus and method for processing information, and program
US20110216095A1 (en) * 2010-03-04 2011-09-08 Tobias Rydenhag Methods, Devices, and Computer Program Products Providing Multi-Touch Drag and Drop Operations for Touch-Sensitive User Interfaces
JP5477108B2 (en) * 2010-03-29 2014-04-23 日本電気株式会社 Information processing apparatus, control method therefor, and program
WO2011135944A1 (en) * 2010-04-30 2011-11-03 日本電気株式会社 Information processing terminal and operation control method for same
KR101675597B1 (en) * 2010-06-08 2016-11-11 현대모비스 주식회사 System and method for assistant parking with improved hmi in setting up target of parking space
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
US8922499B2 (en) 2010-07-26 2014-12-30 Apple Inc. Touch input transitions
US8543942B1 (en) * 2010-08-13 2013-09-24 Adobe Systems Incorporated Method and system for touch-friendly user interfaces
KR101657122B1 (en) * 2010-09-15 2016-09-30 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP2012088762A (en) 2010-10-15 2012-05-10 Touch Panel Systems Kk Touch panel input device and gesture detection method
TWI441052B (en) * 2011-02-24 2014-06-11 Avermedia Tech Inc Gesture manipulation method and mutlimedia display apparatus
US9547428B2 (en) 2011-03-01 2017-01-17 Apple Inc. System and method for touchscreen knob control
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
JP5304849B2 (en) * 2011-06-07 2013-10-02 カシオ計算機株式会社 Electronic equipment and programs
JP5360140B2 (en) * 2011-06-17 2013-12-04 コニカミノルタ株式会社 Information browsing apparatus, control program, and control method
JP5618926B2 (en) * 2011-07-11 2014-11-05 株式会社セルシス Multipointing device control method and program
JP5374564B2 (en) 2011-10-18 2013-12-25 株式会社ソニー・コンピュータエンタテインメント Drawing apparatus, drawing control method, and drawing control program
KR101885132B1 (en) * 2011-11-23 2018-09-11 삼성전자주식회사 Apparatus and method for input by touch in user equipment
KR20130061993A (en) * 2011-12-02 2013-06-12 (주) 지.티 텔레콤 The operating method of touch screen
DE102011056940A1 (en) 2011-12-22 2013-06-27 Bauhaus Universität Weimar A method of operating a multi-touch display and device having a multi-touch display
US8963867B2 (en) 2012-01-27 2015-02-24 Panasonic Intellectual Property Management Co., Ltd. Display device and display method
KR101952219B1 (en) 2012-04-04 2019-02-26 삼성전자 주식회사 Operating Method For Icon displaying on the Electronic Device And Electronic Device thereof
JP5634442B2 (en) * 2012-06-26 2014-12-03 京セラドキュメントソリューションズ株式会社 Display input device and image forming apparatus
US9507513B2 (en) 2012-08-17 2016-11-29 Google Inc. Displaced double tap gesture
JP6025482B2 (en) 2012-09-28 2016-11-16 富士ゼロックス株式会社 Display control device, image display device, and program
US9092093B2 (en) 2012-11-27 2015-07-28 Neonode Inc. Steering wheel user interface
US12032817B2 (en) 2012-11-27 2024-07-09 Neonode Inc. Vehicle user interface
TWI478005B (en) * 2012-12-19 2015-03-21 Inventec Corp Protecting system for application of handheld device and method thereof
KR101984592B1 (en) 2013-01-04 2019-05-31 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN104981758B (en) 2013-01-15 2018-10-02 瑟克公司 It is searched for using the more finger-shaped materials of multidimensional with range over-sampling climbing method and down-hill method
JP5611380B2 (en) * 2013-01-24 2014-10-22 エヌ・ティ・ティ・コミュニケーションズ株式会社 Terminal device, input control method, and program
JP5742870B2 (en) * 2013-04-17 2015-07-01 カシオ計算機株式会社 Electronic equipment and programs
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
EP2816460A1 (en) * 2013-06-21 2014-12-24 BlackBerry Limited Keyboard and touch screen gesture system
US20160202865A1 (en) 2015-01-08 2016-07-14 Apple Inc. Coordination of static backgrounds and rubberbanding
JP2016015126A (en) * 2015-05-29 2016-01-28 利仁 曽根 Resize request determination method
KR102464280B1 (en) * 2015-11-06 2022-11-08 삼성전자주식회사 Input processing method and device
JP6230136B2 (en) * 2016-07-27 2017-11-15 株式会社スクウェア・エニックス Information processing apparatus, information processing method, and game apparatus
CN113165515B (en) 2018-11-28 2021-11-02 内奥诺德公司 Driver user interface sensor
CA3071758A1 (en) 2019-02-07 2020-08-07 1004335 Ontario Inc. Methods for two-touch detection with resisitive touch sensor and related apparatuses and sysyems
US11669210B2 (en) 2020-09-30 2023-06-06 Neonode Inc. Optical touch sensor

Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4703316A (en) 1984-10-18 1987-10-27 Tektronix, Inc. Touch panel input apparatus
US4758690A (en) 1986-07-12 1988-07-19 Alps Electric Co., Ltd. Coordinate detecting method
US4914624A (en) 1988-05-06 1990-04-03 Dunthorn David I Virtual button for touch screen
US5016008A (en) 1987-05-25 1991-05-14 Sextant Avionique Device for detecting the position of a control member on a touch-sensitive pad
US5241139A (en) 1992-03-25 1993-08-31 International Business Machines Corporation Method and apparatus for determining the position of a member contacting a touch screen
US5345543A (en) * 1992-11-16 1994-09-06 Apple Computer, Inc. Method for manipulating objects on a computer display
JPH07230352A (en) 1993-09-16 1995-08-29 Hitachi Ltd Touch position detecting device and touch instruction processor
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
JPH0854976A (en) 1994-08-10 1996-02-27 Matsushita Electric Ind Co Ltd Resistance film system touch panel
US5500935A (en) * 1993-12-30 1996-03-19 Xerox Corporation Apparatus and method for translating graphic objects and commands with direct touch input in a touch based input system
US5563632A (en) * 1993-04-30 1996-10-08 Microtouch Systems, Inc. Method of and apparatus for the elimination of the effects of internal interference in force measurement systems, including touch - input computer and related displays employing touch force location measurement techniques
JPH0934625A (en) 1995-07-20 1997-02-07 Canon Inc Method and device for coordinate detection and computer controller
JPH0934626A (en) 1995-07-21 1997-02-07 Alps Electric Co Ltd Coordinate input device
US5638093A (en) * 1993-12-07 1997-06-10 Seiko Epson Corporation Touch panel input device and control method thereof
US5670987A (en) * 1993-09-21 1997-09-23 Kabushiki Kaisha Toshiba Virtual manipulating apparatus and method
US5796406A (en) * 1992-10-21 1998-08-18 Sharp Kabushiki Kaisha Gesture-based input information processing apparatus
US5821930A (en) * 1992-08-23 1998-10-13 U S West, Inc. Method and system for generating a working window in a computer system
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5844547A (en) * 1991-10-07 1998-12-01 Fujitsu Limited Apparatus for manipulating an object displayed on a display device by using a touch screen
US5861886A (en) * 1996-06-26 1999-01-19 Xerox Corporation Method and apparatus for grouping graphic objects on a computer based system having a graphical user interface
US5880743A (en) * 1995-01-24 1999-03-09 Xerox Corporation Apparatus and method for implementing visual animation illustrating results of interactive editing operations
US5943043A (en) 1995-11-09 1999-08-24 International Business Machines Corporation Touch panel "double-touch" input method and detection apparatus
US6008800A (en) * 1992-09-18 1999-12-28 Pryor; Timothy R. Man machine interfaces for entering data into a computer
US6034672A (en) * 1992-01-17 2000-03-07 Sextant Avionique Device for multimode management of a cursor on the screen of a display device
JP2000163193A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
US6255604B1 (en) 1995-05-31 2001-07-03 Canon Kabushiki Kaisha Coordinate detecting device for outputting coordinate data when two points are simultaneously depressed, method therefor and computer control device
US6323847B1 (en) * 1997-12-24 2001-11-27 Fujitsu Limited Correction of view-angle-dependent characteristics for display device
US6347290B1 (en) * 1998-06-24 2002-02-12 Compaq Information Technologies Group, L.P. Apparatus and method for detecting and executing positional and gesture commands corresponding to movement of handheld computing device
US6392638B2 (en) * 1998-01-16 2002-05-21 Sony Corporation Information processing apparatus and display control method of the same information processing apparatus
US6400376B1 (en) * 1998-12-21 2002-06-04 Ericsson Inc. Display control for hand-held data processing device
US6414671B1 (en) * 1992-06-08 2002-07-02 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
US6421042B1 (en) * 1998-06-09 2002-07-16 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
US6466198B1 (en) * 1999-11-05 2002-10-15 Innoventions, Inc. View navigation and magnification of a hand-held device with a display
US6518960B2 (en) * 1998-07-30 2003-02-11 Ricoh Company, Ltd. Electronic blackboard system
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US7345675B1 (en) * 1991-10-07 2008-03-18 Fujitsu Limited Apparatus for manipulating an object displayed on a display device by using a touch screen

Patent Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4703316A (en) 1984-10-18 1987-10-27 Tektronix, Inc. Touch panel input apparatus
US4758690A (en) 1986-07-12 1988-07-19 Alps Electric Co., Ltd. Coordinate detecting method
US5016008A (en) 1987-05-25 1991-05-14 Sextant Avionique Device for detecting the position of a control member on a touch-sensitive pad
US4914624A (en) 1988-05-06 1990-04-03 Dunthorn David I Virtual button for touch screen
US5844547A (en) * 1991-10-07 1998-12-01 Fujitsu Limited Apparatus for manipulating an object displayed on a display device by using a touch screen
US7345675B1 (en) * 1991-10-07 2008-03-18 Fujitsu Limited Apparatus for manipulating an object displayed on a display device by using a touch screen
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US6034672A (en) * 1992-01-17 2000-03-07 Sextant Avionique Device for multimode management of a cursor on the screen of a display device
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US5241139A (en) 1992-03-25 1993-08-31 International Business Machines Corporation Method and apparatus for determining the position of a member contacting a touch screen
US6414671B1 (en) * 1992-06-08 2002-07-02 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
US5821930A (en) * 1992-08-23 1998-10-13 U S West, Inc. Method and system for generating a working window in a computer system
US6008800A (en) * 1992-09-18 1999-12-28 Pryor; Timothy R. Man machine interfaces for entering data into a computer
US5796406A (en) * 1992-10-21 1998-08-18 Sharp Kabushiki Kaisha Gesture-based input information processing apparatus
US5345543A (en) * 1992-11-16 1994-09-06 Apple Computer, Inc. Method for manipulating objects on a computer display
US5563632A (en) * 1993-04-30 1996-10-08 Microtouch Systems, Inc. Method of and apparatus for the elimination of the effects of internal interference in force measurement systems, including touch - input computer and related displays employing touch force location measurement techniques
JPH07230352A (en) 1993-09-16 1995-08-29 Hitachi Ltd Touch position detecting device and touch instruction processor
US5670987A (en) * 1993-09-21 1997-09-23 Kabushiki Kaisha Toshiba Virtual manipulating apparatus and method
US5638093A (en) * 1993-12-07 1997-06-10 Seiko Epson Corporation Touch panel input device and control method thereof
US5500935A (en) * 1993-12-30 1996-03-19 Xerox Corporation Apparatus and method for translating graphic objects and commands with direct touch input in a touch based input system
JPH0854976A (en) 1994-08-10 1996-02-27 Matsushita Electric Ind Co Ltd Resistance film system touch panel
US5880743A (en) * 1995-01-24 1999-03-09 Xerox Corporation Apparatus and method for implementing visual animation illustrating results of interactive editing operations
US6255604B1 (en) 1995-05-31 2001-07-03 Canon Kabushiki Kaisha Coordinate detecting device for outputting coordinate data when two points are simultaneously depressed, method therefor and computer control device
JPH0934625A (en) 1995-07-20 1997-02-07 Canon Inc Method and device for coordinate detection and computer controller
JPH0934626A (en) 1995-07-21 1997-02-07 Alps Electric Co Ltd Coordinate input device
US5943043A (en) 1995-11-09 1999-08-24 International Business Machines Corporation Touch panel "double-touch" input method and detection apparatus
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5861886A (en) * 1996-06-26 1999-01-19 Xerox Corporation Method and apparatus for grouping graphic objects on a computer based system having a graphical user interface
US6323847B1 (en) * 1997-12-24 2001-11-27 Fujitsu Limited Correction of view-angle-dependent characteristics for display device
US6392638B2 (en) * 1998-01-16 2002-05-21 Sony Corporation Information processing apparatus and display control method of the same information processing apparatus
US6421042B1 (en) * 1998-06-09 2002-07-16 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
US6347290B1 (en) * 1998-06-24 2002-02-12 Compaq Information Technologies Group, L.P. Apparatus and method for detecting and executing positional and gesture commands corresponding to movement of handheld computing device
US6518960B2 (en) * 1998-07-30 2003-02-11 Ricoh Company, Ltd. Electronic blackboard system
JP2000163193A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
US6400376B1 (en) * 1998-12-21 2002-06-04 Ericsson Inc. Display control for hand-held data processing device
US6466198B1 (en) * 1999-11-05 2002-10-15 Innoventions, Inc. View navigation and magnification of a hand-held device with a display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Office Action dated Oct. 30, 2008 for U.S. Appl. No. 11/862,943.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130275914A1 (en) * 2012-04-12 2013-10-17 Xiao-Yi ZHUO Electronic device and method for controlling touch panel
US20170164036A1 (en) * 2013-07-29 2017-06-08 Wew Entertainment Corporation Enabling communication and content viewing

Also Published As

Publication number Publication date
JP2001134382A (en) 2001-05-18
US6958749B1 (en) 2005-10-25

Similar Documents

Publication Publication Date Title
USRE44258E1 (en) Apparatus and method for manipulating a touch-sensitive display panel
KR101085603B1 (en) Gesturing with a multipoint sensing device
JP4295280B2 (en) Method and apparatus for recognizing two-point user input with a touch-based user input device
US7091954B2 (en) Computer keyboard and cursor control system and method with keyboard map switching
CN106909305B (en) Method and apparatus for displaying graphical user interface
US7602382B2 (en) Method for displaying information responsive to sensing a physical presence proximate to a computer input device
KR100636184B1 (en) Location control method and apparatus therefor of display window displayed in display screen of information processing device
US5724531A (en) Method and apparatus of manipulating an object on a display
US20060114225A1 (en) Cursor function switching method
JP2010517197A (en) Gestures with multipoint sensing devices
US20060119588A1 (en) Apparatus and method of processing information input using a touchpad
JP2011028524A (en) Information processing apparatus, program and pointing method
EP1241558A2 (en) Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device
US20050190147A1 (en) Pointing device for a terminal having a touch screen and method for using the same
WO1998000775A9 (en) Touchpad with scroll and pan regions
KR100381583B1 (en) Method for transmitting a user data in personal digital assistant
JP6293209B2 (en) Information processing apparatus, erroneous operation suppression method, and program
JPH10228350A (en) Input device
JP5275429B2 (en) Information processing apparatus, program, and pointing method
JP2000181617A (en) Touch pad and scroll control method by touch pad
JP2000194493A (en) Pointing device
US20210064229A1 (en) Control method of user interface and electronic device
JPH09167058A (en) Information processor
JP5330175B2 (en) Touchpad, information processing terminal, touchpad control method, and program
JP3197764B2 (en) Input device

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12