US5859686A - Eye finding and tracking system - Google Patents
Eye finding and tracking system Download PDFInfo
- Publication number
- US5859686A US5859686A US08/858,841 US85884197A US5859686A US 5859686 A US5859686 A US 5859686A US 85884197 A US85884197 A US 85884197A US 5859686 A US5859686 A US 5859686A
- Authority
- US
- United States
- Prior art keywords
- subject
- pixel
- matrix
- location
- pixel block
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
Definitions
- This invention relates to a system and method for finding and tracking the location of a subject's eyes.
- Eye finding and tracking devices are employed for many purposes. For example, such devices are often used in vehicle operator drowsiness and intoxication detection systems.
- vehicle operator drowsiness and intoxication detection systems When a person is responsible for operating a vehicle, such as an automobile, it is critical that the person be capable of demonstrating basic cognitive and motor skills that will assure the safe operation of the vehicle. Lack of sleep, boredom or consumption of drugs or alcohol can impair a vehicle operator's ability to safely operate the vehicle. Therefore, it is important when designing an impaired driver detection system to continuously evaluate an operator's ability to control the vehicle. Impaired driver detection systems are useful because they avoid or reduce personal injury and property damage by preventing accidents.
- An example of a known apparatus for eye finding and tracking which is employed in a vehicle operator drowsiness detection system employs two images of a driver's face. One of the images is obtained by illuminating the driver's face from a first direction, and the other is obtained by illuminating the driver's face from another direction. The two images are processed to detect three-dimensional positions of the driver's eyes.
- an image of a vehicle driver's head is processed to determine the widest portion of the driver's face. From this determination, calculations are performed based on an analysis of the image and assumed geometric relationships of the eyes to the widest portion of the operator's face to define two rectangular regions which should include the operator's eyes.
- Eye finding and tracking devices are also useful for other applications. For instance, these devices can be employed in systems designed to monitor the attentiveness of a machine operator, or a person working at a control console such as an air traffic controller. Another example where eye finding and tracking can be useful is in conjunction with an identification system which employs iris pattern matching techniques.
- the above-described objectives are realized with embodiments of the present invention directed to a system and method for finding and tracking the location of a subject's eyes.
- the system and method employ an imaging apparatus which produces digital image frames including the face and eyes of a subject.
- Each digital image frame comprising an array of pixel values representing the intensity of light reflected from the face of the subject. These intensity representing pixel values are located at positions in the array specified by x and y coordinates.
- An eye position finding and tracking apparatus is used to average the intensity representing pixel values within respective M x by M y pixel blocks of a digitized image frame to create elements of plural output matrices. Then, elements of the output matrices are compared to various threshold values. These threshold values are chosen so as to identify which matrix elements correspond to a M x by M y pixel block which includes pixel values potentially representing an image of the subject's pupil and at least the portion of the subject's iris.
- the aforementioned averaging process involves respectively averaging the intensity representing pixel values within three vertically arranged M x by M y pixel blocks in the upper left-hand corner of a first digitized image frame to create a first three-element column of a first output matrix.
- the intensity representing pixel values within three vertically arranged M x by M y pixel blocks, each of which is offset to the right by one column of the array in relation to the last averaged pixel block are respectively averaged to create a new column of the first output matrix.
- This last step is repeated until the three vertically arranged M x by M y pixel blocks which include pixel values from the last column of the array have been averaged to create a last column of the output matrix.
- the intensity representing pixel values within three more vertically arranged M x by M y pixel blocks are then respectively averaged starting at a position on the left-hand side of the array which is offset downward by one row in relation to a previously averaged pixel blocks. This averaging creates a first three-element column of a new output matrix.
- the intensity representing pixel values within three vertically arranged M x by M y pixel blocks each of which is offset to the right by one column of the array in relation to a last averaged pixel blocks are respectively averaged to create a new column of the new output matrix.
- This step is then repeated until the three vertically arranged M x by M y pixel blocks which include pixel values from the last column of the array have been averaged to create a last column of the new output matrix.
- the above-described process of averaging pixel blocks which are offset downward by one row is continued the until the three vertically arranged M x by M y pixel blocks which include pixel values from the last column and row of the array have been averaged to create a last column of a last output matrix.
- the specifics of the aforementioned comparing process includes comparing each element of each output matrix to a threshold range.
- the threshold range has a lower limit representing the lowest expected average of the intensity representing pixel values within a M x by M y pixel block corresponding to an image of the subject's pupil and at least a portion of the subject's iris for the particular illumination condition present at the time the image was produced.
- the upper limit of the threshold range represents the highest expected average of the intensity representing pixel values within the M x by M y pixel block corresponding to the image of the subject's pupil and at least the portion of the subject's iris for the particular illumination condition present at the time the image was produced.
- any output matrix element which both exceeds a lower limit of the threshold range and is less than an upper limit of the threshold range is flagged.
- threshold values are compared to the average of the intensity representing pixel values within each M x by M y pixel block immediately surrounding any pixel block corresponding to a flagged output matrix element. Any previously flagged output matrix element which has any surrounding pixel block which fails to meet the threshold criteria are de-flagged (i.e. de-selected).
- the system and method of the present invention can also include provisions for identifying a group of flagged matrix elements having any intensity representing pixel values associated with the M x by M y pixel block corresponding to the flagged matrix element which are shared with another M x by M y pixel block corresponding to a different flagged matrix element. It is then determined which of these flagged matrix elements are greater than the others, or are positioned in the center of a group of elements each having identical values. Any of the flagged matrix elements which were not determined to be greater, or in the center position, are de-flagged.
- the system and method can include a provision for determining if a M x by M y pixel block corresponding to a flagged matrix element is within prescribed horizontal and vertical distance ranges from a M x by M y pixel block corresponding to another flagged matrix element.
- the prescribed horizontal distance range has an upper limit corresponding to the maximum expected horizontal eye separation of a subject and a lower limit corresponding to the minimum expected horizontal eye separation of a subject.
- the prescribed vertical distance range has an upper limit corresponding to the maximum expected vertical eye separation of a subject and a lower limit corresponding to the minimum expected vertical eye separation of a subject.
- any M x by M y pixel block corresponding to a flagged matrix element can be designated as an actual subject eye location.
- further steps be taken to confirm the actual eye location, as well as to track the changing position of the subject's eye.
- the system and method of the present invention include a provision for tracking the location of a center of each M x by M y pixel block potentially representing the image of the subject's pupil and at least the portion of the subject's iris in each subsequent image frame produced from the imaging apparatus.
- a provision is preferably included to detect a blink at the location. Such a blink confirms the potential eye location is an actual eye location.
- the aforementioned preferred provision for tracking and detecting a blink involves extracting a rectangular block of pixels centered at a current potential eye location.
- the current rectangular block is correlated with a block extracted during the aforementioned eye finding process. If the computed correlation coefficient does not exceed a threshold, then a blink has been detected.
- the center of the M x by M y pixel block corresponding to this location where a blink has been detected is then confirmed as an actual eye location.
- this preferred tracking and blink detecting provision involves first determining the center of the M x by M y pixel block associated with the previously-identified potential eye locations.
- cut-out pixel blocks are selected in a next consecutive image frame produced by the imaging apparatus.
- Each of these cut-out pixel blocks respectively corresponds to the location of the M x by M y pixel block in the immediately preceding image frame which was identified as a potential eye location (i.e. the pixel block potentially representing the image of the subject's pupil and at least the portion of the subject's iris) and includes a number of pixels which surround the identified pixel block.
- the number of pixels equates to the pixels contained in all surrounding M x by M y pixel blocks which are adjacent to the identified pixel block.
- respective matrices of intensity representing pixel values are formed from an area surrounding the center of each M x by M y pixel block identified as a potential eye location in the last preceding image frame in which such a center was determined.
- this area encompasses the identified pixel block and all surrounding adjacent pixel block of the same size.
- the appropriate one of these matrices is then correlated with each element of the associated cutout block. This correlation is performed by sequentially overlaying the center element of the appropriate one of the matrices onto each pixel value of the associated cut-out block starting with the upper left-hand corner, and performing a correlation between the overlaid matrix and the cut-out block for each overlaid cut-out block pixel location.
- the result of this correlation is to create a matrix of correlation coefficients associated with each cut-out block. These correlation coefficient matrices are compared to a correlation threshold value, and any correlation coefficient matrix element which exceeds the correlation threshold value in each of the correlation coefficient matrices is flagged.
- the correlation coefficient threshold value is chosen so as to ensure a substantial degree of correlation between an overlaid matrix and an associated cut-out block. If more than one element of a correlation coefficient matrix is flagged then it is determined which of the flagged correlation coefficient elements has the greatest value. This element having the greatest value in the matrix corresponds to the center pixel value of the M x by M y pixel block of the potential eye location in the image frame currently being processed.
- the preferred provision for detecting that a blink has occurred in a location identified as potential eye is accomplished by monitoring the number of consecutive image frames which exhibit the aforementioned lack of flagged elements condition for a particular location. If this number does not exceed a prescribed threshold typical of a blink duration, for example approximately seven times in the tested embodiment, before elements of the correlation coefficient matrix corresponding to the location are once again flagged, then a blink has been detected. The center of the M x by M y pixel block corresponding the this location where a blink has been detected is then confirmed as an actual subject eye location.
- the system and method may also have a provision for assigning a low confidence status to a location identified as a potential eye location whenever the number of consecutive times the aforementioned lack of flagged elements condition is detected for a correlation coefficient matrix associated with the location exceeds a prescribed threshold value, for example 150 times in the tested embodiment.
- This low confidence status indicates the location is not likely to be an actual subject's eye.
- a low confidence status is assigned to a potential eye location whenever a blink is not detected at the location for a prescribed number of consecutive image frames (e.g. 150 frames in the tested embodiment).
- a low confidence status is also assigned to a location previously identified as an actual eye location whenever the number of consecutive times the lack of flagged elements condition associated with this location exceeds a prescribed threshold (for example 150 times in the tested embodiment) and there is no other location identified as an actual eye location within the previously-described horizontal and vertical distance ranges from the actual eye location.
- a prescribed threshold for example 150 times in the tested embodiment
- FIG. 1 is a schematic diagram showing one embodiment of an eye finding and tracking system in accordance with the present invention.
- FIG. 2 is a preferred overall flow diagram of the process used in the eye finding and tracking unit of FIG. 1.
- FIG. 3 is a flow diagram of a process for identifying potential eye locations (and optionally actual eye locations) within an image frame produced by the imaging apparatus of FIG. 1.
- FIG. 4 is an idealized diagram of the pixels in an image frame including various exemplary pixel block designations applicable to the process of FIG. 3.
- FIG. 5 is a flow diagram of a process for tracking eye locations in successive image frames produced by the imaging apparatus of FIG. 1, as well as a process of detecting a blink at a potential eye location to identify it as an actual eye location.
- FIG. 6 is a diagram showing a cut-out block of an image frame applicable to the process of FIG. 5.
- FIG. 7 is a flow diagram of a process for monitoring potential and actual eye locations and to reinitialize the eye finding and tracking system if all monitored eye locations are deemed low confidence locations.
- FIG. 1 depicts an eye finding system embodying the present invention.
- the system includes an imaging apparatus 10 which may be a digital camera, or a television camera connected to a frame grabber device as is known in the art.
- the imaging apparatus 10 is located in front of a subject 12, so as to image his or her face.
- the output of the imaging apparatus 10 is a signal representing digitized images of a subject's face.
- the digitized images are provided at a rate of about 30 frames per second.
- Each frame preferably consists of an 640 by 480 array of pixels each having one of 256 (i.e. 0 to 255) gray tones representative of the intensity of reflected light from a portion of the subject's face.
- the output signal from the imaging apparatus is fed into an eye finding and tracking unit 14.
- the unit 14 processes each image frame produced by the imaging apparatus 10 to detect the position of the subject's eye and to track these eye positions over time.
- the eye finding and tracking unit 14 can employ a digital computer to accomplish the image processing task, or alternately, the processing could be performed by logic circuitry specifically designed for the task.
- the eye finding and tracking unit 14 would be used to control this light source 16.
- the infrared light source 16 is activated by the unit 14 whenever it is needed to effectively image the subject's face. Specifically, the light source would be activated to illuminate the subject's face at night or when the ambient lighting conditions are too low to obtain an image.
- the unit 14 includes a sensor capable of determining when the ambient lighting conditions are inadequate.
- the light source would be employed when the subject 12 is wearing non-reflective sunglasses, as these types of sunglasses are transparent to infrared light.
- the subject could indicate that sunglasses are being worn, such as by depressing a control switch on the eye finding and tracking unit 14, thereby causing the infrared light source 16 to be activated.
- the infrared light source 16 could be activated automatically by the unit 14, for example, when the subject's eyes cannot be found otherwise.
- the imaging apparatus 10 would be of the type capable of sensing infrared light.
- FIG. 2 is an overall flow diagram of the preferred process used to find and track the location of a subject's eyes.
- a first image frame of the subject's face is inputted from the imaging apparatus to the eye finding and tracking unit.
- the inputted image frame is processed to identify potential eye locations. This is accomplished, as will be explained in detail later, by identifying features within the image frame which exhibit attributes consistent with those associated with the appearance of a subject's eye.
- a determination is made as to which of the potential eye locations is an actual eye of the subject. This is generally accomplished by monitoring successive image frames to detect a blink. If a blink is detected at a potential eye location, it is deemed an actual eye location.
- step 208 the now determined actual eye locations are continuously tracked and updated using successive image frames.
- the process is reinitialized by returning to step 202 and repeating the eye finding procedure.
- FIG. 3 is a flow diagram of the process used to identify potential eye locations in the initial image frame.
- the first step 302 involves averaging the digitized image values which are representative of the pixel intensities of a first M x by M y block of pixels for each of three M y high rows of the digitized image, starting in the upper left-hand corner of the image frame, as depicted by the solid line boxes 17 in FIG. 4.
- the three averages obtained in step 302 are used to form the first column of an output matrix.
- the M x variable represents a number of pixels in the horizontal direction of the image frame
- the M y variable represents a number of pixels in the vertical direction of the image frame.
- the resulting M x by M y pixel block has a size which just encompasses the minimum expected size of the iris and pupil portions of a subject's eye.
- the pixel block would contain an image of the pupil and at least a part of the iris of any subject's eye.
- the next step 304 is to create the next column of the output matrix. This is accomplished by averaging the intensity representing values of a M x by M y pixel block which is offset horizontally to the right by one pixel column from the first pixel block for each of the three aforementioned M y high rows, as shown by the broken line boxes 18 in FIG. 4. This process is repeated, moving one pixel column to the right during each iteration, until the ends of the three M y high rows in the upper portion of the image frame are reached. The result is one completed output matrix.
- the next step 306 in the process is to repeat steps 302 and 304, except that the M x by M y pixel blocks being averaged are offset vertically downward from the previous pixel blocks by one pixel row, as depicted by the dashed and dotted line boxes 19 in FIG. 4. This produces a second complete output matrix.
- This process of offsetting the blocks vertically downward by one pixel row is then continued until the bottom of the image frame is reached, thereby forming a group of output matrices.
- the purpose of averaging three M y high blocks at once is to compensate for any geometrical distortion caused by, for example, head movements due to vibration of the vehicle.
- each element of each output matrix in the group of generated output matrices is compared with a threshold range. Those matrix elements which exceed the lower limit of the threshold range and are less than the upper limit of this range, are flagged (step 310).
- the upper limit of the threshold range corresponds to a value which represents the maximum expected average intensity of a M x by M y pixel block containing an image of the iris and pupil of a subject's eye for the illumination conditions that are present at the time the image was captured.
- the maximum average intensity of block containing the image of the subject's pupil and at least a portion of the iris will be lower than the same size portion of most other areas of the subject's face because the pupil absorbs a substantial portion of the light impinging thereon.
- the upper threshold limit is a good way of eliminating portions of the image frame which cannot be the subject's eye.
- the lower threshold limit is employed to eliminate these portions of the image frame which cannot be the subject's eye.
- the lower limit corresponds to a value which represents the minimum expected average intensity of a M x by M y pixel block containing an image of the pupil and at least a portion of the subject's iris.
- this minimum is based on the illumination conditions that are present at the time the image is captured.
- step 312 the average intensity value of each M x by M y pixel block which surrounds the M x by M y pixel block associated with each of the flagged output matrix elements is compared to an output matrix threshold value.
- this threshold value represents the lowest expected average intensity possible for the pixel block sized areas immediately adjacent the portion of an image frame containing the subject's pupil and iris.
- the pixel block associated with the flagged element is designated a potential eye location (step 314).
- the flagged block is eliminated as a potential eye location (step 316).
- This comparison concept is taken further in a preferred embodiment of the present invention where a separate threshold value is applied to each of the surrounding pixel block averages.
- This has particular utility because some of the areas immediately surrounding the iris and pupil exhibit unique average intensity values which can be used to increase the confidence that the flagged pixel block is good prospect for a potential eye location. For example, the areas immediately to the left and right of the iris and pupil include the white parts of the eye. Thus, these areas tend to exhibit a greater average intensity than most other areas of the face. Further, it has been found that the areas directly above and below the iris and pupil are often in shadow.
- the average intensity of these areas is expected to be less than many other areas of the face, although greater than the average intensity of the portion of the image containing the iris and pupil.
- the threshold value applied to the average intensity value of the pixel blocks directly to the left and right of the flagged block would be just below the minimum expected average intensity for these relatively light areas of the face, and the threshold value applied to the average intensity values associated with the pixel block directly above and below the flagged block would be just above the maximum expected average intensity for these relative dark regions of the face.
- the pixel blocks diagonal to the flagged block would be assigned threshold values which are just below the minimum expected average intensity for the block whenever the average intensity for the block is generally lighter than the rest of the face, and just above the maximum expected average intensity for a particular block if the average intensity of the block is generally darker than the rest of the face. If the average intensity of the "lighter" blocks exceeds the respectively assigned threshold value, or the "darker” blocks are less than the respectively assigned threshold value, then the flagged pixel block is deemed a potential eye location. If any of the surrounding pixel blocks do not meet this thresholding criteria, then the flagged pixel block is eliminated as a potential eye location.
- the output matrices were generated using the previously-described "one pixel column and one pixel row offset" approach, some of the matrices will contain rows having identical elements as others because they characterize the same pixels of the image frame. This does not present a problem in identifying the pixel block locations associated with potential eye locations as the elements flagged by the above-described thresholding process in multiple matrices which correspond to the same pixels of the image frame will be identified as a single location. If fact, this multiplicity serves to add redundancy to the identification process. However, it is preferred that the pixel block associated with a flagged matrix element correspond to the portion of the image centered on the subject's pupil.
- the matrix elements representing these blocks may also be identified as potential eye locations via the above-described thresholding process.
- the next step 318 in the process of identifying potential eye locations is to examine flagged matrix elements associated with the previously-designated potential eye locations which correspond to blocks having pixels in common with pixel blocks associated with other flagged elements. Only the matrix element representing the block having the minimum average intensity among the examined group of elements, or which is centered within the group, remain flagged. The others are de-selected and no longer considered potential eye locations (step 320).
- any remaining potential eye locations identified in the above process could be considered the actual eye locations with a reasonable amount of confidence, especially, if the more sophisticated and preferred thresholding processes are employed (step 322). If, however, no remaining potential eye locations exist, then the system is reinitialized in step 324 by inputting a next image frame and starting over at step 302.
- the confidence level could optionally be further increased by determining if two of the remaining potential eye locations have a certain geographic relationship to each other (step 326). If so, these two potential eye locations are designated as actual eye locations (step 328).
- An example of a process for determining if two potential eye locations have the desired geographic relationship involves determining if the locations are within a prescribed horizontal and vertical (i.e. to allow for the head being cocked) distance range from one another. These distance ranges represent the difference between the minimum and maximum expected horizontal and vertical eye separation typical of the subject being monitored (e.g. an adult vehicle operator). If the potential eyes are within these distance ranges, then the chances that these locations are actual eyes is increased. Performing the required distance measurements from a digitized image can be accomplished using any appropriate conventional imaging technique. In the case where none of the remaining potential eye locations are within the aforementioned geographic relationship to one another, then the system is reinitialized in step 330 by inputting a next image frame and starting over at step 302.
- a potential eye location is an actual eye location.
- a preliminary determination in this blink detecting process is to identify the image pixel in the original image frame which constitutes the center of the pupil of each identified potential eye location. As the pixel block associated with the identified potential eye location should be centered on the pupil, finding the center of the pupil can be approximated by simply selecting the pixel representing the center of the pixel block. Alternately, a more intensive process can be employed to ensure a the accuracy of the identified pupil center location.
- the purpose of applying the threshold value is to identify those pixel of the image which correspond to the pupil of the eye. As the pixels associated with the pupil image will have a lower intensity than the surrounding iris, the threshold value is chosen to approximate the highest intensity expected from the pupil image for the illumination conditions present at the time the image was captured. This ensures that only the darker pupil pixels are selected and not the pixels imaging the relatively lighter surrounding iris structures. Once the pixels associated with the pupil are flagged, the next step is to determine the geographic center of the selected pixels. This geographic center will be the pixel of the image which represents the center of the pupil, as the pupil is circular in shape.
- the geographic center of the selected pixels can be accomplished in a variety of ways.
- the pixel block associated with the potential eye location can be scanned horizontally, column by column, until one of the selected pixels is detected within a column. This column location is noted and the horizontal scan is continued until a column containing no selected pixels is found. This second column location is also noted.
- a similar scanning process is then conducted vertically, so as to identify the first row in the block containing a selected pixel and the next subsequent row containing no selected pixels.
- the center of the pupil is chosen as the pixel having a column location in-between the noted columns and a row location in-between the noted rows.
- any noise in the image or spots in the iris which are dark enough to be selected in the aforementioned thresholding step, can skew the results of the just-described process.
- this possibility can be eliminated in a number of way, for example by requiring there be a prescribed number of pixel columns or rows following the first detection before that column or row is noted as the outside edge of the pupil.
- a blink at a potential eye location represents itself as a brief period where the eyelid is closed, e.g. about 2-3 image frames in length based on an imaging system producing about 30 frames per second. This would appear as a "disappearance" of a potential eye at an identified location for a few successive frames, followed by its "reappearance” in the next frame.
- the eye "disappears” from an image frame during the blink because the eyelid which covers the iris and pupil will exhibit a much greater average pixel intensity.
- the closed eye will not be detected by the previously-described thresholding process.
- a reasonable frame speed is employed by the imaging system. For example, a 30 frames per second rate is adequate to ensure the eye has not moved significantly in the 2-3 frames it takes to blink, Any slight movement of the eye is detected and compensated for by a correlation procedure to be described shortly.
- FIG. 5 is a flow diagram of the preferred eye location tracking and blink detection process used primarily to identify and track actual eye locations among the potential eye locations identified previously (i.e. steps 302 through 320 of FIG. 3).
- This preferred process uses cut-out blocks in the subsequent frames which are correlated to the potential eye locations in the previous frame to determine a new eye location. Processing just the cutout blocks rather than the entire image saves considerable processing resources.
- the first step 502 in the process involves identifying the aforementioned cut-out blocks within the second image frame produced by the imaging system. This is preferably accomplished by identifying cut-out pixel blocks 20 in the second frame, each of which includes the pixel block 22 corresponding to the location of the block identified as a potential eye location in the previous image frame, and all adjacent M x by M y pixel blocks 24, as shown in FIG. 6.
- a matrix is created from the first image for each potential eye location. This matrix includes all the represented pixel intensities in an area surrounding the determined center of a potential eye location. Preferably, this area is bigger than the cut-out block employed in the second image. For example, an area having a size of 100 by 50 could be employed.
- each matrix (which corresponds to the determined center of the pupil of the potential eye) is then "overlaid" in step 506 on each pixel in the associated cut-out block in the second image frame, starting with the pixel in the upper left-hand corner.
- a correlation procedure is then performed between each matrix and the overlaid pixels of its associated cutout block. This correlation is accomplished using any appropriate conventional matrix correlation process. As these correlation processes are known in the art, no further detail will be provided herein.
- the result of the correlation is a correlation coefficient representing the degree to which the pixel matrix from the first image frame corresponded to the overlaid position in the associated cutout block. This process is repeated for all the pixel locations in each cut-out block to produce a correlation coefficient matrix for each potential eye location.
- a threshold value is compared to each element in the correlation coefficient matrices, and those which exceed the threshold are flagged.
- the flagged element in each of these correlation coefficient matrices which is larger than the rest of the elements corresponds to the pixel location in the second image which most closely matches the intensity profile of the associated potential eye location identified in the first image, and represents the center of the updated potential eye location in the second image frame. If such a maximum value is found, the corresponding pixel location in the second image is designated as the new center of the potential eye location (step 510).
- the threshold value was applied to ensure the pixel intensity values in the second frame were at least "in line" with those in the corresponding potential eye locations in the first image.
- the threshold is chosen so as to ensure a relatively high degree of correlation is observed. For example, a threshold value of at least 0.5 could be employed.
- step 512 the number of consecutive times the "no-correlation" condition occurs is calculated in step 512. Whenever, a no-correlation condition exists from a period of 2-3 frames, and then the potential eye is detected once again, this is indicative of a blink. If a blink is so detected, the status of the potential eye location is upgraded to a high confidence actual eye location (step 514). This is possible because an eye will always exhibit this blink response, and so the location can be deemed that of an actual eye with a high degree of confidence.
- the eye tracking and blink detection process (of FIG. 5) is repeated for each successive frame generated by the imaging apparatus with the addition that actual eye locations are tracked as well as the remaining potential eye locations (step 516). This allows the position of the actual and potential eye locations to be continuously updated. It is noted that the pixel matrix from the immediately preceding frame is used for the aforementioned correlation procedure whenever possible. However, where a no-correlation condition exists in any iteration of the tracking process, the present image is correlated using the pixel matrix from the last image frame where the affected eye location was updated.
- a potential eye location does not exhibit a blink response within 150 image frames, it is still tracked but assigned a low confidence status (i.e. a low probability it is an actual eye location) at step 702.
- a potential eye location becomes "lost” in that there is a no-correlation condition for more than 150 frames, this location is assigned a low confidence status (step 704).
- a blink has been detected at a potential eye location and its status upgraded to an actual eye location, but then this location is "lost", its status will depend on a secondary factor. This secondary factor is the presence of a second actual eye location having a geometric relationship to the first, as was described previously.
- the high confidence status of the "lost" actual eye does not change. If, however, there is no second eye location, then the "lost" actual eye is downgraded to a low confidence potential eye location (step 706).
- the determination of high and low confidence is important because, the tracking process continues for all potential or actual eye locations only for as long as there is at least one remaining high confidence actual eye location or an un-designated potential eye location (i.e. a potential eye location which has not been assigned a low confidence status) being monitored (step 708). However, if only low confidence locations exist, the system is re-initialized and the entire eye finding and tracking process starts over (step 710).
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Ophthalmology & Optometry (AREA)
- Biomedical Technology (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Analysis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims (42)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/858,841 US5859686A (en) | 1997-05-19 | 1997-05-19 | Eye finding and tracking system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/858,841 US5859686A (en) | 1997-05-19 | 1997-05-19 | Eye finding and tracking system |
Publications (1)
Publication Number | Publication Date |
---|---|
US5859686A true US5859686A (en) | 1999-01-12 |
Family
ID=25329338
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/858,841 Expired - Lifetime US5859686A (en) | 1997-05-19 | 1997-05-19 | Eye finding and tracking system |
Country Status (1)
Country | Link |
---|---|
US (1) | US5859686A (en) |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6091334A (en) * | 1998-09-04 | 2000-07-18 | Massachusetts Institute Of Technology | Drowsiness/alertness monitor |
US20030149549A1 (en) * | 2002-01-23 | 2003-08-07 | Radica China Ltd. | Optical controller |
US20030223037A1 (en) * | 2002-05-30 | 2003-12-04 | Visx, Incorporated | Methods and systems for tracking a torsional orientation and position of an eye |
US20040234103A1 (en) * | 2002-10-28 | 2004-11-25 | Morris Steffein | Method and apparatus for detection of drowsiness and quantitative control of biological processes |
US20050213792A1 (en) * | 2004-03-29 | 2005-09-29 | Hammoud Riad I | Eye tracking method based on correlation and detected eye movement |
US20050232461A1 (en) * | 2004-04-20 | 2005-10-20 | Hammoud Riad I | Object tracking and eye state identification method |
US20060287779A1 (en) * | 2005-05-16 | 2006-12-21 | Smith Matthew R | Method of mitigating driver distraction |
US20060290884A1 (en) * | 2002-05-24 | 2006-12-28 | Resmed Limited | Method and apparatus for testing sleepiness |
US20070036397A1 (en) * | 2005-01-26 | 2007-02-15 | Honeywell International Inc. | A distance iris recognition |
US20070140531A1 (en) * | 2005-01-26 | 2007-06-21 | Honeywell International Inc. | standoff iris recognition system |
US20070189582A1 (en) * | 2005-01-26 | 2007-08-16 | Honeywell International Inc. | Approaches and apparatus for eye detection in a digital image |
US20070211924A1 (en) * | 2006-03-03 | 2007-09-13 | Honeywell International Inc. | Invariant radial iris segmentation |
US20070274571A1 (en) * | 2005-01-26 | 2007-11-29 | Honeywell International Inc. | Expedient encoding system |
US20070274570A1 (en) * | 2005-01-26 | 2007-11-29 | Honeywell International Inc. | Iris recognition system having image quality metrics |
US20070276853A1 (en) * | 2005-01-26 | 2007-11-29 | Honeywell International Inc. | Indexing and database search system |
US20080044063A1 (en) * | 2006-05-15 | 2008-02-21 | Retica Systems, Inc. | Multimodal ocular biometric system |
US20080069411A1 (en) * | 2006-09-15 | 2008-03-20 | Friedman Marc D | Long distance multimodal biometric system and method |
US20080075441A1 (en) * | 2006-03-03 | 2008-03-27 | Honeywell International Inc. | Single lens splitter camera |
US20080253622A1 (en) * | 2006-09-15 | 2008-10-16 | Retica Systems, Inc. | Multimodal ocular biometric system and methods |
US20080267456A1 (en) * | 2007-04-25 | 2008-10-30 | Honeywell International Inc. | Biometric data collection system |
US20100033677A1 (en) * | 2008-08-08 | 2010-02-11 | Honeywell International Inc. | Image acquisition system |
US20100182440A1 (en) * | 2008-05-09 | 2010-07-22 | Honeywell International Inc. | Heterogeneous video capturing system |
US20100284576A1 (en) * | 2006-09-25 | 2010-11-11 | Yasunari Tosa | Iris data extraction |
EP2275020A1 (en) | 2009-07-16 | 2011-01-19 | Tobil Technology AB | Eye detection unit using sequential data flow |
US20110187845A1 (en) * | 2006-03-03 | 2011-08-04 | Honeywell International Inc. | System for iris detection, tracking and recognition at a distance |
US8049812B2 (en) | 2006-03-03 | 2011-11-01 | Honeywell International Inc. | Camera with auto focus capability |
US8085993B2 (en) | 2006-03-03 | 2011-12-27 | Honeywell International Inc. | Modular biometrics collection system architecture |
US8121356B2 (en) | 2006-09-15 | 2012-02-21 | Identix Incorporated | Long distance multimodal biometric system and method |
WO2012047221A1 (en) | 2010-10-07 | 2012-04-12 | Sony Computer Entertainment Inc. | 3-d glasses with camera based head tracking |
US8213782B2 (en) | 2008-08-07 | 2012-07-03 | Honeywell International Inc. | Predictive autofocusing system |
US8280119B2 (en) | 2008-12-05 | 2012-10-02 | Honeywell International Inc. | Iris recognition system using quality metrics |
US20130033524A1 (en) * | 2011-08-02 | 2013-02-07 | Chin-Han Wang | Method for performing display control in response to eye activities of a user, and associated apparatus |
US20130090562A1 (en) * | 2011-10-07 | 2013-04-11 | Baycrest Centre For Geriatric Care | Methods and systems for assessing cognitive function |
US8472681B2 (en) | 2009-06-15 | 2013-06-25 | Honeywell International Inc. | Iris and ocular recognition system using trace transforms |
US8630464B2 (en) | 2009-06-15 | 2014-01-14 | Honeywell International Inc. | Adaptive iris matching using database indexing |
US8705808B2 (en) | 2003-09-05 | 2014-04-22 | Honeywell International Inc. | Combined face and iris recognition system |
US8742887B2 (en) | 2010-09-03 | 2014-06-03 | Honeywell International Inc. | Biometric visitor check system |
CN105205481A (en) * | 2015-11-03 | 2015-12-30 | 浙江中烟工业有限责任公司 | Iris recognition equipment |
US9237846B2 (en) | 2011-02-17 | 2016-01-19 | Welch Allyn, Inc. | Photorefraction ocular screening device and methods |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US10506165B2 (en) | 2015-10-29 | 2019-12-10 | Welch Allyn, Inc. | Concussion screening system |
US10624538B2 (en) * | 2017-01-17 | 2020-04-21 | Mitsubishi Electric Corporation | Eyelid detection device, drowsiness determination device, and eyelid detection method |
US10713483B2 (en) * | 2018-03-20 | 2020-07-14 | Welch Allyn, Inc. | Pupil edge detection in digital imaging |
US10928894B2 (en) | 2017-02-27 | 2021-02-23 | Nokia Technologies Oy | Eye tracking |
US20220189132A1 (en) * | 2019-03-26 | 2022-06-16 | Nec Corporation | Interest determination apparatus, interest determination system, interest determination method, and non-transitory computer readable medium storing program |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3462604A (en) * | 1967-08-23 | 1969-08-19 | Honeywell Inc | Control apparatus sensitive to eye movement |
US5093567A (en) * | 1989-07-14 | 1992-03-03 | Gec-Marconi Limited | Helmet systems with eyepiece and eye position sensing means |
US5106184A (en) * | 1990-08-13 | 1992-04-21 | Eye Research Institute Of Retina Foundation | Retinal laser doppler apparatus having eye tracking system |
US5196873A (en) * | 1990-05-08 | 1993-03-23 | Nihon Kohden Corporation | Eye movement analysis system |
US5218387A (en) * | 1990-05-21 | 1993-06-08 | Nissan Motor Co., Ltd. | Eye position detecting apparatus |
US5231674A (en) * | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
-
1997
- 1997-05-19 US US08/858,841 patent/US5859686A/en not_active Expired - Lifetime
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3462604A (en) * | 1967-08-23 | 1969-08-19 | Honeywell Inc | Control apparatus sensitive to eye movement |
US5231674A (en) * | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
US5093567A (en) * | 1989-07-14 | 1992-03-03 | Gec-Marconi Limited | Helmet systems with eyepiece and eye position sensing means |
US5196873A (en) * | 1990-05-08 | 1993-03-23 | Nihon Kohden Corporation | Eye movement analysis system |
US5218387A (en) * | 1990-05-21 | 1993-06-08 | Nissan Motor Co., Ltd. | Eye position detecting apparatus |
US5106184A (en) * | 1990-08-13 | 1992-04-21 | Eye Research Institute Of Retina Foundation | Retinal laser doppler apparatus having eye tracking system |
Cited By (95)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6091334A (en) * | 1998-09-04 | 2000-07-18 | Massachusetts Institute Of Technology | Drowsiness/alertness monitor |
US20030149549A1 (en) * | 2002-01-23 | 2003-08-07 | Radica China Ltd. | Optical controller |
US6836751B2 (en) | 2002-01-23 | 2004-12-28 | Radica China Ltd. | Optical controller |
US20060290884A1 (en) * | 2002-05-24 | 2006-12-28 | Resmed Limited | Method and apparatus for testing sleepiness |
US7783347B2 (en) * | 2002-05-24 | 2010-08-24 | Resmed Limited | Method and apparatus for testing sleepiness |
US9596983B2 (en) | 2002-05-30 | 2017-03-21 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US8740385B2 (en) | 2002-05-30 | 2014-06-03 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US7044602B2 (en) * | 2002-05-30 | 2006-05-16 | Visx, Incorporated | Methods and systems for tracking a torsional orientation and position of an eye |
US20060161141A1 (en) * | 2002-05-30 | 2006-07-20 | Visx, Incorporated | Methods and Systems for Tracking a Torsional Orientation and Position of an Eye |
US20030223037A1 (en) * | 2002-05-30 | 2003-12-04 | Visx, Incorporated | Methods and systems for tracking a torsional orientation and position of an eye |
US20090012505A1 (en) * | 2002-05-30 | 2009-01-08 | Amo Manufacturing Usa, Llc | Methods and Systems for Tracking a Torsional Orientation and Position of an Eye |
US7431457B2 (en) | 2002-05-30 | 2008-10-07 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US10251783B2 (en) | 2002-05-30 | 2019-04-09 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US7261415B2 (en) | 2002-05-30 | 2007-08-28 | Visx, Incorporated | Methods and systems for tracking a torsional orientation and position of an eye |
US20040234103A1 (en) * | 2002-10-28 | 2004-11-25 | Morris Steffein | Method and apparatus for detection of drowsiness and quantitative control of biological processes |
US7680302B2 (en) | 2002-10-28 | 2010-03-16 | Morris Steffin | Method and apparatus for detection of drowsiness and quantitative control of biological processes |
US20080192983A1 (en) * | 2002-10-28 | 2008-08-14 | Morris Steffin | Method and apparatus for detection of drowsiness and quantitative control of biological processes |
US7336804B2 (en) | 2002-10-28 | 2008-02-26 | Morris Steffin | Method and apparatus for detection of drowsiness and quantitative control of biological processes |
US8705808B2 (en) | 2003-09-05 | 2014-04-22 | Honeywell International Inc. | Combined face and iris recognition system |
US20050213792A1 (en) * | 2004-03-29 | 2005-09-29 | Hammoud Riad I | Eye tracking method based on correlation and detected eye movement |
US7331671B2 (en) * | 2004-03-29 | 2008-02-19 | Delphi Technologies, Inc. | Eye tracking method based on correlation and detected eye movement |
US7362885B2 (en) * | 2004-04-20 | 2008-04-22 | Delphi Technologies, Inc. | Object tracking and eye state identification method |
US20050232461A1 (en) * | 2004-04-20 | 2005-10-20 | Hammoud Riad I | Object tracking and eye state identification method |
US20070036397A1 (en) * | 2005-01-26 | 2007-02-15 | Honeywell International Inc. | A distance iris recognition |
US8098901B2 (en) | 2005-01-26 | 2012-01-17 | Honeywell International Inc. | Standoff iris recognition system |
US20070276853A1 (en) * | 2005-01-26 | 2007-11-29 | Honeywell International Inc. | Indexing and database search system |
US20070189582A1 (en) * | 2005-01-26 | 2007-08-16 | Honeywell International Inc. | Approaches and apparatus for eye detection in a digital image |
US8488846B2 (en) | 2005-01-26 | 2013-07-16 | Honeywell International Inc. | Expedient encoding system |
US8285005B2 (en) | 2005-01-26 | 2012-10-09 | Honeywell International Inc. | Distance iris recognition |
US20070274570A1 (en) * | 2005-01-26 | 2007-11-29 | Honeywell International Inc. | Iris recognition system having image quality metrics |
US7593550B2 (en) | 2005-01-26 | 2009-09-22 | Honeywell International Inc. | Distance iris recognition |
US20100002913A1 (en) * | 2005-01-26 | 2010-01-07 | Honeywell International Inc. | distance iris recognition |
US8045764B2 (en) | 2005-01-26 | 2011-10-25 | Honeywell International Inc. | Expedient encoding system |
US20070274571A1 (en) * | 2005-01-26 | 2007-11-29 | Honeywell International Inc. | Expedient encoding system |
US7761453B2 (en) | 2005-01-26 | 2010-07-20 | Honeywell International Inc. | Method and system for indexing and searching an iris image database |
US8090157B2 (en) | 2005-01-26 | 2012-01-03 | Honeywell International Inc. | Approaches and apparatus for eye detection in a digital image |
US20070140531A1 (en) * | 2005-01-26 | 2007-06-21 | Honeywell International Inc. | standoff iris recognition system |
US8050463B2 (en) | 2005-01-26 | 2011-11-01 | Honeywell International Inc. | Iris recognition system having image quality metrics |
US7835834B2 (en) * | 2005-05-16 | 2010-11-16 | Delphi Technologies, Inc. | Method of mitigating driver distraction |
US20060287779A1 (en) * | 2005-05-16 | 2006-12-21 | Smith Matthew R | Method of mitigating driver distraction |
US8049812B2 (en) | 2006-03-03 | 2011-11-01 | Honeywell International Inc. | Camera with auto focus capability |
US7933507B2 (en) | 2006-03-03 | 2011-04-26 | Honeywell International Inc. | Single lens splitter camera |
US8442276B2 (en) | 2006-03-03 | 2013-05-14 | Honeywell International Inc. | Invariant radial iris segmentation |
US20110187845A1 (en) * | 2006-03-03 | 2011-08-04 | Honeywell International Inc. | System for iris detection, tracking and recognition at a distance |
US20080075441A1 (en) * | 2006-03-03 | 2008-03-27 | Honeywell International Inc. | Single lens splitter camera |
US20070211924A1 (en) * | 2006-03-03 | 2007-09-13 | Honeywell International Inc. | Invariant radial iris segmentation |
US8064647B2 (en) | 2006-03-03 | 2011-11-22 | Honeywell International Inc. | System for iris detection tracking and recognition at a distance |
US8761458B2 (en) | 2006-03-03 | 2014-06-24 | Honeywell International Inc. | System for iris detection, tracking and recognition at a distance |
US8085993B2 (en) | 2006-03-03 | 2011-12-27 | Honeywell International Inc. | Modular biometrics collection system architecture |
US8014571B2 (en) | 2006-05-15 | 2011-09-06 | Identix Incorporated | Multimodal ocular biometric system |
US20080044063A1 (en) * | 2006-05-15 | 2008-02-21 | Retica Systems, Inc. | Multimodal ocular biometric system |
US8391567B2 (en) | 2006-05-15 | 2013-03-05 | Identix Incorporated | Multimodal ocular biometric system |
US8983146B2 (en) | 2006-05-15 | 2015-03-17 | Morphotrust Usa, Llc | Multimodal ocular biometric system |
US20080069411A1 (en) * | 2006-09-15 | 2008-03-20 | Friedman Marc D | Long distance multimodal biometric system and method |
US8577093B2 (en) | 2006-09-15 | 2013-11-05 | Identix Incorporated | Long distance multimodal biometric system and method |
US8121356B2 (en) | 2006-09-15 | 2012-02-21 | Identix Incorporated | Long distance multimodal biometric system and method |
US20080253622A1 (en) * | 2006-09-15 | 2008-10-16 | Retica Systems, Inc. | Multimodal ocular biometric system and methods |
US8170293B2 (en) | 2006-09-15 | 2012-05-01 | Identix Incorporated | Multimodal ocular biometric system and methods |
US8644562B2 (en) | 2006-09-15 | 2014-02-04 | Morphotrust Usa, Inc. | Multimodal ocular biometric system and methods |
US8433103B2 (en) | 2006-09-15 | 2013-04-30 | Identix Incorporated | Long distance multimodal biometric system and method |
US20100284576A1 (en) * | 2006-09-25 | 2010-11-11 | Yasunari Tosa | Iris data extraction |
US8340364B2 (en) | 2006-09-25 | 2012-12-25 | Identix Incorporated | Iris data extraction |
US7970179B2 (en) | 2006-09-25 | 2011-06-28 | Identix Incorporated | Iris data extraction |
US20110200235A1 (en) * | 2006-09-25 | 2011-08-18 | Identix Incorporated | Iris Data Extraction |
US9235762B2 (en) | 2006-09-25 | 2016-01-12 | Morphotrust Usa, Llc | Iris data extraction |
US20080267456A1 (en) * | 2007-04-25 | 2008-10-30 | Honeywell International Inc. | Biometric data collection system |
US8063889B2 (en) | 2007-04-25 | 2011-11-22 | Honeywell International Inc. | Biometric data collection system |
US8436907B2 (en) | 2008-05-09 | 2013-05-07 | Honeywell International Inc. | Heterogeneous video capturing system |
US20100182440A1 (en) * | 2008-05-09 | 2010-07-22 | Honeywell International Inc. | Heterogeneous video capturing system |
US8213782B2 (en) | 2008-08-07 | 2012-07-03 | Honeywell International Inc. | Predictive autofocusing system |
US8090246B2 (en) | 2008-08-08 | 2012-01-03 | Honeywell International Inc. | Image acquisition system |
US20100033677A1 (en) * | 2008-08-08 | 2010-02-11 | Honeywell International Inc. | Image acquisition system |
US8280119B2 (en) | 2008-12-05 | 2012-10-02 | Honeywell International Inc. | Iris recognition system using quality metrics |
US8630464B2 (en) | 2009-06-15 | 2014-01-14 | Honeywell International Inc. | Adaptive iris matching using database indexing |
US8472681B2 (en) | 2009-06-15 | 2013-06-25 | Honeywell International Inc. | Iris and ocular recognition system using trace transforms |
EP2275020A1 (en) | 2009-07-16 | 2011-01-19 | Tobil Technology AB | Eye detection unit using sequential data flow |
EP3338621A1 (en) | 2009-07-16 | 2018-06-27 | Tobii AB | Eye detection unit using parallel data flow |
WO2011006760A1 (en) | 2009-07-16 | 2011-01-20 | Tobii Technology Ab | Eye detection unit using sequential data flow |
US8742887B2 (en) | 2010-09-03 | 2014-06-03 | Honeywell International Inc. | Biometric visitor check system |
WO2012047221A1 (en) | 2010-10-07 | 2012-04-12 | Sony Computer Entertainment Inc. | 3-d glasses with camera based head tracking |
EP2486441A4 (en) * | 2010-10-07 | 2016-05-25 | Sony Computer Entertainment Inc | 3-d glasses with camera based head tracking |
US9237846B2 (en) | 2011-02-17 | 2016-01-19 | Welch Allyn, Inc. | Photorefraction ocular screening device and methods |
US9402538B2 (en) | 2011-02-17 | 2016-08-02 | Welch Allyn, Inc. | Photorefraction ocular screening device and methods |
US9408535B2 (en) | 2011-02-17 | 2016-08-09 | Welch Allyn, Inc. | Photorefraction ocular screening device and methods |
US20130033524A1 (en) * | 2011-08-02 | 2013-02-07 | Chin-Han Wang | Method for performing display control in response to eye activities of a user, and associated apparatus |
US20130090562A1 (en) * | 2011-10-07 | 2013-04-11 | Baycrest Centre For Geriatric Care | Methods and systems for assessing cognitive function |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US10506165B2 (en) | 2015-10-29 | 2019-12-10 | Welch Allyn, Inc. | Concussion screening system |
CN105205481A (en) * | 2015-11-03 | 2015-12-30 | 浙江中烟工业有限责任公司 | Iris recognition equipment |
US10624538B2 (en) * | 2017-01-17 | 2020-04-21 | Mitsubishi Electric Corporation | Eyelid detection device, drowsiness determination device, and eyelid detection method |
US10928894B2 (en) | 2017-02-27 | 2021-02-23 | Nokia Technologies Oy | Eye tracking |
US10713483B2 (en) * | 2018-03-20 | 2020-07-14 | Welch Allyn, Inc. | Pupil edge detection in digital imaging |
US20220189132A1 (en) * | 2019-03-26 | 2022-06-16 | Nec Corporation | Interest determination apparatus, interest determination system, interest determination method, and non-transitory computer readable medium storing program |
US11887349B2 (en) * | 2019-03-26 | 2024-01-30 | Nec Corporation | Interest determination apparatus, interest determination system, interest determination method, and non-transitory computer readable medium storing program |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5859686A (en) | Eye finding and tracking system | |
US5867587A (en) | Impaired operator detection and warning system employing eyeblink analysis | |
US8102417B2 (en) | Eye closure recognition system and method | |
EP1701288A1 (en) | System and method of detecting eye closure based on edge lines | |
US7253739B2 (en) | System and method for determining eye closure state | |
US10521683B2 (en) | Glare reduction | |
EP2074550B1 (en) | Eye opening detection system and method of detecting eye opening | |
US5878156A (en) | Detection of the open/closed state of eyes based on analysis of relation between eye and eyebrow images in input face images | |
EP1701289A1 (en) | System and method of detecting eye closure based on line angles | |
EP1732028B1 (en) | System and method for detecting an eye | |
JP3143819B2 (en) | Eyelid opening detector | |
EP2060993B1 (en) | An awareness detection system and method | |
US7650034B2 (en) | Method of locating a human eye in a video image | |
JPH04216402A (en) | Detecting apparatus of position of eye | |
GB2328504A (en) | Eye position detector | |
US20060109422A1 (en) | Pupilometer | |
JP3116638B2 (en) | Awake state detection device | |
JP3444115B2 (en) | Dozing state detection device | |
JPH1044824A (en) | Driver's-eye open/closed state determination device for vehicle | |
KR101122513B1 (en) | Assuming system of eyeball position using 3-dimension position information and assuming method of eyeball position | |
JPH04174309A (en) | Driver's eye position detecting apparatus and condition detecting apparatus | |
JPH03202045A (en) | Detecting device for state of driver | |
JP2677010B2 (en) | Eye position detection device | |
JP4447464B2 (en) | Method for determining regions of interest in skin-pattern images | |
JPH10236181A (en) | Dozing condition detection device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NORTHROP GRUMMAN CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ABOUTALIB, OMAR;RAMROTH, RICHARD ROY;REEL/FRAME:008570/0771 Effective date: 19970508 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: INTEGRATED MEDICAL SYSTEMS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTHROP GRUMMAN CORPORATION;REEL/FRAME:010776/0831 Effective date: 19991005 |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: LTOS); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
AS | Assignment |
Owner name: MEDFLEX, LLC, GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTEGRATED MEDICAL SYSTEMS, INC;REEL/FRAME:032697/0230 Effective date: 20140408 |