[go: nahoru, domu]

US20050276454A1 - System and methods for transforming biometric image data to a consistent angle of inclination - Google Patents

System and methods for transforming biometric image data to a consistent angle of inclination Download PDF

Info

Publication number
US20050276454A1
US20050276454A1 US11/151,412 US15141205A US2005276454A1 US 20050276454 A1 US20050276454 A1 US 20050276454A1 US 15141205 A US15141205 A US 15141205A US 2005276454 A1 US2005276454 A1 US 2005276454A1
Authority
US
United States
Prior art keywords
image
biometric data
biometric
sample
transformed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/151,412
Inventor
Rodney Beatson
Mark Kelty
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/151,412 priority Critical patent/US20050276454A1/en
Application filed by Individual filed Critical Individual
Publication of US20050276454A1 publication Critical patent/US20050276454A1/en
Priority to US12/627,413 priority patent/US7916907B2/en
Priority to US12/931,340 priority patent/US8842887B2/en
Priority to US13/072,398 priority patent/US8885894B2/en
Priority to US14/198,695 priority patent/US9286457B2/en
Priority to US14/998,574 priority patent/US9665704B2/en
Priority to US15/731,069 priority patent/US9940453B2/en
Priority to US15/909,218 priority patent/US10515204B2/en
Priority to US16/724,214 priority patent/US10824714B2/en
Priority to US17/082,743 priority patent/US11449598B2/en
Priority to US17/947,930 priority patent/US11803633B1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/146Aligning or centring of the image pick-up or image-field
    • G06V30/1475Inclination or skew detection or correction of characters or of image to be recognised
    • G06V30/1478Inclination or skew detection or correction of characters or of image to be recognised of characters or characters lines

Definitions

  • the invention described herein is related to image processing of biometric data to compensate for image skew. More particularly, the invention is related to methods and associated systems for transforming biometric data so that features may be extracted consistently therefrom regardless of the original orientation of the biometric data in the image.
  • Machine authentication of persons spans several fields of endeavor, including network security, financial transaction authorization and the electronic execution of binding agreements and legal documents.
  • Biometric data includes fingerprint, iris and face images, and voice and handwriting samples. But, while biometric systems and methods are becoming more prevalent, a greater reliance on the technology has been hampered due to the complex nature of the data metric itself.
  • U.S. Pat. No. 5,828,772 issued to Kashi, et al., discloses normalization of signature data by certain ones of the signature's Fourier descriptors to reestablish the signature data in a common orientation.
  • the normalization disclosed in Kashi, et al. translates the signature by the value of the zero-th, or “D.C.” descriptor and scales and rotates the signature according the first Fourier descriptor.
  • the normalization described in Kashi, et al. is highly dependent on the sequential ordering of data points in the signature. For example, the rotation angle of the normalization depends on the location of the first point in the sequence with respect to its centroid.
  • Biometric data are obtained and pixelated to form a biometric image.
  • the biometric data in the image are oriented at a submission angle with respect to a predetermined axis of the image.
  • a statistical relationship between pixels of a transformed biometric image is selected.
  • a transform is applied to the biometric image to relocate the pixels thereof so as to form a transformed biometric image, where the biometric image is rotated to an angle of inclination corresponding to the statistical relationship regardless of the submission angle.
  • An input device is provided to obtain biometric data from a user and a storage unit is provided for storing features of the biometric data.
  • a statistical relationship between pixels of a transformed image is selected.
  • a first sample of biometric data is obtained from the user through the input device and the first sample is pixelated to form a first sample image.
  • the first sample image includes a plurality of pixels at a corresponding plurality of pixel coordinates.
  • the first sample is oriented in the image at a first submission angle with respect to a predetermined axis of the first sample image.
  • a transform is applied to the first sample image to produce a first transformed biometric data image, where the first sample is oriented in the first transformed biometric data image at an angle of inclination corresponding to the selected relationship between pixels.
  • Biometric features are extracted from the first transformed biometric image and stored in the storage unit.
  • a second sample of biometric data is received from the user.
  • the second sample is pixelated to form a second sample image.
  • the transform is applied to the second sample image to produce a second transformed biometric data image so that the second sample is oriented in the second rotated biometric data image at the angle of inclination corresponding to the statistical relationship between pixels.
  • Biometric features are extracted from the second transformed biometric data image and compared with the corresponding biometric data stored in the storage unit.
  • the system includes an input device operable to obtain biometric data for the user, a pixelator operable to pixelate the biometric data into an image thereof, where the biometric data are oriented in the image at a submission angle with respect to a predetermined axis of the image.
  • the system further includes a data storage unit operable to store features of the biometric data and a code storage unit operable to store sequences of program instructions that, when executed by a processing unit, cause the processor to execute a transformation process to relocate the pixels of the biometric data image to achieve an angle of inclination of biometric data corresponding to a statistical relationship between pixels thereof.
  • the system further includes a processing unit coupled to the input device and the code and data storage units. The processing unit is operable to execute the transformation process.
  • FIG. 1 is a schematic block diagram of an exemplary system configuration for carrying out aspects of the present invention
  • FIG. 2A illustrates a biometric data image of a handwriting sample, or sign, which is oriented in the image at a submission angle
  • FIG. 2B is an illustration of the user's sign after the transformation in accordance with the present invention.
  • FIG. 3A is an illustration of a pixelated sign after image transformation in which a regression line has been determined
  • FIG. 3B illustrates a rotated image after a post transformation rotation process
  • FIG. 4 is a flow diagram of fundamental method steps of an exemplary embodiment of the present invention.
  • biometric data for purposes of describing the invention to the skilled artisan. It is to be understood, however, that the present invention may be used with many different biometric data types, which are not specifically listed or described herein. It should be clear from the following description that among the beneficial features of the present invention is that any biometric data image may be transformed in accordance therewith so as to extract significant features consistently and regardless of the angle at which the biometric data are oriented in the image.
  • FIG. 1 there is shown an exemplary system configuration, in simplified form, for implementing the present invention. It is to be understood that the components illustrated may be distributed across system boundaries or may be contained in a single unit. For example, the system 100 illustrated in FIG. 1 may be entirely implemented on a personal digital assistant (PDA). Alternatively, functions of processing unit 110 , described in more detail below, may be distributed across many processing units interconnected by a communications network. Alternative implementations of the present invention will be clear to those of skill in the art upon reading the disclosure.
  • PDA personal digital assistant
  • the system 100 includes a biometric input device 120 for obtaining a biometric data sample from a user.
  • Biometric input devices are well known in the art for obtaining a sample of the user's handwriting, a user fingerprint, iris and facial images, and other biometric data.
  • Biometric input device 120 may include a transducer which converts a measurable biometric characteristic into an electrical signal for further processing. Such transducers include means for converting stylus position, pressure, temperature, optical radiance, and other physical qualities into a corresponding electrical signal. In certain implementations, the transducer produces an electrical signal which is proportional to the physical quality being measured.
  • Biometric input device 120 is coupled to a pixelator 130 which converts the signal from biometric input device 120 into discretely valued quantities corresponding to a coordinate in an image.
  • a “pixel” as used herein will refer to one or more values, which may be, for example, real, imaginary, or complex numbers, assigned to a coordinate in an image.
  • a coordinate may be of a spatial coordinate system, a temporal coordinate system, or any other coordinate system, and an image will be used herein to refer to an ordered collection of such pixels. As such, it should be clear that an image need not be a planar array of pixels containing graphic data, which is a common usage of the term.
  • An image may be formed from, for example, stylus position and/or stylus pressure data in a temporal coordinate system, such as would be produced by a handwriting digitizer known in the art.
  • an image could be for example, a spatial array of feature data, such as whorl inclination angle of a fingerprint. It should be apparent to those skilled in the art upon reading the following descriptions that several image types may be used with the invention.
  • the pixelation process i.e., forming the image of pixels in accordance with the descriptions above, may be accomplished in a variety of ways known in the art.
  • the electrical signal produced by the biometric input device may be sampled and converted into a digital number in a digital data stream.
  • the numbers of the data stream may then be assigned to pixels of a digital image to produce a digitized copy of the original biometric data supplied by the user through biometric input device 120 .
  • pixelator 130 may collect quanta and store an electrical representation of a collected number thereof at individual locations of an array.
  • An example of such a pixelator may be a charge coupled device (CCD).
  • CCD charge coupled device
  • the biometric input device 120 and pixelator 130 are combined into a single physical unit.
  • many digitizers used for obtaining handwriting samples are configured to determine the position of a stylus, such as through electromagnetic sensors, when it is in contact with the digitizer pad.
  • the data are presented at an output terminal of the digitizer as a time ordered stream of stylus position data, i.e., the coordinates of the stylus when it is in contact with the pad.
  • the image produced is a stream of stylus position data in a temporal coordinate system. It should be clear to the skilled artisan that such system configurations and images produced thereby fall within the scope of the invention.
  • the description of digitizers above provides illustration of a concept which requires elaboration. It is to be noted that the same data used to form one image may be presented in an alternative image.
  • the temporally ordered positional data described above may be alternatively presented in a spatially ordered image of pixels containing data corresponding to “stylus up”, i.e., not in contact with the pad, or “stylus down”, i.e., in physical contact with the pad, in a spatial coordinate system corresponding to the known physical layout of the pad.
  • the values, “stylus up” or “stylus down”, may be assigned to a pixel corresponding to the coordinate established by the same stylus position data provided as the pixel value in the temporally ordered image.
  • the actual various image representations will vary by the many different possible implementations of the present invention.
  • System 100 of FIG. 1 includes a processing unit 110 which is coupled to a data storage unit 140 and a code storage unit 150 .
  • data storage unit 140 is primarily used for the storage of biometric data and code storage unit 150 is used to store program instruction sequences that, when executed by processing unit 110 , implement various procedures for carrying out aspects of the present invention.
  • data storage unit 140 and code storage unit 150 may actually reside in a single device such as a random access memory (RAM) or hard disk drive.
  • RAM random access memory
  • data storage unit 140 and code storage unit 150 may be distributed across several devices.
  • code storage unit 150 may be implemented in a read only memory (ROM) device while data storage unit 140 may be implemented in a hard disk drive.
  • ROM read only memory
  • data storage unit 140 or code storage unit 150 may be physically removed from processing unit 110 while still being electrically coupled thereto via a communications network.
  • present invention may be implemented through numerous physical configurations of system 100 .
  • System 100 in the exemplary embodiment shown, is operable to obtain biometric image data and extract pertinent features therefrom.
  • the features extracted are particular to the type of biometric data utilized. These features are well known in the art and new features and feature extraction techniques are continuously being developed. Among its many beneficial features, the present invention affords consistent feature extraction that is invariant to the angle of submission of the biometric data.
  • the biometric data 220 are in the form of a handwriting sample. More specifically, the handwriting sample 220 is a signature of the user. It is to be noted that many other handwriting samples may be used and which qualify as handwriting samples for the purposes of validating a particular user. As such, the signature 220 will be referred to as a sign 220 or handwriting sample 220 to emphasize that the handwriting sample 220 need not be a signature.
  • sign 220 exhibits an inclination with respect to the X axis of the image 210 .
  • This inclination will be referred to herein as the angle of submission ⁇ s or, alternatively, the submission angle ⁇ s .
  • a non-zero angle of submission is encountered often in biometric data collection.
  • a user cannot be expected to submit a handwriting sample, or place a finger on a touchpad or even hold his head at a particular angle for purposes of retinal scan every time the individual is to provide biometric data for validation. As such, feature extraction for purposes of biometric data validation becomes more difficult.
  • the application of a transform through which a “universal” angle of rotation is applied to the biometric data in the image is used herein to describe an angle which, after transformation of the image data, the resulting angle of inclination of the biometric data is always the same.
  • the desired inclination is determined by a statistical relationship between pixels of the transformed image.
  • the universal angle of rotation may be determined by the relationship where the variance in pixel values of pixels along the X axis is a scalar multiple of the variance in pixel values of those pixels in the Y direction.
  • the covariance of pixel values in the X and Y directions are set to a specific value, such as zero.
  • Eq. (10) produces two solutions, one for each of the operations defined by the ⁇ operator.
  • the two solutions provide angles of rotation that are 90° apart, i.e., perpendicular to each other.
  • FIG. 2B there is shown a rotated image 210 ′, in which the user's sign 220 ′ is at the inclination corresponding to the angle of rotation, ⁇ . It is to be noted that this inclination of the sign 220 ′ will be produced by embodiments of the present invention regardless of the angle of submission ⁇ S of the user's original sign. Certain embodiments of the present invention analyze the original image of the biometric data 210 to determine the statistical properties thereof and these properties are then utilized to determine the angle of rotation that achieves the desired statistical relationship between pixels in the rotated image 210 ′. From the rotated image, features may be extracted for purposes of validation, and because the sign 220 ′ is at a consistent inclination with a sign from which template features are extracted, feature recognition and correspondence is improved.
  • the relationship between pixels of the rotated image is that where the covariance of the pixels thereof are to be set to some constant value, such as zero.
  • the rotation defined by Eq. (10) may, depending upon the inclination of the biometric data in the original image, produce an inverted image, i.e., 180° to the expected image.
  • many key features of biometric data may be defined which are not affected by such an inversion. If features are included which are sensitive to a 180° rotation, those features may be extracted for both the image after applying the universal angle of rotation and its equivalent after rotating the image further through another 180°.
  • the closest match to the data in a biometric template stored in data storage unit 140 can be used for validation purposes.
  • a further rotation may be imposed on the biometric data after the data has been transformed while maintaining the statistical relationship between pixels.
  • it may be desired to rotate the biometric data to a “natural” state, e.g., one that appears visually natural to an observer.
  • a specific example of this may be where it is desired that the handwriting sample be rotated such that it is parallel to a longitudinal axis of the enclosing image.
  • the additional rotation which retains the previously established statistical relationship may be accomplished through aspects of the invention discussed in paragraphs that follow.
  • FIG. 3A there is shown the image 210 ′ which has been transformed by the universal rotation angle in accordance with the present invention.
  • the biometric data 220 ′ is illustrated in the Figure as discrete points to emphasize the fact that the data are discrete.
  • a line fitting technique to be described presently, has produced a fitted line 330 to the data which makes an angle ⁇ R with the X axis of the image. It is important to note that ⁇ R may not be the same as the inclination angle produced by the transformation which achieves the relationship between pixels of the image.
  • the inclination of the line 330 , ⁇ R is dependent upon the actual line fitting technique utilized.
  • the angle of inclination of any one image is consistently defined by the chosen statistical relationship between pixels.
  • a further rotation may be applied based upon defining a line of regression (LR) on the rotated data.
  • LR line of regression
  • the resultant angle of rotation i.e., the universal rotation plus the line of regression rotation, will still preserve the property that the image, irrespective of the initial angle of submission of the biometric data, can be rotated to a consistent angle of inclination.
  • Applying the LR rotation after the universal rotation will generally restore the biometric data to the desired aspect ratio, such as the horizontal aspect ratio of the user sign illustrated in FIG. 3B .
  • the resultant angle of inclination of the biometric data in the image will be dependent upon the equation used for the line of regression and there are numerous methods known in the art.
  • the choice between using M or M 1 depends upon the actual implementation. Using M 1 will generate images with a horizontal aspect ratio and using M will generate images with a square aspect ratio.
  • the values of a and r calculated from the transformed image when the rotation is defined by M 1 may be denoted as a 2 and r 2 .
  • FIG. 4 illustrates via a flow diagram fundamental method steps of certain embodiments of the present invention.
  • the method is entered at start block 405 and flow is transferred to block 410 , where a sample of biometric data is obtained via biometric input device 120 .
  • the biometric data sample may be a handwriting sample, facial, iris or fingerprint features, or many other biometric quantities.
  • the sample is then pixelated at block 415 and an image is formed from the pixelated data.
  • the relationship between the rotated image pixels is selected. Exemplary embodiments previously described have included where the selected relationship is that of equivalent variance of pixel values in orthogonal directions and a zero coefficient of correlation between pixels, but many other relationships exist and may be used with the present invention.
  • an angle of rotation may be determined, as shown at block 425 , and the image is then transformed to rotate the biometric data to that desired angle of rotation at block 430 .
  • the relationship selected at block 420 is in effect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Collating Specific Patterns (AREA)

Abstract

Biometric data are obtained through a biometric input device (120) and subsequently pixelated via a pixelator (130). The pixelator (130) creates an image of the biometric data. Via a processing unit (110), a relationship between pixels of a transformed version of the image is asserted. Thus, the biometric data is rotated to a consistent inclination based on the relationship between pixels regardless of an orientation in which the biometric data were captured in the original image. Once the image has been transformed, features of the biometric data may be extracted and either stored in a data storage unit (140) or compared with previously stored feature values for validation of the biometric data.

Description

    RELATED APPLICATION DATA
  • This Application is based on U.S. Provisional Patent Application Ser. No. 60/579,422, filed on 14 Jun. 2004.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention described herein is related to image processing of biometric data to compensate for image skew. More particularly, the invention is related to methods and associated systems for transforming biometric data so that features may be extracted consistently therefrom regardless of the original orientation of the biometric data in the image.
  • 2. Description of the Prior Art
  • In the past decade, vast resources have been put into motion towards improving systems and methods for authenticating persons by automatic means. Machine authentication of persons spans several fields of endeavor, including network security, financial transaction authorization and the electronic execution of binding agreements and legal documents.
  • The field of biometrics, which utilizes physiological or behavioral phenomena that are particularly unique to an individual, has introduced many systems and methods for authentication, many of which are enjoying some popularity. Biometric data includes fingerprint, iris and face images, and voice and handwriting samples. But, while biometric systems and methods are becoming more prevalent, a greater reliance on the technology has been hampered due to the complex nature of the data metric itself.
  • One particular problem is that of the inconsistencies in biometric data. Indeed, extracting features from a biometric sample comes with its own complications in implementation, but when variability in the way an individual submits a biometric sample is introduced, consistent extraction of features becomes much more difficult. For example, in handwriting analysis, it is often the case that a signer will enter a sample at a different angle with respect to a previous sample upon which a feature template has been processed and stored for validation purposes. Whereas, to the trained human eye, it is apparent that a slanted version of a handwriting sample originates from the same person that signed a non-slanted version of the sample, automating the extraction of handwriting features to accomplish the recognition task by machine is complicated when the sample is skewed from the orientation on which a template was based.
  • Previous attempts to rectify images to a common orientation have primarily involved regression techniques to determine a regression line, which is then used as a reference for locating pertinent features for extraction. Examples of this type of technique are disclosed in U.S. Pat. No. 5,563,403 issued to Bessho, et al., U.S. Pat. No. 6,084,985 issued to Dolfing, et al., and U.S. Pat. No. 5,892,824 issued to Beatson, et al, the latter Patent having common inventorship with the present invention. These methods suffer from the dependency of the extraction process on the angle at which the data was submitted. Thus, key features extracted from the image for biometric verification are more variable and therefore less useful than if the image were rotated to a consistent and repeatable angle of inclination prior to feature extraction.
  • U.S. Pat. No. 5,828,772, issued to Kashi, et al., discloses normalization of signature data by certain ones of the signature's Fourier descriptors to reestablish the signature data in a common orientation. The normalization disclosed in Kashi, et al. translates the signature by the value of the zero-th, or “D.C.” descriptor and scales and rotates the signature according the first Fourier descriptor. However, the normalization described in Kashi, et al. is highly dependent on the sequential ordering of data points in the signature. For example, the rotation angle of the normalization depends on the location of the first point in the sequence with respect to its centroid. This sequential dependency severely limits the applicability of the normalization technique in that not all biometric data are arranged in a sequence. Many types of biometric data, such as that found in fingerprint and iris images, do not have among their properties a start and end point. The Fourier normalization disclosed in Kashi, et al. could not be applied to such non-serialized data. Moreover, because the normalization of Kashi, et al. requires Fourier analysis, it may be too costly to implement on certain platforms.
  • In light of the prior art, there is an apparent need for a system and associated methods for transforming biometric image data so that features can be consistently extracted therefrom regardless of the angle at which the data was submitted.
  • SUMMARY OF THE INVENTION
  • It is an object of the present invention to provide a method for spatially transforming biometric data for submission angle invariant feature extraction. Biometric data are obtained and pixelated to form a biometric image. The biometric data in the image are oriented at a submission angle with respect to a predetermined axis of the image. A statistical relationship between pixels of a transformed biometric image is selected. A transform is applied to the biometric image to relocate the pixels thereof so as to form a transformed biometric image, where the biometric image is rotated to an angle of inclination corresponding to the statistical relationship regardless of the submission angle.
  • It is a further object of the present invention to provide a method for verifying the validity of biometric data through invariant feature extraction. An input device is provided to obtain biometric data from a user and a storage unit is provided for storing features of the biometric data. A statistical relationship between pixels of a transformed image is selected. A first sample of biometric data is obtained from the user through the input device and the first sample is pixelated to form a first sample image. The first sample image includes a plurality of pixels at a corresponding plurality of pixel coordinates. The first sample is oriented in the image at a first submission angle with respect to a predetermined axis of the first sample image. A transform is applied to the first sample image to produce a first transformed biometric data image, where the first sample is oriented in the first transformed biometric data image at an angle of inclination corresponding to the selected relationship between pixels. Biometric features are extracted from the first transformed biometric image and stored in the storage unit. Subsequently, a second sample of biometric data is received from the user. The second sample is pixelated to form a second sample image. The transform is applied to the second sample image to produce a second transformed biometric data image so that the second sample is oriented in the second rotated biometric data image at the angle of inclination corresponding to the statistical relationship between pixels. Biometric features are extracted from the second transformed biometric data image and compared with the corresponding biometric data stored in the storage unit.
  • It is yet another object of the present invention to provide a system for spatially transforming biometric data for invariant feature extraction. The system includes an input device operable to obtain biometric data for the user, a pixelator operable to pixelate the biometric data into an image thereof, where the biometric data are oriented in the image at a submission angle with respect to a predetermined axis of the image. The system further includes a data storage unit operable to store features of the biometric data and a code storage unit operable to store sequences of program instructions that, when executed by a processing unit, cause the processor to execute a transformation process to relocate the pixels of the biometric data image to achieve an angle of inclination of biometric data corresponding to a statistical relationship between pixels thereof. The system further includes a processing unit coupled to the input device and the code and data storage units. The processing unit is operable to execute the transformation process.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic block diagram of an exemplary system configuration for carrying out aspects of the present invention;
  • FIG. 2A illustrates a biometric data image of a handwriting sample, or sign, which is oriented in the image at a submission angle;
  • FIG. 2B is an illustration of the user's sign after the transformation in accordance with the present invention;
  • FIG. 3A is an illustration of a pixelated sign after image transformation in which a regression line has been determined;
  • FIG. 3B illustrates a rotated image after a post transformation rotation process; and
  • FIG. 4 is a flow diagram of fundamental method steps of an exemplary embodiment of the present invention.
  • DESCRIPTION OF PREFERRED EMBODIMENTS
  • In the following description, reference will be made to particular types of biometric data for purposes of describing the invention to the skilled artisan. It is to be understood, however, that the present invention may be used with many different biometric data types, which are not specifically listed or described herein. It should be clear from the following description that among the beneficial features of the present invention is that any biometric data image may be transformed in accordance therewith so as to extract significant features consistently and regardless of the angle at which the biometric data are oriented in the image.
  • Referring now to FIG. 1, there is shown an exemplary system configuration, in simplified form, for implementing the present invention. It is to be understood that the components illustrated may be distributed across system boundaries or may be contained in a single unit. For example, the system 100 illustrated in FIG. 1 may be entirely implemented on a personal digital assistant (PDA). Alternatively, functions of processing unit 110, described in more detail below, may be distributed across many processing units interconnected by a communications network. Alternative implementations of the present invention will be clear to those of skill in the art upon reading the disclosure.
  • As shown in the Figure, the system 100 includes a biometric input device 120 for obtaining a biometric data sample from a user. Biometric input devices are well known in the art for obtaining a sample of the user's handwriting, a user fingerprint, iris and facial images, and other biometric data. Biometric input device 120 may include a transducer which converts a measurable biometric characteristic into an electrical signal for further processing. Such transducers include means for converting stylus position, pressure, temperature, optical radiance, and other physical qualities into a corresponding electrical signal. In certain implementations, the transducer produces an electrical signal which is proportional to the physical quality being measured.
  • Biometric input device 120 is coupled to a pixelator 130 which converts the signal from biometric input device 120 into discretely valued quantities corresponding to a coordinate in an image. For purposes of description, a “pixel” as used herein, will refer to one or more values, which may be, for example, real, imaginary, or complex numbers, assigned to a coordinate in an image. A coordinate may be of a spatial coordinate system, a temporal coordinate system, or any other coordinate system, and an image will be used herein to refer to an ordered collection of such pixels. As such, it should be clear that an image need not be a planar array of pixels containing graphic data, which is a common usage of the term. An image, as used herein, may be formed from, for example, stylus position and/or stylus pressure data in a temporal coordinate system, such as would be produced by a handwriting digitizer known in the art. Alternatively, an image could be for example, a spatial array of feature data, such as whorl inclination angle of a fingerprint. It should be apparent to those skilled in the art upon reading the following descriptions that several image types may be used with the invention.
  • The pixelation process, i.e., forming the image of pixels in accordance with the descriptions above, may be accomplished in a variety of ways known in the art. For example, the electrical signal produced by the biometric input device may be sampled and converted into a digital number in a digital data stream. The numbers of the data stream may then be assigned to pixels of a digital image to produce a digitized copy of the original biometric data supplied by the user through biometric input device 120. Alternatively, pixelator 130 may collect quanta and store an electrical representation of a collected number thereof at individual locations of an array. An example of such a pixelator may be a charge coupled device (CCD). The output of pixelator 130 is then a pixelated image of the biometric data provided by the user.
  • In certain embodiments of the present invention, the biometric input device 120 and pixelator 130 are combined into a single physical unit. For example, many digitizers used for obtaining handwriting samples are configured to determine the position of a stylus, such as through electromagnetic sensors, when it is in contact with the digitizer pad. The data are presented at an output terminal of the digitizer as a time ordered stream of stylus position data, i.e., the coordinates of the stylus when it is in contact with the pad. The image produced is a stream of stylus position data in a temporal coordinate system. It should be clear to the skilled artisan that such system configurations and images produced thereby fall within the scope of the invention.
  • The description of digitizers above provides illustration of a concept which requires elaboration. It is to be noted that the same data used to form one image may be presented in an alternative image. For example, the temporally ordered positional data described above may be alternatively presented in a spatially ordered image of pixels containing data corresponding to “stylus up”, i.e., not in contact with the pad, or “stylus down”, i.e., in physical contact with the pad, in a spatial coordinate system corresponding to the known physical layout of the pad. The values, “stylus up” or “stylus down”, may be assigned to a pixel corresponding to the coordinate established by the same stylus position data provided as the pixel value in the temporally ordered image. Thus, it may be beneficial to practice portions of the invention while data are formed in one image representation thereof while other portions of the invention are practiced when the data are formed in an alternative image representation thereof. The actual various image representations will vary by the many different possible implementations of the present invention.
  • Whereas, it should be clear to the skilled artisan that many image types of many different biometric data may be obtained by the system of FIG. 1, a handwriting sample will be used herein to facilitate the description of various aspects of the present invention. It is to be fully understood, however, that the present invention may be used with many different types of biometric data, and the handwriting sample described should not be construed to limit the scope of the present invention.
  • System 100 of FIG. 1 includes a processing unit 110 which is coupled to a data storage unit 140 and a code storage unit 150. As their respective descriptions suggest, data storage unit 140 is primarily used for the storage of biometric data and code storage unit 150 is used to store program instruction sequences that, when executed by processing unit 110, implement various procedures for carrying out aspects of the present invention. As is known in the art, data storage unit 140 and code storage unit 150 may actually reside in a single device such as a random access memory (RAM) or hard disk drive. Alternatively, data storage unit 140 and code storage unit 150 may be distributed across several devices. For example, code storage unit 150 may be implemented in a read only memory (ROM) device while data storage unit 140 may be implemented in a hard disk drive. Moreover, either or both of data storage unit 140 or code storage unit 150 may be physically removed from processing unit 110 while still being electrically coupled thereto via a communications network. Thus, as will be clear from the discussion below, the present invention may be implemented through numerous physical configurations of system 100.
  • System 100, in the exemplary embodiment shown, is operable to obtain biometric image data and extract pertinent features therefrom. The features extracted are particular to the type of biometric data utilized. These features are well known in the art and new features and feature extraction techniques are continuously being developed. Among its many beneficial features, the present invention affords consistent feature extraction that is invariant to the angle of submission of the biometric data.
  • Referring now to FIG. 2A, there is shown an image 210 of biometric data 220. In the exemplary embodiment shown in FIG. 2A, the biometric data 220 are in the form of a handwriting sample. More specifically, the handwriting sample 220 is a signature of the user. It is to be noted that many other handwriting samples may be used and which qualify as handwriting samples for the purposes of validating a particular user. As such, the signature 220 will be referred to as a sign 220 or handwriting sample 220 to emphasize that the handwriting sample 220 need not be a signature.
  • As is shown in the Figure, sign 220 exhibits an inclination with respect to the X axis of the image 210. This inclination will be referred to herein as the angle of submission θs or, alternatively, the submission angle θs. As previously discussed, a non-zero angle of submission is encountered often in biometric data collection. Clearly, a user cannot be expected to submit a handwriting sample, or place a finger on a touchpad or even hold his head at a particular angle for purposes of retinal scan every time the individual is to provide biometric data for validation. As such, feature extraction for purposes of biometric data validation becomes more difficult.
  • Among the beneficial features of the present invention is the application of a transform through which a “universal” angle of rotation is applied to the biometric data in the image. The term “universal angle of rotation” is used herein to describe an angle which, after transformation of the image data, the resulting angle of inclination of the biometric data is always the same. In certain embodiments of the invention, the desired inclination is determined by a statistical relationship between pixels of the transformed image. For example, the universal angle of rotation may be determined by the relationship where the variance in pixel values of pixels along the X axis is a scalar multiple of the variance in pixel values of those pixels in the Y direction. In another embodiment, the covariance of pixel values in the X and Y directions are set to a specific value, such as zero. Many other relationships exist and will become apparent to the skilled artisan as the present invention is described and all such relationships are considered to fall within the scope of the present invention.
  • Consider first an angle of rotation which brings about the relationship var(X)=k×var(Y), where X, Y are coordinates of the rotated image and x, y are the pixel coordinates of the image prior to rotation. Further, assume k=1, i.e., var(X)=var(Y). The present invention then applies a universal angle of rotation to the image data such that thereafter, the statistical relationship var(X)=var(Y) between pixels of the transformed biometric data image will hold, regardless of the angle of submission, θS.
  • In the exemplary embodiment of FIG. 2A, the biometric data image 210 is composed of a rectangular array of pixels distributed in the X and Y directions. It is desired to derive an angle of rotation through which the pixels of the biometric data image are to be rotated such that var(x)=var(y), where (x, y) is the set of transformed coordinates and var(x), var(y) are estimates of the variances of the pixel values in the transformed coordinate system. The desired angle of rotation θ=tan−1 M will achieve the design goal when M is appropriately chosen.
  • It is well known that rotation of axes in a Cartesian system is brought about through the relationships x i = X i + MY i 1 + M 2 ( 1 ) y i = Y i - MX i 1 + M 2 . ( 2 )
    Thus, it is a design goal, to find M such that
    var(x)=var((X i +MY i)/{square root}{square root over (1+M 2)})=var(y)=var((Y i −MX i)/{square root}{square root over (1+M 2)}).   (3)
    Simplifying, it is observed that M must provide a solution to
    var(X i +MY i)=var(Y i −MX i).   (4)
  • It is a well known statistical identity that
    var(X i +MY i)=var(X)+2M cov(X,Y)+M 2 var(Y)   (5)
    and
    var(Y i −MX i)=var(Y)−2M cov(X,Y)+M 2 var(X).   (6)
    Thus,
    var(X)+2M cov(X,Y)+M 2var(Y)=var(Y)−2M cov(X,Y)+M 2var(X)   (7)
    By gathering terms, a quadratic equation in M is formed:
    M 2{var(Y)−var(X)}+4M cov(X,Y)−{var(Y)−var(X)}=0.   (8)
    Dividing through by {square root}{square root over (var(X) var(Y))} and defining a={square root}{square root over (var(X)/var(Y))} and r=cov(X,Y)/{square root}{square root over (var(X) var(Y))}:
    M 2(a −1 −a)+4Mr−(a −1 −a)=0.   (9)
  • Thus, the universal angle of rotation that ensures the rotated image has the desired statistical relationship between pixels, i.e., var(X)=var(Y) is defined by the relationship θ=tan−1M, where, M = { 0 , a = 1 2 ra ± 4 r 2 a 2 + a 4 - 2 a 2 + 1 a 2 - 1 , a 1 . ( 10 )
    As is typical with solutions to quadratic equations, Eq. (10) produces two solutions, one for each of the operations defined by the ± operator. The two solutions provide angles of rotation that are 90° apart, i.e., perpendicular to each other.
  • Referring now to FIG. 2B, there is shown a rotated image 210′, in which the user's sign 220′ is at the inclination corresponding to the angle of rotation, θ. It is to be noted that this inclination of the sign 220′ will be produced by embodiments of the present invention regardless of the angle of submission θS of the user's original sign. Certain embodiments of the present invention analyze the original image of the biometric data 210 to determine the statistical properties thereof and these properties are then utilized to determine the angle of rotation that achieves the desired statistical relationship between pixels in the rotated image 210′. From the rotated image, features may be extracted for purposes of validation, and because the sign 220′ is at a consistent inclination with a sign from which template features are extracted, feature recognition and correspondence is improved.
  • In certain embodiments of the present invention, the relationship between pixels of the rotated image is that where the covariance of the pixels thereof are to be set to some constant value, such as zero. Using the same coordinate transformation described in Eqs. (1) and (2), the θ=tan−1 M that ensures that cov(x,y)=0, i.e., cov ( x , y ) = 1 N i = 1 N x i y i - 1 N 2 i = i N x i i = 1 N y i = 0 , ( 11 )
    leads to, 1 N i = 1 N { ( X i + MY i ) ( Y i - MX i ) } - 1 N 2 i = 1 N ( X i + MY i ) i = 1 N ( Y i - MX i ) = 0. ( 12 )
    Eq. (12) may be reduced to ( 1 - M 2 ) { N i = 1 N X i Y i - i = 1 N X i i = 1 N Y i } + M { N i = 1 N Y i 2 - ( i = 1 N Y i ) 2 } - M { N i = 1 N X i 2 - ( i = 1 N X i ) 2 } = 0. ( 13 )
    If Eq. (13) is divided through by N2 {square root}{square root over (var(X)var(Y))}, the equation may be reduced to: ( 1 - M 2 ) r + M var ( Y ) var ( X ) - M var ( X ) var ( Y ) = 0. ( 14 )
    Setting a={square root}{square root over (var(X)/var(Y))},
    rM 2 +M(a−a −1)−r =0.   (15)
    Thus, setting cov(x,y)=0 yields two perpendicular solutions, M = { 0 , r = 0 ( 1 - a 2 ) ± 4 r 2 a 2 + a 4 - 2 a 2 + 1 2 ra , r 0 . ( 16 )
  • The rotation defined by Eq. (10) may, depending upon the inclination of the biometric data in the original image, produce an inverted image, i.e., 180° to the expected image. However, many key features of biometric data may be defined which are not affected by such an inversion. If features are included which are sensitive to a 180° rotation, those features may be extracted for both the image after applying the universal angle of rotation and its equivalent after rotating the image further through another 180°. The closest match to the data in a biometric template stored in data storage unit 140 can be used for validation purposes.
  • In accordance with yet another of the beneficial features of the present invention, a further rotation may be imposed on the biometric data after the data has been transformed while maintaining the statistical relationship between pixels. For example, in certain applications, it may be desired to rotate the biometric data to a “natural” state, e.g., one that appears visually natural to an observer. A specific example of this may be where it is desired that the handwriting sample be rotated such that it is parallel to a longitudinal axis of the enclosing image. The additional rotation which retains the previously established statistical relationship may be accomplished through aspects of the invention discussed in paragraphs that follow.
  • Referring to FIG. 3A, there is shown the image 210′ which has been transformed by the universal rotation angle in accordance with the present invention. The biometric data 220′ is illustrated in the Figure as discrete points to emphasize the fact that the data are discrete. A line fitting technique, to be described presently, has produced a fitted line 330 to the data which makes an angle θR with the X axis of the image. It is important to note that θR may not be the same as the inclination angle produced by the transformation which achieves the relationship between pixels of the image. The inclination of the line 330, θR, is dependent upon the actual line fitting technique utilized.
  • After applying the universal angle of rotation according to Eq. (10), the set of pixels at the coordinates (x, y) have the property that var(x)=var(y). A new correlation coefficient for the transformed data set, r1, in terms of the original values of r and a may be defined. From the definition of the correlation coefficient, r 1 = cov ( x , y ) var ( x ) var ( y ) , ( 17 )
    and since var(x)=var(y), r 1 = cov ( x , y ) var ( x ) = cov ( x , y ) var ( y ) . Since , ( 18 ) var ( x ) = var ( X ) + 2 M cov ( X , Y ) + M 2 var ( Y ) 1 + M 2 and ( 19 ) x i = X i + MY i 1 + M 2 ( 20 ) y i = Y i - MX i 1 + M 2 ( 21 )
    and using known identities for cov(x, y), r 1 = 1 N i = 1 N ( X i + MY i ) ( Y i - MX i ) - 1 N 2 i = 1 N ( X i + MY i ) i = 1 N ( Y i - MX i ) var ( X ) + 2 M cov ( X , Y ) + M 2 var ( Y ) . ( 22 )
    Eq. (22) reduces to r 1 = ( 1 - M 2 ) cov ( X , Y ) - M var ( X ) + M var ( Y ) var ( X ) + 2 M cov ( X , Y ) + M 2 var ( Y ) . ( 23 )
    Dividing the numerator and denominator of Eq. (23) by {square root}{square root over (var(X)var(Y))} results in, r 1 = r ( 1 - M 2 ) - Ma + Ma - 1 a + 2 Mr + M 2 a - 1 , ( 24 )
    where a={square root}{square root over (var(X)/var(Y))}. Multiplying the numerator and denominator of Eq. (24) by a results in, r 1 = ar ( 1 - M 2 ) - Ma 2 + M a 2 + 2 Mra + M 2 or ( 25 ) r 1 = M ( 1 - a 2 ) + ( 1 - M 2 ) ra a 2 + 2 Mra + M 2 . ( 26 )
  • The angle of inclination of any one image is consistently defined by the chosen statistical relationship between pixels. A further rotation may be applied based upon defining a line of regression (LR) on the rotated data. The resultant angle of rotation, i.e., the universal rotation plus the line of regression rotation, will still preserve the property that the image, irrespective of the initial angle of submission of the biometric data, can be rotated to a consistent angle of inclination. Applying the LR rotation after the universal rotation will generally restore the biometric data to the desired aspect ratio, such as the horizontal aspect ratio of the user sign illustrated in FIG. 3B. As previously stated, the resultant angle of inclination of the biometric data in the image will be dependent upon the equation used for the line of regression and there are numerous methods known in the art.
  • To illustrate the additional rotation concept, the following least squares estimator minimizes the sum of squared errors of the perpendicular distance from the X, Y pixels onto the line of regression. With var(x)=var(y), the property instilled onto the image after universal rotation, an exemplary LR rotation is defined by the angle tan−1m, where m is the solution to
    2m 3 −m 2(r 1+1)+2m(r 1+1)−(2r 1+1)=0.   (27)
    It is a property of Eq. (27) that −1<m<1, for all r1, −1≦r1≦1. Thus, the LR rotation is between ±45°.
  • The overall angle of rotation may be given by tan−1M1, where, M 1 = M + m 1 - Mm . ( 28 )
    The original X, Y axis may then be rotated through an angle θ, generating a new set of pixels in (x, y) using the transformation:
    x i =X icosθ+Y isinθ  (29)
    y i =Y icosθ−X isinθ.   (30)
    where θ=tan−1M or θ=tan−1M1 and the image will always be rotated to a consistent angle of inclination. The choice between using M or M1 depends upon the actual implementation. Using M1 will generate images with a horizontal aspect ratio and using M will generate images with a square aspect ratio. The values of a and r calculated from the transformed image when the rotation is defined by M1 may be denoted as a2 and r2.
  • To illustrate the effectiveness of the present invention, five renditions of a sign that were submitted at a significant submission angle (Slanted Sign) were processed by the invention as well as five renditions of the same sign that were not at a significant submission angle (Horizontal Sign). The following table illustrates comparisons through respective correlation coefficients under the following conditions: (1) the image was not rotated, (2) the image was processed by a line of regression technique of the prior art, (3) the image was subjected to the universal rotation transformation of the present invention, and (4) the image was transformed to a universal rotation and then rotated by the LR rotation described above.
    TABLE 1
    Image Rotation Performance Comparison
    Horizontal Sign Slanted Sign
    Correlation Correlation
    Type of Rotation Coefficient Coefficient Difference
    None −0.242 0.016 0.258
    Line of Regression −0.182 0.015 0.197
    Universal Rotation 0.303 0.221 0.082
    Universal Rotation/LR 0.121 0.081 0.040
  • FIG. 4 illustrates via a flow diagram fundamental method steps of certain embodiments of the present invention. The method is entered at start block 405 and flow is transferred to block 410, where a sample of biometric data is obtained via biometric input device 120. As previously stated, the biometric data sample may be a handwriting sample, facial, iris or fingerprint features, or many other biometric quantities. The sample is then pixelated at block 415 and an image is formed from the pixelated data. At block 420, the relationship between the rotated image pixels is selected. Exemplary embodiments previously described have included where the selected relationship is that of equivalent variance of pixel values in orthogonal directions and a zero coefficient of correlation between pixels, but many other relationships exist and may be used with the present invention. From the selected relationship, an angle of rotation may be determined, as shown at block 425, and the image is then transformed to rotate the biometric data to that desired angle of rotation at block 430. When the image has been transformed, the relationship selected at block 420 is in effect.
  • Once the image of the biometric data has been transformed, it is then decided whether or not further rotation of the image is desired, as shown at decision block 435. If it is decided that the image is to be further rotated, flow transfers to block 440 where, a line is fit to the rotated image. At block 445, the fitted line is used as a reference to rotate the image until the desired resulting angle is achieved. Once this has been accomplished, or if further rotation is not necessary, flow is transferred to block 450, at which features of the biometric data are selected, for example, for purposes of identity validation.
  • When the pertinent features have been extracted at block 450, it is then decided, at decision block 460, whether these features are to be stored in a template for use as reference against subsequent data in validation processes, or if the extracted features are those to be compared with previously stored features. If the features extracted are to be stored as a template for future validation processes, flow is transferred to block 455, where the features are stored, and the process is terminated at block 480. If, at block 460, it is determined that the features are to be compared with stored values, the stored template is retrieved at block 465 and features stored therein compared with the extracted features at block 470. Flow is then transferred to block 475 where the biometric data are validated against the stored values in accordance with the appropriate validation techniques known in the art. The process is then terminated at end block 480.
  • The descriptions above are intended to illustrate possible implementations of the present invention and are not restrictive. Many variations, modifications and alternatives will become apparent to those skilled in the art upon review of this disclosure. For example, components equivalent to those shown and described may be substituted therefore, elements and methods individually described may be combined, and elements described as discrete may be distributed across many components. The scope of the invention should therefore be determined not with reference to the description above, but with reference to the appended Claims, along with their full range of equivalence.

Claims (20)

1. A method for spatially transforming biometric data for invariant feature extraction comprising the steps of:
selecting a statistical relationship between pixels of a transformed biometric data image;
pixelating the biometric data to form a biometric image thereof, said image including a plurality of pixels at a corresponding plurality of pixel coordinates, said biometric data oriented in said image at a submission angle with respect to a predetermined axis of said image; and
applying a transform to said biometric image to form said transformed biometric image, said biometric data being thereby rotated to an angle of inclination corresponding to said statistical relationship regardless of said submission angle.
2. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 1, where said statistical relationship selecting step includes the step of selecting as said statistical relationship that a variance of pixel values in a first direction of said transformed biometric image is a scalar multiple of a variance of pixel values in a second direction of said transformed biometric image.
3. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 2, where said statistical relationship selecting step includes the step of establishing that said second direction is perpendicular to said first direction.
4. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 1, where said statistical relationship selecting step includes the step of selecting as said statistical relationship that a covariance of pixel values at each of said pixel coordinates of said transformed biometric image is a constant value.
5. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 4, where said statistical relationship selecting step includes the step of establishing that said constant value is zero.
6. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 1 further including the steps of:
fitting a line to said pixelated biometric data of said transformed biometric image; and
rotating said transformed biometric image so that said fitted line is oriented in said image at a predetermined angle with respect to a predetermined axis of said transformed biometric image.
7. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 6 where said line fitting step includes the step of fitting said line using a regression procedure.
8. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 7 where said line fitting step includes the step of establishing a least square estimation as said regression procedure.
9. The method for spatially transforming biometric data for invariant feature extraction as recited in claim 1, where said pixelating step includes the step of providing a handwriting sample as the biometric data.
10. A method for verifying the validity of biometric data by invariant feature extraction, the method comprising the steps of:
selecting a statistical relationship between pixels of a transformed biometric data image;
providing an input device for obtaining biometric data from a user;
providing a storage unit for storing features of said biometric data;
receiving a first sample of said biometric data from said user;
pixelating said first sample to form a first sample image, said first sample image including a plurality of pixels at a corresponding plurality of pixel coordinates, said first sample being oriented in said first sample image at a first submission angle with respect to a predetermined axis of said first sample image;
applying a transform to said first sample image to produce a first transformed biometric data image, said first sample being oriented in said first transformed biometric data image at an angle of inclination corresponding to said statistical relationship;
extracting biometric features from said first transformed biometric image and storing said features in said storage unit;
receiving a second sample of said biometric data from said user;
pixelating said second sample to form a second sample image, said second sample image including a plurality of pixels at a corresponding plurality of pixel coordinates, said second sample being oriented in said second sample image at a second submission angle with respect to a predetermined axis of said second sample image;
applying said transform to said second sample image to produce a second transformed biometric data image, said second sample being oriented in said second transformed biometric data image at said angle of inclination;
extracting said biometric features from said second transformed biometric data image; and
comparing said biometric features of said second transformed biometric data image to corresponding ones of said features stored in said storage unit.
11. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 10, where said statistical relationship selecting step includes the step of selecting as said statistical relationship that a variance of pixel values in a first direction of said biometric image is a scalar multiple of a variance of pixel values in a second direction of said biometric image.
12. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 11, where said statistical relationship selecting step includes the step of establishing that said second direction is perpendicular to said first direction.
13. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 10, where said statistical relationship selecting step includes the step of selecting as said statistical relationship that a covariance of pixel values at each of said pixel coordinates of said biometric image is a constant value.
14. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 13, where said statistical relationship selecting step includes the step of establishing that said constant value is zero.
15. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 10 further including the steps of:
fitting a line to each of said pixelated first biometric sample and said pixelated second biometric sample of respectively said first transformed biometric data image and second transformed biometric data image;
rotating said first transformed biometric data image so that said respective fitted line is oriented in said first transformed biometric data image at an angle with respect to a predetermined axis of thereof prior to said first transformed biometric data image feature extraction step; and
rotating said second transformed biometric data image so that said respective fitted line is oriented in said second transformed biometric data image at an angle with respect to a predetermined axis of thereof prior to said second transformed biometric data image feature extraction step.
16. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 15 where said line fitting step includes the step of fitting each said respective line using a regression procedure.
17. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 16 where said line fitting step includes the step of establishing a least square estimation as said regression procedure.
18. The method for verifying the validity of biometric data by invariant feature extraction as recited in claim 10, where said first sample pixelating step and said second sample pixelating step each includes the step of providing a handwriting sample as said first sample and said second sample, respectively.
19. A system for spatially transforming biometric data for invariant feature extraction, the system comprising:
an input device operable to obtain biometric data from a user;
a pixelator operable to pixelate said biometric data into an image thereof, said image including a plurality of pixels at a corresponding plurality of pixel coordinates, said biometric data oriented in said image at a submission angle with respect to a predetermined axis of said image;
a storage unit operable to store sequences of program instructions that, when executed by a processing unit, cause said processor to execute a transformation process for transforming said image into a transformed image such that said biometric data are rotated to an angle of inclination in said transformed image corresponding to a predetermined statistical relationship between pixels thereof regardless of said submission angle; and
a processing unit coupled to said input device and said storage unit, said processing unit operable to execute said transformation process.
20. The system for spatially transforming biometric data for invariant feature extraction as recited in claim 19, wherein said storage unit further includes sequences of program instructions that, when executed by a processing unit, cause said processor to execute:
a line fitting process for fitting a line to said pixelated biometric data of said transformed biometric image; and
an image rotation process for rotating said transformed biometric image so that said fitted line is oriented in said image at an angle with respect to a predetermined axis of said rotated transformed biometric image.
US11/151,412 2004-06-14 2005-06-14 System and methods for transforming biometric image data to a consistent angle of inclination Abandoned US20050276454A1 (en)

Priority Applications (11)

Application Number Priority Date Filing Date Title
US11/151,412 US20050276454A1 (en) 2004-06-14 2005-06-14 System and methods for transforming biometric image data to a consistent angle of inclination
US12/627,413 US7916907B2 (en) 2004-06-14 2009-11-30 System and methods for transforming biometric image data to a consistent angle of inclination
US12/931,340 US8842887B2 (en) 2004-06-14 2011-01-31 Method and system for combining a PIN and a biometric sample to provide template encryption and a trusted stand-alone computing device
US13/072,398 US8885894B2 (en) 2004-06-14 2011-03-25 Reduction of transaction fraud through the use of automatic centralized signature/sign verification combined with credit and fraud scoring during real-time payment card authorization processes
US14/198,695 US9286457B2 (en) 2004-06-14 2014-03-06 Method and system for providing password-free, hardware-rooted, ASIC-based authentication of a human to a mobile device using biometrics with a protected, local template to release trusted credentials to relying parties
US14/998,574 US9665704B2 (en) 2004-06-14 2016-01-21 Method and system for providing password-free, hardware-rooted, ASIC-based, authentication of human to a stand-alone computing device using biometrics with a protected local template to release trusted credentials to relying parties
US15/731,069 US9940453B2 (en) 2004-06-14 2017-04-14 Method and system for securing user access, data at rest and sensitive transactions using biometrics for mobile devices with protected, local templates
US15/909,218 US10515204B2 (en) 2004-06-14 2018-03-01 Method and system for securing user access, data at rest and sensitive transactions using biometrics for mobile devices with protected, local templates
US16/724,214 US10824714B2 (en) 2004-06-14 2019-12-21 Method and system for securing user access, data at rest, and sensitive transactions using biometrics for mobile devices with protected local templates
US17/082,743 US11449598B2 (en) 2004-06-14 2020-10-28 Method and system for securing user access, data at rest, and sensitive transactions using biometrics for mobile devices with protected local templates
US17/947,930 US11803633B1 (en) 2004-06-14 2022-09-19 Method and system for securing user access, data at rest and sensitive transactions using biometrics for mobile devices with protected, local templates

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US57942204P 2004-06-14 2004-06-14
US11/151,412 US20050276454A1 (en) 2004-06-14 2005-06-14 System and methods for transforming biometric image data to a consistent angle of inclination

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/627,413 Continuation US7916907B2 (en) 2004-06-14 2009-11-30 System and methods for transforming biometric image data to a consistent angle of inclination

Publications (1)

Publication Number Publication Date
US20050276454A1 true US20050276454A1 (en) 2005-12-15

Family

ID=35460561

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/151,412 Abandoned US20050276454A1 (en) 2004-06-14 2005-06-14 System and methods for transforming biometric image data to a consistent angle of inclination
US12/627,413 Expired - Fee Related US7916907B2 (en) 2004-06-14 2009-11-30 System and methods for transforming biometric image data to a consistent angle of inclination

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/627,413 Expired - Fee Related US7916907B2 (en) 2004-06-14 2009-11-30 System and methods for transforming biometric image data to a consistent angle of inclination

Country Status (1)

Country Link
US (2) US20050276454A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110135203A1 (en) * 2009-01-29 2011-06-09 Nec Corporation Feature selection device
TWI564821B (en) * 2015-11-10 2017-01-01 財團法人工業技術研究院 Organic object classification method and organic object classification device
CN109409143A (en) * 2018-12-21 2019-03-01 北京思源互联科技有限公司 A kind of safety keyboard system and method
CN113553558A (en) * 2016-06-30 2021-10-26 微软技术许可有限责任公司 Detecting attacks using leaked credentials via internal network monitoring

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8122259B2 (en) * 2005-09-01 2012-02-21 Bricom Technologies Ltd Systems and algorithms for stateless biometric recognition
JP5012092B2 (en) * 2007-03-02 2012-08-29 富士通株式会社 Biometric authentication device, biometric authentication program, and combined biometric authentication method
US8977013B2 (en) 2010-07-12 2015-03-10 The Institute For Diagnostic Imaging Research, University Of Windsor Biometric sensor and method for generating a three-dimensional representation of a portion of a finger
US8542889B2 (en) * 2010-10-19 2013-09-24 Apple Inc. Systems, methods, and computer-readable media for capturing a signature for use in a document
FR3028980B1 (en) * 2014-11-20 2017-01-13 Oberthur Technologies METHOD AND DEVICE FOR AUTHENTICATING A USER
US11651060B2 (en) * 2020-11-18 2023-05-16 International Business Machines Corporation Multi-factor fingerprint authenticator

Citations (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3624293A (en) * 1970-03-19 1971-11-30 Shintron Co Inc Electrical inscribing
US3873770A (en) * 1974-03-21 1975-03-25 Bendix Corp Digital position measurement system with stylus tilt error compensation
US4028674A (en) * 1976-06-04 1977-06-07 Recognition Equipment Incorporated Automated signature verification system
US4202626A (en) * 1978-02-24 1980-05-13 A Patent Partnership Signature verification and authentication systems
US4240065A (en) * 1978-12-13 1980-12-16 Wigmore Professional Data Services Ltd. Position sensing apparatus
US4308522A (en) * 1979-03-19 1981-12-29 Ncr Corporation Identity verification apparatus and method
US4396902A (en) * 1980-07-07 1983-08-02 Recognition Equipment Incorporated OCR/Variable head slot reader
US4475235A (en) * 1982-01-04 1984-10-02 Rolm Corporation Signature verification sensor
US4672182A (en) * 1983-10-17 1987-06-09 Kabushiki Kaisha Toshiba Memory card
US4771460A (en) * 1984-02-09 1988-09-13 Kabushiki Kaishia Toshiba Data processing terminal device which stores a modified decrypted data in a programmable read only memory in order to detect alteration thereof
US4803351A (en) * 1986-03-12 1989-02-07 Casio Computer Co., Ltd. IC card system with control of data-writing process
US5048085A (en) * 1989-10-06 1991-09-10 International Business Machines Corporation Transaction system security method and apparatus
US5054088A (en) * 1989-09-20 1991-10-01 International Business Machines Corporation Signature verification data compression for storage on an identification card
US5101437A (en) * 1991-02-11 1992-03-31 Ecole Polytechnique Method and apparatus for comparing a test handwritten signature with a reference signature by using information relative to curvilinear and angular velocities of the signature
US5115107A (en) * 1991-01-11 1992-05-19 Ncr Corporation Method of correcting skew between a digitizer and a digital display
US5140107A (en) * 1991-07-02 1992-08-18 Ncr Corporation Digitizer screen and method of making
US5150420A (en) * 1985-10-21 1992-09-22 Omron Tateisi Electronics Co. Signature identification system
US5191175A (en) * 1991-07-29 1993-03-02 Ncr Corporation Self-tuning digitizer control circuit and method
US5195133A (en) * 1991-01-11 1993-03-16 Ncr Corporation Apparatus and method for producing a digitized transaction record including an encrypted signature
US5223677A (en) * 1991-09-09 1993-06-29 Ncr Corporation Handwriting capture device with insertable form interface
US5225636A (en) * 1992-02-21 1993-07-06 Ncr Corporation Apparatus and method for digitizer point sampling and validation
US5233547A (en) * 1991-11-12 1993-08-03 Ncr Corporation Electronic checking account apparatus and method having a digitizer to receive information as a check is being written
US5245139A (en) * 1992-02-18 1993-09-14 Ncr Corporation Apparatus and method for digitizer sampled point validation
US5272469A (en) * 1991-07-01 1993-12-21 Ncr Corporation Process for mapping high resolution data into a lower resolution depiction
US5283557A (en) * 1991-07-05 1994-02-01 Ncr Corporation Method for converting high resolution data into lower resolution data
US5335230A (en) * 1991-08-12 1994-08-02 Ncr Corporation Apparatus and method for automatic digitizer fault detection
US5373117A (en) * 1992-08-10 1994-12-13 Ncr Corporation Method for reducing errors in a digitizer
US5387765A (en) * 1992-09-02 1995-02-07 Ncr Corporation Data secure digitizer control circuit and method
US5414441A (en) * 1991-01-11 1995-05-09 Ncr Corporation Temperature compensation apparatus for liquid crystal display
US5434928A (en) * 1993-12-06 1995-07-18 At&T Global Information Solutions Company Method for verifying a handwritten signature entered into a digitizer
US5479280A (en) * 1992-12-30 1995-12-26 Goldstar Co., Ltd. Active matrix for liquid crystal displays having two switching means and discharging means per pixel
US5539159A (en) * 1991-05-17 1996-07-23 Ncr Corporation Handwriting capture device
US5563403A (en) * 1993-12-27 1996-10-08 Ricoh Co., Ltd. Method and apparatus for detection of a skew angle of a document image using a regression coefficient
US5563381A (en) * 1993-06-21 1996-10-08 Ncr Corporation Handwriting capture system with segmented digitizer
US5604802A (en) * 1993-10-29 1997-02-18 International Business Machines Corporation Transaction processing system
US5636291A (en) * 1992-01-08 1997-06-03 International Business Machines Corporation Continuous parameter hidden Markov model approach to automatic handwriting recognition
US5680470A (en) * 1993-12-17 1997-10-21 Moussa; Ali Mohammed Method of automated signature verification
US5745598A (en) * 1994-03-11 1998-04-28 Shaw; Venson Ming Heng Statistics based segmentation and parameterization method for dynamic processing, identification, and verification of binary contour image
US5825906A (en) * 1994-11-30 1998-10-20 Nippondenso Co., Ltd. Signature recognition system
US5828772A (en) * 1995-12-27 1998-10-27 Lucent Technologies Inc. Method and apparatus for parametric signature verification using global features and stroke-direction codes
US5892824A (en) * 1996-01-12 1999-04-06 International Verifact Inc. Signature capture/verification systems and methods
US6084985A (en) * 1996-10-04 2000-07-04 U.S. Philips Corporation Method and apparatus for on-line handwriting recognition based on feature vectors that use aggregated observations derived from time-sequential frames
US6157731A (en) * 1998-07-01 2000-12-05 Lucent Technologies Inc. Signature verification method using hidden markov models
US6226417B1 (en) * 1997-08-04 2001-05-01 Ricoh Company, Ltd. Method and system for recognizing a rotated image pattern with reduced processing time and memory space
US20020136469A1 (en) * 2001-03-23 2002-09-26 Fujitsu Limited Image processing apparatus, image processing method and computer-readable medium on which image processing program is recorded
US20030007691A1 (en) * 2001-07-06 2003-01-09 Glory Ltd. Signature collation apparatus, signature collation method, and program which causes computer to execute the method
US6571002B1 (en) * 1999-05-13 2003-05-27 Mitsubishi Denki Kabushiki Kaisha Eye open/close detection through correlation
US20030126448A1 (en) * 2001-07-12 2003-07-03 Russo Anthony P. Method and system for biometric image assembly from multiple partial biometric frame scans
US20030210817A1 (en) * 2002-05-10 2003-11-13 Microsoft Corporation Preprocessing of multi-line rotated electronic ink
US6734998B2 (en) * 2000-02-18 2004-05-11 Mustek Systems Inc. Method for determining scan line misalignments
US20040170318A1 (en) * 2003-02-28 2004-09-02 Eastman Kodak Company Method for detecting color objects in digital images
US6882746B1 (en) * 1999-02-01 2005-04-19 Thomson Licensing S.A. Normalized bitmap representation of visual object's shape for search/query/filtering applications
US20050089248A1 (en) * 2003-01-28 2005-04-28 Konstantin Zuev Adjustment method of a machine-readable form model and a filled form scanned image thereof in the presence of distortion

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2909899A1 (en) 1979-03-14 1980-09-25 Dieter Dipl Phys Dr Philipp Fast data acquisition system for cash desk computer - has autonomous portable data acquisition units connectable to computer for data transfer
JP2603946B2 (en) * 1986-09-26 1997-04-23 オリンパス光学工業株式会社 Apparatus for detecting corresponding areas between images
EP0461779B1 (en) * 1990-06-15 1995-08-16 Yusuke Izumi Reaction accelerator for rearrangement of oxime to amide and process for producing amides by rearrangement of oximes
US6285802B1 (en) * 1999-04-08 2001-09-04 Litton Systems, Inc. Rotational correction and duplicate image identification by fourier transform correlation

Patent Citations (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3624293A (en) * 1970-03-19 1971-11-30 Shintron Co Inc Electrical inscribing
US3873770A (en) * 1974-03-21 1975-03-25 Bendix Corp Digital position measurement system with stylus tilt error compensation
US4028674A (en) * 1976-06-04 1977-06-07 Recognition Equipment Incorporated Automated signature verification system
US4202626A (en) * 1978-02-24 1980-05-13 A Patent Partnership Signature verification and authentication systems
US4240065A (en) * 1978-12-13 1980-12-16 Wigmore Professional Data Services Ltd. Position sensing apparatus
US4308522A (en) * 1979-03-19 1981-12-29 Ncr Corporation Identity verification apparatus and method
US4396902A (en) * 1980-07-07 1983-08-02 Recognition Equipment Incorporated OCR/Variable head slot reader
US4475235A (en) * 1982-01-04 1984-10-02 Rolm Corporation Signature verification sensor
US4672182A (en) * 1983-10-17 1987-06-09 Kabushiki Kaisha Toshiba Memory card
US4771460A (en) * 1984-02-09 1988-09-13 Kabushiki Kaishia Toshiba Data processing terminal device which stores a modified decrypted data in a programmable read only memory in order to detect alteration thereof
US5150420A (en) * 1985-10-21 1992-09-22 Omron Tateisi Electronics Co. Signature identification system
US4803351A (en) * 1986-03-12 1989-02-07 Casio Computer Co., Ltd. IC card system with control of data-writing process
US5054088A (en) * 1989-09-20 1991-10-01 International Business Machines Corporation Signature verification data compression for storage on an identification card
US5048085A (en) * 1989-10-06 1991-09-10 International Business Machines Corporation Transaction system security method and apparatus
US5115107A (en) * 1991-01-11 1992-05-19 Ncr Corporation Method of correcting skew between a digitizer and a digital display
US5195133A (en) * 1991-01-11 1993-03-16 Ncr Corporation Apparatus and method for producing a digitized transaction record including an encrypted signature
US5414441A (en) * 1991-01-11 1995-05-09 Ncr Corporation Temperature compensation apparatus for liquid crystal display
US5297202A (en) * 1991-01-11 1994-03-22 Ncr Corporation Apparatus and method for producing a digitized transaction record including an encrypted signature
US5101437A (en) * 1991-02-11 1992-03-31 Ecole Polytechnique Method and apparatus for comparing a test handwritten signature with a reference signature by using information relative to curvilinear and angular velocities of the signature
US5539159A (en) * 1991-05-17 1996-07-23 Ncr Corporation Handwriting capture device
US5272469A (en) * 1991-07-01 1993-12-21 Ncr Corporation Process for mapping high resolution data into a lower resolution depiction
US5140107A (en) * 1991-07-02 1992-08-18 Ncr Corporation Digitizer screen and method of making
US5283557A (en) * 1991-07-05 1994-02-01 Ncr Corporation Method for converting high resolution data into lower resolution data
US5191175A (en) * 1991-07-29 1993-03-02 Ncr Corporation Self-tuning digitizer control circuit and method
US5335230A (en) * 1991-08-12 1994-08-02 Ncr Corporation Apparatus and method for automatic digitizer fault detection
US5223677A (en) * 1991-09-09 1993-06-29 Ncr Corporation Handwriting capture device with insertable form interface
US5233547A (en) * 1991-11-12 1993-08-03 Ncr Corporation Electronic checking account apparatus and method having a digitizer to receive information as a check is being written
US5636291A (en) * 1992-01-08 1997-06-03 International Business Machines Corporation Continuous parameter hidden Markov model approach to automatic handwriting recognition
US5245139A (en) * 1992-02-18 1993-09-14 Ncr Corporation Apparatus and method for digitizer sampled point validation
US5225636A (en) * 1992-02-21 1993-07-06 Ncr Corporation Apparatus and method for digitizer point sampling and validation
US5373117A (en) * 1992-08-10 1994-12-13 Ncr Corporation Method for reducing errors in a digitizer
US5387765A (en) * 1992-09-02 1995-02-07 Ncr Corporation Data secure digitizer control circuit and method
US5479280A (en) * 1992-12-30 1995-12-26 Goldstar Co., Ltd. Active matrix for liquid crystal displays having two switching means and discharging means per pixel
US5563381A (en) * 1993-06-21 1996-10-08 Ncr Corporation Handwriting capture system with segmented digitizer
US5604802A (en) * 1993-10-29 1997-02-18 International Business Machines Corporation Transaction processing system
US5434928A (en) * 1993-12-06 1995-07-18 At&T Global Information Solutions Company Method for verifying a handwritten signature entered into a digitizer
US5680470A (en) * 1993-12-17 1997-10-21 Moussa; Ali Mohammed Method of automated signature verification
US5563403A (en) * 1993-12-27 1996-10-08 Ricoh Co., Ltd. Method and apparatus for detection of a skew angle of a document image using a regression coefficient
US5745598A (en) * 1994-03-11 1998-04-28 Shaw; Venson Ming Heng Statistics based segmentation and parameterization method for dynamic processing, identification, and verification of binary contour image
US5825906A (en) * 1994-11-30 1998-10-20 Nippondenso Co., Ltd. Signature recognition system
US5828772A (en) * 1995-12-27 1998-10-27 Lucent Technologies Inc. Method and apparatus for parametric signature verification using global features and stroke-direction codes
US5892824A (en) * 1996-01-12 1999-04-06 International Verifact Inc. Signature capture/verification systems and methods
US6084985A (en) * 1996-10-04 2000-07-04 U.S. Philips Corporation Method and apparatus for on-line handwriting recognition based on feature vectors that use aggregated observations derived from time-sequential frames
US6226417B1 (en) * 1997-08-04 2001-05-01 Ricoh Company, Ltd. Method and system for recognizing a rotated image pattern with reduced processing time and memory space
US6157731A (en) * 1998-07-01 2000-12-05 Lucent Technologies Inc. Signature verification method using hidden markov models
US6882746B1 (en) * 1999-02-01 2005-04-19 Thomson Licensing S.A. Normalized bitmap representation of visual object's shape for search/query/filtering applications
US6571002B1 (en) * 1999-05-13 2003-05-27 Mitsubishi Denki Kabushiki Kaisha Eye open/close detection through correlation
US6734998B2 (en) * 2000-02-18 2004-05-11 Mustek Systems Inc. Method for determining scan line misalignments
US20020136469A1 (en) * 2001-03-23 2002-09-26 Fujitsu Limited Image processing apparatus, image processing method and computer-readable medium on which image processing program is recorded
US20030007691A1 (en) * 2001-07-06 2003-01-09 Glory Ltd. Signature collation apparatus, signature collation method, and program which causes computer to execute the method
US20030126448A1 (en) * 2001-07-12 2003-07-03 Russo Anthony P. Method and system for biometric image assembly from multiple partial biometric frame scans
US20030210817A1 (en) * 2002-05-10 2003-11-13 Microsoft Corporation Preprocessing of multi-line rotated electronic ink
US20050089248A1 (en) * 2003-01-28 2005-04-28 Konstantin Zuev Adjustment method of a machine-readable form model and a filled form scanned image thereof in the presence of distortion
US20040170318A1 (en) * 2003-02-28 2004-09-02 Eastman Kodak Company Method for detecting color objects in digital images

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110135203A1 (en) * 2009-01-29 2011-06-09 Nec Corporation Feature selection device
US8620087B2 (en) * 2009-01-29 2013-12-31 Nec Corporation Feature selection device
TWI564821B (en) * 2015-11-10 2017-01-01 財團法人工業技術研究院 Organic object classification method and organic object classification device
CN113553558A (en) * 2016-06-30 2021-10-26 微软技术许可有限责任公司 Detecting attacks using leaked credentials via internal network monitoring
CN109409143A (en) * 2018-12-21 2019-03-01 北京思源互联科技有限公司 A kind of safety keyboard system and method

Also Published As

Publication number Publication date
US7916907B2 (en) 2011-03-29
US20100142763A1 (en) 2010-06-10

Similar Documents

Publication Publication Date Title
US7916907B2 (en) System and methods for transforming biometric image data to a consistent angle of inclination
US6810480B1 (en) Verification of identity and continued presence of computer users
US7274807B2 (en) Method and apparatus for supporting a biometric registration performed on a card
US7142699B2 (en) Fingerprint matching using ridge feature maps
US7391891B2 (en) Method and apparatus for supporting a biometric registration performed on an authentication server
US8005277B2 (en) Secure fingerprint matching by hashing localized information
CN100382093C (en) Registration method for biometrics authentication system, biometrics authentication system, and program for same
Paul et al. Multimodal cancelable biometrics
US8538096B2 (en) Methods and apparatus for generation of cancelable fingerprint template
US8908934B2 (en) Fingerprint recognition for low computing power applications
JP2012519928A (en) Fingerprint template synthesis and fingerprint mosaicking method using point matching algorithm
EP1280094B1 (en) Biometric identification method and apparatus using one
Hooda ATM security
EP1385118A2 (en) Method and apparatus for supporting a biometric registration performed on a card
CN108573212B (en) Palm feature identity authentication method and device
Krivokuća et al. Fast fingerprint alignment method based on minutiae orientation histograms
KR100919486B1 (en) Method for aligning concealed fingerprint data using partial geometric hashing, Method for authenticating fingerprint data using partial geometric hashing, Apparatus and System thereof
Sanchez-Reillo et al. Improving access control security using iris identification
Csongrády et al. Spectral biometrical recognition of fingerprints
Seto et al. Standardization of accuracy evaluation for biometric authentication in Japan
Xia et al. Fingerprint liveness detection using difference co-occurrence matrix based texture features
Chadha et al. Rotation, Scaling and Translation Analysis of Biometric Signature Templates
Rithvik et al. Fingerprint Password Method Provides Improved Accuracy over Token-based Authentication for Efficient and Secure File Transfers
Hamouda et al. Innovative Hetero-Associative Memory Encoder (HAMTE) for Palmprint Template Protection.
Uprety et al. Polar Harmonic Transform for Fingerprint Recognition

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION