[go: nahoru, domu]

US20030081141A1 - Brightness adjustment method - Google Patents

Brightness adjustment method Download PDF

Info

Publication number
US20030081141A1
US20030081141A1 US10/309,448 US30944802A US2003081141A1 US 20030081141 A1 US20030081141 A1 US 20030081141A1 US 30944802 A US30944802 A US 30944802A US 2003081141 A1 US2003081141 A1 US 2003081141A1
Authority
US
United States
Prior art keywords
brightness
area
transition
image
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/309,448
Inventor
Douglas Mazzapica
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/309,448 priority Critical patent/US20030081141A1/en
Publication of US20030081141A1 publication Critical patent/US20030081141A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/40093Modification of content of picture, e.g. retouching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/407Control or modification of tonal gradation or of extreme levels, e.g. background level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"

Definitions

  • the present invention relates generally to a brightness adjustment method, and more particularly, to a method modifying the relationship between the actual brightness of a scene and the brightness of the image recorded in a photograph, such that the scene can be reproduced with enhanced fidelity.
  • a digital camera has an image capturing device such as a charge couple device (CCD) to capture an image and to save it to a memory.
  • the exposure step allows the image capture device exposed to the image of the scene to be captured and photographed in a medium such as a film, a disk or a display screen.
  • the modern image capturing device is sensitive to a broad range of brightness or light levels reflected from or emitted from the scene.
  • overexposure and underexposure frequently occurs to some part of the image due to excessive and insufficient brightness thereof, respectively. This is because most of the currently available media for recording the image accept relatively small range of light levels (brightness) compared to the image capture device.
  • a black image can be rendered as black as it is, but the image with very high light level such as the sun can hardly be resolved.
  • the exposure value has to be controlled or adjusted.
  • overall exposure adjustment and color shift can be achieved. That is, the exposure value of the whole scene is adjusted to render the image of the much brighter or to resolve the remaining darker elements of the scene.
  • the exposure value is adjusted (normally reduced) to adequately reproduce the brighter element, the remaining elements are very likely to be underexposed.
  • the details of the brighter element is blown out of the acceptable range, that is, overexposed.
  • FIG. 1 shows a graph of how a digital camera translates the input image brightness into the light intensity rendered in the picture processed by the digital camera.
  • the dash line shows the exposure adjustment to render the brighter element of the scene
  • the dot line shows the exposure adjustment for reproducing the remaining objects darker than the brighter element of the scene. In either case, a part of the scene will be lost in the picture.
  • An alternative to photograph a scene with an element much brighter than the remaining elements is to manually filter the brighter element. That is, a neutral density filter is disposed between the brighter element of the scene and the digital camera or an illumination source of the brighter element. Therefore, the brightness of the brighter element can be effectively reduced to an acceptable range of the digital camera.
  • the present invention provides a method of brightness adjustment similar in concept to the manual filtering as mentioned above.
  • An image capture device is used to capture an image of a scene.
  • the image capture device includes a charged-coupled device, for example.
  • the image is segmented into a plurality of areas, and each area comprises a plurality of pixels.
  • One of the areas is selected as a transition area, and a transition brightness of the transition area is defined.
  • the brightness of the remaining area of the image are then defined and compared to the transition brightness. According to the comparison result, the remaining areas are grouped into various regions, and a coefficient is determined for each of the remaining area.
  • the brightness of the pixels in the areas of each region is then adjusted by the corresponding coefficient.
  • the transition area may be selected from an area that maps with a particular element of the scene, or an area of which the brightness is closest to a predetermined value.
  • the transition area may also be selected from an area located at a specific position.
  • the transition brightness may be defined by computing an average brightness of the pixels of the transition area, or a predetermined percentage of a peak brightness among the pixels of the transition area.
  • the transition brightness may also be selected from a medium brightness among the pixels of the transition area.
  • the remaining areas are grouped into a lower region and a higher region.
  • the areas grouped into the lower region have brightness lower than the transition brightness, while the areas grouped into the higher region have brightness higher than the transition brightness. Consequently, a first coefficient and a second coefficient are determined for the areas in lower and higher regions, respectively.
  • the brightness of the pixels of each area in the lower region is adjusted by multiplying the first coefficient, and the brightness of the pixels of each area in the higher region is adjusted by multiplying the second coefficient plus a constant.
  • the first coefficient is larger than the second coefficient.
  • the present invention further provides a brightness method, in which an image of a scene is captured by an image capture device.
  • the image is segmented into a plurality of areas, where each area comprises a plurality of pixels in three primary colors.
  • One of the areas is selected as a transition area, and a transition brightness for the transition area is defined.
  • the brightness of each of the remaining area is compared to the transition area, and a coefficient is determined for each of the remaining area.
  • the brightness of the pixels of at least one color in at least one area is adjusted by the corresponding adjustment coefficient.
  • the brightness of the pixels of other colors may be filtered according to specific requirements.
  • the brightness of the pixels of other colors may be filter by a percentage of the adjustment coefficient of the adjusted color, or only the adjusted color is displayed.
  • FIG. 1 shows a graph of conventional exposure adjustments for a digital camera
  • FIG. 2 shows a flow chart of an brightness adjustment method in one embodiment of the present invention
  • FIG. 3 shows the process flow for selecting a transition area under an automatic selection mode
  • FIG. 4 shows the process flow for selecting a transition area under a manual selection mode
  • FIG. 5 is a graph illustrating the relationship between the input brightness of the captured image and the output intensity of the adjusted image.
  • the present invention provides a method for adjusting brightness of an image similar in concept to the manual filtering as mentioned above.
  • a neutral density filter is placed over an element of the scene to be photographed when the brightness of the element is beyond the acceptable range of the recording medium.
  • a fluid electronic masking grid is applied to the brightness adjustment method. The detailed description of the fluid electronic masking grid can be referred to the previously filed application Ser. No. 09/954,326.
  • the image of the scene is captured by an image capture device and converted in a digital format.
  • an area is selected as a reference area or transition area.
  • the brightness of remaining areas of the image are then compared to the brightness of the reference area.
  • the brightness or exposure values of the other areas of the image to be captured are adjusted (or filtered) with reference to the brightness of the reference area. Therefore, the element with brightness beyond the acceptable range can be adjusted to render the details of the element without underexposing the remaining elements of the scene.
  • FIG. 2 shows a flow chart of an embodiment in which an area of an image to be captured is selected as a transition area, and the brightness of the remaining areas of the image is filtered in response to the brightness of the transition area.
  • an image is made ready to be captured by an image sensor such as a charge-coupled device or a complementary metal-oxide semiconductor (CMOS) sensor.
  • CMOS complementary metal-oxide semiconductor
  • the to-be-captured image is then partitioned or segmented into a plurality areas in step 202 , and each of the areas comprises a plurality of pixels.
  • a transition area is selected either manually or automatically in step M 204 or A 204 , respectively.
  • the brightness of the transition area is defined as the transition brightness in step 206 .
  • the step of defining the transition brightness can be performed in various ways.
  • the transition brightness may be defined by computing an average brightness of the pixels of the transition area.
  • the transition brightness may also be represented by a percentage of a peak brightness among the pixels of the transition area.
  • a medium brightness among the pixels of the transition area may be selected as the transition brightness.
  • the brightness of the remaining areas of the to-be-captured image is also defined.
  • the way for defining the brightness of the remaining areas is similar to that of defining the brightness of the transition brightness.
  • the brightness of the remaining areas is compared to the transition brightness in step 208 . According to the comparison result, the remaining areas are grouped into a plurality of regions in step 210 .
  • a coefficient is determined for each region in step 212 .
  • the brightness of each pixel in each of the remaining areas is then adjusted by the corresponding coefficient, allowing all elements of the scene from which the image is captured to be adequately photographed.
  • the transition area can be selected either manually or automatically.
  • FIG. 3 shows the process flow of the automatic selection mode
  • FIG. 4 shows the process flow of the manual selection mode.
  • the step A 204 further comprises the sub-steps A 300 and A 302 .
  • the brightness of each area is defined.
  • the brightness of each area can be defined by an average brightness, a percentage of a peak brightness, or a medium brightness of the pixels in the corresponding area.
  • the area having the brightness closest to a predetermined brightness is selected as the transition area.
  • the predetermined brightness can be preprogrammed in the digital camera, or input by the operator or the user according to specific requirement.
  • a manual selection mode the operator or user may select the transition area according to various factors as shown in FIG. 4.
  • a user interface is provided to display the raw image captured by the image capture device.
  • the operator may select an element as a reference element for adjusting images of other elements of the scene. That is, the area of the raw image mapping the reference element of the scene is selected as the transition area in step M 402 .
  • the operator may simply observe the raw image and decide which area of the image is selected as the transition area. Or alternatively, the operator may selects an area at a particularly position of the image as the transition area in step M 406 .
  • the brightness of each area may be defined and shown by the user interface in step M 410 , such that the operator can select the transition area based on the brightness of the areas in step M 412 .
  • the brightness of each area is defined according to the brightness of every pixel of the corresponding area. For example, an average brightness of the pixels can be computed and defined as the brightness of the corresponding area. A predetermined percentage of a peak brightness of the pixels can also be defined as the brightness of the corresponding area. Or alternatively, the medium brightness of the pixels can also be referred as the transition brightness of the corresponding area.
  • FIG. 5 shows a graph of the relationship between the brightness of the to-be-captured image and the output intensity of the translated image.
  • the remaining areas of the captured image are grouped into two regions, including one lower region and one higher region.
  • the areas in the lower region have brightness lower than the transition brightness, and the areas in the higher region have brightness higher than the transition brightness.
  • a first and a second adjustment coefficients C 1 and C 2 are determined for the areas of the areas in the lower and higher region, respectively.
  • the first and second adjustment coefficients are determined based on the brightness difference between the remaining areas and the transition area, and the acceptable brightness range, or in other words, the maximum resolvable brightness of the recording medium.
  • the relationship between the brightness of the captured image Bi and the output intensity of the translated image Bo for the lower region is:
  • Const is a constant. According to FIG. 5, C 1 is larger than C 2 , allowing the brightness of the areas in the lower region adjusted relatively higher, and the brightness of the areas in the higher region adjusted relatively lower.
  • the brightness that is, the light level or the exposure value, of each pixel adjusted by the adjustment coefficient can be represented by numbers of f-stops.
  • the image sensors accept a wider range of brightness than the recording media.
  • an image sensor may have a resolution of about 10 bits to about 16 bits with each bit equivalent to one f-stop, while most recording media have resolution of only 8 bits per color. This indicates that about 2 to 8 f-stops of the captured image will be lost in the image recorded by the media by the conventional image process.
  • the f-stops of the captured image beyond the acceptable range of the medium are scaled into the acceptable range from two ends with reference to a selected number of f-stops. Therefore, the captured image can be adequately translated to render every element of the scene.
  • the same adjustment coefficient is introduced to the pixels of three primary colors in the same area.
  • the neutral density adjustment is no longer neutral. That is, rather than adjusting the brightness of the pixels of all three primary colors by the same adjustment coefficient, separate adjustment coefficients may be applied to the pixels of different colors.
  • the second adjustment coefficient is only applied to the blue pixels of the corresponding area, while the red and green pixels of the corresponding area are adjusted by coefficients different from the second adjustment coefficient.
  • the second adjustment coefficient is applied to the blue pixels only, while the coefficients applied to the red and green pixels are zero.
  • a software is stored in the digital camera for segmenting the image captured by the image capture device recording area a plurality of areas. All the areas are active and ready to accept a brightness adjustment of neutral density or color.
  • the above method can be applied to digital camera, digital video camera, film scanners and other digital image processing systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

A brightness adjustment method. An image of a scene is provided. The image is segmented into a plurality of areas, where each area has a plurality of pixels. One of the areas is selected as a transition area, and a transition brightness of the transition area is defined. The brightness of each remaining area of the image is also defined and compared to the transition brightness. According to the comparison result, the remaining areas are grouped into a plurality of regions, and a coefficient is determined for each region. The rightness of each pixel of the areas in each region is then adjusted by the corresponding coefficient.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This invention is a continuation-in-part of the previously filed application Ser. No. 09/954,326.[0001]
  • STATEMENT RE: FEDERALLY SPONSORED RESEARCH/DEVELOPMENT
  • (Not Applicable) [0002]
  • BACKGROUND OF THE INVENTION
  • The present invention relates generally to a brightness adjustment method, and more particularly, to a method modifying the relationship between the actual brightness of a scene and the brightness of the image recorded in a photograph, such that the scene can be reproduced with enhanced fidelity. [0003]
  • A digital camera has an image capturing device such as a charge couple device (CCD) to capture an image and to save it to a memory. The exposure step allows the image capture device exposed to the image of the scene to be captured and photographed in a medium such as a film, a disk or a display screen. The modern image capturing device is sensitive to a broad range of brightness or light levels reflected from or emitted from the scene. However, when the image is recorded in the medium, overexposure and underexposure frequently occurs to some part of the image due to excessive and insufficient brightness thereof, respectively. This is because most of the currently available media for recording the image accept relatively small range of light levels (brightness) compared to the image capture device. For many media, a black image can be rendered as black as it is, but the image with very high light level such as the sun can hardly be resolved. [0004]
  • To obtain a properly exposed image recorded by the digital camera, particularly when the scene from which the image is captured contains an element much brighter than the others, the exposure value has to be controlled or adjusted. In many applications such as digital still cameras, digital video cameras, film and print scanners and motion picture transfer systems that convert a light image into a recordable image, overall exposure adjustment and color shift can be achieved. That is, the exposure value of the whole scene is adjusted to render the image of the much brighter or to resolve the remaining darker elements of the scene. When the exposure value is adjusted (normally reduced) to adequately reproduce the brighter element, the remaining elements are very likely to be underexposed. In contrast, by adjusting the exposure value to resolve the remaining darker elements, the details of the brighter element is blown out of the acceptable range, that is, overexposed. [0005]
  • FIG. 1 shows a graph of how a digital camera translates the input image brightness into the light intensity rendered in the picture processed by the digital camera. In FIG. 1, the dash line shows the exposure adjustment to render the brighter element of the scene, and the dot line shows the exposure adjustment for reproducing the remaining objects darker than the brighter element of the scene. In either case, a part of the scene will be lost in the picture. [0006]
  • An alternative to photograph a scene with an element much brighter than the remaining elements is to manually filter the brighter element. That is, a neutral density filter is disposed between the brighter element of the scene and the digital camera or an illumination source of the brighter element. Therefore, the brightness of the brighter element can be effectively reduced to an acceptable range of the digital camera. [0007]
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention provides a method of brightness adjustment similar in concept to the manual filtering as mentioned above. An image capture device is used to capture an image of a scene. The image capture device includes a charged-coupled device, for example. The image is segmented into a plurality of areas, and each area comprises a plurality of pixels. One of the areas is selected as a transition area, and a transition brightness of the transition area is defined. The brightness of the remaining area of the image are then defined and compared to the transition brightness. According to the comparison result, the remaining areas are grouped into various regions, and a coefficient is determined for each of the remaining area. The brightness of the pixels in the areas of each region is then adjusted by the corresponding coefficient. [0008]
  • In the above method, the transition area may be selected from an area that maps with a particular element of the scene, or an area of which the brightness is closest to a predetermined value. The transition area may also be selected from an area located at a specific position. The transition brightness may be defined by computing an average brightness of the pixels of the transition area, or a predetermined percentage of a peak brightness among the pixels of the transition area. Alternatively, the transition brightness may also be selected from a medium brightness among the pixels of the transition area. [0009]
  • In one embodiment of the present invention, the remaining areas are grouped into a lower region and a higher region. The areas grouped into the lower region have brightness lower than the transition brightness, while the areas grouped into the higher region have brightness higher than the transition brightness. Consequently, a first coefficient and a second coefficient are determined for the areas in lower and higher regions, respectively. The brightness of the pixels of each area in the lower region is adjusted by multiplying the first coefficient, and the brightness of the pixels of each area in the higher region is adjusted by multiplying the second coefficient plus a constant. Preferably, the first coefficient is larger than the second coefficient. [0010]
  • The present invention further provides a brightness method, in which an image of a scene is captured by an image capture device. The image is segmented into a plurality of areas, where each area comprises a plurality of pixels in three primary colors. One of the areas is selected as a transition area, and a transition brightness for the transition area is defined. The brightness of each of the remaining area is compared to the transition area, and a coefficient is determined for each of the remaining area. The brightness of the pixels of at least one color in at least one area is adjusted by the corresponding adjustment coefficient. [0011]
  • In one embodiment of the present invention, when the brightness of pixels of one color in at least one area is adjusted by the determined coefficient, the brightness of the pixels of other colors may be filtered according to specific requirements. For example, the brightness of the pixels of other colors may be filter by a percentage of the adjustment coefficient of the adjusted color, or only the adjusted color is displayed. [0012]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These, as well as other features of the present invention, will become more apparent upon reference to the drawings wherein: [0013]
  • FIG. 1 shows a graph of conventional exposure adjustments for a digital camera; [0014]
  • FIG. 2 shows a flow chart of an brightness adjustment method in one embodiment of the present invention; [0015]
  • FIG. 3 shows the process flow for selecting a transition area under an automatic selection mode; [0016]
  • FIG. 4 shows the process flow for selecting a transition area under a manual selection mode; and [0017]
  • FIG. 5 is a graph illustrating the relationship between the input brightness of the captured image and the output intensity of the adjusted image.[0018]
  • DETAILED DESCRIPTION OF THE INVENTION
  • The present invention provides a method for adjusting brightness of an image similar in concept to the manual filtering as mentioned above. In essence, in the brightness adjustment method provided by the present invention, a neutral density filter is placed over an element of the scene to be photographed when the brightness of the element is beyond the acceptable range of the recording medium. A fluid electronic masking grid is applied to the brightness adjustment method. The detailed description of the fluid electronic masking grid can be referred to the previously filed application Ser. No. 09/954,326. [0019]
  • To adequately reproduce an image of a scene or an object, in the present invention, the image of the scene is captured by an image capture device and converted in a digital format. In the image to be captured, an area is selected as a reference area or transition area. The brightness of remaining areas of the image are then compared to the brightness of the reference area. Depending on specific requirements, the brightness or exposure values of the other areas of the image to be captured are adjusted (or filtered) with reference to the brightness of the reference area. Therefore, the element with brightness beyond the acceptable range can be adjusted to render the details of the element without underexposing the remaining elements of the scene. [0020]
  • FIG. 2 shows a flow chart of an embodiment in which an area of an image to be captured is selected as a transition area, and the brightness of the remaining areas of the image is filtered in response to the brightness of the transition area. In [0021] step 200, an image is made ready to be captured by an image sensor such as a charge-coupled device or a complementary metal-oxide semiconductor (CMOS) sensor. The to-be-captured image is then partitioned or segmented into a plurality areas in step 202, and each of the areas comprises a plurality of pixels. Among the areas of the to-be-captured image, a transition area is selected either manually or automatically in step M204 or A204, respectively.
  • When the transition area is selected, the brightness of the transition area is defined as the transition brightness in [0022] step 206. According to specific requirement, the step of defining the transition brightness can be performed in various ways. For example, the transition brightness may be defined by computing an average brightness of the pixels of the transition area. The transition brightness may also be represented by a percentage of a peak brightness among the pixels of the transition area. Or alternatively, a medium brightness among the pixels of the transition area may be selected as the transition brightness.
  • Similarly, the brightness of the remaining areas of the to-be-captured image is also defined. The way for defining the brightness of the remaining areas is similar to that of defining the brightness of the transition brightness. After the brightness of the remaining areas is defined, the brightness of the remaining areas is compared to the transition brightness in [0023] step 208. According to the comparison result, the remaining areas are grouped into a plurality of regions in step 210. In response to the comparison result and the acceptable brightness range of the recording medium, a coefficient is determined for each region in step 212. In step 214, the brightness of each pixel in each of the remaining areas is then adjusted by the corresponding coefficient, allowing all elements of the scene from which the image is captured to be adequately photographed.
  • As mentioned above, the transition area can be selected either manually or automatically. FIG. 3 shows the process flow of the automatic selection mode and FIG. 4 shows the process flow of the manual selection mode. As shown in FIG. 3, when the automatic selection mode is selected, the step A[0024] 204 further comprises the sub-steps A300 and A302. In the sub-step A300, the brightness of each area is defined. Similarly to the method of defining the transition brightness, the brightness of each area can be defined by an average brightness, a percentage of a peak brightness, or a medium brightness of the pixels in the corresponding area. In step A302, the area having the brightness closest to a predetermined brightness is selected as the transition area. The predetermined brightness can be preprogrammed in the digital camera, or input by the operator or the user according to specific requirement.
  • In a manual selection mode, the operator or user may select the transition area according to various factors as shown in FIG. 4. In step M[0025] 400, a user interface is provided to display the raw image captured by the image capture device. In step M402, the operator may select an element as a reference element for adjusting images of other elements of the scene. That is, the area of the raw image mapping the reference element of the scene is selected as the transition area in step M402. In M404, the operator may simply observe the raw image and decide which area of the image is selected as the transition area. Or alternatively, the operator may selects an area at a particularly position of the image as the transition area in step M406. Rather than displaying a raw image of the scene, the brightness of each area may be defined and shown by the user interface in step M410, such that the operator can select the transition area based on the brightness of the areas in step M412. As mentioned above, the brightness of each area is defined according to the brightness of every pixel of the corresponding area. For example, an average brightness of the pixels can be computed and defined as the brightness of the corresponding area. A predetermined percentage of a peak brightness of the pixels can also be defined as the brightness of the corresponding area. Or alternatively, the medium brightness of the pixels can also be referred as the transition brightness of the corresponding area.
  • FIG. 5 shows a graph of the relationship between the brightness of the to-be-captured image and the output intensity of the translated image. In FIG. 5, the remaining areas of the captured image are grouped into two regions, including one lower region and one higher region. The areas in the lower region have brightness lower than the transition brightness, and the areas in the higher region have brightness higher than the transition brightness. A first and a second adjustment coefficients C[0026] 1 and C2 are determined for the areas of the areas in the lower and higher region, respectively. Again, the first and second adjustment coefficients are determined based on the brightness difference between the remaining areas and the transition area, and the acceptable brightness range, or in other words, the maximum resolvable brightness of the recording medium. As shown in FIG. 5, the relationship between the brightness of the captured image Bi and the output intensity of the translated image Bo for the lower region is:
  • Bo=C1*Bi;
  • and the relationship for the higher region is: [0027]
  • Bo=C2*Bi+Const,
  • where Const is a constant. According to FIG. 5, C[0028] 1 is larger than C2, allowing the brightness of the areas in the lower region adjusted relatively higher, and the brightness of the areas in the higher region adjusted relatively lower.
  • In the above method, the brightness, that is, the light level or the exposure value, of each pixel adjusted by the adjustment coefficient can be represented by numbers of f-stops. As mentioned above, currently, the image sensors accept a wider range of brightness than the recording media. For example, an image sensor may have a resolution of about 10 bits to about 16 bits with each bit equivalent to one f-stop, while most recording media have resolution of only 8 bits per color. This indicates that about 2 to 8 f-stops of the captured image will be lost in the image recorded by the media by the conventional image process. By the above method provided by the present invention, the f-stops of the captured image beyond the acceptable range of the medium are scaled into the acceptable range from two ends with reference to a selected number of f-stops. Therefore, the captured image can be adequately translated to render every element of the scene. [0029]
  • Further, in the above brightness adjustment method, the same adjustment coefficient is introduced to the pixels of three primary colors in the same area. In one embodiment of the present invention, to make the image more visible or to obtain a specific color effect, the neutral density adjustment is no longer neutral. That is, rather than adjusting the brightness of the pixels of all three primary colors by the same adjustment coefficient, separate adjustment coefficients may be applied to the pixels of different colors. For example, in FIG. 5, the second adjustment coefficient is only applied to the blue pixels of the corresponding area, while the red and green pixels of the corresponding area are adjusted by coefficients different from the second adjustment coefficient. When the areas in the higher region are required to display in monochrome blue, the second adjustment coefficient is applied to the blue pixels only, while the coefficients applied to the red and green pixels are zero. [0030]
  • A software is stored in the digital camera for segmenting the image captured by the image capture device recording area a plurality of areas. All the areas are active and ready to accept a brightness adjustment of neutral density or color. The above method can be applied to digital camera, digital video camera, film scanners and other digital image processing systems. [0031]
  • Indeed, each of the features and embodiments described herein can be used by itself, or in combination with one or more of other features and embodiment. Thus, the invention is not limited by the illustrated embodiment but is to be defined by the following claims when read in the broadest reasonable manner to preserve the validity of the claims. [0032]

Claims (22)

What is claimed is:
1. A brightness adjustment method, comprising:
providing a scene of which an image is to be captured;
segmenting the image into a plurality of areas, each area comprising a plurality of pixels;
selecting one of the areas as a transition area and defining a transition brightness of the transition area;
defining brightness of each remaining area of the image;
comparing the brightness of the remaining areas to the transition brightness;
grouping the remaining areas into a plurality of regions in response to the results of the comparing step;
determining a coefficient for each region; and
adjusting brightness of each pixel of the remaining areas in each region by the corresponding coefficient.
2. The method according to claim 1, further comprises using an image capture device to capture the image.
3. The method according to claim 1, further comprises using a charged-coupled device to capture the image.
4. The method according to claim 1, wherein the step of selecting the transition area further comprises selecting the transition area from one area of the image mapping a predetermined element of the scene.
5. The method according to claim 1, wherein the step of selecting the transition area further comprises selecting the transition area from one area of the image located in a predetermined position.
6. The method according to claim 1, wherein the step of selecting the transition area further comprises selecting the transition area from one area of the image with a brightness closest to a predetermined brightness value.
7. The method according to claim 1, wherein the step of defining the transition brightness further comprises computing an average brightness of the pixels of the transition area.
8. The method according to claim 1, wherein the step of defining the transition brightness further comprises determining a percentage of a peak brightness among the pixels of the transition area.
9. The method according to claim 1, wherein the step of defining the transition brightness further comprises selecting a medium brightness among the pixels of the transition area.
10. The method according to claim 1, wherein the step of grouping the remaining areas further comprises grouping the remaining areas into a higher region and a lower region, wherein the areas in the higher region have brightness higher than the transition brightness, and the areas in the lower region have brightness lower than the transition brightness.
11. The method according to claim 10, wherein the step of determining the coefficient further comprising determining a first coefficient for the lower region and a second coefficient for the higher region.
12. The method according to claim 11, wherein the brightness of each pixel of each area in the lower region is adjusted by multiplying the first coefficient.
13. The method according to claim 11, wherein the brightness of each pixel of each area in the higher region is adjusted by multiplying the second coefficient and then adding a constant.
14. A brightness adjustment method, comprising:
providing an image of a scene;
segmenting the image into a plurality of areas, each area comprising a plurality of pixels of three primary colors;
selecting one of the areas as a transition area, and defining a transition brightness of the transition area;
comparing brightness of each area to the transition brightness to determine an adjustment coefficient for each area; and
adjusting brightness of the pixels of at least one color in at least one area by the corresponding adjustment coefficient.
15. The method according to claim 14, wherein the step of selecting the transition area further comprises selecting the transition area from one area of the image mapping a predetermined element of the scene.
16. The method according to claim 14, wherein the step of selecting the transition area further comprises selecting the transition area from one area of the image with a brightness closest to a predetermined brightness value.
17. The method according to claim 14, wherein the step of defining the transition brightness further comprises computing an average brightness of the pixels of the transition area.
18. The method according to claim 14, wherein the step of defining the transition brightness further comprises determining a percentage of a peak brightness among the pixels of the transition area.
19. The method according to claim 14, further comprising the step of filtering brightness of the pixels of the remaining colors in the area adjusted by the adjustment coefficient.
20. The method according to claim 19, wherein the step of filtering brightness of the pixels of the remaining colors includes multiplying the brightness of the pixels of the remaining colors by another coefficient.
21. The method according to claim 20, wherein the step of filtering brightness of the pixels of the remaining colors includes displaying the adjusted pixels only.
22. A brightness adjustment method, comprising:
providing an image of a scene;
segmenting the image into a plurality of areas, each area comprising a plurality of pixels of three primary colors;
selecting one of the areas as a transition area, and defining a transition brightness of the transition area;
comparing brightness of the remaining areas to the transition brightness to determine one adjustment coefficient for the pixels of each color in each of the remaining areas; and
adjusting brightness of the pixels of each color in at least one area by the corresponding adjustment coefficient.
US10/309,448 2001-09-17 2002-12-04 Brightness adjustment method Abandoned US20030081141A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/309,448 US20030081141A1 (en) 2001-09-17 2002-12-04 Brightness adjustment method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/954,326 US6999126B2 (en) 2001-09-17 2001-09-17 Method of eliminating hot spot in digital photograph
US10/309,448 US20030081141A1 (en) 2001-09-17 2002-12-04 Brightness adjustment method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/954,326 Continuation-In-Part US6999126B2 (en) 2001-09-17 2001-09-17 Method of eliminating hot spot in digital photograph

Publications (1)

Publication Number Publication Date
US20030081141A1 true US20030081141A1 (en) 2003-05-01

Family

ID=25495265

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/954,326 Expired - Fee Related US6999126B2 (en) 2001-09-17 2001-09-17 Method of eliminating hot spot in digital photograph
US10/309,448 Abandoned US20030081141A1 (en) 2001-09-17 2002-12-04 Brightness adjustment method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/954,326 Expired - Fee Related US6999126B2 (en) 2001-09-17 2001-09-17 Method of eliminating hot spot in digital photograph

Country Status (2)

Country Link
US (2) US6999126B2 (en)
WO (1) WO2003025667A1 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060268149A1 (en) * 2005-05-25 2006-11-30 I-Chen Teng Method for adjusting exposure of a digital image
US20070291033A1 (en) * 2006-06-08 2007-12-20 Nicholas Phelps Method for producing three-dimensional views using a brightness control
CN100428786C (en) * 2005-04-25 2008-10-22 三星电子株式会社 Method and apparatus for adjusting brightness of image
US20140320408A1 (en) * 2013-04-26 2014-10-30 Leap Motion, Inc. Non-tactile interface systems and methods
US20150287216A1 (en) * 2014-04-04 2015-10-08 Tektronix, Inc. F-stop weighted waveform with picture monitor markers
US9436288B2 (en) 2013-05-17 2016-09-06 Leap Motion, Inc. Cursor mode switching
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US9632658B2 (en) 2013-01-15 2017-04-25 Leap Motion, Inc. Dynamic user interactions for display control and scaling responsiveness of display objects
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US9697643B2 (en) 2012-01-17 2017-07-04 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9747696B2 (en) 2013-05-17 2017-08-29 Leap Motion, Inc. Systems and methods for providing normalized parameters of motions of objects in three-dimensional space
US9934580B2 (en) 2012-01-17 2018-04-03 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10139918B2 (en) 2013-01-15 2018-11-27 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US10281987B1 (en) 2013-08-09 2019-05-07 Leap Motion, Inc. Systems and methods of free-space gestural interaction
US10319341B2 (en) * 2015-11-17 2019-06-11 Samsung Electronics Co., Ltd. Electronic device and method for displaying content thereof
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US10620775B2 (en) 2013-05-17 2020-04-14 Ultrahaptics IP Two Limited Dynamic interactive objects
US10620709B2 (en) 2013-04-05 2020-04-14 Ultrahaptics IP Two Limited Customized gesture interpretation
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US11282273B2 (en) 2013-08-29 2022-03-22 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
US11994377B2 (en) 2012-01-17 2024-05-28 Ultrahaptics IP Two Limited Systems and methods of locating a control object appendage in three dimensional (3D) space
US12032746B2 (en) 2015-02-13 2024-07-09 Ultrahaptics IP Two Limited Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments
US12118134B2 (en) 2015-02-13 2024-10-15 Ultrahaptics IP Two Limited Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
US12131011B2 (en) 2020-07-28 2024-10-29 Ultrahaptics IP Two Limited Virtual interactions for machine control

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350017A (en) * 2005-06-16 2006-12-28 Olympus Corp Imaging apparatus

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4695888A (en) * 1986-11-13 1987-09-22 Eastman Kodak Company Video camera with automatically variable diaphragm and shutter speed control
US5065247A (en) * 1988-01-12 1991-11-12 Sanyo Electric Co., Ltd. Automatic iris correction apparatus for use in automatically adjusting exposure in response to a video signal
US5075778A (en) * 1988-01-07 1991-12-24 Fuji Photo Film Co., Ltd. Backlight correction system
US5099334A (en) * 1988-09-14 1992-03-24 Fuji Photo Film Co., Ltd. Electronic still camera
US5347320A (en) * 1991-11-30 1994-09-13 Samsung Electronics Co., Ltd. Circuit for preventing automatic white balance error operation
US5353058A (en) * 1990-10-31 1994-10-04 Canon Kabushiki Kaisha Automatic exposure control apparatus
US5414487A (en) * 1991-02-13 1995-05-09 Nikon Corporation Light metering calculation apparatus
US5559555A (en) * 1993-06-17 1996-09-24 Sony Corporation Apparatus for performing exposure control pertaining to the luminance level of an object
US5592256A (en) * 1993-06-08 1997-01-07 Nikon Corporation Photometry device for a camera
US5923372A (en) * 1995-08-23 1999-07-13 Samsung Electronics Co., Ltd. Apparatus and method for controlling an iris according to brightness variation of input signal
US6064433A (en) * 1995-03-30 2000-05-16 Sony Corporation Video filming method and apparatus
US20040175054A1 (en) * 1998-11-13 2004-09-09 Masami Ogata Image processing apparatus and image processing method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5638118A (en) * 1987-06-09 1997-06-10 Canon Kabushiki Kaisha Image sensing device with diverse storage times used in picture composition
JPH0370274A (en) * 1989-08-09 1991-03-26 Sanyo Electric Co Ltd Image pickup device
US5831676A (en) * 1992-08-19 1998-11-03 Canon Kabushiki Kaisha Image pickup device using plural control parameters for exposure control
JP3170722B2 (en) * 1991-09-03 2001-05-28 株式会社ニコン Exposure calculation device
JP3666900B2 (en) * 1994-04-25 2005-06-29 キヤノン株式会社 Imaging apparatus and imaging method
US6091908A (en) * 1994-07-18 2000-07-18 Nikon Corporation Photometric device and method for a camera
JPH08186761A (en) * 1994-12-30 1996-07-16 Sony Corp Video camera device and video camera exposure control method
CN1228850A (en) * 1996-09-02 1999-09-15 株式会社Snk Shooting apparatus
US6765619B1 (en) * 2000-04-04 2004-07-20 Pixim, Inc. Method and apparatus for optimizing exposure time in image acquisitions

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4695888A (en) * 1986-11-13 1987-09-22 Eastman Kodak Company Video camera with automatically variable diaphragm and shutter speed control
US5075778A (en) * 1988-01-07 1991-12-24 Fuji Photo Film Co., Ltd. Backlight correction system
US5065247A (en) * 1988-01-12 1991-11-12 Sanyo Electric Co., Ltd. Automatic iris correction apparatus for use in automatically adjusting exposure in response to a video signal
US5099334A (en) * 1988-09-14 1992-03-24 Fuji Photo Film Co., Ltd. Electronic still camera
US5353058A (en) * 1990-10-31 1994-10-04 Canon Kabushiki Kaisha Automatic exposure control apparatus
US5414487A (en) * 1991-02-13 1995-05-09 Nikon Corporation Light metering calculation apparatus
US5347320A (en) * 1991-11-30 1994-09-13 Samsung Electronics Co., Ltd. Circuit for preventing automatic white balance error operation
US5592256A (en) * 1993-06-08 1997-01-07 Nikon Corporation Photometry device for a camera
US5559555A (en) * 1993-06-17 1996-09-24 Sony Corporation Apparatus for performing exposure control pertaining to the luminance level of an object
US6064433A (en) * 1995-03-30 2000-05-16 Sony Corporation Video filming method and apparatus
US5923372A (en) * 1995-08-23 1999-07-13 Samsung Electronics Co., Ltd. Apparatus and method for controlling an iris according to brightness variation of input signal
US20040175054A1 (en) * 1998-11-13 2004-09-09 Masami Ogata Image processing apparatus and image processing method

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100428786C (en) * 2005-04-25 2008-10-22 三星电子株式会社 Method and apparatus for adjusting brightness of image
US20060268149A1 (en) * 2005-05-25 2006-11-30 I-Chen Teng Method for adjusting exposure of a digital image
US20070291033A1 (en) * 2006-06-08 2007-12-20 Nicholas Phelps Method for producing three-dimensional views using a brightness control
US7742055B2 (en) * 2006-06-08 2010-06-22 E-On Software Method for producing three-dimensional views using a brightness control
US10565784B2 (en) 2012-01-17 2020-02-18 Ultrahaptics IP Two Limited Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US10410411B2 (en) 2012-01-17 2019-09-10 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11308711B2 (en) 2012-01-17 2022-04-19 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10366308B2 (en) 2012-01-17 2019-07-30 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10699155B2 (en) 2012-01-17 2020-06-30 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US9697643B2 (en) 2012-01-17 2017-07-04 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9934580B2 (en) 2012-01-17 2018-04-03 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9741136B2 (en) 2012-01-17 2017-08-22 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US11994377B2 (en) 2012-01-17 2024-05-28 Ultrahaptics IP Two Limited Systems and methods of locating a control object appendage in three dimensional (3D) space
US9778752B2 (en) 2012-01-17 2017-10-03 Leap Motion, Inc. Systems and methods for machine control
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US10241639B2 (en) 2013-01-15 2019-03-26 Leap Motion, Inc. Dynamic user interactions for display control and manipulation of display objects
US11353962B2 (en) 2013-01-15 2022-06-07 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US10042510B2 (en) 2013-01-15 2018-08-07 Leap Motion, Inc. Dynamic user interactions for display control and measuring degree of completeness of user gestures
US10042430B2 (en) 2013-01-15 2018-08-07 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US10139918B2 (en) 2013-01-15 2018-11-27 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US11243612B2 (en) 2013-01-15 2022-02-08 Ultrahaptics IP Two Limited Dynamic, free-space user interactions for machine control
US11269481B2 (en) 2013-01-15 2022-03-08 Ultrahaptics IP Two Limited Dynamic user interactions for display control and measuring degree of completeness of user gestures
US10782847B2 (en) 2013-01-15 2020-09-22 Ultrahaptics IP Two Limited Dynamic user interactions for display control and scaling responsiveness of display objects
US10739862B2 (en) 2013-01-15 2020-08-11 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US9632658B2 (en) 2013-01-15 2017-04-25 Leap Motion, Inc. Dynamic user interactions for display control and scaling responsiveness of display objects
US11874970B2 (en) 2013-01-15 2024-01-16 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US11693115B2 (en) 2013-03-15 2023-07-04 Ultrahaptics IP Two Limited Determining positional information of an object in space
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US11347317B2 (en) 2013-04-05 2022-05-31 Ultrahaptics IP Two Limited Customized gesture interpretation
US10620709B2 (en) 2013-04-05 2020-04-14 Ultrahaptics IP Two Limited Customized gesture interpretation
US11099653B2 (en) 2013-04-26 2021-08-24 Ultrahaptics IP Two Limited Machine responsiveness to dynamic user movements and gestures
US20140320408A1 (en) * 2013-04-26 2014-10-30 Leap Motion, Inc. Non-tactile interface systems and methods
US10452151B2 (en) * 2013-04-26 2019-10-22 Ultrahaptics IP Two Limited Non-tactile interface systems and methods
US9916009B2 (en) * 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
US20190018495A1 (en) * 2013-04-26 2019-01-17 Leap Motion, Inc. Non-tactile interface systems and methods
US10254849B2 (en) 2013-05-17 2019-04-09 Leap Motion, Inc. Cursor mode switching
US11429194B2 (en) 2013-05-17 2022-08-30 Ultrahaptics IP Two Limited Cursor mode switching
US10901519B2 (en) 2013-05-17 2021-01-26 Ultrahaptics IP Two Limited Cursor mode switching
US10936145B2 (en) 2013-05-17 2021-03-02 Ultrahaptics IP Two Limited Dynamic interactive objects
US10459530B2 (en) 2013-05-17 2019-10-29 Ultrahaptics IP Two Limited Cursor mode switching
US11194404B2 (en) 2013-05-17 2021-12-07 Ultrahaptics IP Two Limited Cursor mode switching
US10620775B2 (en) 2013-05-17 2020-04-14 Ultrahaptics IP Two Limited Dynamic interactive objects
US9927880B2 (en) 2013-05-17 2018-03-27 Leap Motion, Inc. Cursor mode switching
US11275480B2 (en) 2013-05-17 2022-03-15 Ultrahaptics IP Two Limited Dynamic interactive objects
US11720181B2 (en) 2013-05-17 2023-08-08 Ultrahaptics IP Two Limited Cursor mode switching
US9436288B2 (en) 2013-05-17 2016-09-06 Leap Motion, Inc. Cursor mode switching
US9747696B2 (en) 2013-05-17 2017-08-29 Leap Motion, Inc. Systems and methods for providing normalized parameters of motions of objects in three-dimensional space
US12045394B2 (en) 2013-05-17 2024-07-23 Ultrahaptics IP Two Limited Cursor mode switching
US9552075B2 (en) 2013-05-17 2017-01-24 Leap Motion, Inc. Cursor mode switching
US11567578B2 (en) 2013-08-09 2023-01-31 Ultrahaptics IP Two Limited Systems and methods of free-space gestural interaction
US10831281B2 (en) 2013-08-09 2020-11-10 Ultrahaptics IP Two Limited Systems and methods of free-space gestural interaction
US10281987B1 (en) 2013-08-09 2019-05-07 Leap Motion, Inc. Systems and methods of free-space gestural interaction
US12086935B2 (en) 2013-08-29 2024-09-10 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11282273B2 (en) 2013-08-29 2022-03-22 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11776208B2 (en) 2013-08-29 2023-10-03 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US20150287216A1 (en) * 2014-04-04 2015-10-08 Tektronix, Inc. F-stop weighted waveform with picture monitor markers
US9317931B2 (en) * 2014-04-04 2016-04-19 Tektronix, Inc. F-stop weighted waveform with picture monitor markers
US9721354B2 (en) 2014-04-04 2017-08-01 Tektronix, Inc. Stop weighted waveform
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US12095969B2 (en) 2014-08-08 2024-09-17 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US12032746B2 (en) 2015-02-13 2024-07-09 Ultrahaptics IP Two Limited Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments
US12118134B2 (en) 2015-02-13 2024-10-15 Ultrahaptics IP Two Limited Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
US10319341B2 (en) * 2015-11-17 2019-06-11 Samsung Electronics Co., Ltd. Electronic device and method for displaying content thereof
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
US12131011B2 (en) 2020-07-28 2024-10-29 Ultrahaptics IP Two Limited Virtual interactions for machine control

Also Published As

Publication number Publication date
US6999126B2 (en) 2006-02-14
US20030052990A1 (en) 2003-03-20
WO2003025667A1 (en) 2003-03-27

Similar Documents

Publication Publication Date Title
US20030081141A1 (en) Brightness adjustment method
US10412296B2 (en) Camera using preview image to select exposure
US8687087B2 (en) Digital camera with selectively increased dynamic range by control of parameters during image acquisition
US7057653B1 (en) Apparatus capable of image capturing
US7944485B2 (en) Method, apparatus and system for dynamic range estimation of imaged scenes
US6806903B1 (en) Image capturing apparatus having a γ-characteristic corrector and/or image geometric distortion correction
US7030911B1 (en) Digital camera and exposure control method of digital camera
CN100550990C (en) Image correction apparatus and method for correcting image
EP0877524B1 (en) Digital photography apparatus with an image-processing unit
JP3706708B2 (en) Image forming system and image forming method
US8334912B2 (en) Image processing apparatus, imaging apparatus, image processing method, and computer readable recording medium storing image processing program
JP3643203B2 (en) Digital camera
JPH10210360A (en) Digital camera
JP4307862B2 (en) Signal processing method, signal processing circuit, and imaging apparatus
JP2004120511A (en) Imaging apparatus
JPH10210355A (en) Digital camera
Allen et al. Digital cameras and scanners
KR101595888B1 (en) Photographing apparatus controlling method of photographing apparatus and recording medium storing program to implement the controlling method
JP3631577B2 (en) Digital camera
JP3641537B2 (en) Digital camera
JP2004096444A (en) Image processor and method thereof
JP3763555B2 (en) Electronic still camera
JPH10210354A (en) Digital camera
JP3643202B2 (en) Digital camera
JPH10210358A (en) Digital camera

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION