[go: nahoru, domu]

USRE48444E1 - High resolution thin multi-aperture imaging systems - Google Patents

High resolution thin multi-aperture imaging systems Download PDF

Info

Publication number
USRE48444E1
USRE48444E1 US16/383,618 US201916383618A USRE48444E US RE48444 E1 USRE48444 E1 US RE48444E1 US 201916383618 A US201916383618 A US 201916383618A US RE48444 E USRE48444 E US RE48444E
Authority
US
United States
Prior art keywords
image
camera
sensor
imaging system
color filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active - Reinstated
Application number
US16/383,618
Inventor
Gal Shabtay
Noy Cohen
Oded Gigushinski
Ephraim Goldenberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Corephotonics Ltd
Original Assignee
Corephotonics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=50827245&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=USRE48444(E1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Corephotonics Ltd filed Critical Corephotonics Ltd
Priority to US16/383,618 priority Critical patent/USRE48444E1/en
Priority to US16/384,197 priority patent/USRE48477E1/en
Priority to US16/384,244 priority patent/USRE48697E1/en
Priority to US16/384,140 priority patent/USRE48945E1/en
Priority to US16/419,604 priority patent/USRE49256E1/en
Application granted granted Critical
Publication of USRE48444E1 publication Critical patent/USRE48444E1/en
Active - Reinstated legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • H04N5/232
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0229Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using masks, aperture plates, spatial light modulators or spatial filters, e.g. reflective filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0248Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using a sighting port, e.g. camera or human eye
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/18Generating the spectrum; Monochromators using diffraction elements, e.g. grating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1814Diffraction gratings structurally combined with one or more further optical elements, e.g. lenses, mirrors, prisms or other diffraction gratings
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1842Gratings for image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • H04N23/16Optical arrangements associated therewith, e.g. for beam-splitting or for colour correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/133Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • H04N5/225
    • H04N9/04
    • H04N9/09
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor
    • H04N2209/045Picture signal generators using solid-state devices having a single pick-up sensor using mosaic colour filter

Definitions

  • Embodiments disclosed herein relate in general to multi-aperture imaging (“MAI”) systems (where “multi” refers to two or more apertures) and more specifically to thin MAI systems with high color resolution and/or optical zoom.
  • MAI multi-aperture imaging
  • Optical Zoom is a primary feature of many digital still cameras but one that mobile phone cameras usually lack, mainly due to camera height constraints in mobile imaging devices, cost and mechanical reliability.
  • One way of implementing zoom in mobile cameras is by over-sampling the image and cropping and interpolating it in accordance with the desired ZF. While this method is mechanically reliable, it results in thick optics and in an expensive image sensor due to the large number of pixels so associated therewith. As an example, if one is interested in implementing a 12 Megapixel camera with X3 ZF, one needs a sensor of 108 Megapixels.
  • a DAI system includes two optical apertures which may be formed by one or two optical modules, and one or two image sensors (e.g., CMOS or CCD) that grab the optical image or images and convert the data into the electronic domain, where the image can be processed and stored.
  • CMOS complementary metal-oxide-semiconductor
  • CCD complementary metal-oxide-semiconductor
  • Embodiments disclosed herein teach the use of multi-aperture imaging systems to implement thin cameras (with short optical paths of less than about 9 mm) and/or to realize optical zoom systems in such thin cameras. Embodiments disclosed herein further teach new color filter arrays that optimize the color information which may be achieved in a multi-aperture imaging system with or without zoom.
  • a MAI system disclosed herein includes at least two sensors or a single sensor divided into at least two areas. Hereinafter, the description refers to “two sensors”, with the understanding that they may represent sections of a single physical sensor (imager chip).
  • a left sensor (or left side of a single sensor) captures an image coming from a first aperture while a right sensor (or right side of a single sensor) captures an image coming from a second aperture.
  • one sensor is a “Wide” sensor while another sensor is a “Tele” sensor, see e.g. FIG. 1A .
  • the Wide sensor includes either a single standard CFA or two different CFAs: a non-standard CFA with higher color sampling rate positioned in an “overlap area” of the sensor (see below description of FIG. 1B ) and a standard CFA with a lower color sampling rate surrounding the overlap area.
  • the CFA When including a single standard CFA, the CFA may cover the entire Wide sensor area.
  • a “standard CFA” may include a RGB (Bayer) pattern or a non-Bayer pattern such as RGBE, CYYM, CYGM, RGBW#1, RGBW#2 or RGBW#3.
  • RGBE RGB
  • CYYM CYGM
  • RGBW#1 RGBW#2
  • RGBW#3 RGBE
  • non-standard CFA refers to a CFA that is different in its pattern that CFAs listed above as “standard”.
  • the Tele sensor may be a Clear sensor (i.e. a sensor without color filters) or a standard CFA sensor.
  • Each sensor provides a separate image (referred to respectively as a Wide image and a Tele image), except for the case of a single sensor, where two images are captured (grabbed) by the single sensor (example above).
  • zoom is achieved by fusing the two images, resulting in higher color resolution that approaches that of a high quality dual-aperture zoom camera.
  • a different magnification image of the same scene is grabbed by each subset, resulting in field of view (FOV) overlap between the two subsets.
  • the two subsets have the same zoom (i.e. same FOV).
  • the Tele subset is the higher zoom subset and the Wide subset is the lower zoom subset.
  • Post processing is applied on the two images grabbed by the MAI system to fuse and output one fused (combined) output zoom image processed according to a user ZF input request.
  • the resolution of the fused image may be higher than the resolution of the Wide/Tele sensors.
  • up-sampling may be applied on the Wide image to scale it to the Tele image.
  • a multi-aperture imaging system comprising a first camera subset that provides a first image, the first camera subset having a first sensor with a first plurality of sensor pixels covered at least in part with a non-standard CFA, the non-standard CFA used to increase a specific color sampling rate relative to a same color sampling rate in a standard CFA; a second camera subset that provides a second image, the second camera subset having a second sensor with a second plurality of sensor pixels either Clear or covered with a standard CFA; and a processor configured to process the first and second images into a combined output image.
  • the first and the second camera subsets have identical FOVs and the non-standard CFA may cover an overlap area that includes all the pixels of first sensor, thereby providing increased color resolution.
  • the processor is further configured to, during the processing of the first and second images into a combined output image, register respective first and second Luma images obtained from the first and second images, the registered first and second Luma images used together with color information to form the combined output image.
  • the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image, whereby the output image is formed by transferring information from the second image to the first image.
  • the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, whereby the output image is formed by transferring information from the first image to the second image.
  • the first camera subset has a first FOV
  • the second camera subset has a second, smaller FOV than the first FOV
  • the non-standard CFA covers an overlap area on the first sensor that captures the second FOV, thereby providing both optical zoom and increased color resolution.
  • the processor is further configured to, during the processing of the first and second images into a combined output image and based on a ZF input, register respective first and second Luma images obtained from the first and second images, the registered first and second Luma images used together with color information to form the combined output image.
  • the registration For a ZF input that defines an FOV greater than the second FOV, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image.
  • the registration For a ZF input that defines an FOV smaller than or equal to the second FOV, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, and the processing includes forming the output image by transferring information from the first image to the second image.
  • a multi-aperture imaging system comprising a first camera subset that provides a first image, the first camera subset having a first sensor with a first plurality of sensor pixels covered at least in part with a standard CFA; a second camera subset that provides a second image, the second camera subset having a second sensor with a second plurality of sensor pixels either Clear or covered with a standard CFA; and a processor configured to register first and second Luma images obtained respectively from the first and second images and to process the registered first and second Luma images together with color information into a combined output image.
  • the first and the second camera subsets have identical first and second FOVs.
  • the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image.
  • the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image and the processing includes forming the output image by transferring information from the first image to the second image.
  • the first camera subset has a first FOV
  • the second camera subset has a second, smaller FOV than the first FOV
  • the processor is further configured to register the first and second Luma images based on a ZF input.
  • the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image.
  • the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, and the processing includes forming the output image by transferring information from the first image to the second image.
  • FIG. 1A shows schematically a block diagram illustrating a dual-aperture zoom imaging system disclosed herein;
  • FIG. 1B shows an example of an image captured by the Wide sensor and the Tele sensor while illustrating the overlap area on the Wide sensor
  • FIG. 2 shows schematically an embodiment of a Wide sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 3 shows schematically another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 4 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 5 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 6 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 7 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 8 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 9 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
  • FIG. 10 shows a schematically in a flow chart an embodiment of a method disclosed herein for acquiring and outputting a zoom image
  • FIG. 11A shows exemplary images captured by a triple aperture zoom imaging system disclosed herein;
  • FIG. 11B illustrates schematically the three sensors of the triple aperture imaging system of FIG. 11A .
  • Embodiments disclosed herein relate to multi-aperture imaging systems that include at least one Wide sensor with a single CFA or with two different CFAs and at least one Tele sensor.
  • the description continues with particular reference to dual-aperture imaging systems that include two (Wide and Tele) subsets with respective sensors.
  • a three-aperture imaging system is described later with reference to FIGS. 11A-11B .
  • the Wide sensor includes an overlap area (see description of FIG. 1B ) that captures the Tele FOV.
  • the overlap area may cover the entire Wide sensor or only part of the sensor.
  • the overlap area may include a standard CFA or a non-standard CFA. Since the Tele image is optically magnified compared to the Wide image, the effective sampling rate of the Tele image is higher than that of the Wide image. Thus, the effective color sampling rate in the Wide sensor is much lower than the Clear sampling rate in the Tele sensor.
  • the Tele and Wide images fusion procedure requires up-scaling of the color data from the Wide sensor. Up-scaling will not improve color resolution.
  • the Wide sensor may have a Bayer CFA in the overlap area.
  • color resolution improvement depends on using color information from the Tele sensor in the fused output image.
  • FIG. 1A shows schematically a block diagram illustrating a dual-aperture zoom imaging (“DAZI”) system 100 disclosed herein.
  • System 100 includes a dual-aperture camera 102 with a Wide subset 104 and a Tele subset 106 (each subset having a respective sensor), and a processor 108 that fuses two images, a Wide image obtained with the Wide subset and a Tele image obtained with the Tele subset, into a single fused output image according to a user-defined “applied” ZF input or request.
  • the ZF is input to processor 108 .
  • the Wide sensor may include a non-standard CFA in an overlap area illustrated by 110 in FIG. 1B .
  • Overlap area 110 is surrounded by a non-overlap area 112 with a standard CFA (for example a Bayer pattern).
  • FIG. 1B also shows an example of an image captured by both Wide and Tele sensors. Note that “overlap” and “non-overlap” areas refer to parts of the Wide image as well as to the CFA arrangements of the Wide sensor.
  • the overlap area may cover different portions of a Wide sensor, for example half the sensor area, a third of the sensor area, a quarter of the sensor area, etc. A number of such Wide sensor CFA arrangements are described in more detail with reference to FIGS. 2-9 .
  • the non-standard CFA pattern increases the color resolution of the DAZI system.
  • the Tele sensor may be Clear (providing a Tele Clear image scaled relative to the Wide image) or may include a standard (Bayer or non-Bayer) CFA. It in the latter case, it is desirable to define primary and auxiliary sensors based on the applied ZF. If the ZF is such that the output FOV is larger than the Tele FOV, the primary sensor is the Wide sensor and the auxiliary sensor is the Tele sensor. If the ZF is such that the output FOV is equal to, or smaller than the Tele FOV, the primary sensor is the Tele sensor and the auxiliary sensor is the Wide sensor. The point of view defined by the output image is that of the primary sensor.
  • FIG. 2 shows schematically an embodiment of a Wide sensor 200 that may be implemented in a DAZI system such as system 100 .
  • Sensor 200 has a non-overlap area 202 with a Bayer CFA and an overlap area 204 covered by a non-standard CFA with a repetition of a 4 ⁇ 4 micro-cell in which the color filter order is BBRR-RBBR-RRBB-BRRB.
  • “Width 1 ” and “Height 1 ” refer to the full Wide sensor dimension.
  • “Width 2 ” and “Height 2 ” refer to the dimensions of the Wide sensor overlap area. Note that in FIG. 2 (as in following FIGS.
  • the empty row and column to the left and top of the overlap area are for clarity purposes only, and that the sensor pixels follow there the pattern of the non-overlap area (as shown in FIG. 6 ).
  • R and B are sampled at 1 ⁇ 2 0.5 Nyquist frequency in the diagonal (left to right) direction with 2 pixel intervals instead of at 1 ⁇ 2 Nyquist frequency in a standard Bayer pattern.
  • FIG. 3 shows schematically an embodiment of a Wide sensor 300 that may be implemented in a DAZI system such as system 100 .
  • Sensor 300 has a non-overlap area 302 with a Bayer CFA and an overlap area 304 covered by a non-standard CFA with a repetition of a 2 ⁇ 2 micro-cell in which the color filter order is BR-RB.
  • R and B are sampled at 1 ⁇ 2 0.5 Nyquist frequency in both diagonal directions.
  • FIG. 4 shows schematically an embodiment of a Wide sensor 400 that may be implemented in a DAZI system such as system 100 .
  • YC-CY the color filter order
  • FIG. 5 shows schematically an embodiment of a Wide sensor 500 that may be implemented in a DAZI system such as system 100 .
  • Sensor 500 has a non-overlap area 502 with a Bayer CFA and an overlap area 504 covered by a non-standard CFA with a repetition of a 6 ⁇ 6 micro-cell in which the color filter order is RBBRRB-RWRBWB-BBRBRR-RRBRBB-BWBRWR-BRRBBR, where “W” represents White or Clear pixels.
  • R and B are sampled at a higher frequency than in a standard CFA. For example, in a Bayer pixel order, the Red average sampling rate (“R s ”) is 0.25 (sampled once for every 4 pixels). In the overlap area pattern, R s is 0.44.
  • FIG. 6 shows schematically an embodiment of a Wide sensor 600 that may be implemented in a DAZI system such as system 100 .
  • Sensor 600 has a non-overlap area 602 with a Bayer CFA and an overlap area 604 covered by a non-standard CFA with a repetition of a 6 ⁇ 6 micro-cell in which the color filter order is BBGRRG-RGRBGB-GBRGRB-RRGBBG-BGBRGR-GRBGBR.
  • R and B are sampled at a higher frequency than in a standard CFA.
  • R s is 0.33 vs. 0.25 in a Bayer pixel order.
  • FIG. 7 shows schematically an embodiment of a Wide sensor 700 that may be implemented in a DAZI system such as system 100 .
  • Sensor 700 has a non-overlap area 702 with a Bayer CFA and an overlap area 704 covered by a non-standard CFA with a repetition of a 3 ⁇ 3 micro-cell in which the color filter order is GBR-RGB-BRG.
  • R and B are sampled at a higher frequency than in a standard CFA.
  • R s is 0.33 vs. 0.25 in a Bayer pixel order.
  • FIG. 8 shows schematically an embodiment of a Wide sensor 800 that may be implemented in a DAZI system such as system 100 .
  • Sensor 800 has a non-overlap area 802 with a Bayer CFA and an overlap area 804 covered by a non-standard CFA with a repetition of a 6 ⁇ 6 micro-cell in which the color filter order is RBBRRB-RGRBGB-BBRBRR-RRBRBB-BGBRGR-BRRBBR.
  • R and B are sampled at a higher frequency than in a standard CFA.
  • R s is 0.44 vs. 0.25 in a Bayer pixel order.
  • FIG. 9 shows schematically an embodiment of a Wide sensor 900 that may be implemented in a DAZI system such as system 100 .
  • Sensor 900 has a non-overlap area 902 with a Bayer CFA and an overlap area 904 covered by a non-standard CFA with a repetition of a 6 ⁇ 6 micro-cell in which the color filter order is RBRBRB-BGBRGR-RBRBRB-BRBRBR-RGRBGB-BRBRBRBR.
  • R and B are sampled at a higher frequency than in a standard CFA.
  • R s is 0.44 vs. 0.25 in a Bayer pixel order.
  • an image is acquired with imaging system 100 and is processed according to steps illustrated in a flowchart shown in FIG. 10 .
  • demosaicing is performed on the Wide overlap area pixels (which refer to the Tele image FOV) according to the specific CFA pattern. If the CFA in the Wide overlap area is a standard CFA, a standard demosaicing process may be applied to it. If the CFA in the Wide overlap area is non-standard CFA, the overlap and non-overlap subsets of pixels may need different demosaicing processes. That is, the Wide overlap area may need a non-standard demosaicing process and the Wide non-overlap area may need a standard demosaicing process.
  • demosaicing interpolations for the overlap area of each of the Wide sensors shown in FIGS. 2-9 are given in detail below.
  • the aim of the demosaicing is to reconstruct missing colors in each pixel.
  • Demosaicing is applied also to the Tele sensor pixels if the Tele sensor is not a Clear only sensor. This will result in a Wide subset color image where the colors (in the overlap area) hold higher resolution than those of a standard CFA pattern.
  • the Tele image is registered (mapped) into the Wide image.
  • the mapping includes finding correspondences between pixels in the two images.
  • step 1002 actual registration is performed on luminance Tele and Wide images (respectively Luma Tele and Luma wide ) calculated from the pixel information of the Tele and Wide cameras.
  • These luminance images are estimates for the scene luminance as captured by each camera and do not include any color information.
  • the Wide or Tele sensors have CFAs, the calculation of the luminance images is performed on the respective demosaiced images.
  • the calculation of the Wide luminance image varies according to the type of non-standard CFA used in the Wide overlap area. If the CFA permits calculation of a full RGB demosaiced image, the luminance image calculation is straightforward. If the CFA is such that it does not permit calculation of a full RGB demosaiced image, the luminance image is estimated from the available color channels.
  • the Tele luminance image is just the pixel information. Performing the registration on luminance images has the advantage of enabling registration between images captured by sensors with different CFAs or between images captured by a standard CFA or non-standard CFA sensor and a standard CFA or Clear sensor and avoiding color artifacts that may arise from erroneous registration.
  • step 1004 the data from the Wide and Tele images is processed together with the registration information from step 1002 to form a high quality output zoom image.
  • the Tele sensor is a Clear only sensor
  • the high resolution luminance component is taken from the Tele sensor and color resolution is taken from the Wide sensor.
  • the Tele sensor includes a CFA
  • both color and luminance data are taken from the Tele subset to form the high quality zoom image.
  • color and luminance data is taken from the Wide subset.
  • the Wide image is interpolated to reconstruct the missing pixel values.
  • Standard demosaicing is applied in the non-overlap area. If the overlap area includes a standard CFA, standard demosaicing is applied there as well. If the overlap area includes a non-standard CFA, a special demosaicing algorithm is applied, depending on the CFA pattern used. In addition, in case the Tele sensor has a CFA, standard demosaicing is applied to reconstruct the missing pixel values in each pixel location and to generate a full RGB color image.
  • This step of the algorithm calculates the mapping between the overlap areas in the two luminance images.
  • the registration step does not depend on the type of CFA used (or the lack thereof), as it is applied on luminance images.
  • the same registration step can therefore be applied on Wide and Tele images captured by standard CFA sensors, as well as by any combination of CFAs or Clear sensor pixels disclosed herein.
  • the registration process chooses either the Wide image or the Tele image to be a primary image.
  • the other image is defined as an auxiliary image.
  • the registration process considers the primary image as the baseline image and registers the overlap area in the auxiliary image to it, by finding for each pixel in the overlap area of the primary image its corresponding pixel in the auxiliary image.
  • the output image point of view is determined according to the primary image point of view (camera angle).
  • Various correspondence metrics could be used for this purpose, among which are a sum of absolute differences and correlation.
  • the choice of the Wide image or the Tele image as the primary and auxiliary images is based on the ZF chosen for the output image. If the chosen ZF is larger than the ratio between the focal-lengths of the Tele and Wide cameras, the Tele image is set to be the primary image and the Wide image is set to be the auxiliary image. If the chosen ZF is smaller than or equal to the ratio between the focal-lengths of the Tele and Wide cameras, the Wide image is set to be the primary image and the Tele image is set to be the auxiliary image. In another embodiment independent of a zoom factor, the Wide image is always the primary image and the Tele image is always the auxiliary image.
  • the output of the registration stage is a map relating Wide image pixels indices to matching Tele image pixels indices.
  • the primary and auxiliary images are used to produce a high resolution image.
  • the values of c 1 and c 2 may change between different pixels in the image.
  • RGB values of the output are calculated from Luma Out and R Wide , G Wide , and B Wide .
  • the Tele image is the primary image generated from a Clear sensor
  • the RGB values of the output are calculated from the Luma Tele image and R Wide , G Wide , and B Wide (matching pixels according to the registration map).
  • the Tele image is the primary image generated from a CFA sensor
  • the RGB values of the output are calculated either by using only the Tele image data, or by also combining data from the Wide image. The choice depends on the zoom factor.
  • Certain portions of the registered Wide and Tele images are used to generate the output image based on the ZF of the output image.
  • the ZF of the output image defines a FOV smaller than the Tele FOV
  • the fused high resolution image is cropped to the required field of view and digital interpolation is applied to scale up the image to the required output image resolution.
  • R 22 (R 21 +R 23 )/2
  • G 22 (W 22 ⁇ R 22 ⁇ B 22 ) (assuming that W includes the same amount of R, G and B colors).
  • W 22 (2*W 21 +W 24 )/3
  • G 22 (W 22 ⁇ R 22 ⁇ B 22 ) (assuming that W contains the same amount of R, G and B colors). The same operation is performed for Blue as the center pixel.
  • R 22 (R 21 +R 23 )/2.
  • G 32 (2*G 31 +2*G 22 +G 43 )/5
  • R 32 (R 41 +2*R 42 + 2 *R 33 +R 23 +R 21 )/7.
  • R 22 (2*R 21 +2*R 32 +R 13 )/5
  • R 22 (2*R 21 +2*R 23 +R 11 )/5.
  • G 32 (2*G 22 +G 52 )/3
  • R 32 (2*R 33 +2*R 42 +R 41 +R 21 +R 23 )/7.
  • B 22 (B 12 +B 32 +B 23 +B 21 )/4
  • R 22 (R 11 +R 13 +R 31 +R 33 )/4.
  • G 32 (2*G 22 +G 52 )/3
  • R 32 (R 42 +R 31 +R 33 )/3.
  • FIGS. 11A-11B A non-limiting and exemplary embodiment 1100 of a triple-aperture imaging system is shown in FIGS. 11A-11B .
  • System 1100 includes a first Wide subset camera 1102 (with exemplarily X 1 ), a second Wide subset camera (with exemplarily X 1 . 5 , and referred to as a “Wide-Tele” subset) and a Tele subset camera (with exemplarily X 2 ).
  • FIG. 11A shows exemplary images captured by imaging system 1100
  • FIG. 11B illustrates schematically three sensors marked 1102 , 1104 and 1106 , which belong respectively to the Wide, Wide-Tele and Tele subsets.
  • FIG. 11B also shows the CFA arrangements in each sensor: sensors 1102 and 1104 are similar to Wide sensors described above with reference to any of FIGS. 2-9 , in the sense that they include an overlap area and a non-overlap area.
  • the overlap area includes a non-standard CFA.
  • the non-overlap area may have a Clear pattern or a standard CFA.
  • neither Wide subset is solely a Clear channel camera.
  • the Tele sensor may be Clear or have a standard Bayer CFA or a standard non-Bayer CFA.
  • an image is acquired with imaging system 1100 and processed as follows: demosaicing is performed on the overlap area pixels of the Wide and Wide-Tele sensors according to the specific CFA pattern in each overlap area.
  • the overlap and non-overlap subsets of pixels in each of these sensors may need different demos interng.
  • Exemplary and non-limiting demosaicing specifications for the overlap area for Wide sensors shown in FIGS. 2-9 are given above.
  • the aim is to reconstruct the missing colors in each and every pixel.
  • demosaicing is performed as well.
  • the Wide and Wide-Tele subset color images acquired this way will have colors (in the overlap area) holding higher resolution than that of a standard CFA pattern.
  • the Tele image acquired with the Tele sensor is registered (mapped) into the respective Wide image.
  • the data from the Wide, Wide-Tele and Tele images is then processed to form a high quality zoom image.
  • high Luma resolution is taken from the Tele sensor and color resolution is taken from the Wide sensor.
  • color resolution is taken from the Tele subset.
  • color resolution is taken from the Wide sensor. The resolution of the fused image may be higher than the resolution of both sensors.
  • multi-aperture imaging systems with more than two Wide or Wide-Tele subsets (and sensors) or with more than one Tele subset (and sensor) may be constructed and used according to principles set forth herein.
  • non-zoom multi-aperture imaging systems with more than two sensors, at least one of which has a non-standard CFA, may be constructed and used according to principles set forth herein.
  • the disclosure is to be understood as not limited by the specific embodiments described herein, but only by the scope of the appended claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Optics & Photonics (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Cameras In General (AREA)

Abstract

A multi-aperture imaging system comprising a first camera with a first sensor that captures a first image and a second camera with a second sensor that captures a second image, the two cameras having either identical or different FOVs. The first sensor may have a standard color filter array (CFA) covering one sensor section and a non-standard color CFA covering another. The second sensor may have either Clear or standard CFA covered sections. Either image may be chosen to be a primary or an auxiliary image, based on a zoom factor. An output image with a point of view determined by the primary image is obtained by registering the auxiliary image to the primary image.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This application is a Continuation application of U.S. patent application Ser. No. 14/386,823 (now allowed), which was a National Phase application from PCT patent application PCT/IB2013/060356 which claimed priority from U.S. Provisional Patent Application No. 61/730,570 having the same title and filed Nov. 28, 2012, the latter incorporated herein by reference in its entirety.This patent application is a reissue application of Ser. No. 15/375,090, filed Dec. 11, 2016, now U.S. Pat. No. 9,876,952, which is a continuation of U.S. patent application Ser. No. 14/386,823, filed Apr. 22, 2014, now U.S. Pat. No. 9,538,152, which was a National Phase application from PCT application PCT/IB2013/060356 which claimed priority from U.S. Provisional Patent Application No. 61/730,570 having the same title and filed Nov. 28, 2012, the latter incorporated herein by reference in its entirety. This broadening reissue application is a parent to each of the following four co-pending continuation reissue applications: U.S. patent application Ser. No. 16/384,140 (filed Apr. 15, 2019), U.S. patent application Ser. No. 16/384,197 (filed Apr. 15, 2019), U.S. patent application Ser. No. 16/384,244 (filed Apr. 15, 2019) and U.S. patent application Ser. No. 16/419,604 (filed May 22, 2019).
FIELD
Embodiments disclosed herein relate in general to multi-aperture imaging (“MAI”) systems (where “multi” refers to two or more apertures) and more specifically to thin MAI systems with high color resolution and/or optical zoom.
BACKGROUND
Small digital cameras integrated into mobile (cell) phones, personal digital assistants and music players are becoming ubiquitous. Each year, mobile phone manufacturers add more imaging features to their handsets, causing these mobile imaging devices to converge towards feature sets and image quality that customers expect from standalone digital still cameras. Concurrently, the size of these handsets is shrinking, making it necessary to reduce the total size of the camera accordingly while adding more imaging features. Optical Zoom is a primary feature of many digital still cameras but one that mobile phone cameras usually lack, mainly due to camera height constraints in mobile imaging devices, cost and mechanical reliability.
Mechanical zoom solutions are common in digital still cameras but are typically too thick for most camera phones. Furthermore, the F/# (“F number) in such systems typically increases with the zoom factor (ZF) resulting in poor light sensitivity and higher noise (especially in low-light scenarios). In mobile cameras, this also results in resolution compromise, due to the small pixel size of their image sensors and the diffraction limit optics associated with the F/#.
One way of implementing zoom in mobile cameras is by over-sampling the image and cropping and interpolating it in accordance with the desired ZF. While this method is mechanically reliable, it results in thick optics and in an expensive image sensor due to the large number of pixels so associated therewith. As an example, if one is interested in implementing a 12 Megapixel camera with X3 ZF, one needs a sensor of 108 Megapixels.
Another way of implementing zoom, as well as increasing the output resolution, is by using a dual-aperture imaging (“DAI”) system. In its basic form, a DAI system includes two optical apertures which may be formed by one or two optical modules, and one or two image sensors (e.g., CMOS or CCD) that grab the optical image or images and convert the data into the electronic domain, where the image can be processed and stored.
The design of a thin MAI system with improved resolution requires a careful choice of parameters coupled with advanced signal processing algorithms to support the output of a high quality image. Known MAI systems, in particular ones with short optical paths, often trade-off functionalities and properties, for example zoom and color resolution, or image resolution and quality for camera module height. Therefore, there is a need for, and it would be advantageous to have thin MAI systems that produce an image with high resolution (and specifically high color resolution) together with zoom functionality.
Moreover, known signal processing algorithms used together with existing MAI systems often further degrade the output image quality by introducing artifacts when combining information from different apertures. A primary source of these artifacts is the image registration process, which has to find correspondences between the different images that are often captured by different sensors with different color filter arrays (CFAs). There is therefore a need for, and it would be advantageous to have an image registration algorithm that is more robust to the type of CFA used by the cameras and which can produce better correspondence between images captured by a multi-aperture system.
SUMMARY
Embodiments disclosed herein teach the use of multi-aperture imaging systems to implement thin cameras (with short optical paths of less than about 9 mm) and/or to realize optical zoom systems in such thin cameras. Embodiments disclosed herein further teach new color filter arrays that optimize the color information which may be achieved in a multi-aperture imaging system with or without zoom. In various embodiments, a MAI system disclosed herein includes at least two sensors or a single sensor divided into at least two areas. Hereinafter, the description refers to “two sensors”, with the understanding that they may represent sections of a single physical sensor (imager chip). Exemplarily, in a dual-aperture imaging system, a left sensor (or left side of a single sensor) captures an image coming from a first aperture while a right sensor (or right side of a single sensor) captures an image coming from a second aperture. In various embodiments disclosed herein, one sensor is a “Wide” sensor while another sensor is a “Tele” sensor, see e.g. FIG. 1A. The Wide sensor includes either a single standard CFA or two different CFAs: a non-standard CFA with higher color sampling rate positioned in an “overlap area” of the sensor (see below description of FIG. 1B) and a standard CFA with a lower color sampling rate surrounding the overlap area. When including a single standard CFA, the CFA may cover the entire Wide sensor area. A “standard CFA” may include a RGB (Bayer) pattern or a non-Bayer pattern such as RGBE, CYYM, CYGM, RGBW#1, RGBW#2 or RGBW#3. Thus, reference may be made to “standard Bayer” or “standard non-Bayer” patterns or filters. As used herein, “non-standard CFA” refers to a CFA that is different in its pattern that CFAs listed above as “standard”. Exemplary non-standard CFA patterns may include repetitions of a 2×2 micro-cell in which the color filter order is RR-BB, RB-BR or YC-CY where Y=Yellow=Green+Red, C=Cyan=Green+Blue; repetitions of a 3×3 micro-cell in which the color filter order is GBR-RGB-BRG; and repetitions of a 6×6 micro-cell in which the color filter order is RBBRRB-RWRBWB-BBRBRR-RRBRBB-BWBRWR-BRRBBR, or BBGRRG-RGRBGB-GBRGRB-RRGBBG-BGBRGR-GRBGBR, or RBBRRB-RGRBGB-BBRBRR-RRBRBB-BGBRGR-BRRBBR, or, RBRBRB-BGBRGR-RBRBRB-BRBRBR-RGRBGB-BRBRBR.
The Tele sensor may be a Clear sensor (i.e. a sensor without color filters) or a standard CFA sensor. This arrangement of the two (or more than two) sensors and of two (or more than two) Wide and Tele “subset cameras” (or simply “subsets”) related to the two Wide and Tele subsets. Each sensor provides a separate image (referred to respectively as a Wide image and a Tele image), except for the case of a single sensor, where two images are captured (grabbed) by the single sensor (example above). In some embodiments, zoom is achieved by fusing the two images, resulting in higher color resolution that approaches that of a high quality dual-aperture zoom camera. Some thin MAI systems disclosed herein therefore provide zoom, super-resolution, high dynamic range and enhanced user experience.
In some embodiments, in order to reach optical zoom capabilities, a different magnification image of the same scene is grabbed by each subset, resulting in field of view (FOV) overlap between the two subsets. In some embodiments, the two subsets have the same zoom (i.e. same FOV). In some embodiments, the Tele subset is the higher zoom subset and the Wide subset is the lower zoom subset. Post processing is applied on the two images grabbed by the MAI system to fuse and output one fused (combined) output zoom image processed according to a user ZF input request. In some embodiments, the resolution of the fused image may be higher than the resolution of the Wide/Tele sensors. As part of the fusion procedure, up-sampling may be applied on the Wide image to scale it to the Tele image.
In an embodiment there is provided a multi-aperture imaging system comprising a first camera subset that provides a first image, the first camera subset having a first sensor with a first plurality of sensor pixels covered at least in part with a non-standard CFA, the non-standard CFA used to increase a specific color sampling rate relative to a same color sampling rate in a standard CFA; a second camera subset that provides a second image, the second camera subset having a second sensor with a second plurality of sensor pixels either Clear or covered with a standard CFA; and a processor configured to process the first and second images into a combined output image.
In some embodiments, the first and the second camera subsets have identical FOVs and the non-standard CFA may cover an overlap area that includes all the pixels of first sensor, thereby providing increased color resolution. In some such embodiments, the processor is further configured to, during the processing of the first and second images into a combined output image, register respective first and second Luma images obtained from the first and second images, the registered first and second Luma images used together with color information to form the combined output image. In an embodiment, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image, whereby the output image is formed by transferring information from the second image to the first image. In another embodiment, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, whereby the output image is formed by transferring information from the first image to the second image.
In some embodiments, the first camera subset has a first FOV, the second camera subset has a second, smaller FOV than the first FOV, and the non-standard CFA covers an overlap area on the first sensor that captures the second FOV, thereby providing both optical zoom and increased color resolution. In some such embodiments, the processor is further configured to, during the processing of the first and second images into a combined output image and based on a ZF input, register respective first and second Luma images obtained from the first and second images, the registered first and second Luma images used together with color information to form the combined output image. For a ZF input that defines an FOV greater than the second FOV, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image. For a ZF input that defines an FOV smaller than or equal to the second FOV, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, and the processing includes forming the output image by transferring information from the first image to the second image.
In an embodiment there is provided a multi-aperture imaging system comprising a first camera subset that provides a first image, the first camera subset having a first sensor with a first plurality of sensor pixels covered at least in part with a standard CFA; a second camera subset that provides a second image, the second camera subset having a second sensor with a second plurality of sensor pixels either Clear or covered with a standard CFA; and a processor configured to register first and second Luma images obtained respectively from the first and second images and to process the registered first and second Luma images together with color information into a combined output image.
In some embodiments, the first and the second camera subsets have identical first and second FOVs. In some such embodiments, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image. In other such embodiments, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image and the processing includes forming the output image by transferring information from the first image to the second image.
In some embodiments, the first camera subset has a first FOV, the second camera subset has a second, smaller FOV than the first FOV, and the processor is further configured to register the first and second Luma images based on a ZF input. For a ZF input that defines an FOV greater than the second FOV, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image. For a ZF input that defines an FOV smaller than or equal to the second FOV, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, and the processing includes forming the output image by transferring information from the first image to the second image.
BRIEF DESCRIPTION OF THE DRAWINGS
Non-limiting examples of embodiments disclosed herein are described below with reference to figures attached hereto that are listed following this paragraph. The drawings and descriptions are meant to illuminate and clarify embodiments disclosed herein, and should not be considered limiting in any way.
FIG. 1A shows schematically a block diagram illustrating a dual-aperture zoom imaging system disclosed herein;
FIG. 1B shows an example of an image captured by the Wide sensor and the Tele sensor while illustrating the overlap area on the Wide sensor;
FIG. 2 shows schematically an embodiment of a Wide sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 3 shows schematically another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 4 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 5 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 6 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 7 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 8 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 9 shows schematically yet another embodiment of a Wide camera sensor that may be implemented in a dual-aperture zoom imaging system disclosed herein;
FIG. 10 shows a schematically in a flow chart an embodiment of a method disclosed herein for acquiring and outputting a zoom image;
FIG. 11A shows exemplary images captured by a triple aperture zoom imaging system disclosed herein;
FIG. 11B illustrates schematically the three sensors of the triple aperture imaging system of FIG. 11A.
DETAILED DESCRIPTION
Embodiments disclosed herein relate to multi-aperture imaging systems that include at least one Wide sensor with a single CFA or with two different CFAs and at least one Tele sensor. The description continues with particular reference to dual-aperture imaging systems that include two (Wide and Tele) subsets with respective sensors. A three-aperture imaging system is described later with reference to FIGS. 11A-11B.
The Wide sensor includes an overlap area (see description of FIG. 1B) that captures the Tele FOV. The overlap area may cover the entire Wide sensor or only part of the sensor. The overlap area may include a standard CFA or a non-standard CFA. Since the Tele image is optically magnified compared to the Wide image, the effective sampling rate of the Tele image is higher than that of the Wide image. Thus, the effective color sampling rate in the Wide sensor is much lower than the Clear sampling rate in the Tele sensor. In addition, the Tele and Wide images fusion procedure (see below) requires up-scaling of the color data from the Wide sensor. Up-scaling will not improve color resolution. In some applications, it is therefore advantageous to use a non-standard CFA in the Wide overlap area that increases color resolution for cases in which the Tele sensor includes only Clear pixels. In some embodiments in which the Tele sensor includes a Bayer CFA, the Wide sensor may have a Bayer CFA in the overlap area. In such embodiments, color resolution improvement depends on using color information from the Tele sensor in the fused output image.
FIG. 1A shows schematically a block diagram illustrating a dual-aperture zoom imaging (“DAZI”) system 100 disclosed herein. System 100 includes a dual-aperture camera 102 with a Wide subset 104 and a Tele subset 106 (each subset having a respective sensor), and a processor 108 that fuses two images, a Wide image obtained with the Wide subset and a Tele image obtained with the Tele subset, into a single fused output image according to a user-defined “applied” ZF input or request. The ZF is input to processor 108. The Wide sensor may include a non-standard CFA in an overlap area illustrated by 110 in FIG. 1B. Overlap area 110 is surrounded by a non-overlap area 112 with a standard CFA (for example a Bayer pattern). FIG. 1B also shows an example of an image captured by both Wide and Tele sensors. Note that “overlap” and “non-overlap” areas refer to parts of the Wide image as well as to the CFA arrangements of the Wide sensor. The overlap area may cover different portions of a Wide sensor, for example half the sensor area, a third of the sensor area, a quarter of the sensor area, etc. A number of such Wide sensor CFA arrangements are described in more detail with reference to FIGS. 2-9. The non-standard CFA pattern increases the color resolution of the DAZI system.
The Tele sensor may be Clear (providing a Tele Clear image scaled relative to the Wide image) or may include a standard (Bayer or non-Bayer) CFA. It in the latter case, it is desirable to define primary and auxiliary sensors based on the applied ZF. If the ZF is such that the output FOV is larger than the Tele FOV, the primary sensor is the Wide sensor and the auxiliary sensor is the Tele sensor. If the ZF is such that the output FOV is equal to, or smaller than the Tele FOV, the primary sensor is the Tele sensor and the auxiliary sensor is the Wide sensor. The point of view defined by the output image is that of the primary sensor.
FIG. 2 shows schematically an embodiment of a Wide sensor 200 that may be implemented in a DAZI system such as system 100. Sensor 200 has a non-overlap area 202 with a Bayer CFA and an overlap area 204 covered by a non-standard CFA with a repetition of a 4×4 micro-cell in which the color filter order is BBRR-RBBR-RRBB-BRRB. In this figure, as well as in FIGS. 3-9, “Width 1” and “Height 1” refer to the full Wide sensor dimension. “Width 2” and “Height 2” refer to the dimensions of the Wide sensor overlap area. Note that in FIG. 2 (as in following FIGS. 3-5 and 7, 8) the empty row and column to the left and top of the overlap area are for clarity purposes only, and that the sensor pixels follow there the pattern of the non-overlap area (as shown in FIG. 6). In overlap area 204, R and B are sampled at ½0.5 Nyquist frequency in the diagonal (left to right) direction with 2 pixel intervals instead of at ½ Nyquist frequency in a standard Bayer pattern.
FIG. 3 shows schematically an embodiment of a Wide sensor 300 that may be implemented in a DAZI system such as system 100. Sensor 300 has a non-overlap area 302 with a Bayer CFA and an overlap area 304 covered by a non-standard CFA with a repetition of a 2×2 micro-cell in which the color filter order is BR-RB. In the overlap area, R and B are sampled at ½0.5 Nyquist frequency in both diagonal directions.
FIG. 4 shows schematically an embodiment of a Wide sensor 400 that may be implemented in a DAZI system such as system 100. Sensor 400 has a non-overlap area 402 with a Bayer CFA and an overlap area 404 covered by a non-standard CFA with a repetition of a 2×2 micro-cell in which the color filter order is YC-CY, where Y=Yellow=Green+Red, C=Cyan=Green+Blue. As a result, in the overlap area, R and B are sampled at ½0.5 Nyquist frequency in a diagonal direction. The non-standard CFA includes green information for registration purposes. This allows for example registration between the two images where the object is green, since there is green information in both sensor images.
FIG. 5 shows schematically an embodiment of a Wide sensor 500 that may be implemented in a DAZI system such as system 100. Sensor 500 has a non-overlap area 502 with a Bayer CFA and an overlap area 504 covered by a non-standard CFA with a repetition of a 6×6 micro-cell in which the color filter order is RBBRRB-RWRBWB-BBRBRR-RRBRBB-BWBRWR-BRRBBR, where “W” represents White or Clear pixels. In the overlap area, R and B are sampled at a higher frequency than in a standard CFA. For example, in a Bayer pixel order, the Red average sampling rate (“Rs”) is 0.25 (sampled once for every 4 pixels). In the overlap area pattern, Rs is 0.44.
FIG. 6 shows schematically an embodiment of a Wide sensor 600 that may be implemented in a DAZI system such as system 100. Sensor 600 has a non-overlap area 602 with a Bayer CFA and an overlap area 604 covered by a non-standard CFA with a repetition of a 6×6 micro-cell in which the color filter order is BBGRRG-RGRBGB-GBRGRB-RRGBBG-BGBRGR-GRBGBR. In the overlap area, R and B are sampled at a higher frequency than in a standard CFA. For example, in the overlap area pattern, Rs is 0.33 vs. 0.25 in a Bayer pixel order.
FIG. 7 shows schematically an embodiment of a Wide sensor 700 that may be implemented in a DAZI system such as system 100. Sensor 700 has a non-overlap area 702 with a Bayer CFA and an overlap area 704 covered by a non-standard CFA with a repetition of a 3×3 micro-cell in which the color filter order is GBR-RGB-BRG. In the overlap area, R and B are sampled at a higher frequency than in a standard CFA. For example, in the overlap area pattern, Rs is 0.33 vs. 0.25 in a Bayer pixel order.
FIG. 8 shows schematically an embodiment of a Wide sensor 800 that may be implemented in a DAZI system such as system 100. Sensor 800 has a non-overlap area 802 with a Bayer CFA and an overlap area 804 covered by a non-standard CFA with a repetition of a 6×6 micro-cell in which the color filter order is RBBRRB-RGRBGB-BBRBRR-RRBRBB-BGBRGR-BRRBBR. In the overlap area, R and B are sampled at a higher frequency than in a standard CFA. For example, in the overlap area pattern, Rs is 0.44 vs. 0.25 in a Bayer pixel order.
FIG. 9 shows schematically an embodiment of a Wide sensor 900 that may be implemented in a DAZI system such as system 100. Sensor 900 has a non-overlap area 902 with a Bayer CFA and an overlap area 904 covered by a non-standard CFA with a repetition of a 6×6 micro-cell in which the color filter order is RBRBRB-BGBRGR-RBRBRB-BRBRBR-RGRBGB-BRBRBR. In the overlap area, R and B are sampled at a higher frequency than in a standard CFA. For example, in the overlap area pattern, Rs is 0.44 vs. 0.25 in a Bayer pixel order.
Processing Flow
In use, an image is acquired with imaging system 100 and is processed according to steps illustrated in a flowchart shown in FIG. 10. In step 1000, demosaicing is performed on the Wide overlap area pixels (which refer to the Tele image FOV) according to the specific CFA pattern. If the CFA in the Wide overlap area is a standard CFA, a standard demosaicing process may be applied to it. If the CFA in the Wide overlap area is non-standard CFA, the overlap and non-overlap subsets of pixels may need different demosaicing processes. That is, the Wide overlap area may need a non-standard demosaicing process and the Wide non-overlap area may need a standard demosaicing process. Exemplary and non-limiting non-standard demosaicing interpolations for the overlap area of each of the Wide sensors shown in FIGS. 2-9 are given in detail below. The aim of the demosaicing is to reconstruct missing colors in each pixel. Demosaicing is applied also to the Tele sensor pixels if the Tele sensor is not a Clear only sensor. This will result in a Wide subset color image where the colors (in the overlap area) hold higher resolution than those of a standard CFA pattern. In step 1002, the Tele image is registered (mapped) into the Wide image. The mapping includes finding correspondences between pixels in the two images. In step 1002, actual registration is performed on luminance Tele and Wide images (respectively LumaTele and Lumawide) calculated from the pixel information of the Tele and Wide cameras. These luminance images are estimates for the scene luminance as captured by each camera and do not include any color information. If the Wide or Tele sensors have CFAs, the calculation of the luminance images is performed on the respective demosaiced images. The calculation of the Wide luminance image varies according to the type of non-standard CFA used in the Wide overlap area. If the CFA permits calculation of a full RGB demosaiced image, the luminance image calculation is straightforward. If the CFA is such that it does not permit calculation of a full RGB demosaiced image, the luminance image is estimated from the available color channels. If the Tele sensor is a Clear sensor, the Tele luminance image is just the pixel information. Performing the registration on luminance images has the advantage of enabling registration between images captured by sensors with different CFAs or between images captured by a standard CFA or non-standard CFA sensor and a standard CFA or Clear sensor and avoiding color artifacts that may arise from erroneous registration.
In step 1004, the data from the Wide and Tele images is processed together with the registration information from step 1002 to form a high quality output zoom image. In cases where the Tele sensor is a Clear only sensor, the high resolution luminance component is taken from the Tele sensor and color resolution is taken from the Wide sensor. In cases where the Tele sensor includes a CFA, both color and luminance data are taken from the Tele subset to form the high quality zoom image. In addition, color and luminance data is taken from the Wide subset.
Exemplary Process for Fusing a Zoom Image
1. Special Demosaicing
In this step, the Wide image is interpolated to reconstruct the missing pixel values. Standard demosaicing is applied in the non-overlap area. If the overlap area includes a standard CFA, standard demosaicing is applied there as well. If the overlap area includes a non-standard CFA, a special demosaicing algorithm is applied, depending on the CFA pattern used. In addition, in case the Tele sensor has a CFA, standard demosaicing is applied to reconstruct the missing pixel values in each pixel location and to generate a full RGB color image.
2. Registration Preparation
    • Tele image: a luminance image LumaTele is calculated from the Tele sensor pixels. If the Tele subset has a Clear sensor, LumaTele is simply the sensor pixels data. If the Tele subset has a standard CFA, LumaTele is calculated from the demosaiced Tele image.
    • Wide image: as a first step, in case the Wide overlap CFA permits estimating the luminance component of the image, the luminance component is calculated from the demosaiced Wide image, LumaWide. If the CFA is one of those depicted in FIGS. 4-9, a luminance image is calculated first. If the CFA is one of the CFAs depicted in FIG. 2 or FIG. 3, a luminance image is not calculated. Instead, the following registration step is performed between a weighted average of the demosaiced channels of the Wide image and LumaTele. For convenience, this weighted average image is also denoted LumaWide. For example, if the Wide sensor CFA in the overlap region is as shown in FIG. 2, the demosaiced channels RWide and BWide are averaged to create LumaWide according to LumaWide=(f1*RWide+f2*BWide)/(f1+f2), where f1 may be f1=1 and f2 may be f2=1.
    • Low-pass filtering is applied on the Tele luminance image in order to match its spatial frequency content to that of the LumaWide image. This improves the registration performance, as after low-pass filtering the luminance images become more similar. The calculation is LumaTele→Low pass filter→LumaTele LP, where “LP” denotes an image after low pass filtering.
      3. Registration of LumaWide and LumaTele LP
This step of the algorithm calculates the mapping between the overlap areas in the two luminance images. The registration step does not depend on the type of CFA used (or the lack thereof), as it is applied on luminance images. The same registration step can therefore be applied on Wide and Tele images captured by standard CFA sensors, as well as by any combination of CFAs or Clear sensor pixels disclosed herein. The registration process chooses either the Wide image or the Tele image to be a primary image. The other image is defined as an auxiliary image. The registration process considers the primary image as the baseline image and registers the overlap area in the auxiliary image to it, by finding for each pixel in the overlap area of the primary image its corresponding pixel in the auxiliary image. The output image point of view is determined according to the primary image point of view (camera angle). Various correspondence metrics could be used for this purpose, among which are a sum of absolute differences and correlation.
In an embodiment, the choice of the Wide image or the Tele image as the primary and auxiliary images is based on the ZF chosen for the output image. If the chosen ZF is larger than the ratio between the focal-lengths of the Tele and Wide cameras, the Tele image is set to be the primary image and the Wide image is set to be the auxiliary image. If the chosen ZF is smaller than or equal to the ratio between the focal-lengths of the Tele and Wide cameras, the Wide image is set to be the primary image and the Tele image is set to be the auxiliary image. In another embodiment independent of a zoom factor, the Wide image is always the primary image and the Tele image is always the auxiliary image. The output of the registration stage is a map relating Wide image pixels indices to matching Tele image pixels indices.
4. Combination into a High Resolution Image
In this final step, the primary and auxiliary images are used to produce a high resolution image. One can distinguish between several cases:
a. If the Wide image is the primary image, and the Tele image was generated from a Clear sensor, LumaWide is calculated and replaced or averaged with LumaTele in the overlap area between the two images to create a luminance output image, matching corresponding pixels according to the registration map LumaOut=c1*LumaWide+c2*LumaTele. The values of c1 and c2 may change between different pixels in the image. Then, RGB values of the output are calculated from LumaOut and RWide, GWide, and BWide.
b. If the Wide image is the primary image and the Tele image was generated from a CFA sensor, LumaTele is calculated and is combined with LumaWide in the overlap area between the two images, according to the flow described in 4a.
c. If the Tele image is the primary image generated from a Clear sensor, the RGB values of the output are calculated from the LumaTele image and RWide, GWide, and BWide (matching pixels according to the registration map).
d. If the Tele image is the primary image generated from a CFA sensor, the RGB values of the output (matching pixels according to the registration map) are calculated either by using only the Tele image data, or by also combining data from the Wide image. The choice depends on the zoom factor.
Certain portions of the registered Wide and Tele images are used to generate the output image based on the ZF of the output image. In an embodiment, if the ZF of the output image defines a FOV smaller than the Tele FOV, the fused high resolution image is cropped to the required field of view and digital interpolation is applied to scale up the image to the required output image resolution.
Exemplary and Non-Limiting Pixel Interpolations Specifications for the Overlap Area
FIG. 2
B11 B12 R13
R21 B22 B23
R31 R32 B33

In order to reconstruct the missing R22 pixel, we perform R22=(R31+R13)/2. The same operation is performed for all missing Blue pixels.
FIG. 3
R11 B12 R13
B21 R22 B23
R31 B32 R33

In order to reconstruct the missing B22 pixel, we perform B22=(B12+B21+B32+B23)/4. The same operation is performed for all missing Red pixels.
FIG. 4
Y11 C12 Y13
C21 Y22 C23
Y31 C32 Y33

In order to reconstruct the missing C22 pixel, we perform C22=(C12+C21+C32+C23)/4. The same operation is performed for all missing Yellow pixels.
FIG. 5
Case 1: W is Center Pixel
R11 B12 B13
R21 W22 R23
B31 B32 R33

In order to reconstruct the missing 22 pixels, we perform the following:
B22=(B12+B32)/2
R22=(R21+R23)/2
G22=(W22−R22−B22) (assuming that W includes the same amount of R, G and B colors).
Case 2: R22 is Center Pixel
B11 B12 R13 R14
W21 R22 B23 W24
B31 R32 B33 R34
B22(B11+R33)/2
In order to reconstruct the missing 22 pixels, we perform the following:
W22=(2*W21+W24)/3
G22(W22−R22−B22) (assuming that W contains the same amount of R, G and B colors). The same operation is performed for Blue as the center pixel.
FIG. 6
B11 B12 G13 R14
R21 G22 R23 B24
G31 B32 R33 G34
R41 R42 G43 B44

In order to reconstruct the missing 22 pixels, we perform the following:
B22=(B12+B32)/2
R22=(R21+R23)/2.
In order to reconstruct the missing 32 pixels, we perform the following:
G32=(2*G31+2*G22+G43)/5
R32=(R41+2*R42+2*R33+R23+R21)/7.
FIG. 7
G11 B12 R13 G14
R21 G22 B23 R24
B31 R32 G33 B34
G41 B42 R43 G44

In order to reconstruct the missing 22 pixels, we perform the following:
B22=(2*B12+2*B23+B31)/5
R22=(2*R21+2*R32+R13)/5
and similarly for all other missing pixels.
FIG. 8
R11 B12 B13 R14
R21 G22 R23 B24
B31 B32 R33 B34
R41 R42 B43 R44
B51 G52 B53 R54

In order to reconstruct the missing 22 pixels, we perform the following:
B22=(2*B12+2*B32+B13)/5
R22=(2*R21+2*R23+R11)/5.
In order to reconstruct the missing 32 pixels, we perform the following:
G32=(2*G22+G52)/3
R32=(2*R33+2*R42+R41+R21+R23)/7.
FIG. 9
R11 B12 R13 B14
B21 G22 B23 R24
R31 B32 R33 B34
B41 R42 B43 R44
R51 G52 R53 B54

In order to reconstruct the missing 22 pixels, we perform the following:
B22=(B12+B32+B23+B21)/4
R22=(R11+R13+R31+R33)/4.
In order to reconstruct the missing 32 pixels, we perform the following:
G32=(2*G22+G52)/3
R32=(R42+R31+R33)/3.
Triple-Aperture Zoom Imaging System with Improved Color Resolution
As mentioned, a multi-aperture zoom or non-zoom imaging system disclosed herein may include more than two apertures. A non-limiting and exemplary embodiment 1100 of a triple-aperture imaging system is shown in FIGS. 11A-11B. System 1100 includes a first Wide subset camera 1102 (with exemplarily X1), a second Wide subset camera (with exemplarily X1.5, and referred to as a “Wide-Tele” subset) and a Tele subset camera (with exemplarily X2). FIG. 11A shows exemplary images captured by imaging system 1100, while FIG. 11B illustrates schematically three sensors marked 1102, 1104 and 1106, which belong respectively to the Wide, Wide-Tele and Tele subsets. FIG. 11B also shows the CFA arrangements in each sensor: sensors 1102 and 1104 are similar to Wide sensors described above with reference to any of FIGS. 2-9, in the sense that they include an overlap area and a non-overlap area. The overlap area includes a non-standard CFA. In both Wide sensors, the non-overlap area may have a Clear pattern or a standard CFA. Thus, neither Wide subset is solely a Clear channel camera. The Tele sensor may be Clear or have a standard Bayer CFA or a standard non-Bayer CFA. In use, an image is acquired with imaging system 1100 and processed as follows: demosaicing is performed on the overlap area pixels of the Wide and Wide-Tele sensors according to the specific CFA pattern in each overlap area. The overlap and non-overlap subsets of pixels in each of these sensors may need different demos aicing. Exemplary and non-limiting demosaicing specifications for the overlap area for Wide sensors shown in FIGS. 2-9 are given above. The aim is to reconstruct the missing colors in each and every pixel. In cases in which the Tele subset sensor is not Clear only, demosaicing is performed as well. The Wide and Wide-Tele subset color images acquired this way will have colors (in the overlap area) holding higher resolution than that of a standard CFA pattern. Then, the Tele image acquired with the Tele sensor is registered (mapped) into the respective Wide image. The data from the Wide, Wide-Tele and Tele images is then processed to form a high quality zoom image. In cases where the Tele subset is Clear only, high Luma resolution is taken from the Tele sensor and color resolution is taken from the Wide sensor. In cases where the Tele subset includes a CFA, both color and Luma resolution is taken from the Tele subset. In addition, color resolution is taken from the Wide sensor. The resolution of the fused image may be higher than the resolution of both sensors.
While this disclosure has been described in terms of certain embodiments and generally associated methods, alterations and permutations of the embodiments and methods will be apparent to those skilled in the art. For example, multi-aperture imaging systems with more than two Wide or Wide-Tele subsets (and sensors) or with more than one Tele subset (and sensor) may be constructed and used according to principles set forth herein. Similarly, non-zoom multi-aperture imaging systems with more than two sensors, at least one of which has a non-standard CFA, may be constructed and used according to principles set forth herein. The disclosure is to be understood as not limited by the specific embodiments described herein, but only by the scope of the appended claims.

Claims (24)

What is claimed is:
1. A multi-aperture imaging system comprising:
a) a first camera that provides a first camera image, the first camera having a first sensor with a first plurality of sensor pixels covered at least in part with a non-standard first color filter array (CFA) used to increase a specific color sampling rate relative to a same color sampling rate in a standard CFA, wherein the non-standard first CFA includes a repetition of a n×n micro-cell where n=4 and wherein each micro-cell includes a BBRR-RBBR-RRBB-BRRB color filter order;
b) a second camera that provides a second camera image, the second camera having a second sensor with a second plurality of sensor pixels, the second plurality of sensor pixels being either Clear or covered with a standard second CFA, wherein the second camera image has an overlap area with the first camera image, wherein the second CFA is one of RGB (Bayer), RGBE, CYYM, CYGM, RGBW#1, RGBW#2 or RGBW#3; and
c) a processor configured to process the first and second camera images into a fused output image, wherein in the overlap area pixels of the second camera image are registered with corresponding pixels of the first camera image.
2. A multi-aperture imaging system comprising:
a) a first camera that provides a first camera image, the first camera having a first sensor with a first plurality of sensor pixels covered at least in part with a non-standard first color filter array (CFA) used to increase a specific color sampling rate relative to a same color sampling rate in a standard CFA, wherein the non-standard first CFA includes a repetition of a n×n micro-cell where n=6 and wherein each micro-cell includes a color filter order selected from the group consisting of RBBRRB-RWRBWB-BBRBRR-RRBRBB-BWBRWR-BRRBBR, BBGRRG-RGRBGB-GBRGRB-RRGBBG-BGBRGR-GRBGBR, RBBRRB-RGRBGB-BBRBRR-RRBRBB-BGBRGR-BRRBBR and RBRBRB-BGBRGR-RBRBRB-BRBRBR-RGRBGB-BRBRBR;
b) a second camera that provides a second camera image, the second camera having a second sensor with a second plurality of sensor pixels, the second plurality of sensor pixels being either Clear or covered with a standard second CFA, wherein the second camera image has an overlap area with the first camera image, wherein the second CFA is one of RGB (Bayer), RGBE, CYYM, CYGM, RGBW#1, RGBW#2 or RGBW#3; and
c) a processor configured to process the first and second camera images into a fused output image, wherein in the overlap area pixels of the second camera image are registered with corresponding pixels of the first camera image.
3. The multi-aperture imaging system of claim 1, wherein the first camera is a Wide camera with a field of view FOVw and wherein the second camera is a Tele camera with a field of view FOVT smaller than FOVw.
4. A method of acquiring images by a multi-aperture imaging system, the method comprising:
a) providing a first image generated by a first camera of the imaging system, the first camera having a first field of view (FOV1);
b) providing a second image generated by a second camera of the imaging system, the second camera having a second field of view (FOV2) such that FOV2<FOV1, the second image having an overlap area with the first image; and
c) fusing the first and second images into a fused image, wherein the fusing includes applying a registration process between the first and second images, the registration process including:
i. extracting a first Luma image from the first image
ii. extracting a second Luma image from the second image,
iii. applying low-pass filtering on the second Luma image in order to match its spatial frequency content to that of the first Luma image and to generate a low-pass second Luma image, and
iv. applying registration on the low-pass second Luma image and the first Luma image,
wherein the non-standard a CFA includes a repetition of a n×n micro-cell where n=4 and
wherein each micro-cell includes a BBRR-RBBR-RRBB-BRRB color filter order.
5. The method of claim 4, wherein n=6 instead of n=4 and wherein instead of each micro-cell including a BBRR-RBBR-RRBB-BRRB color filter order, each micro-cell includes a color filter order selected from the group consisting of RBBRRB-RWRBWB-BBRBRR-RRBRBB-BWBRWR-BRRBBR, BBGRRG-RGRBGB-GBRGRB-RRGBBG-BGBRGR-GRBGBR, RBBRRB-RGRBGB-BBRBRR-RRBRBB-BGBRGR-BRRBBR and RBRBRB-BGBRGR-RBRBRB-BRBRBR-RGRBGB-BRBRBR.
6. A multi-aperture imaging system comprising:
a) a first camera that provides a first image, the first camera having a first field of view (FOV1) and a first sensor with a first color filter array; and
b) a second camera that provides a second image, the second camera having a second field of view (FOV2) such that FOV2<FOV1 and a second sensor with a second color filter array different from the first color filter array;
wherein the first color filter array has a first red color pixel and a second red color pixel adjacent to the first red color pixel, and a first blue color pixel and a second blue color pixel adjacent to the first blue color pixel.
7. The multi-aperture imaging system of claim 6, wherein the first red color pixel is disposed adjacent to the first blue color pixel in a diagonal direction.
8. The multi-aperture imaging system of claim 7, wherein the second color filter array is one of an RGB (Bayer), RGBE, CYYM, CYGM, RGBW#1, RGBW#2 or RGBW#3 color filter array.
9. The multi-aperture imaging system of claim 7, wherein the first color filter array is one of an RGBE, CYYM, CYGM, RGBW#1, RGBW#2 or RGBW#3 color filter array.
10. The multi-aperture imaging system of claim 9, wherein the first color filter array has a first 2×2 pixel group with three blue color pixels and a fourth pixel that is not a blue color pixel.
11. The multi-aperture imaging system of claim 10, wherein the first color filter array has a second 2×2 pixel group with three red color pixels and a fourth pixel that is not a red color pixel.
12. A multi-aperture imaging system comprising:
a) a first camera that provides a first image, the first camera having a first sensor with a first color filter array; and
b) a second camera that provides a second image, the second camera having a second sensor with a second color filter array different from the first color filter array,
wherein the first color filter array has a first red color pixel group comprising a first red color pixel and a second red color pixel adjacent to the first red color pixel in a first direction, and a first blue color pixel group comprising a first blue color pixel and a second blue color pixel adjacent to the first blue color pixel in a second direction perpendicular to the first direction.
13. The multi-aperture imaging system of claim 12, wherein the first camera has a first field of view (FOV1) and wherein the second camera has a second field of view (FOV2) such that FOV2<FOV1.
14. The multi-aperture imaging system of claim 12, wherein the first red color pixel is disposed adjacent to the first blue color pixel in a diagonal direction.
15. The multi-aperture imaging system of claim 12, wherein the first color filter array has a first 2×2 pixel group with three blue color pixels and a fourth pixel that is not a blue pixel.
16. The multi-aperture imaging system of claim 12, wherein the first color filter array has a second 2×2 pixel group with three red color pixels and a fourth pixel that is not a red color pixel.
17. The multi-aperture imaging system of claim 12, wherein the first color filter array is a non-Bayer color filter array.
18. The multi-aperture imaging system of claim 17, wherein the second color filter array is a standard color filter array.
19. The multi-aperture imaging system of claim 12, wherein the first color filter array has a first red and green color pixel group with a first red color pixel and a first green color pixel adjacent to the first red color pixel in the first direction.
20. A multi-aperture imaging system comprising:
a) a first camera that provides a first image, the first camera having a sensor with a first color filter array and a first resolution; and
b) a second camera that provides a second image, the second camera having a second sensor with a second color filter array and a second resolution,
wherein the first color filter array has a first red color pixel and a second red color pixel adjacent to the first red color pixel, and wherein the output image has a third resolution higher than the first resolution.
21. The multi-aperture imaging system of claim 20, wherein the first sensor has a first blue color pixel group with a first blue color pixel and a second blue color pixel adjacent to the first blue color pixel.
22. The multi-aperture imaging system of claim 21, wherein the first red color pixel is disposed adjacent to the first blue color pixel in a diagonal direction.
23. The multi-aperture imaging system of claim 22, wherein the first camera has a first field of view (FOV1) and the second camera has a second field of view (FOV2) different from FOV1.
24. The multi-aperture imaging system of claim 23, wherein the FOV1 is greater than the FOV2.
US16/383,618 2012-11-28 2019-04-14 High resolution thin multi-aperture imaging systems Active - Reinstated USRE48444E1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US16/383,618 USRE48444E1 (en) 2012-11-28 2019-04-14 High resolution thin multi-aperture imaging systems
US16/384,197 USRE48477E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/384,244 USRE48697E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/384,140 USRE48945E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/419,604 USRE49256E1 (en) 2012-11-28 2019-05-22 High resolution thin multi-aperture imaging systems

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201261730570P 2012-11-28 2012-11-28
US14/386,823 US9538152B2 (en) 2012-11-28 2013-11-23 High resolution thin multi-aperture imaging systems
PCT/IB2013/060356 WO2014083489A1 (en) 2012-11-28 2013-11-23 High-resolution thin multi-aperture imaging systems
US15/375,090 US9876952B2 (en) 2012-11-28 2016-12-11 High resolution thin multi-aperture imaging systems
US16/383,618 USRE48444E1 (en) 2012-11-28 2019-04-14 High resolution thin multi-aperture imaging systems

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/375,090 Reissue US9876952B2 (en) 2012-11-28 2016-12-11 High resolution thin multi-aperture imaging systems

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/375,090 Continuation US9876952B2 (en) 2012-11-28 2016-12-11 High resolution thin multi-aperture imaging systems

Publications (1)

Publication Number Publication Date
USRE48444E1 true USRE48444E1 (en) 2021-02-16

Family

ID=50827245

Family Applications (10)

Application Number Title Priority Date Filing Date
US14/386,823 Active 2034-05-19 US9538152B2 (en) 2012-11-28 2013-11-23 High resolution thin multi-aperture imaging systems
US15/278,046 Active US9581496B2 (en) 2012-01-29 2016-09-28 Snapshot spectral imaging based on digital cameras
US15/375,090 Ceased US9876952B2 (en) 2012-11-28 2016-12-11 High resolution thin multi-aperture imaging systems
US15/439,091 Expired - Fee Related US9927300B2 (en) 2012-01-29 2017-02-22 Snapshot spectral imaging based on digital cameras
US15/878,939 Abandoned US20180160040A1 (en) 2012-11-28 2018-01-24 High resolution thin multi-aperture imaging systems
US16/383,618 Active - Reinstated USRE48444E1 (en) 2012-11-28 2019-04-14 High resolution thin multi-aperture imaging systems
US16/384,140 Active USRE48945E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/384,197 Active - Reinstated USRE48477E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/384,244 Active USRE48697E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/419,604 Active USRE49256E1 (en) 2012-11-28 2019-05-22 High resolution thin multi-aperture imaging systems

Family Applications Before (5)

Application Number Title Priority Date Filing Date
US14/386,823 Active 2034-05-19 US9538152B2 (en) 2012-11-28 2013-11-23 High resolution thin multi-aperture imaging systems
US15/278,046 Active US9581496B2 (en) 2012-01-29 2016-09-28 Snapshot spectral imaging based on digital cameras
US15/375,090 Ceased US9876952B2 (en) 2012-11-28 2016-12-11 High resolution thin multi-aperture imaging systems
US15/439,091 Expired - Fee Related US9927300B2 (en) 2012-01-29 2017-02-22 Snapshot spectral imaging based on digital cameras
US15/878,939 Abandoned US20180160040A1 (en) 2012-11-28 2018-01-24 High resolution thin multi-aperture imaging systems

Family Applications After (4)

Application Number Title Priority Date Filing Date
US16/384,140 Active USRE48945E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/384,197 Active - Reinstated USRE48477E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/384,244 Active USRE48697E1 (en) 2012-11-28 2019-04-15 High resolution thin multi-aperture imaging systems
US16/419,604 Active USRE49256E1 (en) 2012-11-28 2019-05-22 High resolution thin multi-aperture imaging systems

Country Status (4)

Country Link
US (10) US9538152B2 (en)
CN (6) CN113259565B (en)
IL (4) IL238900B (en)
WO (1) WO2014083489A1 (en)

Families Citing this family (156)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US8866920B2 (en) 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
KR101588877B1 (en) 2008-05-20 2016-01-26 펠리칸 이매징 코포레이션 Capturing and processing of images using monolithic camera array with heterogeneous imagers
US8514491B2 (en) 2009-11-20 2013-08-20 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
WO2011143501A1 (en) 2010-05-12 2011-11-17 Pelican Imaging Corporation Architectures for imager arrays and array cameras
US8878950B2 (en) 2010-12-14 2014-11-04 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using super-resolution processes
EP2708019B1 (en) 2011-05-11 2019-10-16 FotoNation Limited Systems and methods for transmitting and receiving array camera image data
WO2013043761A1 (en) 2011-09-19 2013-03-28 Pelican Imaging Corporation Determining depth from multiple views of a scene that include aliasing using hypothesized fusion
WO2013049699A1 (en) 2011-09-28 2013-04-04 Pelican Imaging Corporation Systems and methods for encoding and decoding light field image files
WO2013126578A1 (en) 2012-02-21 2013-08-29 Pelican Imaging Corporation Systems and methods for the manipulation of captured light field image data
US9210392B2 (en) 2012-05-01 2015-12-08 Pelican Imaging Coporation Camera modules patterned with pi filter groups
WO2014005123A1 (en) 2012-06-28 2014-01-03 Pelican Imaging Corporation Systems and methods for detecting defective camera arrays, optic arrays, and sensors
US20140002674A1 (en) 2012-06-30 2014-01-02 Pelican Imaging Corporation Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors
EP3869797B1 (en) 2012-08-21 2023-07-19 Adeia Imaging LLC Method for depth detection in images captured using array cameras
WO2014032020A2 (en) 2012-08-23 2014-02-27 Pelican Imaging Corporation Feature based high resolution motion estimation from low resolution images captured using an array source
US20140092281A1 (en) 2012-09-28 2014-04-03 Pelican Imaging Corporation Generating Images from Light Fields Utilizing Virtual Viewpoints
US9143711B2 (en) 2012-11-13 2015-09-22 Pelican Imaging Corporation Systems and methods for array camera focal plane control
WO2014130849A1 (en) 2013-02-21 2014-08-28 Pelican Imaging Corporation Generating compressed light field representation data
US9374512B2 (en) 2013-02-24 2016-06-21 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
WO2014138695A1 (en) 2013-03-08 2014-09-12 Pelican Imaging Corporation Systems and methods for measuring scene information while capturing images using array cameras
US8866912B2 (en) 2013-03-10 2014-10-21 Pelican Imaging Corporation System and methods for calibration of an array camera using a single captured image
US9519972B2 (en) 2013-03-13 2016-12-13 Kip Peli P1 Lp Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
WO2014164550A2 (en) 2013-03-13 2014-10-09 Pelican Imaging Corporation System and methods for calibration of an array camera
WO2014164909A1 (en) 2013-03-13 2014-10-09 Pelican Imaging Corporation Array camera architecture implementing quantum film sensors
US9106784B2 (en) 2013-03-13 2015-08-11 Pelican Imaging Corporation Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
WO2014153098A1 (en) 2013-03-14 2014-09-25 Pelican Imaging Corporation Photmetric normalization in array cameras
US9578259B2 (en) 2013-03-14 2017-02-21 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
EP2973476A4 (en) 2013-03-15 2017-01-18 Pelican Imaging Corporation Systems and methods for stereo imaging with camera arrays
US10223838B2 (en) * 2013-03-15 2019-03-05 Derek A. Devries Method and system of mobile-device control with a plurality of fixed-gradient focused digital cameras
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US9445003B1 (en) 2013-03-15 2016-09-13 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
CN108234851B (en) * 2013-06-13 2019-08-16 核心光电有限公司 Based on Dual-Aperture zoom digital camera
KR102081087B1 (en) 2013-06-17 2020-02-25 삼성전자주식회사 Image adjustment apparatus and image sensor for synchronous image and non-synchronous image
CN108388005A (en) 2013-07-04 2018-08-10 核心光电有限公司 Small-sized focal length lens external member
US9857568B2 (en) 2013-07-04 2018-01-02 Corephotonics Ltd. Miniature telephoto lens assembly
US9880054B2 (en) * 2013-07-26 2018-01-30 Inview Technology Corporation Simplified compressive sensing spectral imager
CN109120823B (en) 2013-08-01 2020-07-14 核心光电有限公司 Thin multi-aperture imaging system with auto-focus and method of use thereof
US9473708B1 (en) 2013-08-07 2016-10-18 Google Inc. Devices and methods for an imaging system with a dual camera architecture
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
CN105492954B (en) 2013-11-06 2018-04-27 核心光电有限公司 Electromagnetic actuators for digital camera
US9264592B2 (en) 2013-11-07 2016-02-16 Pelican Imaging Corporation Array camera modules incorporating independently aligned lens stacks
US10119808B2 (en) 2013-11-18 2018-11-06 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US9426361B2 (en) 2013-11-26 2016-08-23 Pelican Imaging Corporation Array camera configurations incorporating multiple constituent array cameras
WO2015134996A1 (en) 2014-03-07 2015-09-11 Pelican Imaging Corporation System and methods for depth regularization and semiautomatic interactive matting using rgb-d images
CN103986867B (en) * 2014-04-24 2017-04-05 宇龙计算机通信科技(深圳)有限公司 A kind of image taking terminal and image capturing method
WO2016004115A1 (en) 2014-07-01 2016-01-07 Apple Inc. Mobile camera system
EP3172700B1 (en) * 2014-07-21 2021-04-28 Politecnico Di Torino Improved method for fingerprint matching and camera identification, device and system
KR102157675B1 (en) * 2014-07-25 2020-09-18 삼성전자주식회사 Image photographing apparatus and methods for photographing image thereof
US9225889B1 (en) 2014-08-18 2015-12-29 Entropix, Inc. Photographic image acquisition device and method
CN106605196B (en) 2014-09-02 2018-11-09 苹果公司 remote camera user interface
CN107077743B (en) 2014-09-29 2021-03-23 快图有限公司 System and method for dynamic calibration of an array camera
EP3148177A4 (en) * 2014-10-22 2018-01-24 Yulong Computer Telecommunication Technologies (Shenzhen) Co., Ltd. Image generation method based on dual camera module and dual camera module
CN112433331B (en) 2015-01-03 2022-07-08 核心光电有限公司 Miniature telephoto lens module and camera using the same
US9992396B1 (en) 2015-02-02 2018-06-05 Apple Inc. Focusing lighting module
US9781345B1 (en) 2015-02-13 2017-10-03 Apple Inc. Dual camera magnet arrangement
US11381747B2 (en) 2015-02-13 2022-07-05 Apple Inc. Dual camera magnet arrangement
US9846919B2 (en) 2015-02-16 2017-12-19 Samsung Electronics Co., Ltd. Data processing device for processing multiple sensor data and system including the same
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
CN108353126B (en) 2015-04-23 2019-08-23 苹果公司 Handle method, electronic equipment and the computer readable storage medium of the content of camera
JP6281730B2 (en) 2015-07-10 2018-02-21 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Image acquisition system and unmanned aerial vehicle
CN112672024B (en) * 2015-08-13 2022-08-02 核心光电有限公司 Dual aperture zoom camera with video support and switching/non-switching dynamic control
US9998666B2 (en) * 2015-08-26 2018-06-12 Duke University Systems and methods for burst image deblurring
DE102015217253A1 (en) * 2015-09-10 2017-03-16 Robert Bosch Gmbh Environment detecting device for a vehicle and method for capturing an image by means of an environment detecting device
US9769419B2 (en) 2015-09-30 2017-09-19 Cisco Technology, Inc. Camera system for video conference endpoints
US9769389B2 (en) 2015-09-30 2017-09-19 Apple Inc. Mobile zoom using multiple optical image stabilization cameras
US10264188B2 (en) 2015-09-30 2019-04-16 Apple Inc. Mobile zoom using multiple optical image stabilization cameras
US9774787B2 (en) 2015-09-30 2017-09-26 Apple Inc. Mobile zoom using multiple optical image stabilization cameras
US10382698B2 (en) 2015-09-30 2019-08-13 Apple Inc. Mobile zoom using multiple optical image stabilization cameras
US10063783B2 (en) * 2015-09-30 2018-08-28 Apple Inc. Mobile zoom using multiple optical image stabilization cameras
KR102480600B1 (en) * 2015-10-21 2022-12-23 삼성전자주식회사 Method for low-light image quality enhancement of image processing devices and method of operating an image processing system for performing the method
KR102446442B1 (en) * 2015-11-24 2022-09-23 삼성전자주식회사 Digital photographing apparatus and the operating method for the same
JP6711612B2 (en) * 2015-12-21 2020-06-17 キヤノン株式会社 Image processing apparatus, image processing method, and imaging apparatus
TWI592646B (en) * 2015-12-23 2017-07-21 高準精密工業股份有限公司 Optical device
US10359618B2 (en) 2016-01-11 2019-07-23 Nikon Corporation Multispectral stereoscopic endoscope system and use of same
CN106990646A (en) 2016-01-20 2017-07-28 深圳富泰宏精密工业有限公司 Many lens systems, its method of work and portable electron device
US10194089B2 (en) * 2016-02-08 2019-01-29 Qualcomm Incorporated Systems and methods for implementing seamless zoom function using multiple cameras
JP7290907B2 (en) * 2016-03-10 2023-06-14 シスメックス株式会社 Optical instrument and image formation method
JP2017169111A (en) * 2016-03-17 2017-09-21 ソニー株式会社 Imaging control apparatus, imaging control method, and imaging apparatus
CN108781278A (en) * 2016-03-30 2018-11-09 Lg 电子株式会社 Image processing apparatus and mobile terminal
US10539763B2 (en) 2016-03-31 2020-01-21 Sony Corporation Optical system, electronic device, camera, method and computer program
US20170318273A1 (en) 2016-04-28 2017-11-02 Qualcomm Incorporated Shift-and-match fusion of color and mono images
US9912860B2 (en) 2016-06-12 2018-03-06 Apple Inc. User interface for camera effects
US9936129B2 (en) * 2016-06-15 2018-04-03 Obsidian Sensors, Inc. Generating high resolution images
US10290111B2 (en) 2016-07-26 2019-05-14 Qualcomm Incorporated Systems and methods for compositing images
KR102255789B1 (en) 2016-08-30 2021-05-26 삼성전자주식회사 Optical Module and Optical device Using the same
EP3497928B1 (en) * 2016-08-31 2020-11-18 Huawei Technologies Co., Ltd. Multi camera system for zoom
KR102547104B1 (en) * 2016-09-06 2023-06-23 삼성전자주식회사 Electronic device and method for processing plural images
US10297034B2 (en) 2016-09-30 2019-05-21 Qualcomm Incorporated Systems and methods for fusing images
KR102229811B1 (en) 2016-10-28 2021-03-18 후아웨이 테크놀러지 컴퍼니 리미티드 Filming method and terminal for terminal
KR102156597B1 (en) 2016-11-03 2020-09-16 후아웨이 테크놀러지 컴퍼니 리미티드 Optical imaging method and apparatus
US9860456B1 (en) * 2016-11-11 2018-01-02 Motorola Mobility Llc Bayer-clear image fusion for dual camera
EP3551997A1 (en) * 2016-12-08 2019-10-16 Koninklijke Philips N.V. Apparatus and method for determining a refractive index
TWI626620B (en) * 2016-12-20 2018-06-11 廣東歐珀移動通訊有限公司 Image processing method and device, electronic device, and computer readable storage medium
DE102017204035B3 (en) 2017-03-10 2018-09-13 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. A multi-aperture imaging apparatus, imaging system, and method of providing a multi-aperture imaging apparatus
DE102017206429A1 (en) 2017-04-13 2018-10-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. A multi-aperture imaging apparatus, imaging system, and method of providing a multi-aperture imaging apparatus
DE102017206442B4 (en) 2017-04-13 2021-01-28 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device for imaging partial fields of view, multi-aperture imaging device and method for providing the same
KR102204596B1 (en) * 2017-06-02 2021-01-19 삼성전자주식회사 Processor, image processing device comprising the same, and method for image processing
DK180859B1 (en) 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
US10972672B2 (en) * 2017-06-05 2021-04-06 Samsung Electronics Co., Ltd. Device having cameras with different focal lengths and a method of implementing cameras with different focal lengths
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
WO2019059632A1 (en) * 2017-09-25 2019-03-28 한국과학기술원 Method and system for reconstructing hyperspectral image by using prism
US10462370B2 (en) 2017-10-03 2019-10-29 Google Llc Video stabilization
KR102318013B1 (en) 2017-10-13 2021-10-27 삼성전자 주식회사 Electronic device composing a plurality of images and method
US10337857B2 (en) * 2017-10-17 2019-07-02 Raytheon Company Multi-spectral boresight alignment methods and systems
US11112964B2 (en) 2018-02-09 2021-09-07 Apple Inc. Media capture lock affordance for graphical user interface
KR102418852B1 (en) * 2018-02-14 2022-07-11 삼성전자주식회사 Electronic device and method for controlling an image display
US12067650B2 (en) * 2018-03-20 2024-08-20 Nec Corporation Imaging apparatus and imaging method
CN111885294B (en) * 2018-03-26 2022-04-22 华为技术有限公司 Shooting method, device and equipment
US10171738B1 (en) 2018-05-04 2019-01-01 Google Llc Stabilizing video to reduce camera and face movement
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
TWI693828B (en) * 2018-06-28 2020-05-11 圓展科技股份有限公司 Image-capturing device and method for operating the same
CN108900772A (en) * 2018-07-19 2018-11-27 维沃移动通信有限公司 A kind of mobile terminal and image capturing method
DK201870623A1 (en) 2018-09-11 2020-04-15 Apple Inc. User interfaces for simulated depth effects
CN110896444B (en) * 2018-09-13 2022-01-04 深圳市鸿合创新信息技术有限责任公司 Double-camera switching method and equipment
CN109163809B (en) * 2018-09-25 2020-10-13 北京理工大学 Multi-aperture view field partially overlapped dual-band thermal imaging method and device
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US10674072B1 (en) 2019-05-06 2020-06-02 Apple Inc. User interfaces for capturing and managing visual media
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
CN109587455B (en) * 2019-02-01 2024-05-03 思特威(上海)电子科技股份有限公司 Intelligent zooming image sensor
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
CN114223192A (en) 2019-08-26 2022-03-22 三星电子株式会社 System and method for content enhancement using four-color filtered array sensors
US11685016B2 (en) 2019-08-26 2023-06-27 Lake Country Tool, Llc Cooling device for a rotating polishing disk
WO2021055585A1 (en) 2019-09-17 2021-03-25 Boston Polarimetrics, Inc. Systems and methods for surface modeling using polarization cues
KR102680342B1 (en) * 2019-09-23 2024-07-03 삼성전자주식회사 Electronic device for performing video hdr process based on image data obtained by plurality of image sensors
MX2022004162A (en) 2019-10-07 2022-07-12 Boston Polarimetrics Inc Systems and methods for augmentation of sensor systems and imaging systems with polarization.
KR102625261B1 (en) * 2019-10-21 2024-01-12 삼성전자주식회사 Image device
CN110855883B (en) * 2019-11-05 2021-07-20 浙江大华技术股份有限公司 Image processing system, method, device equipment and storage medium
CN112839215B (en) * 2019-11-22 2022-05-13 华为技术有限公司 Camera module, camera, terminal device, image information determination method and storage medium
KR20230116068A (en) 2019-11-30 2023-08-03 보스턴 폴라리메트릭스, 인크. System and method for segmenting transparent objects using polarization signals
CN115552486A (en) 2020-01-29 2022-12-30 因思创新有限责任公司 System and method for characterizing an object pose detection and measurement system
WO2021154459A1 (en) 2020-01-30 2021-08-05 Boston Polarimetrics, Inc. Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images
US11509837B2 (en) 2020-05-12 2022-11-22 Qualcomm Incorporated Camera transition blending
US11953700B2 (en) 2020-05-27 2024-04-09 Intrinsic Innovation Llc Multi-aperture polarization optical systems using beam splitters
US11151736B1 (en) * 2020-05-30 2021-10-19 Center For Quantitative Cytometry Apparatus and method to obtain unprocessed intrinsic data cubes for generating intrinsic hyper-spectral data cubes
US11039074B1 (en) 2020-06-01 2021-06-15 Apple Inc. User interfaces for managing media
CN111683234B (en) * 2020-06-04 2022-05-31 深圳开立生物医疗科技股份有限公司 Endoscope imaging method and device and related equipment
US11190689B1 (en) 2020-07-29 2021-11-30 Google Llc Multi-camera video stabilization
JP7477158B2 (en) * 2020-07-31 2024-05-01 i-PRO株式会社 3-chip camera
JP2022029026A (en) * 2020-08-04 2022-02-17 キヤノン株式会社 Imaging apparatus
US11212449B1 (en) 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
KR20220075028A (en) * 2020-11-26 2022-06-07 삼성전자주식회사 Electronic device including image sensor having multi-crop function
TWI831078B (en) * 2020-12-11 2024-02-01 國立中央大學 Optical system and optical image processing method by applying image restoration
US12020455B2 (en) 2021-03-10 2024-06-25 Intrinsic Innovation Llc Systems and methods for high dynamic range image reconstruction
US12069227B2 (en) 2021-03-10 2024-08-20 Intrinsic Innovation Llc Multi-modal and multi-spectral stereo camera arrays
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
US12067746B2 (en) 2021-05-07 2024-08-20 Intrinsic Innovation Llc Systems and methods for using computer vision to pick up small objects
US12112024B2 (en) 2021-06-01 2024-10-08 Apple Inc. User interfaces for managing media styles
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers
US11676306B1 (en) * 2022-11-05 2023-06-13 Center For Quantitative Cytometry Enhancing and mapping the multi-dimentional color differentiation of intrinsic images
CN115880198B (en) * 2023-02-01 2023-07-07 荣耀终端有限公司 Image processing method and device

Citations (287)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4199785A (en) 1979-01-05 1980-04-22 Honeywell Inc. Electronic zoom system
JPS59191146A (en) 1983-04-13 1984-10-30 Hitachi Ltd Optical scanner
US5005083A (en) 1988-05-19 1991-04-02 Siemens Aktiengesellschaft FLIR system with two optical channels for observing a wide and a narrow field of view
US5032917A (en) 1990-03-12 1991-07-16 Rca Licensing Corporation Video signal blending apparatus
US5041852A (en) 1990-10-18 1991-08-20 Fjui Photo Film Co., Ltd. Camera shake correction system
US5051830A (en) 1989-08-18 1991-09-24 Messerschmitt-Bolkow-Blohm Gmbh Dual lens system for electronic camera
US5099263A (en) 1984-11-10 1992-03-24 Minolta Camera Kabushiki Kaisha Variable focal length camera
JPH04211230A (en) 1989-10-20 1992-08-03 Fuji Photo Film Co Ltd Compensator for camera shake by hand
US5248971A (en) 1992-05-19 1993-09-28 Mandl William J Method and apparatus for multiplexed oversampled analog to digital modulation
US5287093A (en) 1990-06-11 1994-02-15 Matsushita Electric Industrial Co., Ltd. Image processor for producing cross-faded image from first and second image data
US5394520A (en) 1991-09-26 1995-02-28 Hughes Aircraft Company Imaging apparatus for providing a composite digital representation of a scene within a field of regard
US5436660A (en) 1991-03-13 1995-07-25 Sharp Kabushiki Kaisha Image sensing apparatus having plurality of optical systems and method of operating such apparatus
US5444478A (en) 1992-12-29 1995-08-22 U.S. Philips Corporation Image processing method and device for constructing an image from adjacent images
US5459520A (en) 1992-12-08 1995-10-17 Sony Corporation Electronic camera with over-sampling filter and method for over-sampling and interpolating electronic camera image data
JPH07318864A (en) 1994-05-20 1995-12-08 Sony Corp Optical axis correcting mechanism
JPH08271976A (en) 1995-03-29 1996-10-18 Canon Inc Camera
US5657402A (en) 1991-11-01 1997-08-12 Massachusetts Institute Of Technology Method of creating a high resolution still image using a plurality of images and apparatus for practice of the method
US5682198A (en) 1993-06-28 1997-10-28 Canon Kabushiki Kaisha Double eye image pickup apparatus
US5768443A (en) 1995-12-19 1998-06-16 Cognex Corporation Method for coordinating multiple fields of view in multi-camera
US5926190A (en) 1996-08-21 1999-07-20 Apple Computer, Inc. Method and system for simulating motion in a computer graphics application using image registration and view interpolation
US5940641A (en) 1997-07-10 1999-08-17 Eastman Kodak Company Extending panoramic images
US5982951A (en) 1996-05-28 1999-11-09 Canon Kabushiki Kaisha Apparatus and method for combining a plurality of images
US6101334A (en) 1997-02-18 2000-08-08 Mobi Corporation Dual focal length camera
US6128416A (en) 1993-09-10 2000-10-03 Olympus Optical Co., Ltd. Image composing technique for optimally composing a single image from a plurality of digital images
US6148120A (en) 1997-10-30 2000-11-14 Cognex Corporation Warping of focal images to correct correspondence error
US6208765B1 (en) 1998-06-19 2001-03-27 Sarnoff Corporation Method and apparatus for improving image resolution
US6268611B1 (en) 1997-12-18 2001-07-31 Cellavision Ab Feature-free registration of dissimilar images using a robust similarity metric
US20020005902A1 (en) 2000-06-02 2002-01-17 Yuen Henry C. Automatic video recording system using wide-and narrow-field cameras
US20020030163A1 (en) 2000-08-09 2002-03-14 Zhang Evan Y.W. Image intensifier and LWIR fusion/combination system
US20020063711A1 (en) 1999-05-12 2002-05-30 Imove Inc. Camera system with high resolution image inside a wide angle view
US20020075258A1 (en) 1999-05-12 2002-06-20 Imove Inc. Camera system with high resolution image inside a wide angle view
US20020122113A1 (en) 1999-08-09 2002-09-05 Foote Jonathan T. Method and system for compensating for parallax in multiple camera systems
US20020167741A1 (en) 2001-05-14 2002-11-14 Olympus Optical Co., Ltd. Optical apparatus including lens
US20030030729A1 (en) 1996-09-12 2003-02-13 Prentice Wayne E. Dual mode digital imaging and camera system
US6549215B2 (en) 1999-05-20 2003-04-15 Compaq Computer Corporation System and method for displaying images using anamorphic video
US20030093805A1 (en) 2001-11-15 2003-05-15 Gin J.M. Jack Dual camera surveillance and control system
US6611289B1 (en) 1999-01-15 2003-08-26 Yanbin Yu Digital cameras using multiple sensors with multiple lenses
US20030160886A1 (en) 2002-02-22 2003-08-28 Fuji Photo Film Co., Ltd. Digital camera
JP2003298920A (en) 2002-03-29 2003-10-17 Fuji Photo Film Co Ltd Digital camera
US20030202113A1 (en) 2002-04-30 2003-10-30 Eastman Kodak Company Electronic still camera and image processing method
US6643416B1 (en) 1999-11-30 2003-11-04 Eastman Kodak Company Method for determining necessary resolution for zoom and crop images
US6650368B1 (en) 1999-10-26 2003-11-18 Hewlett-Packard Development Company, Lp. Digital camera and method of enhancing zoom effects
US20040008773A1 (en) 2002-06-14 2004-01-15 Canon Kabushiki Kaisha Multiple image processing and synthesis using background image extraction
US6680748B1 (en) 2001-09-27 2004-01-20 Pixim, Inc., Multi-mode camera and method therefor
US20040012683A1 (en) 2001-01-23 2004-01-22 Masafumi Yamasaki Shake compensating device for optical devices
US20040017386A1 (en) 2002-07-26 2004-01-29 Qiong Liu Capturing and producing shared multi-resolution video
US20040027367A1 (en) 2002-04-30 2004-02-12 Maurizio Pilu Method of and apparatus for processing zoomed sequential images
US6714665B1 (en) 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US20040061788A1 (en) 2002-09-26 2004-04-01 Logitech Europe S.A. Multiple mode capture button for a digital camera
US6724421B1 (en) 1994-11-22 2004-04-20 Sensormatic Electronics Corporation Video surveillance system with pilot and slave cameras
JP2004133054A (en) 2002-10-08 2004-04-30 Olympus Corp Lens barrel
US6738073B2 (en) 1999-05-12 2004-05-18 Imove, Inc. Camera system with both a wide angle view and a high resolution view
US6741250B1 (en) 2001-02-09 2004-05-25 Be Here Corporation Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path
US6750903B1 (en) 1998-03-05 2004-06-15 Hitachi, Ltd. Super high resolution camera
US20040141086A1 (en) 2003-01-10 2004-07-22 Olympus Corporation Electronic imaging apparatus
US6778207B1 (en) 2000-08-07 2004-08-17 Koninklijke Philips Electronics N.V. Fast digital pan tilt zoom video
JP2004245982A (en) 2003-02-13 2004-09-02 Minolta Co Ltd Imaging lens device and electronic equipment equipped with the same
US20040240052A1 (en) 2003-06-02 2004-12-02 Pentax Corporation Multiple-focal imaging device, and a mobile device having the multiple-focal-length imaging device
US20050013509A1 (en) 2003-07-16 2005-01-20 Ramin Samadani High resolution image reconstruction
US20050046740A1 (en) 2003-08-29 2005-03-03 Davis Raymond A.. Apparatus including a dual camera module and method of using the same
JP2005099265A (en) 2003-09-24 2005-04-14 Fujinon Corp Imaging apparatus, imaging method, and range finding method
EP1536633A1 (en) 2003-11-27 2005-06-01 Sony Corporation Photographing apparatus and method, supervising system, program and recording medium
US20050157184A1 (en) 2004-01-21 2005-07-21 Konica Minolta Photo Imaging, Inc. Image capturing apparatus
US20050168834A1 (en) 2002-10-08 2005-08-04 Olympus Corporation Camera
US20050200718A1 (en) 2004-03-10 2005-09-15 Samsung Electronics Co., Ltd. Image photographing apparatus and method
US7002583B2 (en) 2000-08-03 2006-02-21 Stono Technologies, Llc Display of images and image transitions
US20060054782A1 (en) 2004-08-25 2006-03-16 Olsen Richard I Apparatus for multiple camera devices and method of operating same
US20060056056A1 (en) 2004-07-19 2006-03-16 Grandeye Ltd. Automatically expanding the zoom capability of a wide-angle video camera
US7038716B2 (en) 1999-07-30 2006-05-02 Pixim, Inc. Mobile device equipped with digital image sensor
US20060102907A1 (en) 2004-11-17 2006-05-18 Samsung Electronics Co., Ltd. Thin film transistor array panel and method for manufacturing the same
US20060125937A1 (en) 2004-12-10 2006-06-15 Ambarella, Inc. High resolution zoom: a novel digital zoom for digital video camera
US20060170793A1 (en) 2005-02-03 2006-08-03 Eastman Kodak Company Digital imaging system with digital zoom warning
US20060175549A1 (en) 2005-02-09 2006-08-10 Miller John L High and low resolution camera systems and methods
US20060187310A1 (en) 2005-02-18 2006-08-24 Janson Wilbert F Jr Digital camera using an express zooming mode to provide expedited operation over an extended zoom range
US20060187338A1 (en) 2005-02-18 2006-08-24 May Michael J Camera phone using multiple lenses and image sensors to provide an extended zoom range
US20060187322A1 (en) 2005-02-18 2006-08-24 Janson Wilbert F Jr Digital camera using multiple fixed focal length lenses and multiple image sensors to provide an extended zoom range
JP2006238325A (en) 2005-02-28 2006-09-07 Canon Inc Camera system
US20070024737A1 (en) 2005-08-01 2007-02-01 Hideo Nakamura Image capturing device having multiple optical systems
US7206136B2 (en) 2005-02-18 2007-04-17 Eastman Kodak Company Digital camera using multiple lenses and image sensors to provide an extended zoom range
EP1780567A1 (en) 2004-07-20 2007-05-02 Five Dimension Co., Ltd. Electronic imaging device
US20070126911A1 (en) 2005-11-16 2007-06-07 Sony Corporation Image capture apparatus and zoom lens
US7248294B2 (en) 2001-07-10 2007-07-24 Hewlett-Packard Development Company, L.P. Intelligent feature selection and pan zoom control
US20070177025A1 (en) 2006-02-01 2007-08-02 Micron Technology, Inc. Method and apparatus minimizing die area and module size for a dual-camera mobile device
US7256944B2 (en) 2005-02-18 2007-08-14 Eastman Kodak Company Compact image capture assembly using multiple lenses and image sensors to provide an extended zoom range
US20070189386A1 (en) 2005-06-22 2007-08-16 Taro Imagawa Image generation apparatus and image generation method
US20070188653A1 (en) 2006-02-13 2007-08-16 Pollock David B Multi-lens array system and method
JP2007228006A (en) 2006-02-21 2007-09-06 Casio Comput Co Ltd Digital camera
US20070257184A1 (en) 2005-08-25 2007-11-08 Olsen Richard I Large dynamic range cameras
JP2007306282A (en) 2006-05-11 2007-11-22 Citizen Electronics Co Ltd Camera module
US20070285550A1 (en) 2006-06-13 2007-12-13 Samsung Electronics Co. Ltd. Method and apparatus for taking images using mobile communication terminal with plurality of camera lenses
US20080017557A1 (en) 2006-07-19 2008-01-24 Witdouck Calvin J System and Method for Sorting Larvae Cocoons
US20080024614A1 (en) 2006-07-25 2008-01-31 Hsiang-Tsun Li Mobile device with dual digital camera sensors and methods of using the same
US20080025634A1 (en) 2006-07-27 2008-01-31 Eastman Kodak Company Producing an extended dynamic range digital image
US20080030592A1 (en) 2006-08-01 2008-02-07 Eastman Kodak Company Producing digital image with different resolution portions
US20080030611A1 (en) 2006-08-01 2008-02-07 Jenkins Michael V Dual Sensor Video Camera
US7339621B2 (en) 2001-12-13 2008-03-04 Psion Teklogix Systems, Inc. Imager output signal processing
US7346217B1 (en) 2001-04-25 2008-03-18 Lockheed Martin Corporation Digital image enhancement using successive zoom images
JP2008076485A (en) 2006-09-19 2008-04-03 Konica Minolta Opto Inc Lens barrel and imaging apparatus
US20080084484A1 (en) 2006-10-10 2008-04-10 Nikon Corporation Camera
US7365793B2 (en) 2002-10-31 2008-04-29 Hewlett-Packard Development Company, L.P. Image capture system and method
US20080117316A1 (en) 2006-11-22 2008-05-22 Fujifilm Corporation Multi-eye image pickup device
US7411610B2 (en) 2002-05-15 2008-08-12 Idelix Software Inc. Method and system for generating detail-in-context video presentations using a graphical user interface
US7424218B2 (en) 2005-07-28 2008-09-09 Microsoft Corporation Real-time preview for panoramic images
US20080219654A1 (en) 2007-03-09 2008-09-11 Border John N Camera using multiple lenses and image sensors to provide improved focusing capability
US20080218611A1 (en) 2007-03-09 2008-09-11 Parulski Kenneth A Method and apparatus for operating a dual lens camera to augment an image
US20080218613A1 (en) 2007-03-09 2008-09-11 Janson Wilbert F Camera using multiple lenses and image sensors operable in a default imaging mode
US20080218612A1 (en) 2007-03-09 2008-09-11 Border John N Camera using multiple lenses and image sensors in a rangefinder configuration to provide a range map
CN101276415A (en) 2008-03-03 2008-10-01 北京航空航天大学 Apparatus and method for realizing multi-resolutions image acquisition with multi-focusing video camera
US7509041B2 (en) 2005-08-01 2009-03-24 Eastman Kodak Company Image-capturing device having multiple optical systems
US20090086074A1 (en) 2007-09-27 2009-04-02 Omnivision Technologies, Inc. Dual mode camera solution apparatus, system, and method
US20090109556A1 (en) 2007-10-31 2009-04-30 Sony Corporation Lens barrel and imaging apparatus
US20090122195A1 (en) 2007-11-09 2009-05-14 Van Baar Jeroen System and Method for Combining Image Sequences
US20090122406A1 (en) 2006-02-06 2009-05-14 Jarkko Rouvinen Optical Image Stabilizer Using Gimballed Prism
US7533819B2 (en) 2007-01-31 2009-05-19 Symbol Technologies, Inc. Dual camera assembly for an imaging-based bar code reader
US20090128644A1 (en) 2007-11-15 2009-05-21 Camp Jr William O System and method for generating a photograph
KR20090058229A (en) 2007-12-04 2009-06-09 삼성전기주식회사 Dual camera module
WO2009097552A1 (en) 2008-02-01 2009-08-06 Omnivision Cdm Optics, Inc. Image data fusion systems and methods
US20090219547A1 (en) 2006-02-06 2009-09-03 Petteri Kauhanen Method and Device for Position Sensing in an Imaging System
US20090252484A1 (en) 2005-11-14 2009-10-08 Nikon Corporation Image Blur Correction Device and Camera
US20090295949A1 (en) 2008-05-28 2009-12-03 Valtion Teknillinen Tutkimuskeskus Zoom camera arrangement comprising multiple sub-cameras
US20090324135A1 (en) 2008-06-27 2009-12-31 Sony Corporation Image processing apparatus, image processing method, program and recording medium
US20100013906A1 (en) 2008-07-17 2010-01-21 Border John N Zoom by multiple image capture
KR20100008936A (en) 2008-07-17 2010-01-27 삼성전자주식회사 Portable terminal having dual camera and photographing method using the same
US20100020221A1 (en) 2008-07-24 2010-01-28 David John Tupman Camera Interface in a Portable Handheld Electronic Device
US20100060746A9 (en) 2004-08-25 2010-03-11 Richard Ian Olsen Simultaneous multiple field of view digital cameras
US20100097444A1 (en) 2008-10-16 2010-04-22 Peter Lablans Camera System for Creating an Image From a Plurality of Images
US20100103194A1 (en) 2008-10-27 2010-04-29 Huawei Technologies Co., Ltd. Method and system for fusing images
US7738016B2 (en) 2006-02-06 2010-06-15 Eastman Kodak Company Digital camera with dual optical systems
US20100165131A1 (en) 2008-12-25 2010-07-01 Fujifilm Corporation Image stabilizer and optical instrument therewith
US7773121B1 (en) 2006-05-03 2010-08-10 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration High-resolution, continuous field-of-view (FOV), non-rotating imaging system
US20100238327A1 (en) 2009-03-19 2010-09-23 Griffith John D Dual Sensor Camera
US7809256B2 (en) 2005-07-27 2010-10-05 Sony Corporation Imaging lens device and imaging apparatus
WO2010122841A1 (en) 2009-04-22 2010-10-28 コニカミノルタオプト株式会社 Mirror-lens barrel, image pickup device and method for manufacturing a mirror-lens barrel
US20100277619A1 (en) 2009-05-04 2010-11-04 Lawrence Scarff Dual Lens Digital Zoom
US20100283842A1 (en) 2007-04-19 2010-11-11 Dvp Technologies Ltd. Imaging system and method for use in monitoring a field of regard
US20100321494A1 (en) 2009-06-18 2010-12-23 Theia Technologies, Llc Compact dome camera
US20110058320A1 (en) 2009-09-09 2011-03-10 Lg Electronics Inc. Mobile terminal
US7918398B2 (en) 2007-06-04 2011-04-05 Hand Held Products, Inc. Indicia reading terminal having multiple setting imaging lens
US20110080487A1 (en) 2008-05-20 2011-04-07 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
JP2011085666A (en) 2009-10-13 2011-04-28 Tdk Taiwan Corp Lens driving device
US20110121421A1 (en) 2008-05-09 2011-05-26 Ecole Polytechnique Federate de Lausanne EPFL Image sensor having nonlinear response
US20110128288A1 (en) 2009-12-02 2011-06-02 David Petrou Region of Interest Selector for Visual Queries
US7964835B2 (en) 2005-08-25 2011-06-21 Protarius Filo Ag, L.L.C. Digital cameras with direct luminance and chrominance detection
US20110164172A1 (en) 2008-09-10 2011-07-07 Panasonic Corporation Camera body and imaging device
US7978239B2 (en) 2007-03-01 2011-07-12 Eastman Kodak Company Digital camera using multiple image sensors to provide improved temporal sampling
US20110216228A1 (en) * 2009-09-14 2011-09-08 Fujifilm Corporation Solid-state image sensing element, method for driving solid-state image sensing element and image pickup device
US20110229054A1 (en) 2008-07-23 2011-09-22 Snell Limited Processing of images to represent a transition in viewpoint
US20110234881A1 (en) 2010-03-25 2011-09-29 Fujifilm Corporation Display apparatus
US20110234853A1 (en) 2010-03-26 2011-09-29 Fujifilm Corporation Imaging apparatus and display apparatus
US20110242355A1 (en) 2010-04-05 2011-10-06 Qualcomm Incorporated Combining data from multiple image sensors
US20110242286A1 (en) 2010-03-31 2011-10-06 Vincent Pace Stereoscopic Camera With Automatic Obstruction Removal
US20110285730A1 (en) 2010-05-21 2011-11-24 Jimmy Kwok Lap Lai Controlling Display Updates For Electro-Optic Displays
US20110292258A1 (en) 2010-05-28 2011-12-01 C2Cure, Inc. Two sensor imaging systems
US20110298966A1 (en) 2010-05-21 2011-12-08 Jena Optronik Gmbh Camera having multiple focal lengths
US8094208B2 (en) * 2009-07-17 2012-01-10 The Invention Sciennce Fund I, LLC Color filters and demosaicing techniques for digital imaging
US20120026366A1 (en) 2009-04-07 2012-02-02 Nextvision Stabilized Systems Ltd. Continuous electronic zoom for an imaging system with multiple imaging devices having different fixed fov
US8115825B2 (en) 2008-02-20 2012-02-14 Apple Inc. Electronic device with two image sensors
US8134115B2 (en) 2009-06-23 2012-03-13 Nokia Corporation Color filters for sub-diffraction limit-sized light sensors
US20120062780A1 (en) 2010-09-15 2012-03-15 Morihisa Taijiro Imaging apparatus and image capturing method
US20120069235A1 (en) 2010-09-20 2012-03-22 Canon Kabushiki Kaisha Image capture with focus adjustment
US20120075489A1 (en) 2010-09-24 2012-03-29 Nishihara H Keith Zoom camera image blending technique
US8149327B2 (en) 2009-03-13 2012-04-03 Hon Hai Precision Industry Co., Ltd. Camera module with dual lens modules and image sensors
US20120081566A1 (en) * 2010-09-30 2012-04-05 Apple Inc. Flash synchronization using image sensor interface timing signal
US8154610B2 (en) 2004-12-30 2012-04-10 Intellectual Ventures Ii Llc Image sensor with built-in ISP and dual camera system
US20120105579A1 (en) 2010-11-01 2012-05-03 Lg Electronics Inc. Mobile terminal and method of controlling an image photographing therein
US8179457B2 (en) 2009-06-23 2012-05-15 Nokia Corporation Gradient color filters for sub-diffraction limit sensors
US20120154614A1 (en) 2009-08-21 2012-06-21 Akihiro Moriya Camera-shake correction device
US20120196648A1 (en) 2011-01-31 2012-08-02 Havens William H Apparatus, system, and method of use of imaging assembly on mobile terminal
US8238695B1 (en) 2005-12-15 2012-08-07 Grandeye, Ltd. Data reduction techniques for processing wide-angle video
US20120229663A1 (en) 2011-03-08 2012-09-13 Spectral Instruments Imaging , Llc Imaging system having primary and auxiliary camera systems
US8274552B2 (en) 2010-12-27 2012-09-25 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US20120249815A1 (en) 2011-03-29 2012-10-04 Mircrosoft Corporation Folded imaging path camera
CN102739949A (en) 2011-04-01 2012-10-17 张可伦 Control method for multi-lens camera and multi-lens device
EP2523450A1 (en) 2011-05-10 2012-11-14 HTC Corporation Handheld electronic device with dual image capturing method and computer program product
US20120287315A1 (en) 2011-05-10 2012-11-15 Htc Corporation Handheld Electronic Device, Dual Image Capturing Method Applying for Thereof, and Computer Program Production for Load into Thereof
US20120320467A1 (en) 2011-06-14 2012-12-20 Samsung Electro-Mechanics Co., Ltd. Image photographing device
US20130002928A1 (en) 2011-06-28 2013-01-03 Canon Kabushiki Kaisha Adjustment of imaging properties for an imaging assembly having light-field optics
US20130016427A1 (en) 2011-07-15 2013-01-17 Mitsumi Electric Co., Ltd Lens holder driving device capable of avoiding deleterious effect on hall elements
US8390729B2 (en) 2007-09-05 2013-03-05 International Business Machines Corporation Method and apparatus for providing a video image having multiple focal lengths
US8391697B2 (en) 2009-09-30 2013-03-05 Lg Electronics Inc. Mobile terminal and method of controlling the operation of the mobile terminal
US8400555B1 (en) 2009-12-01 2013-03-19 Adobe Systems Incorporated Focused plenoptic camera employing microlenses with different focal lengths
US20130076922A1 (en) 2011-07-28 2013-03-28 Canon Kabushiki Kaisha Correcting optical device and image pickup apparatus
CN103024272A (en) 2012-12-14 2013-04-03 广东欧珀移动通信有限公司 Double camera control device, method and system of mobile terminal and mobile terminal
US20130093842A1 (en) 2011-10-12 2013-04-18 Canon Kabushiki Kaisha Image-capturing device
US20130113894A1 (en) 2010-07-13 2013-05-09 Ram Srikanth Mirlay Variable 3-d camera assembly for still photography
US8439265B2 (en) 2009-06-16 2013-05-14 Intel Corporation Camera applications in a handheld device
US8446484B2 (en) 2010-04-21 2013-05-21 Nokia Corporation Image processing architecture with pre-scaler
JP2013106289A (en) 2011-11-16 2013-05-30 Konica Minolta Advanced Layers Inc Imaging apparatus
US20130135445A1 (en) 2010-12-27 2013-05-30 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US20130136355A1 (en) * 2011-11-29 2013-05-30 Microsoft Corporation Automatic Estimation and Correction of Vignetting
US8483452B2 (en) 2010-03-09 2013-07-09 Sony Corporation Image processing apparatus, image processing method, and program
US20130182150A1 (en) 2012-01-12 2013-07-18 Olympus Corporation Image Pickup Apparatus
US20130201360A1 (en) 2012-02-03 2013-08-08 Samsung Electronics Co., Ltd. Method of changing an operation mode of a camera image sensor
US20130202273A1 (en) 2012-02-07 2013-08-08 Canon Kabushiki Kaisha Method and device for transitioning between an image of a first video sequence and an image of a second video sequence
US8514491B2 (en) 2009-11-20 2013-08-20 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US20130235224A1 (en) 2012-03-09 2013-09-12 Minwoo Park Video camera providing a composite video sequence
US20130250150A1 (en) 2010-05-03 2013-09-26 Michael R. Malone Devices and methods for high-resolution image and video capture
US8547389B2 (en) 2010-04-05 2013-10-01 Microsoft Corporation Capturing image structure detail from a first image and color from a second image
US20130258044A1 (en) 2012-03-30 2013-10-03 Zetta Research And Development Llc - Forc Series Multi-lens camera
US20130270419A1 (en) 2012-04-12 2013-10-17 Digitaloptics Corporation Compact Camera Module
US20130278785A1 (en) 2012-04-20 2013-10-24 Hoya Corporation Imaging apparatus
US8587691B2 (en) 2008-11-28 2013-11-19 Samsung Electronics Co., Ltd. Photographing apparatus and method for dynamic range adjustment and stereography
US20130321668A1 (en) 2012-05-30 2013-12-05 Ajith Kamath Plural Focal-Plane Imaging
US8619148B1 (en) 2012-01-04 2013-12-31 Audience, Inc. Image correction after combining images from multiple cameras
US20140009631A1 (en) 2012-07-06 2014-01-09 Apple Inc. Vcm ois actuator module
KR20140014787A (en) 2012-07-26 2014-02-06 엘지이노텍 주식회사 Camera module
US20140049615A1 (en) 2010-12-28 2014-02-20 Sony Corporation Lens protection device, lens unit and image capture device
US8660420B2 (en) 2011-12-13 2014-02-25 Hon Hai Precision Industry Co., Ltd. Adjustable dual lens camera
US20140118584A1 (en) 2012-10-31 2014-05-01 Jess Jan Young Lee Devices, methods, and systems for expanded-field-of-view image and video capture
WO2014072818A2 (en) 2012-11-08 2014-05-15 Dynaoptics Pte Ltd. Miniature optical zoom lens
CN103841404A (en) 2014-03-18 2014-06-04 江西省一元数码科技有限公司 Novel three-dimensional image shooting module
US20140192253A1 (en) 2013-01-05 2014-07-10 Tinz Optics, Inc. Methods and apparatus for capturing and/or processing images
US20140192238A1 (en) 2010-10-24 2014-07-10 Linx Computational Imaging Ltd. System and Method for Imaging and Image Processing
US20140218587A1 (en) 2013-02-07 2014-08-07 Motorola Mobility Llc Double sided camera module
US8803990B2 (en) 2011-01-25 2014-08-12 Aptina Imaging Corporation Imaging system with multiple sensors for producing high-dynamic-range images
US20140313316A1 (en) 2013-01-30 2014-10-23 SeeScan, Inc. Adjustable variable resolution inspection systems and methods using multiple image sensors
US8896655B2 (en) 2010-08-31 2014-11-25 Cisco Technology, Inc. System and method for providing depth adaptive video conferencing
US20140362242A1 (en) 2012-11-16 2014-12-11 Panasonic Intellectual Property Corporation Of America Camera drive device
US20150002683A1 (en) 2013-07-01 2015-01-01 Tdk Taiwan Corp. Optical Anti-Shake Apparatus with Switchable Light Path
US20150042870A1 (en) 2013-08-08 2015-02-12 Apple Inc. Mirror tilt actuation
US8976255B2 (en) 2011-02-28 2015-03-10 Olympus Imaging Corp. Imaging apparatus
US20150070781A1 (en) 2013-09-12 2015-03-12 Hong Kong Applied Science and Technology Research Institute, Co. Multi-lens imaging module and actuator with auto-focus adjustment
US20150092066A1 (en) 2013-09-30 2015-04-02 Google Inc. Using a Second Camera to Adjust Settings of First Camera
US9019387B2 (en) 2011-03-18 2015-04-28 Ricoh Company, Ltd. Imaging device and method of obtaining image
US9025073B2 (en) 2007-12-04 2015-05-05 Nan Chang O-Film Optoelectronics Technology Ltd Compact camera optics
US20150138381A1 (en) 2012-06-29 2015-05-21 Lg Innotek Co., Ltd. Camera module
US9041835B2 (en) 2010-11-10 2015-05-26 Canon Kabushiki Kaisha Selective combining of image data
US20150154776A1 (en) 2013-12-03 2015-06-04 Huawei Technologies Co., Ltd. Image splicing method and apparatus
US20150162048A1 (en) 2012-06-11 2015-06-11 Sony Computer Entertainment Inc. Image generation device and image generation method
US20150195458A1 (en) 2012-07-12 2015-07-09 Sony Corporation Image shake correction device and image shake correction method and image pickup device
US20150215516A1 (en) 2014-01-27 2015-07-30 Ratheon Company Imaging system and methods with variable lateral magnification
US20150237280A1 (en) 2014-02-19 2015-08-20 Samsung Electronics Co., Ltd. Image processing device with multiple image signal processors and image processing method
US20150242994A1 (en) 2010-01-28 2015-08-27 Pathway Innovations And Technologies, Inc. Method and system for accelerating video preview digital camera
US20150253543A1 (en) 2014-03-07 2015-09-10 Apple Inc. Folded telephoto camera lens system
US20150253647A1 (en) 2014-03-07 2015-09-10 Apple Inc. Folded camera lens systems
US9137447B2 (en) 2013-07-31 2015-09-15 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus that generates an image including an emphasized in-focus part of a captured image
US20150271471A1 (en) 2014-03-19 2015-09-24 Htc Corporation Blocking detection method for camera and electronic apparatus with cameras
US20150286033A1 (en) 2014-04-04 2015-10-08 Qualcomm Incorporated Auto-focus in low-profile folded optics multi-camera system
KR20150118012A (en) 2014-04-11 2015-10-21 삼성전기주식회사 Camera module
US20150316744A1 (en) 2014-04-30 2015-11-05 Lite-On Electronics (Guangzhou) Limited Voice coil motor array module
US9185291B1 (en) 2013-06-13 2015-11-10 Corephotonics Ltd. Dual aperture zoom digital camera
US20150334309A1 (en) 2014-05-16 2015-11-19 Htc Corporation Handheld electronic apparatus, image capturing apparatus and image capturing method thereof
US9215385B2 (en) 2009-06-22 2015-12-15 Ominivision Technologies, Inc. System and method for an image sensor operable in multiple video standards
US9215377B2 (en) 2013-12-04 2015-12-15 Nokia Technologies Oy Digital zoom with sensor mode change
US20160044250A1 (en) 2014-08-10 2016-02-11 Corephotonics Ltd. Zoom dual-aperture camera with folded lens
US9270875B2 (en) 2011-07-20 2016-02-23 Broadcom Corporation Dual image capture processing
US20160070088A1 (en) 2014-09-10 2016-03-10 Hoya Corporation Imaging apparatus having bending optical element
US9286680B1 (en) 2014-12-23 2016-03-15 Futurewei Technologies, Inc. Computational multi-camera adjustment for smooth view switching and zooming
US9344626B2 (en) 2013-11-18 2016-05-17 Apple Inc. Modeless video and still frame capture using interleaved frames of video and still resolutions
US20160154204A1 (en) 2014-11-28 2016-06-02 Samsung Electro-Mechanics Co., Ltd. Camera module
US20160154202A1 (en) 2013-05-27 2016-06-02 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Optical structure with ridges arranged at the same and method for producing the same
US9360671B1 (en) 2014-06-09 2016-06-07 Google Inc. Systems and methods for image zoom
US9369621B2 (en) 2010-05-03 2016-06-14 Invisage Technologies, Inc. Devices and methods for high-resolution image and video capture
US20160212358A1 (en) 2011-11-14 2016-07-21 Sony Corporation Information processing apparatus, method, and non-transitory computer-readable medium
US9413930B2 (en) 2013-03-14 2016-08-09 Joergen Geerds Camera system
US9420180B2 (en) 2012-05-22 2016-08-16 Zte Corporation Method and device for switching between double cameras
US20160241751A1 (en) 2013-09-23 2016-08-18 Lg Innotek Co., Ltd. Camera Module and Manufacturing Method for Same
US9438792B2 (en) 2013-05-17 2016-09-06 Canon Kabushiki Kaisha Image-processing apparatus and image-processing method for generating a virtual angle of view
US20160295112A1 (en) 2012-10-19 2016-10-06 Qualcomm Incorporated Multi-camera system using folded optics
US20160301840A1 (en) 2013-12-06 2016-10-13 Huawei Device Co., Ltd. Photographing Method for Dual-Lens Device and Dual-Lens Device
US9485432B1 (en) 2015-04-29 2016-11-01 Uurmi Systems Private Limited Methods, systems and apparatuses for dual-camera based zooming
US20160353012A1 (en) 2015-05-25 2016-12-01 Htc Corporation Zooming control method for camera and electronic apparatus with camera
US20170019616A1 (en) 2014-05-15 2017-01-19 Huawei Technologies Co., Ltd. Multi-frame noise reduction method, and terminal
WO2017025822A1 (en) 2015-08-13 2017-02-16 Corephotonics Ltd. Dual aperture zoom camera with video support and switching / non-switching dynamic control
WO2017037688A1 (en) 2015-09-06 2017-03-09 Corephotonics Ltd. Auto focus and optical image stabilization with roll compensation in a compact folded camera
US9618748B2 (en) 2008-04-02 2017-04-11 Esight Corp. Apparatus and method for a dynamic “region of interest” in a display system
US20170187962A1 (en) 2015-12-23 2017-06-29 Samsung Electronics Co., Ltd. Imaging device module, user terminal apparatus including the imaging device module, and a method of operating the imaging device module
US20170214846A1 (en) 2014-09-30 2017-07-27 Huawei Technologies Co., Ltd. Auto-Focus Method and Apparatus and Electronic Device
US20170214866A1 (en) 2013-12-06 2017-07-27 Huawei Device Co., Ltd. Image Generating Method and Dual-Lens Device
US9723220B2 (en) 2013-05-13 2017-08-01 Canon Kabushiki Kaisha Imaging apparatus, control method, and program
US9736365B2 (en) 2013-10-26 2017-08-15 Light Labs Inc. Zoom related methods and apparatus
US9736391B2 (en) 2013-12-06 2017-08-15 Huawei Device Co., Ltd. Photographing method of dual-lens device, and dual-lens device
US20170242225A1 (en) 2014-11-19 2017-08-24 Orlo James Fiske Thin optical system and camera
US9768310B2 (en) 2014-11-25 2017-09-19 Samsung Display Co., Ltd. Thin film transistor, organic light-emitting diode display including the same, and manufacturing method thereof
US20170289458A1 (en) 2016-03-31 2017-10-05 Lg Electronics Inc. Mobile terminal and method for controlling the same
US9800798B2 (en) 2015-02-13 2017-10-24 Qualcomm Incorporated Systems and methods for power optimization for imaging devices with dual cameras
US9851803B2 (en) 2013-03-15 2017-12-26 Eyecam, LLC Autonomous computing and telecommunications head-up displays glasses
US20180017844A1 (en) 2016-07-12 2018-01-18 Tdk Taiwan Corp. Lens driving module
US20180024329A1 (en) 2015-04-16 2018-01-25 Corephotonics Ltd. Auto focus and optical image stabilization in a compact folded camera
US9894287B2 (en) 2013-12-06 2018-02-13 Huawei Device (Dongguan) Co., Ltd. Method and apparatus for acquiring a high dynamic image using multiple cameras
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US20180059379A1 (en) 2016-08-26 2018-03-01 Largan Precision Co., Ltd. Optical path folding element, imaging lens module and electronic device
US20180120674A1 (en) 2015-06-24 2018-05-03 Corephotonics Ltd. Low profile tri-axis actuator for folded lens camera
US20180150973A1 (en) 2015-07-15 2018-05-31 Huawei Technologies Co., Ltd. Method and Apparatus for Calculating Dual-Camera Relative Position, and Device
WO2018130898A1 (en) 2017-01-12 2018-07-19 Corephotonics Ltd. Compact folded camera
US20180241922A1 (en) 2017-02-23 2018-08-23 Qualcomm Incorporated Adjustment for cameras for low power mode operation
US20180295292A1 (en) 2017-04-10 2018-10-11 Samsung Electronics Co., Ltd Method and electronic device for focus control

Family Cites Families (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4898467A (en) * 1988-11-07 1990-02-06 Eastman Kodak Company Spectrometer apparatus for self-calibrating color imaging apparatus
KR0120397B1 (en) * 1992-04-28 1997-10-22 나카무라 히사오 Image display apparatus
US5374956A (en) * 1992-05-29 1994-12-20 Eastman Kodak Company Electronic imaging apparatus with dithered color filter array
US5303028A (en) * 1992-08-24 1994-04-12 Eastman Kodak Company Spectrometer apparatus for calibrating color imaging apparatus
US5760955A (en) * 1995-04-06 1998-06-02 Philips Electronics North America Corporation Rear projection screen with reduced speckle
US5629764A (en) * 1995-07-07 1997-05-13 Advanced Precision Technology, Inc. Prism fingerprint sensor using a holographic optical element
GB9823689D0 (en) 1998-10-30 1998-12-23 Greenagate Limited Improved methods and apparatus for 3-D imaging
JP3777893B2 (en) * 1999-08-05 2006-05-24 セイコーエプソン株式会社 Liquid crystal display
JP2002010276A (en) * 2000-06-22 2002-01-11 Olympus Optical Co Ltd Imaging apparatus
JP4501239B2 (en) 2000-07-13 2010-07-14 ソニー株式会社 Camera calibration apparatus and method, and storage medium
JP4068869B2 (en) * 2002-03-29 2008-03-26 富士フイルム株式会社 Digital camera
CN1666229A (en) * 2002-07-04 2005-09-07 皇家飞利浦电子股份有限公司 Method and apparatus for signal processing, computer program product, computing system and camera
JP3861815B2 (en) 2003-01-17 2006-12-27 コニカミノルタフォトイメージング株式会社 Camera with image stabilization function
AU2003236050A1 (en) 2003-03-20 2004-10-11 Seijiro Tomita Panoramic picture creating method and device, and monitor system using the method and device
US7688511B2 (en) * 2004-07-23 2010-03-30 Hitachi Chemical Company, Ltd. Diffraction type light-condensing film and planar light source device using the same
US7465107B2 (en) 2004-09-21 2008-12-16 Canon Kabushiki Kaisha Photographing apparatus and control method therefor
KR100658150B1 (en) 2005-04-08 2006-12-15 삼성전기주식회사 Camera module and method of manufacturing the same
KR20070005946A (en) 2005-07-05 2007-01-11 엘지전자 주식회사 Position detection apparatus of camera lens in the mobile terminal
EP1961214A4 (en) * 2005-10-13 2011-11-16 Rjs Technology Inc System and method for a high performance color filter mosaic array
US7456881B2 (en) * 2006-01-12 2008-11-25 Aptina Imaging Corporation Method and apparatus for producing Bayer color mosaic interpolation for imagers
US7708478B2 (en) 2006-04-13 2010-05-04 Nokia Corporation Actuator mechanism and a shutter mechanism
US7697053B2 (en) 2006-11-02 2010-04-13 Eastman Kodak Company Integrated display having multiple capture devices
KR100871566B1 (en) 2006-12-04 2008-12-02 삼성전자주식회사 Apparatus and method for preventing shaking of image photographing device
CN101573731B (en) * 2007-01-04 2015-07-22 皇家飞利浦电子股份有限公司 Apparatus and method and for producing a corrected image of a region of interest from acquired projection data
US20100133424A1 (en) * 2007-05-26 2010-06-03 Norman Matheson Lindsay Electro-optical sensors
US7745779B2 (en) * 2008-02-08 2010-06-29 Aptina Imaging Corporation Color pixel arrays having common color filters for multiple adjacent pixels for use in CMOS imagers
JP2010204341A (en) 2009-03-03 2010-09-16 Nikon Corp Camera
KR101552481B1 (en) 2009-04-13 2015-09-21 삼성전자 주식회사 Zoom lens module
WO2011009108A2 (en) 2009-07-17 2011-01-20 Universal Robotics, Inc. System and method for automatic calibration of stereo images
EP2284800B1 (en) * 2009-07-23 2018-09-05 Samsung Electronics Co., Ltd. Method and system for creating an image
CN201514511U (en) 2009-09-08 2010-06-23 华晶科技股份有限公司 Periscopic lens structure
EP2478464B1 (en) 2009-09-14 2019-05-08 VIION Systems Inc. Saccadic dual-resolution video analytics camera
KR20110029217A (en) * 2009-09-15 2011-03-23 삼성전자주식회사 Image sensor for outputting rgb bayer signals through internal conversion, and image processing apparatus including the same
TWI478828B (en) 2010-03-26 2015-04-01 Hon Hai Prec Ind Co Ltd Vehicle imaging system and vehicle with same
US8565522B2 (en) * 2010-05-21 2013-10-22 Seiko Epson Corporation Enhancing color images
JP2012027263A (en) 2010-07-23 2012-02-09 Sony Corp Imaging apparatus, control method and program thereof
US8493482B2 (en) 2010-08-18 2013-07-23 Apple Inc. Dual image sensor image processing system and method
KR101731346B1 (en) 2010-11-12 2017-04-28 엘지전자 주식회사 Method for providing display image in multimedia device and thereof
US8988564B2 (en) 2011-09-09 2015-03-24 Apple Inc. Digital camera with light splitter
US8947627B2 (en) 2011-10-14 2015-02-03 Apple Inc. Electronic devices having displays with openings
US8970655B2 (en) 2011-12-16 2015-03-03 Polycom, Inc. Reflective and refractive solutions to providing direct eye contact videoconferencing
EP2872966A1 (en) 2012-07-12 2015-05-20 Dual Aperture International Co. Ltd. Gesture-based user interface
KR102166262B1 (en) 2013-06-10 2020-10-15 삼성전자주식회사 Camera lens assembly
TW201515433A (en) 2013-10-14 2015-04-16 Etron Technology Inc Image calibration system and calibration method of a stereo camera
TW201533514A (en) 2014-02-27 2015-09-01 Tdk Taiwan Corp Reflector structure and photographic device
KR102214193B1 (en) 2014-03-25 2021-02-09 삼성전자 주식회사 Depth camera device, 3d image display system having the same and control methods thereof
US11019330B2 (en) 2015-01-19 2021-05-25 Aquifi, Inc. Multiple camera system with auto recalibration
US20170070731A1 (en) 2015-09-04 2017-03-09 Apple Inc. Single And Multi-Camera Calibration
US9843736B2 (en) 2016-02-26 2017-12-12 Essential Products, Inc. Image capture with a camera integrated display
EP3758356B1 (en) 2016-05-30 2021-10-20 Corephotonics Ltd. Actuator
CN106603765B (en) 2016-12-20 2020-03-17 Oppo广东移动通信有限公司 Bracket component and mobile terminal
CN106534655B (en) 2017-01-11 2019-04-16 Oppo广东移动通信有限公司 Camera module and mobile terminal
JP6967715B2 (en) 2017-04-18 2021-11-17 パナソニックIpマネジメント株式会社 Camera calibration method, camera calibration program and camera calibration device

Patent Citations (305)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4199785A (en) 1979-01-05 1980-04-22 Honeywell Inc. Electronic zoom system
JPS59191146A (en) 1983-04-13 1984-10-30 Hitachi Ltd Optical scanner
US5099263A (en) 1984-11-10 1992-03-24 Minolta Camera Kabushiki Kaisha Variable focal length camera
US5005083A (en) 1988-05-19 1991-04-02 Siemens Aktiengesellschaft FLIR system with two optical channels for observing a wide and a narrow field of view
US5051830A (en) 1989-08-18 1991-09-24 Messerschmitt-Bolkow-Blohm Gmbh Dual lens system for electronic camera
JPH04211230A (en) 1989-10-20 1992-08-03 Fuji Photo Film Co Ltd Compensator for camera shake by hand
US5032917A (en) 1990-03-12 1991-07-16 Rca Licensing Corporation Video signal blending apparatus
US5287093A (en) 1990-06-11 1994-02-15 Matsushita Electric Industrial Co., Ltd. Image processor for producing cross-faded image from first and second image data
US5041852A (en) 1990-10-18 1991-08-20 Fjui Photo Film Co., Ltd. Camera shake correction system
US5436660A (en) 1991-03-13 1995-07-25 Sharp Kabushiki Kaisha Image sensing apparatus having plurality of optical systems and method of operating such apparatus
US5394520A (en) 1991-09-26 1995-02-28 Hughes Aircraft Company Imaging apparatus for providing a composite digital representation of a scene within a field of regard
US5657402A (en) 1991-11-01 1997-08-12 Massachusetts Institute Of Technology Method of creating a high resolution still image using a plurality of images and apparatus for practice of the method
US5248971A (en) 1992-05-19 1993-09-28 Mandl William J Method and apparatus for multiplexed oversampled analog to digital modulation
US5459520A (en) 1992-12-08 1995-10-17 Sony Corporation Electronic camera with over-sampling filter and method for over-sampling and interpolating electronic camera image data
US5444478A (en) 1992-12-29 1995-08-22 U.S. Philips Corporation Image processing method and device for constructing an image from adjacent images
US5682198A (en) 1993-06-28 1997-10-28 Canon Kabushiki Kaisha Double eye image pickup apparatus
US6128416A (en) 1993-09-10 2000-10-03 Olympus Optical Co., Ltd. Image composing technique for optimally composing a single image from a plurality of digital images
JPH07318864A (en) 1994-05-20 1995-12-08 Sony Corp Optical axis correcting mechanism
US6714665B1 (en) 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US6724421B1 (en) 1994-11-22 2004-04-20 Sensormatic Electronics Corporation Video surveillance system with pilot and slave cameras
JPH08271976A (en) 1995-03-29 1996-10-18 Canon Inc Camera
US5768443A (en) 1995-12-19 1998-06-16 Cognex Corporation Method for coordinating multiple fields of view in multi-camera
US5982951A (en) 1996-05-28 1999-11-09 Canon Kabushiki Kaisha Apparatus and method for combining a plurality of images
US5926190A (en) 1996-08-21 1999-07-20 Apple Computer, Inc. Method and system for simulating motion in a computer graphics application using image registration and view interpolation
US20030030729A1 (en) 1996-09-12 2003-02-13 Prentice Wayne E. Dual mode digital imaging and camera system
US6101334A (en) 1997-02-18 2000-08-08 Mobi Corporation Dual focal length camera
US5940641A (en) 1997-07-10 1999-08-17 Eastman Kodak Company Extending panoramic images
US6148120A (en) 1997-10-30 2000-11-14 Cognex Corporation Warping of focal images to correct correspondence error
US6268611B1 (en) 1997-12-18 2001-07-31 Cellavision Ab Feature-free registration of dissimilar images using a robust similarity metric
US6750903B1 (en) 1998-03-05 2004-06-15 Hitachi, Ltd. Super high resolution camera
US6208765B1 (en) 1998-06-19 2001-03-27 Sarnoff Corporation Method and apparatus for improving image resolution
US6611289B1 (en) 1999-01-15 2003-08-26 Yanbin Yu Digital cameras using multiple sensors with multiple lenses
US20020063711A1 (en) 1999-05-12 2002-05-30 Imove Inc. Camera system with high resolution image inside a wide angle view
US20020075258A1 (en) 1999-05-12 2002-06-20 Imove Inc. Camera system with high resolution image inside a wide angle view
US6738073B2 (en) 1999-05-12 2004-05-18 Imove, Inc. Camera system with both a wide angle view and a high resolution view
US6549215B2 (en) 1999-05-20 2003-04-15 Compaq Computer Corporation System and method for displaying images using anamorphic video
US7038716B2 (en) 1999-07-30 2006-05-02 Pixim, Inc. Mobile device equipped with digital image sensor
US20020122113A1 (en) 1999-08-09 2002-09-05 Foote Jonathan T. Method and system for compensating for parallax in multiple camera systems
US7015954B1 (en) 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US6650368B1 (en) 1999-10-26 2003-11-18 Hewlett-Packard Development Company, Lp. Digital camera and method of enhancing zoom effects
US6643416B1 (en) 1999-11-30 2003-11-04 Eastman Kodak Company Method for determining necessary resolution for zoom and crop images
US20020005902A1 (en) 2000-06-02 2002-01-17 Yuen Henry C. Automatic video recording system using wide-and narrow-field cameras
US7002583B2 (en) 2000-08-03 2006-02-21 Stono Technologies, Llc Display of images and image transitions
US6778207B1 (en) 2000-08-07 2004-08-17 Koninklijke Philips Electronics N.V. Fast digital pan tilt zoom video
US20020030163A1 (en) 2000-08-09 2002-03-14 Zhang Evan Y.W. Image intensifier and LWIR fusion/combination system
US20040012683A1 (en) 2001-01-23 2004-01-22 Masafumi Yamasaki Shake compensating device for optical devices
US6741250B1 (en) 2001-02-09 2004-05-25 Be Here Corporation Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path
US7346217B1 (en) 2001-04-25 2008-03-18 Lockheed Martin Corporation Digital image enhancement using successive zoom images
US20020167741A1 (en) 2001-05-14 2002-11-14 Olympus Optical Co., Ltd. Optical apparatus including lens
US7248294B2 (en) 2001-07-10 2007-07-24 Hewlett-Packard Development Company, L.P. Intelligent feature selection and pan zoom control
US6680748B1 (en) 2001-09-27 2004-01-20 Pixim, Inc., Multi-mode camera and method therefor
US20030093805A1 (en) 2001-11-15 2003-05-15 Gin J.M. Jack Dual camera surveillance and control system
US7339621B2 (en) 2001-12-13 2008-03-04 Psion Teklogix Systems, Inc. Imager output signal processing
US20030160886A1 (en) 2002-02-22 2003-08-28 Fuji Photo Film Co., Ltd. Digital camera
JP2003298920A (en) 2002-03-29 2003-10-17 Fuji Photo Film Co Ltd Digital camera
US20040027367A1 (en) 2002-04-30 2004-02-12 Maurizio Pilu Method of and apparatus for processing zoomed sequential images
US20030202113A1 (en) 2002-04-30 2003-10-30 Eastman Kodak Company Electronic still camera and image processing method
US7411610B2 (en) 2002-05-15 2008-08-12 Idelix Software Inc. Method and system for generating detail-in-context video presentations using a graphical user interface
US20040008773A1 (en) 2002-06-14 2004-01-15 Canon Kabushiki Kaisha Multiple image processing and synthesis using background image extraction
US20040017386A1 (en) 2002-07-26 2004-01-29 Qiong Liu Capturing and producing shared multi-resolution video
US20040061788A1 (en) 2002-09-26 2004-04-01 Logitech Europe S.A. Multiple mode capture button for a digital camera
JP2004133054A (en) 2002-10-08 2004-04-30 Olympus Corp Lens barrel
US20050168834A1 (en) 2002-10-08 2005-08-04 Olympus Corporation Camera
US7365793B2 (en) 2002-10-31 2008-04-29 Hewlett-Packard Development Company, L.P. Image capture system and method
US20040141086A1 (en) 2003-01-10 2004-07-22 Olympus Corporation Electronic imaging apparatus
JP2004245982A (en) 2003-02-13 2004-09-02 Minolta Co Ltd Imaging lens device and electronic equipment equipped with the same
US20040240052A1 (en) 2003-06-02 2004-12-02 Pentax Corporation Multiple-focal imaging device, and a mobile device having the multiple-focal-length imaging device
US20050013509A1 (en) 2003-07-16 2005-01-20 Ramin Samadani High resolution image reconstruction
US20050046740A1 (en) 2003-08-29 2005-03-03 Davis Raymond A.. Apparatus including a dual camera module and method of using the same
US7619683B2 (en) 2003-08-29 2009-11-17 Aptina Imaging Corporation Apparatus including a dual camera module and method of using the same
JP2005099265A (en) 2003-09-24 2005-04-14 Fujinon Corp Imaging apparatus, imaging method, and range finding method
EP1536633A1 (en) 2003-11-27 2005-06-01 Sony Corporation Photographing apparatus and method, supervising system, program and recording medium
US20050157184A1 (en) 2004-01-21 2005-07-21 Konica Minolta Photo Imaging, Inc. Image capturing apparatus
US20050200718A1 (en) 2004-03-10 2005-09-15 Samsung Electronics Co., Ltd. Image photographing apparatus and method
US20060056056A1 (en) 2004-07-19 2006-03-16 Grandeye Ltd. Automatically expanding the zoom capability of a wide-angle video camera
EP1780567A1 (en) 2004-07-20 2007-05-02 Five Dimension Co., Ltd. Electronic imaging device
US20100060746A9 (en) 2004-08-25 2010-03-11 Richard Ian Olsen Simultaneous multiple field of view digital cameras
US7199348B2 (en) 2004-08-25 2007-04-03 Newport Imaging Corporation Apparatus for multiple camera devices and method of operating same
US20060054782A1 (en) 2004-08-25 2006-03-16 Olsen Richard I Apparatus for multiple camera devices and method of operating same
US20060102907A1 (en) 2004-11-17 2006-05-18 Samsung Electronics Co., Ltd. Thin film transistor array panel and method for manufacturing the same
US7880776B2 (en) 2004-12-10 2011-02-01 Ambarella, Inc. High resolution zoom: a novel digital zoom for digital video camera
US20060125937A1 (en) 2004-12-10 2006-06-15 Ambarella, Inc. High resolution zoom: a novel digital zoom for digital video camera
US8154610B2 (en) 2004-12-30 2012-04-10 Intellectual Ventures Ii Llc Image sensor with built-in ISP and dual camera system
US20060170793A1 (en) 2005-02-03 2006-08-03 Eastman Kodak Company Digital imaging system with digital zoom warning
US20060175549A1 (en) 2005-02-09 2006-08-10 Miller John L High and low resolution camera systems and methods
US20060187310A1 (en) 2005-02-18 2006-08-24 Janson Wilbert F Jr Digital camera using an express zooming mode to provide expedited operation over an extended zoom range
US20060187322A1 (en) 2005-02-18 2006-08-24 Janson Wilbert F Jr Digital camera using multiple fixed focal length lenses and multiple image sensors to provide an extended zoom range
US20060187338A1 (en) 2005-02-18 2006-08-24 May Michael J Camera phone using multiple lenses and image sensors to provide an extended zoom range
US7206136B2 (en) 2005-02-18 2007-04-17 Eastman Kodak Company Digital camera using multiple lenses and image sensors to provide an extended zoom range
US7561191B2 (en) 2005-02-18 2009-07-14 Eastman Kodak Company Camera phone using multiple lenses and image sensors to provide an extended zoom range
US7305180B2 (en) 2005-02-18 2007-12-04 Kodak Company Digital camera using multiple lenses and image sensors to provide an extended zoom range
US7256944B2 (en) 2005-02-18 2007-08-14 Eastman Kodak Company Compact image capture assembly using multiple lenses and image sensors to provide an extended zoom range
JP2006238325A (en) 2005-02-28 2006-09-07 Canon Inc Camera system
US20070189386A1 (en) 2005-06-22 2007-08-16 Taro Imagawa Image generation apparatus and image generation method
US7809256B2 (en) 2005-07-27 2010-10-05 Sony Corporation Imaging lens device and imaging apparatus
US7424218B2 (en) 2005-07-28 2008-09-09 Microsoft Corporation Real-time preview for panoramic images
US20070024737A1 (en) 2005-08-01 2007-02-01 Hideo Nakamura Image capturing device having multiple optical systems
US7509041B2 (en) 2005-08-01 2009-03-24 Eastman Kodak Company Image-capturing device having multiple optical systems
US7964835B2 (en) 2005-08-25 2011-06-21 Protarius Filo Ag, L.L.C. Digital cameras with direct luminance and chrominance detection
US20070257184A1 (en) 2005-08-25 2007-11-08 Olsen Richard I Large dynamic range cameras
US20090252484A1 (en) 2005-11-14 2009-10-08 Nikon Corporation Image Blur Correction Device and Camera
US20070126911A1 (en) 2005-11-16 2007-06-07 Sony Corporation Image capture apparatus and zoom lens
US8238695B1 (en) 2005-12-15 2012-08-07 Grandeye, Ltd. Data reduction techniques for processing wide-angle video
US20070177025A1 (en) 2006-02-01 2007-08-02 Micron Technology, Inc. Method and apparatus minimizing die area and module size for a dual-camera mobile device
US20090219547A1 (en) 2006-02-06 2009-09-03 Petteri Kauhanen Method and Device for Position Sensing in an Imaging System
US7738016B2 (en) 2006-02-06 2010-06-15 Eastman Kodak Company Digital camera with dual optical systems
US20090122406A1 (en) 2006-02-06 2009-05-14 Jarkko Rouvinen Optical Image Stabilizer Using Gimballed Prism
US20070188653A1 (en) 2006-02-13 2007-08-16 Pollock David B Multi-lens array system and method
JP2007228006A (en) 2006-02-21 2007-09-06 Casio Comput Co Ltd Digital camera
US7773121B1 (en) 2006-05-03 2010-08-10 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration High-resolution, continuous field-of-view (FOV), non-rotating imaging system
JP2007306282A (en) 2006-05-11 2007-11-22 Citizen Electronics Co Ltd Camera module
US20070285550A1 (en) 2006-06-13 2007-12-13 Samsung Electronics Co. Ltd. Method and apparatus for taking images using mobile communication terminal with plurality of camera lenses
US20080017557A1 (en) 2006-07-19 2008-01-24 Witdouck Calvin J System and Method for Sorting Larvae Cocoons
US20080024614A1 (en) 2006-07-25 2008-01-31 Hsiang-Tsun Li Mobile device with dual digital camera sensors and methods of using the same
US20080025634A1 (en) 2006-07-27 2008-01-31 Eastman Kodak Company Producing an extended dynamic range digital image
US20080030611A1 (en) 2006-08-01 2008-02-07 Jenkins Michael V Dual Sensor Video Camera
US20080030592A1 (en) 2006-08-01 2008-02-07 Eastman Kodak Company Producing digital image with different resolution portions
JP2008076485A (en) 2006-09-19 2008-04-03 Konica Minolta Opto Inc Lens barrel and imaging apparatus
US20080084484A1 (en) 2006-10-10 2008-04-10 Nikon Corporation Camera
US20080117316A1 (en) 2006-11-22 2008-05-22 Fujifilm Corporation Multi-eye image pickup device
US7533819B2 (en) 2007-01-31 2009-05-19 Symbol Technologies, Inc. Dual camera assembly for an imaging-based bar code reader
US7978239B2 (en) 2007-03-01 2011-07-12 Eastman Kodak Company Digital camera using multiple image sensors to provide improved temporal sampling
US20080219654A1 (en) 2007-03-09 2008-09-11 Border John N Camera using multiple lenses and image sensors to provide improved focusing capability
US20080218611A1 (en) 2007-03-09 2008-09-11 Parulski Kenneth A Method and apparatus for operating a dual lens camera to augment an image
US20080218613A1 (en) 2007-03-09 2008-09-11 Janson Wilbert F Camera using multiple lenses and image sensors operable in a default imaging mode
US20080218612A1 (en) 2007-03-09 2008-09-11 Border John N Camera using multiple lenses and image sensors in a rangefinder configuration to provide a range map
US7676146B2 (en) 2007-03-09 2010-03-09 Eastman Kodak Company Camera using multiple lenses and image sensors to provide improved focusing capability
US20100283842A1 (en) 2007-04-19 2010-11-11 Dvp Technologies Ltd. Imaging system and method for use in monitoring a field of regard
US7918398B2 (en) 2007-06-04 2011-04-05 Hand Held Products, Inc. Indicia reading terminal having multiple setting imaging lens
US8390729B2 (en) 2007-09-05 2013-03-05 International Business Machines Corporation Method and apparatus for providing a video image having multiple focal lengths
US20090086074A1 (en) 2007-09-27 2009-04-02 Omnivision Technologies, Inc. Dual mode camera solution apparatus, system, and method
US20090109556A1 (en) 2007-10-31 2009-04-30 Sony Corporation Lens barrel and imaging apparatus
US20090122195A1 (en) 2007-11-09 2009-05-14 Van Baar Jeroen System and Method for Combining Image Sequences
US20090128644A1 (en) 2007-11-15 2009-05-21 Camp Jr William O System and method for generating a photograph
US9025073B2 (en) 2007-12-04 2015-05-05 Nan Chang O-Film Optoelectronics Technology Ltd Compact camera optics
KR20090058229A (en) 2007-12-04 2009-06-09 삼성전기주식회사 Dual camera module
WO2009097552A1 (en) 2008-02-01 2009-08-06 Omnivision Cdm Optics, Inc. Image data fusion systems and methods
US20110064327A1 (en) * 2008-02-01 2011-03-17 Dagher Joseph C Image Data Fusion Systems And Methods
US8115825B2 (en) 2008-02-20 2012-02-14 Apple Inc. Electronic device with two image sensors
CN101276415A (en) 2008-03-03 2008-10-01 北京航空航天大学 Apparatus and method for realizing multi-resolutions image acquisition with multi-focusing video camera
US9618748B2 (en) 2008-04-02 2017-04-11 Esight Corp. Apparatus and method for a dynamic “region of interest” in a display system
US20110121421A1 (en) 2008-05-09 2011-05-26 Ecole Polytechnique Federate de Lausanne EPFL Image sensor having nonlinear response
US20110080487A1 (en) 2008-05-20 2011-04-07 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US20090295949A1 (en) 2008-05-28 2009-12-03 Valtion Teknillinen Tutkimuskeskus Zoom camera arrangement comprising multiple sub-cameras
US20090324135A1 (en) 2008-06-27 2009-12-31 Sony Corporation Image processing apparatus, image processing method, program and recording medium
US20100013906A1 (en) 2008-07-17 2010-01-21 Border John N Zoom by multiple image capture
KR20100008936A (en) 2008-07-17 2010-01-27 삼성전자주식회사 Portable terminal having dual camera and photographing method using the same
KR101477178B1 (en) 2008-07-17 2014-12-29 삼성전자주식회사 Portable terminal having dual camera and photographing method using the same
US20110229054A1 (en) 2008-07-23 2011-09-22 Snell Limited Processing of images to represent a transition in viewpoint
US20100020221A1 (en) 2008-07-24 2010-01-28 David John Tupman Camera Interface in a Portable Handheld Electronic Device
US20110164172A1 (en) 2008-09-10 2011-07-07 Panasonic Corporation Camera body and imaging device
US20100097444A1 (en) 2008-10-16 2010-04-22 Peter Lablans Camera System for Creating an Image From a Plurality of Images
US20100103194A1 (en) 2008-10-27 2010-04-29 Huawei Technologies Co., Ltd. Method and system for fusing images
US8587691B2 (en) 2008-11-28 2013-11-19 Samsung Electronics Co., Ltd. Photographing apparatus and method for dynamic range adjustment and stereography
US20100165131A1 (en) 2008-12-25 2010-07-01 Fujifilm Corporation Image stabilizer and optical instrument therewith
US8149327B2 (en) 2009-03-13 2012-04-03 Hon Hai Precision Industry Co., Ltd. Camera module with dual lens modules and image sensors
US20100238327A1 (en) 2009-03-19 2010-09-23 Griffith John D Dual Sensor Camera
US8542287B2 (en) 2009-03-19 2013-09-24 Digitaloptics Corporation Dual sensor camera
US20120026366A1 (en) 2009-04-07 2012-02-02 Nextvision Stabilized Systems Ltd. Continuous electronic zoom for an imaging system with multiple imaging devices having different fixed fov
WO2010122841A1 (en) 2009-04-22 2010-10-28 コニカミノルタオプト株式会社 Mirror-lens barrel, image pickup device and method for manufacturing a mirror-lens barrel
US8553106B2 (en) 2009-05-04 2013-10-08 Digitaloptics Corporation Dual lens digital zoom
US20100277619A1 (en) 2009-05-04 2010-11-04 Lawrence Scarff Dual Lens Digital Zoom
US8439265B2 (en) 2009-06-16 2013-05-14 Intel Corporation Camera applications in a handheld device
US20100321494A1 (en) 2009-06-18 2010-12-23 Theia Technologies, Llc Compact dome camera
US9215385B2 (en) 2009-06-22 2015-12-15 Ominivision Technologies, Inc. System and method for an image sensor operable in multiple video standards
US8179457B2 (en) 2009-06-23 2012-05-15 Nokia Corporation Gradient color filters for sub-diffraction limit sensors
US8134115B2 (en) 2009-06-23 2012-03-13 Nokia Corporation Color filters for sub-diffraction limit-sized light sensors
US8094208B2 (en) * 2009-07-17 2012-01-10 The Invention Sciennce Fund I, LLC Color filters and demosaicing techniques for digital imaging
US20120154614A1 (en) 2009-08-21 2012-06-21 Akihiro Moriya Camera-shake correction device
US20110058320A1 (en) 2009-09-09 2011-03-10 Lg Electronics Inc. Mobile terminal
US20110216228A1 (en) * 2009-09-14 2011-09-08 Fujifilm Corporation Solid-state image sensing element, method for driving solid-state image sensing element and image pickup device
US8391697B2 (en) 2009-09-30 2013-03-05 Lg Electronics Inc. Mobile terminal and method of controlling the operation of the mobile terminal
JP2011085666A (en) 2009-10-13 2011-04-28 Tdk Taiwan Corp Lens driving device
US8514491B2 (en) 2009-11-20 2013-08-20 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US8400555B1 (en) 2009-12-01 2013-03-19 Adobe Systems Incorporated Focused plenoptic camera employing microlenses with different focal lengths
US20110128288A1 (en) 2009-12-02 2011-06-02 David Petrou Region of Interest Selector for Visual Queries
US20150242994A1 (en) 2010-01-28 2015-08-27 Pathway Innovations And Technologies, Inc. Method and system for accelerating video preview digital camera
US8483452B2 (en) 2010-03-09 2013-07-09 Sony Corporation Image processing apparatus, image processing method, and program
US20110234881A1 (en) 2010-03-25 2011-09-29 Fujifilm Corporation Display apparatus
US20110234853A1 (en) 2010-03-26 2011-09-29 Fujifilm Corporation Imaging apparatus and display apparatus
US20110242286A1 (en) 2010-03-31 2011-10-06 Vincent Pace Stereoscopic Camera With Automatic Obstruction Removal
US20110242355A1 (en) 2010-04-05 2011-10-06 Qualcomm Incorporated Combining data from multiple image sensors
US8547389B2 (en) 2010-04-05 2013-10-01 Microsoft Corporation Capturing image structure detail from a first image and color from a second image
US8446484B2 (en) 2010-04-21 2013-05-21 Nokia Corporation Image processing architecture with pre-scaler
US9369621B2 (en) 2010-05-03 2016-06-14 Invisage Technologies, Inc. Devices and methods for high-resolution image and video capture
US20130250150A1 (en) 2010-05-03 2013-09-26 Michael R. Malone Devices and methods for high-resolution image and video capture
US20110285730A1 (en) 2010-05-21 2011-11-24 Jimmy Kwok Lap Lai Controlling Display Updates For Electro-Optic Displays
US20110298966A1 (en) 2010-05-21 2011-12-08 Jena Optronik Gmbh Camera having multiple focal lengths
US20110292258A1 (en) 2010-05-28 2011-12-01 C2Cure, Inc. Two sensor imaging systems
US20130113894A1 (en) 2010-07-13 2013-05-09 Ram Srikanth Mirlay Variable 3-d camera assembly for still photography
US8896655B2 (en) 2010-08-31 2014-11-25 Cisco Technology, Inc. System and method for providing depth adaptive video conferencing
US20120062780A1 (en) 2010-09-15 2012-03-15 Morihisa Taijiro Imaging apparatus and image capturing method
US20120069235A1 (en) 2010-09-20 2012-03-22 Canon Kabushiki Kaisha Image capture with focus adjustment
US20120075489A1 (en) 2010-09-24 2012-03-29 Nishihara H Keith Zoom camera image blending technique
US20120081566A1 (en) * 2010-09-30 2012-04-05 Apple Inc. Flash synchronization using image sensor interface timing signal
US9413984B2 (en) 2010-10-24 2016-08-09 Linx Computational Imaging Ltd. Luminance source selection in a multi-lens camera
US20140192238A1 (en) 2010-10-24 2014-07-10 Linx Computational Imaging Ltd. System and Method for Imaging and Image Processing
US9578257B2 (en) 2010-10-24 2017-02-21 Linx Computational Imaging Ltd. Geometrically distorted luminance in a multi-lens camera
US9681057B2 (en) 2010-10-24 2017-06-13 Linx Computational Imaging Ltd. Exposure timing manipulation in a multi-lens camera
US9025077B2 (en) 2010-10-24 2015-05-05 Linx Computational Imaging Ltd. Geometrically distorted luminance in a multi-lens camera
US20120105579A1 (en) 2010-11-01 2012-05-03 Lg Electronics Inc. Mobile terminal and method of controlling an image photographing therein
US9041835B2 (en) 2010-11-10 2015-05-26 Canon Kabushiki Kaisha Selective combining of image data
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US20130135445A1 (en) 2010-12-27 2013-05-30 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US8274552B2 (en) 2010-12-27 2012-09-25 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US20140049615A1 (en) 2010-12-28 2014-02-20 Sony Corporation Lens protection device, lens unit and image capture device
US8803990B2 (en) 2011-01-25 2014-08-12 Aptina Imaging Corporation Imaging system with multiple sensors for producing high-dynamic-range images
US20120196648A1 (en) 2011-01-31 2012-08-02 Havens William H Apparatus, system, and method of use of imaging assembly on mobile terminal
US8976255B2 (en) 2011-02-28 2015-03-10 Olympus Imaging Corp. Imaging apparatus
US20120229663A1 (en) 2011-03-08 2012-09-13 Spectral Instruments Imaging , Llc Imaging system having primary and auxiliary camera systems
US9019387B2 (en) 2011-03-18 2015-04-28 Ricoh Company, Ltd. Imaging device and method of obtaining image
US20120249815A1 (en) 2011-03-29 2012-10-04 Mircrosoft Corporation Folded imaging path camera
CN102739949A (en) 2011-04-01 2012-10-17 张可伦 Control method for multi-lens camera and multi-lens device
EP2523450A1 (en) 2011-05-10 2012-11-14 HTC Corporation Handheld electronic device with dual image capturing method and computer program product
US20120287315A1 (en) 2011-05-10 2012-11-15 Htc Corporation Handheld Electronic Device, Dual Image Capturing Method Applying for Thereof, and Computer Program Production for Load into Thereof
US20120320467A1 (en) 2011-06-14 2012-12-20 Samsung Electro-Mechanics Co., Ltd. Image photographing device
US20130002928A1 (en) 2011-06-28 2013-01-03 Canon Kabushiki Kaisha Adjustment of imaging properties for an imaging assembly having light-field optics
US20130016427A1 (en) 2011-07-15 2013-01-17 Mitsumi Electric Co., Ltd Lens holder driving device capable of avoiding deleterious effect on hall elements
US9270875B2 (en) 2011-07-20 2016-02-23 Broadcom Corporation Dual image capture processing
US20130076922A1 (en) 2011-07-28 2013-03-28 Canon Kabushiki Kaisha Correcting optical device and image pickup apparatus
US20130093842A1 (en) 2011-10-12 2013-04-18 Canon Kabushiki Kaisha Image-capturing device
US20160212358A1 (en) 2011-11-14 2016-07-21 Sony Corporation Information processing apparatus, method, and non-transitory computer-readable medium
JP2013106289A (en) 2011-11-16 2013-05-30 Konica Minolta Advanced Layers Inc Imaging apparatus
US20130136355A1 (en) * 2011-11-29 2013-05-30 Microsoft Corporation Automatic Estimation and Correction of Vignetting
US8660420B2 (en) 2011-12-13 2014-02-25 Hon Hai Precision Industry Co., Ltd. Adjustable dual lens camera
US8619148B1 (en) 2012-01-04 2013-12-31 Audience, Inc. Image correction after combining images from multiple cameras
US20130182150A1 (en) 2012-01-12 2013-07-18 Olympus Corporation Image Pickup Apparatus
US20130201360A1 (en) 2012-02-03 2013-08-08 Samsung Electronics Co., Ltd. Method of changing an operation mode of a camera image sensor
US20130202273A1 (en) 2012-02-07 2013-08-08 Canon Kabushiki Kaisha Method and device for transitioning between an image of a first video sequence and an image of a second video sequence
US20130235224A1 (en) 2012-03-09 2013-09-12 Minwoo Park Video camera providing a composite video sequence
US20130258044A1 (en) 2012-03-30 2013-10-03 Zetta Research And Development Llc - Forc Series Multi-lens camera
US20130270419A1 (en) 2012-04-12 2013-10-17 Digitaloptics Corporation Compact Camera Module
US20130278785A1 (en) 2012-04-20 2013-10-24 Hoya Corporation Imaging apparatus
US9420180B2 (en) 2012-05-22 2016-08-16 Zte Corporation Method and device for switching between double cameras
US20130321668A1 (en) 2012-05-30 2013-12-05 Ajith Kamath Plural Focal-Plane Imaging
US20150162048A1 (en) 2012-06-11 2015-06-11 Sony Computer Entertainment Inc. Image generation device and image generation method
US20150138381A1 (en) 2012-06-29 2015-05-21 Lg Innotek Co., Ltd. Camera module
US20140009631A1 (en) 2012-07-06 2014-01-09 Apple Inc. Vcm ois actuator module
US20150195458A1 (en) 2012-07-12 2015-07-09 Sony Corporation Image shake correction device and image shake correction method and image pickup device
KR20140014787A (en) 2012-07-26 2014-02-06 엘지이노텍 주식회사 Camera module
US20160295112A1 (en) 2012-10-19 2016-10-06 Qualcomm Incorporated Multi-camera system using folded optics
US20140118584A1 (en) 2012-10-31 2014-05-01 Jess Jan Young Lee Devices, methods, and systems for expanded-field-of-view image and video capture
WO2014072818A2 (en) 2012-11-08 2014-05-15 Dynaoptics Pte Ltd. Miniature optical zoom lens
US20140362242A1 (en) 2012-11-16 2014-12-11 Panasonic Intellectual Property Corporation Of America Camera drive device
CN103024272A (en) 2012-12-14 2013-04-03 广东欧珀移动通信有限公司 Double camera control device, method and system of mobile terminal and mobile terminal
US20140192253A1 (en) 2013-01-05 2014-07-10 Tinz Optics, Inc. Methods and apparatus for capturing and/or processing images
US20140313316A1 (en) 2013-01-30 2014-10-23 SeeScan, Inc. Adjustable variable resolution inspection systems and methods using multiple image sensors
US20140218587A1 (en) 2013-02-07 2014-08-07 Motorola Mobility Llc Double sided camera module
US9413930B2 (en) 2013-03-14 2016-08-09 Joergen Geerds Camera system
US9851803B2 (en) 2013-03-15 2017-12-26 Eyecam, LLC Autonomous computing and telecommunications head-up displays glasses
US9723220B2 (en) 2013-05-13 2017-08-01 Canon Kabushiki Kaisha Imaging apparatus, control method, and program
US9438792B2 (en) 2013-05-17 2016-09-06 Canon Kabushiki Kaisha Image-processing apparatus and image-processing method for generating a virtual angle of view
US20160154202A1 (en) 2013-05-27 2016-06-02 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Optical structure with ridges arranged at the same and method for producing the same
US9185291B1 (en) 2013-06-13 2015-11-10 Corephotonics Ltd. Dual aperture zoom digital camera
US20150002683A1 (en) 2013-07-01 2015-01-01 Tdk Taiwan Corp. Optical Anti-Shake Apparatus with Switchable Light Path
US9137447B2 (en) 2013-07-31 2015-09-15 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus that generates an image including an emphasized in-focus part of a captured image
US20150042870A1 (en) 2013-08-08 2015-02-12 Apple Inc. Mirror tilt actuation
US20150070781A1 (en) 2013-09-12 2015-03-12 Hong Kong Applied Science and Technology Research Institute, Co. Multi-lens imaging module and actuator with auto-focus adjustment
US20160241751A1 (en) 2013-09-23 2016-08-18 Lg Innotek Co., Ltd. Camera Module and Manufacturing Method for Same
US20150092066A1 (en) 2013-09-30 2015-04-02 Google Inc. Using a Second Camera to Adjust Settings of First Camera
US9736365B2 (en) 2013-10-26 2017-08-15 Light Labs Inc. Zoom related methods and apparatus
US9344626B2 (en) 2013-11-18 2016-05-17 Apple Inc. Modeless video and still frame capture using interleaved frames of video and still resolutions
US20150154776A1 (en) 2013-12-03 2015-06-04 Huawei Technologies Co., Ltd. Image splicing method and apparatus
US9215377B2 (en) 2013-12-04 2015-12-15 Nokia Technologies Oy Digital zoom with sensor mode change
US9736391B2 (en) 2013-12-06 2017-08-15 Huawei Device Co., Ltd. Photographing method of dual-lens device, and dual-lens device
US20170214866A1 (en) 2013-12-06 2017-07-27 Huawei Device Co., Ltd. Image Generating Method and Dual-Lens Device
US9894287B2 (en) 2013-12-06 2018-02-13 Huawei Device (Dongguan) Co., Ltd. Method and apparatus for acquiring a high dynamic image using multiple cameras
US20160301840A1 (en) 2013-12-06 2016-10-13 Huawei Device Co., Ltd. Photographing Method for Dual-Lens Device and Dual-Lens Device
US20150215516A1 (en) 2014-01-27 2015-07-30 Ratheon Company Imaging system and methods with variable lateral magnification
US20150237280A1 (en) 2014-02-19 2015-08-20 Samsung Electronics Co., Ltd. Image processing device with multiple image signal processors and image processing method
US20150253543A1 (en) 2014-03-07 2015-09-10 Apple Inc. Folded telephoto camera lens system
US20150253647A1 (en) 2014-03-07 2015-09-10 Apple Inc. Folded camera lens systems
CN103841404A (en) 2014-03-18 2014-06-04 江西省一元数码科技有限公司 Novel three-dimensional image shooting module
US20150271471A1 (en) 2014-03-19 2015-09-24 Htc Corporation Blocking detection method for camera and electronic apparatus with cameras
US20150286033A1 (en) 2014-04-04 2015-10-08 Qualcomm Incorporated Auto-focus in low-profile folded optics multi-camera system
US20160353008A1 (en) 2014-04-04 2016-12-01 Qualcomm Incorporated Auto-focus in low-profile folded optics multi-camera system
KR20150118012A (en) 2014-04-11 2015-10-21 삼성전기주식회사 Camera module
US20150316744A1 (en) 2014-04-30 2015-11-05 Lite-On Electronics (Guangzhou) Limited Voice coil motor array module
US20170019616A1 (en) 2014-05-15 2017-01-19 Huawei Technologies Co., Ltd. Multi-frame noise reduction method, and terminal
US20150334309A1 (en) 2014-05-16 2015-11-19 Htc Corporation Handheld electronic apparatus, image capturing apparatus and image capturing method thereof
US9360671B1 (en) 2014-06-09 2016-06-07 Google Inc. Systems and methods for image zoom
US20160044250A1 (en) 2014-08-10 2016-02-11 Corephotonics Ltd. Zoom dual-aperture camera with folded lens
US20160291295A1 (en) 2014-08-10 2016-10-06 Corephotonics Ltd. Zoom dual-aperture camera with folded lens
US20160070088A1 (en) 2014-09-10 2016-03-10 Hoya Corporation Imaging apparatus having bending optical element
US20170214846A1 (en) 2014-09-30 2017-07-27 Huawei Technologies Co., Ltd. Auto-Focus Method and Apparatus and Electronic Device
US20170242225A1 (en) 2014-11-19 2017-08-24 Orlo James Fiske Thin optical system and camera
US9768310B2 (en) 2014-11-25 2017-09-19 Samsung Display Co., Ltd. Thin film transistor, organic light-emitting diode display including the same, and manufacturing method thereof
US20160154204A1 (en) 2014-11-28 2016-06-02 Samsung Electro-Mechanics Co., Ltd. Camera module
US9286680B1 (en) 2014-12-23 2016-03-15 Futurewei Technologies, Inc. Computational multi-camera adjustment for smooth view switching and zooming
US9800798B2 (en) 2015-02-13 2017-10-24 Qualcomm Incorporated Systems and methods for power optimization for imaging devices with dual cameras
US9927600B2 (en) 2015-04-16 2018-03-27 Corephotonics Ltd Method and system for providing auto focus and optical image stabilization in a compact folded camera
US20180024329A1 (en) 2015-04-16 2018-01-25 Corephotonics Ltd. Auto focus and optical image stabilization in a compact folded camera
US9485432B1 (en) 2015-04-29 2016-11-01 Uurmi Systems Private Limited Methods, systems and apparatuses for dual-camera based zooming
US20160353012A1 (en) 2015-05-25 2016-12-01 Htc Corporation Zooming control method for camera and electronic apparatus with camera
US20180120674A1 (en) 2015-06-24 2018-05-03 Corephotonics Ltd. Low profile tri-axis actuator for folded lens camera
US20180150973A1 (en) 2015-07-15 2018-05-31 Huawei Technologies Co., Ltd. Method and Apparatus for Calculating Dual-Camera Relative Position, and Device
WO2017025822A1 (en) 2015-08-13 2017-02-16 Corephotonics Ltd. Dual aperture zoom camera with video support and switching / non-switching dynamic control
WO2017037688A1 (en) 2015-09-06 2017-03-09 Corephotonics Ltd. Auto focus and optical image stabilization with roll compensation in a compact folded camera
US20170187962A1 (en) 2015-12-23 2017-06-29 Samsung Electronics Co., Ltd. Imaging device module, user terminal apparatus including the imaging device module, and a method of operating the imaging device module
US20170289458A1 (en) 2016-03-31 2017-10-05 Lg Electronics Inc. Mobile terminal and method for controlling the same
US20180017844A1 (en) 2016-07-12 2018-01-18 Tdk Taiwan Corp. Lens driving module
US20180059379A1 (en) 2016-08-26 2018-03-01 Largan Precision Co., Ltd. Optical path folding element, imaging lens module and electronic device
WO2018130898A1 (en) 2017-01-12 2018-07-19 Corephotonics Ltd. Compact folded camera
US20180241922A1 (en) 2017-02-23 2018-08-23 Qualcomm Incorporated Adjustment for cameras for low power mode operation
US20180295292A1 (en) 2017-04-10 2018-10-11 Samsung Electronics Co., Ltd Method and electronic device for focus control

Non-Patent Citations (17)

* Cited by examiner, † Cited by third party
Title
A 3MPixel Multi-Aperture Image Sensor with 0.7 μm Pixels in 0.11 μm CMOS, Fife et al., Stanford University, 2008, 3 pages.
Compact multi-aperture imaging with high angular resolution, Santacana et al., Publisher: Optical Society of America, 2015, 10 pages.
Defocus Video Matting, McGuire et al., Publisher: ACM SIGGRAPH, Jul. 31, 2005, 11 pages.
Dual camera intelligent sensor for high definition 360 degrees surveillance, Scotti et al., Publisher: IET, May 9, 2000, 8 pages.
Dual-Camera System for Multi-Level Activity Recognition, Bodor et al., Publisher: IEEE, Oct. 2014, 6 pages.
Dual-sensor foveated imaging system, Hua et al., Publisher: Optical Society of America, Jan. 14, 2008, 11 pages.
Engineered to the task: Why camera-phone cameras are different, Giles Humpston, Publisher: Solid State Technology Jun. 2009, 3 pages.
High Performance Imaging Using Large Camera Arrays, Wilburn et al., Publisher: Association for Computing Machinery, Inc., 2005, 12 pages.
International Search Report and Written Opinion issued in related PCT patent application PCT/IB2013/060356, dated Apr. 17, 2014, 15 pages.
Multi-Aperture Photography, Green et al., Publisher: Mitsubishi Electric Research Laboratories, Inc., Jul. 2007, 10 pages.
Multispectral Bilateral Video Fusion, Bennett et al., Publisher: IEEE, May 2007, 10 pages.
Optical Splitting Trees for High-Precision Monocular Imaging, McGuire et al., Publisher: IEEE, 2007, 11 pages.
Real-time Edge-Aware Image Processing with the Bilateral Grid, Chen et al., Publisher: ACM SIGGRAPH, 2007, 9 pages.
Statistical Modeling and Performance Characterization of a Real-Time Dual Camera Surveillance System, Greienhagen et al., Publisher: IEEE, 2000, 8 pages.
Superimposed multi-resolution imaging, Caries et al., Publisher: Optical Society of America, 2017, 13 pages.
Super-resolution imaging using a camera array, Santacana et al., Publisher: Optical Society of America, 2014, 6 pages.
Viewfinder Alignment, Adams et al., Publisher: EUROGRAPHICS, 2008, 10 pages.

Also Published As

Publication number Publication date
US20170094164A1 (en) 2017-03-30
IL260978B1 (en) 2024-06-01
IL315343A (en) 2024-10-01
CN112911252A (en) 2021-06-04
US9876952B2 (en) 2018-01-23
USRE48697E1 (en) 2021-08-17
IL260978B2 (en) 2024-10-01
USRE48945E1 (en) 2022-02-22
IL238900B (en) 2018-08-30
CN105556944B (en) 2019-03-08
CN116405747A (en) 2023-07-07
CN105556944A (en) 2016-05-04
CN109963059A (en) 2019-07-02
CN112911252B (en) 2023-07-04
US20150085174A1 (en) 2015-03-26
WO2014083489A1 (en) 2014-06-05
US9538152B2 (en) 2017-01-03
IL312771B1 (en) 2024-10-01
USRE49256E1 (en) 2022-10-18
USRE48477E1 (en) 2021-03-16
CN113259565B (en) 2023-05-19
CN109963059B (en) 2021-07-27
US20170160135A1 (en) 2017-06-08
US20170016768A1 (en) 2017-01-19
CN113472989A (en) 2021-10-01
IL260978A (en) 2019-01-31
CN113259565A (en) 2021-08-13
US20180160040A1 (en) 2018-06-07
US9927300B2 (en) 2018-03-27
US9581496B2 (en) 2017-02-28
IL312771A (en) 2024-07-01

Similar Documents

Publication Publication Date Title
USRE48444E1 (en) High resolution thin multi-aperture imaging systems
US10091405B2 (en) Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
CN108141571B (en) Maskless phase detection autofocus
JP5151075B2 (en) Image processing apparatus, image processing method, imaging apparatus, and computer program
US8339483B2 (en) Image processing device, solid-state imaging device, and camera module
US9319585B1 (en) High resolution array camera
EP2728545B1 (en) Image processing method and device based on bayer format
JP5404376B2 (en) Camera module and image processing apparatus
US20110141321A1 (en) Method and apparatus for transforming a lens-distorted image to a perspective image in bayer space
KR102619738B1 (en) Signal processing devices and imaging devices
US10616493B2 (en) Multi camera system for zoom
JP4962293B2 (en) Image processing apparatus, image processing method, and program
WO2019167571A1 (en) Image processing device and image processing method
CN114080795A (en) Image sensor and electronic device

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES FILED (ORIGINAL EVENT CODE: PMFP); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES DISMISSED (ORIGINAL EVENT CODE: PMFS); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES FILED (ORIGINAL EVENT CODE: PMFP); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY