[go: nahoru, domu]

US20040249291A1 - Image display apparatus, image display method, and computer program - Google Patents

Image display apparatus, image display method, and computer program Download PDF

Info

Publication number
US20040249291A1
US20040249291A1 US10/830,790 US83079004A US2004249291A1 US 20040249291 A1 US20040249291 A1 US 20040249291A1 US 83079004 A US83079004 A US 83079004A US 2004249291 A1 US2004249291 A1 US 2004249291A1
Authority
US
United States
Prior art keywords
image
unit
color information
input
scale
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/830,790
Inventor
Takemitsu Honda
Tetsuo Minai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HONDA, TAKEMITSU, MINAI, TETSUO
Publication of US20040249291A1 publication Critical patent/US20040249291A1/en
Priority to US11/493,722 priority Critical patent/US20060270783A1/en
Priority to US13/229,309 priority patent/US8620044B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/0002Operational features of endoscopes provided with data storages
    • A61B1/00022Operational features of endoscopes provided with data storages removable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/0004Operational features of endoscopes provided with input arrangements for the user for electronic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6805Vests
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00011Operational features of endoscopes characterised by signal transmission
    • A61B1/00016Operational features of endoscopes characterised by signal transmission using wireless means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/07Endoradiosondes
    • A61B5/073Intestinal transmitters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control

Definitions

  • the present invention relates to an image display apparatus, an image display method, and an image display program.
  • swallowable capsule endoscopes have been produced as a type of endoscopes.
  • the capsule endoscopes are provided with an imaging capability and a radio capability.
  • a capsule endoscope is configured to sequentially take images of organs such as the stomach and the small intestine within an observation period from the time it has been swallowed through the mouth of a patient for observation (examination) to its natural excretion from the human body (see Japanese Patent Application Laid-open No. H11-225996 Publication);
  • image data taken in a body by the capsule endoscope is sequentially transmitted outside through radio communication and is stored in a memory. Since a patient carries around a receiver having a radio communication capability and a memory capability, the patient can freely perform normal actions during the observation period from swallowing of the capsule endoscope to its excretion. After observation, a doctor or a nurse can display the images of organs on a display based on the image data stored in the memory and use it to make a diagnosis.
  • the capsule endoscope described above takes images of each organ within a period from the time a subject swallows to its natural excretion, meaning an extended period of observation.(examination), for example, more than ten hours. Therefore, the number of images to be taken in time sequence is correspondingly huge.
  • the image display apparatus includes an input unit that inputs image data taken in time sequence by an in-vivo imaging device, a scale display control unit that controls to display a scale indicating an overall imaging period of input image data taken in time sequence and input by the input unit, a color information detecting unit that detects color information of a screen of the image data input by the input unit, a color display control unit that controls to display a color corresponding to the color information detected by the color information detecting unit at a time-corresponding position on the scale, an image display control unit that controls to display an image corresponding to the image data input by the input unit, an image designation unit that designates the image subjected to be displayed by the image display control unit, and an index display control unit that controls to display, on the scale, an index indicating a position corresponding to an imaging time of the image designated by the image designation unit.
  • the image display method includes inputting image data taken in time sequence by an in-vivo imaging device, displaying a scale indicating an overall imaging period of input image data taken in time sequence and input by the input unit, detecting color information of a screen of the image data input by the input unit, displaying a color corresponding to the color information detected by the color information detecting unit at a time-corresponding position on the scale, displaying an image corresponding to the image data input by the input unit, designating the image subjected to be displayed by the image display control unit, and displaying, on the scale, an index indicating a position corresponding to an imaging time of the image designated by the image designation unit.
  • the image display program according to still another aspect of the present invention realizes the method according to the above aspect on a computer.
  • FIG. 1 is a schematic of a capsule endoscope according to an embodiment of the present invention
  • FIG. 2 is a schematic of a capsule endoscope system according to the embodiment
  • FIG. 3 is a block diagram of an example of the capsule endoscope system according to the embodiment.
  • FIG. 4A and FIG. 4B are schematics of an example of screen transition (screen 1 and 2 ) associated with the observation procedures according to the embodiment;
  • FIG. 6A to FIG. 6C are schematics of an example of screen transition (screen for acquisition of data 1 to 3 ) associated with the observation procedures according to the embodiment;
  • FIG. 7 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment.
  • FIG. 8 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment.
  • FIG. 9 is a flowchart of the operation for average color bar display according to the embodiment.
  • FIG. 10 is a schematic of an example of a display screen associated with a diagnosis process according to a modification of the embodiment
  • FIG. 11 is Graphs for illustrating the principle of automatic discrimination of organ names according to the modification of the embodiment.
  • FIG. 12 is a flowchart of the procedures of discriminating the organ names according to the modification of the embodiment.
  • FIG. 13 is a graph for illustrating an example of application of the modification shown in FIG. 11;
  • FIG. 14 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment.
  • FIG. 15 is a flowchart of an operation for displaying the imaging time of a designated image according to the embodiment.
  • FIG. 1 is a schematic of a capsule endoscope according to an embodiment of the present invention.
  • a capsule endoscope 10 includes an imaging unit 111 that can take the internal image of a celom, illumination units 112 a and 112 b that illuminate the interior of the celom, a power supply unit 13 that supplies them with power, and a capsule housing 14 that has at least the imaging unit 111 , the illumination units 112 and the power supply unit 13 disposed inside.
  • the capsule housing 14 includes a distal-end cover 120 that covers the imaging unit 111 and the illumination units 112 a, 112 b, and a capsule body 122 that is provided in a water-proof state with respect to the distal-end cover 120 via a seal member 121 and has the imaging unit 111 , etc. disposed therein.
  • a rear-end cover 123 may be provided as separate from the capsule body 122 as needed. Although the rear-end cover 123 is provided integrally with the capsule body and has a flat shape in the present embodiment, the shape is not limited and may be, for example, a dome shape.
  • the distal-end cover 120 may clearly separate an illumination window 120 a, that transmits illumination light L from the illumination unit 112 a, 112 b, and an imaging window 120 b, that performs imaging in the illumination range, from each other.
  • the entire distal-end cover 120 is transparent and the areas of the illumination window 120 a and the imaging window 120 b partly overlap each other.
  • the imaging unit 111 is provided on an imaging board 124 with a solid-state imaging device 125 formed of, for example, a CCD, which performs imaging in the range that is illuminated with the illumination light L from the illumination unit 112 a, 112 b, and an image forming lens 126 that includes a fixed lens 126 a and a movable lens 126 b, and forms the image of a subject to the solid-state imaging device 125 , and executes sharp image forming with a focus adjusting unit 128 with a fixed frame 128 a that secures the fixed lens 126 a and a movable frame 128 b, which secures the movable lens 126 b.
  • the imaging unit 111 is not limited to the CCD, but an imaging unit such as CMOS, may be used.
  • the illumination units 112 a, 112 b are provided on an illumination board 130 and are comprised of, for example, a light-emitting diode (LED), and a plurality of illumination units 112 a, 112 b (four in the present embodiment as one example) are laid out around the image forming lens 126 that constitutes the imaging unit 111 .
  • the illumination units 112 a, 112 b are not limited to the LED but other illumination units may be used as well.
  • the power supply unit 13 is provided on a power supply board 132 provided with an internal switch 131 and uses, for example, a button type battery as a power supply 133 . While a silver oxide cell, for example, is used as the battery in the present invention, the invention is not limited to it and may use a chargeable battery, a dynamo type battery or the like.
  • the present invention is not limited to this type and other switch units can be also exemplified.
  • a radio unit 142 comprising an antenna or the like for radio communication with outside is provided on a radio board 141 and communication with outside is carried out as needed.
  • a signal processing/control unit 143 for processing or controlling the individual units is provided on an imaging board 124 and executes various processes in the capsule endoscope 10 .
  • the signal processing/control unit 143 is comprised of a video signal processing function for image data generation, a transmission signal generating function that performs mixing of a video signal and a sync signal, affixing of an error correction code, etc., a modulation function that performs conversion to, for example, a PSK, MSK, GMSK, QMSK, ASK, AM, or FM system in cooperation with a modulator, a power supply control function that controls power supply with ON-OFF of a switch, driver circuits such as an LED driver circuit, a timing generator (TG) function that controls the number of imaging shots, and a memory function that stores various data, such as parameters for a line frame.
  • the signal processing/control unit 143 executes various signal processes/controls.
  • the video signal processing function performs processes, such as image data correction (e.g., white balance (WB) correction, ⁇ correction, color processing, correlation double sampling (CDS), and automatic gain control (AGC)), and analog-digital conversion (ADC) and an auto exposure function (AE), in addition to, for example, image data generation.
  • image data correction e.g., white balance (WB) correction, ⁇ correction, color processing, correlation double sampling (CDS), and automatic gain control (AGC)
  • ADC analog-digital conversion
  • AE auto exposure function
  • the communication unit 142 for example, information collecting units, such as various sensors, a chemical releasing unit that releases chemicals, a tissue collecting unit that cuts tissues in a celom and collects them, etc. may be disposed in the capsule endoscope 10 as needed.
  • information collecting units such as various sensors, a chemical releasing unit that releases chemicals, a tissue collecting unit that cuts tissues in a celom and collects them, etc. may be disposed in the capsule endoscope 10 as needed.
  • FIG. 2 is a schematic of a capsule endoscope system according to the embodiment. At the time of performing examination using the capsule endoscope 10 , the capsule endoscope system as shown in FIG. 2 is used.
  • the capsule endoscope system comprises the capsule endoscope 10 and its package 50 , a jacket 3 that a patient or a subject 2 wears, a receiver 4 attachable to/detachable from the jacket 3 , a work station 5 , a CF (compact flash (registered trademark)) memory reader/writer 6 , a label printer 7 , a database 8 , and a network 9 , as shown in FIG. 2, for example.
  • CF compact flash
  • the jacket 3 is provided with antennas 31 , 32 , 33 , and 34 that catch radio waves of taken images to be sent from the radio unit 142 of the capsule endoscope 10 so that the jacket 3 can communicate with the receiver 4 wirelessly or by a cable.
  • the number of antennas is not particularly limited to four but should be plural, so that radio waves according to positions of the capsule endoscope 10 moved can be received properly.
  • the receiver 4 is provided with an antenna 41 that is used when directly receiving taken images through radio waves, a display unit 42 that displays information necessary for observation (examination) and an input unit 43 that inputs information necessary for observation (examination).
  • a CF memory 44 that stores received taken image data can be detachably attached to the receiver 4 .
  • the receiver 4 is provided with a power supply unit 45 capable of supplying power even at the time of portable usage and a signal processing/control unit 46 that performs processes needed for observation (examination).
  • a dry cell, Li ion secondary battery, and Ni hydrogen battery can be exemplified and a chargeable type may also be used.
  • The,work station 5 has a processing function for performing a diagnosis based on images of organs or the like in a patient, taken by the capsule endoscope 10 by a doctor or a nurse.
  • This work station 5 has interfaces, though not shown, which connect to the receiver 4 , the CF memory reader/writer 6 , and the label printer 7 in a communicable manner and executes read/write of the CF memory 44 , chart printing, etc.
  • the work station 5 has a communication function for connecting to the network 9 and stores doctor results of a patient into the database 8 via the network 9 . Further, the work station 5 has a display unit 51 , and receives taken image data of inside a patient from the receiver 4 and displays the images of organs or the like on the display unit 51 .
  • the capsule endoscope 10 As the capsule endoscope 10 is taken out of the package 50 and is swallowed by the subject 2 through the mouth, prior to initiation examination, it passes through the esophagus, moves inside the celom by peristalsis of the digestive tracts and takes images inside the celom one after another.
  • the radio waves of taken images are output via the radio unit 142 as needed or for the imaging results and are caught by the antennas 31 , 32 , 33 , and 34 of the jacket 3 .
  • a signal from the antenna the intensity of whose received radio waves is high is sent to the receiver 4 outside.
  • taken image data received one after another is stored in the CF memory 44 .
  • the receiver 4 is not synchronized with the start of imaging of the capsule endoscope 10 and the initiation of reception and end of reception are controlled by manipulation of the input unit 43 .
  • the taken image data may be still picture data taken by plural frames per second for dynamic display or ordinary moving picture data.
  • the taken image data stored in the CF memory 44 is transferred to the work station 51 via a cable.
  • the work station 5 memorizes the transferred taken image data in association with individual patients.
  • the taken image data inside the celom taken by the capsule endoscope 10 and stored in the receiver 4 in this manner is displayed by the display unit 51 of the work station 5 . Accordingly, acquisition of effective data for physiological study and diagnosis of lesion can be carried out over the entire digestive tracts of a human body including the deep body portion (small intestine, etc.) that cannot be reached by an ultrasonic probe, endoscope, etc.
  • FIG. 3 is a block diagram of an example of the capsule endoscope system according to the embodiment. The description is given on only the essential structures of the individual units.
  • the capsule endoscope 10 has the structure to take the image of an internal target (organs, etc.) with the imaging unit 111 from reflection of light illuminated from the illumination units 112 a and 112 b and send the taken image from the radio unit 142 in the form of a radio signal.
  • the jacket 3 has a structure such that a selector 35 is connected to the four antennas 31 , 32 , 33 , 34 , and an I/F 36 to which a cable to connect to the receiver 4 is connected to the selector 35 .
  • the jacket 3 receives radio signals sent from the capsule endoscope 10 at the four antennas 31 , 32 , 33 , and 34 , select a received signal according to the radio wave intensity by the selector 35 and is transferred to the receiver 4 via the I/F 36 .
  • the jacket 3 is not provided with a large-capacity memory and taken images received via the antennas 31 , 32 , 33 , and 34 are transferred one after another to the receiver 4 at the subsequent stage.
  • the receiver 4 has, as the internal structure, an I/F 45 for communication to the I/F 36 of the jacket 3 via a cable, a CPU 46 that controls the entire receiver 4 according to a program prepared beforehand, a CF memory I/F 47 that performs data communication with the attached CF memory 44 , and an I/F 48 that performs communication with the work station 5 by a cable.
  • the receiver 4 is always attached to the subject 2 during observation of inside a body by the capsule endoscope 10 . During observation, therefore, taken images are received one after another from the jacket 3 and the received images are stored in the CF memory 44 via the CF memory I/F 47 one after another. During observation, the receiver 4 is not connected to the work station 4 and the subject 2 is not restricted in a hospital or the like and can move freely.
  • the CF memory reader/writer 6 has, as the internal structure, a CPU 61 that controls the entire reader/writer according to a program prepared beforehand, a CF memory I/F 62 that performs data communication with the attached CF memory 44 , and an I/F 63 that performs communication with the work station 5 by a cable.
  • the CF memory reader/writer 6 is attached with the CF memory 44 and is connected to the work station 5 via the I/F 63 , performs formatting of taken information for diagnosis according to the present embodiment with respect to the CF memory 44 or reads stored taken image data from the CF memory 44 and transfers the data to the work station 5 .
  • the taken image data here is in the form of JPEG or the like.
  • the work station 5 has the display unit 51 that displays images of organs, etc. according to the present embodiment, an I/F 52 that manages communication with the I/F 48 of the receiver 4 via a cable and the I/F 63 of the CF memory reader/writer 6 via a cable, a large-capacity memory 53 that stores data to be handled in various processes, a CPU 54 that controls the entire work station 5 according to a program prepared beforehand, an input unit 55 that inputs various kinds of operations and an output unit 56 that is connected to the label printer 7 or the database. 8 or other printers over the network 9 for performing various kinds of output processes.
  • taken image data stored in the CF memory 44 is transferred from the receiver 4 to the work station 5 and stored in the memory 53 .
  • the display of an average color slider to be discussed later, the locus of the capsule endoscope 10 , etc. are displayed at the time of a diagnosis.
  • the diagnosis results are output as a chart from the printer and stored in the database 8 patient by patient.
  • FIG. 4A and FIG. 4B, FIG. 5A and FIG. 5B, and FIG. 6A to FIG. 6C are schematics of an example of screen transition associated with the observation procedures according to the present embodiment.
  • FIG. 7 and FIG. 8 are schematics of an example of screen transition associated with the diagnosis procedures according to the present embodiment.
  • FIG. 9 is a flowchart of the operation for average color bar display according to the embodiment.
  • a program for displaying an average color slider is directly installed from a recording medium such as CD-ROM or is downloaded from outside such as a network, then installed and stored in the memory 53 of the work station 5 as its storage scheme.
  • a doctor formats the CF memory 44 using the work station 5 and the CF memory reader/writer 6 .
  • the CF memory 44 is inserted into the CF memory reader/writer 6 and a guidance screen prompting connection of the CF memory reader/writer 6 to the work station 5 is displayed on the display unit 51 of the work station 5 (FIG. 4A).
  • the doctor performs a menu operation for “NEXT”
  • the process proceeds to the next guidance screen display. It is assumed that the doctor has prepared according to the guidance at this time. If the preparation is inadequate and the menu operation for “NEXT” is done in that state, a message of non-insertion of the CF memory, non-connection of the CF memory reader/writer or the like may be displayed.
  • the next guidance screen displays a guidance screen prompting entry of diagnosis information and patient information (FIG. 4B).
  • diagnosis information there are input items of, for example, a hospital name, the name of capsule-administering doctor (nurse), the date/time of capsule administration, a capsule serial number and a receiver serial number.
  • patient information there are input items of, for example, a patient ID, the name of a patient, gender of the patient, the age of the patient and the birth date of the patient.
  • a confirmation screen for the entered items is displayed (FIG. 5A). The screen may go back to the previous screen through a menu operation for “BACK”.
  • next guidance screen shows a confirmation of the items entered on the previous screen and the doctor further performs the menu operation for “NEXT”, it is considered that nothing is wrong about the input information and the display screen goes to the next screen (FIG. 5B). At this time, information on the input items is written in the CF memory 44 . When the menu operation for “BACK” is done, the items entered previously can be corrected.
  • the next guidance screen (FIG. 5B) shows a message of an instruction to remove the CF memory 44 , an instruction to put labels having necessary ID information printed according to the input items confirmation of the items entered on the previous screen to the receiver 4 and the CF memory 44 , and an instruction to insert the CF memory 44 into the receiver 4 .
  • the doctor performs a menu operation for “COMPLETED”, preparation before administration of the capsule endoscope 10 into the subject is completed.
  • the administration of the capsule endoscope 10 into the subject 10 is completed, observation of the interior of the body is started and storage of taken image data into the CF memory 44 is started by the operation of the receiver 4 .
  • the doctor receives guidance from the work station 5 again.
  • the CF memory 44 is removed from the receiver 4 and a guidance screen prompting insertion of the CF memory reader/writer 6 is displayed (FIG. 6A). After preparation takes places according to the message, when the doctor performs the menu operation for “NEXT”, the display screen goes to the next (FIG. 6B).
  • the diagnosis information and patient information recorded in the CF memory 44 are read from the memory and displayed.
  • the information of the displayed contents, i.e., information (taken image data, etc.) acquired through observation is acquired by the work station 5 .
  • a list of diagnosis information and patient information of individual patients saved in the memory 53 of the work station 5 is displayed (FIG. 7). Accordingly, the doctor can select on which patient diagnosis is to be done with, for example, a cursor. The selected state has only to be given in inverted display. When a menu operation for “OBSERVATION” is done with the cursor selecting state, a patient to be diagnosed is decided. With regard to diagnosed patients, affixing “DONE” on the displayed list as shown in FIG. 7 can ensure an easy confirmation of whether a diagnosis has been made.
  • a diagnosis procedure screen is displayed as shown in FIG. 8.
  • This diagnosis procedure screen shows information necessary for diagnosis.
  • 501 and 502 are respectively patient information and diagnosis information of the associated patient
  • 503 is an image display field illustrating one of taken images.
  • 504 A shows a checked-image display field giving a list of taken images of interest that have been arbitrarily checked (selected) by a doctor by operating a software-based check button CHK.
  • 505 shows a 3D (three dimensional) position display field showing an imaging position (position inside a body) of the taken image, displayed in the image display field 503 , in a 3D manner
  • 506 shows a playback operation field 506 for performing a playback operation for a taken image to be displayed in the image display field 503
  • 507 shows an average color bar colored in time sequence with average colors according to the organs for taken-images from the start point of reception by the receiver to the end point of reception.
  • the average color bar 507 serves as a scale indicating the passing time during the observation period.
  • the display screen further displays individual menus for “HELP”, “BACK”, “CANCEL”, and “END DIAGNOSIS/PRINT CHART”.
  • the average color bar 507 is average colors acquired from the individual frames of a taken image and colored in time sequence using the characteristics of colors different from one organ to another. In the average color bar 507 , therefore, the average color of a taken image when the capsule endoscope 10 is moving according to regions of each organ becomes nearly uniform. Even if an image taken while movement in the same organ contains noise, nearly a uniform color for each organ can be acquired by obtaining the average color of a single screen frame by frame.
  • a slider S is shown movable in the direction of the time axis.
  • the slider S serves as an index to indicate the position of a taken image to be displayed in the image display field 503 , at a position on the average color bar 507 . Therefore, moving/display control of the slider S is carried out according to the operation of the playback operation field 506 .
  • a checked image distinguished from other images can be extracted at the,doctor's discretion.
  • the doctor operates the check button CHK.
  • the checked image is additionally displayed as a thumbnail image in the checked-image display field 504 A. Due to the restriction of the display area, the checked-image display field 504 A can display up to a predetermined number of images. In the present embodiment, as shown in FIG. 8, for example, up to five images can be displayed and for other checked images, display images are switched by scrolling.
  • the doctor can intuitively and quickly move the display image to the position of the taken image associated with the desired organ referring to the average color bar 507 .
  • the slider S of the average color bar 507 is moved by using the mouse (not shown).
  • a process of sequentially changing the image to the one at the position indicated by the slider S following the movement is executed in the image display field 503 .
  • a flag as a bleeding part can be affixed to each taken image.
  • a sub menu is displayed with the current state displayed in the image display field 503 to manually set the flag of the bleeding part. Accordingly, display can be made in association with the positions on the average color bar 507 , such as bleeding parts V 1 , V 2 , as shown in FIG. 8, for example.
  • a bleeding part can be automatically extracted through image processing, in which case an AUTO-RETRIEVE BLEEDING PART button as indicated by 508 is operated.
  • the operation of the AUTO-RETRIEVE BLEEDING PART button 508 may be done for the image currently displayed in the image display field 503 or for all the images.
  • a flag is put in association with each image as done in the case of manual operation.
  • the diagnosis by a doctor can be terminated by a menu operation for “END DIAGNOSIS/PRINT CHART”.
  • the diagnosis results are made into a chart and printed through a printer (not shown) from the work station 5 or via the database 8 .
  • a process is executed as shown in FIG. 9. That is, when a patient to be diagnosed is decided from a list shown in FIG. 7, a file of imaging information corresponding to that patient is designed. Then, one frame of image files is read from the memory 53 and opened (step S 1 ), and the average color of the taken images frame by frame is measured (step S 2 ).
  • step S 3 When the average color is measured and average color data is acquired, the average color data for the first frame is stored in the memory 53 (step S 3 ). Then, a processed image file is closed and an image file located next in time sequence is read out and opened, and a similar process is repeatedly executed thereafter (NO route of step S 5 ).
  • step S 5 When the average colors for all the imaging information of the patient to be diagnosed are obtained (step S 5 ), the average color bar 507 is displayed and controlled as shown in FIG. 8 using the average color data stored in the memory 53 (step S 6 ). In this manner, the display of the average color bar 6 is completed. At this time, the initial position of the slider S is the left end (start position) of the average color bar 507 but is not restrictive.
  • the average color may be acquired while efficiently thinning several frames.
  • the acquired average color itself is displayed on the average color bar 507 in the present embodiment, it is not restrictive and a color corresponding to this average color has only to be displayed on the average color bar 507 .
  • a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope is displayed, a movable slider is shown on the scale, an image at the imaging time corresponding to the position of the slider is displayed in response to the movement of the slider on the scale, and a color corresponding to average color information for one screen of input image data is displayed at the time-associated position on the scale, so that distinguishing coloring is carried out according to the taken part and an organ in the body can easily be determined from the distinguished colors. Accordingly, the ability to retrieve the image is improved and it is possible to easily recognize the organ depicted in each image.
  • the present invention is not limited to this type and an additional function of displaying the name of an organ in association with the average color may be provided as in a modification to be discussed below. As the modification to be discussed below is the same in the structure and functions described above, only what is added is discussed.
  • FIG. 10 is a schematic of an example of a display screen associated with a diagnosis process according to a modification of the embodiment.
  • FIG. 11 is Graphs for illustrating the principle of automatic discrimination of organ names according to the modification of the embodiment.
  • FIG. 12 is a flowchart of the procedures of discriminating the organ names according to the modification of the embodiment.
  • the organ names are displayed in association with each average color on the average color bar 507 .
  • Average colors are lined on the average color bar 507 in the order of the esophagus, the stomach, the small intestine, and the large intestine in the order of imaging done in a body by the capsule endoscope 10 in time sequence. Therefore, the average color bar 507 shows organ names 509 in the order of the esophagus, the stomach, the small intestine, and the large intestine in association with the average colors of the individual organs.
  • the first discoloration edge ( 1 ) is a transitional portion from the esophagus to the stomach
  • ( 2 ) is a transitional portion from the stomach to the small intestine
  • ( 3 ) is a transitional portion from the small intestine to the large intestine.
  • the order of the organ names is based on the layout of the organs to be taken by the capsule endoscope 10 in the direction of the time axis.
  • the red level and blue level are computed (step S 21 )
  • the LPF process in the direction of the time axis is performed on the red level and blue level (step S 22 ) and the discoloration edges ( 1 ), ( 2 ), and ( 3 ) are detected (step S 23 ).
  • automatic discrimination of the ranges of the organs is carried out from the time-associated positions of the discoloration edges ( 1 ), ( 2 ), and ( 3 ) and the organ names are displayed in association with the individual average colors on the average color bar 507 (step S 24 ).
  • a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope is displayed, a movable slider is shown on the scale, an image at the imaging time corresponding to the position of the slider is displayed in response to the movement of the slider on the scale, and organs are discriminated based on color information for one screen of input image data and organ names are displayed in association with the scale, so that organs in the body can easily be determined from the displayed organ names. This also improves the ability to retrieve images and makes it possible to easily recognize the organ depicted in each image.
  • the present invention is not limited to this type and a pH sensor may be provided in the capsule endoscope 10 so that the ranges of the organs are specified more accurately using the measured pH values.
  • the pH values are measured by the pH sensor during the observation period and like taken images, the pH values are measured in time sequence and are stored in the receiver 4 . At that time, the taken images and pH values are recorded in association with each other, such as coexisting in each frame (image file).
  • FIG. 13 is a graph for illustrating an example of application of the modification shown in FIG. 11.
  • an acidic part is compared with the discoloration edges ( 1 ) and ( 2 ) to discriminate the stomach part, thereby further increasing the discrimination precision.
  • FIG. 14 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment.
  • FIG. 15 is a flowchart of an operation for displaying the imaging time of a designated image according to the embodiment. While a diagnosis by a doctor can be terminated through the menu operation for “END DIAGNOSIS/PRINT CHART”, further transition to the chart creating procedures can be made.
  • 504 B in FIG. 14 indicates a checked-image display field, set larger than the checked-image display field 504 A and provided at the lower portion of the screen.
  • numbers ( 1 ) to ( 10 ) are given to individual taken images and displayed.
  • the checked-image display field 504 B has the same function as the checked-image display field 504 A.
  • 510 is a comment input field where opinions (comments) of a doctor are input and displayed. The results of a diagnosis by a doctor are input as comments in the comment input field 510 .
  • an imaging time display mark that is displayed, as a mark on the average color bar 505 , indicating which taken image at which elapsed time each checked image to be displayed in the checked-image display field 504 B is.
  • the imaging time display mark a downward arrow as an index indicating the imaging time for a checked image and the aforementioned number given to a checked image as relative display indicating the correlation with the checked image to show the correlation with the checked image are displayed on the average color bar 505 .
  • FIG. 14 displays ten checked images.
  • average colors are distinguished on the average color bar 507 in the order of the esophagus, the stomach, the small intestine, and the large intestine.
  • a mark ( 1 ) for a checked image is present in the range of the esophagus
  • marks ( 2 ), ( 3 ), and ( 4 ) for a checked image are present in the range of the stomach.
  • marks ( 5 ), ( 6 ), ( 7 ), ( 8 ), ( 9 ), and ( 10 ) for checked images are present in the range of the small intestine.
  • the presence of images checked by a doctor are identified in the esophagus, the stomach, and the small intestine from the example in FIG. 14, and marks are displayed in association with the times at which the individual checked images have been taken, so that the doctor can easily confirm at which parts of the organs the checked images have been taken.
  • the imaging time display mark is displayed on the average color bar 505 showing the organ names in FIG. 14, it may be displayed on the average color bar that does not show the organ names as in FIG. 8.
  • a correlation indication (number) indicating the correlation with a checked image is displayed as the imaging time display mark in FIG. 14, it may be an index (downward arrow) indicating the position of the imaging time.
  • step S 31 the date/time of creating a file of the designated image is acquired from the memory 53 (step S 31 ), and the time elapsed since the date/time of the initiation of imaging is computed (step S 32 ). Then, a mark display as shown in FIG. 4 is controlled on the scale of the average color bar 507 at the position corresponding to the elapsed time on the average color bar 507 (step S 33 ). Thereafter, when chart printing is manipulated, outputting for the chart printing is executed.
  • a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope is displayed, a color corresponding to average color information for one screen of input image data is displayed at a time-associated position on the scale, an image corresponding to the input image data is displayed, and an index indicating a position corresponding to an imaging time of a designated image is displayed, so that it is possible to visually and easily recognize how many and in which time band designated images are present.
  • organs can easily be determined from the colors distinguished from one taken part from another one, it is possible to easily recognize which part of which organ has more designated images.
  • a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope is displayed, organs are discriminated based on color information of one screen of input image data, the names of the discriminated organ are displayed in association with the scale, images corresponding to the input image data are displayed and an index indicating the position corresponding to the imaging time of the designated image is displayed on the scale, so that organs in the body can easily be determined from the displayed organ names. This also makes it possible to easily recognize which part of which organ has more designated images.
  • an image display apparatus constructed in such a way that a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device is displayed, a color corresponding to average color information for one screen of input image data is displayed at a time-associated position on the scale, an image corresponding to the input image data is displayed, and an index indicating a position corresponding to an imaging time of a designated image is displayed, so that it is possible to visually and easily recognize how many and in which time band designated images are present and easily determine organs from the colors distinguished from one taken part from another one, thus making it possible to easily recognize which part of which organ has more designated images.
  • an image display apparatus constructed in such a way that a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device is displayed, organs are discriminated based on color information of one screen of input image data, names of the discriminated organ are displayed in association with the scale, images corresponding to the input image data are displayed and an index indicating the position corresponding to the imaging time of the designated image is displayed on the scale, so that organs in the body can easily be determined from the displayed organ names, whereby it is possible to easily recognize which part of which organ has more designated images.
  • an image display method configured to have steps of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, displaying a color corresponding to average color information for one screen of input image data at a time-associated position on the scale, displaying an image corresponding to the input image data, and displaying an index indicating a position corresponding to an imaging time of a designated image, so that it is possible to visually and easily recognize how many and in which time band designated images are present and easily determine organs from the colors distinguished from one taken part from another one, thus making it possible to easily recognize which part of which organ has more designated images.
  • an image display method configured to have steps of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, discriminating organs based on color information of one screen of input image data, displaying names of the discriminated organ in association with the scale, displaying images corresponding to the input image data and displaying an index indicating the position corresponding to the imaging time of the designated image on the scale, so that organs in the body can easily be determined from the displayed organ names, whereby it is possible to easily recognize which part of which organ has more designated images.
  • an image display program that allows a computer to execute processes of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, displaying a color corresponding to average color information for one screen of input image data at a time-associated position on the scale, displaying an image corresponding to the input image data, and displaying an index indicating a position corresponding to an imaging time of a designated image, so that it is possible to visually and easily recognize how many and in which time band designated images are present and easily determine organs from the colors distinguished from one taken part from another one, thus making it possible to easily recognize which part of which organ has more designated images.
  • an image display program that allows a computer to execute processes of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, discriminating organs based on color information of one screen of input image data, displaying names of the discriminated organ in association with the scale, displaying images corresponding to the input image data and displaying an index indicating the position corresponding to the imaging time of the designated image on the scale, so that organs in the body can easily be determined from the displayed organ names, whereby it is possible to easily recognize which part of which organ has more designated images.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Endoscopes (AREA)
  • Studio Devices (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

An average color bar indicating the overall imaging period of images taken in time sequence by a capsule endoscope is displayed. A list of checked images in the entire taken images is displayed in a checked-image display field, computation is made to what time during an observation period each checked image corresponds is computed, and a mark is displayed with a scale of the average color bar by a number corresponding to each checked image on the average color bar.

Description

    BACKGROUND OF THE INVENTION
  • 1) Field of the Invention [0001]
  • The present invention relates to an image display apparatus, an image display method, and an image display program. [0002]
  • 2) Description of the Related Art [0003]
  • Recently, swallowable capsule endoscopes have been produced as a type of endoscopes. The capsule endoscopes are provided with an imaging capability and a radio capability. A capsule endoscope is configured to sequentially take images of organs such as the stomach and the small intestine within an observation period from the time it has been swallowed through the mouth of a patient for observation (examination) to its natural excretion from the human body (see Japanese Patent Application Laid-open No. H11-225996 Publication); [0004]
  • During the observation period, image data taken in a body by the capsule endoscope is sequentially transmitted outside through radio communication and is stored in a memory. Since a patient carries around a receiver having a radio communication capability and a memory capability, the patient can freely perform normal actions during the observation period from swallowing of the capsule endoscope to its excretion. After observation, a doctor or a nurse can display the images of organs on a display based on the image data stored in the memory and use it to make a diagnosis. [0005]
  • As the above type of capsule endoscope, “M2A (registered trademark)” by Given Imaging Ltd. of Israel, and “NORIKA (registered trademark)” by RF SYSTEM lab. of Japan are presently available, and they have already come to practical applications. [0006]
  • However, unlike an ordinary endoscope, the capsule endoscope described above takes images of each organ within a period from the time a subject swallows to its natural excretion, meaning an extended period of observation.(examination), for example, more than ten hours. Therefore, the number of images to be taken in time sequence is correspondingly huge. [0007]
  • At the stage of diagnosis or the like, no particular consideration is given to improving the ability to retrieve a desired image from the vast amount of images taken over a long period of time, or providing a display screen allowing easy recognition of what time in the overall imaging period the displayed image was taken, of which organ is being shown, and the like. [0008]
  • SUMMARY OF THE INVENTION
  • It is an object of the present invention to solve at least the problems in the conventional technology. [0009]
  • The image display apparatus according to one aspect of the present invention includes an input unit that inputs image data taken in time sequence by an in-vivo imaging device, a scale display control unit that controls to display a scale indicating an overall imaging period of input image data taken in time sequence and input by the input unit, a color information detecting unit that detects color information of a screen of the image data input by the input unit, a color display control unit that controls to display a color corresponding to the color information detected by the color information detecting unit at a time-corresponding position on the scale, an image display control unit that controls to display an image corresponding to the image data input by the input unit, an image designation unit that designates the image subjected to be displayed by the image display control unit, and an index display control unit that controls to display, on the scale, an index indicating a position corresponding to an imaging time of the image designated by the image designation unit. [0010]
  • The image display method according to another aspect of the present invention includes inputting image data taken in time sequence by an in-vivo imaging device, displaying a scale indicating an overall imaging period of input image data taken in time sequence and input by the input unit, detecting color information of a screen of the image data input by the input unit, displaying a color corresponding to the color information detected by the color information detecting unit at a time-corresponding position on the scale, displaying an image corresponding to the image data input by the input unit, designating the image subjected to be displayed by the image display control unit, and displaying, on the scale, an index indicating a position corresponding to an imaging time of the image designated by the image designation unit. [0011]
  • The image display program according to still another aspect of the present invention realizes the method according to the above aspect on a computer. [0012]
  • The other objects, features, and advantages of the present invention are specifically set forth in or will become apparent from the following detailed description of the invention when read in conjunction with the accompanying drawings.[0013]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic of a capsule endoscope according to an embodiment of the present invention; [0014]
  • FIG. 2 is a schematic of a capsule endoscope system according to the embodiment; [0015]
  • FIG. 3 is a block diagram of an example of the capsule endoscope system according to the embodiment; [0016]
  • FIG. 4A and FIG. 4B are schematics of an example of screen transition ([0017] screen 1 and 2) associated with the observation procedures according to the embodiment;
  • FIG. 5A and FIG. 5B are schematics of an example of screen transition ([0018] screen 3 and 4) associated with the observation procedures according to the embodiment;
  • FIG. 6A to FIG. 6C are schematics of an example of screen transition (screen for acquisition of [0019] data 1 to 3) associated with the observation procedures according to the embodiment;
  • FIG. 7 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment; [0020]
  • FIG. 8 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment; [0021]
  • FIG. 9 is a flowchart of the operation for average color bar display according to the embodiment; [0022]
  • FIG. 10 is a schematic of an example of a display screen associated with a diagnosis process according to a modification of the embodiment; [0023]
  • FIG. 11 is Graphs for illustrating the principle of automatic discrimination of organ names according to the modification of the embodiment; [0024]
  • FIG. 12 is a flowchart of the procedures of discriminating the organ names according to the modification of the embodiment; [0025]
  • FIG. 13 is a graph for illustrating an example of application of the modification shown in FIG. 11; [0026]
  • FIG. 14 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment; and [0027]
  • FIG. 15 is a flowchart of an operation for displaying the imaging time of a designated image according to the embodiment.[0028]
  • DETAILED DESCRIPTION
  • Exemplary embodiments of an image display apparatus, an image display method, and a computer program according to the present invention are described below in detail with reference to the accompanying drawings. [0029]
  • FIG. 1 is a schematic of a capsule endoscope according to an embodiment of the present invention. A [0030] capsule endoscope 10 includes an imaging unit 111 that can take the internal image of a celom, illumination units 112 a and 112 b that illuminate the interior of the celom, a power supply unit 13 that supplies them with power, and a capsule housing 14 that has at least the imaging unit 111, the illumination units 112 and the power supply unit 13 disposed inside.
  • The capsule housing [0031] 14 according to the present embodiment includes a distal-end cover 120 that covers the imaging unit 111 and the illumination units 112 a, 112 b, and a capsule body 122 that is provided in a water-proof state with respect to the distal-end cover 120 via a seal member 121 and has the imaging unit 111, etc. disposed therein. A rear-end cover 123 may be provided as separate from the capsule body 122 as needed. Although the rear-end cover 123 is provided integrally with the capsule body and has a flat shape in the present embodiment, the shape is not limited and may be, for example, a dome shape.
  • The distal-[0032] end cover 120 may clearly separate an illumination window 120 a, that transmits illumination light L from the illumination unit 112 a, 112 b, and an imaging window 120 b, that performs imaging in the illumination range, from each other. In the present embodiment, the entire distal-end cover 120 is transparent and the areas of the illumination window 120 a and the imaging window 120 b partly overlap each other.
  • The [0033] imaging unit 111 is provided on an imaging board 124 with a solid-state imaging device 125 formed of, for example, a CCD, which performs imaging in the range that is illuminated with the illumination light L from the illumination unit 112 a, 112 b, and an image forming lens 126 that includes a fixed lens 126 a and a movable lens 126 b, and forms the image of a subject to the solid-state imaging device 125, and executes sharp image forming with a focus adjusting unit 128 with a fixed frame 128 a that secures the fixed lens 126 a and a movable frame 128 b, which secures the movable lens 126 b. In the present invention, the imaging unit 111 is not limited to the CCD, but an imaging unit such as CMOS, may be used.
  • The [0034] illumination units 112 a, 112 b are provided on an illumination board 130 and are comprised of, for example, a light-emitting diode (LED), and a plurality of illumination units 112 a, 112 b (four in the present embodiment as one example) are laid out around the image forming lens 126 that constitutes the imaging unit 111. In the present invention, the illumination units 112 a, 112 b are not limited to the LED but other illumination units may be used as well.
  • The [0035] power supply unit 13 is provided on a power supply board 132 provided with an internal switch 131 and uses, for example, a button type battery as a power supply 133. While a silver oxide cell, for example, is used as the battery in the present invention, the invention is not limited to it and may use a chargeable battery, a dynamo type battery or the like.
  • Although one that can perform an ON operation by, for example, the oppositional action of magnets is used as the [0036] internal switch 131, the present invention is not limited to this type and other switch units can be also exemplified.
  • In the present embodiment, besides the individual units described above, a [0037] radio unit 142 comprising an antenna or the like for radio communication with outside is provided on a radio board 141 and communication with outside is carried out as needed.
  • A signal processing/[0038] control unit 143 for processing or controlling the individual units is provided on an imaging board 124 and executes various processes in the capsule endoscope 10.
  • The signal processing/[0039] control unit 143 is comprised of a video signal processing function for image data generation, a transmission signal generating function that performs mixing of a video signal and a sync signal, affixing of an error correction code, etc., a modulation function that performs conversion to, for example, a PSK, MSK, GMSK, QMSK, ASK, AM, or FM system in cooperation with a modulator, a power supply control function that controls power supply with ON-OFF of a switch, driver circuits such as an LED driver circuit, a timing generator (TG) function that controls the number of imaging shots, and a memory function that stores various data, such as parameters for a line frame. The signal processing/control unit 143 executes various signal processes/controls.
  • The video signal processing function performs processes, such as image data correction (e.g., white balance (WB) correction, γ correction, color processing, correlation double sampling (CDS), and automatic gain control (AGC)), and analog-digital conversion (ADC) and an auto exposure function (AE), in addition to, for example, image data generation. [0040]
  • Besides the [0041] communication unit 142, for example, information collecting units, such as various sensors, a chemical releasing unit that releases chemicals, a tissue collecting unit that cuts tissues in a celom and collects them, etc. may be disposed in the capsule endoscope 10 as needed.
  • FIG. 2 is a schematic of a capsule endoscope system according to the embodiment. At the time of performing examination using the [0042] capsule endoscope 10, the capsule endoscope system as shown in FIG. 2 is used.
  • The capsule endoscope system according to the present embodiment comprises the [0043] capsule endoscope 10 and its package 50, a jacket 3 that a patient or a subject 2 wears, a receiver 4 attachable to/detachable from the jacket 3, a work station 5, a CF (compact flash (registered trademark)) memory reader/writer 6, a label printer 7, a database 8, and a network 9, as shown in FIG. 2, for example.
  • The [0044] jacket 3 is provided with antennas 31, 32, 33, and 34 that catch radio waves of taken images to be sent from the radio unit 142 of the capsule endoscope 10 so that the jacket 3 can communicate with the receiver 4 wirelessly or by a cable. The number of antennas is not particularly limited to four but should be plural, so that radio waves according to positions of the capsule endoscope 10 moved can be received properly.
  • The [0045] receiver 4 is provided with an antenna 41 that is used when directly receiving taken images through radio waves, a display unit 42 that displays information necessary for observation (examination) and an input unit 43 that inputs information necessary for observation (examination). A CF memory 44 that stores received taken image data can be detachably attached to the receiver 4. Further, the receiver 4 is provided with a power supply unit 45 capable of supplying power even at the time of portable usage and a signal processing/control unit 46 that performs processes needed for observation (examination). As the power supply unit 45, for example, a dry cell, Li ion secondary battery, and Ni hydrogen battery can be exemplified and a chargeable type may also be used.
  • The,[0046] work station 5 has a processing function for performing a diagnosis based on images of organs or the like in a patient, taken by the capsule endoscope 10 by a doctor or a nurse. This work station 5 has interfaces, though not shown, which connect to the receiver 4, the CF memory reader/writer 6, and the label printer 7 in a communicable manner and executes read/write of the CF memory 44, chart printing, etc.
  • The [0047] work station 5 has a communication function for connecting to the network 9 and stores doctor results of a patient into the database 8 via the network 9. Further, the work station 5 has a display unit 51, and receives taken image data of inside a patient from the receiver 4 and displays the images of organs or the like on the display unit 51.
  • As the [0048] capsule endoscope 10 is taken out of the package 50 and is swallowed by the subject 2 through the mouth, prior to initiation examination, it passes through the esophagus, moves inside the celom by peristalsis of the digestive tracts and takes images inside the celom one after another.
  • The radio waves of taken images are output via the [0049] radio unit 142 as needed or for the imaging results and are caught by the antennas 31, 32, 33, and 34 of the jacket 3. A signal from the antenna the intensity of whose received radio waves is high is sent to the receiver 4 outside.
  • In the [0050] receiver 4, taken image data received one after another is stored in the CF memory 44. The receiver 4 is not synchronized with the start of imaging of the capsule endoscope 10 and the initiation of reception and end of reception are controlled by manipulation of the input unit 43. The taken image data may be still picture data taken by plural frames per second for dynamic display or ordinary moving picture data.
  • When observation (examination) of the subject [0051] 2 by the capsule endoscope 10 is finished, the taken image data stored in the CF memory 44 is transferred to the work station 51 via a cable. The work station 5 memorizes the transferred taken image data in association with individual patients.
  • The taken image data inside the celom taken by the [0052] capsule endoscope 10 and stored in the receiver 4 in this manner is displayed by the display unit 51 of the work station 5. Accordingly, acquisition of effective data for physiological study and diagnosis of lesion can be carried out over the entire digestive tracts of a human body including the deep body portion (small intestine, etc.) that cannot be reached by an ultrasonic probe, endoscope, etc.
  • FIG. 3 is a block diagram of an example of the capsule endoscope system according to the embodiment. The description is given on only the essential structures of the individual units. [0053]
  • The [0054] capsule endoscope 10 has the structure to take the image of an internal target (organs, etc.) with the imaging unit 111 from reflection of light illuminated from the illumination units 112 a and 112 b and send the taken image from the radio unit 142 in the form of a radio signal.
  • The [0055] jacket 3 has a structure such that a selector 35 is connected to the four antennas 31, 32, 33, 34, and an I/F 36 to which a cable to connect to the receiver 4 is connected to the selector 35. The jacket 3 receives radio signals sent from the capsule endoscope 10 at the four antennas 31, 32, 33, and 34, select a received signal according to the radio wave intensity by the selector 35 and is transferred to the receiver 4 via the I/F 36. The jacket 3 is not provided with a large-capacity memory and taken images received via the antennas 31, 32, 33, and 34 are transferred one after another to the receiver 4 at the subsequent stage.
  • The [0056] receiver 4 has, as the internal structure, an I/F 45 for communication to the I/F 36 of the jacket 3 via a cable, a CPU 46 that controls the entire receiver 4 according to a program prepared beforehand, a CF memory I/F 47 that performs data communication with the attached CF memory 44, and an I/F 48 that performs communication with the work station 5 by a cable.
  • To secure the state of being capable of receiving taken images from the [0057] jacket 3 at any time, the receiver 4 is always attached to the subject 2 during observation of inside a body by the capsule endoscope 10. During observation, therefore, taken images are received one after another from the jacket 3 and the received images are stored in the CF memory 44 via the CF memory I/F 47 one after another. During observation, the receiver 4 is not connected to the work station 4 and the subject 2 is not restricted in a hospital or the like and can move freely.
  • The CF memory reader/[0058] writer 6 has, as the internal structure, a CPU 61 that controls the entire reader/writer according to a program prepared beforehand, a CF memory I/F 62 that performs data communication with the attached CF memory 44, and an I/F 63 that performs communication with the work station 5 by a cable.
  • The CF memory reader/[0059] writer 6 is attached with the CF memory 44 and is connected to the work station 5 via the I/F 63, performs formatting of taken information for diagnosis according to the present embodiment with respect to the CF memory 44 or reads stored taken image data from the CF memory 44 and transfers the data to the work station 5. The taken image data here is in the form of JPEG or the like.
  • According to the present embodiment, it is possible to arbitrarily select direct transfer of taken image data to the [0060] work station 5 from the receiver 4 or moving the CF memory 44 to the CF memory reader/writer 6 to transfer taken image data to the work station 5.
  • The [0061] work station 5 has the display unit 51 that displays images of organs, etc. according to the present embodiment, an I/F 52 that manages communication with the I/F 48 of the receiver 4 via a cable and the I/F 63 of the CF memory reader/writer 6 via a cable, a large-capacity memory 53 that stores data to be handled in various processes, a CPU 54 that controls the entire work station 5 according to a program prepared beforehand, an input unit 55 that inputs various kinds of operations and an output unit 56 that is connected to the label printer 7 or the database. 8 or other printers over the network 9 for performing various kinds of output processes.
  • When the observation period ends and the [0062] receiver 4 is connected to the work station 5 in a communicable manner, taken image data stored in the CF memory 44 is transferred from the receiver 4 to the work station 5 and stored in the memory 53. In the work station 5, taken images from the capsule endoscope 10 according to the present embodiment, the display of an average color slider to be discussed later, the locus of the capsule endoscope 10, etc. are displayed at the time of a diagnosis. The diagnosis results are output as a chart from the printer and stored in the database 8 patient by patient.
  • FIG. 4A and FIG. 4B, FIG. 5A and FIG. 5B, and FIG. 6A to FIG. 6C are schematics of an example of screen transition associated with the observation procedures according to the present embodiment. FIG. 7 and FIG. 8 are schematics of an example of screen transition associated with the diagnosis procedures according to the present embodiment. FIG. 9 is a flowchart of the operation for average color bar display according to the embodiment. A program for displaying an average color slider is directly installed from a recording medium such as CD-ROM or is downloaded from outside such as a network, then installed and stored in the [0063] memory 53 of the work station 5 as its storage scheme.
  • First, a doctor (or a nurse) formats the [0064] CF memory 44 using the work station 5 and the CF memory reader/writer 6. In this case, as procedures prior to observation, the CF memory 44 is inserted into the CF memory reader/writer 6 and a guidance screen prompting connection of the CF memory reader/writer 6 to the work station 5 is displayed on the display unit 51 of the work station 5 (FIG. 4A). When the doctor performs a menu operation for “NEXT”, the process proceeds to the next guidance screen display. It is assumed that the doctor has prepared according to the guidance at this time. If the preparation is inadequate and the menu operation for “NEXT” is done in that state, a message of non-insertion of the CF memory, non-connection of the CF memory reader/writer or the like may be displayed.
  • The next guidance screen displays a guidance screen prompting entry of diagnosis information and patient information (FIG. 4B). As the diagnosis information, there are input items of, for example, a hospital name, the name of capsule-administering doctor (nurse), the date/time of capsule administration, a capsule serial number and a receiver serial number. As the patient information, there are input items of, for example, a patient ID, the name of a patient, gender of the patient, the age of the patient and the birth date of the patient. When the input operation for various input items is completed and the menu operation for “NEXT” is done, a confirmation screen for the entered items is displayed (FIG. 5A). The screen may go back to the previous screen through a menu operation for “BACK”. [0065]
  • As the next guidance screen (FIG. 5A) shows a confirmation of the items entered on the previous screen and the doctor further performs the menu operation for “NEXT”, it is considered that nothing is wrong about the input information and the display screen goes to the next screen (FIG. 5B). At this time, information on the input items is written in the [0066] CF memory 44. When the menu operation for “BACK” is done, the items entered previously can be corrected.
  • The next guidance screen (FIG. 5B) shows a message of an instruction to remove the [0067] CF memory 44, an instruction to put labels having necessary ID information printed according to the input items confirmation of the items entered on the previous screen to the receiver 4 and the CF memory 44, and an instruction to insert the CF memory 44 into the receiver 4. When the doctor performs a menu operation for “COMPLETED”, preparation before administration of the capsule endoscope 10 into the subject is completed.
  • Then, the administration of the [0068] capsule endoscope 10 into the subject 10 is completed, observation of the interior of the body is started and storage of taken image data into the CF memory 44 is started by the operation of the receiver 4. When the observation period ends and storage into the CF memory 44 is finished, the doctor receives guidance from the work station 5 again.
  • First, the [0069] CF memory 44 is removed from the receiver 4 and a guidance screen prompting insertion of the CF memory reader/writer 6 is displayed (FIG. 6A). After preparation takes places according to the message, when the doctor performs the menu operation for “NEXT”, the display screen goes to the next (FIG. 6B).
  • In the next guidance screen (FIG. 6B), the diagnosis information and patient information recorded in the [0070] CF memory 44 are read from the memory and displayed. The information of the displayed contents, i.e., information (taken image data, etc.) acquired through observation is acquired by the work station 5.
  • When the doctor performs the menu operation for “NEXT” upon completion of acquisition of the information in that manner, a process of acquiring data from the [0071] CF memory 44 is carried out. When the data acquisition process is finished, a guidance screen prompting completion of data acquisition from the CF memory 44, removal of the CF memory 44 from the CF memory reader/writer 6 and instruction for initiation of diagnosis is displayed (FIG. 6C). When the doctor performs the menu operation for “COMPLETED”, a sequence of guidance associated with the observation procedures is completed.
  • In the transition of a series of screens, there are icons of CANCEL and HELP that the doctor can arbitrarily select and operate. When the CANCEL is operated, the inputs so far are initialized. [0072]
  • At the stage of the diagnosis process, first, a list of diagnosis information and patient information of individual patients saved in the [0073] memory 53 of the work station 5 is displayed (FIG. 7). Accordingly, the doctor can select on which patient diagnosis is to be done with, for example, a cursor. The selected state has only to be given in inverted display. When a menu operation for “OBSERVATION” is done with the cursor selecting state, a patient to be diagnosed is decided. With regard to diagnosed patients, affixing “DONE” on the displayed list as shown in FIG. 7 can ensure an easy confirmation of whether a diagnosis has been made.
  • As a patient to be diagnosed is decided in this manner, a diagnosis procedure screen is displayed as shown in FIG. 8. This diagnosis procedure screen shows information necessary for diagnosis. [0074] 501 and 502 are respectively patient information and diagnosis information of the associated patient, and 503 is an image display field illustrating one of taken images. 504A shows a checked-image display field giving a list of taken images of interest that have been arbitrarily checked (selected) by a doctor by operating a software-based check button CHK.
  • [0075] 505 shows a 3D (three dimensional) position display field showing an imaging position (position inside a body) of the taken image, displayed in the image display field 503, in a 3D manner, 506 shows a playback operation field 506 for performing a playback operation for a taken image to be displayed in the image display field 503, and 507 shows an average color bar colored in time sequence with average colors according to the organs for taken-images from the start point of reception by the receiver to the end point of reception. The average color bar 507 serves as a scale indicating the passing time during the observation period. The display screen further displays individual menus for “HELP”, “BACK”, “CANCEL”, and “END DIAGNOSIS/PRINT CHART”.
  • The [0076] average color bar 507 is average colors acquired from the individual frames of a taken image and colored in time sequence using the characteristics of colors different from one organ to another. In the average color bar 507, therefore, the average color of a taken image when the capsule endoscope 10 is moving according to regions of each organ becomes nearly uniform. Even if an image taken while movement in the same organ contains noise, nearly a uniform color for each organ can be acquired by obtaining the average color of a single screen frame by frame.
  • In the [0077] average color bar 507, a slider S is shown movable in the direction of the time axis. The slider S serves as an index to indicate the position of a taken image to be displayed in the image display field 503, at a position on the average color bar 507. Therefore, moving/display control of the slider S is carried out according to the operation of the playback operation field 506.
  • The movement of the slider S on the [0078] average color bar 507 and changing of the taken image to be displayed in the image display field 503 are synchronized. That is, a software-based FRAME PLAYBACK button, PLAYBACK button, and FAST PLAYBACK (FP) button for operations in the forward playback direction along the time-sequential direction and a software-based REVERSE FRAME PLAYBACK button, REVERSE PLAYBACK button, and FAST REVERSE PLAYBACK (FR) button for operations in the reverse playback direction along the time-sequential direction are displayed and controlled. Further, a STOP button is displayed and controlled in the playback operation field 506.
  • When a doctor clicks the PLAYBACK button with a mouse (not shown) by operating the [0079] input unit 55, an image based on taken image data is displayed in the image display field 503 in time sequence in the forward playback direction. When the FRAME PLAYBACK button is clicked, a next image in the forward playback direction is displayed, and when the FAST PLAYBACK button is clicked, images are reproduced and displayed faster than the playback done by the PLAYBACK button in the forward playback direction. When the STOP button is clicked during playback or during fast playback, changing of the displayed image is stopped while an image at the time the clicking was made is displayed.
  • When the doctor clicks the REVERSE PLAYBACK button with the mouse (not shown) by operating the [0080] input unit 55, an image based on taken image data is displayed in the image display field 503 in the reverse playback direction with respect to the time-sequential direction. When the REVERSE FRAME PLAYBACK button is clicked, an image previous by one in the forward playback direction is displayed, and when the FAST REVERSE PLAYBACK button is clicked, images are reproduced and displayed faster than the playback done by the REVERSE PLAYBACK button in the reverse playback direction. When the STOP button is clicked during reverse playback or during fast reverse playback, changing of the displayed image is stopped while an image at the time the clicking was made is displayed.
  • When a diseased part like a bleeding part is found, or the like at the time of image playback or reverse playback in the [0081] image display field 503, a checked image distinguished from other images can be extracted at the,doctor's discretion. When such checking is desired, the doctor operates the check button CHK. The checked image is additionally displayed as a thumbnail image in the checked-image display field 504A. Due to the restriction of the display area, the checked-image display field 504A can display up to a predetermined number of images. In the present embodiment, as shown in FIG. 8, for example, up to five images can be displayed and for other checked images, display images are switched by scrolling.
  • As the [0082] average color bar 507 is segmented by the average colors according to the types of the organs, the doctor can intuitively and quickly move the display image to the position of the taken image associated with the desired organ referring to the average color bar 507. At this time, the slider S of the average color bar 507 is moved by using the mouse (not shown). As the slider S is operated to move on the average color bar 507, a process of sequentially changing the image to the one at the position indicated by the slider S following the movement is executed in the image display field 503.
  • In the present embodiment, when the doctor finds a bleeding part from the display image, a flag as a bleeding part can be affixed to each taken image. In this case, though not shown, a sub menu is displayed with the current state displayed in the [0083] image display field 503 to manually set the flag of the bleeding part. Accordingly, display can be made in association with the positions on the average color bar 507, such as bleeding parts V1, V2, as shown in FIG. 8, for example.
  • A bleeding part can be automatically extracted through image processing, in which case an AUTO-RETRIEVE BLEEDING PART button as indicated by [0084] 508 is operated. The operation of the AUTO-RETRIEVE BLEEDING PART button 508 may be done for the image currently displayed in the image display field 503 or for all the images. When it is found in automatic retrieval, a flag is put in association with each image as done in the case of manual operation.
  • The diagnosis by a doctor can be terminated by a menu operation for “END DIAGNOSIS/PRINT CHART”. The diagnosis results are made into a chart and printed through a printer (not shown) from the [0085] work station 5 or via the database 8.
  • In the display of the [0086] average color bar 507, a process is executed as shown in FIG. 9. That is, when a patient to be diagnosed is decided from a list shown in FIG. 7, a file of imaging information corresponding to that patient is designed. Then, one frame of image files is read from the memory 53 and opened (step S1), and the average color of the taken images frame by frame is measured (step S2).
  • When the average color is measured and average color data is acquired, the average color data for the first frame is stored in the memory [0087] 53 (step S3). Then, a processed image file is closed and an image file located next in time sequence is read out and opened, and a similar process is repeatedly executed thereafter (NO route of step S5).
  • When the average colors for all the imaging information of the patient to be diagnosed are obtained (step S[0088] 5), the average color bar 507 is displayed and controlled as shown in FIG. 8 using the average color data stored in the memory 53 (step S6). In this manner, the display of the average color bar 6 is completed. At this time, the initial position of the slider S is the left end (start position) of the average color bar 507 but is not restrictive.
  • Because the amount of the imaging information including taken image data is huge, it is unnecessary to open all the image files and acquire the average colors for all the frames, and the average color may be acquired while efficiently thinning several frames. Although the acquired average color itself is displayed on the [0089] average color bar 507 in the present embodiment, it is not restrictive and a color corresponding to this average color has only to be displayed on the average color bar 507.
  • According to the present embodiment, as described above, a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope (internal imaging device) is displayed, a movable slider is shown on the scale, an image at the imaging time corresponding to the position of the slider is displayed in response to the movement of the slider on the scale, and a color corresponding to average color information for one screen of input image data is displayed at the time-associated position on the scale, so that distinguishing coloring is carried out according to the taken part and an organ in the body can easily be determined from the distinguished colors. Accordingly, the ability to retrieve the image is improved and it is possible to easily recognize the organ depicted in each image. [0090]
  • Although the position of an organ is identified using the average colors arranged on the average color bar as an index in the embodiment described above, the present invention is not limited to this type and an additional function of displaying the name of an organ in association with the average color may be provided as in a modification to be discussed below. As the modification to be discussed below is the same in the structure and functions described above, only what is added is discussed. [0091]
  • FIG. 10 is a schematic of an example of a display screen associated with a diagnosis process according to a modification of the embodiment. FIG. 11 is Graphs for illustrating the principle of automatic discrimination of organ names according to the modification of the embodiment. FIG. 12 is a flowchart of the procedures of discriminating the organ names according to the modification of the embodiment. [0092]
  • The organ names are displayed in association with each average color on the [0093] average color bar 507. Average colors are lined on the average color bar 507 in the order of the esophagus, the stomach, the small intestine, and the large intestine in the order of imaging done in a body by the capsule endoscope 10 in time sequence. Therefore, the average color bar 507 shows organ names 509 in the order of the esophagus, the stomach, the small intestine, and the large intestine in association with the average colors of the individual organs.
  • At the time of automatic discrimination of organ names, it is the automatic discrimination in the ranges of organs. The level of red and the level of blue for individual taken images at elapsed times have the characteristics as shown in FIG. 11. As an actual image contains a noise component, it is subjected to a low-pass filter (LPF) process in the direction of the time axis with respect to the levels of red and blue that have the characteristics to remove noises. Then, edge portions (discoloration edges) the levels of red and blue in the direction of the time axis after the LPF process commonly have are extracted. [0094]
  • In the example in FIG. 11, there are three discoloration edges, ([0095] 1), (2), and (3), extracted in the above manner. Therefore, automatic discrimination is done such that from the positions of the discoloration edges (1), (2), and (3) in the direction of the time axis, the first discoloration edge (1) is a transitional portion from the esophagus to the stomach, (2) is a transitional portion from the stomach to the small intestine and (3) is a transitional portion from the small intestine to the large intestine. At this time, the order of the organ names is based on the layout of the organs to be taken by the capsule endoscope 10 in the direction of the time axis.
  • As the processing based on the principle described above, first, the red level and blue level are computed (step S[0096] 21), the LPF process in the direction of the time axis is performed on the red level and blue level (step S22) and the discoloration edges (1), (2), and (3) are detected (step S23). Then, automatic discrimination of the ranges of the organs is carried out from the time-associated positions of the discoloration edges (1), (2), and (3) and the organ names are displayed in association with the individual average colors on the average color bar 507 (step S24).
  • In the above manner, a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope is displayed, a movable slider is shown on the scale, an image at the imaging time corresponding to the position of the slider is displayed in response to the movement of the slider on the scale, and organs are discriminated based on color information for one screen of input image data and organ names are displayed in association with the scale, so that organs in the body can easily be determined from the displayed organ names. This also improves the ability to retrieve images and makes it possible to easily recognize the organ depicted in each image. [0097]
  • Although the ranges of the organs on the average color bar are automatically discriminated from the discoloration edges in the modification described above, the present invention is not limited to this type and a pH sensor may be provided in the [0098] capsule endoscope 10 so that the ranges of the organs are specified more accurately using the measured pH values. In this case, the pH values are measured by the pH sensor during the observation period and like taken images, the pH values are measured in time sequence and are stored in the receiver 4. At that time, the taken images and pH values are recorded in association with each other, such as coexisting in each frame (image file).
  • FIG. 13 is a graph for illustrating an example of application of the modification shown in FIG. 11. In the automatic discrimination with pH values added, as shown in FIG. 13, using the fact that the stomach is in an acidic state, an acidic part is compared with the discoloration edges ([0099] 1) and (2) to discriminate the stomach part, thereby further increasing the discrimination precision.
  • FIG. 14 is a schematic of an example of screen transition associated with the diagnosis procedures according to the embodiment. FIG. 15 is a flowchart of an operation for displaying the imaging time of a designated image according to the embodiment. While a diagnosis by a doctor can be terminated through the menu operation for “END DIAGNOSIS/PRINT CHART”, further transition to the chart creating procedures can be made. [0100]
  • When the process is shifted from the display screen in FIG. 8 to the display screen in FIG. 14, comments of a doctor are entered and a mark indicating to which elapsed time on the [0101] average color bar 507 each checked image corresponds is displayed.
  • That is, [0102] 504B in FIG. 14 indicates a checked-image display field, set larger than the checked-image display field 504A and provided at the lower portion of the screen. As a difference from the checked-image display field 504A, numbers (1) to (10) are given to individual taken images and displayed. The checked-image display field 504B has the same function as the checked-image display field 504A. 510 is a comment input field where opinions (comments) of a doctor are input and displayed. The results of a diagnosis by a doctor are input as comments in the comment input field 510. 511 indicates an imaging time display mark that is displayed, as a mark on the average color bar 505, indicating which taken image at which elapsed time each checked image to be displayed in the checked-image display field 504B is. As the imaging time display mark, a downward arrow as an index indicating the imaging time for a checked image and the aforementioned number given to a checked image as relative display indicating the correlation with the checked image to show the correlation with the checked image are displayed on the average color bar 505.
  • FIG. 14 displays ten checked images. In this example, average colors are distinguished on the [0103] average color bar 507 in the order of the esophagus, the stomach, the small intestine, and the large intestine. As apparent from the ranges of the organs of the organ names 509, therefore, a mark (1) for a checked image is present in the range of the esophagus, and marks (2), (3), and (4) for a checked image are present in the range of the stomach. Further, marks (5), (6), (7), (8), (9), and (10) for checked images are present in the range of the small intestine.
  • Therefore, the presence of images checked by a doctor are identified in the esophagus, the stomach, and the small intestine from the example in FIG. 14, and marks are displayed in association with the times at which the individual checked images have been taken, so that the doctor can easily confirm at which parts of the organs the checked images have been taken. Although the imaging time display mark is displayed on the [0104] average color bar 505 showing the organ names in FIG. 14, it may be displayed on the average color bar that does not show the organ names as in FIG. 8. Although a correlation indication (number) indicating the correlation with a checked image is displayed as the imaging time display mark in FIG. 14, it may be an index (downward arrow) indicating the position of the imaging time.
  • The process for the above mark display is described with reference to FIG. 15. In the imaging time display of a checked image or a designated image, first, the date/time of creating a file of the designated image is acquired from the memory [0105] 53 (step S31), and the time elapsed since the date/time of the initiation of imaging is computed (step S32). Then, a mark display as shown in FIG. 4 is controlled on the scale of the average color bar 507 at the position corresponding to the elapsed time on the average color bar 507 (step S33). Thereafter, when chart printing is manipulated, outputting for the chart printing is executed.
  • According to the present embodiment, as described above, a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope (internal imaging device) is displayed, a color corresponding to average color information for one screen of input image data is displayed at a time-associated position on the scale, an image corresponding to the input image data is displayed, and an index indicating a position corresponding to an imaging time of a designated image is displayed, so that it is possible to visually and easily recognize how many and in which time band designated images are present. As organs can easily be determined from the colors distinguished from one taken part from another one, it is possible to easily recognize which part of which organ has more designated images. [0106]
  • Furthermore, a scale indicating the overall imaging period of input image data taken in time sequence by the capsule endoscope is displayed, organs are discriminated based on color information of one screen of input image data, the names of the discriminated organ are displayed in association with the scale, images corresponding to the input image data are displayed and an index indicating the position corresponding to the imaging time of the designated image is displayed on the scale, so that organs in the body can easily be determined from the displayed organ names. This also makes it possible to easily recognize which part of which organ has more designated images. [0107]
  • The present invention is not limited to the above embodiments, and various modifications can be made without departing from the spirit of the present invention. [0108]
  • As explained above, according to the present invention, it is possible to provide an image display apparatus constructed in such a way that a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device is displayed, a color corresponding to average color information for one screen of input image data is displayed at a time-associated position on the scale, an image corresponding to the input image data is displayed, and an index indicating a position corresponding to an imaging time of a designated image is displayed, so that it is possible to visually and easily recognize how many and in which time band designated images are present and easily determine organs from the colors distinguished from one taken part from another one, thus making it possible to easily recognize which part of which organ has more designated images. [0109]
  • Furthermore, according to the present invention, it is possible to provide an image display apparatus constructed in such a way that a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device is displayed, organs are discriminated based on color information of one screen of input image data, names of the discriminated organ are displayed in association with the scale, images corresponding to the input image data are displayed and an index indicating the position corresponding to the imaging time of the designated image is displayed on the scale, so that organs in the body can easily be determined from the displayed organ names, whereby it is possible to easily recognize which part of which organ has more designated images. [0110]
  • Moreover, according to the present invention, it is possible to provide an image display method configured to have steps of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, displaying a color corresponding to average color information for one screen of input image data at a time-associated position on the scale, displaying an image corresponding to the input image data, and displaying an index indicating a position corresponding to an imaging time of a designated image, so that it is possible to visually and easily recognize how many and in which time band designated images are present and easily determine organs from the colors distinguished from one taken part from another one, thus making it possible to easily recognize which part of which organ has more designated images. [0111]
  • Furthermore, according to the present invention, it is possible to provide an image display method configured to have steps of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, discriminating organs based on color information of one screen of input image data, displaying names of the discriminated organ in association with the scale, displaying images corresponding to the input image data and displaying an index indicating the position corresponding to the imaging time of the designated image on the scale, so that organs in the body can easily be determined from the displayed organ names, whereby it is possible to easily recognize which part of which organ has more designated images. [0112]
  • Moreover, according to the present invention, it is possible to provide an image display program that allows a computer to execute processes of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, displaying a color corresponding to average color information for one screen of input image data at a time-associated position on the scale, displaying an image corresponding to the input image data, and displaying an index indicating a position corresponding to an imaging time of a designated image, so that it is possible to visually and easily recognize how many and in which time band designated images are present and easily determine organs from the colors distinguished from one taken part from another one, thus making it possible to easily recognize which part of which organ has more designated images. [0113]
  • Furthermore, according to the present invention, it is possible to provide an image display program that allows a computer to execute processes of displaying a scale indicating the overall imaging period of input image data taken in time sequence by an internal imaging device, discriminating organs based on color information of one screen of input image data, displaying names of the discriminated organ in association with the scale, displaying images corresponding to the input image data and displaying an index indicating the position corresponding to the imaging time of the designated image on the scale, so that organs in the body can easily be determined from the displayed organ names, whereby it is possible to easily recognize which part of which organ has more designated images. [0114]
  • Although the invention has been described with respect to a specific embodiment for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art which fairly fall within the basic teaching herein set forth. [0115]

Claims (15)

What is claimed is:
1. An image display apparatus comprising:
an input unit that inputs image data taken in time sequence by an in-vivo imaging device;
a scale display control unit that controls to display a scale indicating an overall imaging period of input image data taken in time sequence and input by the input unit;
a color information detecting unit that detects color information of a screen of the image data input by the input unit;
a color display control unit that controls to display a color corresponding to the color information detected by the color information detecting unit at a time-corresponding position on the scale;
an image display control unit that controls to display an image corresponding to the image data input by the input unit;
an image designation unit that designates the image subjected to be displayed by the image display control unit; and
an index display control unit that controls to display, on the scale, an index indicating a position corresponding to an imaging time of the image designated by the image designation unit.
2. The image display apparatus according to claim 1, wherein the color information detecting unit includes an average color detecting unit that detects color information on average color from color information of a screen of the image data input by the input unit.
3. The image display apparatus according to claim 1, further comprising a designated image display control unit that controls to display the image designated by the image designation unit, wherein
the index display control unit and the designated image display control unit are configured to make correlation display indicating a correlation between the index displayed on the scale and the displayed designated image.
4. The image display apparatus according to claim 1, further comprising:
an organ discriminating unit that discriminates an organ based on the color information detected by the color information detecting unit; and
an organ name display control unit that controls to display a name of the organ discriminated by the organ discriminating unit in association with the scale.
5. The image display apparatus according to claim 4, wherein the organ discriminating unit includes a discoloration edge detecting unit that detects a discoloration edge from the color information detected by the color information detecting unit.
6. An image display method comprising:
inputting image data taken in time sequence by an in-vivo imaging device;
displaying a scale indicating an overall imaging period of input image data taken in time sequence and input by the input unit;
detecting color information of a screen of the image data input by the input unit;
displaying a color corresponding to the color information detected by the color information detecting unit at a time-corresponding position on the scale;
displaying an image corresponding to the image data input by the input unit;
designating the image subjected to be displayed by the image display control unit; and
displaying, on the scale, an index indicating a position corresponding to an imaging time of the image designated by the image designation unit.
7. The image display method according to claim 6, wherein the detecting unit includes detecting color information on average color from color information of a screen of the image data input by the input unit.
8. The image display method according to claim 6, further comprising displaying the image designated by the image designation unit, wherein
the displaying the index the displaying the image designated are configured to make correlation display indicating a correlation between the index displayed on the scale and the displayed designated image.
9. The image display method according to claim 6, further comprising:
discriminating an organ based on the color information detected by the color information detecting unit; and
displaying a name of the organ discriminated by the organ discriminating unit in association with the scale.
10. The image display method according to claim 9, wherein the discriminating includes detecting a discoloration edge from the color information detected.
11. An image display program making a computer execute:
inputting image data taken in time sequence by an in-vivo imaging device;
displaying a scale indicating an overall imaging period of input image data taken in time sequence and input by the input unit;
detecting color information of a screen of the image data input by the input unit;
displaying a color corresponding to the color information detected by the color information detecting unit at a time-corresponding position on the scale;
displaying an image corresponding to the image data input by the input unit;
designating the image subjected to be displayed by the image display control unit; and
displaying, on the scale, an index indicating a position corresponding to an imaging time of the image designated by the image designation unit.
12. The image display program according to claim 11, wherein the detecting unit includes detecting color information on average color from color information of a screen of the image data input by the input unit.
13. The image display program according to claim 11, making the computer further execute displaying the image designated by the image designation unit, wherein
the displaying the index the displaying the image designated are configured to make correlation display indicating a correlation between the index displayed on the scale and the displayed designated image.
14. The image display program according to claim 11, making the computer further execute:
discriminating an organ based on the color information detected by the color information detecting unit; and
displaying a name of the organ discriminated by the organ discriminating unit in association with the scale.
15. The image display program according to claim 14, wherein the discriminating includes detecting a discoloration edge from the color information detected.
US10/830,790 2003-04-25 2004-04-23 Image display apparatus, image display method, and computer program Abandoned US20040249291A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/493,722 US20060270783A1 (en) 2003-09-15 2006-07-26 Elastomer compositions for use in a hydrocarbon resistant hose
US13/229,309 US8620044B2 (en) 2003-04-25 2011-09-09 Image display apparatus, image display method, and computer program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003-122805 2003-04-25
JP2003122805A JP3810381B2 (en) 2003-04-25 2003-04-25 Image display device, image display method, and image display program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/663,324 Continuation-In-Part US20050058795A1 (en) 2003-09-15 2003-09-15 Vinyl ester hose and method for manufacture of such hose

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/493,722 Continuation-In-Part US20060270783A1 (en) 2003-09-15 2006-07-26 Elastomer compositions for use in a hydrocarbon resistant hose
US13/229,309 Division US8620044B2 (en) 2003-04-25 2011-09-09 Image display apparatus, image display method, and computer program

Publications (1)

Publication Number Publication Date
US20040249291A1 true US20040249291A1 (en) 2004-12-09

Family

ID=33410094

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/830,790 Abandoned US20040249291A1 (en) 2003-04-25 2004-04-23 Image display apparatus, image display method, and computer program
US13/229,309 Expired - Fee Related US8620044B2 (en) 2003-04-25 2011-09-09 Image display apparatus, image display method, and computer program

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/229,309 Expired - Fee Related US8620044B2 (en) 2003-04-25 2011-09-09 Image display apparatus, image display method, and computer program

Country Status (8)

Country Link
US (2) US20040249291A1 (en)
EP (4) EP1891887A3 (en)
JP (1) JP3810381B2 (en)
KR (1) KR100746422B1 (en)
CN (4) CN101133935B (en)
AU (1) AU2004233673C1 (en)
CA (1) CA2523304A1 (en)
WO (1) WO2004096027A1 (en)

Cited By (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030151661A1 (en) * 2002-02-12 2003-08-14 Tal Davidson System and method for displaying an image stream
US20040027500A1 (en) * 2002-02-12 2004-02-12 Tal Davidson System and method for displaying an image stream
US20050075551A1 (en) * 2003-10-02 2005-04-07 Eli Horn System and method for presentation of data streams
US20060020202A1 (en) * 2004-07-06 2006-01-26 Mathew Prakash P Method and appartus for controlling ultrasound system display
US20060034521A1 (en) * 2004-07-16 2006-02-16 Sectra Imtec Ab Computer program product and method for analysis of medical image data in a medical imaging system
US20060074275A1 (en) * 2004-09-27 2006-04-06 Tal Davidson System and method for editing an image stream captured in vivo
US20060106318A1 (en) * 2004-11-15 2006-05-18 Tal Davidson System and method for displaying an image stream
US20060164511A1 (en) * 2003-12-31 2006-07-27 Hagal Krupnik System and method for displaying an image stream
US20070060798A1 (en) * 2005-09-15 2007-03-15 Hagai Krupnik System and method for presentation of data streams
US20070066875A1 (en) * 2005-09-18 2007-03-22 Eli Horn System and method for identification of images in an image database
US20070167715A1 (en) * 2005-12-05 2007-07-19 Toshiaki Shigemori Receiving device
US20070173714A1 (en) * 2005-11-24 2007-07-26 Katsumi Hirakawa In vivo image display apparatus, receiving apparatus, and image display system using same and image display method thereof
US20070191677A1 (en) * 2004-10-29 2007-08-16 Olympus Corporation Image processing method and capsule type endoscope device
EP1834568A1 (en) * 2005-01-05 2007-09-19 Olympus Corporation Encapsulated endoscope case
US20070237375A1 (en) * 2006-03-23 2007-10-11 Hiromasa Yamagishi Apparatus and method of displaying image scanning report
US20070239004A1 (en) * 2006-01-19 2007-10-11 Kabushiki Kaisha Toshiba Apparatus and method for indicating locus of an ultrasonic probe, ultrasonic diagnostic apparatus and method
US20070268280A1 (en) * 2004-08-23 2007-11-22 Manabu Fujita Image Display Apparatus, Image Display Method, and Image Display Program
EP1867280A1 (en) * 2005-04-07 2007-12-19 Olympus Medical Systems Corp. System for acquiring information on inside of subject
US20070292011A1 (en) * 2005-04-13 2007-12-20 Hirokazu Nishimura Image Processing Apparatus and Image Processing Method
EP1870012A1 (en) * 2005-04-14 2007-12-26 Olympus Medical Systems Corp. Simple image display apparatus and receiving system
US20080001971A1 (en) * 2006-06-29 2008-01-03 Scientific-Atlanta, Inc. Filling Blank Spaces of a Display Screen
US20080051659A1 (en) * 2004-06-18 2008-02-28 Koji Waki Ultrasonic Diagnostic Apparatus
US20080103359A1 (en) * 2006-10-26 2008-05-01 Tah-Yoong Lin Capsule-type endoscopic system with real-time image display
EP1922977A1 (en) * 2005-09-09 2008-05-21 Olympus Medical Systems Corp. Image display device
EP1922983A1 (en) * 2005-09-09 2008-05-21 Olympus Corporation Intra-lumen image viewer
US20080119691A1 (en) * 2005-03-22 2008-05-22 Yasushi Yagi Capsule Endoscope Image Display Controller
US20080161639A1 (en) * 2006-12-28 2008-07-03 Olympus Medical Systems Corporation Capsule medical apparatus and body-cavity observation method
US20080172255A1 (en) * 2005-08-22 2008-07-17 Katsumi Hirakawa Image display apparatus
US20080262304A1 (en) * 2004-06-30 2008-10-23 Micha Nisani In-Vivo Sensing System Device and Method for Real Time Viewing
US20090003732A1 (en) * 2007-06-27 2009-01-01 Olympus Medical Systems Corp. Display processing apparatus for image information
US20090005639A1 (en) * 2007-01-12 2009-01-01 Olympus Medical Systems Corp. Capsule medical apparatus
US20090027486A1 (en) * 2005-08-22 2009-01-29 Katsumi Hirakawa Image display apparatus
US20090124853A1 (en) * 2007-11-08 2009-05-14 Kazuhiro Gono Capsule Blood Detection System and Method
US20090123043A1 (en) * 2007-11-08 2009-05-14 Kazuhiro Gono Method and System for Correlating Image and Tissue Characteristic Data
US20090252390A1 (en) * 2006-10-02 2009-10-08 Olympus Corporation Image processing apparatus and image processing method
US20090259096A1 (en) * 2005-09-09 2009-10-15 Toshiaki Shigemori Body-cavity image obeservation apparatus
US20090306522A1 (en) * 2007-11-08 2009-12-10 Kazuhiro Gono Method and Apparatus for Detecting Abnormal Living Tissue
US20100061597A1 (en) * 2007-06-05 2010-03-11 Olympus Corporation Image processing device, image processing program and image processing method
US20100086286A1 (en) * 2008-10-07 2010-04-08 Intromedic Method of displaying image taken by capsule endoscope and record media of storing program for carrying out that method
US20100119110A1 (en) * 2008-11-07 2010-05-13 Olympus Corporation Image display device, computer readable storage medium storing image processing program, and image processing method
US20100166272A1 (en) * 2003-06-12 2010-07-01 Eli Horn System and method to detect a transition in an image stream
US20100269064A1 (en) * 2007-12-13 2010-10-21 Koninklijke Philips Electronics N.V. Navigation in a series of images
US20110032259A1 (en) * 2009-06-09 2011-02-10 Intromedic Co., Ltd. Method of displaying images obtained from an in-vivo imaging device and apparatus using same
US20110196201A1 (en) * 2009-03-11 2011-08-11 Olympus Medical Systems Corp. Image processing system, external device, and image processing method
US20110224490A1 (en) * 2009-04-20 2011-09-15 Olympus Medical Systems Corp. In-vivo examination system
US8022980B2 (en) 2002-02-12 2011-09-20 Given Imaging Ltd. System and method for displaying an image stream
US8133169B2 (en) 2007-09-19 2012-03-13 Olympus Medical Systems Corp. In-vivo image acquiring system capable of controlling illuminating unit and determining whether to wirelessly transmit image information based on estimated distance
US20120207369A1 (en) * 2011-02-14 2012-08-16 Siemens Medical Solutions Usa, Inc. Presentation of Locations in Medical Diagnosis
US20130152020A1 (en) * 2011-03-30 2013-06-13 Olympus Medical Systems Corp. Image management apparatus, method, and computer-readable recording medium and capsule endoscope system
US8682142B1 (en) 2010-03-18 2014-03-25 Given Imaging Ltd. System and method for editing an image stream captured in-vivo
US8768024B1 (en) * 2010-06-01 2014-07-01 Given Imaging Ltd. System and method for real time detection of villi texture in an image stream of the gastrointestinal tract
US8873816B1 (en) 2011-04-06 2014-10-28 Given Imaging Ltd. Method and system for identification of red colored pathologies in vivo
US8900124B2 (en) 2006-08-03 2014-12-02 Olympus Medical Systems Corp. Image display device
US8922633B1 (en) 2010-09-27 2014-12-30 Given Imaging Ltd. Detection of gastrointestinal sections and transition of an in-vivo device there between
US8965079B1 (en) 2010-09-28 2015-02-24 Given Imaging Ltd. Real time detection of gastrointestinal sections and transitions of an in-vivo device therebetween
US9060673B2 (en) 2010-04-28 2015-06-23 Given Imaging Ltd. System and method for displaying portions of in-vivo images
US9113846B2 (en) 2001-07-26 2015-08-25 Given Imaging Ltd. In-vivo imaging device providing data compression
US9324145B1 (en) 2013-08-08 2016-04-26 Given Imaging Ltd. System and method for detection of transitions in an image stream of the gastrointestinal tract
EP1882440A4 (en) * 2005-05-20 2016-06-15 Olympus Corp Image display device
US9538937B2 (en) 2008-06-18 2017-01-10 Covidien Lp System and method of evaluating a subject with an ingestible capsule
US9545192B2 (en) 2012-05-04 2017-01-17 Given Imaging Ltd. System and method for automatic navigation of a capsule based on image stream captured in-vivo
US20170083791A1 (en) * 2014-06-24 2017-03-23 Olympus Corporation Image processing device, endoscope system, and image processing method
EP3040016A4 (en) * 2013-08-30 2017-04-19 Olympus Corporation Image management device
US20170337683A1 (en) * 2016-05-18 2017-11-23 Olympus Corporation Image file creation method, medium with image file creation program recorded and image file creation apparatus
WO2019133496A3 (en) * 2017-12-28 2019-08-08 Wisconsin Alumni Research Foundation Otoscope providing low obstruction electronic display
US10405734B2 (en) 2012-06-29 2019-09-10 Given Imaging Ltd. System and method for displaying an image stream
JP2020014866A (en) * 2011-10-13 2020-01-30 マシモ・コーポレイション Medical monitoring hub
WO2021041980A1 (en) * 2019-08-28 2021-03-04 Trustees Of Tufts College Systems and methods for high-throughput screening and analysis of drug delivery systems in vitro
US11083397B2 (en) 2012-02-09 2021-08-10 Masimo Corporation Wireless patient monitoring device
US11241199B2 (en) 2011-10-13 2022-02-08 Masimo Corporation System for displaying medical monitoring data
US11484205B2 (en) 2002-03-25 2022-11-01 Masimo Corporation Physiological measurement device
USD991279S1 (en) * 2019-12-09 2023-07-04 Ankon Technologies Co., Ltd Display screen or portion thereof with transitional graphical user interface
USD991278S1 (en) * 2019-12-09 2023-07-04 Ankon Technologies Co., Ltd Display screen or portion thereof with transitional graphical user interface for auxiliary reading
US11900775B2 (en) 2009-12-21 2024-02-13 Masimo Corporation Modular patient monitor
US11963736B2 (en) 2009-07-20 2024-04-23 Masimo Corporation Wireless patient monitoring system

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4009581B2 (en) 2003-11-18 2007-11-14 オリンパス株式会社 Capsule medical system
JP4885432B2 (en) * 2004-08-18 2012-02-29 オリンパス株式会社 Image display device, image display method, and image display program
JP2006141898A (en) * 2004-11-24 2006-06-08 Olympus Corp Image display system
JP4699014B2 (en) * 2004-11-29 2011-06-08 オリンパス株式会社 Image display device, image display method, and image display program
JP4575124B2 (en) 2004-11-29 2010-11-04 オリンパス株式会社 Image display device
JP2006212051A (en) * 2005-02-01 2006-08-17 Yamaha Corp Capsule type imaging device, in vivo imaging system and in vivo imaging method
JP5057651B2 (en) * 2005-02-15 2012-10-24 オリンパス株式会社 Lumen image processing apparatus, lumen image processing method, and program therefor
EP1849402B1 (en) 2005-02-15 2018-05-16 Olympus Corporation Medical image processing device, lumen image processing device, lumen image processing method, and programs for them
JP4575216B2 (en) * 2005-04-08 2010-11-04 オリンパス株式会社 Medical image display device
JP4624842B2 (en) * 2005-04-13 2011-02-02 オリンパスメディカルシステムズ株式会社 Image processing method, image processing apparatus, and program
JP4624841B2 (en) * 2005-04-13 2011-02-02 オリンパスメディカルシステムズ株式会社 Image processing apparatus and image processing method in the image processing apparatus
US7308292B2 (en) * 2005-04-15 2007-12-11 Sensors For Medicine And Science, Inc. Optical-based sensing devices
JP2006296882A (en) * 2005-04-22 2006-11-02 Olympus Medical Systems Corp Image display device
JP2006345929A (en) * 2005-06-13 2006-12-28 Olympus Medical Systems Corp Image display device
JP4758720B2 (en) * 2005-09-29 2011-08-31 富士フイルム株式会社 Electronic endoscope system
JP2007097653A (en) * 2005-09-30 2007-04-19 Fujinon Corp Endoscopic diagnosis system
JP4885518B2 (en) * 2005-11-10 2012-02-29 オリンパス株式会社 Capsule type endoscope system and capsule type endoscope information processing method
JP2007312810A (en) 2006-05-23 2007-12-06 Olympus Corp Image processing device
JP5368668B2 (en) * 2006-06-01 2013-12-18 富士フイルム株式会社 MEDICAL IMAGE DISPLAY DEVICE, MEDICAL IMAGE DISPLAY SYSTEM, AND METHOD FOR OPERATING MEDICAL IMAGE DISPLAY SYSTEM
CN102178506A (en) * 2006-09-12 2011-09-14 奥林巴斯医疗株式会社 In-vivo information acquisition device, and capsule endoscope
EP2101630B1 (en) * 2006-12-28 2017-12-13 Olympus Corporation Capsule medical apparatus and body-cavity observation method
JP2008200344A (en) * 2007-02-21 2008-09-04 Hoya Corp Electronic endoscope and endoscope processor
JP4939326B2 (en) * 2007-07-10 2012-05-23 オリンパスメディカルシステムズ株式会社 Image display device, image display method, and image display program
JP5259141B2 (en) * 2007-08-31 2013-08-07 オリンパスメディカルシステムズ株式会社 In-subject image acquisition system, in-subject image processing method, and in-subject introduction device
KR100868339B1 (en) * 2007-11-15 2008-11-12 주식회사 인트로메딕 Method for displaying the medical image and system and method for providing captured image by the medical image
JP4789961B2 (en) * 2008-02-08 2011-10-12 オリンパス株式会社 Image display device
JP5374078B2 (en) 2008-06-16 2013-12-25 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
JP2010035637A (en) * 2008-07-31 2010-02-18 Olympus Medical Systems Corp Image display apparatus and endoscope system using the same
KR100900533B1 (en) 2008-10-07 2009-06-02 주식회사 인트로메딕 Method for displaying screen of capsule endoscope image, and record media recoded program for implement thereof
KR100931947B1 (en) * 2009-06-10 2009-12-15 주식회사 인트로메딕 Method of display processing images in digest organs obtained by capsule endoscope
JP5460488B2 (en) * 2010-06-29 2014-04-02 富士フイルム株式会社 Electronic endoscope system, processor device for electronic endoscope, image retrieval system, and method of operating electronic endoscope system
CN102843950B (en) * 2010-11-08 2015-02-04 奥林巴斯医疗株式会社 Image display device and capsule endoscope system
US20130002842A1 (en) * 2011-04-26 2013-01-03 Ikona Medical Corporation Systems and Methods for Motion and Distance Measurement in Gastrointestinal Endoscopy
JP5242731B2 (en) * 2011-04-28 2013-07-24 オリンパス株式会社 Image display system and image display terminal device
WO2012165298A1 (en) * 2011-06-01 2012-12-06 オリンパスメディカルシステムズ株式会社 Receiving device and capsule-type endoscope system
JP5846841B2 (en) * 2011-10-14 2016-01-20 株式会社東芝 Medical image display device
JP6596203B2 (en) 2012-02-23 2019-10-23 スミス アンド ネフュー インコーポレイテッド Video endoscope system
CN104640496A (en) * 2013-05-31 2015-05-20 奥林巴斯医疗株式会社 Medical device
US10204411B2 (en) 2014-05-09 2019-02-12 Given Imaging Ltd. System and method for sequential image analysis of an in vivo image stream
JP6548730B2 (en) * 2014-08-23 2019-07-24 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for display of pathological data in image guided procedures
WO2016084779A1 (en) * 2014-11-27 2016-06-02 オリンパス株式会社 Image playback apparatus and image playback program
JPWO2016110993A1 (en) * 2015-01-09 2017-10-19 オリンパス株式会社 Endoscope system, endoscope apparatus, and control method for endoscope system
US10334445B2 (en) * 2015-10-14 2019-06-25 Cisco Technology, Inc. Accurate detection of rogue wireless access points
CN107037952B (en) * 2016-02-03 2020-01-24 腾讯科技(深圳)有限公司 Resource display control method and device
WO2019064704A1 (en) 2017-09-29 2019-04-04 オリンパス株式会社 Endoscopic image observation assistance system, endoscopic image observation assistance device, and endoscopic image observation assistance method
TWI646939B (en) * 2017-11-10 2019-01-11 沅聖科技股份有限公司 Micro endoscope device
CN112584738B (en) * 2018-08-30 2024-04-23 奥林巴斯株式会社 Recording device, image observation device, observation system, control method for observation system, and storage medium
JP7392521B2 (en) * 2020-03-04 2023-12-06 コニカミノルタ株式会社 Image display device, image display program, and method of operating the image display device
CN116721175B (en) * 2023-08-09 2023-10-10 安翰科技(武汉)股份有限公司 Image display method, image display device and capsule endoscope system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5010412A (en) * 1988-12-27 1991-04-23 The Boeing Company High frequency, low power light source for video camera
US5032913A (en) * 1989-02-28 1991-07-16 Olympus Optical Co., Ltd. Electronic endoscope system equipped with color smear reducing means
US5604531A (en) * 1994-01-17 1997-02-18 State Of Israel, Ministry Of Defense, Armament Development Authority In vivo video camera system
US6217519B1 (en) * 1997-03-25 2001-04-17 Dwl Elektronische Systeme Gmbh Device and method for observing vessels, specially blood vessels
US6422994B1 (en) * 1997-09-24 2002-07-23 Olympus Optical Co., Ltd. Fluorescent diagnostic system and method providing color discrimination enhancement
US20020171669A1 (en) * 2001-05-18 2002-11-21 Gavriel Meron System and method for annotation on a moving image
US20020193669A1 (en) * 2000-05-31 2002-12-19 Arkady Glukhovsky Method for measurement of electrical characteristics of tissue
US20030063130A1 (en) * 2000-09-08 2003-04-03 Mauro Barbieri Reproducing apparatus providing a colored slider bar
US20030077223A1 (en) * 2001-06-20 2003-04-24 Arkady Glukhovsky Motility analysis within a gastrointestinal tract
US20030151661A1 (en) * 2002-02-12 2003-08-14 Tal Davidson System and method for displaying an image stream
US6934093B2 (en) * 1999-06-15 2005-08-23 Given Imaging Ltd Optical system
US7022067B2 (en) * 2000-05-15 2006-04-04 Given Imaging Ltd. System and method for controlling in vivo camera capture and display rate
US7118529B2 (en) * 2002-11-29 2006-10-10 Given Imaging, Ltd. Method and apparatus for transmitting non-image information via an image sensor in an in vivo imaging system

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3683389A (en) 1971-01-20 1972-08-08 Corning Glass Works Omnidirectional loop antenna array
JPS5519124A (en) 1978-07-27 1980-02-09 Olympus Optical Co Camera system for medical treatment
JPS5745833A (en) 1980-09-01 1982-03-16 Taeko Nakagawa Stomack camera
US4741317A (en) 1987-06-12 1988-05-03 General Motors Corporation Vapor recovery system with variable delay purge
JP2579372B2 (en) 1989-12-04 1997-02-05 日本テキサス・インスツルメンツ株式会社 Low power imaging device
JPH04109927A (en) 1990-08-31 1992-04-10 Toshiba Corp Electronic endoscope apparatus
JP2948900B2 (en) 1990-11-16 1999-09-13 オリンパス光学工業株式会社 Medical capsule
US5279607A (en) 1991-05-30 1994-01-18 The State University Of New York Telemetry capsule and process
US5392072A (en) 1992-10-23 1995-02-21 International Business Machines Inc. Hybrid video compression system and method capable of software-only decompression in selected multimedia systems
JP3402659B2 (en) 1993-05-13 2003-05-06 オリンパス光学工業株式会社 Image handling equipment
JP3544557B2 (en) 1994-04-08 2004-07-21 オリンパス株式会社 Image file device
US5970173A (en) 1995-10-05 1999-10-19 Microsoft Corporation Image compression and affine transformation for image motion compensation
US5971767A (en) * 1996-09-16 1999-10-26 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination
JP3767033B2 (en) 1996-09-20 2006-04-19 ソニー株式会社 Image editing device
US6032678A (en) 1997-03-14 2000-03-07 Shraga Rottem Adjunct to diagnostic imaging systems for analysis of images of an object or a body part or organ
JP3736706B2 (en) 1997-04-06 2006-01-18 ソニー株式会社 Image display apparatus and method
IL123073A0 (en) * 1998-01-26 1998-09-24 Simbionix Ltd Endoscopic tutorial system
JPH11225996A (en) * 1998-02-19 1999-08-24 Olympus Optical Co Ltd Capsule type in vivo information detector
EP1968067A1 (en) 1999-03-30 2008-09-10 Tivo, Inc. Multimedia program bookmarking system
JP2001143005A (en) 1999-11-16 2001-05-25 Nippon Koden Corp Medical image display system
US20020177779A1 (en) * 2001-03-14 2002-11-28 Doron Adler Method and system for detecting colorimetric abnormalities in vivo
JP2002290783A (en) * 2001-03-28 2002-10-04 Fuji Photo Optical Co Ltd Electronic endoscope device
US6939292B2 (en) 2001-06-20 2005-09-06 Olympus Corporation Capsule type endoscope
KR100411631B1 (en) 2001-10-18 2003-12-18 주식회사 메디미르 Fluorescence endoscope apparatus and a method for imaging tissue within a body using the same
WO2005031650A1 (en) * 2003-10-02 2005-04-07 Given Imaging Ltd. System and method for presentation of data streams

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5010412A (en) * 1988-12-27 1991-04-23 The Boeing Company High frequency, low power light source for video camera
US5032913A (en) * 1989-02-28 1991-07-16 Olympus Optical Co., Ltd. Electronic endoscope system equipped with color smear reducing means
US5604531A (en) * 1994-01-17 1997-02-18 State Of Israel, Ministry Of Defense, Armament Development Authority In vivo video camera system
US6217519B1 (en) * 1997-03-25 2001-04-17 Dwl Elektronische Systeme Gmbh Device and method for observing vessels, specially blood vessels
US6422994B1 (en) * 1997-09-24 2002-07-23 Olympus Optical Co., Ltd. Fluorescent diagnostic system and method providing color discrimination enhancement
US6934093B2 (en) * 1999-06-15 2005-08-23 Given Imaging Ltd Optical system
US7022067B2 (en) * 2000-05-15 2006-04-04 Given Imaging Ltd. System and method for controlling in vivo camera capture and display rate
US20020193669A1 (en) * 2000-05-31 2002-12-19 Arkady Glukhovsky Method for measurement of electrical characteristics of tissue
US20030063130A1 (en) * 2000-09-08 2003-04-03 Mauro Barbieri Reproducing apparatus providing a colored slider bar
US20020171669A1 (en) * 2001-05-18 2002-11-21 Gavriel Meron System and method for annotation on a moving image
US7119814B2 (en) * 2001-05-18 2006-10-10 Given Imaging Ltd. System and method for annotation on a moving image
US20030077223A1 (en) * 2001-06-20 2003-04-24 Arkady Glukhovsky Motility analysis within a gastrointestinal tract
US20050281446A1 (en) * 2001-06-20 2005-12-22 Given Imaging Ltd. Motility analysis within a gastrointestinal tract
US6944316B2 (en) * 2001-06-20 2005-09-13 Given Imaging Ltd Motility analysis within a gastrointestinal tract
US20030151661A1 (en) * 2002-02-12 2003-08-14 Tal Davidson System and method for displaying an image stream
US7118529B2 (en) * 2002-11-29 2006-10-10 Given Imaging, Ltd. Method and apparatus for transmitting non-image information via an image sensor in an in vivo imaging system

Cited By (139)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9113846B2 (en) 2001-07-26 2015-08-25 Given Imaging Ltd. In-vivo imaging device providing data compression
US20040027500A1 (en) * 2002-02-12 2004-02-12 Tal Davidson System and method for displaying an image stream
US7505062B2 (en) 2002-02-12 2009-03-17 Given Imaging Ltd. System and method for displaying an image stream
US7474327B2 (en) 2002-02-12 2009-01-06 Given Imaging Ltd. System and method for displaying an image stream
US20030151661A1 (en) * 2002-02-12 2003-08-14 Tal Davidson System and method for displaying an image stream
US8022980B2 (en) 2002-02-12 2011-09-20 Given Imaging Ltd. System and method for displaying an image stream
US11484205B2 (en) 2002-03-25 2022-11-01 Masimo Corporation Physiological measurement device
US7885446B2 (en) * 2003-06-12 2011-02-08 Given Imaging Ltd. System and method to detect a transition in an image stream
US20100166272A1 (en) * 2003-06-12 2010-07-01 Eli Horn System and method to detect a transition in an image stream
US7215338B2 (en) 2003-10-02 2007-05-08 Given Imaging Ltd. System and method for presentation of data streams
US8144152B2 (en) * 2003-10-02 2012-03-27 Given Imaging, Ltd. System and method for presentation of data streams
US20100053313A1 (en) * 2003-10-02 2010-03-04 Eli Horn System and method for presentation of data streams
US20050075551A1 (en) * 2003-10-02 2005-04-07 Eli Horn System and method for presentation of data streams
US20070230893A1 (en) * 2003-12-31 2007-10-04 Gavriel Meron System and Method for Displaying an Image Stream
US9072442B2 (en) 2003-12-31 2015-07-07 Given Imaging Ltd. System and method for displaying an image stream
US8164672B2 (en) 2003-12-31 2012-04-24 Given Imaging Ltd. System and method for displaying an image stream
US20060164511A1 (en) * 2003-12-31 2006-07-27 Hagal Krupnik System and method for displaying an image stream
US20080051659A1 (en) * 2004-06-18 2008-02-28 Koji Waki Ultrasonic Diagnostic Apparatus
US20080262304A1 (en) * 2004-06-30 2008-10-23 Micha Nisani In-Vivo Sensing System Device and Method for Real Time Viewing
US20060020202A1 (en) * 2004-07-06 2006-01-26 Mathew Prakash P Method and appartus for controlling ultrasound system display
US7717849B2 (en) * 2004-07-06 2010-05-18 Gerneral Electric Company Method and apparatus for controlling ultrasound system display
US20060034521A1 (en) * 2004-07-16 2006-02-16 Sectra Imtec Ab Computer program product and method for analysis of medical image data in a medical imaging system
US20070268280A1 (en) * 2004-08-23 2007-11-22 Manabu Fujita Image Display Apparatus, Image Display Method, and Image Display Program
AU2005214199B8 (en) * 2004-09-27 2011-09-29 Given Imaging Ltd System and method for editing an image stream captured in vivo
AU2005214199B2 (en) * 2004-09-27 2011-06-09 Given Imaging Ltd System and method for editing an image stream captured in vivo
AU2005214199A8 (en) * 2004-09-27 2011-09-29 Given Imaging Ltd System and method for editing an image stream captured in vivo
US7986337B2 (en) * 2004-09-27 2011-07-26 Given Imaging Ltd. System and method for editing an image stream captured in vivo
US20060074275A1 (en) * 2004-09-27 2006-04-06 Tal Davidson System and method for editing an image stream captured in vivo
US20070191677A1 (en) * 2004-10-29 2007-08-16 Olympus Corporation Image processing method and capsule type endoscope device
US20060106318A1 (en) * 2004-11-15 2006-05-18 Tal Davidson System and method for displaying an image stream
US7486981B2 (en) * 2004-11-15 2009-02-03 Given Imaging Ltd. System and method for displaying an image stream
US7770725B2 (en) 2005-01-05 2010-08-10 Olympus Corporation Capsule endoscope storage case
US20080027267A1 (en) * 2005-01-05 2008-01-31 Hidetake Segawa Capsule Endoscope Storage Case
EP1834568A4 (en) * 2005-01-05 2009-07-08 Olympus Corp Encapsulated endoscope case
EP1834568A1 (en) * 2005-01-05 2007-09-19 Olympus Corporation Encapsulated endoscope case
US20080119691A1 (en) * 2005-03-22 2008-05-22 Yasushi Yagi Capsule Endoscope Image Display Controller
US8005279B2 (en) * 2005-03-22 2011-08-23 Osaka University Capsule endoscope image display controller
US20090076320A1 (en) * 2005-04-07 2009-03-19 Toshiaki Shigemori Intra-subject information acquiring system
EP1867280A4 (en) * 2005-04-07 2010-11-17 Olympus Medical Systems Corp System for acquiring information on inside of subject
EP1867280A1 (en) * 2005-04-07 2007-12-19 Olympus Medical Systems Corp. System for acquiring information on inside of subject
US8348832B2 (en) 2005-04-07 2013-01-08 Olympus Medical Systems Corp. Intra-subject information acquiring system
US7953261B2 (en) * 2005-04-13 2011-05-31 Olympus Medical Systems Corporation Image processing apparatus and image processing method
US20070292011A1 (en) * 2005-04-13 2007-12-20 Hirokazu Nishimura Image Processing Apparatus and Image Processing Method
US20080004494A1 (en) * 2005-04-14 2008-01-03 Akira Matsui Simplified Image Display Apparatus and Receiving System
US8144192B2 (en) 2005-04-14 2012-03-27 Olympus Medical Systems Corp. Simplified image display apparatus and receiving system
EP1870012A1 (en) * 2005-04-14 2007-12-26 Olympus Medical Systems Corp. Simple image display apparatus and receiving system
EP1870012A4 (en) * 2005-04-14 2009-11-18 Olympus Medical Systems Corp Simple image display apparatus and receiving system
EP1882440A4 (en) * 2005-05-20 2016-06-15 Olympus Corp Image display device
US20080172255A1 (en) * 2005-08-22 2008-07-17 Katsumi Hirakawa Image display apparatus
US20090027486A1 (en) * 2005-08-22 2009-01-29 Katsumi Hirakawa Image display apparatus
US8169472B2 (en) 2005-08-22 2012-05-01 Olympus Corporation Image display apparatus with interactive database
EP1922983A1 (en) * 2005-09-09 2008-05-21 Olympus Corporation Intra-lumen image viewer
EP1922977A4 (en) * 2005-09-09 2012-06-20 Olympus Medical Systems Corp Image display device
US8038608B2 (en) 2005-09-09 2011-10-18 Olympus Corporation Body-cavity image observation apparatus
US20090259096A1 (en) * 2005-09-09 2009-10-15 Toshiaki Shigemori Body-cavity image obeservation apparatus
EP1922977A1 (en) * 2005-09-09 2008-05-21 Olympus Medical Systems Corp. Image display device
EP1922983A4 (en) * 2005-09-09 2010-06-23 Olympus Corp Intra-lumen image viewer
US20070060798A1 (en) * 2005-09-15 2007-03-15 Hagai Krupnik System and method for presentation of data streams
US20070066875A1 (en) * 2005-09-18 2007-03-22 Eli Horn System and method for identification of images in an image database
US8175347B2 (en) * 2005-11-24 2012-05-08 Olympus Medical Systems Corp. In vivo image display apparatus, receiving apparatus, and image display system using same and image display method thereof
US20070173714A1 (en) * 2005-11-24 2007-07-26 Katsumi Hirakawa In vivo image display apparatus, receiving apparatus, and image display system using same and image display method thereof
US20070167715A1 (en) * 2005-12-05 2007-07-19 Toshiaki Shigemori Receiving device
US9084556B2 (en) * 2006-01-19 2015-07-21 Toshiba Medical Systems Corporation Apparatus for indicating locus of an ultrasonic probe, ultrasonic diagnostic apparatus
US20070239004A1 (en) * 2006-01-19 2007-10-11 Kabushiki Kaisha Toshiba Apparatus and method for indicating locus of an ultrasonic probe, ultrasonic diagnostic apparatus and method
US20070237375A1 (en) * 2006-03-23 2007-10-11 Hiromasa Yamagishi Apparatus and method of displaying image scanning report
US7978890B2 (en) * 2006-03-23 2011-07-12 Kabushiki Kaisha Toshiba Apparatus and method of displaying image scanning report
US7742059B2 (en) 2006-06-29 2010-06-22 Scientific-Atlanta, Llc Filling blank spaces of a display screen
US20080001971A1 (en) * 2006-06-29 2008-01-03 Scientific-Atlanta, Inc. Filling Blank Spaces of a Display Screen
WO2008002786A2 (en) * 2006-06-29 2008-01-03 Scientific-Atlanta, Inc. Filling blank spaces of a display screen
WO2008002786A3 (en) * 2006-06-29 2008-04-10 Scientific Atlanta Filling blank spaces of a display screen
US8900124B2 (en) 2006-08-03 2014-12-02 Olympus Medical Systems Corp. Image display device
US20090252390A1 (en) * 2006-10-02 2009-10-08 Olympus Corporation Image processing apparatus and image processing method
US8135192B2 (en) * 2006-10-02 2012-03-13 Olympus Corporation Image processing apparatus and image processing method
US20080103359A1 (en) * 2006-10-26 2008-05-01 Tah-Yoong Lin Capsule-type endoscopic system with real-time image display
US20080161639A1 (en) * 2006-12-28 2008-07-03 Olympus Medical Systems Corporation Capsule medical apparatus and body-cavity observation method
US20090005639A1 (en) * 2007-01-12 2009-01-01 Olympus Medical Systems Corp. Capsule medical apparatus
US8702591B2 (en) 2007-01-12 2014-04-22 Olympus Medical Systems Corp. Capsule medical apparatus
US20100061597A1 (en) * 2007-06-05 2010-03-11 Olympus Corporation Image processing device, image processing program and image processing method
US8107704B2 (en) * 2007-06-05 2012-01-31 Olympus Corporation Image processing device, image processing program and image processing method
EP2158835A4 (en) * 2007-06-27 2012-06-13 Olympus Medical Systems Corp Image information display processing device
US20090003732A1 (en) * 2007-06-27 2009-01-01 Olympus Medical Systems Corp. Display processing apparatus for image information
EP2158835A1 (en) * 2007-06-27 2010-03-03 Olympus Medical Systems Corp. Image information display processing device
US8413079B2 (en) * 2007-06-27 2013-04-02 Olympus Medical Systems Corp. Display processing apparatus for image information
US8133169B2 (en) 2007-09-19 2012-03-13 Olympus Medical Systems Corp. In-vivo image acquiring system capable of controlling illuminating unit and determining whether to wirelessly transmit image information based on estimated distance
US20100329520A2 (en) * 2007-11-08 2010-12-30 Olympus Medical Systems Corp. Method and System for Correlating Image and Tissue Characteristic Data
US20090124853A1 (en) * 2007-11-08 2009-05-14 Kazuhiro Gono Capsule Blood Detection System and Method
US20090306522A1 (en) * 2007-11-08 2009-12-10 Kazuhiro Gono Method and Apparatus for Detecting Abnormal Living Tissue
US9017248B2 (en) 2007-11-08 2015-04-28 Olympus Medical Systems Corp. Capsule blood detection system and method
US9131847B2 (en) 2007-11-08 2015-09-15 Olympus Corporation Method and apparatus for detecting abnormal living tissue
US20090123043A1 (en) * 2007-11-08 2009-05-14 Kazuhiro Gono Method and System for Correlating Image and Tissue Characteristic Data
US20100269064A1 (en) * 2007-12-13 2010-10-21 Koninklijke Philips Electronics N.V. Navigation in a series of images
EP2235652B2 (en) 2007-12-13 2022-06-15 Koninklijke Philips N.V. Navigation in a series of images
EP2235652B1 (en) 2007-12-13 2018-10-17 Koninklijke Philips N.V. Navigation in a series of images
CN109584996A (en) * 2007-12-13 2019-04-05 皇家飞利浦电子股份有限公司 Navigation in a series of images
US9538937B2 (en) 2008-06-18 2017-01-10 Covidien Lp System and method of evaluating a subject with an ingestible capsule
US8744231B2 (en) * 2008-10-07 2014-06-03 Intromedic Method of displaying image taken by capsule endoscope and record media of storing program for carrying out that method
US20100086286A1 (en) * 2008-10-07 2010-04-08 Intromedic Method of displaying image taken by capsule endoscope and record media of storing program for carrying out that method
US8768017B2 (en) * 2008-11-07 2014-07-01 Olympus Corporation Image processing apparatus, computer readable recording medium storing therein image processing program, and image processing method
US20100119110A1 (en) * 2008-11-07 2010-05-13 Olympus Corporation Image display device, computer readable storage medium storing image processing program, and image processing method
US20110196201A1 (en) * 2009-03-11 2011-08-11 Olympus Medical Systems Corp. Image processing system, external device, and image processing method
US8167789B2 (en) * 2009-03-11 2012-05-01 Olympus Medical Systems Corp. Image processing system and method for body-insertable apparatus
US8298136B2 (en) * 2009-04-20 2012-10-30 Olympus Medical Systems Corp. In-vivo examination system
US20110224490A1 (en) * 2009-04-20 2011-09-15 Olympus Medical Systems Corp. In-vivo examination system
US20110032259A1 (en) * 2009-06-09 2011-02-10 Intromedic Co., Ltd. Method of displaying images obtained from an in-vivo imaging device and apparatus using same
US11963736B2 (en) 2009-07-20 2024-04-23 Masimo Corporation Wireless patient monitoring system
US11900775B2 (en) 2009-12-21 2024-02-13 Masimo Corporation Modular patient monitor
US8682142B1 (en) 2010-03-18 2014-03-25 Given Imaging Ltd. System and method for editing an image stream captured in-vivo
US10101890B2 (en) 2010-04-28 2018-10-16 Given Imaging Ltd. System and method for displaying portions of in-vivo images
US9060673B2 (en) 2010-04-28 2015-06-23 Given Imaging Ltd. System and method for displaying portions of in-vivo images
US8768024B1 (en) * 2010-06-01 2014-07-01 Given Imaging Ltd. System and method for real time detection of villi texture in an image stream of the gastrointestinal tract
US8922633B1 (en) 2010-09-27 2014-12-30 Given Imaging Ltd. Detection of gastrointestinal sections and transition of an in-vivo device there between
US8965079B1 (en) 2010-09-28 2015-02-24 Given Imaging Ltd. Real time detection of gastrointestinal sections and transitions of an in-vivo device therebetween
US8655036B2 (en) * 2011-02-14 2014-02-18 Siemens Medical Solutions Usa, Inc. Presentation of locations in medical diagnosis
DE112011104887B4 (en) * 2011-02-14 2016-12-22 Siemens Medical Solutions Usa, Inc. Non-transitory computer-readable storage medium for displaying markings of computer-aided detection in the computed tomography diagnosis of the colon
US20120207369A1 (en) * 2011-02-14 2012-08-16 Siemens Medical Solutions Usa, Inc. Presentation of Locations in Medical Diagnosis
WO2012112182A1 (en) * 2011-02-14 2012-08-23 Siemens Medical Solutions Usa, Inc. Presentation of locations in medical diagnosis
US8918740B2 (en) * 2011-03-30 2014-12-23 Olympus Medical Systems Corp. Image management apparatus, method, and computer-readable recording medium and capsule endoscope system
US20130152020A1 (en) * 2011-03-30 2013-06-13 Olympus Medical Systems Corp. Image management apparatus, method, and computer-readable recording medium and capsule endoscope system
US8873816B1 (en) 2011-04-06 2014-10-28 Given Imaging Ltd. Method and system for identification of red colored pathologies in vivo
US11786183B2 (en) 2011-10-13 2023-10-17 Masimo Corporation Medical monitoring hub
JP2020014866A (en) * 2011-10-13 2020-01-30 マシモ・コーポレイション Medical monitoring hub
US11241199B2 (en) 2011-10-13 2022-02-08 Masimo Corporation System for displaying medical monitoring data
US11179114B2 (en) 2011-10-13 2021-11-23 Masimo Corporation Medical monitoring hub
US11918353B2 (en) 2012-02-09 2024-03-05 Masimo Corporation Wireless patient monitoring device
US11083397B2 (en) 2012-02-09 2021-08-10 Masimo Corporation Wireless patient monitoring device
US12109022B2 (en) 2012-02-09 2024-10-08 Masimo Corporation Wireless patient monitoring device
US9545192B2 (en) 2012-05-04 2017-01-17 Given Imaging Ltd. System and method for automatic navigation of a capsule based on image stream captured in-vivo
US10405734B2 (en) 2012-06-29 2019-09-10 Given Imaging Ltd. System and method for displaying an image stream
US9324145B1 (en) 2013-08-08 2016-04-26 Given Imaging Ltd. System and method for detection of transitions in an image stream of the gastrointestinal tract
EP3040016A4 (en) * 2013-08-30 2017-04-19 Olympus Corporation Image management device
US10360474B2 (en) * 2014-06-24 2019-07-23 Olympus Corporation Image processing device, endoscope system, and image processing method
US20170083791A1 (en) * 2014-06-24 2017-03-23 Olympus Corporation Image processing device, endoscope system, and image processing method
US10271828B2 (en) * 2016-05-18 2019-04-30 Olympus Corporation Image file creation method, medium with image file creation program recorded and image file creation apparatus
US20170337683A1 (en) * 2016-05-18 2017-11-23 Olympus Corporation Image file creation method, medium with image file creation program recorded and image file creation apparatus
CN111565622A (en) * 2017-12-28 2020-08-21 威斯康星校友研究基金会 Otoscope provided with low-obstruction electronic display
WO2019133496A3 (en) * 2017-12-28 2019-08-08 Wisconsin Alumni Research Foundation Otoscope providing low obstruction electronic display
WO2021041980A1 (en) * 2019-08-28 2021-03-04 Trustees Of Tufts College Systems and methods for high-throughput screening and analysis of drug delivery systems in vitro
USD991279S1 (en) * 2019-12-09 2023-07-04 Ankon Technologies Co., Ltd Display screen or portion thereof with transitional graphical user interface
USD991278S1 (en) * 2019-12-09 2023-07-04 Ankon Technologies Co., Ltd Display screen or portion thereof with transitional graphical user interface for auxiliary reading

Also Published As

Publication number Publication date
AU2004233673A1 (en) 2004-11-11
EP1618832A1 (en) 2006-01-25
CN101133935A (en) 2008-03-05
EP1891888B1 (en) 2013-04-03
JP2004321603A (en) 2004-11-18
CN101133935B (en) 2012-03-21
AU2004233673B9 (en) 2004-11-11
CN101133937A (en) 2008-03-05
AU2004233673C1 (en) 2008-06-05
JP3810381B2 (en) 2006-08-16
EP1891887A2 (en) 2008-02-27
WO2004096027A1 (en) 2004-11-11
EP1891888A3 (en) 2009-08-05
US20120002026A1 (en) 2012-01-05
KR100746422B1 (en) 2007-08-03
AU2004233673B2 (en) 2007-10-25
EP1891888A2 (en) 2008-02-27
EP1891886A3 (en) 2009-08-05
EP1891886B1 (en) 2013-04-03
CN101133936B (en) 2010-12-15
EP1618832B1 (en) 2012-12-12
US8620044B2 (en) 2013-12-31
EP1891886A2 (en) 2008-02-27
CN1777391A (en) 2006-05-24
KR20060013518A (en) 2006-02-10
EP1891887A3 (en) 2009-08-05
CN101133936A (en) 2008-03-05
CN100486514C (en) 2009-05-13
EP1891886A8 (en) 2008-04-23
CA2523304A1 (en) 2004-11-11
EP1618832A4 (en) 2009-08-05

Similar Documents

Publication Publication Date Title
US8620044B2 (en) Image display apparatus, image display method, and computer program
AU2004233674B2 (en) Image display apparatus, image display method and image display program
US20040225223A1 (en) Image display apparatus, image display method, and computer program
JP4554647B2 (en) Image display device, image display method, and image display program
JP4547401B2 (en) Image display device, image display method, and image display program
JP4547402B2 (en) Image display device, image display method, and image display program
AU2008200088B2 (en) Image Display Apparatus, Image Display Method and Image Display Program
AU2007221808C1 (en) Image display unit, image display method and image display program
CA2614635C (en) Image display apparatus, image display method, and image display program

Legal Events

Date Code Title Description
AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HONDA, TAKEMITSU;MINAI, TETSUO;REEL/FRAME:015574/0803

Effective date: 20040602

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION