[go: nahoru, domu]

WO2002023403A2 - System and method for obtaining and utilizing maintenance information - Google Patents

System and method for obtaining and utilizing maintenance information Download PDF

Info

Publication number
WO2002023403A2
WO2002023403A2 PCT/US2001/028587 US0128587W WO0223403A2 WO 2002023403 A2 WO2002023403 A2 WO 2002023403A2 US 0128587 W US0128587 W US 0128587W WO 0223403 A2 WO0223403 A2 WO 0223403A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
computer
act
camera
coupled
Prior art date
Application number
PCT/US2001/028587
Other languages
French (fr)
Other versions
WO2002023403A3 (en
Inventor
Robert Lee Thompson
Original Assignee
Pinotage, Llc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pinotage, Llc. filed Critical Pinotage, Llc.
Priority to AU2001289056A priority Critical patent/AU2001289056A1/en
Priority to EP01968842A priority patent/EP1332443A2/en
Publication of WO2002023403A2 publication Critical patent/WO2002023403A2/en
Publication of WO2002023403A3 publication Critical patent/WO2002023403A3/en

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F5/00Designing, manufacturing, assembling, cleaning, maintaining or repairing aircraft, not otherwise provided for; Handling, transporting, testing or inspecting aircraft components, not otherwise provided for
    • B64F5/60Testing or inspecting aircraft components or systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach

Definitions

  • the present invention relates to maintenance systems and, more particularly, to systems and methods for obtaining and utilizing maintenance information.
  • Maintenance logs are used to record maintenance information by personnel performing maintenance and inspection on objects, such as motors, aircraft, boats, machines, structures and buildings. These maintenance logs typically include information regarding the condition of the object and/or the work being performed on the object, and provide an historical record of such information. Typical logs take the form of notebooks, whereby the person performing the maintenance can write descriptions of the condition of the object and/or the work performed. The log can be maintained as a reference point for future maintenance and performance information regarding the object.
  • a method of maintaining an object comprises the acts of storing, in digital format, a first image of the object at a first time, obtaining a second image of the object at a second time, comparing the first image to the second image, and determining whether to perform maintenance on the object based, at least in part, on the act of comparing.
  • a method of inspecting an object from a remote location comprises the acts of obtaining a digital image of the object at a first location, electronically transmitting the digital image to a second location remote from the first location, viewing the digital image at the second location, transmitting instructions to the first location, and performing an act on the object in response to the instructions.
  • an electronic inspection apparatus is provided.
  • the apparatus is adapted to communicate with a camera to obtain an image of an object.
  • the apparatus comprises a casing, a computer disposed within the casing, and a camera control unit disposed within the casing and coupled to the computer.
  • the camera control unit is adapted to receive electronic images from the camera, reformat the electronic images into digital format and pass the digitally formatted images to the computer.
  • the apparatus also includes an input device, coupled to the computer, that is adapted to allow a user to input full text data relating to the image.
  • an electronic inspection apparatus is provided.
  • the apparatus is adapted to communicate with a camera to obtain an image of an object.
  • the apparatus comprises a casing, a computer disposed within the casing, and a camera control unit disposed within the casing and coupled to the computer.
  • the camera control unit is adapted to receive electronic images from the camera, reformat the electronic images into digital format and pass the digitally formatted images to the computer.
  • the apparatus further includes a computer readable storage medium, coupled to the computer, having an executable code stored thereon. The code allows the computer to execute at least two processes in a multitask fashion.
  • an electronic inspection apparatus is provided, the apparatus is adapted to communicate with a camera for obtaining an image of an object.
  • the apparatus comprises a casing, a computer disposed within the casing, and a control unit disposed within the casing and coupled to the computer.
  • the control unit is adapted to communicate with the camera.
  • the apparatus further includes an input device coupled to the computer and the control unit.
  • the input device is adapted to receive an input command from a user.
  • the control unit is adapted to receive the command and signal at least portions of the camera to react as commanded.
  • an aircraft inspection system in another embodiment, includes a camera adapted to view a component of the aircraft, and a portable electronic apparatus communicating with the camera,.
  • the apparatus includes a casing, a computer disposed within the casing, and a camera control unit coupled to the computer and disposed within the casing.
  • the camera control unit is adapted to receive an image from the camera and pass the image to the computer.
  • the apparatus also includes a display coupled to the computer that is adapted to display the image.
  • An input device is coupled to the computer and is adapted to allow a user to input maintenance data relating to the component.
  • the apparatus further includes a storage medium communicating with the computer. The storage medium is adapted to store the image and related data.
  • an electronic maintenance apparatus is provided.
  • the apparatus is adapted to communicate with a camera to obtain an image of an object.
  • the apparatus comprises a casing, a computer disposed within the casing, and a storage medium communicating with the computer.
  • the storage medium includes maintenance information regarding the object being imaged.
  • Figure 1 is a schematic representation of a maintenance system according to one aspect of the invention.
  • Figure 2 is an illustration of an exemplary use of the system of Figure 1;
  • Figure 3 is a perspective view of a maintenance apparatus for use with the system according to one embodiment of the invention;
  • Figure 4 is an exploded perspective view of the maintenance apparatus of Figure 3;
  • Figure 5 is a view of the maintenance apparatus of Figure 3 showing an example of a display provided by the maintenance apparatus;
  • Figure 6 is a partially cut away perspective view of an imaging system for use with the maintenance apparatus of Figures 3-5;
  • Figure 7 is a partially cut away perspective view of the imaging system shown in Figure 6;
  • Figures 8a and 8b are partially cut away perspective views of an illustrative focusing mechanism employed in the system of Figure 6-7;
  • Figure 9 is a partially cut away perspective view of an alternative embodiment of the imaging system including an adapter that adapts a standard camera head to be mated with a coupler shown in the system of Figures 6-7; and
  • Figure 10 is a partially cut away perspective view of the adapter shown in Figure 9.
  • a system for obtaining and storing maintenance information in electronic format includes an apparatus having an LCD, a touch panel, a camera connector, camera adjustments and a flashcard port.
  • the apparatus houses a camera control unit (CCU) and a computer, which are used to receive and process images from an imager which is attached to the apparatus at the camera connector.
  • This CCU and computer are also used to process images and data and place these images and data on a storage media such as a flashcard, which may be removably placed in the flashcard port.
  • the apparatus also has attachment connectors for an external keyboard if one is desired by the user, external computer display video OUT and IN connectors as well as battery and external power connectors.
  • the apparatus may be used by maintenance personnel to capture images of the equipment or objects they are inspecting or maintaining as well as enter notes or detailed descriptions in writing or voice recording as adjuncts to the aforementioned images.
  • the apparatus may also be wearable, battery powered, voice or touch activated. Once the pictures and data are captured and stored, they may be down loaded to other computers and or transmitted via the Internet or other transport methods.
  • the storage media may be maintained with the apparatus in a separate housing carrying/storage case for permanent records that may stay with the apparatus for further reference.
  • the apparatus may use storage media which has been Preformatted with desired maintenance programs that could contain parts list, training material, instructions for use, instructions on how to accomplish a job at hand, check list, operations manuals and other material not limited to the aforementioned.
  • the apparatus will enable the user to keep and maintain a wear history on mechanical objects (e.g., engine components) thus enabling the user to make judgments on when a part might fail prior the part actually failing.
  • mechanical objects e.g., engine components
  • Another embodiment of the present invention is directed to a method of maintaining a digital maintenance information.
  • One embodiment of the present invention relates to a method of maintaining a digital maintenance information that includes pictures and/or text concerning the system being maintained.
  • the use of pictures is particularly powerful, as it enables one viewing the maintenance apparatus to compare and contrast the manner in which a component of the system has worn over time. It should be appreciated that any suitable type of camera can be used to take such pictures.
  • a set of pictures can be taken of key components of a system before the system is sent to the customer. Thereafter, during periodic maintenance checks, additional pictures can be taken, which can enable one to view the maintenance apparatus to compare the way the parts have worn.
  • a computer readable medium can be installed on the system to be maintained, so that the maintenance file can be stored therein.
  • the storage medium provided with the system can include pictures of certain components of the system when initially shipped to the customer, although the aspect of the present invention related to installing the digital maintenance file on the system to be maintained is not limited in this respect.
  • the embodiment of the present invention relating to installing the storage medium that stores the digital maintenance file on the system to be maintained is not limited to the use of a photographic maintenance file, as embodiments of the present invention contemplate that merely a text maintenance file can be employed.
  • the digital maintenance file is mounted to the system to be maintained, such that the maintenance file always stays with the system and can be accessed by maintenance personnel wherever the system is present, and further, cannot be lost.
  • the maintenance file can be backed up and stored away from the system to be maintained to enhance the security of the data that comprises the digital maintenance file.
  • the apparatus can be provided with a video output, such that videotapes can be made of the digital pictures taken.
  • maintenance personnel can be provided with a remote system for recording digital information (photographic and/or text) while inspecting the system into a computer readable medium that they can carry around with them.
  • This remote system can be cordless for ease of use (e.g., it can be battery powered). Once the inspection is complete, the remote system can be coupled to the storage medium installed on the system to be maintained and the information from the maintenance inspection can be downloaded into the digital maintenance file on the system.
  • Such a maintenance apparatus can be used with numerous types of systems, including aircraft (e.g., airplanes and helicopters), boats, automobiles, trucks, military equipment (e.g., tanks, etc.) and other systems as will be explained below.
  • aircraft e.g., airplanes and helicopters
  • boats e.g., boats, automobiles, trucks
  • military equipment e.g., tanks, etc.
  • One embodiment is directed to a method and apparatus for obtaining, recording, displaying, storing, transmitting and/or receiving maintenance and other information electronically, allowing a user to capture and store images, sound, error codes, related text or voice data and/or other information concerning the system or object being maintained.
  • the information can be stored locally and/or transmitted to remote locations. Retrieval of the images and other information at a later date provides an historical perspective of the object, enabling one using the maintenance apparatus to compare and contrast the condition of the object over time.
  • diagnostic information and/or support information may also be transmitted to and from the maintenance apparatus. Such information may alternatively be pre-stored for later retrieval.
  • the maintenance apparatus may be used as an interface between the object to be inspected and the person performing the inspection.
  • the apparatus allows a user to receive maintenance information, such as historical and/or real-time information regarding the object, and determine a course for corrective action to be performed on the object as necessary. In this manner, a user may make maintenance judgments, such as, for example, whether the object needs maintenance or when the object might fail prior the object actually failing.
  • a maintenance system 10 includes a maintenance apparatus 20 that receives real-time or current data 22a concerning the condition of one or more objects 24, such as a mechanical component, being inspected.
  • the data 22a concerning the object may relate to physical characteristics of the object 24, the interaction of two or more physical components, the operation of any object, such as the operating characteristics of any physical or electronic component, or any other characteristic of the object, as the present invention is not limited to receiving any particular types of data.
  • the data 22a may be in the form of one or more images 26, audio 28 (e.g., the sound of the object as it functions), error codes 30, any suitable combination thereof, or any other data, as the present invention is not limited in this respect.
  • the image 26 of the object may be generated by any image producing device as invention not limited in this respect.
  • audio 28 may be obtained with the use of any suitable device (e.g., a microphone), and the error code 30 may be obtained with any suitable interface.
  • Notes or detailed descriptions in text format 32 or voice recording 34 may be input into the apparatus 20 as adjuncts to the aforementioned data 22a and may be inputted using a user interface 36.
  • the data 22a may be presented to a user using one or more suitable output devices 38.
  • the maintenance apparatus 20 may store the data (labeled as 22b in Figure 1) locally (e.g., in a storage medium of the apparatus 20) or remotely (e.g., at a central maintenance facility).
  • the local storage medium may be internal or external to the apparatus 20 (e.g., in a separate housing carrying/storage case (not shown)), thereby providing a record that may stay with the apparatus 20 for further reference.
  • the apparatus may provide access to maintenance information that may include, in addition to the present data 22b concerning the object, any one or more of the following: information regarding the initial condition 39 of the object; historical information 40 of the object; diagnostic information 42; instructional information 44 (e.g., parts list, training materials, instructions for use, instructions on how to accomplish a job at hand, check lists, operations manuals, layout info ⁇ nation, schematic and parts diagrams, object location diagrams, etc.); and support 46 (e.g., help menu and/or real time technical assistance from technical support personnel when the apparatus is communicating with a maintenance facility or manufacturer/provider of the object 24).
  • Such additional information may be stored locally (e.g., within the apparatus 20) or remotely, with the apparatus 20 having the capability to communicate with the remote location. Any of the above described information can be employed with the apparatus
  • the historical information 40 may be provided using any suitable technique.
  • the historical information 40 may include a compilation of maintenance and inspection data 22b previously obtained by the user or users. Data concerning the initial condition 39 of an object may be provided to a customer of the system for subsequent comparison with real time information. For example, a set of images can be taken of key components of a system before the system is sent to a customer. During periodic maintenance checks, additional images can be taken, which can enable one to view the maintenance apparatus to compare the current data with the initial condition information or historical information to determine the way the parts have worn.
  • the system can communicate with a remote facility.
  • a remote facility This provides a number of advantages. For example, as may be the case with aircraft, maintenance for certain objects may be performed at different locations. Using the remote communication ability, an inspector at a first location may record his or her observations and upload the data 22b to a central database, so that an inspector at a second location may download that data prior to performing a subsequent inspection on the same aircraft.
  • a computer readable medium can be installed on the object to be maintained (e.g., installed on an aircraft), so that the maintenance information can be stored therein.
  • the storage medium provided with the object can include any of the types of data described above, including pictures of certain components of the object when initially shipped to the customer, although the aspect of the invention related to installing the maintenance information on the system to be maintained is not limited in this respect.
  • the embodiment of the present invention relating to installing the storage medium that stores the maintenance information on the object to be maintained is not limited to the use of image data, as embodiments of the present invention contemplate that text, audio, error code and/or other data can be employed.
  • An advantage of installing the maintenance information on the object to be maintained is that the maintenance information always stays with the object and can be accessed by maintenance persomiel wherever the object is present, and cannot be lost.
  • the apparatus can be coupled to the storage medium installed on the object to be maintained and the information from the maintenance inspection can be downloaded into the file stored on the object.
  • the maintenance information can be backed up and stored away from the object to enhance the security of the data that comprises the maintenance information.
  • a maintenance worker or inspector 50 inspects an engine 52 of an airplane using the maintenance apparatus 20 according to one embodiment of the present invention.
  • the inspector 50 probes into the engine compaitment 53 using a suitable data input device (such as a camera, scope, microphone, etc., (not shown)) coupled to the apparatus 20 via a link 54.
  • An inspection port 55 formed on the engine housing 53 may be used to facilitate inserting the input device to enable the user to obtain the desired data.
  • Data 22b (Fig. 1) is captured by the apparatus 20 for subsequent processing and analysis.
  • the inspector 50 inserts a camera 50 into the engine compartment to obtain an image of the engine.
  • the inspector 50 may record additional data, such as notes regarding the condition of the engine, the serial number of the engine, the date of inspection, the aircraft tail number or other identifier, the inspector's name, etc. This can be performed using a user interface 36 (Fig. 1) or the apparatus 20, which can be a keyboard, touch screen or any suitable interfaces as will be described below.
  • the inspector 50 may also recall previously stored information regarding the engine, such as the aforementioned initial condition 39, historical information 40, diagnostic information 42 or instructional information, 44 and determine a course of action.
  • the apparatus 20 may communicate with a remote facility through a suitable communications link (shown as 56 in Figure 2).
  • Linlc 56 can be any suitable communication medium, including wireless communication.
  • the remote facility may include a computer 57 storing a database (not shown) capable of storing any of the above mentioned information concerning the object being inspected.
  • Technicians at the remote facility may be able to remotely obtain and analyze the information obtained by the apparatus 20 to provide guidance to the inspector 50 regarding any action necessary.
  • the communication of the apparatus with the remote facility enables technicians at a remote site to obtain the data in real time, thereby enhancing maintenance efficiency. Alternatively, the technician at the remote facility may view and analyze the maintenance information at a later time.
  • the maintenance apparatus 20 may also be used as a communication interface between an inspection facility and the object 24 being inspected. In this manner, an inspector can be posted at the remote location while a helper is located on site to manipulate the apparatus 20 and/or its associated data gathering device(s). This enables the remote inspector to obtain real time data and render a maintenance decision from a remote location without the need for a skilled technician on site with the object being inspected.
  • one or more data gathering devices may be installed on the object to be inspected, with the apparatus 20 being capable of communicating with these devices.
  • an aircraft, ship or other object may be outfitted with several cameras capable of viewing certain areas within the object. The apparatus 20 may communicate with each of these cameras, via hardwire or wireless connection, to receive an image of the area to be inspected. Multiple views may also be generated to view an area from different locations and/or to view the interaction of multiple components.
  • the maintenance apparatus 20 may be implemented in any suitable manner, as the present invention is not limited in this respect.
  • the maintenance apparatus 20 is implemented as a portable hand-held digital computer/camera assembly.
  • the assembly may be housed within a casing, resulting in the approximate size and weight of a laptop computer.
  • the hand-held apparatus may be up to about ten to fourteen inches long, up to about eight to twelve inches wide, and up to about one to four inches tliick.
  • the apparatus 20 may include or otherwise communicate with a storage medium and may also include a power source (e.g., a battery pack) that renders the apparatus cordless and easily transportable. In one embodiment, the apparatus 20 is less than about ten pounds.
  • the apparatus 20 is less than about five pounds, and most preferably, less than about three pounds. It should be appreciated that the power pack may comprise a large percentage of the weight. Thus, the weight of the apparatus 20 depends upon the size of the power pack included within the apparatus 20. With such a hand-held apparatus, increased portability and ease of use may be attained.
  • the illustrative embodiment of the apparatus 20 shown in Figures 3-5 includes several main components, including input devices 70a-70f, output devices 80a, 80b, 70b, 70c, a motherboard 90, a camera control unit 100, a video chip 110, and a casing 130, each of which will be discussed in more detail below.
  • the data input devices and the data output devices may be any number of devices, either internal to the apparatus or connected externally via any number of techniques, and in some instances, the input and output devices may be part of the same device.
  • the data being inputted to or outputted from the apparatus 20 may be in any format, including but not limited to, still image data, streaming video images,, text and audio, and may be sent to or received by the apparatus as desired.
  • the motherboard 90 includes a central processing unit
  • CPU central processing unit
  • computer readable storage medium 94 coupled to the CPU 92 (e.g., via a bus (not shown)), and at least one input/output (I/O) connection 95 coupled to the CPU 92.
  • the motherboard can be custom designed or can be any of a number of standard devices.
  • the motherboard 90 controls data flow and storage, and works in conjunction with the video chip 110 and camera control unit 100 (CCU) to facilitate image processing and display.
  • the input devices 70a-70f provide the apparatus 20 with data. At least one of the devices provides a user interface.
  • a user may be human or non-human, as in the case of an application program or another device. Any of a number of input devices may be employed.
  • the apparatus 20 may have any number of internal input devices, disposed within the confines of the casing of the apparatus, as well as any number of external devices through suitable connections.
  • the input devices can include control units, such as buttons, knobs or switches, keypads, touch screen, the other input devices and the output devices etc. to control various aspects of the apparatus.
  • Human user input can also be obtained from an externally connected mouse, keyboard, joystick, glove, headset, microphone or any other manually controlled devices.
  • a touch screen 70a is employed for human user input.
  • a touch screen controller 72 is connected to the touch screen 70a and the motherboard 90 and transfers the data from the touch screen 70a to the motherboard 90 for further processing and storage.
  • Any of aforementioned external input or output devices may be attached to the apparatus 20 in numerous ways, via, for example, a connection port 74.
  • the apparatus may also include voice recognition software, so that data may be input or the system may be controlled by voice. Voice recordings may also be stored in the apparatus 20.
  • a flashcard 70b may be employed as a storage medium and may be installed through a PCMCIA (Personal Computer Memory Card International Association) card port 76.
  • the flashcard 70b may be in addition to the memory already present on the motherboard 90.
  • the flashcard 70b may be removable through the slot, or permanently attached to the apparatus 20 and contained within the device via a detachable, protective, screw-on covering 78. The card can be used to store pre- configured data.
  • Information stored on other devices can also be transmitted to the apparatus 20 via any of numerous communication mediums 70c, including but not limited to wireless communication media, such as cellular, satellite or infrared communication, modem connections, Ethernet connections, etc may be made through the PCMCIA port 76. Hardware enabling these communication mechanisms may be internal to the apparatus 20 in some embodiments and connected externally in others. Additionally, information may be transferred into the apparatus 20 via any of the numerous devices, for example: magnetic media (e.g., videotapes, audiotapes or floppy disks), optical media (e.g., CDs DVDs or laser disks), and electronic media (e.g., EPROM).
  • magnetic media e.g., videotapes, audiotapes or floppy disks
  • optical media e.g., CDs DVDs or laser disks
  • EPROM electronic media
  • One method of connection for any video input is an S-Nideo (Super-Video) connection port 79 hardwired to an S- Video-compatible device capable of reading the product.
  • the present invention is not limited to this type of connection, as ports and devices formatted for other types of video signals may be employed, including, for example, a composite signal.
  • the apparatus 20 is capable of receiving images from a camera, such as camera 70d shown in Figure 3. Any suitable camera or cameras may be used, as the present invention is not limited in this respect.
  • the camera 70d is NTSC (National Television Standards Committee) compatible. NTSC is the one of several camera standards used in the United States.
  • cameras compatible with other television broadcast standards may be used, including those compatible with the PAL (Phase Alternate Line) or SEC AM (Systeme Electronique Couleur Avec Memoire) systems, or any other type of camera.
  • the camera may be connected to the apparatus 20 in any suitable manner, as the present invention is not limited in this respect.
  • the camera 70d is connected to the apparatus 20 through port 78 on the apparatus 20 via an electronic cable 79.
  • an image sensor e.g., a charge-couple device, also referred to as a CCD
  • a fiber optic cable extending from the camera may be employed.
  • a fiber optic cable may also be used to transmit digital code representative of the image viewed by the camera to the apparatus 20, even where the camera includes a CCD.
  • Wireless, Ethernet or modem connections enabling data and image transfer from remote cameras or other sources may also be employed, as the present invention is not limited to the use of any particular connection technique.
  • Audio signals from the object being inspected may also be stored and/or transmitted via the apparatus 20.
  • the camera 70d may include a microphone 70e to pick up such audio.
  • a separate probe including the microphone 70e or other such sound or vibration receiving device may be employed.
  • Error code signals may also be received by the apparatus 20 using a suitable connection 70f.
  • some of the input devices lOa-lOf may be controlled by the apparatus 20, rather than independent device controls.
  • one or more camera control buttons or other interfaces may be provided on the apparatus and coupled, though the apparatus, to the camera to allow a user to operate and maneuver the camera 70d.
  • Camera control may be made via a Motion Control Card (MCC) 97 that is hardwired to the camera 70d or otherwise communicates with the camera 70d via a wireless communication.
  • Camera maneuvering may be made using any of the foregoing input devices that may communicate with the MCC.
  • Control and/or maneuvering of the camera includes at least focusing, zooming, change viewing axis, etc., as the present invention is not limited in this respect.
  • Control of the camera can occur because, in one embodiment, the camera includes a stepper motor coupled to various components of the camera, e.g., a gimbal for moving the camera head.
  • the MCC can control the stepper motor as desired.
  • the camera 70d may be manipulated by hand, as the present invention is not limited in this respect.
  • a white balance control button 77 intended to compensate for the amount of ambient light coming into the camera 70d, may be employed. Control button 77 is internally connected to the CCU.
  • the apparatus 20 has at least one output device used to display and/or store images and data.
  • an LCD (Liquid Crystal Display) screen 80a is coupled internally to the motherboard 90 and is visible to the user through a cut-out in the casing 130.
  • An LCD back light inverter 82 may be employed to control the illumination of the screen 80a.
  • the LCD 80a works in conjunction with the aforementioned touch screen 70a to act as both an input and an output device.
  • the LCD is one example of a display and other suitable displays can be used.
  • This LCD 80a may be configured to display image data, video data and text data in any number of display patterns 84, as shown in Figure 5.
  • the display 84 includes a split screen comprising an image of keys, such as atypical keyboard setup 85, enabling a user to type on the touch screen 80a using his or her fingers or other such probe, and an image display region 86 for displaying the imaged component with related text, if included.
  • the orientation (landscape or portrait) of images in region 86 can be manipulated, as will be discussed below. These images may be still or streaming video, as the present invention is not limited to any particular convention.
  • the image display region may also include a split screen, wherein images and text data from two or more cameras, each viewing a component, may be displayed.
  • the split screen may display stored or historical images and or text of one or more components as well as real time data.
  • the split screen may also be used to display any of the other aforementioned data. Additional electronic hardware and software may be necessary to view images in a split screen mode.
  • An external monitor or television may also be attached to the apparatus 20 and configured as a display in any of the manners disclosed above.
  • the external monitor is connected to the apparatus 20 via a hardwire connection to a VGA (Video Graphics Array) port 87.
  • VGA Video Graphics Array
  • a television is connected to the apparatus 20 via a hardwire connection to the aforementioned S-video port.
  • data output is made through the communication medium 70c, such as a modem, Ethernet or wireless devices.
  • Data may also be outputted to memory, including the aforementioned flashcard 70b, the motherboard's internal memory, or any other memory device known to those in the art, internal or external to the apparatus 20, such as the aforementioned magnetic media, optical media, or electronic media.
  • a speaker 80b may optionally be coupled to the apparatus 20 or otherwise included therein for presenting audio picked up by the microphone 70e, whether real-time or previously stored, regarding the object being inspected as well as previously recorded or real time voice transmission. It is to be appreciated, however, that the use of audio data and the speaker are not required for all embodiments.
  • the motherboard 90 controls data flowing in and out of the device and internal device activity.
  • the motherboard contains the CPU 92, memory, buses, and I/O connection sockets.
  • the CPU can be any suitable processor (e.g., such as a Mobile P3, available from the Intel Corporation, Santa Clara, CA).
  • the motherboard 90 can be custom designed, or can be any of numerous commercially available motherboards.
  • One such motherboard 90 that may be employed is the Microbus MPX-233111 , manufactured by Microbus Inc. of Houston, Texas.
  • the Microbus MPX-233111 contains a video chip 110 coupled to the motherboard 90 through a COM (serial communications) port.
  • This motherboard may be used with a Philips 69000 video chip, manufactured by Philips Semiconductors of Eindhoven, The Netherlands, as the video chip 110.
  • the CCU 100 is also coupled to the motherboard 90 and is used to control and receive images from one or more of the external cameras 70d described above.
  • a CCU 100 that may be used is the Panasonic GP-KS 162CBP WNTCE manufactured by the Panasonic Systems Company of Elgin, Illinois. Both the video chip 110 and the CCU 100 aid in manipulating and displaying graphics data. It should be appreciated that the name brand and type of components described are exemplary, as the present invention is not limited in this respect.
  • Most incoming data flows through the motherboard 90 upon entering the apparatus 20. Input data received via the camera 70d may be received by the CCU 100 before being processed by the motherboard 90.
  • the CCU 100 is capable of controlling one or more parameters of camera generated images including gain and white light balance and controlling an electromc iris for contrast.
  • the aforementioned white balance control button 77 is connected to the CCU 100 so that an initial white balance reading may be obtained. To take such a reading, the user places a piece of white paper in front of the camera 70d and depresses the white balance control button 77. The CCU 100 uses this reading to measure the amount of ambient light. Then, the CCU 100 uses the reading to adjust the color data in all subsequent camera shots, compensating for the ambient light.
  • the CCU can also perform analog to digital (A/D) conversion.
  • A/D analog to digital
  • the CCU may receive images in any electronic format fiom the camera and reformat the images into digital format.
  • the CCU then passes the digitally formatted image to the CPU.
  • the video chip 110 can perform a variety of image manipulations on any image, and is not limited to manipulating solely camera generated images.
  • the video chip 110 is capable of A/D conversion, as well as formatting the image into known image formats, such as JPEG (Joint Photographic Experts Group).
  • JPEG Joint Photographic Experts Group
  • the CPU 92 retrieves any requested data and sends it to the proper output device as requested.
  • the CPU 92 also processes, stores or sends any inputted data as directed.
  • Software used in the apparatus 20 may be run by and controlled by the CPU 92.
  • Such software may be custom software or commercially available software, such as XFREE86 provided by The XFree86 Project, Inc (available from the University of Sydney, Australia) that runs on UNIX ® and compatible (e.g., Linux, BSD, Mac OS X and Solaris x86 series) operating systems and OS/2 and a suitable windows manager.
  • This or other software may be used so that the CPU can perform concurrent operations of two or more processes in a multitask fashion.
  • Linux operating system is run on the apparatus, available from Linux.com.
  • Word processing or other text processing software may be employed to handle partial or full text inputs by a user.
  • any text information that a user desires may be inputted, not merely pre-programmed information.
  • preprogrammed information such as checklists, may also be employed.
  • the images or audio data may be attached as a file to the text resulting text file.
  • Additional software may include an image manipulation package, enabling the data to be formatted according to certain display constraints. Some possible manipulations may include image rotation, image sizing and choosing between landscape and portrait display options.
  • the CPU 92 may employ any of a number of algorithms to handle these tasks, as will be explained below.
  • the memory 94 is used to buffer several frames of incoming streaming video such that the images can be processed frame by frame and then displayed to the user at a rate comparable to that of real time, but several microseconds later. This process improves display quality and facilitates image manipulation. For example, each frame in the buffer may be rotated prior to being displayed to the user.
  • the CPU 92 can interface with the motherboard's memory 94 in any of numerous ways, e.g., through various busses.
  • the motherboard 90 contains 64 MB of RAM (Random Access Memory).
  • the present invention is not limited by the type or amount of storage placed on the motherboard 90, as additional types or amounts may be coupled to the motherboard 90.
  • both the memory 94 and the CPU 92 interface with the I/O devices through the I/O connection.
  • a power supply interface is provided by a port 112 capable of hardwire connection to an external power supply.
  • the power supply level may be about 12 N, or other levels may be employed.
  • the apparatus 20 can include an on-board power source, such as a battery 114 (Fig. 4), which may be rechargeable and housed within the casing, thereby rendering the apparatus 20 cordless.
  • the various components forming the apparatus 20 may be housed within a casing 130.
  • the casing 130 includes a front casing 130a and a back casing 130b that intercom ect to form an enclosure.
  • the front casing 130a contains a cutout 132 for the display screen 80a and touch screen 70a.
  • the back casing 130b is substantially rectangular and may also one or more cutouts 134 for ports to external devices and/or control buttons, knobs, switches or other interfaces.
  • the front and back casings 130a, 130b may be secured together using any suitable technique, such as with the use of screws.
  • the casing 130 may contain various bosses to support and secure the various electronic and mechanical components of the apparatus 20.
  • the casing 130 also contains two sets of four curved finger grooves 136 on the external side to aid in handling the apparatus 20. Handles 138 are attached to the casing 130 over these groves, leaving about a one to two inch space for a user's hands. A hook 140 may be mounted to the case to allow the apparatus 20 to be hung for hands-free use. It should also be recognized that casing for the apparatus 20 can take many other shapes and configurations, as not limited.
  • the casing 130 of the apparatus 20 may be manufactured out of many types of material in order to satisfy the needs of the user. For example, the apparatus 20 may be ruggedized and/or waterproofed.
  • any suitable type of imaging unit or camera can be used with the apparatus 20 to provide images of the object 24.
  • an imaging system including a camera assembly and a scope, with which the apparatus 20 of the present invention can be used, will now be described with reference to Figures 6-10.
  • Figure 6 is a partially cut away perspective view of an example of an imaging system that may be used with the apparatus 20.
  • the imaging system includes four primary components, i.e., a scope 150, such as an endoscope, an imaging unit or camera assembly 152, a coupler 154, which couples the scope 150 to the imaging unit 152, and a condom-like drape 400, which prevents the imaging unit 152 from contaminating a sterile operating field should the system be used in a medical environment, a clean room environment for the manufacture of e.g., silicon wafers, or other sterile environments.
  • the use of the condom-like drape 400 need not be employed when inspecting components, such as aircraft engines.
  • the imaging system can be employed with any type of image-producing scope, and is not limited to use with any particular type of scope.
  • the condom-like drape 400 does not intercept the optical viewing axis of the system.
  • the condom-like drape 400 does not cover a focusing mechanism 480 of the imaging system, making it easier to focus the system and lessening the likelihood that the drape 400 will be damaged due to manipulation of the focusing mechanism.
  • the lens for focusing the image from the endoscope to the imaging unit may be provided in the imaging unit 152, rather than in the coupler 154. This is particularly advantageous because, as discussed in more detail below, in the exemplary embodiment shown, a portion of the coupler 154 is not separated from the scope 150 by the condomlike drape 400, and therefore, is sterile in use.
  • the coupler 154 can be made significantly less expensively, thereby enabling the coupler 154 to be provided as a disposable part that need not be sterilized between uses. This is advantageous because the sterilization of the devices can be inconvenient and time consuming.
  • the imaging unit 152 includes an image sensor 156 that senses an image along an imaging axis (not shown).
  • the coupler 154 is coupled between the eyepiece 158 of the scope 150 and a distal end 660 of the imaging unit 152 such that the lens 200 is disposed between the image sensor 156 and the eyepiece 158 to focus an image produced by the scope 150 onto the image sensor 156.
  • the refractive lens 200 may be provided in the imaging unit 152, rather than in the coupler 154.
  • the coupler can be therefore made significantly less expensively, thereby enabling the coupler to be provided as a disposable part that need not be sterilized between uses.
  • the image sensor 156 may, for example, include a charge-coupled device (CCD) as discussed above, or a metal-oxide semiconductor (MOS) sensor. It should be appreciated, however, that the present invention is not limited in this respect, and can be employed with any type of image sensor 156.
  • the image generated by the image sensor 156 can be conveyed to the maintenance apparatus 20 or a monitor 460 in any of numerous ways, and the present invention is not limited to any particular implementation.
  • the image sensor 156 may be coupled to circuitry 560 which can assist in converting an image sensed by the image sensor 156 into an electrical signal.
  • This electrical signal then may be transmitted (e.g., via cable 260) to the monitor 460, maintenance apparatus 20 or elsewhere for display to a user or may be otherwise processed and/or recorded on a suitable medium.
  • the image sensor 156 may comprise a bundle of fiber optic cables which optically transmit an image from the lens 200 to the apparatus 20 or other a viewing device for display to a user.
  • the image sensor 156 need not necessarily convert the image from scope 150 into an electrical signal.
  • the imaging unit 152 is releasably mated with the coupler 154. This mating may be accomplished using any of a number of techniques. Figures 6 and 7 illustrate one technique that may be used to mate these two components.
  • a distal end 660 of the imaging unit 152 is inserted into an opening 880 at a proximal end 1100 of the coupler 154.
  • the imaging unit 152 includes a button 580 which is pivotally connected, via a pin 820, to a body portion 180 of the imaging unit 152.
  • the imaging unit 152 has a cavity 810 formed underneath the button 580 and a spring 900, disposed in the cavity 810.
  • Spring 900 biases the button 580 (in a clockwise direction in Figure 6) about pin 820 so that locking member 600 is biased away from a surface 860 of body portion 180.
  • spring 900 is compressed so that button 580 moves in a counterclockwise direction in Figure 6 about pin 820 and locking member 600 moves toward surface 860.
  • the button 580 is depressed and the distal end 660 of the imaging unit is inserted into the opening 880 in the coupler 154, the locking member 600 moves toward surface 860 so that it can slide over edge 1180 of the coupler 154.
  • the locking member 600 When the button 580 is released, the locking member 600 is biased (by spring 900) away from surface 860 and into a notch 620 in the coupler 154, and a shoulder 1160 of imaging unit 152 contacts a shoulder 1140 of the coupler 154, thereby interlocking the imaging unit 152 and the coupler 154.
  • An indication that the distal end 660 of the imaging unit 152 is fully inserted into the opening 880 is provided by the distal end 660 contacting a shoulder 1120 of coupler 154.
  • the imaging unit 152 and coupler 154 can be separated by pushing button 580, which moves the locking member 600 out of the notch 620, and pulling the imaging unit 152 away from the coupler 154.
  • Figures 6 and 7 illustrate only one example of the many ways that the imaging unit 152 and coupler 154 may be mated together.
  • the imaging unit 152 also includes a handle 780 proximal to the body portion 180.
  • the handle 780 may include grooves 800 to make it easier for a user to grip the imaging unit 152 though the drape 400 that can be extended over the imaging unit 152 in a manner described below.
  • the image sensor 156 and circuitry 560 may be mounted in the body portion 180 of the imaging unit 152 in any of a number of ways.
  • the image sensor 156 may be mounted via pins or screws 840a and 840b, and circuitry 560 may be mounted on a circuit board supported within body portion 180.
  • One or more wires may be used to interconnect the circuitry 560 with the cable 260.
  • the focal length between the image sensor 156 and the lens 200 of imaging unit 152 may be adjusted. In the system shown in Figures 6-7, this is accomplished via a mechanism that is not covered by the condom-like drape 400, thereby making it easier to focus the system and lessening the likelihood that the drape 400 will be damaged due to manipulation of the focusing mechanism. It should be appreciated, however, that the focal length adjustment can be accomplished in any number of ways.
  • the refractive lens 200 is disposed in the imaging unit 152, rather than in the coupler 154.
  • the focusing mechanism includes elements disposed in the imaging unit 152, as well as in the coupler 154.
  • placement of the lens 200 within the imaging unit 152, rather than in the coupler 154 provides at least one significant advantage. That is, the cost of the coupler 154 may be reduced significantly below the cost of coupling devices that include lenses, thereby making it commercially practicable to use a new, sterile coupler each time the imaging system is used, rather than repeatedly sterilizing and reusing the same coupling device should sterilization be required.
  • the distal end 660 of the imaging unit 152 includes a primary cylinder 760, in which a spring 680 and a cylindrical lens holder 220 are disposed.
  • Lens holder 220 supports the lens 200 in front of an imaging axis of image sensor 156.
  • Lens holder 220 (and lens 200) can be moved within primary cylinder 760 either toward or away from distal end 660 of the imaging unit 152 so as to adjust the focal length between the image sensor 156 and the lens 200.
  • Spring 680 biases lens holder 220 toward distal end 660.
  • the position of lens holder 220 within primary cylinder 760 can be adjusted, however, through manipulation of a focusing mechanism on the coupler 154 as discussed below. It should be appreciated that the present intention is not limited in this respect and that a camera including a lens that does not require focussing may be employed.
  • the imaging unit 152 further includes an outer cylinder 720, including a spirally ramped upper edge 960, which surrounds the primary cylinder 760.
  • Outer cylinder 720 is movable with respect to primary cylinder 760 either toward or away from the distal end 660 of imaging unit 152.
  • Outer cylinder 720 is comiected to the lens holder 220 via a pin 700.
  • Pin 700 extends through a slot 920 which extends a short distance along a length of the primary cylinder 760.
  • lens holder 220, outer cylinder 720 and pin 700 move as a single unit, with respect to primary cylinder 760, either toward or away from the distal end 660 of imaging unit 152.
  • the mamier in which this unit interacts with the focusing mechanism disposed on coupler 154 is described below in connection with Figures 8a-8b.
  • Figures 6 and 7 show an exemplary implementation of the coupler 154.
  • the coupler 154 can be constructed in any of a number of ways to achieve the desired goal of enabling the imaging unit 152 to be coupled to the scope 150.
  • the coupler 154 includes a main body 500 (including a proximal portion 500a and a distal portion 500b), a focusing ring 480, a light-penetrable window 940, a scope mounting portion 420 (including inner ring 420a and outer ring 420b) and the condomlike drape 400.
  • the components constituting the main body 500, focusing ring 480 and scope-mounting portion 420 may be made of any suitable material and may be affixed together in any suitable manner.
  • the coupler 154 is a disposable device, the coupler 154 is preferably formed from inexpensive components.
  • the main body 500 may be formed by inserting the distal portion 500b within the focusing ring 480, and then affixing together the proximal and distal portions 500a and 500b. Scope mounting portion 420 may be affixed to distal portion 500b.
  • Main body 500 has an outer surface 520 between a distal end 1080 and a proximal end 1100 of the coupler 154.
  • a channel 440 extends about a perimeter of the outer surface 520 between the focusing ring 480 and the proximal end 1100.
  • a sterile barrier may be established between the sterile operating environment including the scope 150, and a non-sterile environment including the imaging unit 152.
  • a sterile barrier is established by coupling the distal end 660 of the imaging unit 152 to the coupler 154, and providing a hermetic seal between the components of the coupler 120 that separate the sterile and non-sterile environments.
  • a light-penetrable window 940 is hermetically sealed between the distal end 1080 and the proximal end 1100 of the coupler 154 to establish a sterile barrier therebetween.
  • Window 940 may be made of glass, plastic, or any other suitable material through which light can pass from the scope 150 to the image sensor 156 (via lens 200) to generate a suitable image.
  • the coupler 154 also includes the condom-like drape 400.
  • the condom-like drape 400 may be made of any material that is suitable for creating a sterile barrier between a sterile environment and a non-sterile environment.
  • the condom-like drape may be made of a non-porous latex or plastic material.
  • the drape 400 may be extended to cover some or all of imaging unit 152 and cable 260.
  • the condom-like drape 400 may be hermetically sealed to the outer surface 520 of coupler 154.
  • the condom-like drape 400 does not intercept the optical viewing axis 190 of the imaging system. As mentioned above, this is advantageous in that the drape 400 need not be provided with a window that must be aligned with the optical viewing axis 190, and the drape 400 does not interfere with the quality of the image presented on the monitor 460. It should be appreciated that the function performed by the condom-like drape 400 can be achieved in any of numerous ways. For example, a protective drape can be provided that is more rigid than the condom-like drape 400 depicted in the drawings.
  • the condom-like drape 400 is substantially tubular in form and is open on its distal and proximal ends.
  • the distal end 210 of the condom-like drape 400 is attached to the outer surface 520 (within channel 440) of the coupler 120.
  • this attachment can be accomplished using a hermetic seal (e.g., via an O-ring 540) to maintain the separation between the sterile and non- sterile environments.
  • the condom-like drape 400 can be provided in a rolled-up form attached to the coupler 154. After the coupler 154 is mated with to the imaging unit 152 as described above, the condom-like drape 400 can be unrolled to cover the non-sterile imaging unit 152.
  • the drape 400 can be used in conjunction with coupler 154 without requiring the user to align the drape 400, or a window portion thereof, between the eyepiece 158 of the scope 150 and the coupler 154, and without having the drape 400 intercept the optical viewing axis 190 of the imaging system.
  • a drape is optional.
  • Figures 6 and 7 illustrate one example of a technique that may be used to mate the scope 150 with the coupler 154. It should be appreciated that numerous other suitable mating techniques can be employed.
  • the scope 150 is mated with the coupler 154 by inserting the eyepiece 158 into an opening 380 at the distal end 1080 of the coupler 154. Opening 380 may be formed by the inner and outer rings 420a-420b of the scope mounting portion 420.
  • the inner and outer rings 420a-420b form equal diameter openings, and inner ring 420a is movable with respect to outer ring 420b.
  • a spring biases the inner ring 420a so that its center is forced to be offset from the center of the outer ring 420b unless a user activates a lever (not shown) to cause the centers of the two rings to align with one another.
  • the user activates the lever so that the centers of the rings 420a-420b align with one another and inserts the eyepiece 158 through both rings.
  • the user then can release the lever so that the spring (not shown) causes the center of ring 420a to become offset from the center of ring 420b.
  • the diameter of the eyepiece 158 is only slightly smaller than the diameter of each of rings 420a and 420b, when the centers of the rings are offset from one another, the eyepiece 158 will be locked within the scope mounting portion 420 of the coupler 154.
  • the eyepiece 158 may be separated from the scope mounting portion 420 by pressing the lever to realign the centers of rings 420a and 420b and pulling the scope 150 away from the coupler 154.
  • the coupler 154 is shown as being mated directly with the eyepiece 158 of the scope 150.
  • the scope 150 (or other image-producing scope) may alternatively be mated indirectly with the coupler 154.
  • the scope 150 may be mated with the coupler 154 via one or more additional coupling devices.
  • a focusing mechanism can be employed that serves to adjust the focal length between the lens 200 and image sensor 156 in the imaging unit 152.
  • a focusing ring 480 is provided on the coupler 154 to perform this focal length adjustment.
  • the focusing ring 480 is disposed distally of the distal end 210 of the condom-like drape 400, so that after the drape 400 is extended to cover some or all of the imaging unit 152 and cable 260, the focusing ring 480 is not covered by the drape 400 and may be manipulated by a user to adjust the focal length between the lens 200 and the image sensor 158 without also having to manipulate the drape 400.
  • this feature makes focusing ring 480 relatively easy for the user to manipulate to achieve sharp focusing, and reduces the risk of damage to drape 400.
  • FIG. 7 An illustrative example of a linkage assembly for mechanically coupling the focusing ring 480 on the coupler 154 to the imaging unit 152 to adjust the focal length between the lens 200 and image sensor 158 is shown in Figures 7, 8a and 8b. It should be appreciated that numerous other implementations are possible.
  • the distal portion 500b of the main body portion 500 of coupler 154 has an annular groove 1000. Annular groove 1000 may be covered by the focusing ring 480, so that it is not visible from the outside of coupler 154.
  • a finger 980 extends inwardly from the focusing ring 480 through the annular groove 1000, so that when the focusing ring 480 is rotated about the main body portion 500, finger 980 slides within the annular groove 1000.
  • a lower surface 1200 of finger 980 contacts a portion of a spiraling ramp surface 960 on the outer cylinder 720.
  • pin 700 may be connected between the outer cylinder 720 and the cylindrical lens holder 220 through the slot 920, which extends along the length of the primary cylinder 760, so that the outer cylinder 720 and lens holder 220 do not rotate with respect to the primary cylinder 760.
  • the focusing ring 480 can rotate fieely about the primary cylinder 760, limited only by the movement of the finger 980 within the annular groove 1000.
  • Figures 8a and 8b illustrate the focusing mechanism at its two extreme focusing positions, with Figure 8a illustrating the lens 200 at its closest position to the image sensor 156 and Figure 8b illustrating the lens 200 at its furthest position from the image sensor 156.
  • Figure 8a when the lens 200 is at its closest position to the image sensor 156, the spring 680 is fully compressed, bottom surface 1200 of finger 980 is in contact with a point 1060 near the top of the spiraling ramped surface 960, and the finger 980 is in a first position with respect to the primary cylinder 760.
  • the imaging unit 152 includes a single body portion 180 in which both the image sensor 156 (and associated circuitry 560) and the refractive lens 200 (and associated components such as the lens holder 220, the spring 680, and the cylinders 720 and 760) are disposed. It should be appreciated, however, that various components of the imaging unit 152 may alternatively be distributed among two or more separate housings that may be mated together to form the imaging unit 152. An illustrative example of an imaging system configured in this manner is shown in Figures 9 and 10.
  • the imaging unit 152 to be mated with the coupler 154 may include a first housing 180a in which the refractive lens (and associated components) is disposed, and a second housing 180b in which the image sensor 140 (and associated circuitry (not shown)) is disposed.
  • the second housing 180b is the housing of a camera head 152b (e.g., a standard C-mount camera head), and the first housing 180a is the housing of an adapter 152a for adapting the camera head 152b for use with the coupler 154.
  • the adapter 152a is mated with the camera head 152b (as discussed below), the adapter 152a and the camera head 152b together form a composite imaging unit 152 which is similar to the imaging unit 152 described above in connection with Figures 6-7.
  • each of the housings 180a-180b may take on any of a number of alternative forms.
  • the housing 180b may alternatively be the housing of a standard V-mount camera head, or any other device in which an image sensor is disposed, and the housing 180a, may be configured to be mated with the same.
  • the imaging unit 152 may further include additional housings, including only one or two housings.
  • the imaging unit 152 may further include one or more housings disposed between the housings 180a and 180b or between the housing 180a and the coupler 154.
  • Such an additional housing may exist, for example, in the form of a coupling device that couples together the housings 180a and 180b or the housing 180a and the coupler 154.
  • the imaging unit actually employed may be any of numerous devices or combinations of devices capable of receiving an optical image along an imaging axis.
  • the term "imaging unit" is not intended to be limiting. Rather, it is intended to refer to any device or combination of devices capable of performing an imaging function.
  • the coupler 154 is shown as being mated directly with the distal end 660 of the imaging unit 152, it should be appreciated that the imaging unit 152 may alternatively be mated indirectly with the coupler 154.
  • the imaging unit 152 in whatever form, may be mated with the coupler 154 via one or more additional coupling devices.
  • the operational interface between the adapter 152a and the coupler 154 is identical in most respects to the operational interface between the imaging unit 152 and the coupler 154 described above in connection with Figures 6-8.
  • Corresponding components in the two embodiments have therefore been labeled with identical reference numerals, and reference may be made to the description of the embodiment of Figures 6-8 for an in-depth understanding of the operational interface between the adapter 152a and the coupler 154 of the embodiment of Figures 9-10.
  • the camera head 152b may, for example, be a standard C- mount camera head.
  • the camera head 152b may include a threaded, female connector 1280 formed at a distal end 1320 thereof.
  • the adapter 152a may include a threaded, male connector 1260 formed at a proximal end 1360 thereof.
  • the image sensor 156 may be disposed adjacent the distal end 1320 of the camera head 152b so that, when the male connector 1260 of the adapter 152a is threaded into the female connector 1280 of the camera head 152b, the image sensor 156 is disposed adjacent an opening 1380 at the proximal end 1360 of the adapter 152a.
  • the image sensor 156 is therefore disposed further from the distal end 660 of the imaging unit 152 than it is in the system of Figures 6-7.
  • an annular cavity 1220 is formed within the housing 180a to provide an optical pathway between the refractive lens 200 and the image sensor 156 along which an image produced by the scope 150 can be focused onto the image sensor 156 via the lens 200.
  • the cavity 1220 may be formed, for example, by reducing a width of an annular shoulder 1340 ( Figure 10) supporting one end of the spring 680 to be narrower than in the embodiment of Figures 6-7.
  • the button 580 is disposed on the adapter 152a of the imaging unit 152, and is therefore disposed distally of the image sensor 156 in this system, rather than proximally of the image sensor 156 as in the system of Figures 6-7.
  • the button 580 may be shortened as compared to the system of Figures 6-7.
  • the pin 820 about which the button 580 pivots may be disposed within a small cavity 1240 adjacent the proximal end 1360 of the adapter 152a, rather than being disposed proximally of the image sensor 156 as in the system of Figures 6-7.
  • the button 580 and locking member 600 represent only one example of numerous mechanisms that can be used to interconnect the imaging unit 152 with the coupler 154, and that the imaging unit 152 may be mated with the coupler 154 in different ways.
  • the imaging unit 152 may not include a button such as the button 580 or a locking member such as the locking member 600 at all, and may instead provide a different mechanism for mating the imaging unit 152 with the coupler 154.
  • the imaging unit 152 that is formed when the adapter 152a is mated with the camera head 152b can be made identical in all respects to the imaging unit 152 of embodiment of Figures 6-8. Additionally, by properly adjusting the refractive index of the lens 200 to account for the increased distance between the distal end 660 and the image sensor 156 in the embodiment of Figures 9-10 as compared to the embodiment of Figures 6-8, the imaging unit 152 of Figures 9-10 can also be made to mimic the functional characteristics of the imaging unit 152 of Figures 6-8 as well.
  • the adapter 152a of Figures 9-10 therefore enables a standard camera head (e.g., the camera head 152b) to be adapted for use with the inventive coupler 154 described herein in the same manner as in the embodiment of the imaging unit 152 described in connection with Figures 6-8. Therefore, one already in possession of a camera head 152b (e.g., a standard C-mount or V-mount camera head) may simply purchase the adapter 152a (which does not include an image sensor) for use with the coupler 154, rather than purchasing the imaging unit 152 of Figures 6-8 (which additionally includes an image sensor) for use therewith.
  • a standard camera head e.g., the camera head 152b
  • the adapter 152a which does not include an image sensor
  • the imaging unit 152 of Figures 6-8 which additionally includes an image sensor
  • the adapter 152a described herein is configured for use with a specific type of coupler (i.e., the coupler 154). However, it should be appreciated that the adapter 152a may alternatively be configured for use with other types of devices or couplers.
  • any suitable type of camera can be used to take such images, as the present invention is not limited to the above-described examples. Additional examples of cameras that can be suitable for use in such a system are described in a series of Applicant's earlier-filed U.S. patent applications, including provisional applications 60/054,197; 60/054,198; and 60/121,382, as well as regular U.S. patent applications nos. 09/126,368; 09/382,496; and 09/513,673, each of which is incorporated herein by reference. However, the present invention is not limited to using such camera systems.
  • the apparatus 20 and method of use described herein can be used in connection with inspection and/or maintenance of numerous types of objects, as the present invention is not limited in this respect.
  • the apparatus 20 and method of use described herein can be used in connection with inspection and/or maintenance of: aircraft (e.g., airplanes and helicopters), boats, automobiles, trucks, military equipment (e.g., tanks, weapons, etc.) and space vehicles; engines and related components, including aircraft engines, ship engines, motor vehicle engines and turbine engines; structural components of vehicles, such as airframes, hulls, chassis and automobile frames and other such components; structures such as buildings, roads, bridges, tumiels, etc.; facilities such as manufacturing plants and power plants including the components or objects relating to such facilities; mechanical components; systems; parts; inventory; products; processes; fluids and flows; and chemicals.
  • aircraft e.g., airplanes and helicopters
  • boats e.g., automobiles, trucks, military equipment (e.g., tanks, weapons, etc.) and space vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Quality & Reliability (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Marketing (AREA)
  • Game Theory and Decision Science (AREA)
  • General Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Manufacturing & Machinery (AREA)
  • Transportation (AREA)
  • Operations Research (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

A system for obtaining, recording, displaying, storing, transmitting and receiving maintenance and other information is provided. The system, which may include an electronic maintenance apparatus that may be in the form of a hand-held digital computer, allows a user capture and store images, sound, and/or error codes and related text or voice data and other information concerning the system or object being maintained. The information can be stored locally and/or transmitted to remote locations. Retrieval of the images and other data at a later data provides an historical perspective of the object, enabling one using the maintenance apparatus to compare and contrast the condition of the object over time. Instruction on how to accomplish a job at hand, diagnostic information and/or support information may also be transmitted to and from maintenance apparatus.

Description

SYSTEM AND METHOD FOR OBTAINING AND UTILIZING
MAINTENANCE INFORMATION
Cross-Reference to Related Applications
This application claims the benefit of U.S. Provisional Patent Application No. 60/231,913, filed September 11, 2000.
Background Field of the Invention
The present invention relates to maintenance systems and, more particularly, to systems and methods for obtaining and utilizing maintenance information.
Related Art Maintenance logs are used to record maintenance information by personnel performing maintenance and inspection on objects, such as motors, aircraft, boats, machines, structures and buildings. These maintenance logs typically include information regarding the condition of the object and/or the work being performed on the object, and provide an historical record of such information. Typical logs take the form of notebooks, whereby the person performing the maintenance can write descriptions of the condition of the object and/or the work performed. The log can be maintained as a reference point for future maintenance and performance information regarding the object.
Summary of the Invention
In one embodiment, a method of maintaining an object is provided. The method comprises the acts of storing, in digital format, a first image of the object at a first time, obtaining a second image of the object at a second time, comparing the first image to the second image, and determining whether to perform maintenance on the object based, at least in part, on the act of comparing.
In another embodiment, a method of inspecting an object from a remote location is provided. The method comprises the acts of obtaining a digital image of the object at a first location, electronically transmitting the digital image to a second location remote from the first location, viewing the digital image at the second location, transmitting instructions to the first location, and performing an act on the object in response to the instructions.
In yet another embodiment, an electronic inspection apparatus is provided. The apparatus is adapted to communicate with a camera to obtain an image of an object is provided. The apparatus comprises a casing, a computer disposed within the casing, and a camera control unit disposed within the casing and coupled to the computer. The camera control unit is adapted to receive electronic images from the camera, reformat the electronic images into digital format and pass the digitally formatted images to the computer. The apparatus also includes an input device, coupled to the computer, that is adapted to allow a user to input full text data relating to the image.
In still another embodiment, an electronic inspection apparatus is provided. The apparatus is adapted to communicate with a camera to obtain an image of an object. The apparatus comprises a casing, a computer disposed within the casing, and a camera control unit disposed within the casing and coupled to the computer. The camera control unit is adapted to receive electronic images from the camera, reformat the electronic images into digital format and pass the digitally formatted images to the computer. The apparatus further includes a computer readable storage medium, coupled to the computer, having an executable code stored thereon. The code allows the computer to execute at least two processes in a multitask fashion.
In another embodiment, an electronic inspection apparatus is provided, the apparatus is adapted to communicate with a camera for obtaining an image of an object. The apparatus comprises a casing, a computer disposed within the casing, and a control unit disposed within the casing and coupled to the computer. The control unit is adapted to communicate with the camera. The apparatus further includes an input device coupled to the computer and the control unit. The input device is adapted to receive an input command from a user. The control unit is adapted to receive the command and signal at least portions of the camera to react as commanded.
In another embodiment, an aircraft inspection system is provided. The system includes a camera adapted to view a component of the aircraft, and a portable electronic apparatus communicating with the camera,. The apparatus includes a casing, a computer disposed within the casing, and a camera control unit coupled to the computer and disposed within the casing. The camera control unit is adapted to receive an image from the camera and pass the image to the computer. The apparatus also includes a display coupled to the computer that is adapted to display the image. An input device is coupled to the computer and is adapted to allow a user to input maintenance data relating to the component. The apparatus further includes a storage medium communicating with the computer. The storage medium is adapted to store the image and related data.
In yet another embodiment, an electronic maintenance apparatus is provided. The apparatus is adapted to communicate with a camera to obtain an image of an object. The apparatus comprises a casing, a computer disposed within the casing, and a storage medium communicating with the computer. The storage medium includes maintenance information regarding the object being imaged.
Brief Description of the Drawings
Various embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 is a schematic representation of a maintenance system according to one aspect of the invention;
Figure 2 is an illustration of an exemplary use of the system of Figure 1; Figure 3 is a perspective view of a maintenance apparatus for use with the system according to one embodiment of the invention;
Figure 4 is an exploded perspective view of the maintenance apparatus of Figure 3;
Figure 5 is a view of the maintenance apparatus of Figure 3 showing an example of a display provided by the maintenance apparatus; Figure 6 is a partially cut away perspective view of an imaging system for use with the maintenance apparatus of Figures 3-5;
Figure 7 is a partially cut away perspective view of the imaging system shown in Figure 6;
Figures 8a and 8b are partially cut away perspective views of an illustrative focusing mechanism employed in the system of Figure 6-7; Figure 9 is a partially cut away perspective view of an alternative embodiment of the imaging system including an adapter that adapts a standard camera head to be mated with a coupler shown in the system of Figures 6-7; and
Figure 10 is a partially cut away perspective view of the adapter shown in Figure 9.
Detailed Description
Applicant's have appreciated that, after a short period of time, conventional log notebooks can become voluminous, torn, dirty, lost or destroyed. And, if they are to be read by people in places other than where they are stored, they must be copied and shipped, faxed, or transported in some manner to the desired location. Maintaining these notebooks is time consuming, costly and antiquated at best.
In one embodiment, a system for obtaining and storing maintenance information in electronic format is provided. The system includes an apparatus having an LCD, a touch panel, a camera connector, camera adjustments and a flashcard port. The apparatus houses a camera control unit (CCU) and a computer, which are used to receive and process images from an imager which is attached to the apparatus at the camera connector. This CCU and computer are also used to process images and data and place these images and data on a storage media such as a flashcard, which may be removably placed in the flashcard port. The apparatus also has attachment connectors for an external keyboard if one is desired by the user, external computer display video OUT and IN connectors as well as battery and external power connectors.
The apparatus may be used by maintenance personnel to capture images of the equipment or objects they are inspecting or maintaining as well as enter notes or detailed descriptions in writing or voice recording as adjuncts to the aforementioned images. The apparatus may also be wearable, battery powered, voice or touch activated. Once the pictures and data are captured and stored, they may be down loaded to other computers and or transmitted via the Internet or other transport methods. The storage media may be maintained with the apparatus in a separate housing carrying/storage case for permanent records that may stay with the apparatus for further reference.
It should be appreciated that the apparatus may use storage media which has been Preformatted with desired maintenance programs that could contain parts list, training material, instructions for use, instructions on how to accomplish a job at hand, check list, operations manuals and other material not limited to the aforementioned.
Another feature is that the apparatus will enable the user to keep and maintain a wear history on mechanical objects (e.g., engine components) thus enabling the user to make judgments on when a part might fail prior the part actually failing.
Another embodiment of the present invention is directed to a method of maintaining a digital maintenance information. One embodiment of the present invention relates to a method of maintaining a digital maintenance information that includes pictures and/or text concerning the system being maintained. The use of pictures is particularly powerful, as it enables one viewing the maintenance apparatus to compare and contrast the manner in which a component of the system has worn over time. It should be appreciated that any suitable type of camera can be used to take such pictures.
In one embodiment of the present invention, a set of pictures can be taken of key components of a system before the system is sent to the customer. Thereafter, during periodic maintenance checks, additional pictures can be taken, which can enable one to view the maintenance apparatus to compare the way the parts have worn.
In one embodiment of the invention, a computer readable medium can be installed on the system to be maintained, so that the maintenance file can be stored therein. Optionally, the storage medium provided with the system can include pictures of certain components of the system when initially shipped to the customer, although the aspect of the present invention related to installing the digital maintenance file on the system to be maintained is not limited in this respect. Also, it should be appreciated that the embodiment of the present invention relating to installing the storage medium that stores the digital maintenance file on the system to be maintained is not limited to the use of a photographic maintenance file, as embodiments of the present invention contemplate that merely a text maintenance file can be employed.
It should be appreciated that it is an advantage of one embodiment of the present invention that the digital maintenance file is mounted to the system to be maintained, such that the maintenance file always stays with the system and can be accessed by maintenance personnel wherever the system is present, and further, cannot be lost. In addition, the maintenance file can be backed up and stored away from the system to be maintained to enhance the security of the data that comprises the digital maintenance file.
In another embodiment of the present invention, the apparatus can be provided with a video output, such that videotapes can be made of the digital pictures taken. In another embodiment of the present invention, maintenance personnel can be provided with a remote system for recording digital information (photographic and/or text) while inspecting the system into a computer readable medium that they can carry around with them. This remote system can be cordless for ease of use (e.g., it can be battery powered). Once the inspection is complete, the remote system can be coupled to the storage medium installed on the system to be maintained and the information from the maintenance inspection can be downloaded into the digital maintenance file on the system.
Such a maintenance apparatus can be used with numerous types of systems, including aircraft (e.g., airplanes and helicopters), boats, automobiles, trucks, military equipment (e.g., tanks, etc.) and other systems as will be explained below.
One embodiment is directed to a method and apparatus for obtaining, recording, displaying, storing, transmitting and/or receiving maintenance and other information electronically, allowing a user to capture and store images, sound, error codes, related text or voice data and/or other information concerning the system or object being maintained. The information can be stored locally and/or transmitted to remote locations. Retrieval of the images and other information at a later date provides an historical perspective of the object, enabling one using the maintenance apparatus to compare and contrast the condition of the object over time. Instruction on how to accomplish a job at hand, diagnostic information and/or support information may also be transmitted to and from the maintenance apparatus. Such information may alternatively be pre-stored for later retrieval.
In one embodiment, the maintenance apparatus may be used as an interface between the object to be inspected and the person performing the inspection. The apparatus allows a user to receive maintenance information, such as historical and/or real-time information regarding the object, and determine a course for corrective action to be performed on the object as necessary. In this manner, a user may make maintenance judgments, such as, for example, whether the object needs maintenance or when the object might fail prior the object actually failing.
In one embodiment shown in Figure 1, a maintenance system 10 includes a maintenance apparatus 20 that receives real-time or current data 22a concerning the condition of one or more objects 24, such as a mechanical component, being inspected. The data 22a concerning the object may relate to physical characteristics of the object 24, the interaction of two or more physical components, the operation of any object, such as the operating characteristics of any physical or electronic component, or any other characteristic of the object, as the present invention is not limited to receiving any particular types of data. The data 22a may be in the form of one or more images 26, audio 28 (e.g., the sound of the object as it functions), error codes 30, any suitable combination thereof, or any other data, as the present invention is not limited in this respect. The image 26 of the object may be generated by any image producing device as invention not limited in this respect. Similarly, audio 28 may be obtained with the use of any suitable device (e.g., a microphone), and the error code 30 may be obtained with any suitable interface. Notes or detailed descriptions in text format 32 or voice recording 34 may be input into the apparatus 20 as adjuncts to the aforementioned data 22a and may be inputted using a user interface 36. The data 22a may be presented to a user using one or more suitable output devices 38. The maintenance apparatus 20 may store the data (labeled as 22b in Figure 1) locally (e.g., in a storage medium of the apparatus 20) or remotely (e.g., at a central maintenance facility). The local storage medium may be internal or external to the apparatus 20 (e.g., in a separate housing carrying/storage case (not shown)), thereby providing a record that may stay with the apparatus 20 for further reference. In one embodiment, the apparatus may provide access to maintenance information that may include, in addition to the present data 22b concerning the object, any one or more of the following: information regarding the initial condition 39 of the object; historical information 40 of the object; diagnostic information 42; instructional information 44 (e.g., parts list, training materials, instructions for use, instructions on how to accomplish a job at hand, check lists, operations manuals, layout infoπnation, schematic and parts diagrams, object location diagrams, etc.); and support 46 (e.g., help menu and/or real time technical assistance from technical support personnel when the apparatus is communicating with a maintenance facility or manufacturer/provider of the object 24). Such additional information may be stored locally (e.g., within the apparatus 20) or remotely, with the apparatus 20 having the capability to communicate with the remote location. Any of the above described information can be employed with the apparatus in any suitable combination.
The historical information 40 may be provided using any suitable technique. In one embodiment, the historical information 40 may include a compilation of maintenance and inspection data 22b previously obtained by the user or users. Data concerning the initial condition 39 of an object may be provided to a customer of the system for subsequent comparison with real time information. For example, a set of images can be taken of key components of a system before the system is sent to a customer. During periodic maintenance checks, additional images can be taken, which can enable one to view the maintenance apparatus to compare the current data with the initial condition information or historical information to determine the way the parts have worn.
As discussed above, in one embodiment, the system can communicate with a remote facility. This provides a number of advantages. For example, as may be the case with aircraft, maintenance for certain objects may be performed at different locations. Using the remote communication ability, an inspector at a first location may record his or her observations and upload the data 22b to a central database, so that an inspector at a second location may download that data prior to performing a subsequent inspection on the same aircraft.
In other embodiments of the invention, other techniques for providing a user with the most current data may be employed. For example, in one embodiment, a computer readable medium can be installed on the object to be maintained (e.g., installed on an aircraft), so that the maintenance information can be stored therein. Optionally, the storage medium provided with the object can include any of the types of data described above, including pictures of certain components of the object when initially shipped to the customer, although the aspect of the invention related to installing the maintenance information on the system to be maintained is not limited in this respect. Also, it should be appreciated that the embodiment of the present invention relating to installing the storage medium that stores the maintenance information on the object to be maintained is not limited to the use of image data, as embodiments of the present invention contemplate that text, audio, error code and/or other data can be employed.
An advantage of installing the maintenance information on the object to be maintained is that the maintenance information always stays with the object and can be accessed by maintenance persomiel wherever the object is present, and cannot be lost. Once the inspection is complete, the apparatus can be coupled to the storage medium installed on the object to be maintained and the information from the maintenance inspection can be downloaded into the file stored on the object. In addition, the maintenance information can be backed up and stored away from the object to enhance the security of the data that comprises the maintenance information.
Referring in relation to aircraft as shown in Figure 2, an exemplary use of the maintenance system 10 will be described. A maintenance worker or inspector 50 inspects an engine 52 of an airplane using the maintenance apparatus 20 according to one embodiment of the present invention. The inspector 50 probes into the engine compaitment 53 using a suitable data input device (such as a camera, scope, microphone, etc., (not shown)) coupled to the apparatus 20 via a link 54. An inspection port 55 formed on the engine housing 53 may be used to facilitate inserting the input device to enable the user to obtain the desired data. Data 22b (Fig. 1) is captured by the apparatus 20 for subsequent processing and analysis. In one embodiment, the inspector 50 inserts a camera 50 into the engine compartment to obtain an image of the engine.
The inspector 50, after obtaining the data, may record additional data, such as notes regarding the condition of the engine, the serial number of the engine, the date of inspection, the aircraft tail number or other identifier, the inspector's name, etc. This can be performed using a user interface 36 (Fig. 1) or the apparatus 20, which can be a keyboard, touch screen or any suitable interfaces as will be described below. The inspector 50 may also recall previously stored information regarding the engine, such as the aforementioned initial condition 39, historical information 40, diagnostic information 42 or instructional information, 44 and determine a course of action.
As discussed above, in one embodiment, the apparatus 20 may communicate with a remote facility through a suitable communications link (shown as 56 in Figure 2). Linlc 56 can be any suitable communication medium, including wireless communication. The remote facility may include a computer 57 storing a database (not shown) capable of storing any of the above mentioned information concerning the object being inspected. Technicians at the remote facility may be able to remotely obtain and analyze the information obtained by the apparatus 20 to provide guidance to the inspector 50 regarding any action necessary. The communication of the apparatus with the remote facility enables technicians at a remote site to obtain the data in real time, thereby enhancing maintenance efficiency. Alternatively, the technician at the remote facility may view and analyze the maintenance information at a later time.
The maintenance apparatus 20 may also be used as a communication interface between an inspection facility and the object 24 being inspected. In this manner, an inspector can be posted at the remote location while a helper is located on site to manipulate the apparatus 20 and/or its associated data gathering device(s). This enables the remote inspector to obtain real time data and render a maintenance decision from a remote location without the need for a skilled technician on site with the object being inspected. Rather than probe the object 24 to be inspected with a data gathering device coupled to the apparatus 20, one or more data gathering devices may be installed on the object to be inspected, with the apparatus 20 being capable of communicating with these devices. For example, an aircraft, ship or other object may be outfitted with several cameras capable of viewing certain areas within the object. The apparatus 20 may communicate with each of these cameras, via hardwire or wireless connection, to receive an image of the area to be inspected. Multiple views may also be generated to view an area from different locations and/or to view the interaction of multiple components.
The maintenance apparatus 20 may be implemented in any suitable manner, as the present invention is not limited in this respect. In one embodiment, the maintenance apparatus 20 is implemented as a portable hand-held digital computer/camera assembly. As is explained more fully below with reference to Figures 3-5, the assembly may be housed within a casing, resulting in the approximate size and weight of a laptop computer. For example, the hand-held apparatus may be up to about ten to fourteen inches long, up to about eight to twelve inches wide, and up to about one to four inches tliick. The apparatus 20 may include or otherwise communicate with a storage medium and may also include a power source (e.g., a battery pack) that renders the apparatus cordless and easily transportable. In one embodiment, the apparatus 20 is less than about ten pounds. More preferably, the apparatus 20 is less than about five pounds, and most preferably, less than about three pounds. It should be appreciated that the power pack may comprise a large percentage of the weight. Thus, the weight of the apparatus 20 depends upon the size of the power pack included within the apparatus 20. With such a hand-held apparatus, increased portability and ease of use may be attained.
The illustrative embodiment of the apparatus 20 shown in Figures 3-5 includes several main components, including input devices 70a-70f, output devices 80a, 80b, 70b, 70c, a motherboard 90, a camera control unit 100, a video chip 110, and a casing 130, each of which will be discussed in more detail below. As discussed above, the data input devices and the data output devices may be any number of devices, either internal to the apparatus or connected externally via any number of techniques, and in some instances, the input and output devices may be part of the same device. The data being inputted to or outputted from the apparatus 20 may be in any format, including but not limited to, still image data, streaming video images,, text and audio, and may be sent to or received by the apparatus as desired. The motherboard 90 includes a central processing unit
(CPU) 92, computer readable storage medium 94 coupled to the CPU 92 (e.g., via a bus (not shown)), and at least one input/output (I/O) connection 95 coupled to the CPU 92. The motherboard can be custom designed or can be any of a number of standard devices. The motherboard 90 controls data flow and storage, and works in conjunction with the video chip 110 and camera control unit 100 (CCU) to facilitate image processing and display.
The input devices 70a-70f provide the apparatus 20 with data. At least one of the devices provides a user interface. A user may be human or non-human, as in the case of an application program or another device. Any of a number of input devices may be employed. The apparatus 20 may have any number of internal input devices, disposed within the confines of the casing of the apparatus, as well as any number of external devices through suitable connections. The input devices can include control units, such as buttons, knobs or switches, keypads, touch screen, the other input devices and the output devices etc. to control various aspects of the apparatus. Human user input can also be obtained from an externally connected mouse, keyboard, joystick, glove, headset, microphone or any other manually controlled devices. In one embodiment, a touch screen 70a is employed for human user input. In this embodiment, a touch screen controller 72 is connected to the touch screen 70a and the motherboard 90 and transfers the data from the touch screen 70a to the motherboard 90 for further processing and storage. Any of aforementioned external input or output devices may be attached to the apparatus 20 in numerous ways, via, for example, a connection port 74. The apparatus may also include voice recognition software, so that data may be input or the system may be controlled by voice. Voice recordings may also be stored in the apparatus 20.
Maintenance information previously stored on internal or external storage devices may also be inputted to the apparatus 20. Any suitable storage device may be employed, including the internal memory of the motherboard 90, harddrives or other storage media. In one embodiment, a flashcard 70b may be employed as a storage medium and may be installed through a PCMCIA (Personal Computer Memory Card International Association) card port 76. The flashcard 70b may be in addition to the memory already present on the motherboard 90. The flashcard 70b may be removable through the slot, or permanently attached to the apparatus 20 and contained within the device via a detachable, protective, screw-on covering 78. The card can be used to store pre- configured data.
Information stored on other devices can also be transmitted to the apparatus 20 via any of numerous communication mediums 70c, including but not limited to wireless communication media, such as cellular, satellite or infrared communication, modem connections, Ethernet connections, etc may be made through the PCMCIA port 76. Hardware enabling these communication mechanisms may be internal to the apparatus 20 in some embodiments and connected externally in others. Additionally, information may be transferred into the apparatus 20 via any of the numerous devices, for example: magnetic media (e.g., videotapes, audiotapes or floppy disks), optical media (e.g., CDs DVDs or laser disks), and electronic media (e.g., EPROM). One method of connection for any video input is an S-Nideo (Super-Video) connection port 79 hardwired to an S- Video-compatible device capable of reading the product. However, the present invention is not limited to this type of connection, as ports and devices formatted for other types of video signals may be employed, including, for example, a composite signal. As discussed above, in one embodiment the apparatus 20 is capable of receiving images from a camera, such as camera 70d shown in Figure 3. Any suitable camera or cameras may be used, as the present invention is not limited in this respect. In one embodiment, the camera 70d is NTSC (National Television Standards Committee) compatible. NTSC is the one of several camera standards used in the United States. Examples of cameras that may be used with the apparatus 20 include the BoreCam™, the PeriCam M, the TeleCam™, and the ToolCam™, each available from Vision Technologies of Rogers, AR. Alternatively, cameras compatible with other television broadcast standards may be used, including those compatible with the PAL (Phase Alternate Line) or SEC AM (Systeme Electronique Couleur Avec Memoire) systems, or any other type of camera.
The camera may be connected to the apparatus 20 in any suitable manner, as the present invention is not limited in this respect. In one embodiment, the camera 70d is connected to the apparatus 20 through port 78 on the apparatus 20 via an electronic cable 79. In another embodiment, an image sensor (e.g., a charge-couple device, also referred to as a CCD) is incorporated into the apparatus 20 rather than within the camera 70d, and a fiber optic cable extending from the camera may be employed. Further, a fiber optic cable may also be used to transmit digital code representative of the image viewed by the camera to the apparatus 20, even where the camera includes a CCD. Wireless, Ethernet or modem connections enabling data and image transfer from remote cameras or other sources may also be employed, as the present invention is not limited to the use of any particular connection technique.
Audio signals from the object being inspected may also be stored and/or transmitted via the apparatus 20. In one embodiment, the camera 70d may include a microphone 70e to pick up such audio. Alternatively, a separate probe including the microphone 70e or other such sound or vibration receiving device may be employed. Error code signals may also be received by the apparatus 20 using a suitable connection 70f.
In one embodiment, some of the input devices lOa-lOf may be controlled by the apparatus 20, rather than independent device controls. For example, one or more camera control buttons or other interfaces may be provided on the apparatus and coupled, though the apparatus, to the camera to allow a user to operate and maneuver the camera 70d. Camera control may be made via a Motion Control Card (MCC) 97 that is hardwired to the camera 70d or otherwise communicates with the camera 70d via a wireless communication. Camera maneuvering may be made using any of the foregoing input devices that may communicate with the MCC. Control and/or maneuvering of the camera includes at least focusing, zooming, change viewing axis, etc., as the present invention is not limited in this respect. Control of the camera can occur because, in one embodiment, the camera includes a stepper motor coupled to various components of the camera, e.g., a gimbal for moving the camera head. The MCC can control the stepper motor as desired. Alternatively, the camera 70d may be manipulated by hand, as the present invention is not limited in this respect. Further, a white balance control button 77, intended to compensate for the amount of ambient light coming into the camera 70d, may be employed. Control button 77 is internally connected to the CCU.
In one embodiment, the apparatus 20 has at least one output device used to display and/or store images and data. In one embodiment, an LCD (Liquid Crystal Display) screen 80a is coupled internally to the motherboard 90 and is visible to the user through a cut-out in the casing 130. An LCD back light inverter 82 may be employed to control the illumination of the screen 80a. In one embodiment, the LCD 80a works in conjunction with the aforementioned touch screen 70a to act as both an input and an output device. Of course, the LCD is one example of a display and other suitable displays can be used.
This LCD 80a may be configured to display image data, video data and text data in any number of display patterns 84, as shown in Figure 5. In one embodiment, the display 84 includes a split screen comprising an image of keys, such as atypical keyboard setup 85, enabling a user to type on the touch screen 80a using his or her fingers or other such probe, and an image display region 86 for displaying the imaged component with related text, if included. In one embodiment, the orientation (landscape or portrait) of images in region 86 can be manipulated, as will be discussed below. These images may be still or streaming video, as the present invention is not limited to any particular convention. In another embodiment, although not shown, the image display region may also include a split screen, wherein images and text data from two or more cameras, each viewing a component, may be displayed. Alternatively, the split screen may display stored or historical images and or text of one or more components as well as real time data. The split screen may also be used to display any of the other aforementioned data. Additional electronic hardware and software may be necessary to view images in a split screen mode.
An external monitor or television (not shown) may also be attached to the apparatus 20 and configured as a display in any of the manners disclosed above. In one embodiment, the external monitor is connected to the apparatus 20 via a hardwire connection to a VGA (Video Graphics Array) port 87. VGA is one of several standards for color monitors. However, it is to be appreciated that other techniques for outputting video may be employed, as the present invention is not limited in this respect. In one embodiment, a television is connected to the apparatus 20 via a hardwire connection to the aforementioned S-video port.
Additionally, many of the external communication mediums provided as input devices may also be used as output devices. For example, in one embodiment, data output is made through the communication medium 70c, such as a modem, Ethernet or wireless devices. Data may also be outputted to memory, including the aforementioned flashcard 70b, the motherboard's internal memory, or any other memory device known to those in the art, internal or external to the apparatus 20, such as the aforementioned magnetic media, optical media, or electronic media.
In one embodiment, a speaker 80b may optionally be coupled to the apparatus 20 or otherwise included therein for presenting audio picked up by the microphone 70e, whether real-time or previously stored, regarding the object being inspected as well as previously recorded or real time voice transmission. It is to be appreciated, however, that the use of audio data and the speaker are not required for all embodiments.
The motherboard 90 controls data flowing in and out of the device and internal device activity. The motherboard contains the CPU 92, memory, buses, and I/O connection sockets. The CPU can be any suitable processor (e.g., such as a Mobile P3, available from the Intel Corporation, Santa Clara, CA). The motherboard 90 can be custom designed, or can be any of numerous commercially available motherboards. One such motherboard 90 that may be employed is the Microbus MPX-233111 , manufactured by Microbus Inc. of Houston, Texas. The Microbus MPX-233111 contains a video chip 110 coupled to the motherboard 90 through a COM (serial communications) port. This motherboard may be used with a Philips 69000 video chip, manufactured by Philips Semiconductors of Eindhoven, The Netherlands, as the video chip 110. Any other suitable video chip may be employed. In one embodiment, the CCU 100 is also coupled to the motherboard 90 and is used to control and receive images from one or more of the external cameras 70d described above. One example of a CCU 100 that may be used is the Panasonic GP-KS 162CBP WNTCE manufactured by the Panasonic Systems Company of Elgin, Illinois. Both the video chip 110 and the CCU 100 aid in manipulating and displaying graphics data. It should be appreciated that the name brand and type of components described are exemplary, as the present invention is not limited in this respect. Most incoming data flows through the motherboard 90 upon entering the apparatus 20. Input data received via the camera 70d may be received by the CCU 100 before being processed by the motherboard 90. The CCU 100 is capable of controlling one or more parameters of camera generated images including gain and white light balance and controlling an electromc iris for contrast. In one embodiment, the aforementioned white balance control button 77 is connected to the CCU 100 so that an initial white balance reading may be obtained. To take such a reading, the user places a piece of white paper in front of the camera 70d and depresses the white balance control button 77. The CCU 100 uses this reading to measure the amount of ambient light. Then, the CCU 100 uses the reading to adjust the color data in all subsequent camera shots, compensating for the ambient light.
The CCU can also perform analog to digital (A/D) conversion. For example, the CCU may receive images in any electronic format fiom the camera and reformat the images into digital format. The CCU then passes the digitally formatted image to the CPU. The video chip 110 can perform a variety of image manipulations on any image, and is not limited to manipulating solely camera generated images. In some embodiments, the video chip 110 is capable of A/D conversion, as well as formatting the image into known image formats, such as JPEG (Joint Photographic Experts Group). Once formatted by either or both of the CCU 100 and the video chip 110, the data may be passed to the CPU 92 for further processing, storing and/or transmitting.
The CPU 92 retrieves any requested data and sends it to the proper output device as requested. The CPU 92 also processes, stores or sends any inputted data as directed. Software used in the apparatus 20 may be run by and controlled by the CPU 92. Such software may be custom software or commercially available software, such as XFREE86 provided by The XFree86 Project, Inc (available from the University of Sydney, Australia) that runs on UNIX® and compatible (e.g., Linux, BSD, Mac OS X and Solaris x86 series) operating systems and OS/2 and a suitable windows manager. This or other software may be used so that the CPU can perform concurrent operations of two or more processes in a multitask fashion. In one embodiment, Linux operating system is run on the apparatus, available from Linux.com. Other suitable operating systems may be employed as the present invention is not limited in this respect. Word processing or other text processing software may be employed to handle partial or full text inputs by a user. In this respect, any text information that a user desires may be inputted, not merely pre-programmed information. Of course, preprogrammed information, such as checklists, may also be employed. The images or audio data may be attached as a file to the text resulting text file. Additional software may include an image manipulation package, enabling the data to be formatted according to certain display constraints. Some possible manipulations may include image rotation, image sizing and choosing between landscape and portrait display options. The CPU 92 may employ any of a number of algorithms to handle these tasks, as will be explained below. In one embodiment, the memory 94 is used to buffer several frames of incoming streaming video such that the images can be processed frame by frame and then displayed to the user at a rate comparable to that of real time, but several microseconds later. This process improves display quality and facilitates image manipulation. For example, each frame in the buffer may be rotated prior to being displayed to the user. The CPU 92 can interface with the motherboard's memory 94 in any of numerous ways, e.g., through various busses. In one embodiment, the motherboard 90 contains 64 MB of RAM (Random Access Memory). However, the present invention is not limited by the type or amount of storage placed on the motherboard 90, as additional types or amounts may be coupled to the motherboard 90. In the embodiment shown, both the memory 94 and the CPU 92 interface with the I/O devices through the I/O connection.
In one embodiment, a power supply interface is provided by a port 112 capable of hardwire connection to an external power supply. The power supply level may be about 12 N, or other levels may be employed. The apparatus 20 can include an on-board power source, such as a battery 114 (Fig. 4), which may be rechargeable and housed within the casing, thereby rendering the apparatus 20 cordless.
As discussed above, the various components forming the apparatus 20 may be housed within a casing 130. In one embodiment, the casing 130 includes a front casing 130a and a back casing 130b that intercom ect to form an enclosure. The front casing 130a contains a cutout 132 for the display screen 80a and touch screen 70a. The back casing 130b is substantially rectangular and may also one or more cutouts 134 for ports to external devices and/or control buttons, knobs, switches or other interfaces. The front and back casings 130a, 130b may be secured together using any suitable technique, such as with the use of screws. In addition, the casing 130 may contain various bosses to support and secure the various electronic and mechanical components of the apparatus 20.
In one embodiment, the casing 130 also contains two sets of four curved finger grooves 136 on the external side to aid in handling the apparatus 20. Handles 138 are attached to the casing 130 over these groves, leaving about a one to two inch space for a user's hands. A hook 140 may be mounted to the case to allow the apparatus 20 to be hung for hands-free use. It should also be recognized that casing for the apparatus 20 can take many other shapes and configurations, as not limited. The casing 130 of the apparatus 20 may be manufactured out of many types of material in order to satisfy the needs of the user. For example, the apparatus 20 may be ruggedized and/or waterproofed.
In addition, it should be appreciated that various aspects of the present invention are not limited to the use of this or any particular hardware particularly adapted for use as a digital maintenance apparatus. For example, many of the above-described methods may be programmed into any suitable computer.
As discussed above, any suitable type of imaging unit or camera can be used with the apparatus 20 to provide images of the object 24. One example of an imaging system, including a camera assembly and a scope, with which the apparatus 20 of the present invention can be used, will now be described with reference to Figures 6-10. However, it is to be appreciated that the apparatus 20 is not limited to use with this or any other particular imaging system. Figure 6 is a partially cut away perspective view of an example of an imaging system that may be used with the apparatus 20. As shown, the imaging system includes four primary components, i.e., a scope 150, such as an endoscope, an imaging unit or camera assembly 152, a coupler 154, which couples the scope 150 to the imaging unit 152, and a condom-like drape 400, which prevents the imaging unit 152 from contaminating a sterile operating field should the system be used in a medical environment, a clean room environment for the manufacture of e.g., silicon wafers, or other sterile environments. The use of the condom-like drape 400 need not be employed when inspecting components, such as aircraft engines. The imaging system can be employed with any type of image-producing scope, and is not limited to use with any particular type of scope.
As discussed in more detail below, in the exemplary imaging system shown in Figures 6-7, the condom-like drape 400 does not intercept the optical viewing axis of the system. In addition, the condom-like drape 400 does not cover a focusing mechanism 480 of the imaging system, making it easier to focus the system and lessening the likelihood that the drape 400 will be damaged due to manipulation of the focusing mechanism.
The lens for focusing the image from the endoscope to the imaging unit may be provided in the imaging unit 152, rather than in the coupler 154. This is particularly advantageous because, as discussed in more detail below, in the exemplary embodiment shown, a portion of the coupler 154 is not separated from the scope 150 by the condomlike drape 400, and therefore, is sterile in use. By removing the refractive lens 200 from the coupler 154, the coupler 154 can be made significantly less expensively, thereby enabling the coupler 154 to be provided as a disposable part that need not be sterilized between uses. This is advantageous because the sterilization of the devices can be inconvenient and time consuming.
The imaging unit 152 includes an image sensor 156 that senses an image along an imaging axis (not shown). When the imaging system is used, the coupler 154 is coupled between the eyepiece 158 of the scope 150 and a distal end 660 of the imaging unit 152 such that the lens 200 is disposed between the image sensor 156 and the eyepiece 158 to focus an image produced by the scope 150 onto the image sensor 156. The refractive lens 200 may be provided in the imaging unit 152, rather than in the coupler 154. The coupler can be therefore made significantly less expensively, thereby enabling the coupler to be provided as a disposable part that need not be sterilized between uses.
The image sensor 156 may, for example, include a charge-coupled device (CCD) as discussed above, or a metal-oxide semiconductor (MOS) sensor. It should be appreciated, however, that the present invention is not limited in this respect, and can be employed with any type of image sensor 156. The image generated by the image sensor 156 can be conveyed to the maintenance apparatus 20 or a monitor 460 in any of numerous ways, and the present invention is not limited to any particular implementation. For example, the image sensor 156 may be coupled to circuitry 560 which can assist in converting an image sensed by the image sensor 156 into an electrical signal. This electrical signal then may be transmitted (e.g., via cable 260) to the monitor 460, maintenance apparatus 20 or elsewhere for display to a user or may be otherwise processed and/or recorded on a suitable medium. Alternatively, the image sensor 156 may comprise a bundle of fiber optic cables which optically transmit an image from the lens 200 to the apparatus 20 or other a viewing device for display to a user. Thus, the image sensor 156 need not necessarily convert the image from scope 150 into an electrical signal.
The imaging unit 152 is releasably mated with the coupler 154. This mating may be accomplished using any of a number of techniques. Figures 6 and 7 illustrate one technique that may be used to mate these two components. In the particular implementation shown, to mate imaging unit 152 with coupler 154, a distal end 660 of the imaging unit 152 is inserted into an opening 880 at a proximal end 1100 of the coupler 154. As shown, the imaging unit 152 includes a button 580 which is pivotally connected, via a pin 820, to a body portion 180 of the imaging unit 152. The imaging unit 152 has a cavity 810 formed underneath the button 580 and a spring 900, disposed in the cavity 810. Spring 900 biases the button 580 (in a clockwise direction in Figure 6) about pin 820 so that locking member 600 is biased away from a surface 860 of body portion 180. When a user pushes button 580 toward surface 860, however, spring 900 is compressed so that button 580 moves in a counterclockwise direction in Figure 6 about pin 820 and locking member 600 moves toward surface 860. Thus, when the button 580 is depressed and the distal end 660 of the imaging unit is inserted into the opening 880 in the coupler 154, the locking member 600 moves toward surface 860 so that it can slide over edge 1180 of the coupler 154. When the button 580 is released, the locking member 600 is biased (by spring 900) away from surface 860 and into a notch 620 in the coupler 154, and a shoulder 1160 of imaging unit 152 contacts a shoulder 1140 of the coupler 154, thereby interlocking the imaging unit 152 and the coupler 154. An indication that the distal end 660 of the imaging unit 152 is fully inserted into the opening 880 is provided by the distal end 660 contacting a shoulder 1120 of coupler 154. The imaging unit 152 and coupler 154 can be separated by pushing button 580, which moves the locking member 600 out of the notch 620, and pulling the imaging unit 152 away from the coupler 154. As mentioned above, Figures 6 and 7 illustrate only one example of the many ways that the imaging unit 152 and coupler 154 may be mated together.
As shown in Figures 6 and 7, the imaging unit 152 also includes a handle 780 proximal to the body portion 180. The handle 780 may include grooves 800 to make it easier for a user to grip the imaging unit 152 though the drape 400 that can be extended over the imaging unit 152 in a manner described below.
The image sensor 156 and circuitry 560 may be mounted in the body portion 180 of the imaging unit 152 in any of a number of ways. For example, the image sensor 156 may be mounted via pins or screws 840a and 840b, and circuitry 560 may be mounted on a circuit board supported within body portion 180. One or more wires (not shown) may be used to interconnect the circuitry 560 with the cable 260.
It may be useful to enable the focal length between the image sensor 156 and the lens 200 of imaging unit 152 to be adjusted. In the system shown in Figures 6-7, this is accomplished via a mechanism that is not covered by the condom-like drape 400, thereby making it easier to focus the system and lessening the likelihood that the drape 400 will be damaged due to manipulation of the focusing mechanism. It should be appreciated, however, that the focal length adjustment can be accomplished in any number of ways.
One example of a technique that is useful to perform the focal length adjustment is illustrated in Figures 6-8. In the embodiment shown, the refractive lens 200 is disposed in the imaging unit 152, rather than in the coupler 154. Thus, the focusing mechanism includes elements disposed in the imaging unit 152, as well as in the coupler 154. As mentioned above, placement of the lens 200 within the imaging unit 152, rather than in the coupler 154, provides at least one significant advantage. That is, the cost of the coupler 154 may be reduced significantly below the cost of coupling devices that include lenses, thereby making it commercially practicable to use a new, sterile coupler each time the imaging system is used, rather than repeatedly sterilizing and reusing the same coupling device should sterilization be required.
The distal end 660 of the imaging unit 152 includes a primary cylinder 760, in which a spring 680 and a cylindrical lens holder 220 are disposed. Lens holder 220 supports the lens 200 in front of an imaging axis of image sensor 156. Lens holder 220 (and lens 200) can be moved within primary cylinder 760 either toward or away from distal end 660 of the imaging unit 152 so as to adjust the focal length between the image sensor 156 and the lens 200. Spring 680 biases lens holder 220 toward distal end 660. The position of lens holder 220 within primary cylinder 760 can be adjusted, however, through manipulation of a focusing mechanism on the coupler 154 as discussed below. It should be appreciated that the present intention is not limited in this respect and that a camera including a lens that does not require focussing may be employed.
The imaging unit 152 further includes an outer cylinder 720, including a spirally ramped upper edge 960, which surrounds the primary cylinder 760. Outer cylinder 720 is movable with respect to primary cylinder 760 either toward or away from the distal end 660 of imaging unit 152. Outer cylinder 720 is comiected to the lens holder 220 via a pin 700. Pin 700 extends through a slot 920 which extends a short distance along a length of the primary cylinder 760. Thus, lens holder 220, outer cylinder 720 and pin 700 move as a single unit, with respect to primary cylinder 760, either toward or away from the distal end 660 of imaging unit 152. The mamier in which this unit interacts with the focusing mechanism disposed on coupler 154 is described below in connection with Figures 8a-8b.
Figures 6 and 7 show an exemplary implementation of the coupler 154. The coupler 154 can be constructed in any of a number of ways to achieve the desired goal of enabling the imaging unit 152 to be coupled to the scope 150. In the implementation shown, the coupler 154 includes a main body 500 (including a proximal portion 500a and a distal portion 500b), a focusing ring 480, a light-penetrable window 940, a scope mounting portion 420 (including inner ring 420a and outer ring 420b) and the condomlike drape 400. The components constituting the main body 500, focusing ring 480 and scope-mounting portion 420 may be made of any suitable material and may be affixed together in any suitable manner. For example, they may be plastic molded components affixed together using an epoxy-based adhesive. When the coupler 154 is a disposable device, the coupler 154 is preferably formed from inexpensive components. The main body 500 may be formed by inserting the distal portion 500b within the focusing ring 480, and then affixing together the proximal and distal portions 500a and 500b. Scope mounting portion 420 may be affixed to distal portion 500b. Main body 500 has an outer surface 520 between a distal end 1080 and a proximal end 1100 of the coupler 154. A channel 440 extends about a perimeter of the outer surface 520 between the focusing ring 480 and the proximal end 1100.
When the coupler 154 is used in a medical or clean room application, it is desirable to not have to sterilize the imaging unit 152, thereby saving the time and expense of sterilization, and avoiding restrictions on the manner in which the imaging unit be formed, since it need not be sterilizable. Therefore, a sterile barrier may be established between the sterile operating environment including the scope 150, and a non-sterile environment including the imaging unit 152. In the system shown in Figures 6-7, such a sterile barrier is established by coupling the distal end 660 of the imaging unit 152 to the coupler 154, and providing a hermetic seal between the components of the coupler 120 that separate the sterile and non-sterile environments. A light-penetrable window 940 is hermetically sealed between the distal end 1080 and the proximal end 1100 of the coupler 154 to establish a sterile barrier therebetween. Window 940 may be made of glass, plastic, or any other suitable material through which light can pass from the scope 150 to the image sensor 156 (via lens 200) to generate a suitable image.
As mentioned above, the coupler 154 also includes the condom-like drape 400. The condom-like drape 400 may be made of any material that is suitable for creating a sterile barrier between a sterile environment and a non-sterile environment. For example, the condom-like drape may be made of a non-porous latex or plastic material. When the imaging unit 152 is mated with the coupler 154, the drape 400 may be extended to cover some or all of imaging unit 152 and cable 260. The condom-like drape 400 may be hermetically sealed to the outer surface 520 of coupler 154. It should be appreciated that in the implementation shown in the figures, when each of the components of the coupler 154 is sterile, the hermetic seals between the main body portion 500 and the window 940 and drape 400 establish a sterile barrier between the scope 150 and the imaging unit 152, with the main body portion 500 of the coupler 154 itself forming a part of this sterile barrier. As compared to other systems, in which a sterile barrier is formed only with a drape and a window portion thereof and in which a coupling device is located entirely on the non-sterile side of this barrier, the system shown in Figures 8 and 9 is superior because scope 150 can mate directly with body portion 500 rather than requiring the drape to be interposed between the coupling device and the endoscope.
In the system shown in the figures, the condom-like drape 400 does not intercept the optical viewing axis 190 of the imaging system. As mentioned above, this is advantageous in that the drape 400 need not be provided with a window that must be aligned with the optical viewing axis 190, and the drape 400 does not interfere with the quality of the image presented on the monitor 460. It should be appreciated that the function performed by the condom-like drape 400 can be achieved in any of numerous ways. For example, a protective drape can be provided that is more rigid than the condom-like drape 400 depicted in the drawings.
In the system shown in the drawings, the condom-like drape 400 is substantially tubular in form and is open on its distal and proximal ends. The distal end 210 of the condom-like drape 400 is attached to the outer surface 520 (within channel 440) of the coupler 120. As discussed above, this attachment can be accomplished using a hermetic seal (e.g., via an O-ring 540) to maintain the separation between the sterile and non- sterile environments. The condom-like drape 400 can be provided in a rolled-up form attached to the coupler 154. After the coupler 154 is mated with to the imaging unit 152 as described above, the condom-like drape 400 can be unrolled to cover the non-sterile imaging unit 152. By encompassing the outer surface 520 of coupler 154 with the opening at the distal end 210 of the drape 400, the drape 400 can be used in conjunction with coupler 154 without requiring the user to align the drape 400, or a window portion thereof, between the eyepiece 158 of the scope 150 and the coupler 154, and without having the drape 400 intercept the optical viewing axis 190 of the imaging system. As discussed above, it is to be appreciated that the use of a drape is optional.
Figures 6 and 7 illustrate one example of a technique that may be used to mate the scope 150 with the coupler 154. It should be appreciated that numerous other suitable mating techniques can be employed. In the system shown in Figures 6 and 7, the scope 150 is mated with the coupler 154 by inserting the eyepiece 158 into an opening 380 at the distal end 1080 of the coupler 154. Opening 380 may be formed by the inner and outer rings 420a-420b of the scope mounting portion 420. The inner and outer rings 420a-420b form equal diameter openings, and inner ring 420a is movable with respect to outer ring 420b. A spring biases the inner ring 420a so that its center is forced to be offset from the center of the outer ring 420b unless a user activates a lever (not shown) to cause the centers of the two rings to align with one another.
To mate the scope 150 with the coupler 154, the user activates the lever so that the centers of the rings 420a-420b align with one another and inserts the eyepiece 158 through both rings. The user then can release the lever so that the spring (not shown) causes the center of ring 420a to become offset from the center of ring 420b. Because the diameter of the eyepiece 158 is only slightly smaller than the diameter of each of rings 420a and 420b, when the centers of the rings are offset from one another, the eyepiece 158 will be locked within the scope mounting portion 420 of the coupler 154. The eyepiece 158 may be separated from the scope mounting portion 420 by pressing the lever to realign the centers of rings 420a and 420b and pulling the scope 150 away from the coupler 154.
In the system of Figure 6, the coupler 154 is shown as being mated directly with the eyepiece 158 of the scope 150. However, it should be appreciated that the scope 150 (or other image-producing scope) may alternatively be mated indirectly with the coupler 154. For example, the scope 150 may be mated with the coupler 154 via one or more additional coupling devices.
As discussed above, using the system of Figures 6-8, the user can directly manipulate a focusing mechanism without having to do so through a portion of a protective drape such as condom-like drape 400. Any focusing mechanism can be employed that serves to adjust the focal length between the lens 200 and image sensor 156 in the imaging unit 152. In the exemplary system shown in Figures 6-8, a focusing ring 480 is provided on the coupler 154 to perform this focal length adjustment. The focusing ring 480 is disposed distally of the distal end 210 of the condom-like drape 400, so that after the drape 400 is extended to cover some or all of the imaging unit 152 and cable 260, the focusing ring 480 is not covered by the drape 400 and may be manipulated by a user to adjust the focal length between the lens 200 and the image sensor 158 without also having to manipulate the drape 400. Hence, this feature makes focusing ring 480 relatively easy for the user to manipulate to achieve sharp focusing, and reduces the risk of damage to drape 400. An illustrative example of a linkage assembly for mechanically coupling the focusing ring 480 on the coupler 154 to the imaging unit 152 to adjust the focal length between the lens 200 and image sensor 158 is shown in Figures 7, 8a and 8b. It should be appreciated that numerous other implementations are possible. In the system shown, the distal portion 500b of the main body portion 500 of coupler 154 has an annular groove 1000. Annular groove 1000 may be covered by the focusing ring 480, so that it is not visible from the outside of coupler 154. A finger 980 extends inwardly from the focusing ring 480 through the annular groove 1000, so that when the focusing ring 480 is rotated about the main body portion 500, finger 980 slides within the annular groove 1000. As shown in Figure 8a and 8b, when the imaging unit 152 is mated with the coupler 154, a lower surface 1200 of finger 980 contacts a portion of a spiraling ramp surface 960 on the outer cylinder 720. As mentioned above, pin 700 may be connected between the outer cylinder 720 and the cylindrical lens holder 220 through the slot 920, which extends along the length of the primary cylinder 760, so that the outer cylinder 720 and lens holder 220 do not rotate with respect to the primary cylinder 760. The focusing ring 480, however, can rotate fieely about the primary cylinder 760, limited only by the movement of the finger 980 within the annular groove 1000.
As the focusing ring 480 rotates with respect to the primary cylinder 760, a bottom surface 1200 of the finger 980 slides along the spiraling ramped surface 960. The spring 680 pushes upwardly on outer cylinder 720 to keep a portion of the spiraling ramped upper surface 960 in contact with bottom surface 1200 of the finger 980 at all times. Enough friction exists between the focusing ring 480 and the main body 500 of the coupler 154 to prevent the spring 680 from rotating the focusing ring 480 when it is not being manipulated by a user. This friction makes the fine tuning of the focal length between the lens 200 and image sensor 156 (using focusing ring 480) relatively easy to accomplish. Figures 8a and 8b illustrate the focusing mechanism at its two extreme focusing positions, with Figure 8a illustrating the lens 200 at its closest position to the image sensor 156 and Figure 8b illustrating the lens 200 at its furthest position from the image sensor 156. As shown in Figure 8a, when the lens 200 is at its closest position to the image sensor 156, the spring 680 is fully compressed, bottom surface 1200 of finger 980 is in contact with a point 1060 near the top of the spiraling ramped surface 960, and the finger 980 is in a first position with respect to the primary cylinder 760. In contrast, as shown in Figure 8b, when the lens 200 is at its furthest position from the image sensor 156, the spring 680 is fully extended, the bottom surface 1200 of finger 980 is in contact with a point 1040 near the bottom of the spiraling ramped surface 960, and the finger 980 is in a second position with respect to the primary cylinder 760, which is on an opposite side from the first position (Figure 8a).
It should be appreciated that the above-described system for adjusting the focal length between the image sensor 156 and the lens 200 is only one example of the many possible systems that can achieve this result, as other implementations can alternatively be employed.
In the illustrative embodiment of Figures 6-7, the imaging unit 152 includes a single body portion 180 in which both the image sensor 156 (and associated circuitry 560) and the refractive lens 200 (and associated components such as the lens holder 220, the spring 680, and the cylinders 720 and 760) are disposed. It should be appreciated, however, that various components of the imaging unit 152 may alternatively be distributed among two or more separate housings that may be mated together to form the imaging unit 152. An illustrative example of an imaging system configured in this manner is shown in Figures 9 and 10. As shown in Figure 9, the imaging unit 152 to be mated with the coupler 154 may include a first housing 180a in which the refractive lens (and associated components) is disposed, and a second housing 180b in which the image sensor 140 (and associated circuitry (not shown)) is disposed.
In the illustrative embodiment shown in Figures 9 and 10, the second housing 180b is the housing of a camera head 152b (e.g., a standard C-mount camera head), and the first housing 180a is the housing of an adapter 152a for adapting the camera head 152b for use with the coupler 154. When the adapter 152a is mated with the camera head 152b (as discussed below), the adapter 152a and the camera head 152b together form a composite imaging unit 152 which is similar to the imaging unit 152 described above in connection with Figures 6-7. Although the example shown in Figures 9-10 includes a C-mount camera head and adapter therefor, it should be appreciated that each of the housings 180a-180b may take on any of a number of alternative forms. For example, the housing 180b may alternatively be the housing of a standard V-mount camera head, or any other device in which an image sensor is disposed, and the housing 180a, may be configured to be mated with the same.
It should also be appreciated that the imaging unit 152 may further include additional housings, including only one or two housings. For example, referring to the Figure 9 system, the imaging unit 152 may further include one or more housings disposed between the housings 180a and 180b or between the housing 180a and the coupler 154. Such an additional housing may exist, for example, in the form of a coupling device that couples together the housings 180a and 180b or the housing 180a and the coupler 154. It should be appreciated that the imaging unit actually employed may be any of numerous devices or combinations of devices capable of receiving an optical image along an imaging axis. As used herein, the term "imaging unit" is not intended to be limiting. Rather, it is intended to refer to any device or combination of devices capable of performing an imaging function.
Further, while in the systems of Figures 6-9 the coupler 154 is shown as being mated directly with the distal end 660 of the imaging unit 152, it should be appreciated that the imaging unit 152 may alternatively be mated indirectly with the coupler 154. For example, the imaging unit 152, in whatever form, may be mated with the coupler 154 via one or more additional coupling devices.
In the illustrative system shown in Figures 9-10, the operational interface between the adapter 152a and the coupler 154 is identical in most respects to the operational interface between the imaging unit 152 and the coupler 154 described above in connection with Figures 6-8. Corresponding components in the two embodiments have therefore been labeled with identical reference numerals, and reference may be made to the description of the embodiment of Figures 6-8 for an in-depth understanding of the operational interface between the adapter 152a and the coupler 154 of the embodiment of Figures 9-10. As mentioned above, the camera head 152b may, for example, be a standard C- mount camera head. Therefore, as shown in Figure 9, the camera head 152b may include a threaded, female connector 1280 formed at a distal end 1320 thereof. To permit the adapter 152a to mate with the connector 1280 of the camera head 152b, the adapter 152a may include a threaded, male connector 1260 formed at a proximal end 1360 thereof. As shown in Figure 9, the image sensor 156 may be disposed adjacent the distal end 1320 of the camera head 152b so that, when the male connector 1260 of the adapter 152a is threaded into the female connector 1280 of the camera head 152b, the image sensor 156 is disposed adjacent an opening 1380 at the proximal end 1360 of the adapter 152a. In the system of Figures 9-10, the image sensor 156 is therefore disposed further from the distal end 660 of the imaging unit 152 than it is in the system of Figures 6-7. For this reason, in the system of Figures 9-10, an annular cavity 1220 is formed within the housing 180a to provide an optical pathway between the refractive lens 200 and the image sensor 156 along which an image produced by the scope 150 can be focused onto the image sensor 156 via the lens 200. The cavity 1220 may be formed, for example, by reducing a width of an annular shoulder 1340 (Figure 10) supporting one end of the spring 680 to be narrower than in the embodiment of Figures 6-7.
In addition, in the system of Figures 9-10, the button 580 is disposed on the adapter 152a of the imaging unit 152, and is therefore disposed distally of the image sensor 156 in this system, rather than proximally of the image sensor 156 as in the system of Figures 6-7. As shown, to make the button 580 fit on the adapter 152a, the button 580 may be shortened as compared to the system of Figures 6-7. Additionally, the pin 820 about which the button 580 pivots may be disposed within a small cavity 1240 adjacent the proximal end 1360 of the adapter 152a, rather than being disposed proximally of the image sensor 156 as in the system of Figures 6-7. It should be appreciated, of course, that the button 580 and locking member 600 represent only one example of numerous mechanisms that can be used to interconnect the imaging unit 152 with the coupler 154, and that the imaging unit 152 may be mated with the coupler 154 in different ways. For example, the imaging unit 152 may not include a button such as the button 580 or a locking member such as the locking member 600 at all, and may instead provide a different mechanism for mating the imaging unit 152 with the coupler 154. In light of the above description, it should be appreciated that, as far as the physical interface between the imaging unit 152 and the coupler 154 is concerned, the imaging unit 152 that is formed when the adapter 152a is mated with the camera head 152b can be made identical in all respects to the imaging unit 152 of embodiment of Figures 6-8. Additionally, by properly adjusting the refractive index of the lens 200 to account for the increased distance between the distal end 660 and the image sensor 156 in the embodiment of Figures 9-10 as compared to the embodiment of Figures 6-8, the imaging unit 152 of Figures 9-10 can also be made to mimic the functional characteristics of the imaging unit 152 of Figures 6-8 as well. The use of the adapter 152a of Figures 9-10 therefore enables a standard camera head (e.g., the camera head 152b) to be adapted for use with the inventive coupler 154 described herein in the same manner as in the embodiment of the imaging unit 152 described in connection with Figures 6-8. Therefore, one already in possession of a camera head 152b (e.g., a standard C-mount or V-mount camera head) may simply purchase the adapter 152a (which does not include an image sensor) for use with the coupler 154, rather than purchasing the imaging unit 152 of Figures 6-8 (which additionally includes an image sensor) for use therewith.
The adapter 152a described herein is configured for use with a specific type of coupler (i.e., the coupler 154). However, it should be appreciated that the adapter 152a may alternatively be configured for use with other types of devices or couplers.
It should be appreciated that any suitable type of camera can be used to take such images, as the present invention is not limited to the above-described examples. Additional examples of cameras that can be suitable for use in such a system are described in a series of Applicant's earlier-filed U.S. patent applications, including provisional applications 60/054,197; 60/054,198; and 60/121,382, as well as regular U.S. patent applications nos. 09/126,368; 09/382,496; and 09/513,673, each of which is incorporated herein by reference. However, the present invention is not limited to using such camera systems.
The apparatus 20 and method of use described herein can be used in connection with inspection and/or maintenance of numerous types of objects, as the present invention is not limited in this respect. The apparatus 20 and method of use described herein can be used in connection with inspection and/or maintenance of: aircraft (e.g., airplanes and helicopters), boats, automobiles, trucks, military equipment (e.g., tanks, weapons, etc.) and space vehicles; engines and related components, including aircraft engines, ship engines, motor vehicle engines and turbine engines; structural components of vehicles, such as airframes, hulls, chassis and automobile frames and other such components; structures such as buildings, roads, bridges, tumiels, etc.; facilities such as manufacturing plants and power plants including the components or objects relating to such facilities; mechanical components; systems; parts; inventory; products; processes; fluids and flows; and chemicals. Other applications for the apparatus include, but are not limited to, capturing, storing and retrieving information, such as maintenance and/or inspection information, regarding: process control; inventory management and control; cargo inspection by customs agents; searches conducted by law enforcement officials; surveillance; and obtaining diagnostic and other information by doctors and other medical professionals. Other applications will be readily apparent to those of skill. It should be appreciated that various combinations of the above-described embodiments of the present invention can be employed together, but each aspect of the present invention can be used separately. Therefore, although the specific embodiments disclosed in the figures and described in detail employ particular combinations of the above-discussed features of the present invention, it should be appreciated that the present invention is not limited in this respect, as the various aspects of the present invention can be employed separately, or in different combinations. Thus, the particular embodiments described in detail are provided for illustrative purposes only. What is claimed is:

Claims

Clai s
1. A method of maintaining an object, the method comprising acts of: storing, in digital format, a first image of the object at a first time; obtaining a second image of the object at a second time; comparing the first image to the second image; and determining whether to perform maintenance on the object based, at least in part, on the act of comparing.
2. The method according to claim 1 , further comprising an act of annotating the first image with data input from a user.
3. The method according to claim 2, wherein the act of determining whether to perform maintenance on the object comprises an act of reviewing the annotations regarding the first image.
4. The method according to claim 1, further comprising an act of displaying at least one of the first or second images.
5. The method according to claim 1 , further comprising an act of displaying both the first image and the second image.
6. The method according to claim 1, further comprising an act of transmitting the first image to a remote location.
7. The method according to claim 1, further comprising an act of retrieving the first image from the remote location.
8. The method according to claim 1, further comprising an act of storing at least one of an image of the initial condition of the object, maintenance history of the object, diagnostic information regarding the object and instructional information regarding the object.
9. The method according to claim 1, further comprising an act of obtaining audio data regarding the object.
10. The method according to claim 1 , further comprising an act of obtaining an error code regarding the object.
11. The method according to claim 2, wherein the act of annotating the first image with data comprises an act of inputting at least one of text and voice.
12. The method according to claim 6, wherein the act of transmitting the first image comprises an act of transmitting the first image via a wireless connection.
13. The method according to claim 6 wherein the act of determining whether to perform maintenance on the object occurs at the remote location.
14. The method according to claim 1, wherein the act of imaging the object comprises an act of imaging the object with a still image.
15. The method according to claim 1 , wherein the act of imaging the object comprises an act of imaging the object with streaming video.
16. The method according to claim 1, further comprising an act of performing maintenance on the object.
17. The method according to claim 1 , wherein the act of imaging the object comprises an act of imaging a component of an aircraft.
18. The method according to claim 6, wherein the act of transmitting the first image comprises an act of transmitting the first image via the Internet.
19. A method of inspecting an object from a remote location, the method comprising acts of: obtaining a digital image of the object at a first location; electronically transmitting the digital image to a second location remote from the first location; viewing the digital image at the second location; transmitting instructions to the first location; and performing an act on the object in response to the instructions.
20. The method according to claim 19, further comprising an act of annotating the image with data input from a user.
21. The method according to claim 20, further comprising an act of electronically transmitting the annotations to the second location.
22. The method according to claim 21, further comprising an act of reviewing the annotations.
23. The method according to claim 19, further comprising an act of storing at least one of an image of the initial condition of the object, maintenance history of the object, diagnostic information regarding the object and instructional information regarding the object.
24. The method according to claim 19, further comprising an act of obtaining audio data regarding the object.
25. The method according to claim 24, further comprising an act of electronically transmitting the audio data to the second location.
26. The method according to claim 19, further comprising an act of obtaining an error code regarding the object.
27. The method according to claim 26, further comprising an act of electronically transmitting the error code to the second location.
28. The method according to claim 20, wherein the act of annotating the image with data comprises an act of inputting at least one of text and voice.
29. The method according to claim 19, wherein the act of transmitting the image comprises an act of transmitting the image via a wireless connection.
30. The method according to claim 19, wherein the act of transmitting the image comprises an act of transmitting the image via the Internet.
31. The method according to claim 19, wherein the act of performing an act on the object comprises an act of performing maintenance on the object.
32. The method according to claim 19, wherein the act of imaging the object comprises an act of imaging the object with a still image.
33. The method according to claim 19, wherein the act of imaging the object comprises imaging the object with streaming video.
34. The method according to claim 19, wherein the act of imaging the object comprises an act of the act of imaging a component of an aircraft.
35. The method according to claim 19, wherein the act of imaging the obj ect comprises an act of imaging a component of an aircraft and wherein the act of transmitting the image comprises an act of transmitting the image to a manufacturer of the aircraft or a manufacturer of the component.
36. The method according to claim 19, wherein the act of imaging the object comprises an act of imaging a component of an aircraft and wherein the act of transmitting the image comprises an act of transmitting the image to an aircraft maintenance facility.
37. The method according to claim 19, wherein the act of transmitting instructions to the first location comprises an act of transmitting maintenance instructions to the first location.
38. An electronic inspection apparatus adapted to communicate with a camera to obtain an image of an object, the apparatus comprising: a casing; a computer disposed within the casing; a camera control unit disposed within the casing and coupled to the computer, the camera control unit being adapted to receive electronic images from the camera, reformat the electronic images into digital format and pass the digitally formatted images to the computer; and an input device, coupled to the computer, adapted to allow a user to input full text data relating to the image.
39. The apparatus according to claim 38, further comprising a display coupled to the computer and adapted to display at least the digitally formatted images.
40. The apparatus according to claim 38, further comprising a computer readable storage medium communicating with the computer, the storage medium adapted to store the image and related voice or text data regarding the object.
41. The apparatus according to claim 40, further comprising a computer readable storage medimn commmiicating with the computer, the storage medium adapted to store additional maintenance information regarding the object.
42. The apparatus according to claim 38, further comprising a display coupled to the computer for displaying at least two images.
43. The apparatus according to claim 42, wherein the at least two images comprises a first image representative of a current condition of the object and a second image representative of an historical condition of the object.
44. The apparatus according to claim 43, wherein the historical condition of the object is an initial condition of the object.
45. The apparatus according to claim 38, further comprising a device to transmit at least the image to a remote location.
46. The apparatus according to claim 38, further comprising a storage medium coupled to the computer, the storage medium storing at least one of an image of the initial condition of the object, maintenance history of the object, diagnostic information regarding the object and instructional information regarding the object.
47. The apparatus according to claim 38, further comprising a microphone coupled to the computer and adapted to obtain audio data regarding the object.
48. The apparatus according to claim 38, further comprising an input device coupled to the computer and adapted to obtain an error code regarding the object.
49. The apparatus according to claim 38, further comprising a microphone coupled to the computer and adapted to allow a user to input voice data.
50. The apparatus according to claim 45, wherein the device is a wireless communication device.
51. The apparatus according to claim 38, wherein the camera control unit is adapted to receive at least one of still images and streaming video.
52. The apparatus according to claim 45, further comprising an Internet connection coupled to the computer for transmitting the image to the remote location.
53. The apparatus according to claim 38, in combination with the camera.
54. The apparatus according to claim 38, wherein the input device is a touch screen.
55. The apparatus according to claim 38, wherein the apparatus is a hand-held apparatus.
56. The apparatus according to claim 38, wherein the apparatus is cordless.
57. The apparatus according to claim 38, wherein the camera control unit receives images from the camera via a wireless connection.
58. The apparatus according to claim 38, wherein the camera control unit receives images from the camera via a hardwire connection.
59. The combination according to claim 53, wherein the apparatus controls at least one function of the camera.
60. The apparatus according to claim 39, wherein a first portion of the display displays the image and a second portion of the display displays input keys.
61. The combination according to claim 53, wherein the camera is NTSC compatible.
62. An electronic inspection apparatus adapted to communicate with a camera to obtain an image of an object, the apparatus comprising: a casing; a computer disposed within the casing; a camera control unit disposed within the casing and coupled to the computer, the camera control unit being adapted to receive electronic images from the camera, reformat the electronic images into digital format and pass the digitally formatted images to the computer; and a computer readable storage medium, coupled to the computer, having an executable code stored thereon that allows the computer to execute at least two processes in a multitask fashion.
63. The apparatus according to claim 62, further comprising a display coupled to the computer and adapted to display at least the digitally formatted images.
64. The apparatus according to claim 62, further comprising a storage medium communicating with the computer, the storage medium is adapted to store the image and related voice or text data regarding the object.
65. The apparatus according to claim 64, wherein the storage medium is adapted to store additional maintenance information regarding the object.
66. The apparatus according to claim 62, further comprising a display coupled to the computer for displaying at least two images.
67. The apparatus according to claim 66, wherein the at least two images comprises a first image representative of a current condition of the object and a second image representative of an historical condition of the object.
68. The apparatus according to claim 67, wherein the historical condition of the object is an initial condition of the object.
69. The apparatus according to claim 62, further comprising a device coupled to the computer to transmit at least the image to a remote location.
70. The apparatus according to claim 62, further comprising a storage medium communicating with the computer for storing at least one of an image of the initial condition of the object, maintenance history of the object, diagnostic information regarding the object and instructional information regarding the object.
71. The apparatus according to claim 62, further comprising a microphone coupled to the computer and adapted to obtain audio data regarding the object.
72. The apparatus according to claim 62, further comprising an input device coupled to the computer and adapted to obtain an error code regarding the object.
73. The apparatus according to claim 62, further comprising a microphone coupled to the computer and adapted to allow a user to input voice data.
74. The apparatus according to claim 69, wherein the device is a wireless communication device.
75. The apparatus according to claim 62, wherein the camera control unit is adapted to receive at least one of a still image and streaming video.
76. The apparatus according to claim 62, further comprising an Internet connection coupled to the computer for transmitting the image to a remote location.
77. The apparatus according to claim 62, in combination with the camera.
78. The apparatus according to claim 62, further comprising an touch screen coupled to the computer and adapted to allow a user to input text data relating to the image.
79. The apparatus according to claim 62, wherein the apparatus is a hand-held apparatus.
80. The apparatus according to claim 62, wherein the apparatus is cordless.
81. The apparatus according to claim 62, wherein the camera control unit receives images from the camera via a wireless connection.
82. The apparatus according to claim 62, wherein the camera control unit receives images from the camera via a hardwire connection.
83. The combination according to claim 77, wherein the apparatus controls at least one function of the camera.
84. The apparatus according to claim 63, wherein a first portion of the display displays the image and a second portion of the display displays input keys.
85. The combination according to claim 77, wherein the camera is NTSC compatible.
86. The apparatus according to claim 62, wherein the executable code allows the computer to execute the at least two processes simultaneously.
87. An electronic inspection apparatus adapted to communicate with a camera for obtaining an image of an object, the apparatus comprising: a casing; a computer disposed within the casing; a control unit disposed within the casing and coupled to the computer, the control unit being adapted to communicate with the camera; and an input device coupled to the computer and the control unit, the input device adapted to receive an input command from a user, the control unit adapted to receive the command and signal at least portions of the camera to react as commanded.
88. The apparatus according to claim 87, further comprising a display coupled to the computer and adapted to display at least the image.
89. The apparatus according to claim 87, further comprising a storage medium communicating with the computer that is adapted to store the image and related voice or text data regarding the object.
90. The apparatus according to claim 89, further comprising a storage medium, communicating with the computer that is adapted to store additional maintenance information regarding the object.
91. The apparatus according to claim 87, further comprising a display coupled to the computer for displaying at least two images.
92. The apparatus according to claim 91, wherein the at least two images comprises a first image representative of a current condition of the object and a second image representative of an historical condition of the object.
93. The apparatus according to claim 92, wherein the historical condition of the object is an initial condition of the object.
94. The apparatus according to claim 87, further comprising a device coupled to the computer to transmit at least the image to a remote location.
95. The apparatus according to claim 87, further comprising a storage medium communicating with the computer for storing at least one of an image of the initial condition of the object, maintenance history of the object, diagnostic information regarding the object and instructional information regarding the object.
96. The apparatus according to claim 87, further comprising a microphone coupled to the computer and adapted to obtain audio data regarding the object.
97. The apparatus according to claim 87, further comprising an input device coupled to the computer and adapted to obtain an error code regarding the object.
98. The apparatus according to claim 87, further comprising a microphone coupled to the computer and adapted to allow a user to input voice data.
99. The apparatus according to claim 94, wherein the device is a wireless communication device.
100. The apparatus according to claim 87, further comprising a camera control unit coupled to the computer, the camera control unit is adapted to receive at least one of a still image and streaming video.
101. The apparatus according to claim 87, further comprising an Internet connection coupled to the computer for transmitting the image to a remote location.
102. The apparatus according to claim 87, in combination with the camera.
103. The apparatus according to claim 87, further comprising a touch screen coupled to the computer and adapted to allow a user to input the commands.
104. The apparatus according to claim 87, further comprising a touch screen coupled to the computer and adapted to allow a user to input text data relating to the image.
105. The apparatus according to claim 87, wherein the apparatus is a hand-held apparatus.
106. The apparatus according to claim 87, wherein the apparatus is cordless.
107. The apparatus according to claim 87, further comprising a camera control unit coupled to the computer that is adapted to receive images from the camera via a wireless connection.
108. The apparatus according to claim 87, further comprising a camera control unit coupled to the computer that is adapted to receive images from the camera via a hardwire connection.
109. The apparatus according to claim 98, wherein a first portion of the display displays the image and a second portion of the display displays input keys.
110. The combination according to claim 102, wherein the camera is NTSC compatible.
111. The apparatus according to claim 87, wherein the control unit is adapted to control at least one of a viewing axis of the camera, a zoom position of the camera, and the focus of the camera.
112. An aircraft inspection system comprising : a camera adapted to view a component of the aircraft; and a portable electronic apparatus communicating with the camera, the apparatus comprising: a casing; a computer disposed within the casing; a camera control unit coupled to the computer and disposed within the casing, the camera control unit adapted to receive an image from the camera and pass the image to the computer; a display coupled to the computer and adapted to display the image; an input device coupled to the computer and adapted to allow a user to input maintenance data relating to the component; and a storage medium communicating with the computer, the storage medium adapted to store the image and related data.
113. The system according to claim 112, further comprising a device coupled to the computer to transmit the image and maintenance data to a remote location.
114. The system according to claim 112, further comprising an Internet connection coupled to the computer for transmitting the image to a remote location.
115. The system according to claim 112, wherein a first portion of the display displays the image and a second portion of the display displays input keys.
116. An electronic maintenance apparatus adapted to communicate with a camera to obtain an image of an object, the apparatus comprising: a casing; a computer disposed within the casing; and a storage medium communicating with the computer, the storage medium including maintenance information regarding the object being imaged.
117. The apparatus according to claim 116, wherein the maintenance information includes at least one of the image of the object, related voice or text data regarding the object, an image of the initial condition of the object, maintenance history of the object, diagnostic information regarding the object and instructional information regarding the object.
118. The apparatus according to claim 117, further comprising an input device coupled to the computer and adapted to allow a user to input text data relating to the image.
119. The apparatus according to claim 116, further comprising a display coupled to the computer and adapted to display at least a portion of the maintenance information.
120. The apparatus according to claim 116, further comprising an Internet connection coupled to the computer for transmitting at least a portion of the maintenance information to a remote location.
121. The apparatus according to claim 116, in combination with the camera.
122. The apparatus according to claim 116, wherein the apparatus is a handheld apparatus.
PCT/US2001/028587 2000-09-11 2001-09-12 System and method for obtaining and utilizing maintenance information WO2002023403A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2001289056A AU2001289056A1 (en) 2000-09-11 2001-09-12 System and method for obtaining and utilizing maintenance information
EP01968842A EP1332443A2 (en) 2000-09-11 2001-09-12 System and method for obtaining and utilizing maintenance information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US23191300P 2000-09-11 2000-09-11
US60/231,913 2000-09-11

Publications (2)

Publication Number Publication Date
WO2002023403A2 true WO2002023403A2 (en) 2002-03-21
WO2002023403A3 WO2002023403A3 (en) 2003-03-13

Family

ID=22871121

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/028587 WO2002023403A2 (en) 2000-09-11 2001-09-12 System and method for obtaining and utilizing maintenance information

Country Status (4)

Country Link
US (2) US6529620B2 (en)
EP (1) EP1332443A2 (en)
AU (1) AU2001289056A1 (en)
WO (1) WO2002023403A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2394808A (en) * 2002-11-01 2004-05-05 Canon Europa Nv E-Maintenance System
WO2004066606A2 (en) * 2003-01-24 2004-08-05 Jarvis Facilities Ltd Work site monitoring
GB2417091A (en) * 2004-08-03 2006-02-15 Advanced Analysis And Integrat Aircraft test and measuring instruments
EP2388742A3 (en) * 2004-11-05 2012-03-28 Hitachi Ltd. Remote maintenance system, monitoring center computer used for the same, monitoring system and method of communication for maintenance

Families Citing this family (171)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7337389B1 (en) 1999-12-07 2008-02-26 Microsoft Corporation System and method for annotating an electronic document independently of its content
EP1332443A2 (en) * 2000-09-11 2003-08-06 Pinotage, LLC System and method for obtaining and utilizing maintenance information
US7765082B2 (en) * 2000-09-11 2010-07-27 Axiam, Incorporated System for optimal alignment of a shaft of a gas turbine
US7565257B2 (en) * 2000-09-11 2009-07-21 Axiam, Incorporated System for optimal alignment of a bearing seal on a shaft of a gas turbine
US6898547B1 (en) * 2000-09-11 2005-05-24 Axiam, Incorporated Rotor assembly system and method
US20030215128A1 (en) * 2001-09-12 2003-11-20 Pinotage Llc System and method for obtaining and utilizing maintenance information
US7050041B1 (en) * 2000-10-30 2006-05-23 Hewlett-Packard Development Company, L.P. Pointing device with a cable storage winding mechanism
US20020087319A1 (en) * 2001-01-04 2002-07-04 Stephenson Marc C. Portable electronic voice recognition device capable of executing various voice activated commands and calculations associated with aircraft operation by means of synthesized voice response
US8564417B2 (en) 2001-01-15 2013-10-22 Ron Craik System and method for storing and retrieving equipment inspection and maintenance data
US8198986B2 (en) * 2001-11-13 2012-06-12 Ron Craik System and method for storing and retrieving equipment inspection and maintenance data
US7076532B2 (en) * 2001-01-15 2006-07-11 Ron Craik System and method for storing and retrieving equipment inspection and maintenance data
US6574537B2 (en) * 2001-02-05 2003-06-03 The Boeing Company Diagnostic system and method
US7350159B2 (en) * 2001-05-08 2008-03-25 Snap-On Incorporated Integrated diagnostic system
US6951536B2 (en) * 2001-07-30 2005-10-04 Olympus Corporation Capsule-type medical device and medical system
US10185455B2 (en) 2012-10-04 2019-01-22 Zonar Systems, Inc. Mobile computing device for fleet telematics
US8810385B2 (en) 2001-09-11 2014-08-19 Zonar Systems, Inc. System and method to improve the efficiency of vehicle inspections by enabling remote actuation of vehicle components
US20150170521A1 (en) 2001-09-11 2015-06-18 Zonar Systems, Inc. System and method to enhance the utility of vehicle inspection records by including route identification data in each vehicle inspection record
US7557696B2 (en) 2001-09-11 2009-07-07 Zonar Systems, Inc. System and process to record inspection compliance data
US8400296B2 (en) 2001-09-11 2013-03-19 Zonar Systems, Inc. Method and apparatus to automate data collection during a mandatory inspection
US9563869B2 (en) 2010-09-14 2017-02-07 Zonar Systems, Inc. Automatic incorporation of vehicle data into documents captured at a vehicle using a mobile computing device
US20110068954A1 (en) 2006-06-20 2011-03-24 Zonar Systems, Inc. Method and apparatus to collect object identification data during operation of a vehicle and analysis of such data
US8972179B2 (en) 2006-06-20 2015-03-03 Brett Brinton Method and apparatus to analyze GPS data to determine if a vehicle has adhered to a predetermined route
US6671646B2 (en) * 2001-09-11 2003-12-30 Zonar Compliance Systems, Llc System and process to ensure performance of mandated safety and maintenance inspections
US11341853B2 (en) 2001-09-11 2022-05-24 Zonar Systems, Inc. System and method to enhance the utility of vehicle inspection records by including route identification data in each vehicle inspection record
DE10153151A1 (en) * 2001-10-27 2003-05-15 Airbus Gmbh Diagnostic system and diagnostic procedures to support aircraft maintenance
US20040206818A1 (en) * 2001-12-03 2004-10-21 Loda David C. Engine-mounted microserver
US8082317B2 (en) * 2002-02-26 2011-12-20 United Technologies Corporation Remote tablet-based internet inspection system
WO2003077073A2 (en) * 2002-03-08 2003-09-18 Fleettrakker, L.L.C. Equipment tracking system and method
US6885921B1 (en) * 2002-05-09 2005-04-26 Grace H. Farmer Method and apparatus for managing aircraft maintenance records
US20140207514A1 (en) * 2013-01-22 2014-07-24 General Electric Company Inspection data provision
US6925357B2 (en) * 2002-07-25 2005-08-02 Intouch Health, Inc. Medical tele-robotic system
US20040162637A1 (en) 2002-07-25 2004-08-19 Yulun Wang Medical tele-robotic system with a master remote station with an arbitrator
JP4323149B2 (en) * 2002-09-30 2009-09-02 オリンパス株式会社 Electric bending endoscope
AU2003284970A1 (en) * 2002-10-25 2004-05-13 J. Bruce Cantrell Jr. Digital diagnosti video system for manufacturing and industrial process
DE20220652U1 (en) * 2002-11-05 2004-04-22 Quiss Gmbh Device for recognizing a structure to be applied to a substrate
AU2003286856A1 (en) * 2002-11-07 2004-06-03 Snap-On Technologies, Inc. Vehicle data stream pause on data trigger value
US6751536B1 (en) * 2002-12-04 2004-06-15 The Boeing Company Diagnostic system and method for enabling multistage decision optimization for aircraft preflight dispatch
EP1618736A2 (en) * 2003-01-29 2006-01-25 Everest-VIT, Inc. Remote video inspection system
US6842713B1 (en) * 2003-02-24 2005-01-11 The United States Of America As Represented By The Secretary Of The Navy Rapid diagnostic multi data retrieval apparatus and method for using the same
US7668744B2 (en) * 2003-07-31 2010-02-23 The Boeing Company Method and system for conducting fleet operations
US20050043870A1 (en) * 2003-08-22 2005-02-24 General Electric Company Method and apparatus for recording and retrieving maintenance, operating and repair data for turbine engine components
US6940426B1 (en) * 2003-09-05 2005-09-06 Ridgeback Systems Llc Aircraft flight risk measuring system and method of operation
SE527004C2 (en) * 2003-11-26 2005-12-06 Kvaser Consultant Ab Arrangement of distributed for simulation in distributed control systems eg in vehicles
US7813836B2 (en) 2003-12-09 2010-10-12 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
SE528072C2 (en) * 2004-01-16 2006-08-29 Kvaser Consultant Ab Device, unit and arrangement of one or more distributed systems for collecting operation or fault information
US7844385B2 (en) * 2004-01-28 2010-11-30 United Technologies Corporation Microserver engine control card
US7167788B2 (en) * 2004-01-30 2007-01-23 United Technologies Corporation Dual-architecture microserver card
US20050223288A1 (en) * 2004-02-12 2005-10-06 Lockheed Martin Corporation Diagnostic fault detection and isolation
US20050240555A1 (en) * 2004-02-12 2005-10-27 Lockheed Martin Corporation Interactive electronic technical manual system integrated with the system under test
US7801702B2 (en) * 2004-02-12 2010-09-21 Lockheed Martin Corporation Enhanced diagnostic fault detection and isolation
US7584420B2 (en) * 2004-02-12 2009-09-01 Lockheed Martin Corporation Graphical authoring and editing of mark-up language sequences
JP4306510B2 (en) * 2004-03-29 2009-08-05 三菱自動車エンジニアリング株式会社 Vehicle inspection management system
JP4270017B2 (en) * 2004-04-15 2009-05-27 三菱自動車工業株式会社 Vehicle inspection management system
US8077963B2 (en) 2004-07-13 2011-12-13 Yulun Wang Mobile robot with a head-based movement mapping scheme
US7617029B2 (en) * 2004-07-19 2009-11-10 United Technologies Corporation System and method for fault code driven maintenance system
US20060120181A1 (en) * 2004-10-05 2006-06-08 Lockheed Martin Corp. Fault detection and isolation with analysis of built-in-test results
US20060085692A1 (en) * 2004-10-06 2006-04-20 Lockheed Martin Corp. Bus fault detection and isolation
US20060132291A1 (en) * 2004-11-17 2006-06-22 Dourney Charles Jr Automated vehicle check-in inspection method and system with digital image archiving
US20080052281A1 (en) * 2006-08-23 2008-02-28 Lockheed Martin Corporation Database insertion and retrieval system and method
US7427025B2 (en) * 2005-07-08 2008-09-23 Lockheed Marlin Corp. Automated postal voting system and method
WO2007033326A2 (en) * 2005-09-14 2007-03-22 Welch Allyn, Inc. Medical apparatus comprising and adaptive lens
US20070078618A1 (en) * 2005-09-30 2007-04-05 Honeywell International, Inc. Method and system for enabling automated data analysis of multiple commensurate nondestructive test measurements
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US8027095B2 (en) * 2005-10-11 2011-09-27 Hand Held Products, Inc. Control systems for adaptive lens
US20070156496A1 (en) * 2005-12-02 2007-07-05 Avery Robert L Methods and systems for managing aircraft maintenance and material supply
US7769499B2 (en) * 2006-04-05 2010-08-03 Zonar Systems Inc. Generating a numerical ranking of driver performance based on a plurality of metrics
DE202006006268U1 (en) * 2006-04-12 2006-06-14 Branofilter Gmbh Device for detachable fastening of dust filter bag in dust evacuation equipment has flange part which is pluggable to adaptor plate radially outside of annular seal and is pivotally connected to adaptor plate
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
US9230437B2 (en) 2006-06-20 2016-01-05 Zonar Systems, Inc. Method and apparatus to encode fuel use data with GPS data and to analyze such data
US20130164713A1 (en) 2011-12-23 2013-06-27 Zonar Systems, Inc. Method and apparatus for gps based slope determination, real-time vehicle mass determination, and vehicle efficiency analysis
US10056008B1 (en) 2006-06-20 2018-08-21 Zonar Systems, Inc. Using telematics data including position data and vehicle analytics to train drivers to improve efficiency of vehicle use
US9412282B2 (en) 2011-12-24 2016-08-09 Zonar Systems, Inc. Using social networking to improve driver performance based on industry sharing of driver performance data
US20080046167A1 (en) * 2006-07-10 2008-02-21 Small Gregory J Methods and systems for providing a resource management view for airline operations
US20080010107A1 (en) * 2006-07-10 2008-01-10 Small Gregory J Methods and systems for providing a global view of airline operations
US7747382B2 (en) * 2006-07-10 2010-06-29 The Boeing Company Methods and systems for real-time enhanced situational awareness
US7539594B2 (en) * 2006-09-26 2009-05-26 Axiam, Incorporated Method and apparatus for geometric rotor stacking and balancing
US20080114507A1 (en) * 2006-11-10 2008-05-15 Ruth Robert S System and method for situational control of mobile platform maintenance and operation
US8027096B2 (en) * 2006-12-15 2011-09-27 Hand Held Products, Inc. Focus module and components with actuator polymer control
US7813047B2 (en) * 2006-12-15 2010-10-12 Hand Held Products, Inc. Apparatus and method comprising deformable lens element
US8645148B2 (en) * 2006-12-29 2014-02-04 The Boeing Company Methods and apparatus providing an E-enabled ground architecture
US7337058B1 (en) * 2007-02-12 2008-02-26 Honeywell International, Inc. Engine wear characterizing and quantifying method
US8396571B2 (en) * 2007-03-19 2013-03-12 United Technologies Corporation Process and system for multi-objective global optimization of maintenance schedules
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US8824731B2 (en) * 2007-10-31 2014-09-02 The Boeing Comapny Image processing of apparatus condition
US8571747B2 (en) * 2007-12-06 2013-10-29 The Boeing Company System and method for managing aircraft maintenance
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
US8131509B2 (en) * 2008-03-23 2012-03-06 United Technologies Corporation Method of system design for failure detectability
US8179418B2 (en) 2008-04-14 2012-05-15 Intouch Technologies, Inc. Robotic based health care system
US8054182B2 (en) * 2008-04-16 2011-11-08 The Johns Hopkins University Remotely directed vehicle inspection method and apparatus
US8170241B2 (en) 2008-04-17 2012-05-01 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
US20090266150A1 (en) * 2008-04-23 2009-10-29 Ari Novis Sensor criticality determination process
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
US20100048202A1 (en) * 2008-08-25 2010-02-25 Beacham Jr William H Method of communicating with an avionics box via text messaging
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8095265B2 (en) * 2008-10-06 2012-01-10 International Business Machines Corporation Recording, storing, and retrieving vehicle maintenance records
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
FR2938088B1 (en) * 2008-11-03 2010-11-12 Eurocopter France METHOD FOR SECURING FLIGHT DATA AND SYSTEM FOR CARRYING OUT SAID METHOD
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
JP2010140256A (en) * 2008-12-11 2010-06-24 Toshiba Corp Information processor and diagnostic result notification method
US8073294B2 (en) * 2008-12-29 2011-12-06 At&T Intellectual Property I, L.P. Remote optical fiber surveillance system and method
US9992227B2 (en) * 2009-01-07 2018-06-05 Ncr Corporation Secure remote maintenance and support system, method, network entity and computer program product
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US9157723B2 (en) 2009-01-30 2015-10-13 Axiam, Inc. Absolute diameter measurement arm
US8219353B2 (en) * 2009-01-30 2012-07-10 Axiam, Inc. Absolute diameter measurement arm
US8897920B2 (en) 2009-04-17 2014-11-25 Intouch Technologies, Inc. Tele-presence robot system with software modularity, projector and laser pointer
US8509963B1 (en) 2009-07-23 2013-08-13 Rockwell Collins, Inc. Remote management of aircraft computer systems
US11399153B2 (en) 2009-08-26 2022-07-26 Teladoc Health, Inc. Portable telepresence apparatus
US8384755B2 (en) 2009-08-26 2013-02-26 Intouch Technologies, Inc. Portable remote presence robot
EP2378468A1 (en) * 2009-11-10 2011-10-19 Airbus Operations GmbH Platform for aircraft maintenance services and asset management
US11154981B2 (en) 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US10808882B2 (en) 2010-05-26 2020-10-20 Intouch Technologies, Inc. Tele-robotic system with a robot face placed on a chair
US10665040B2 (en) 2010-08-27 2020-05-26 Zonar Systems, Inc. Method and apparatus for remote vehicle diagnosis
US10600096B2 (en) 2010-11-30 2020-03-24 Zonar Systems, Inc. System and method for obtaining competitive pricing for vehicle services
TW201216913A (en) * 2010-10-22 2012-05-01 Three In One Ent Co Ltd An endoscope with acoustic wave detection and voiceprint comparison
US10706647B2 (en) 2010-12-02 2020-07-07 Zonar Systems, Inc. Method and apparatus for implementing a vehicle inspection waiver program
US10431020B2 (en) 2010-12-02 2019-10-01 Zonar Systems, Inc. Method and apparatus for implementing a vehicle inspection waiver program
US8914184B2 (en) 2012-04-01 2014-12-16 Zonar Systems, Inc. Method and apparatus for matching vehicle ECU programming to current vehicle operating conditions
US8736419B2 (en) 2010-12-02 2014-05-27 Zonar Systems Method and apparatus for implementing a vehicle inspection waiver program
US9527515B2 (en) 2011-12-23 2016-12-27 Zonar Systems, Inc. Vehicle performance based on analysis of drive data
US9264664B2 (en) 2010-12-03 2016-02-16 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US9272796B1 (en) 2011-01-11 2016-03-01 Chudy Group, LLC Automatic drug packaging machine and package-less verification system
US12093036B2 (en) 2011-01-21 2024-09-17 Teladoc Health, Inc. Telerobotic system with a dual application screen presentation
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
CN104898652B (en) 2011-01-28 2018-03-13 英塔茨科技公司 Mutually exchanged with a moveable tele-robotic
US10769739B2 (en) 2011-04-25 2020-09-08 Intouch Technologies, Inc. Systems and methods for management of information among medical providers and facilities
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US20140139616A1 (en) 2012-01-27 2014-05-22 Intouch Technologies, Inc. Enhanced Diagnostics for a Telepresence Robot
US20120308984A1 (en) * 2011-06-06 2012-12-06 Paramit Corporation Interface method and system for use with computer directed assembly and manufacturing
GB201109858D0 (en) 2011-06-13 2011-07-27 Lewis Terry Method and system for aircraft inspections
CN102343983A (en) * 2011-07-07 2012-02-08 中国国际航空股份有限公司 Airplane APU (Auxiliary Power Unit) performance detecting method
CN102320382A (en) * 2011-07-07 2012-01-18 中国国际航空股份有限公司 Aircraft performance detection method
US11401045B2 (en) 2011-08-29 2022-08-02 Aerovironment, Inc. Camera ball turret having high bandwidth data transmission to external image processor
US9288513B2 (en) 2011-08-29 2016-03-15 Aerovironment, Inc. System and method of high-resolution digital data image transmission
US8836751B2 (en) 2011-11-08 2014-09-16 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
FR2987443B1 (en) * 2012-02-24 2014-03-07 Snecma DEVICE FOR DETECTING ANOMALIES BY ACOUSTIC ANALYSIS OF AN AIRCRAFT TURBOMACHINE
US8849475B1 (en) * 2012-04-04 2014-09-30 The Boeing Company Systems and method for managing sensors in a vehicle
US9251313B2 (en) 2012-04-11 2016-02-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US20130288210A1 (en) * 2012-04-30 2013-10-31 Andrew James Stewart Integrated maintenance management system
WO2013176758A1 (en) 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9002571B1 (en) * 2012-08-23 2015-04-07 Rockwell Collins, Inc. Automated preflight walk around tool
US9424696B2 (en) 2012-10-04 2016-08-23 Zonar Systems, Inc. Virtual trainer for in vehicle driver coaching and to collect metrics to improve driver performance
US8930068B1 (en) * 2013-07-15 2015-01-06 American Airlines, Inc. System and method for managing instances of damage within a transportation system
US9530057B2 (en) 2013-11-26 2016-12-27 Honeywell International Inc. Maintenance assistant system
US9558547B2 (en) 2014-01-09 2017-01-31 The Boeing Company System and method for determining whether an apparatus or an assembly process is acceptable
FR3019898B1 (en) * 2014-04-11 2018-06-01 Safran Aircraft Engines METHOD AND DEVICE FOR ENDOSCOPY OF A REMOTE AIRCRAFT ENGINE
US9911251B2 (en) * 2014-12-15 2018-03-06 Bosch Automotive Service Solutions Inc. Vehicle diagnostic system and method
US10139795B2 (en) * 2015-10-19 2018-11-27 The Boeing Company System and method for environmental control system diagnosis and prognosis
US9785919B2 (en) 2015-12-10 2017-10-10 General Electric Company Automatic classification of aircraft component distress
JP6493264B2 (en) * 2016-03-23 2019-04-03 横河電機株式会社 Maintenance information sharing apparatus, maintenance information sharing method, maintenance information sharing program, and recording medium
JP2018106654A (en) * 2016-12-28 2018-07-05 横河電機株式会社 Maintenance management device, maintenance management method, maintenance management program, and recording medium
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
FR3068098B1 (en) * 2017-06-26 2019-08-23 Safran Landing Systems METHOD FOR MEASURING BRAKE DISC WEAR OF AN AIRCRAFT
US10483007B2 (en) 2017-07-25 2019-11-19 Intouch Technologies, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
US10650340B2 (en) * 2017-09-08 2020-05-12 Accenture Global Solutions Limited Tracking and/or analyzing facility-related activities
JP6917844B2 (en) * 2017-09-20 2021-08-11 株式会社東芝 Work support system, work support method and work support program
US10847048B2 (en) * 2018-02-23 2020-11-24 Frontis Corp. Server, method and wearable device for supporting maintenance of military apparatus based on augmented reality using correlation rule mining
US10617299B2 (en) 2018-04-27 2020-04-14 Intouch Technologies, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
FR3087755B1 (en) * 2018-10-30 2022-03-04 Safran AIRCRAFT STRUCTURE DEFECT MONITORING METHOD AND SYSTEM
CN111626147A (en) * 2020-05-09 2020-09-04 西藏电建成勘院工程有限公司 Geotechnical engineering drilling information acquisition and processing method and system
CN112084430A (en) * 2020-08-21 2020-12-15 广州汽车集团股份有限公司 Dynamic geographical human information broadcasting device, system and method
WO2022167971A1 (en) * 2021-02-03 2022-08-11 Engifab Srl Device and method for inspecting an industrial vehicle
IT202100002309A1 (en) * 2021-02-03 2022-08-03 Kiwitron S R L DEVICE AND METHOD FOR INSPECTING AN INDUSTRIAL VEHICLE, FOR EXAMPLE A LIFT BASKET.
US11860060B2 (en) 2022-04-05 2024-01-02 Rtx Corporation Integrally bladed rotor analysis and repair systems and methods
US12037918B2 (en) 2022-04-05 2024-07-16 Rtx Corporation Systems and methods for parameterization of inspected bladed rotor analysis

Family Cites Families (129)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR1126036A (en) 1955-05-10 1956-11-13 Colposcope
US4644845A (en) 1972-05-18 1987-02-24 Garehime Jacob W Jr Surveillance and weapon system
US4046140A (en) 1972-06-02 1977-09-06 Born Grant R Cervix photographic method
CH573026A5 (en) 1974-06-11 1976-02-27 Kaiser Josef Ag Fahrzeugwerk
US4210133A (en) 1975-10-21 1980-07-01 Consejo Nacional De Ciencia Y Tecnologia Vaginal microscope
DE7833379U1 (en) 1978-11-10 1979-02-15 Storz, Karl, 7200 Tuttlingen
IL58599A0 (en) 1978-12-04 1980-02-29 United Technologies Corp Method and apparatus for inspecting stator components of gas turbine engines
US4759348A (en) 1981-09-28 1988-07-26 Cawood Charles David Endoscope assembly and surgical instrument for use therewith
US4575185A (en) 1983-08-01 1986-03-11 Combustion Engineering, Inc. System for a fiber optic cable for remote inspection of internal structure of a nuclear steam generator
US4611888A (en) 1983-10-17 1986-09-16 Mp Video, Inc. Coupler for surgical endoscope and video camera
JPH0221041Y2 (en) 1983-11-08 1990-06-07
US4627436A (en) 1984-03-01 1986-12-09 Innoventions Biomedical Inc. Angioplasty catheter and method for use thereof
JPH0646977B2 (en) 1984-06-09 1994-06-22 オリンパス光学工業株式会社 Measuring endoscope
US4573452A (en) 1984-07-12 1986-03-04 Greenberg I Melvin Surgical holder for a laparoscope or the like
JPS6150546A (en) 1984-08-20 1986-03-12 富士写真光機株式会社 Endoscope
JPH0644105B2 (en) 1985-01-14 1994-06-08 オリンパス光学工業株式会社 Endoscope
US4718417A (en) 1985-03-22 1988-01-12 Massachusetts Institute Of Technology Visible fluorescence spectral diagnostic for laser angiosurgery
US4816828A (en) 1986-03-27 1989-03-28 Feher Kornel J Aircraft damage assessment and surveillance system
DE3715417A1 (en) 1986-05-13 1987-11-19 Olympus Optical Co SEMICONDUCTOR IMAGE GENERATION DEVICE, AND ENDOSCOPE HERE EQUIPPED WITH IT
US4791479A (en) 1986-06-04 1988-12-13 Olympus Optical Co., Ltd. Color-image sensing apparatus
US4807025A (en) 1986-10-23 1989-02-21 Teruo Eino Electronic endoscope apparatus
US4738526A (en) 1986-11-21 1988-04-19 Autostudio Corporation Auto-portrait photo studio
JP2735101B2 (en) 1986-12-08 1998-04-02 オリンパス光学工業株式会社 Imaging device
JPS63197431A (en) 1987-02-10 1988-08-16 オリンパス光学工業株式会社 Image pickup apparatus for endoscope
US4736733A (en) 1987-02-25 1988-04-12 Medical Dynamics, Inc. Endoscope with removable eyepiece
US5016098A (en) 1987-03-05 1991-05-14 Fuji Optical Systems, Incorporated Electronic video dental camera
US5115307A (en) 1987-03-05 1992-05-19 Fuji Optical Systems Electronic video dental camera
JP2572394B2 (en) 1987-03-19 1997-01-16 オリンパス光学工業株式会社 Electronic endoscope
US4905082A (en) 1987-05-06 1990-02-27 Olympus Optical Co., Ltd. Rigid video endoscope having a detachable imaging unit
US4867138A (en) 1987-05-13 1989-09-19 Olympus Optical Co., Ltd. Rigid electronic endoscope
US4888639A (en) 1987-05-22 1989-12-19 Olympous Optical Co., Ltd. Endoscope apparatus having integrated disconnectable light transmitting and image signal transmitting cord
US4878113A (en) 1987-08-11 1989-10-31 Olympus Optical Co., Ltd. Endoscope apparatus
JPH0824668B2 (en) 1987-09-14 1996-03-13 オリンパス光学工業株式会社 Electronic endoscopic device
US4858001A (en) 1987-10-08 1989-08-15 High-Tech Medical Instrumentation, Inc. Modular endoscopic apparatus with image rotation
US5172225A (en) 1987-11-25 1992-12-15 Olympus Optical Co., Ltd. Endoscope system
US4893613A (en) 1987-11-25 1990-01-16 Hake Lawrence W Endoscope construction with means for controlling rigidity and curvature of flexible endoscope tube
US5021888A (en) 1987-12-18 1991-06-04 Kabushiki Kaisha Toshiba Miniaturized solid state imaging device
JPH0673517B2 (en) 1988-02-04 1994-09-21 オリンパス光学工業株式会社 Electronic endoscope system
US5111288A (en) 1988-03-02 1992-05-05 Diamond Electronics, Inc. Surveillance camera system
US4852131A (en) * 1988-05-13 1989-07-25 Advanced Research & Applications Corporation Computed tomography inspection of electronic devices
US5026368A (en) 1988-12-28 1991-06-25 Adair Edwin Lloyd Method for cervical videoscopy
US4905670A (en) 1988-12-28 1990-03-06 Adair Edwin Lloyd Apparatus for cervical videoscopy
US5143054A (en) 1988-12-28 1992-09-01 Adair Edwin Lloyd Cervical videoscope with detachable camera unit
JP2981556B2 (en) 1989-02-28 1999-11-22 旭光学工業株式会社 Endoscope tip
GB8904535D0 (en) 1989-02-28 1989-04-12 Barcrest Ltd Automatic picture taking machine
DE3921233A1 (en) 1989-06-28 1991-02-14 Storz Karl Gmbh & Co ENDOSCOPE WITH A VIDEO DEVICE AT THE DISTAL END
JPH0327204U (en) 1989-07-21 1991-03-19
US4979498A (en) 1989-10-30 1990-12-25 Machida Incorporated Video cervicoscope system
DE8914215U1 (en) 1989-11-29 1991-01-03 Effner GmbH, 1000 Berlin endoscope
US5196876A (en) 1989-11-20 1993-03-23 Thayer Donald O Photography booth and method
GB9001993D0 (en) 1990-01-29 1990-03-28 Toy Of The Year Toy Dreams Lim Photobooth
JPH0412727A (en) 1990-05-02 1992-01-17 Olympus Optical Co Ltd Endoscope
US5431645A (en) 1990-05-10 1995-07-11 Symbiosis Corporation Remotely activated endoscopic tools such as endoscopic biopsy forceps
US5164992A (en) 1990-11-01 1992-11-17 Massachusetts Institute Of Technology Face recognition system
JPH0817768B2 (en) 1990-11-06 1996-02-28 富士写真光機株式会社 Endoscope
US5193525A (en) 1990-11-30 1993-03-16 Vision Sciences Antiglare tip in a sheath for an endoscope
JP3041099B2 (en) 1991-02-01 2000-05-15 オリンパス光学工業株式会社 Electronic endoscope device
US5188093A (en) 1991-02-04 1993-02-23 Citation Medical Corporation Portable arthroscope with periscope optics
DE4105326A1 (en) 1991-02-21 1992-09-03 Wolf Gmbh Richard ENDOSCOPE WITH PROXIMALLY CONNECTABLE CAMERA
US5217453A (en) 1991-03-18 1993-06-08 Wilk Peter J Automated surgical system and apparatus
JP3063784B2 (en) 1991-03-26 2000-07-12 オリンパス光学工業株式会社 Endoscope device
JP3065702B2 (en) 1991-04-23 2000-07-17 オリンパス光学工業株式会社 Endoscope system
US5251613A (en) 1991-05-06 1993-10-12 Adair Edwin Lloyd Method of cervical videoscope with detachable camera
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
DE4129961C2 (en) 1991-09-10 1996-02-15 Wolf Gmbh Richard Video endoscope with solid-state imaging device
US5188094A (en) 1991-09-30 1993-02-23 Adair Edwin Lloyd Heat sterilizable electronic video endoscope
JPH05199989A (en) 1991-10-25 1993-08-10 Asahi Optical Co Ltd Tip part of endoscope
US5347988A (en) 1992-05-13 1994-09-20 Linvatec Corporation Endoscope coupler with liquid interface
US5262815A (en) 1992-05-27 1993-11-16 Consumer Programs Incorporated Modular photobooth photography system
US5305121A (en) 1992-06-08 1994-04-19 Origin Medsystems, Inc. Stereoscopic endoscope system
US5609561A (en) 1992-06-09 1997-03-11 Olympus Optical Co., Ltd Electronic type endoscope in which image pickup unit is dismounted to execute disinfection/sterilization processing
CA2101040C (en) 1992-07-30 1998-08-04 Minori Takagi Video tape recorder with a monitor-equipped built-in camera
US5524180A (en) 1992-08-10 1996-06-04 Computer Motion, Inc. Automated endoscope system for optimal positioning
US5704892A (en) 1992-09-01 1998-01-06 Adair; Edwin L. Endoscope with reusable core and disposable sheath with passageways
US5402768A (en) 1992-09-01 1995-04-04 Adair; Edwin L. Endoscope with reusable core and disposable sheath with passageways
US5379756A (en) 1992-09-11 1995-01-10 Welch Allyn, Inc. Replaceable lens assembly for video laparoscope
US5381784A (en) 1992-09-30 1995-01-17 Adair; Edwin L. Stereoscopic endoscope
US5359992A (en) 1992-10-20 1994-11-01 Linvatec Corporation Endoscope coupler with magnetic focus control
US5334150A (en) 1992-11-17 1994-08-02 Kaali Steven G Visually directed trocar for laparoscopic surgical procedures and method of using same
US5383099A (en) 1992-12-14 1995-01-17 Peters; Larry D. Portable photography booth and improved light reflector assembly
US5418567A (en) 1993-01-29 1995-05-23 Bayport Controls, Inc. Surveillance camera system
US5408992A (en) 1993-11-05 1995-04-25 British Technology Group Usa Inc. Endoscopic device for intraoral use
ES2105936B1 (en) 1994-03-21 1998-06-01 I D Tec S L IMPROVEMENTS INTRODUCED IN INVENTION PATENT N. P-9400595/8 BY: BIOMETRIC PROCEDURE FOR SECURITY AND IDENTIFICATION AND CREDIT CARDS, VISAS, PASSPORTS AND FACIAL RECOGNITION.
US5598205A (en) 1994-04-22 1997-01-28 Olympus Optical Co., Ltd. Imaging apparatus
US5508735A (en) 1994-07-12 1996-04-16 Northeast Technical Service Co. Inc. Underdeck inspection device
JP3580869B2 (en) 1994-09-13 2004-10-27 オリンパス株式会社 Stereoscopic endoscope
US5792045A (en) 1994-10-03 1998-08-11 Adair; Edwin L. Sterile surgical coupler and drape
US5657245A (en) 1994-11-09 1997-08-12 Westinghouse Electric Corporation Component maintenance system
US5591192A (en) 1995-02-01 1997-01-07 Ethicon Endo-Surgery, Inc. Surgical penetration instrument including an imaging element
US5652849A (en) * 1995-03-16 1997-07-29 Regents Of The University Of Michigan Apparatus and method for remote control using a visual information stream
KR19980703120A (en) 1995-03-20 1998-10-15 조안나 티. 라우 Image Identification System and Method
JPH08263664A (en) 1995-03-22 1996-10-11 Honda Motor Co Ltd Artificial visual system and image recognizing method
US6182047B1 (en) 1995-06-02 2001-01-30 Software For Surgeons Medical information log system
US5828969A (en) 1995-06-22 1998-10-27 Canadian Digital Photo/Graphics Inc. Process for use with aircraft repairs
US6007484A (en) 1995-09-15 1999-12-28 Image Technologies Corporation Endoscope having elevation and azimuth control of camera
BR9607702A (en) 1995-09-15 1998-01-13 Robert Lee Thompson Surgical device / diagnostic imaging
US5891013A (en) 1996-02-07 1999-04-06 Pinotage, Llc System for single-puncture endoscopic surgery
US5846249A (en) 1996-02-07 1998-12-08 Pinotage, Llc Video gynecological examination apparatus
US5928137A (en) 1996-05-03 1999-07-27 Green; Philip S. System and method for endoscopic imaging and endosurgery
US5931877A (en) 1996-05-30 1999-08-03 Raytheon Company Advanced maintenance system for aircraft and military weapons
US5879289A (en) 1996-07-15 1999-03-09 Universal Technologies International, Inc. Hand-held portable endoscopic camera
JP3715718B2 (en) 1996-07-17 2005-11-16 キヤノン株式会社 Imaging device
DE19633286A1 (en) 1996-08-19 1998-02-26 Peter Bahr Supervision - and/or safety system for passenger aircraft
US6229904B1 (en) 1996-08-30 2001-05-08 American Alpha, Inc Automatic morphing photography booth
US5696995A (en) 1996-08-30 1997-12-09 Huang; Sming Automatic photography booth
US5986718A (en) 1996-09-19 1999-11-16 Video Magic, Inc. Photographic method using chroma-key and a photobooth employing the same
US6002740A (en) * 1996-10-04 1999-12-14 Wisconsin Alumni Research Foundation Method and apparatus for X-ray and extreme ultraviolet inspection of lithography masks and other objects
US5800344A (en) 1996-10-23 1998-09-01 Welch Allyn, Inc. Video laryngoscope
US5784651A (en) 1996-11-15 1998-07-21 Polaroid Corporation Photo booth with modular construction
US6330351B1 (en) * 1996-11-29 2001-12-11 Kabushiki Kaisha Yuyama Seisakusho Drug inspection device and drug packaging device
US5757419A (en) 1996-12-02 1998-05-26 Qureshi; Iqbal Inspection method and apparatus for tanks and the like
US5991429A (en) 1996-12-06 1999-11-23 Coffin; Jeffrey S. Facial recognition system for security access and identification
US6185337B1 (en) 1996-12-17 2001-02-06 Honda Giken Kogyo Kabushiki Kaisha System and method for image recognition
US5980450A (en) 1997-05-07 1999-11-09 Pinotage, Llc Coupling device for use in an imaging system
US6142876A (en) 1997-08-22 2000-11-07 Cumbers; Blake Player tracking and identification system
US6141482A (en) 1997-11-13 2000-10-31 Foto Fantasy, Inc. Method for saving, accessing and reprinting a digitized photographic image
US6113533A (en) 1997-12-10 2000-09-05 Transamerica Technologies International Endoscope video adapter with zoom
US5989182A (en) 1997-12-19 1999-11-23 Vista Medical Technologies, Inc. Device-steering shaft assembly and endoscope
US6038333A (en) * 1998-03-16 2000-03-14 Hewlett-Packard Company Person identifier and management system
US6301370B1 (en) 1998-04-13 2001-10-09 Eyematic Interfaces, Inc. Face recognition from video images
US6292575B1 (en) 1998-07-20 2001-09-18 Lau Technologies Real-time facial recognition and verification system
US6377699B1 (en) * 1998-11-25 2002-04-23 Iridian Technologies, Inc. Iris imaging telephone security module and method
US6266436B1 (en) * 1999-04-09 2001-07-24 Kimberly-Clark Worldwide, Inc. Process control using multiple detections
US6067486A (en) 1999-02-01 2000-05-23 General Electric Company Method and system for planning repair of an aircraft engine
US6246320B1 (en) * 1999-02-25 2001-06-12 David A. Monroe Ground link with on-board security surveillance system for aircraft and other commercial vehicles
US6574672B1 (en) * 1999-03-29 2003-06-03 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing a portable customizable maintenance support computer communications system
US6323761B1 (en) 2000-06-03 2001-11-27 Sam Mog Son Vehicular security access system
US6954657B2 (en) * 2000-06-30 2005-10-11 Texas Instruments Incorporated Wireless communication device having intelligent alerting system
EP1332443A2 (en) * 2000-09-11 2003-08-06 Pinotage, LLC System and method for obtaining and utilizing maintenance information

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HAGENIERS OMER L: "Inspection methodology and data structure for large-area NDI" NONDESTRUCTIVE EVALUATION OF AGING AIRCRAFT, AIRPORTS, AND AEROSPACE HARDWARE;SCOTTSDALE, AZ, USA DEC 3-5 96, vol. 2945, 1996, pages 152-159, XP008009821 Proc SPIE Int Soc Opt Eng;Proceedings of SPIE - The International Society for Optical Engineering 1996 *
KOMOROWSKI J P ET AL: "Synergy between advanced composites and new NDI methods" ADV PERFORM MATER;ADVANCED PERFORMANCE MATERIALS JAN 1998 KLUWER ACADEMIC PUBLISHERS, DORDRECHT, NETHERLANDS, vol. 5, no. 1-2, January 1998 (1998-01), pages 137-151, XP001132098 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2394808A (en) * 2002-11-01 2004-05-05 Canon Europa Nv E-Maintenance System
WO2004066606A2 (en) * 2003-01-24 2004-08-05 Jarvis Facilities Ltd Work site monitoring
WO2004066606A3 (en) * 2003-01-24 2005-03-24 Jarvis Facilities Ltd Work site monitoring
GB2415037A (en) * 2003-01-24 2005-12-14 Jarvis Rail Ltd Work site monitoring
GB2415037B (en) * 2003-01-24 2007-01-03 Jarvis Rail Ltd Work site monitoring
GB2417091A (en) * 2004-08-03 2006-02-15 Advanced Analysis And Integrat Aircraft test and measuring instruments
EP2388742A3 (en) * 2004-11-05 2012-03-28 Hitachi Ltd. Remote maintenance system, monitoring center computer used for the same, monitoring system and method of communication for maintenance
US8234095B2 (en) 2004-11-05 2012-07-31 Hitachi, Ltd. Remote maintenance system, monitoring center computer used for the same, monitoring system and method of communication for maintenance

Also Published As

Publication number Publication date
US20020122583A1 (en) 2002-09-05
US20020033946A1 (en) 2002-03-21
US6529620B2 (en) 2003-03-04
WO2002023403A3 (en) 2003-03-13
US7068301B2 (en) 2006-06-27
EP1332443A2 (en) 2003-08-06
AU2001289056A1 (en) 2002-03-26

Similar Documents

Publication Publication Date Title
US6529620B2 (en) System and method for obtaining and utilizing maintenance information
US20080204553A1 (en) System and method for obtaining and utilizing maintenance information
US7952641B2 (en) Sensor for imaging inside equipment
US20020110263A1 (en) System and method for obtaining and utilizing maintenance information
US6393431B1 (en) Compact imaging instrument system
US7684544B2 (en) Portable digital radiographic devices
US20080116093A1 (en) Apparatus for storing an insertion tube
US6990455B2 (en) Command and control using speech recognition for dental computer connected devices
CN110212451A (en) A kind of electric power AR intelligent patrol detection device
US20080122936A1 (en) System and method for imaging
EP2620099B1 (en) Digital slit lamp microscope system
US20100145146A1 (en) Endoscopic digital recording system with removable screen and storage device
CA2420139A1 (en) Method and system for transmitting digital media between remote locations
CA2252786C (en) Video camera system
KR20120008059A (en) Imaging system
US20070097324A1 (en) Image information display unit
US20090198990A1 (en) Accessory support system for remote inspection device
JP2004191911A (en) Endoscope control system
WO1999042030A1 (en) Compact imaging instrument system
EP4115797A1 (en) Image capture systems and methods for identifying abnormalities using multispectral imaging
US20190017970A1 (en) Apparatus and method for an inspection device
JP2004344390A (en) Image recording device
JP2002326772A (en) Crime prevention image managing system
US20040218089A1 (en) Ruggedized remote monitoring system and method
JP2001060109A (en) Maintenance system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2001968842

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 2001968842

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Ref document number: 2001968842

Country of ref document: EP