[go: nahoru, domu]

US6484049B1 - Fluoroscopic tracking and visualization system - Google Patents

Fluoroscopic tracking and visualization system Download PDF

Info

Publication number
US6484049B1
US6484049B1 US09/560,940 US56094000A US6484049B1 US 6484049 B1 US6484049 B1 US 6484049B1 US 56094000 A US56094000 A US 56094000A US 6484049 B1 US6484049 B1 US 6484049B1
Authority
US
United States
Prior art keywords
image
fluoroscope
markers
tool
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/560,940
Inventor
Teresa Seeley
Faith Lin
Tina Kapur
Gene Gregerson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GE Medical Systems Global Technology Co LLC
Stryker European Holdings I LLC
Original Assignee
GE Medical Systems Global Technology Co LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GE Medical Systems Global Technology Co LLC filed Critical GE Medical Systems Global Technology Co LLC
Priority to US09/560,940 priority Critical patent/US6484049B1/en
Priority to US09/560,608 priority patent/US6490475B1/en
Assigned to VISUALIZATION TECHNOLOGY, INC. reassignment VISUALIZATION TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GREGERSON, GENE, KAPUR, TINA, LIN, FAITH, SEELEY, TERESA
Priority to CA002407616A priority patent/CA2407616A1/en
Priority to EP01930893A priority patent/EP1278458B1/en
Priority to PCT/US2001/013734 priority patent/WO2001087136A2/en
Priority to AU2001257384A priority patent/AU2001257384A1/en
Priority to AT01930893T priority patent/ATE515976T1/en
Priority to US10/298,149 priority patent/US6856826B2/en
Publication of US6484049B1 publication Critical patent/US6484049B1/en
Application granted granted Critical
Assigned to GE MEDICAL SYSTEMS NAVIGATION AND VISUALIZATION, INC. reassignment GE MEDICAL SYSTEMS NAVIGATION AND VISUALIZATION, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: VISUALIZATION TECHNOLOGY, INC.
Assigned to VISUALIZATION TECHNOLOGY, INC. reassignment VISUALIZATION TECHNOLOGY, INC. MERGER (SEE DOCUMENT FOR DETAILS). Assignors: EMERALD MERGER CORPORATION, GENERAL ELECTRIC COMPANY
Assigned to OEC MEDICAL SYSTEMS, INC. reassignment OEC MEDICAL SYSTEMS, INC. MERGER (SEE DOCUMENT FOR DETAILS). Assignors: GE MEDICAL SYSTEMS NAVIGATION & VISUALIZATION, INC.
Assigned to GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY COMPANY, LLC reassignment GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY COMPANY, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OEC MEDICAL SYSTEMS, INC.
Assigned to STRYKER EUROPEAN HOLDINGS I, LLC reassignment STRYKER EUROPEAN HOLDINGS I, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GENERAL ELECTRIC COMPANY
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/582Calibration
    • A61B6/583Calibration using calibration phantoms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/547Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/16Details of sensor housings or probes; Details of structural supports for sensors
    • A61B2562/17Comprising radiolucent components

Definitions

  • the present invention relates to medical and surgical imaging, and in particular to intraoperative or perioperative imaging in which images are formed of a region of the patient's body and a surgical tool or instrument is applied thereto, and the images aid in an ongoing procedure. It is of a special utility in surgical procedures such as brain surgery and arthroscopic procedures on the knee, wrist, shoulder or spine, as well as certain types of angiography, cardiac procedures, interventional radiology and biopsies in which x-ray images may be taken to display, correct the position of, or otherwise navigate a tool or instrument involved in the procedure.
  • these previously recorded diagnostic image sets themselves define a three dimensional rectilinear coordinate system, by virtue of their precision scan formation or the spatial mathematics of their reconstruction algorithms. However, it may be necessary to correlate the available fluoroscopic views and anatomical features visible from the surface or in fluoroscopic images with features in the 3-D diagnostic images and with the external coordinates of the tools being employed.
  • imageable fiducials which may for example be imaged in both fluoroscopic and MRI or CT images
  • systems can also operate to a large extent with simple optical tracking of the surgical tool, and may employ an initialization protocol wherein the surgeon touches or points at a number of bony prominences or other recognizable anatomic features in order to define the external coordinates in relation to the patient anatomy and to initiate software tracking of those features.
  • systems of this type operate with an image display which is positioned in the surgeon's field of view, and which displays a few panels such as a selected MRI image and several x-ray or fluoroscopic views taken from different angles.
  • the three-dimensional diagnostic images typically have a spatial resolution that is both rectilinear and accurate to within a very small tolerance, e.g., to within one millimeter or less.
  • the fluoroscopic views by contrast are distorted, and they are shadowgraphic in that they represent the density of all tissue through which the conical x-ray beam has passed.
  • the display visible to the surgeon may show an image of the surgical tool, biopsy instrument, pedicle screw, probe or the like projected onto a fluoroscopic image, so that the surgeon may visualize the orientation of the surgical instrument in relation to the imaged patient anatomy, while an appropriate reconstructed CT or MRI image, which may correspond to the tracked coordinates of the probe tip, is also displayed.
  • the various sets of coordinates may be defined by robotic mechanical links and encoders, or more usually, are defined by a fixed patient support, two or more receivers such as video cameras which may be fixed to the support, and a plurality of signaling elements attached to a guide or frame on the surgical instrument that enable the position and orientation of the tool with respect to the patient support and camera frame to be automatically determined by triangulation, so that various transformations between respective coordinates may be computed.
  • Three-dimensional tracking systems employing two video cameras and a plurality of emitters or other position signaling elements have long been commercially available and are readily adapted to such operating room systems.
  • Similar systems may also determine external position coordinates using commercially available acoustic ranging systems in which three or more acoustic emitters are actuated and their sounds detected at plural receivers to determine their relative distances from the detecting assemblies, and thus define by simple triangulation the position and orientation of the frames or supports on which the emitters are mounted.
  • acoustic ranging systems in which three or more acoustic emitters are actuated and their sounds detected at plural receivers to determine their relative distances from the detecting assemblies, and thus define by simple triangulation the position and orientation of the frames or supports on which the emitters are mounted.
  • tracked fiducials appear in the diagnostic images, it is possible to define a transformation between operating room coordinates and the coordinates of the image.
  • Intraoperative x-ray images taken by C-arm fluoroscopes alone have both a high degree of distortion and a low degree of repeatability, due largely to deformations of the basic source and camera assembly, and to intrinsic variability of positioning and image distortion properties of the camera.
  • intraoperative sterile field such devices must be draped, which may impair optical or acoustic signal paths of the signal elements they employ to track the patient, tool or camera.
  • Correlation of patient anatomy or intraoperative fluoroscopic images with precompiled 3-D diagnostic image data sets may also be complicated by intervening movement of the imaged structures, particularly soft tissue structures, between the times of original imaging and the intraoperative procedure.
  • transformations between three or more coordinate systems for two sets of images and the physical coordinates in the operating room may require a large number of registration points to provide an effective correlation.
  • spinal tracking to position pedicle screws it may be necessary to initialize the tracking assembly on ten or more points on a single vertebra to achieve suitable accuracy. In cases where a growing tumor or evolving condition actually changes the tissue dimension or position between imaging sessions, further confounding factors may appear.
  • the registration may alternatively be effected without ongoing reference to tracking images, by using a computer modeling procedure in which a tool tip is touched to and initialized on each of several bony prominences to establish their coordinates and disposition, after which movement of the spine as a whole is modeled by optically initially registering and then tracking the tool in relation to the position of those prominences, while mechanically modeling a virtual representation of the spine with a tracking element or frame attached to the spine.
  • Such a procedure dispenses with the time-consuming and computationally intensive correlation of different image sets from different sources, and, by substituting optical tracking of points, may eliminate or reduce the number of x-ray exposures required to effectively determine the tool position in relation to the patient anatomy with the required degree of precision.
  • an x-ray imaging machine of movable angulation such as a fluoroscope
  • a tracking system employs a tracking element affixed to each of the imaging machine and tool, and preferably to the patient as well, to provide respective position data for the tool, the fluoroscope and patient, while a fixed volume array of markers, which is also tracked, is imaged in each frame.
  • the array of markers is affixed to the detector assembly of the imaging machine, where a single tracking element determines position of the fluoroscope and entire array of markers.
  • the fluoroscope may itself also provide further shot-specific indexing or identification data of conventional type, such as time, settings or the like.
  • a processor then applies the position data from the tracking system, and operates on the imaged markers to produce a correct tool navigation image for surgical guidance.
  • the markers are preferably arranged in a known pattern of substantially non-shadowing point elements positioned in different planes. These may be rigidly spaced apart in a predefined configuration in an assembly attached to the fluoroscope, so that the physical position of each marker is known exactly in a fixed fluoroscope-based coordinate system, and the positions may, for example, be stored in a table.
  • a single tracking element may be affixed on the marker assembly, which may in turn be locked in a fixed position on the fluoroscope, so that the fluoroscope and marker positions are known in relation to the tool and the patient.
  • one or more separate arrays of markers may be independently positioned and each tracked by a separate tracking element.
  • the processor identifies a subset of the markers and recovers geometric camera calibration parameters from the imaged marker positions. These calibration parameters then allow accurate reference between the recorded image and the tool and patient coordinates measured by the trackers.
  • the processor may also receive patient identification data of a conventional type to display or record with the shot.
  • the processor computes the calibration as well as geometric distortion due to the imaging process, and converts the tracked or actual location of the tool to a distorted tool image position at which the display projects a representation of the tool onto the fluoroscopic image to guide the surgeon in tool navigation.
  • the processor identifies markers in the image, and employs the geometry of the identified markers to model the effective source and camera projection geometry each time a shot is taken, e.g., to effectively define its focus and imaging characteristics for each frame. These parameters are then used to compute the projection of the tool in the fluoroscope image.
  • the fluoroscope is operated to take a series of shots in progressively varying orientations and positions as the camera and source are moved about the patient. Accurate calibration for multiple images is then employed to allow three-dimensional reconstruction of the image data.
  • the processor applies a reconstruction operation or procedure, for example, back projection of the registered images to form a volume image data set, e.g., a three dimensional set of image density values of a tissue volume.
  • the initial set of fluoroscopic images may, for example, be acquired by taking a series of views rotating the fluoroscope in a fixed plane about a target region of tissue. A common center and coordinate axes are determined for the reconstructed volume, such that the volume image data set constructed from the images corresponds to the target region. Image planes are then directly constructed and displayed from this volume image data set.
  • the resultant fluoro-CT images are geometrically comparable to conventional diagnostic image sets of the imaged volume, and obviate the need for complex tracking and image correlation systems otherwise proposed or required for operating-room management and display of pre-operatively acquired volumetric data sets with intraoperative fluoro images.
  • this reconstructed fluoro-CT data set is then registered to or transformed to the image space coordinates of a preoperative PET, MRI or CT data set for simultaneous display of both sets of images.
  • the system of the present invention may be used simply for the purpose of intraoperatively registering preoperative 3D image data to the patient tissue.
  • a set of fluoro-CT image data is constructed as described above, and these are registered to preoperative 3D image data by mutual information, contour matching or other correlation procedure. This provides a direct registration of the preoperative data to tracking coordinates without requiring the surgeon to place and image fiducials, touch and enter skeletal or surface registration points, or perform invasive pre-operation image registration protocols.
  • the tracking elements of the tracking system may comprise various position-indicating elements or markers which operate optically, ultrasonically, electromagnetically or otherwise, and the tracking system itself may include hybrid software-mediated elements or steps wherein a pointer or tool of defined geometry is tracked as it touches fiducials or markers in order to enter or initialize position coordinates in a tracking system that operates by triangulating paths, angles or distances to various signal emitting or reflecting markers.
  • a hybrid tracking system may also be used, including one or more robotic elements which physically encode mechanical positions of linkages or supports as part of one or more of the tracking measurements being made.
  • the tracking system employs electromagnetic tracking elements such as shown in U.S. Pat. No.
  • a single tracking element may be affixed to each of the fluoroscope, the patient, and the surgical tool.
  • a tracking element employs a magnetic field element, such as one configured with three mutually orthogonal coils, that otherwise operates as a substantially point-origin field generator or field sensor.
  • the element may have a rigid or oriented housing, so that when attached to a rigid object, the tracked coordinates of the element yield all coordinates, with only a defined constant offset, of the object itself.
  • the element may be energized as a field generator, or sampled as a field sensor, to produce or detect a field modulated in phase, frequency or time so that some or all of the x-, y-, z-, roll-, pitch-, and yaw coordinates of each tracking element, and thus its associated object, are quickly and accurately determined.
  • a table of position correction factors or characteristics may be compiled for one or more of the tracking elements to correct for the effects of electromagnetic shunting or other forms of interference with the generator or receiver which may occur when positioned in a region near to the body of the fluoroscope. This allows a magnetic tracking element to be placed quite close to the imaging assembly or other conductive structure and achieve high position tracking accuracy or resolution.
  • one or more tracking elements may be mounted directly on the fluoroscope and/or on calibration fixtures positioned close to the image detector of the fluoroscope to define camera and imaging parameters relative to another tracker which may move with the patient or with a tool.
  • Various alternative magnetic generating and sensing assemblies may be used for the tracking component, such as ones having a tetrahedrally-disposed generating element and a single sensing/receiving coil, or ones having a multipole generating assembly that defines a suitably detectable spatial field.
  • FIG. 1 illustrates a fluoroscopic image and tool navigation system in accordance with one embodiment of the present invention
  • FIG. 1A illustrates camera imaging of a tissue region with the system of FIG. 1;
  • FIG. 2 illustrate representative navigation images of one embodiment of the system of FIG. 1;
  • FIG. 2A illustrates the display of fluoroscope orientation in a preferred implementation of the system of FIG. 1;
  • FIG. 3 shows details of one camera calibration sub-assembly useful in the embodiment of FIG. 1;
  • FIG. 3A shows another calibration sub-assembly of the invention
  • FIG. 4 is a flow chart showing image processing and tool tracking in accordance with a first aspect of the invention
  • FIG. 5 illustrates operation of a second aspect of the invention
  • FIG. 6 illustrates a sheet fixture for use with the invention and having combined calibration and tracking elements
  • FIG. 7 illustrates camera calibrations corresponding to the fluoroscope poses illustrated in FIG. 1 A and used for the operation illustrated in FIG. 5;
  • FIG. 8 illustrates operation of the system to register preoperative images to a patient.
  • FIG. 1 illustrates elements of a basic embodiment of a system 10 in accordance with the present invention for use in an operating room environment.
  • the system 10 includes a fluoroscope 20 , a work station 30 having one or more displays 32 and a keyboard/mouse or other user interface 34 , and a plurality of tracking elements T 1 , T 2 , T 3 .
  • the fluoroscope 20 is illustrated as a C-arm fluoroscope in which an x-ray source 22 is mounted on a structural member or C-arm 26 opposite to an x-ray receiving and detecting unit, referred to herein as an imaging assembly 24 .
  • the C-arm moves about a patient for producing two dimensional projection images of the patient from different angles
  • the patient remains positioned between the source and the camera, and may, for example, be situated on a table or other support, although the patient may move.
  • the tracking elements are mounted such that one element T 1 is affixed to, incorporated in or otherwise secured against movement with respect to a surgical tool or probe 40 .
  • a second tracking unit T 2 is fixed on or in relation to the fluoroscope 20
  • a third tracking unit T 3 fixed on or in relation to the patient.
  • the surgical tool may be a rigid probe as shown in FIG.
  • the tracker T 1 is preferably a small, localized element positioned in or at the operative tip of the tool as shown by catheter tracker T 1 ′ in FIG. 1, to track coordinates of the tip within the body of the patient.
  • fluoroscopes typically operate with an x-ray source 22 positioned opposite the camera or image sensing assembly 24 . While in some systems, the X-ray source is fixed overhead, and the camera is located below a patient support, the discussion below will be illustrated with regard to the more complex case of a typical C-arm fluoroscope, in which the source and camera are connected by a structural member, the C-arm, that allows movement of the source and camera assembly about the patient so it may be positioned to produce x-ray views from different angles or perspectives.
  • the imaging beam generally diverges at an angle
  • the relative locations and orientations of the source and camera vary with position due to structural flexing and mechanical looseness
  • the position of both the source and the camera with respect to the patient and/or a tool which it is desired to track may also vary in different shots.
  • the imaging beam illustrated by B in FIG. 1 diverges from the source 22 in a generally truncated conical beam shape, and the C-arm 26 is movable along a generally arcuate path to position the source and camera for imaging from different directions. This generally involves positioning the camera assembly 24 as close as possible behind the relevant tissue or operating area of the patient, while the C-arm assembly is moved roughly about a targeted imaging center to the desired viewing angle.
  • the C-arm or other beam structure 26 may be a somewhat flexible structure, subject to bending, deflection or sagging as the source and camera move to different positions around the patient, and the C-arm may also have other forms of dimensional variation or looseness, such as drive gear backlash, compressible elastomeric suspension components or the like, which may contribute to variations and non-repeatability of the relative disposition and alignment of the source and camera with respect to each other, and with respect to the patient, as the assembly is moved to different positions.
  • the C-arm may also move eccentrically or translationally to allow better clearance of the patient support table.
  • the bending deflections of the C-arm assembly may vary the actual position of the source 22 by almost a centimeter or more with respect to the image detector, and displace it from a nominal position which may be indicated, for example, by an encoder present in the fluoroscope stand or C-arm positioning assembly. These variations may therefore be significant.
  • FIG. 1A illustrates the fluoroscope 20 in two different imaging positions, with a first position shown in solid line, and a second position in dashed line phantom.
  • a tissue volume V is imaged with a divergent beam from the above right, and a virtual beam origin or focal point at F, while the image from the second position catches a largely overlapping but partly distinct tissue volume with a divergent beam from the upper left, and a different focal point F′.
  • the distances from points F, F′ to the camera may be different, and the camera itself may shift and tilt with respect to the beam and its center axis, respectively.
  • the x-ray beam is generally aimed by its center ray, whose intersection with the imaging plane, referred to as the piercing point, may be visually estimated by aiming the assembly with a laser pointing beam affixed to the source.
  • the x-ray beam may be considered to have a virtual origin or focal point F at the apex of the cone beam.
  • the camera assembly 24 is positioned close to the patient, but is subject to constraints posed by the operating table, the nature of the surgical approach, and the necessary tools, staging, clamps and the like, so that imaging of a tissue volume somewhat off the beam center line, and at different distances along the beam, may occur.
  • flexing of the C-arm also changes the distance to the focal point F and this also may slightly vary the angular disposition of the beam to the camera, so this shifting geometry may affect the fluoroscope images.
  • the camera 24 may utilize an image sensing unit that itself introduces further distortions into the received distribution of image radiation.
  • the unit may involve a detector that employs a phosphor surface of generally curved contour to convert the x-ray image intensity distribution to a free electron distribution.
  • a curved phosphor screen is generally placed over an electron multiplier or image intensifier assembly that provides an enhanced output video signal, but may further introduce a form of electron optical distortion that depends upon the intensifier geometry and varies with the orientation of the camera assembly in the earth's magnetic field.
  • Other configurations of image detectors are also known or proposed, such as digital x-ray detectors or flat semiconductor arrays, which may have different imaging-end fidelity characteristics.
  • deflection or physical movement of the camera itself as well as electron/optical distortion from the camera geometry, image detector and variations due to gravitational, magnetic or electromagnetic fields can all enter the image reception and affect the projective geometry and other distortion of the final image produced by the assembly.
  • the equipment and procedure has two components, a first component provided by a tracking assembly which determines position of a fluoroscope calibration fixture relative to one or both of the tool and patient body, and a second component provided by a processor operating on each image that characterizes or models the geometry of the camera and performs all subsequent processing.
  • a calibration fixture that contains an array of markers, which is either tracked as a rigid unit or affixed to the camera, while the imaged position of the markers in each fluoroscope shot serves to characterize the imaging geometry so as to allow correction of imaged features at measured distances from the camera, and permit registration of successive images in different poses.
  • the surgical instrument display 40 ′ of FIGS. 1 and 2 is effected by determining tool position, focus and imaging axis, and rendering the instrument in conjunction with one or more of the three types of images mentioned above.
  • the processor determines an image distortion inverse transform and projects a distorted or transformed tool graphic or image on the fluoroscopic view.
  • the processor determines the camera geometry for each image and transforms the set of fluoroscopic images such that the screen coordinates of display 33 are similar or aligned with the operating coordinates as measured by tracking elements T 2 , T 3 . This calibration results in more accurate tool tracking and representation over time. As further discussed in regard to FIG.
  • the image data of an imaging sequence for a region of tissue about a common origin may be back-projected or otherwise processed to define a three dimensional stack of fluoro-CT images.
  • the invention thus allows a relatively inexpensive C-arm fluoroscope to achieve accuracy and registration to prepare CT images for tool guidance and reconstruction of arbitrary planes in the imaged volume.
  • the data processing and work station unit 30 illustrated in FIG. 1 may be laid out in a conventional fashion, with a display section in which, for example, a previously acquired CT or diagnostic image is displayed on one screen 32 while one or more intraoperative images 33 , such as a A/P and a lateral fluoroscopic view, are displayed on another screen.
  • FIG. 2 schematically represents one such display.
  • the system may present an appearance common to many systems of the prior art, but, in a first aspect provides enhanced or corrected navigation guiding images, while in a second aspect may provide CT or other reconstructed images in display 32 formed directly from the fluoroscopic views.
  • the system may provide dynamic referencing between these reconstructed images and a set of preoperative 3D image data.
  • one fluoroscope image in display 33 may be taken with the beam disposed vertically to produce an A/P fluoroscopic image projected against a horizontal plane, while another may be taken with beam projected horizontally to take a lateral view projected in a vertical plane.
  • the image typically shows a plurality of differently shaded features, so that a patient's vertebra, for example, may appear as an irregular three-dimensional darkened region shadow-profiled in each of the views.
  • the tool representation for a navigation system may consist of a brightly-colored dot representing tip position and a line or vector showing orientation of the body of the tool approaching its tip. In the example shown in FIG.
  • the probe projected image 40 ′ may extend directly over the imaged structure from the side in the A/P or top view, while when viewed in the vertical plane the perspective clearly reveals that the tip has not reached that feature but lies situated above it in space.
  • the display employs position data from the tracking assembly to display the fluoroscope's current angle of offset from the baseline AP and lateral views. Surgeons have generally become accustomed to operating with such images, and despite the fact that the fluoroscopic images are limited by being projection images rather than 3D images, their display of approximate position and orientation, in conjunction with the diagnostic image on panel 32 which may also have a tool point representation on it, enables the surgeon to navigate during the course of a procedure.
  • this display is further enhanced by employing position data from the tracking assembly to display the fluoroscope's current angle of offset from the baseline AP and lateral fluoroscope views. This may be done as shown in FIG. 2A, by marking the fluoroscope's tracked angle or viewing axis with a marker on a circle between the twelve o'clock and three o'clock positions representing the AP and lateral view orientations.
  • a tracking system tracks the surgical instrument 40 , and the system projects a representation 40 ′ of the tool on each of the images detected by the image detector 24 .
  • This representation while appearing as a simple vector drawing of the tool, is displayed with its position and orientation determined in the processor by applying a projective transform and an inverting image distortion transformation to the actual tool coordinates determined by the tracking elements. Thus, it is displayed in “fluoroscope image space”, rather than displaying a simple tool glyph, or correcting the image to fit the operating room coordinates of the tool.
  • FIG. 3 illustrates one embodiment 50 of a suitable marker array, calibration fixture or standard ST for the practice of the invention.
  • the fixture may include several sheets 52 of radiolucent material, each holding an array of radiopaque point-like markers 54 , such as stainless steel balls. (hereafter simply referred to as BBs).
  • the BBs may be of different sizes in the different planes, or may be of the same size.
  • the illustrated calibration fixture 50 includes a releaseable clamp assembly 51 , with a camming clamp handle 51 a , configured to attach directly on or over the face of the camera assembly.
  • a tracking element is associated with each of the tool, the patient and the fluoroscope.
  • Each tracking element is secured against movement with respect to the structure it is tracking, but advantageously, all three of those structures are free to move.
  • the fluoroscope may move freely about the patient, and both the patient and the tool may move within the operative field.
  • the tracking element associated with the fluoroscope is positioned on a calibration fixture 50 which is itself rigidly affixed to the camera of the fluoroscope as described above.
  • the calibration fixture may be removably attached in a precise position, and the tracking element T 2 may be held in a rigid oriented body affixed to the fixture 50 .
  • the tracking element T 2 (FIG.
  • the tracking element T 2 may, for example, be a point-origin defining tracking element that identifies the spatial coordinates and orientation of its housing, hence, with a rigid coordinate transform, also specifies the position and orientation coordinates of the object to which it is attached.
  • the tracking element T 2 may with one measurement determine the positions of all markers in the calibration fixture, and the position and orientation of the fixture itself or the horizontal surface of the camera assembly.
  • the illustrated marker plates may each be manufactured by NC drilling of an array of holes in an acrylic, e.g., Lexan, and/or other polymer plate, with the BBs pressed into the holes, so that all marker coordinates are exactly known.
  • marker plates may be manufactured by circuit board microlithography techniques, to provide desired patterns of radiopaque markers, for example as metallization patterns, on one or more thin radiolucent films or sheets.
  • the calibration assembly rather than employing separate sheets bearing the markers, may be fabricated as a single block 50 of a suitable radiolucent material, such as a structural foam polymer having a low density and high stiffness and strength. In that case, as shown in FIG.
  • holes may be drilled to different depths and BB markers may be pressed in to defined depths Z 1 , Z 2 . . . at specific locations to create the desired space array of markers in a solid foam calibration block.
  • One suitable material of this type is a structural foam of the type used in aircraft wings for lightweight structural rigidity. This material may also be employed in separate thin marker-holding sheets. In any case the selected polymer or foam, and the number and size of the markers, are configured to remain directly in the imaging beam of the fluoroscope device and be imaged in each shot, while the position of the fixture is tracked.
  • the fixture materials are selected to avoid introducing any significant level of x-ray absorption or x-ray scattering by the plates, sheets or block, and the size and number of markers are similarly chosen to avoid excessive shadowing of the overall image, while maintaining a sufficiently dense image level for their detectability, so that both the imaging source radiation level and the resulting image density scale remain comparable to currently desired operating levels.
  • the BBs are arranged in a pattern at one or more levels, with a different pattern at each level. Further, when more than one array at different depths is used, the patterns are positioned so that as the source/camera alignment changes, BBs of one pattern cast shadows substantially distinct from those of the other pattern(s).
  • the array of markers is imaged in each fluoroscope shot.
  • the image display system of the present invention operates by first identifying markers in the image. This is done in an automated procedure, for example, by a pipeline of grey level thresholding based on the x-ray absorption properties of the markers, followed by spatial clustering based on the shape and size of the markers.
  • each sheet has markers arranged in a particular pattern. The pattern of each sheet will be enlarged in the image by a scale that varies with the cone divergence and the distance of the marker sheet along the axis from the optical center (or x-ray source) to the detection surface.
  • the marker images will also be shifted radially away from the beam center axis due to the beam divergence.
  • the calibration fixture is positioned close to the image detection surface, and the markers lie in arrays distributed in planes placed substantially perpendicular to the optical axis and offset from the detection surface. In general, not all markers will be located in the image due to shadowing of some of markers, or occlusion of the marker by another object of similar x-ray absorption response.
  • the candidate markers in the image are first identified using image processing and then matched with corresponding markers in the fixture.
  • One suitable protocol takes a candidate marker P i in image coordinates, assumes it is, e.g., marker number Q j of sheet one, and then determines how many other candidate markers support this match, i.e., line up with the expected projections of the remaining markers of one array, e.g., in the pattern of sheet one.
  • the number of candidates matching the known template or pattern of sheet one is totaled, and is taken as the score of that marker.
  • This process is repeated to score each candidate marker in the image, and an identification scored above a threshold is taken as correct when it leads to the highest score for that candidate, and does not conflict with the identification of another high-scoring candidate.
  • Scoring of the match is done by using the observation that the ratio of distances and angles between line segments on the same plane are invariant under perspective projection.
  • the processor may proceed on a point-by-point basis, that is, an exhaustive matching process may be used to determine the correspondence between points.
  • the marker detection processor preferably employs an optimization algorithm such as the Powell, Fletcher or a simplex algorithm.
  • an optimization algorithm such as the Powell, Fletcher or a simplex algorithm.
  • One particularly useful pattern matching algorithm is that published by Chang et al in Pattern Recognition, Volume 30, No. 2, pp. 311-320, 1997. That algorithm is both fast and robust with respect to typically encountered fluoroscopic distortions.
  • the Chang alignment/identification algorithm may be accelerated relying upon the fact that the marker fixture itself has a known marker geometry.
  • the marker identification module may predict the expected positions in the image, and search for matches within a defined small neighborhood.
  • the image processor calibration module includes a pre-compiled table, for example, stored in non-volatile memory, indicating the coordinates of each marker of the pattern, and preferably includes tables of separation for each pair, and/or included angle for each triplet of markers, to implement fast identification.
  • each such unit may be tracked by a separate tracking element.
  • the array of marker positions are determined in each fluoroscopic image frame from the tracking element T 2 and from the fixed relative position coordinates stored in the marker table.
  • the camera is next calibrated using the marker identification information of the previous steps.
  • the imaging carried out by the fluoroscope is modeled as a camera system in which the optical center is located at the x-ray source and the imaging plane is located a distance F (focal length) away from it inside the camera assembly.
  • the optical axis is the line through the x-ray source and perpendicular to the horizontal face of the camera. The intersection of the optical axis and the image plane is defined as the piercing point.
  • Certain imaging or distortion characteristics may also be measured by the array of marker images, which thus determines a corrective perspective transformation.
  • a suitable algorithm is that described by Roger Tsai in his article on 3-D camera calibration published in the IEEE Journal of Robotics and Automation , Volume RA-3, No. 4, August 1987, pp. 323-344.
  • This model determines radial distortion in addition to parameters using an algorithm that takes as input the matched marker and image locations, estimates of focal length and information about the number of rows and columns in the projection image.
  • This algorithm is readily implemented with one or more planes of markers in the fixture 50 or 50 ′. When the fluoroscope is sufficiently rigid that focus does not vary, a single plane of markers may be used to define the camera parameters.
  • a pattern of markers may comprise a rectangular lattice, e.g., one marker every centimeter or half-centimeter in two orthogonal directions, or may occupy a non-periodic but known set of closely-spaced positions.
  • the calibration fixture may be constructed such that markers fill a peripheral band around the imaged tissue, to provide marker shadow images that lie outside the imaged area and do not obscure the tissue which is being imaged for display.
  • the markers are located in the imaged field, so that the imaging camera and distortion transforms they define closely fit and characterize the geometric imaging occurring in that area.
  • the image processor removes the marker shadow-images from the fluoroscope image frame before display on the console 30 (FIG. 1 ), and may interpolate or otherwise correct image values in the surrounding image.
  • the processor in one basic embodiment then integrates tracked tool position with the fluoroscope shot. That is, having tracked the position of tool 40 via tracking element TI, relative to the marker array 50 and tracking element T 2 , and having modeled the camera focus, optical axis and image plane relative to the position of the fixture 50 , the system then synthesizes a projection image of the tool as it dynamically tracks movement of the tool, and displays that tool navigation image on the fluoro A/P and/or lateral view of screen 33 (FIG. 1 ).
  • the processor obtains the position of the front and back tips of the tool. These are fixed offsets from the coordinates of the tracking element T 1 associated with the tool.
  • the tracker may also determine tool orientation relative to the patient from position and orientation relative to the tracking element T 3 on the patient at the time of image capture. Tracked position coordinates are converted to be relative to the fixed tracking element on the camera, or so that all coordinates reference the image to which the camera model applies.
  • the camera calibration matrix is then applied to the front and back tip position coordinates of the tool to convert them to fluoroscope image space coordinates.
  • end point coordinates are converted to undistorted two-dimensional image coordinates (e.g., perspective coordinates) using the calculated focal length of the camera, which are then converted to distorted two-dimensional image coordinates using the lens distortion factor derived from the matrix of marker positions.
  • Corresponding pixel locations in the two-dimensional fluoroscope image are determined using the x-scale factor, the calculated origin of the image plane and scaling based on the number of pixels per millimeter in the camera image sensor and display.
  • the determined position is then integrated with the video display on the fluoroscope to show a graphical representation of the tool with its front tip location in image coordinates.
  • the tool is displayed as an instrument vector, a two-dimensional line on the fluoroscopic image with a red dot representing its tip.
  • the tracking assembly may track tool movement relative to the patient, and a processor controls the tracking and determines from the position of the tool when it is necessary to redraw the integrated display using the above-described image distortion transformations to correctly situate the displayed tool in a position on a new image.
  • the process of camera calibration is a process of applying actual coordinates as determined by the tracking system and marker positions, and image coordinates as seen in the fluoroscopic marker images, to model a camera for the image.
  • applicant's provision of an array of marker points having known coordinates in each of several planes, together with tracking coordinates corresponding to the absolute position of those planes and modeling of the camera image plane with respect to these tracked positions obviates the need for lengthy initialization or correlation steps, and allows an image processor to simply identify the marker images and their positions in the image, model the camera to define focus, image plane and piercing point, and to effect image corrections with a few automated tracking measurements and transformations.
  • the fixture is preferably fixed close to the front surface of the image detector assembly, so the calibration fits the detected image closely.
  • the marker positions allow a simple computation of effective parameters to fully characterize the camera. This allows one to scale and correct positions of the image (for example a tool) when their coordinates are tracked or otherwise unknown.
  • the fluoroscope is operated to take a large number of fluoro images, with fixture tracking and camera modeling as described above, and a 3D CT image data set is reconstructed from the acquired data.
  • this data set can be acquired such that it is dimensionally accurate and useful for close surgical guidance, although parameters such as x-ray absorbance, corresponding, for example to bone or tissue density, will be of lesser accuracy than those obtainable from a CT scanner, and should not be relied upon.
  • the fluoroscopic CT images so formed may be further correlated with preoperative MRI, PET or CT images to define a direct image coordinate transformation, using established techniques such as MI (mutual information) registration, edge or contour matching, or the like, between the fluoroscopic 3D data set of the present invention and the existing preoperative 3D image set.
  • MI residual information
  • Operation for forming a volume image data set for CT reconstruction proceeds as follows.
  • the fluoroscope is operated to obtain a dense set of fluoroscope images, for example, by rotating the fluoroscope approximately in a plane about the patient through 180° plus the angle of divergence of the cone beam, taking a shot every degree or less, so as to image a particular three-dimensional tissue volume of the patient in a large number of images.
  • pose information given for example by the position and orientation measurement of the tracking element T 2 , is stored, and the marker detection/ calibration module operates on each shot so that a correction factor and a perspective projection matrix is determined for each image, as described above, to model the camera focus, image plane and optical axis for that shot.
  • a coordinate system for the tissue volume for which reconstruction is desired is then computed, and the processor then applies filtered back projection or other reconstruction processing (such as lumigraphs or lambda-CT), with indexing provided by the relative disposition of each pose, to reconstruct a three-dimensional volume data image set in the intra-operative coordinate system for a region of tissue around the origin of the reconstruction coordinate system.
  • This 3-D image data set referenced to tracker coordinates readily allows CT reconstruction of desired planes within the image set, referenced to patient or tool position.
  • the tracking system In order to integrate the tracking system with the fluoroscopic images, it is necessary to establish a coordinate system for the three-dimensional reconstructed volume. This entails defining the origin and the coordinate axes for that volume. Once such a coordinate system is defined in relation to all fluoro images, one can compute the back projection at voxels in a region referenced to the origin, in planes that are perpendicular to one of the coordinate axes. In the case of a spinal scan, for example, the desired CT planes will be planes perpendicular to an axis that approximates the long axis of the body.
  • Such a spinal data set is especially useful, since this view cannot be directly imaged by a fluoroscope, and it is a view that is critical for visually assessing alignment of pedicle screws. Applicant establishes this common coordinate system in a way that minimizes risk of: (a) backprojecting voxels where insufficient data exists in the projections or (b) being unable to define the relationship between the natural coordinate system of the patient and that of the reconstruction.
  • the camera tracking data may be used to fit a center. This is considered an advance over systems that require a coordinate system to be specified manually.
  • the tracking elements automatically detect coordinates of the marker array, tool and patient at the time each image is taken. Detection of the calibration fixture position allows camera modeling to provide the position of the optical center (F), optical axis and image plane, in tracker coordinates for each shot as described above.
  • the combination of tracked position and modeled camera information is used to define a coordinate system for the reconstruction, which is preferably computed by performing statistical and computational geometry analysis on the pose information recorded and derived for each of the fluoroscopic image frames.
  • the “projection plane” is the plane on which the image is formed through the operation of perspective projection.
  • the “optical center” or the “center of projection”, C is located at a distance F, the focal length of the optical system, from the projection plane. In the case of a fluoroscope, this is the actual location of the x-ray source; the source is positioned at the optical center of the imaging system.
  • the projection of a given point M in the world is computed as the intersection of the ray connecting M and the optical center C with the projection plane.
  • the “optical axis” of a fluoroscopic imaging system is the line that passes through its optical center (the x-ray source) and is normal to the projection plane.
  • the point at which the optical axis intersects the projection plane is known as the “principal point” or the “piercing point”.
  • a textbook such as “Three-Dimensional Computer Vision” by Olivier Faugeras, MIT Press, may be consulted for further background or illustration of basic concepts used here.
  • Applicant's approach to the problem of computing a coordinate origin for reconstruction assures that in this set of data, the origin of the 3D coordinate system lies at a point that is the center of the region that the surgeon is most interested in visualizing. That point is identified in a prototype system by computing a point that is closest to being centered in all of the acquired fluoroscopic images, and then taking that point as the origin of a coordinate system in which the reconstruction is performed.
  • FIG. 5 sets forth the steps of this processing.
  • Each configuration of the C-arm defines a coordinate system in which the origin, (0,0,0) is defined by the location of the x-ray source. The principal point is located at (0,0,F) where F is the focal length. That is, the optical axis, or axis of the imaging beam, is aligned along the third axis.
  • F the focal length
  • FIG. 7 Such a situation is schematically illustrated in FIG. 7 for the two fluoroscope positions shown in FIG. 1 A. If all the fluoroscope configurations are taken in the context of a common world-coordinate system, each of these configurations defines a unique optical axis.
  • the point in three-space where all these optical axes intersect would be visible and centered in all the projection images.
  • the processor incorporates a software condition check for skewness of lines. If the optical axes are skew, the processor defines the intersection point as a computed point that is halfway between the two lines. In order to address the situation (b), the processor takes the mean coordinates of the N 2 skew-intersection points determined in the first step as its common center of projection. Thus the cluster of points defined by the N 2 pairs of axes determines a single point. This point is defined as the origin of the tissue region for which reconstruction is undertaken.
  • the axial planes of the reconstruction are to be parallel to the plane of motion of the x-ray source.
  • Applicant's presently preferred processing module computes the plane of motion of the x-ray source by fitting a least-squares solution to the poses of the x-ray source. Any two non-collinear vectors in this plane define a basis for this plane and serve as two of the axes for the coordinate system. The module also computes a normal to this plane to serve as the third coordinate axis.
  • the coordinate axis computation may be done by using eigen-analysis of the covariance matrix of the coordinates of the optical centers (x-ray source locations) and the principal points in the world-coordinate system. These eigenvectors are then ordered in order of decreasing eigenvalue. The first two eigenvectors provide a basis for the axial plane of interest, and the third eigenvector provides the normal to this plane. This procedure thus provides all three coordinate axes for the three-dimensional reconstruction. This determination is fully automated, and requires only the tracker data and camera models determined by the processor when each shot is taken. Further background and details of implementation for applying the eigenanalysis technique to define coordinate axes may be found in reference texts, such as the 1984 textbook “Pattern Recognition” by J. Therrien.
  • the processor filters and back-projects the image data to form a volume image data set, from which CT planes may be reconstructed or retrieved in a conventional manner.
  • the back projection step may utilize fast or improved processes, such as the fast Feldkamp algorithm or other variant, or may be replaced by other suitable volume data reconstruction technique, such as the local or Lambda tomography method described by A. Louis and P. Maass in IEEE Transac. Med. Imag. 764-769, (1993) and papers cited therein.
  • a simple set of automated tracking elements combined with image processing operative on a fixed or tracked marker array provides accurate tool tracking fluoroscope images, or a set of geometrically accurate reconstructed or CT images from the shadowgraphic images of a C-arm or intraoperative fluoroscope.
  • the nature of the multi-point marker-defined camera image model allows the processor to quickly register, reference to a common coordinate system and back project or otherwise reconstruct accurate volume image data, and the fast determination of a camera parameter model for each shot proceeds quickly and allows accurate tool display for intraoperative tool navigation and dynamic tracking, without requiring rigid frames or robotic assemblies that can obstruct surgery, and without the necessity of matching to an MRI or PET database to achieve precision.
  • the models, transformations and fitting to a coordinate system proceed from the tracker position measurements of the marker fixture relative to the patient or tool, rather than from an extrinsic fixed frame, reducing potential sources of cumulative errors and simplifying the task of registering and transforming to common coordinates. Applicant is therefore able to precisely track and display the tool in real time, and to produce accurate fluoro-CT images using a C-arm fluoroscope.
  • tracking elements each fixed with respect to one of a few movable objects.
  • these tracking elements may be affixed by belts, frames, supports, clips, handles or other securing or orienting structures known in the art.
  • applicant's preferred tracking element is a magnetic field tracking element, which may be oriented and affixed in a rigid housing that allows it to secure to the structure to be tracked.
  • the calibration fixture has been described above as being preferably affixed to the image detector portion of the fluoroscope, where, illustratively one or several precision arrays of markers located along the imaging axis provide necessary data in the image itself to characterize the camera each time an image is taken.
  • This location with the markers in a single fixture, provides a high level of accuracy in determining the desired camera parameters, and enables tracking to proceed without obstructing the surgical field.
  • the constraint of positioning the calibration fixture between the target tissue and the detector may limit flexibility in positioning the image detector near the patient, this may be addressed in other embodiments by having all or a portion of the marker array assembly implemented with markers located on or in a radiographic support table ( 75 , FIG. 6) or other structure on which the patient or the imaged tissue portion is supported.
  • the table or support itself which is radiolucent, may have a thickness and structure that permits markers to be embedded at different depths.
  • it may be formed of a structural foam material as described above in regard to the marker fixture of FIG. 3 A.
  • the markers may be included in one or more sheets that fit within the x-ray sheet film tray of a conventional radiography table, or such marker sheets may be laminated to the bottom and/or top surfaces of the table.
  • the tracking element T 2 may then be attached anywhere on the rigid structure of the table itself, with suitable offsets stored in a fixed memory element of the system.
  • the total angular range of the poses in which useful marker images will appear in the fluoroscope images may be restricted to somewhat under 180°.
  • the image plane will generally not be parallel to the marker arrays, so a different set of computations is utilized by the processor to characterize the camera position and geometry. However, these computations involve straightforward camera modeling, and may be accelerated by also tracking the image detector with an additional element T 2 ′.
  • FIG. 6 shows elements of one such embodiment wherein a marker array 50 ′′ is formed as a pattern of metallized dots 56 , which may be formed lithographically on a printed-circuit type sheet. As indicated schematically in this Figure, the sheet may also bear one or more lithographically-formed conductive loops 58 , configured as a field generating or field sensing loop, for defining one or more elements of a magnetic tracking assembly.
  • Three or more such patterned loops may be formed to constitute a basic electromagnetic generator or sensor that advantageously is precisely pre-aligned with respect to the coordinates of the markers 56 by virtue of its having been manufactured using a pattern lithography mask.
  • the magnetic circuit loops may define magnetic multipoles for establishing or sensing position-tracking electromagnetic fields, or may, for example, include one or more coils of a system of Helmholtz coils for establishing a gradient field in the region where tracking is to occur. These may operate in conjunction with other coils disposed elsewhere for defining the tracking field,
  • the implementation of magnetic tracking and radiographic marker elements on a sheet also allows plural sheets to be positioned and tracked separately for effecting the imaged based processing of the present invention.
  • a fluoro-CT data set is constructed as described above, and the fluoro-3D data set is then registered or correlated to an existing MRI, CT or PET 3D data set to form a fused set of images. These are then displayed on the system console 30 (FIG. 1) to provide enhanced patient information during surgery.
  • the coordinates of the fluoro-CT images are known from the coordinates used in the reconstruction processing, while the correlation of the two different 3D image sets may proceed without reference to patient or other tracking coordinates, using any conventional 3D registration or correlation technique. This provides fast and effective fused image sets for surgical guidance or diagnostic evaluation.
  • the system need not produce detailed fluoro-CT images, or need not display those images.
  • the fluoro-CT images, or a lesser quality set of fluoro-CT images constructed from a faster (smaller) scan sequence of fluoro images, defined in tracker coordinates may be produced and simply registered to a preoperative 3D data set in order to bring that preoperative image data set into the tracker coordinate system.
  • the system applies this registration, and proceeds thereafter by simply tracking the patient and the tool, and displaying the appropriate preoperative images for each tracked location as shown in FIG. 8 .
  • the system provides an automated registration system for the intraoperative display of preoperative MRI, PET or CT images, without requiring placement or imaging of fiducials, without requiring the surgeon to initialize or set up a plurality of reference points, without requiring the surgeon to cut down to or expose a fixed skeletal registration feature, and without requiring immobilization of the patient in a frame or support.
  • the intermediate fluoro-CT images are produced as part of an automated modeling and coordinatizing process, and both the production and the registration of these images to the preoperative data set may proceed entirely automated in software, for example, registering by mutual information (MI), feature correlation or similar process.
  • MI mutual information

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)

Abstract

A system employs a tracker and a set of substantially non-shadowing point markers, arranged in a fixed pattern or set in a fluoroscope calibration fixture that is imaged in each shot. The fixture is preferably affixed to the image detector of the fluoroscope, and tracking elements secured with respect to the fixture and at least one of a tool and the patient, provide respective position data irrespective of movement. A marker detection module identifies markers imaged in each shot, and a processor applies the known marker positions to model the projection geometry, e.g., camera axis and focus, for the shot and, together with the tracked tool position, form a corrected tool navigation image. In one embodiment an inverting distortion correction converts the tracked or actual location of the tool and displays the tool on the fluoroscopic image to guide the surgeon in tool navigation. In another aspect of the invention, the fluoroscope takes a series of frames while rotating in a plane about the patient, and the camera models derived from the marker images in each frame are applied to define a common center and coordinate axes in the imaged tissue region to which all of the fluoroscope view may be registered. The processor then filters and back-projects the image data or otherwise forms a volume image data set corresponding to the region of tissue being imaged, and desired fluoro-CT planar images of a the imaged patient volume are constructed from this data set. Planes may then be constructed and displayed without requiring complex tracking and image correlation systems previously needed for operating-room management of MRI, CT or PET study image data. Further, the fluoro-CT images thus constructed may be directly registered to preoperative MRI, CT or PET 3D image data, or may obviate the need for such preoperative imaging. Preferably, the tracker employs electromagnetic tracking elements, as shown for example in U.S. Pat. No. 5,967,980, to generate and/or detect electromagnetic field components unobstructed by the patient and intervening structures, and to determine coordinates directly referenced to the patient, the tool or the camera. The calibration fixture may be implemented with BBs in a radiolucent block of structural foam, and/or may be implemented by microlithographic techniques, in which case magnetic tracking elements may be simultaneously formed in registry with the markers on a sheet that mounts to the camera, is incorporated in a radiographic support table, or otherwise positioned to be imaged in each shot.

Description

BACKGROUND
The present invention relates to medical and surgical imaging, and in particular to intraoperative or perioperative imaging in which images are formed of a region of the patient's body and a surgical tool or instrument is applied thereto, and the images aid in an ongoing procedure. It is of a special utility in surgical procedures such as brain surgery and arthroscopic procedures on the knee, wrist, shoulder or spine, as well as certain types of angiography, cardiac procedures, interventional radiology and biopsies in which x-ray images may be taken to display, correct the position of, or otherwise navigate a tool or instrument involved in the procedure.
Several areas of surgery have required very precise planning and control for the placement of an elongated probe or other article in tissue or bone that is internal or difficult to view directly. In particular, for brain surgery, stereotactic frames to define the entry point, probe angle and probe depth are used to access a site in the brain, generally in conjunction with previously compiled three-dimensional diagnostic images such as MRI, PET or CT scan images which provide accurate tissue images. For placement of pedicle screws in the spine, where visual and fluoroscopic imaging directions cannot capture an axial view necessary to center the profile of an insertion path in bone, such systems have also been useful.
When used with existing CT, PET or MRI image sets, these previously recorded diagnostic image sets themselves define a three dimensional rectilinear coordinate system, by virtue of their precision scan formation or the spatial mathematics of their reconstruction algorithms. However, it may be necessary to correlate the available fluoroscopic views and anatomical features visible from the surface or in fluoroscopic images with features in the 3-D diagnostic images and with the external coordinates of the tools being employed. This is often done by providing implanted fiducials, and adding externally visible or trackable markers that may be imaged, and using a keyboard or mouse to identify fiducials in the various images, and thus identify common sets of coordinate registration points in the different images, that may also be trackable in an automated way by an external coordinate measurement device, such as a suitably programmed off-the-shelf optical tracking assembly. Instead of imageable fiducials, which may for example be imaged in both fluoroscopic and MRI or CT images, such systems can also operate to a large extent with simple optical tracking of the surgical tool, and may employ an initialization protocol wherein the surgeon touches or points at a number of bony prominences or other recognizable anatomic features in order to define the external coordinates in relation to the patient anatomy and to initiate software tracking of those features.
Generally, systems of this type operate with an image display which is positioned in the surgeon's field of view, and which displays a few panels such as a selected MRI image and several x-ray or fluoroscopic views taken from different angles. The three-dimensional diagnostic images typically have a spatial resolution that is both rectilinear and accurate to within a very small tolerance, e.g., to within one millimeter or less. The fluoroscopic views by contrast are distorted, and they are shadowgraphic in that they represent the density of all tissue through which the conical x-ray beam has passed. In tool navigation systems of this type, the display visible to the surgeon may show an image of the surgical tool, biopsy instrument, pedicle screw, probe or the like projected onto a fluoroscopic image, so that the surgeon may visualize the orientation of the surgical instrument in relation to the imaged patient anatomy, while an appropriate reconstructed CT or MRI image, which may correspond to the tracked coordinates of the probe tip, is also displayed.
Among the systems which have been proposed for effecting such displays, many rely on closely tracking the position and orientation of the surgical instrument in external coordinates. The various sets of coordinates may be defined by robotic mechanical links and encoders, or more usually, are defined by a fixed patient support, two or more receivers such as video cameras which may be fixed to the support, and a plurality of signaling elements attached to a guide or frame on the surgical instrument that enable the position and orientation of the tool with respect to the patient support and camera frame to be automatically determined by triangulation, so that various transformations between respective coordinates may be computed. Three-dimensional tracking systems employing two video cameras and a plurality of emitters or other position signaling elements have long been commercially available and are readily adapted to such operating room systems. Similar systems may also determine external position coordinates using commercially available acoustic ranging systems in which three or more acoustic emitters are actuated and their sounds detected at plural receivers to determine their relative distances from the detecting assemblies, and thus define by simple triangulation the position and orientation of the frames or supports on which the emitters are mounted. When tracked fiducials appear in the diagnostic images, it is possible to define a transformation between operating room coordinates and the coordinates of the image.
In general, the feasibility or utility of a system of this type depends on a number of factors such as cost, accuracy, dependability, ease of use, speed of operation and the like. Intraoperative x-ray images taken by C-arm fluoroscopes alone have both a high degree of distortion and a low degree of repeatability, due largely to deformations of the basic source and camera assembly, and to intrinsic variability of positioning and image distortion properties of the camera. In an intraoperative sterile field, such devices must be draped, which may impair optical or acoustic signal paths of the signal elements they employ to track the patient, tool or camera.
More recently, a number of systems have been proposed in which the accuracy of the 3-D diagnostic data image sets is exploited to enhance accuracy of operating room images, by matching these 3-D images to patterns appearing in intraoperative fluoroscope images. These systems may require tracking and matching edge profiles of bones, morphologically deforming one image onto another to determine a coordinate transform, or other correlation process. The procedure of correlating the lesser quality and non-planar fluoroscopic images with planes in the 3-D image data sets may be time-consuming, and in those techniques that rely on fiducials or added markers, the processing necessary to identify and correlate markers between various sets of images may require the surgeon to follow a lengthy initialization protocol, or may be a slow and computationally intensive procedure. All of these factors have affected the speed and utility of intraoperative image guidance or navigation systems.
Correlation of patient anatomy or intraoperative fluoroscopic images with precompiled 3-D diagnostic image data sets may also be complicated by intervening movement of the imaged structures, particularly soft tissue structures, between the times of original imaging and the intraoperative procedure. Thus, transformations between three or more coordinate systems for two sets of images and the physical coordinates in the operating room may require a large number of registration points to provide an effective correlation. For spinal tracking to position pedicle screws it may be necessary to initialize the tracking assembly on ten or more points on a single vertebra to achieve suitable accuracy. In cases where a growing tumor or evolving condition actually changes the tissue dimension or position between imaging sessions, further confounding factors may appear.
When the purpose of image guided tracking is to define an operation on a rigid or bony structure near the surface, as is the case in placing pedicle screws in the spine, the registration may alternatively be effected without ongoing reference to tracking images, by using a computer modeling procedure in which a tool tip is touched to and initialized on each of several bony prominences to establish their coordinates and disposition, after which movement of the spine as a whole is modeled by optically initially registering and then tracking the tool in relation to the position of those prominences, while mechanically modeling a virtual representation of the spine with a tracking element or frame attached to the spine. Such a procedure dispenses with the time-consuming and computationally intensive correlation of different image sets from different sources, and, by substituting optical tracking of points, may eliminate or reduce the number of x-ray exposures required to effectively determine the tool position in relation to the patient anatomy with the required degree of precision.
However, each of the foregoing approaches, correlating high quality image data sets with more distorted shadowgraphic projection images and using tracking data to show tool position, or fixing a finite set of points on a dynamic anatomical model on which extrinsically detected tool coordinates are superimposed, results in a process whereby machine calculations produce either a synthetic image or select an existing data base diagnostic plane to guide the surgeon in relation to current tool position. While various jigs and proprietary subassemblies have been devised to make each individual coordinate sensing or image handling system easier to use or reasonably reliable, the field remains unnecessarily complex. Not only do systems often require correlation of diverse sets of images and extensive point-by-point initialization of the operating, tracking and image space coordinates or features, but they are subject to constraints due to the proprietary restrictions of diverse hardware manufacturers, the physical limitations imposed by tracking systems and the complex programming task of interfacing with many different image sources in addition to determining their scale, orientation, and relationship to other images and coordinates of the system.
Several proposals have been made that fluoroscope images be corrected to enhance their accuracy. This is a complex undertaking, since the nature of the fluoroscope's 3D to 2D projective imaging results in loss of a great deal of information in each shot, so the reverse transformation is highly underdetermined. Changes in imaging parameters due to camera and source position and orientation that occur with each shot further complicate the problem. This area has been addressed to some extent by one manufacturer which has provided a more rigid and isocentric C-arm structure. The added positional precision of that imaging system offers the prospect that, by taking a large set of fluoroscopic shots of an immobilized patient composed under determined conditions, one may be able to undertake some form of planar image reconstruction. However, this appears to be computationally very expensive, and the current state of the art suggests that while it may be possible to produce corrected fluoroscopic image data sets with somewhat less costly equipment than that required for conventional CT imaging, intra-operative fluoroscopic image guidance will continue to require access to MRI, PET or CT data sets, and to rely on extensive surgical input and set-up for tracking systems that allow position or image correlations to be performed.
Thus, it remains highly desirable to utilize simple, low-dose and low cost fluoroscope images for surgical guidance, yet also to achieve enhanced accuracy for critical tool positioning.
It would be desirable to provide an improved image guided navigation system for a surgical instrument.
It would also be desirable to provide such an image guided system which operates with a C-arm fluoroscope to produce enhanced images and information.
It would also be desirable to provide an image-guided surgical navigation system adaptable to a fluoroscope that accurately depicts tool position.
SUMMARY OF THE INVENTION
One or more of the foregoing features and other desirable ends are achieved in a method or system of the present invention wherein an x-ray imaging machine of movable angulation, such as a fluoroscope, is operated to form reference or navigation images of a patient undergoing a procedure. A tracking system employs a tracking element affixed to each of the imaging machine and tool, and preferably to the patient as well, to provide respective position data for the tool, the fluoroscope and patient, while a fixed volume array of markers, which is also tracked, is imaged in each frame. Preferably the array of markers is affixed to the detector assembly of the imaging machine, where a single tracking element determines position of the fluoroscope and entire array of markers. The fluoroscope may itself also provide further shot-specific indexing or identification data of conventional type, such as time, settings or the like. A processor then applies the position data from the tracking system, and operates on the imaged markers to produce a correct tool navigation image for surgical guidance.
The markers are preferably arranged in a known pattern of substantially non-shadowing point elements positioned in different planes. These may be rigidly spaced apart in a predefined configuration in an assembly attached to the fluoroscope, so that the physical position of each marker is known exactly in a fixed fluoroscope-based coordinate system, and the positions may, for example, be stored in a table. A single tracking element may be affixed on the marker assembly, which may in turn be locked in a fixed position on the fluoroscope, so that the fluoroscope and marker positions are known in relation to the tool and the patient. Alternatively, one or more separate arrays of markers may be independently positioned and each tracked by a separate tracking element.
In each fluoroscopic image, the processor identifies a subset of the markers and recovers geometric camera calibration parameters from the imaged marker positions. These calibration parameters then allow accurate reference between the recorded image and the tool and patient coordinates measured by the trackers. The processor may also receive patient identification data of a conventional type to display or record with the shot. In one embodiment the processor computes the calibration as well as geometric distortion due to the imaging process, and converts the tracked or actual location of the tool to a distorted tool image position at which the display projects a representation of the tool onto the fluoroscopic image to guide the surgeon in tool navigation.
In this aspect of the invention, the processor identifies markers in the image, and employs the geometry of the identified markers to model the effective source and camera projection geometry each time a shot is taken, e.g., to effectively define its focus and imaging characteristics for each frame. These parameters are then used to compute the projection of the tool in the fluoroscope image.
In yet a further aspect of the invention, the fluoroscope is operated to take a series of shots in progressively varying orientations and positions as the camera and source are moved about the patient. Accurate calibration for multiple images is then employed to allow three-dimensional reconstruction of the image data. The processor applies a reconstruction operation or procedure, for example, back projection of the registered images to form a volume image data set, e.g., a three dimensional set of image density values of a tissue volume. The initial set of fluoroscopic images may, for example, be acquired by taking a series of views rotating the fluoroscope in a fixed plane about a target region of tissue. A common center and coordinate axes are determined for the reconstructed volume, such that the volume image data set constructed from the images corresponds to the target region. Image planes are then directly constructed and displayed from this volume image data set.
The resultant fluoro-CT images are geometrically comparable to conventional diagnostic image sets of the imaged volume, and obviate the need for complex tracking and image correlation systems otherwise proposed or required for operating-room management and display of pre-operatively acquired volumetric data sets with intraoperative fluoro images. In accordance with a still further aspect of the invention, this reconstructed fluoro-CT data set is then registered to or transformed to the image space coordinates of a preoperative PET, MRI or CT data set for simultaneous display of both sets of images. In other embodiments, the system of the present invention may be used simply for the purpose of intraoperatively registering preoperative 3D image data to the patient tissue. In accordance with this aspect of the invention, a set of fluoro-CT image data is constructed as described above, and these are registered to preoperative 3D image data by mutual information, contour matching or other correlation procedure. This provides a direct registration of the preoperative data to tracking coordinates without requiring the surgeon to place and image fiducials, touch and enter skeletal or surface registration points, or perform invasive pre-operation image registration protocols.
The tracking elements of the tracking system may comprise various position-indicating elements or markers which operate optically, ultrasonically, electromagnetically or otherwise, and the tracking system itself may include hybrid software-mediated elements or steps wherein a pointer or tool of defined geometry is tracked as it touches fiducials or markers in order to enter or initialize position coordinates in a tracking system that operates by triangulating paths, angles or distances to various signal emitting or reflecting markers. A hybrid tracking system may also be used, including one or more robotic elements which physically encode mechanical positions of linkages or supports as part of one or more of the tracking measurements being made. Preferably, however, the tracking system employs electromagnetic tracking elements such as shown in U.S. Pat. No. 5,967,980, to generate and/or detect electromagnetic field components that pass through or are substantially unobstructed by the patient and intervening structures, and to directly determine coordinates in three or more dimensions referenced to the tool, the patient or the fluoroscope to which the elements are attached.
A single tracking element may be affixed to each of the fluoroscope, the patient, and the surgical tool. One presently preferred embodiment of a tracking element employs a magnetic field element, such as one configured with three mutually orthogonal coils, that otherwise operates as a substantially point-origin field generator or field sensor. The element may have a rigid or oriented housing, so that when attached to a rigid object, the tracked coordinates of the element yield all coordinates, with only a defined constant offset, of the object itself. The element may be energized as a field generator, or sampled as a field sensor, to produce or detect a field modulated in phase, frequency or time so that some or all of the x-, y-, z-, roll-, pitch-, and yaw coordinates of each tracking element, and thus its associated object, are quickly and accurately determined. A table of position correction factors or characteristics may be compiled for one or more of the tracking elements to correct for the effects of electromagnetic shunting or other forms of interference with the generator or receiver which may occur when positioned in a region near to the body of the fluoroscope. This allows a magnetic tracking element to be placed quite close to the imaging assembly or other conductive structure and achieve high position tracking accuracy or resolution. In particular, one or more tracking elements may be mounted directly on the fluoroscope and/or on calibration fixtures positioned close to the image detector of the fluoroscope to define camera and imaging parameters relative to another tracker which may move with the patient or with a tool. Various alternative magnetic generating and sensing assemblies may be used for the tracking component, such as ones having a tetrahedrally-disposed generating element and a single sensing/receiving coil, or ones having a multipole generating assembly that defines a suitably detectable spatial field.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will be understood from the description and claims herein, viewed in light of the prior art, and taken together with the Figures illustrating several basic embodiments and representative details of construction, wherein
FIG. 1 illustrates a fluoroscopic image and tool navigation system in accordance with one embodiment of the present invention;
FIG. 1A illustrates camera imaging of a tissue region with the system of FIG. 1;
FIG. 2 illustrate representative navigation images of one embodiment of the system of FIG. 1;
FIG. 2A illustrates the display of fluoroscope orientation in a preferred implementation of the system of FIG. 1;
FIG. 3 shows details of one camera calibration sub-assembly useful in the embodiment of FIG. 1;
FIG. 3A shows another calibration sub-assembly of the invention;
FIG. 4 is a flow chart showing image processing and tool tracking in accordance with a first aspect of the invention;
FIG. 5 illustrates operation of a second aspect of the invention;
FIG. 6 illustrates a sheet fixture for use with the invention and having combined calibration and tracking elements;
FIG. 7 illustrates camera calibrations corresponding to the fluoroscope poses illustrated in FIG. 1A and used for the operation illustrated in FIG. 5; and
FIG. 8 illustrates operation of the system to register preoperative images to a patient.
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 illustrates elements of a basic embodiment of a system 10 in accordance with the present invention for use in an operating room environment. As shown, the system 10 includes a fluoroscope 20, a work station 30 having one or more displays 32 and a keyboard/mouse or other user interface 34, and a plurality of tracking elements T1, T2, T3. The fluoroscope 20 is illustrated as a C-arm fluoroscope in which an x-ray source 22 is mounted on a structural member or C-arm 26 opposite to an x-ray receiving and detecting unit, referred to herein as an imaging assembly 24. The C-arm moves about a patient for producing two dimensional projection images of the patient from different angles The patient remains positioned between the source and the camera, and may, for example, be situated on a table or other support, although the patient may move. The tracking elements, described further below, are mounted such that one element T1 is affixed to, incorporated in or otherwise secured against movement with respect to a surgical tool or probe 40. A second tracking unit T2 is fixed on or in relation to the fluoroscope 20, and a third tracking unit T3 fixed on or in relation to the patient. The surgical tool may be a rigid probe as shown in FIG. 1, allowing the tracker T1 to be fixed at any known or convenient position, such as on its handle, or the tool may be a flexible tool, such as a catheter, flexible endoscope or an articulated tool. In the latter cases, the tracker T1 is preferably a small, localized element positioned in or at the operative tip of the tool as shown by catheter tracker T1′ in FIG. 1, to track coordinates of the tip within the body of the patient.
As will be understood by those skilled in the art, fluoroscopes typically operate with an x-ray source 22 positioned opposite the camera or image sensing assembly 24. While in some systems, the X-ray source is fixed overhead, and the camera is located below a patient support, the discussion below will be illustrated with regard to the more complex case of a typical C-arm fluoroscope, in which the source and camera are connected by a structural member, the C-arm, that allows movement of the source and camera assembly about the patient so it may be positioned to produce x-ray views from different angles or perspectives. In these devices, the imaging beam generally diverges at an angle, the relative locations and orientations of the source and camera vary with position due to structural flexing and mechanical looseness, and the position of both the source and the camera with respect to the patient and/or a tool which it is desired to track may also vary in different shots.
The imaging beam illustrated by B in FIG. 1 diverges from the source 22 in a generally truncated conical beam shape, and the C-arm 26 is movable along a generally arcuate path to position the source and camera for imaging from different directions. This generally involves positioning the camera assembly 24 as close as possible behind the relevant tissue or operating area of the patient, while the C-arm assembly is moved roughly about a targeted imaging center to the desired viewing angle. The C-arm or other beam structure 26 may be a somewhat flexible structure, subject to bending, deflection or sagging as the source and camera move to different positions around the patient, and the C-arm may also have other forms of dimensional variation or looseness, such as drive gear backlash, compressible elastomeric suspension components or the like, which may contribute to variations and non-repeatability of the relative disposition and alignment of the source and camera with respect to each other, and with respect to the patient, as the assembly is moved to different positions. The C-arm may also move eccentrically or translationally to allow better clearance of the patient support table. The bending deflections of the C-arm assembly may vary the actual position of the source 22 by almost a centimeter or more with respect to the image detector, and displace it from a nominal position which may be indicated, for example, by an encoder present in the fluoroscope stand or C-arm positioning assembly. These variations may therefore be significant.
FIG. 1A illustrates the fluoroscope 20 in two different imaging positions, with a first position shown in solid line, and a second position in dashed line phantom. In the first position, a tissue volume V is imaged with a divergent beam from the above right, and a virtual beam origin or focal point at F, while the image from the second position catches a largely overlapping but partly distinct tissue volume with a divergent beam from the upper left, and a different focal point F′. The distances from points F, F′ to the camera may be different, and the camera itself may shift and tilt with respect to the beam and its center axis, respectively. In practice, the x-ray beam is generally aimed by its center ray, whose intersection with the imaging plane, referred to as the piercing point, may be visually estimated by aiming the assembly with a laser pointing beam affixed to the source. The x-ray beam may be considered to have a virtual origin or focal point F at the apex of the cone beam. Generally, the camera assembly 24 is positioned close to the patient, but is subject to constraints posed by the operating table, the nature of the surgical approach, and the necessary tools, staging, clamps and the like, so that imaging of a tissue volume somewhat off the beam center line, and at different distances along the beam, may occur. As noted above, flexing of the C-arm also changes the distance to the focal point F and this also may slightly vary the angular disposition of the beam to the camera, so this shifting geometry may affect the fluoroscope images.
Furthermore, the camera 24 may utilize an image sensing unit that itself introduces further distortions into the received distribution of image radiation. For example, the unit may involve a detector that employs a phosphor surface of generally curved contour to convert the x-ray image intensity distribution to a free electron distribution. Such a curved phosphor screen is generally placed over an electron multiplier or image intensifier assembly that provides an enhanced output video signal, but may further introduce a form of electron optical distortion that depends upon the intensifier geometry and varies with the orientation of the camera assembly in the earth's magnetic field. Other configurations of image detectors are also known or proposed, such as digital x-ray detectors or flat semiconductor arrays, which may have different imaging-end fidelity characteristics. In any case, deflection or physical movement of the camera itself as well as electron/optical distortion from the camera geometry, image detector and variations due to gravitational, magnetic or electromagnetic fields can all enter the image reception and affect the projective geometry and other distortion of the final image produced by the assembly.
The foregoing aspects of imaging system variability are addressed by the present invention by using tracking elements in conjunction with a camera calibration fixture or correction assembly to provide fluoroscopic images of enhanced accuracy for tool navigation and workstation display.
A more detailed description of the operation of the present invention follows, and proceeds initially from 1) a mechanism for effectively characterizing camera imaging parameters while addressing distortion in each image frame or shot of a C-arm fluoroscope to 2) using these parameters to reconstructing a 3D volume that is dynamically referenced to the patient and the tool; and finally 3) fusing the dynamically referenced 3D volume with preoperative volumetric data. The equipment and procedure has two components, a first component provided by a tracking assembly which determines position of a fluoroscope calibration fixture relative to one or both of the tool and patient body, and a second component provided by a processor operating on each image that characterizes or models the geometry of the camera and performs all subsequent processing. This is done by providing a calibration fixture that contains an array of markers, which is either tracked as a rigid unit or affixed to the camera, while the imaged position of the markers in each fluoroscope shot serves to characterize the imaging geometry so as to allow correction of imaged features at measured distances from the camera, and permit registration of successive images in different poses.
In accordance with a principal aspect of the present invention, when tracked relative to a tool, the surgical instrument display 40′ of FIGS. 1 and 2 is effected by determining tool position, focus and imaging axis, and rendering the instrument in conjunction with one or more of the three types of images mentioned above. In one embodiment, the processor determines an image distortion inverse transform and projects a distorted or transformed tool graphic or image on the fluoroscopic view. In another aspect, the processor determines the camera geometry for each image and transforms the set of fluoroscopic images such that the screen coordinates of display 33 are similar or aligned with the operating coordinates as measured by tracking elements T2, T3. This calibration results in more accurate tool tracking and representation over time. As further discussed in regard to FIG. 5 below, the image data of an imaging sequence for a region of tissue about a common origin may be back-projected or otherwise processed to define a three dimensional stack of fluoro-CT images. The invention thus allows a relatively inexpensive C-arm fluoroscope to achieve accuracy and registration to prepare CT images for tool guidance and reconstruction of arbitrary planes in the imaged volume.
In overall appearance, the data processing and work station unit 30 illustrated in FIG. 1 may be laid out in a conventional fashion, with a display section in which, for example, a previously acquired CT or diagnostic image is displayed on one screen 32 while one or more intraoperative images 33, such as a A/P and a lateral fluoroscopic view, are displayed on another screen. FIG. 2 schematically represents one such display. In its broad aspects the system may present an appearance common to many systems of the prior art, but, in a first aspect provides enhanced or corrected navigation guiding images, while in a second aspect may provide CT or other reconstructed images in display 32 formed directly from the fluoroscopic views. In a third aspect the system may provide dynamic referencing between these reconstructed images and a set of preoperative 3D image data.
Typically, for tool positioning, one fluoroscope image in display 33 may be taken with the beam disposed vertically to produce an A/P fluoroscopic image projected against a horizontal plane, while another may be taken with beam projected horizontally to take a lateral view projected in a vertical plane. As schematically illustrated therein, the image typically shows a plurality of differently shaded features, so that a patient's vertebra, for example, may appear as an irregular three-dimensional darkened region shadow-profiled in each of the views. The tool representation for a navigation system may consist of a brightly-colored dot representing tip position and a line or vector showing orientation of the body of the tool approaching its tip. In the example shown in FIG. 2, in the horizontal plane, the probe projected image 40′ may extend directly over the imaged structure from the side in the A/P or top view, while when viewed in the vertical plane the perspective clearly reveals that the tip has not reached that feature but lies situated above it in space. In a preferred implementation of the multi-image display console of the invention, the display employs position data from the tracking assembly to display the fluoroscope's current angle of offset from the baseline AP and lateral views. Surgeons have generally become accustomed to operating with such images, and despite the fact that the fluoroscopic images are limited by being projection images rather than 3D images, their display of approximate position and orientation, in conjunction with the diagnostic image on panel 32 which may also have a tool point representation on it, enables the surgeon to navigate during the course of a procedure. In a preferred embodiment of the present invention, this display is further enhanced by employing position data from the tracking assembly to display the fluoroscope's current angle of offset from the baseline AP and lateral fluoroscope views. This may be done as shown in FIG. 2A, by marking the fluoroscope's tracked angle or viewing axis with a marker on a circle between the twelve o'clock and three o'clock positions representing the AP and lateral view orientations.
The nature of the enhancement or correction is best understood from a discussion of one simple embodiment of the present invention, wherein a tracking system tracks the surgical instrument 40, and the system projects a representation 40′ of the tool on each of the images detected by the image detector 24. This representation, while appearing as a simple vector drawing of the tool, is displayed with its position and orientation determined in the processor by applying a projective transform and an inverting image distortion transformation to the actual tool coordinates determined by the tracking elements. Thus, it is displayed in “fluoroscope image space”, rather than displaying a simple tool glyph, or correcting the image to fit the operating room coordinates of the tool.
FIG. 3 illustrates one embodiment 50 of a suitable marker array, calibration fixture or standard ST for the practice of the invention. As illustrated in this prototype embodiment, the fixture may include several sheets 52 of radiolucent material, each holding an array of radiopaque point-like markers 54, such as stainless steel balls. (hereafter simply referred to as BBs). The BBs may be of different sizes in the different planes, or may be of the same size. Preferably, they are of the same size, e.g., about one or two millimeters in diameter, and preferably the one or more plates holding them are rigidly affixed at or near to the face of the camera imaging assembly so as to allow accurate calibration of the entire volume of interest while occupying a sufficiently small space that the camera may be positioned closely to the patient. The illustrated calibration fixture 50 includes a releaseable clamp assembly 51, with a camming clamp handle 51 a, configured to attach directly on or over the face of the camera assembly.
As shown in the system diagram, FIG. 4, operation of the system proceeds as follows.
Initially, as noted above, a tracking element is associated with each of the tool, the patient and the fluoroscope. Each tracking element is secured against movement with respect to the structure it is tracking, but advantageously, all three of those structures are free to move. Thus, the fluoroscope may move freely about the patient, and both the patient and the tool may move within the operative field. Preferably, the tracking element associated with the fluoroscope is positioned on a calibration fixture 50 which is itself rigidly affixed to the camera of the fluoroscope as described above. The calibration fixture may be removably attached in a precise position, and the tracking element T2 may be held in a rigid oriented body affixed to the fixture 50. The tracking element T2 (FIG. 3) may, for example, be a point-origin defining tracking element that identifies the spatial coordinates and orientation of its housing, hence, with a rigid coordinate transform, also specifies the position and orientation coordinates of the object to which it is attached. Thus, the tracking element T2 may with one measurement determine the positions of all markers in the calibration fixture, and the position and orientation of the fixture itself or the horizontal surface of the camera assembly.
The illustrated marker plates may each be manufactured by NC drilling of an array of holes in an acrylic, e.g., Lexan, and/or other polymer plate, with the BBs pressed into the holes, so that all marker coordinates are exactly known. Alternatively, marker plates may be manufactured by circuit board microlithography techniques, to provide desired patterns of radiopaque markers, for example as metallization patterns, on one or more thin radiolucent films or sheets. Applicants also contemplate that the calibration assembly, rather than employing separate sheets bearing the markers, may be fabricated as a single block 50 of a suitable radiolucent material, such as a structural foam polymer having a low density and high stiffness and strength. In that case, as shown in FIG. 3A, holes may be drilled to different depths and BB markers may be pressed in to defined depths Z1, Z2 . . . at specific locations to create the desired space array of markers in a solid foam calibration block. One suitable material of this type is a structural foam of the type used in aircraft wings for lightweight structural rigidity. This material may also be employed in separate thin marker-holding sheets. In any case the selected polymer or foam, and the number and size of the markers, are configured to remain directly in the imaging beam of the fluoroscope device and be imaged in each shot, while the position of the fixture is tracked. The fixture materials are selected to avoid introducing any significant level of x-ray absorption or x-ray scattering by the plates, sheets or block, and the size and number of markers are similarly chosen to avoid excessive shadowing of the overall image, while maintaining a sufficiently dense image level for their detectability, so that both the imaging source radiation level and the resulting image density scale remain comparable to currently desired operating levels. Preferably, the BBs are arranged in a pattern at one or more levels, with a different pattern at each level. Further, when more than one array at different depths is used, the patterns are positioned so that as the source/camera alignment changes, BBs of one pattern cast shadows substantially distinct from those of the other pattern(s).
As noted above, in accordance with a principal aspect of the present invention, the array of markers is imaged in each fluoroscope shot. As shown in FIG. 4, the image display system of the present invention operates by first identifying markers in the image. This is done in an automated procedure, for example, by a pipeline of grey level thresholding based on the x-ray absorption properties of the markers, followed by spatial clustering based on the shape and size of the markers. In the preferred embodiment having two or more planar sheets, each sheet has markers arranged in a particular pattern. The pattern of each sheet will be enlarged in the image by a scale that varies with the cone divergence and the distance of the marker sheet along the axis from the optical center (or x-ray source) to the detection surface. The marker images will also be shifted radially away from the beam center axis due to the beam divergence. In the preferred embodiment, the calibration fixture is positioned close to the image detection surface, and the markers lie in arrays distributed in planes placed substantially perpendicular to the optical axis and offset from the detection surface. In general, not all markers will be located in the image due to shadowing of some of markers, or occlusion of the marker by another object of similar x-ray absorption response. In a prototype embodiment of the marker identification image processor, the candidate markers in the image are first identified using image processing and then matched with corresponding markers in the fixture.
One suitable protocol takes a candidate marker Pi in image coordinates, assumes it is, e.g., marker number Qj of sheet one, and then determines how many other candidate markers support this match, i.e., line up with the expected projections of the remaining markers of one array, e.g., in the pattern of sheet one. The number of candidates matching the known template or pattern of sheet one is totaled, and is taken as the score of that marker. This process is repeated to score each candidate marker in the image, and an identification scored above a threshold is taken as correct when it leads to the highest score for that candidate, and does not conflict with the identification of another high-scoring candidate. Scoring of the match is done by using the observation that the ratio of distances and angles between line segments on the same plane are invariant under perspective projection. When the array has only about fifty to one hundred markers, the processor may proceed on a point-by-point basis, that is, an exhaustive matching process may be used to determine the correspondence between points. When a larger number of markers are desired, the marker detection processor preferably employs an optimization algorithm such as the Powell, Fletcher or a simplex algorithm. One particularly useful pattern matching algorithm is that published by Chang et al in Pattern Recognition, Volume 30, No. 2, pp. 311-320, 1997. That algorithm is both fast and robust with respect to typically encountered fluoroscopic distortions. As applied to calibration markers of the present invention, the Chang alignment/identification algorithm may be accelerated relying upon the fact that the marker fixture itself has a known marker geometry. For example, the marker identification module may predict the expected positions in the image, and search for matches within a defined small neighborhood. The image processor calibration module includes a pre-compiled table, for example, stored in non-volatile memory, indicating the coordinates of each marker of the pattern, and preferably includes tables of separation for each pair, and/or included angle for each triplet of markers, to implement fast identification.
As noted above, when the calibration plates are rigidly affixed to the camera, only a single tracking element T2 is needed to determine the positions of all the markers, which differ only by a rigid transform (e.g. a translation plus a rotation) from those of the tracking element. Otherwise, if one or more of the arrays of markers is carried in a separately-positioned sheet or fixture, each such unit may be tracked by a separate tracking element. In either case, the array of marker positions are determined in each fluoroscopic image frame from the tracking element T2 and from the fixed relative position coordinates stored in the marker table.
Continuing with a description of FIG. 4, in accordance with a principal aspect of the invention, the camera is next calibrated using the marker identification information of the previous steps. The imaging carried out by the fluoroscope is modeled as a camera system in which the optical center is located at the x-ray source and the imaging plane is located a distance F (focal length) away from it inside the camera assembly. The optical axis is the line through the x-ray source and perpendicular to the horizontal face of the camera. The intersection of the optical axis and the image plane is defined as the piercing point. Certain imaging or distortion characteristics may also be measured by the array of marker images, which thus determines a corrective perspective transformation. A suitable algorithm is that described by Roger Tsai in his article on 3-D camera calibration published in the IEEE Journal of Robotics and Automation, Volume RA-3, No. 4, August 1987, pp. 323-344. This model determines radial distortion in addition to parameters using an algorithm that takes as input the matched marker and image locations, estimates of focal length and information about the number of rows and columns in the projection image. This algorithm is readily implemented with one or more planes of markers in the fixture 50 or 50′. When the fluoroscope is sufficiently rigid that focus does not vary, a single plane of markers may be used to define the camera parameters.
By providing a pattern of markers in a plane, the shifts in position of those markers in the image define a local transformation that corrects for radial distortion of the image, while non-occluding markers in two planes, or at two different positions along the z-axis are sufficient to identify the focus and the optical or center axis of the beam. Other models relying, for example, on defining a distortion morphing transformation from the array of marker images may also be applied. A pattern of markers may comprise a rectangular lattice, e.g., one marker every centimeter or half-centimeter in two orthogonal directions, or may occupy a non-periodic but known set of closely-spaced positions. The calibration fixture may be constructed such that markers fill a peripheral band around the imaged tissue, to provide marker shadow images that lie outside the imaged area and do not obscure the tissue which is being imaged for display. Preferably, however, the markers are located in the imaged field, so that the imaging camera and distortion transforms they define closely fit and characterize the geometric imaging occurring in that area. In the preferred embodiment, the image processor removes the marker shadow-images from the fluoroscope image frame before display on the console 30 (FIG. 1), and may interpolate or otherwise correct image values in the surrounding image.
Continuing with a description of the processing, the processor in one basic embodiment then integrates tracked tool position with the fluoroscope shot. That is, having tracked the position of tool 40 via tracking element TI, relative to the marker array 50 and tracking element T2, and having modeled the camera focus, optical axis and image plane relative to the position of the fixture 50, the system then synthesizes a projection image of the tool as it dynamically tracks movement of the tool, and displays that tool navigation image on the fluoro A/P and/or lateral view of screen 33 (FIG. 1).
To display the tool position on an uncorrected fluoroscope image, the processor obtains the position of the front and back tips of the tool. These are fixed offsets from the coordinates of the tracking element T1 associated with the tool. The tracker may also determine tool orientation relative to the patient from position and orientation relative to the tracking element T3 on the patient at the time of image capture. Tracked position coordinates are converted to be relative to the fixed tracking element on the camera, or so that all coordinates reference the image to which the camera model applies. In a basic tool navigation embodiment, the camera calibration matrix is then applied to the front and back tip position coordinates of the tool to convert them to fluoroscope image space coordinates. These end point coordinates are converted to undistorted two-dimensional image coordinates (e.g., perspective coordinates) using the calculated focal length of the camera, which are then converted to distorted two-dimensional image coordinates using the lens distortion factor derived from the matrix of marker positions. Corresponding pixel locations in the two-dimensional fluoroscope image are determined using the x-scale factor, the calculated origin of the image plane and scaling based on the number of pixels per millimeter in the camera image sensor and display. The determined position is then integrated with the video display on the fluoroscope to show a graphical representation of the tool with its front tip location in image coordinates. Preferably, the tool is displayed as an instrument vector, a two-dimensional line on the fluoroscopic image with a red dot representing its tip. Thereafter, during an ongoing procedure, the tracking assembly may track tool movement relative to the patient, and a processor controls the tracking and determines from the position of the tool when it is necessary to redraw the integrated display using the above-described image distortion transformations to correctly situate the displayed tool in a position on a new image.
As described above, the process of camera calibration is a process of applying actual coordinates as determined by the tracking system and marker positions, and image coordinates as seen in the fluoroscopic marker images, to model a camera for the image. In general, applicant's provision of an array of marker points having known coordinates in each of several planes, together with tracking coordinates corresponding to the absolute position of those planes and modeling of the camera image plane with respect to these tracked positions obviates the need for lengthy initialization or correlation steps, and allows an image processor to simply identify the marker images and their positions in the image, model the camera to define focus, image plane and piercing point, and to effect image corrections with a few automated tracking measurements and transformations. The fixture is preferably fixed close to the front surface of the image detector assembly, so the calibration fits the detected image closely.
As noted above, the marker positions allow a simple computation of effective parameters to fully characterize the camera. This allows one to scale and correct positions of the image (for example a tool) when their coordinates are tracked or otherwise unknown.
In accordance with a preferred method of operation of the present device, the fluoroscope is operated to take a large number of fluoro images, with fixture tracking and camera modeling as described above, and a 3D CT image data set is reconstructed from the acquired data. In general, this data set can be acquired such that it is dimensionally accurate and useful for close surgical guidance, although parameters such as x-ray absorbance, corresponding, for example to bone or tissue density, will be of lesser accuracy than those obtainable from a CT scanner, and should not be relied upon. The fluoroscopic CT images so formed may be further correlated with preoperative MRI, PET or CT images to define a direct image coordinate transformation, using established techniques such as MI (mutual information) registration, edge or contour matching, or the like, between the fluoroscopic 3D data set of the present invention and the existing preoperative 3D image set.
Operation for forming a volume image data set for CT reconstruction proceeds as follows. First, the fluoroscope is operated to obtain a dense set of fluoroscope images, for example, by rotating the fluoroscope approximately in a plane about the patient through 180° plus the angle of divergence of the cone beam, taking a shot every degree or less, so as to image a particular three-dimensional tissue volume of the patient in a large number of images. As each frame is acquired, pose information, given for example by the position and orientation measurement of the tracking element T2, is stored, and the marker detection/ calibration module operates on each shot so that a correction factor and a perspective projection matrix is determined for each image, as described above, to model the camera focus, image plane and optical axis for that shot. A coordinate system for the tissue volume for which reconstruction is desired is then computed, and the processor then applies filtered back projection or other reconstruction processing (such as lumigraphs or lambda-CT), with indexing provided by the relative disposition of each pose, to reconstruct a three-dimensional volume data image set in the intra-operative coordinate system for a region of tissue around the origin of the reconstruction coordinate system. This 3-D image data set referenced to tracker coordinates readily allows CT reconstruction of desired planes within the image set, referenced to patient or tool position.
In order to integrate the tracking system with the fluoroscopic images, it is necessary to establish a coordinate system for the three-dimensional reconstructed volume. This entails defining the origin and the coordinate axes for that volume. Once such a coordinate system is defined in relation to all fluoro images, one can compute the back projection at voxels in a region referenced to the origin, in planes that are perpendicular to one of the coordinate axes. In the case of a spinal scan, for example, the desired CT planes will be planes perpendicular to an axis that approximates the long axis of the body. Such a spinal data set is especially useful, since this view cannot be directly imaged by a fluoroscope, and it is a view that is critical for visually assessing alignment of pedicle screws. Applicant establishes this common coordinate system in a way that minimizes risk of: (a) backprojecting voxels where insufficient data exists in the projections or (b) being unable to define the relationship between the natural coordinate system of the patient and that of the reconstruction.
In the discussion that follows, it will be assumed that the user, e.g., the surgeon, or radiologist takes the fluoroscopic images such that the region of interest stays visible, preferably centered, in all the fluoroscopic images and that each arc traced by the C-arm is approximately planar. These requirements may be met in practice by users of C-arm fluoroscopes, since surgeons have extensive practice in acquiring fluoroscopic images by tracing planar motion trajectories in which the relevant anatomy is centered in both AP and lateral views. Such centering is easiest to achieve, or most accurately attained when using a substantially isocentric C-arm such as that made by the Siemens corporation. However, in calibrating the camera for each image, applicants are able to automatically determine a reconstruction coordinate system for an arbitrary sequence of images. In this regard, the camera tracking data may be used to fit a center. This is considered an advance over systems that require a coordinate system to be specified manually.
It will be appreciated that in the above-described system, the tracking elements automatically detect coordinates of the marker array, tool and patient at the time each image is taken. Detection of the calibration fixture position allows camera modeling to provide the position of the optical center (F), optical axis and image plane, in tracker coordinates for each shot as described above. In accordance with this further aspect of the invention, the combination of tracked position and modeled camera information is used to define a coordinate system for the reconstruction, which is preferably computed by performing statistical and computational geometry analysis on the pose information recorded and derived for each of the fluoroscopic image frames.
A few definitions will clarify the underlying procedure, employed in the prototype embodiment and automated in software. The “projection plane” is the plane on which the image is formed through the operation of perspective projection. The “optical center” or the “center of projection”, C, is located at a distance F, the focal length of the optical system, from the projection plane. In the case of a fluoroscope, this is the actual location of the x-ray source; the source is positioned at the optical center of the imaging system. The projection of a given point M in the world is computed as the intersection of the ray connecting M and the optical center C with the projection plane. The “optical axis” of a fluoroscopic imaging system is the line that passes through its optical center (the x-ray source) and is normal to the projection plane. The point at which the optical axis intersects the projection plane is known as the “principal point” or the “piercing point”. A textbook such as “Three-Dimensional Computer Vision” by Olivier Faugeras, MIT Press, may be consulted for further background or illustration of basic concepts used here.
Applicant's approach to the problem of computing a coordinate origin for reconstruction assures that in this set of data, the origin of the 3D coordinate system lies at a point that is the center of the region that the surgeon is most interested in visualizing. That point is identified in a prototype system by computing a point that is closest to being centered in all of the acquired fluoroscopic images, and then taking that point as the origin of a coordinate system in which the reconstruction is performed. FIG. 5 sets forth the steps of this processing.
It will be recalled that the camera calibration described above models the camera for each shot. Each configuration of the C-arm defines a coordinate system in which the origin, (0,0,0) is defined by the location of the x-ray source. The principal point is located at (0,0,F) where F is the focal length. That is, the optical axis, or axis of the imaging beam, is aligned along the third axis. Such a situation is schematically illustrated in FIG. 7 for the two fluoroscope positions shown in FIG. 1A. If all the fluoroscope configurations are taken in the context of a common world-coordinate system, each of these configurations defines a unique optical axis. Ideally, the point in three-space where all these optical axes intersect would be visible and centered in all the projection images. Based on the assumption that the fluoroscopic images are acquired by approximately centering the region of interest, applicant defmes a projection center of the imaged tissue volume from the ensemble of camera models, and uses this intersection point as the origin for a three-dimensional reconstruction. This is done by applying a coordinate determination module, which identifies the optical axis intersection point as the intersection of, or best fit to, the N2 pairs of optical axes of the modeled cameras for the N poses. In practice, two facts should are addressed in computing the center of projection: (a) the optical axes of any two fluoroscope shots are usually somewhat skew, lying in separate, but substantially parallel planes, and do not really intersect in space, and (b) the two “intersection” points determined by two different pairs of axes also do not generally coincide exactly.
In order to address these problems, for situation (a), the processor incorporates a software condition check for skewness of lines. If the optical axes are skew, the processor defines the intersection point as a computed point that is halfway between the two lines. In order to address the situation (b), the processor takes the mean coordinates of the N2 skew-intersection points determined in the first step as its common center of projection. Thus the cluster of points defined by the N2 pairs of axes determines a single point. This point is defined as the origin of the tissue region for which reconstruction is undertaken.
It is also necessary to determine a set of coordinate axes for the volume data set. Preferably, the axial planes of the reconstruction are to be parallel to the plane of motion of the x-ray source. Applicant's presently preferred processing module computes the plane of motion of the x-ray source by fitting a least-squares solution to the poses of the x-ray source. Any two non-collinear vectors in this plane define a basis for this plane and serve as two of the axes for the coordinate system. The module also computes a normal to this plane to serve as the third coordinate axis. The coordinate axis computation may be done by using eigen-analysis of the covariance matrix of the coordinates of the optical centers (x-ray source locations) and the principal points in the world-coordinate system. These eigenvectors are then ordered in order of decreasing eigenvalue. The first two eigenvectors provide a basis for the axial plane of interest, and the third eigenvector provides the normal to this plane. This procedure thus provides all three coordinate axes for the three-dimensional reconstruction. This determination is fully automated, and requires only the tracker data and camera models determined by the processor when each shot is taken. Further background and details of implementation for applying the eigenanalysis technique to define coordinate axes may be found in reference texts, such as the 1984 textbook “Pattern Recognition” by J. Therrien.
Having determined a coordinate system for the reconstruction, the processor then filters and back-projects the image data to form a volume image data set, from which CT planes may be reconstructed or retrieved in a conventional manner. The back projection step may utilize fast or improved processes, such as the fast Feldkamp algorithm or other variant, or may be replaced by other suitable volume data reconstruction technique, such as the local or Lambda tomography method described by A. Louis and P. Maass in IEEE Transac. Med. Imag. 764-769, (1993) and papers cited therein.
Thus, a simple set of automated tracking elements combined with image processing operative on a fixed or tracked marker array provides accurate tool tracking fluoroscope images, or a set of geometrically accurate reconstructed or CT images from the shadowgraphic images of a C-arm or intraoperative fluoroscope. The nature of the multi-point marker-defined camera image model allows the processor to quickly register, reference to a common coordinate system and back project or otherwise reconstruct accurate volume image data, and the fast determination of a camera parameter model for each shot proceeds quickly and allows accurate tool display for intraoperative tool navigation and dynamic tracking, without requiring rigid frames or robotic assemblies that can obstruct surgery, and without the necessity of matching to an MRI or PET database to achieve precision. Furthermore in the preferred embodiment, the models, transformations and fitting to a coordinate system proceed from the tracker position measurements of the marker fixture relative to the patient or tool, rather than from an extrinsic fixed frame, reducing potential sources of cumulative errors and simplifying the task of registering and transforming to common coordinates. Applicant is therefore able to precisely track and display the tool in real time, and to produce accurate fluoro-CT images using a C-arm fluoroscope.
It will be understood that the description above relies upon tracking measurements made by tracking elements each fixed with respect to one of a few movable objects. As applied to the patient or tool, these tracking elements may be affixed by belts, frames, supports, clips, handles or other securing or orienting structures known in the art. In general, applicant's preferred tracking element is a magnetic field tracking element, which may be oriented and affixed in a rigid housing that allows it to secure to the structure to be tracked. Actual implementation of the system may involve a preliminary calibration procedure wherein the actual dimension, offset or relative position of the tool tip, the marker array or the like, with respect to the tool or array tracking element is permanently stored in a chip or non-volatile memory so that minimal or no set-up initialization is required during an imaging session. Similarly, when employing such magnetic tracking elements, a table of field or position corrections may be initially compiled for the tracking element mounted on the fluoroscope to assure that the tracking achieve a high level of accuracy over a broad field extending quite close to the image detector and C-arm structures. Additional reference sensor-type tracking elements or standards may also be provided as described in the aforesaid '980 patent, if desired to enhance the range, resolution or accuracy of the tracking system.
The calibration fixture has been described above as being preferably affixed to the image detector portion of the fluoroscope, where, illustratively one or several precision arrays of markers located along the imaging axis provide necessary data in the image itself to characterize the camera each time an image is taken. This location, with the markers in a single fixture, provides a high level of accuracy in determining the desired camera parameters, and enables tracking to proceed without obstructing the surgical field.
To the extent that the constraint of positioning the calibration fixture between the target tissue and the detector may limit flexibility in positioning the image detector near the patient, this may be addressed in other embodiments by having all or a portion of the marker array assembly implemented with markers located on or in a radiographic support table (75, FIG. 6) or other structure on which the patient or the imaged tissue portion is supported. In this case, the table or support itself, which is radiolucent, may have a thickness and structure that permits markers to be embedded at different depths. For example, it may be formed of a structural foam material as described above in regard to the marker fixture of FIG. 3A. Alternatively, the markers may be included in one or more sheets that fit within the x-ray sheet film tray of a conventional radiography table, or such marker sheets may be laminated to the bottom and/or top surfaces of the table. When affixed to the table or inserted in a registered or fixed fashion, the tracking element T2 may then be attached anywhere on the rigid structure of the table itself, with suitable offsets stored in a fixed memory element of the system. In embodiments utilizing such markers in the table, the total angular range of the poses in which useful marker images will appear in the fluoroscope images may be restricted to somewhat under 180°. Furthermore, the image plane will generally not be parallel to the marker arrays, so a different set of computations is utilized by the processor to characterize the camera position and geometry. However, these computations involve straightforward camera modeling, and may be accelerated by also tracking the image detector with an additional element T2′.
The calibration fixtures of the invention as well as the embodiments having markers on or in the table may be implemented with one or more separately-made sheet structures. FIG. 6 shows elements of one such embodiment wherein a marker array 50″ is formed as a pattern of metallized dots 56, which may be formed lithographically on a printed-circuit type sheet. As indicated schematically in this Figure, the sheet may also bear one or more lithographically-formed conductive loops 58, configured as a field generating or field sensing loop, for defining one or more elements of a magnetic tracking assembly. Three or more such patterned loops may be formed to constitute a basic electromagnetic generator or sensor that advantageously is precisely pre-aligned with respect to the coordinates of the markers 56 by virtue of its having been manufactured using a pattern lithography mask. The magnetic circuit loops may define magnetic multipoles for establishing or sensing position-tracking electromagnetic fields, or may, for example, include one or more coils of a system of Helmholtz coils for establishing a gradient field in the region where tracking is to occur. These may operate in conjunction with other coils disposed elsewhere for defining the tracking field, The implementation of magnetic tracking and radiographic marker elements on a sheet also allows plural sheets to be positioned and tracked separately for effecting the imaged based processing of the present invention.
In addition to the above described structure and operation of the invention, applicant contemplates system embodiments wherein a fluoro-CT data set is constructed as described above, and the fluoro-3D data set is then registered or correlated to an existing MRI, CT or PET 3D data set to form a fused set of images. These are then displayed on the system console 30 (FIG. 1) to provide enhanced patient information during surgery. Advantageously, the coordinates of the fluoro-CT images are known from the coordinates used in the reconstruction processing, while the correlation of the two different 3D image sets may proceed without reference to patient or other tracking coordinates, using any conventional 3D registration or correlation technique. This provides fast and effective fused image sets for surgical guidance or diagnostic evaluation.
Indeed, the system need not produce detailed fluoro-CT images, or need not display those images. Instead, the fluoro-CT images, or a lesser quality set of fluoro-CT images constructed from a faster (smaller) scan sequence of fluoro images, defined in tracker coordinates, may be produced and simply registered to a preoperative 3D data set in order to bring that preoperative image data set into the tracker coordinate system. In that case, the system applies this registration, and proceeds thereafter by simply tracking the patient and the tool, and displaying the appropriate preoperative images for each tracked location as shown in FIG. 8. Thus, in accordance with this aspect of the invention, the system provides an automated registration system for the intraoperative display of preoperative MRI, PET or CT images, without requiring placement or imaging of fiducials, without requiring the surgeon to initialize or set up a plurality of reference points, without requiring the surgeon to cut down to or expose a fixed skeletal registration feature, and without requiring immobilization of the patient in a frame or support. Instead, the intermediate fluoro-CT images are produced as part of an automated modeling and coordinatizing process, and both the production and the registration of these images to the preoperative data set may proceed entirely automated in software, for example, registering by mutual information (MI), feature correlation or similar process.
The invention being thus disclosed and illustrative embodiments depicted herein, further variations and modifications of the invention, will occur to those skilled in the art, and all such variations and modifications are considered to be within the scope of the invention, as defined by the claims appended hereto and equivalents thereof. Each of the patents and publications identified above is hereby incorporated by reference herein in its entirety.

Claims (31)

What is claimed is:
1. A system for surgical imaging and display of the type indicating tool position together with an image of a patient undergoing surgery, and including a display and an image processor for displaying such image in coordination with an image of the tool for visual navigation of the tool during the surgical procedure, the system being configured for use with a fluoroscope that includes an x-ray source and an imaging system, and such that at least one image in the display is derived from the fluoroscope at the time of surgery, wherein the system comprises:
a fixture including a three-dimensional array of markers and adapted to be rigidly affixed to a fluoroscope for automatically imaging the array of markers as a fluoroscope image is acquired with the fluoroscope;
a tracking assembly including a first tracking element associated with the tool and a second tracking element associated with the fixture, the tracking assembly being effective to automatically determine relative position data of the marker array and the tool;
a marker determination unit operative on the fluoroscope image to identify at least a subset of said markers in the image;
a processor operative with said identification to calibrate the system; and
a correction unit responsive to the system calibration and to said position data to determine an image transformation such that a representation of the tool is correctly positioned and oriented in the fluoroscope image.
2. The surgical imaging and display system of claim 1, wherein said fixture includes arrays of markers arranged in two different planes along an imaging direction in substantially non-shadowing patterns, and said correction unit applies the marker position data to determine camera focus and center to register the representation of the tool in the fluoroscope image.
3. The surgical imaging and display system of claim 1, wherein said correction unit determines an image distortion inverse transform operative to convert tracking assembly spatial coordinates of the tool relative to the imaging system, to fluoroscope image coordinates.
4. The surgical imaging and display system of claim 1, wherein said processor determines a camera projection geometry for each of a plurality of fluoroscope images and applies said camera projection geometry to identify a tissue coordinate system for a common region of imaged tissue.
5. The surgical imaging and display system of claim 4, wherein said processor transforms image data of said plurality of fluoroscope images to a 3D fluoro-CT data set representative of said common region of imaged tissue.
6. The surgical imaging and display system of claim 1, wherein said processor determines a camera model that includes radial distortion of said fluoroscope image.
7. The surgical imaging and display system of claim 1, further comprising a table of calibration data representing static camera corrections that vary with camera position or orientation.
8. The surgical imaging and display system of claim 1, wherein the fixture is releasably attachable to an image detector assembly of the fluoroscope camera.
9. The surgical imaging and display system of claim 1, wherein the three dimensional array of markers includes a first array of markers disposed in a first plane, and a second array of markers disposed in a second plane parallel to the first plane, the planes having a defined separation.
10. The surgical imaging and display system of claim 1, wherein the three dimensional array of markers includes a first array of markers disposed in a first plane, and a second array of markers disposed in a second plane, wherein one of said planes is affixed to the camera, and a tracking element is secured against movement with respect to the other of said planes.
11. The surgical imaging and display system of claim 1, wherein said fixture comprises a radiolucent and substantially non-scattering body having a plurality of radio-opaque features at different depths arranged to constitute patterns of markers imaged by the image detector.
12. The surgical imaging and display system of claim 1, wherein one of said tracking elements is an electromagnetic tracking element that includes a lithographically formed sheet assembly comprising marker elements.
13. A system for use with a fluoroscope that includes a source and an image detector, wherein the system comprises:
a fixture including a three dimensional array of point-like fluoroscopically opaque markers disposed over a region in a rigid body, said body being configured for disposition between the source and camera such that the array of markers is automatically imaged as an image is acquired by the fluoroscope;
a tracking assembly including at least a first tracking element associated with the fixture and a second tracking element associated with a tool and configured for automatically determining position data of the fixture with respect to the tool;
a marker determination unit operative on the fluoroscope image to identify at least a subset of said markers in the image and operative with a table of predefined marker positions to determine camera imaging parameters for the fluoroscope image; and
a correction unit responsive to said camera imaging parameters and to said position data to form a corrected tool navigation fluoroscopic image.
14. A system for display of tissue images and probe images during a surgical procedure on a patient, such system comprising
a fixture holding a plurality of radiopaque markers disposed in predefined patterns, said fixture being positionable between a source and an imaging side of a fluoroscope;
a stored table of position of said markers;
an image processor operative on a fluoroscopic image to identify imaged markers in the fluoroscopic image and operative with said stored table to model at least camera image plane piercing point and focus data for said image;
a tracker including a first tracking element securable against movement with respect to said fixture and at least one further tracking element securable against movement with respect to an object of interest such as a tool or a patient, said tracker determining coordinates of said fixture and said object of interest,
wherein said image processor applies said coordinates together with image positions of the identified imaged markers to model the camera data; and
an image display that applies said camera data to display a corrected fluoroscope image for tool guided surgery.
15. The tissue image display system of claim 14, wherein said processor and image display remove images of the markers from the fluoroscope image to display an interpolated unobstructed fluoroscope image.
16. The tissue image display system of claim 14, wherein the stored table is stored in non-volatile memory.
17. The tissue image display system of claim 16, wherein the tracker assembly includes a table of tracker position correction data for a given fluoroscope.
18. The tissue image display system of claim 14, wherein the tracker assembly includes a field generating tracking element and a plurality of field sensing tracking elements, respective ones of the tracking elements being securable against movement with respect to the patient, a tool and the fluoroscope imaging detector such that the system models said camera data and corrects images of the patient and tool irrespective of movement of the patient, the tool and the fluoroscope.
19. A system for use with a fluoroscope camera device for intraoperative tracking during a procedure on a patient, such system being of the type including a display for displaying one or more fluoroscope images together with an indication of probe or tissue status or position, and wherein the system comprises:
a fixed multi-dimensional array of fluoroscopically-imageable markers affixed on or proximate to an imaging assembly of the fluoroscope such that the markers are imaged in each view;
a tracking assembly including at least a first tracking element referenced to a patient and a second tracking element referenced to said fixed array;
a marker localization and identification unit operative on image data to identify in real time the position of markers in each fluoroscopic image and to replace the marker images with smooth image values to thereby create an unobstructed real time patient fluoroscopic view of patient tissue; and
a processor operative on detected image markers for integrating said view with patient position, tool position or preexisting image data to depict a trajectory of the surgical probe in said patient tissue intraoperatively in real time.
20. The system of claim 19, wherein said tracking assembly tracks position of a patient with respect to the marker array, and an image registration unit operates with data from the tracking assembly and the tracked markers to register the dimensionally corrected image to the patient.
21. The system of claim 20, wherein said processor determines a projection geometry camera calibration for each of a plurality of fluoroscope images, and further determines a coordinate system about a region of tissue imaged by the fluoroscope, said coordinate system being common to the tracking assembly and the fluoroscopic view of the said of tissue,
said processor further applying a transformation of image data of said plurality of images to define a fluoro-CT image data set representative of said region of tissue.
22. Apparatus for surgical imaging and probe guidance, such apparatus comprising:
a fluoroscope having a radiation emitting side and an imaging side, said fluoroscope being moveable to provide two-dimensional projection images through the body of a patient placed in its imaging field
a tracking system effective to simultaneously determine relative position of a patient, the fluoroscope and a surgical probe
a calibration fixture securing imageable registration markers positioned such that a plurality of said markers are identifiably imaged in every image taken by the fluoroscope, said calibration fixture being affixed to the fluoroscope so that the fixture is tracked by the tracking system
at least some of said markers lying at different coordinate positions along an imaging axis and being characterized by defined positions with respect to each other such that angle and spacing of images of said markers are effective to characterize camera parameters for relating a volume region of the patient imaged in each fluoroscopic image to camera focus and optical center, and
an image processor that operates on images of said markers to model fluoroscope image projection relative to tracked coordinates for each fluoroscope view such that imaged tool position in successive fluoroscope images taken from different orientations is accurately registered in successive images to track actual movement of the probe with respect to the patient.
23. The surgical imaging system of claim 22, wherein said image processor corrects image coordinates to projected tracked coordinate of a probe, to provide accurate tracking and display of a probe by measurement of probe position and change of display coordinates so as to display probe position on a display image without re-imaging of the tissue or probe.
24. A calibration fixture for use with a fluoroscope, such fixture comprising a radiolucent sheet or body having a major surface and a thickness, said fixture including an array of closely spaced radioopaque markers distributed over a caregion at or below said major surface at defined coordinate positions and being configured to secure in a fixed position in the imaging beam a fluoroscope such that a plurality of the markers are imaged in each pose and images of the markers are effective to define a camera model for the pose,
wherein the marker array is lithographically formed together with a co-registered magnetic element.
25. The calibration fixture of claim 24, wherein the sheet or body is formed of a rigid foam configured to have low x-ray absorption and scattering characteristics.
26. The calibration fixture of claim 25, wherein the array includes first and second sub-arrays of markers disposed at first and second offsets from said major surface.
27. The calibration fixture of claim 24, mounted in a radiographic patient support so as to be imaged together with the patient.
28. The calibration fixture of claim 24, adapted to be releasably affixed to a fluoroscope image detector to image the markers in each image formed by the fluoroscope.
29. The calibration fixture of claim 24, wherein the marker array is lithographically formed together with a co-registered magnetic element.
30. A calibration fixture comprising
a sheet body having
(i) an array of radioopaque markers distributed over a major region of the sheet, and
(ii) a magnetic tracking element attached to the sheet in defined relation to the array of markers.
31. The calibration fixture of claim 30, wherein the magnetic tracking element is at least partly formed on the sheet in registration with a pattern of said markers.
US09/560,940 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system Expired - Lifetime US6484049B1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US09/560,940 US6484049B1 (en) 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system
US09/560,608 US6490475B1 (en) 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system
AT01930893T ATE515976T1 (en) 2000-04-28 2001-04-30 FLUOROSCOPIC TRACKING AND IMAGING SYSTEM
EP01930893A EP1278458B1 (en) 2000-04-28 2001-04-30 Fluoroscopic tracking and visualization system
PCT/US2001/013734 WO2001087136A2 (en) 2000-04-28 2001-04-30 Fluoroscopic tracking and visualization system
AU2001257384A AU2001257384A1 (en) 2000-04-28 2001-04-30 Fluoroscopic tracking and visualization system
CA002407616A CA2407616A1 (en) 2000-04-28 2001-04-30 Fluoroscopic tracking and visualization system
US10/298,149 US6856826B2 (en) 2000-04-28 2002-11-15 Fluoroscopic tracking and visualization system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/560,940 US6484049B1 (en) 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system
US09/560,608 US6490475B1 (en) 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/298,149 Continuation US6856826B2 (en) 2000-04-28 2002-11-15 Fluoroscopic tracking and visualization system

Publications (1)

Publication Number Publication Date
US6484049B1 true US6484049B1 (en) 2002-11-19

Family

ID=27072400

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/560,940 Expired - Lifetime US6484049B1 (en) 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system
US09/560,608 Expired - Lifetime US6490475B1 (en) 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US09/560,608 Expired - Lifetime US6490475B1 (en) 2000-04-28 2000-04-28 Fluoroscopic tracking and visualization system

Country Status (6)

Country Link
US (2) US6484049B1 (en)
EP (1) EP1278458B1 (en)
AT (1) ATE515976T1 (en)
AU (1) AU2001257384A1 (en)
CA (1) CA2407616A1 (en)
WO (1) WO2001087136A2 (en)

Cited By (339)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010036245A1 (en) * 1999-02-10 2001-11-01 Kienzle Thomas C. Computer assisted targeting device for use in orthopaedic surgery
US20020156359A1 (en) * 2001-03-16 2002-10-24 Knoplioch Jerome Francois Short/long axis cardiac display protocol
US20030021381A1 (en) * 2001-07-25 2003-01-30 Reiner Koppe Method and device for the registration of two 3D image data sets
US20030088179A1 (en) * 2000-04-28 2003-05-08 Teresa Seeley Fluoroscopic tracking and visualization system
US20030128800A1 (en) * 2001-08-23 2003-07-10 Norbert Strobel Method for detecting the three-dimensional position of an examination instrument inserted into a body region
US20030147492A1 (en) * 2001-04-10 2003-08-07 Roland Proksa Fluoroscopy intervention method with a cone-beam
US20030149364A1 (en) * 2002-02-01 2003-08-07 Ajay Kapur Methods, system and apparatus for digital imaging
US20030212320A1 (en) * 2001-10-01 2003-11-13 Michael Wilk Coordinate registration system for dual modality imaging systems
US20030218720A1 (en) * 2002-02-07 2003-11-27 Olympus Optical Co., Ltd. Three-dimensional observation apparatus and method of three-dimensional observation
US20030235266A1 (en) * 2002-06-11 2003-12-25 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US20040013225A1 (en) * 2002-03-19 2004-01-22 Breakaway Imaging, Llc Systems and methods for imaging large field-of-view objects
US20040015077A1 (en) * 2002-07-11 2004-01-22 Marwan Sati Apparatus, system and method of calibrating medical imaging systems
US20040013239A1 (en) * 2002-03-13 2004-01-22 Breakaway Imaging, Llc Systems and methods for quasi-simultaneous multi-planar x-ray imaging
US20040022350A1 (en) * 2002-02-15 2004-02-05 Breakaway Imaging, Llc Breakable gantry apparatus for multidimensional x-ray based imaging
US20040034300A1 (en) * 2002-08-19 2004-02-19 Laurent Verard Method and apparatus for virtual endoscopy
US20040082854A1 (en) * 2002-09-12 2004-04-29 Robert Essenreiter X-ray image-assisted navigation using original, two-dimensional x-ray images
US20040143178A1 (en) * 2003-01-21 2004-07-22 Francois Leitner Recording localization device tool positional parameters
WO2004064379A1 (en) * 2003-01-16 2004-07-29 Philips Intellectual Property & Standards Gmbh Method of determining the position of an object in an image
US20040170254A1 (en) * 2002-08-21 2004-09-02 Breakaway Imaging, Llc Gantry positioning apparatus for X-ray imaging
US20040239314A1 (en) * 2003-05-29 2004-12-02 Assaf Govari Hysteresis assessment for metal immunity
US20050004449A1 (en) * 2003-05-20 2005-01-06 Matthias Mitschke Method for marker-less navigation in preoperative 3D images using an intraoperatively acquired 3D C-arm image
US20050024043A1 (en) * 2003-07-31 2005-02-03 Assaf Govari Detection of metal disturbance in a magnetic tracking system
US20050027193A1 (en) * 2003-05-21 2005-02-03 Matthias Mitschke Method for automatically merging a 2D fluoroscopic C-arm image with a preoperative 3D image with one-time use of navigation markers
US20050033164A1 (en) * 2001-07-12 2005-02-10 Takeshi Yatsuo Endoscopic image pickup method and magnetic resonance imaging device using the same
WO2005024729A1 (en) * 2003-09-04 2005-03-17 Philips Intellectual Property & Standards Gmbh Device and method for displaying ultrasound images of a vessel
US20050065513A1 (en) * 2002-03-22 2005-03-24 Irion Klaus M. Medical instrument for the treatment of tissue by means of a high-frequency current and medical system with a medical instrument of this type
US20050149041A1 (en) * 2003-11-14 2005-07-07 Mcginley Brian J. Adjustable surgical cutting systems
US20050154285A1 (en) * 2004-01-02 2005-07-14 Neason Curtis G. System and method for receiving and displaying information pertaining to a patient
US20050154279A1 (en) * 2003-12-31 2005-07-14 Wenguang Li System and method for registering an image with a representation of a probe
US20050209524A1 (en) * 2004-03-10 2005-09-22 General Electric Company System and method for receiving and storing information pertaining to a patient
US20050222509A1 (en) * 2004-04-02 2005-10-06 General Electric Company Electrophysiology system and method
US20050228252A1 (en) * 2004-04-02 2005-10-13 General Electric Company Electrophysiology system and method
US20050228251A1 (en) * 2004-03-30 2005-10-13 General Electric Company System and method for displaying a three-dimensional image of an organ or structure inside the body
US20050262031A1 (en) * 2003-07-21 2005-11-24 Olivier Saidi Systems and methods for treating, diagnosing and predicting the occurrence of a medical condition
US20050281385A1 (en) * 2004-06-02 2005-12-22 Johnson Douglas K Method and system for improved correction of registration error in a fluoroscopic image
US20050288574A1 (en) * 2004-06-23 2005-12-29 Thornton Thomas M Wireless (disposable) fiducial based registration and EM distoration based surface registration
US20060025668A1 (en) * 2004-08-02 2006-02-02 Peterson Thomas H Operating table with embedded tracking technology
US20060063998A1 (en) * 2004-09-21 2006-03-23 Von Jako Ron Navigation and visualization of an access needle system
US20060064005A1 (en) * 2004-09-23 2006-03-23 Innovative Spinal Technologies System and method for externally controlled surgical navigation
US20060089552A1 (en) * 2004-10-05 2006-04-27 Gunter Goldbach Tracking system with scattering effect utilization, in particular with star effect and/or cross effect utilization
US20060115054A1 (en) * 2004-11-12 2006-06-01 General Electric Company System and method for integration of a calibration target into a C-arm
US20060121849A1 (en) * 2003-07-01 2006-06-08 Peter Traneus Anderson Electromagnetic coil array integrated into flat-panel detector
US20060153468A1 (en) * 2002-09-04 2006-07-13 Torsten Solf Imaging system and method for optimizing an x-ray image
US7106825B2 (en) 2002-08-21 2006-09-12 Breakaway Imaging, Llc Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US20060229626A1 (en) * 2005-02-22 2006-10-12 Mclean Terry W In-line milling system
US20060293592A1 (en) * 2005-05-13 2006-12-28 General Electric Company System and method for controlling a medical imaging device
US20070016025A1 (en) * 2005-06-28 2007-01-18 Siemens Medical Solutions Usa, Inc. Medical diagnostic imaging three dimensional navigation device and methods
US20070016009A1 (en) * 2005-06-27 2007-01-18 Lakin Ryan C Image guided tracking array and method
US20070073143A1 (en) * 2004-09-03 2007-03-29 Cti Concorde Microsystems, Llc Solid fiduciary marker for multimodality imaging
US20070078466A1 (en) * 2005-09-30 2007-04-05 Restoration Robotics, Inc. Methods for harvesting follicular units using an automated system
US20070100234A1 (en) * 2005-10-27 2007-05-03 Arenson Jerome S Methods and systems for tracking instruments in fluoroscopy
US20070106306A1 (en) * 2005-09-30 2007-05-10 Restoration Robotics, Inc. Automated system for harvesting or implanting follicular units
US20070106307A1 (en) * 2005-09-30 2007-05-10 Restoration Robotics, Inc. Methods for implanting follicular units using an automated system
US20070197908A1 (en) * 2003-10-29 2007-08-23 Ruchala Kenneth J System and method for calibrating and positioning a radiation therapy treatment table
WO2007132381A2 (en) * 2006-05-11 2007-11-22 Koninklijke Philips Electronics N.V. System and method for generating intraoperative 3-dimensional images using non-contrast image data
US20080002809A1 (en) * 2006-06-28 2008-01-03 Mohan Bodduluri Parallel stereovision geometry in image-guided radiosurgery
US20080009715A1 (en) * 2006-05-16 2008-01-10 Markus Kukuk Rotational stereo roadmapping
US20080161684A1 (en) * 2006-10-26 2008-07-03 General Electric Company Systems and methods for integrating a navigation field replaceable unit into a fluoroscopy system
US7398116B2 (en) * 2003-08-11 2008-07-08 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US20080255414A1 (en) * 2007-04-13 2008-10-16 Ethicon Endo-Surgery, Inc. Fluorescent nanoparticle scope
US7444178B2 (en) 2004-10-05 2008-10-28 Brainlab Ag Positional marker system with point light sources
WO2008130354A1 (en) * 2007-04-24 2008-10-30 Medtronic, Inc. Intraoperative image registration
US20080269588A1 (en) * 2007-04-24 2008-10-30 Medtronic, Inc. Intraoperative Image Registration
US20080285707A1 (en) * 2005-10-24 2008-11-20 Cas Innovations Ag System and Method for Medical Navigation
US20080292046A1 (en) * 2007-05-09 2008-11-27 Estelle Camus Bronchopulmonary medical services system and imaging method
US20080312529A1 (en) * 2007-06-15 2008-12-18 Louis-Philippe Amiot Computer-assisted surgery system and method
US20080319311A1 (en) * 2007-06-22 2008-12-25 General Electric Company System and method for accuracy verification for image based surgical navigation
US7471202B2 (en) 2006-03-29 2008-12-30 General Electric Co. Conformal coil array for a medical tracking system
US20090028291A1 (en) * 2007-07-23 2009-01-29 Rainer Graumann X-ray system and method for image composition
US20090116616A1 (en) * 2007-10-25 2009-05-07 Tomotherapy Incorporated System and method for motion adaptive optimization for radiation therapy delivery
US7532997B2 (en) 2006-04-17 2009-05-12 General Electric Company Electromagnetic tracking using a discretized numerical field model
US20090135992A1 (en) * 2007-11-27 2009-05-28 Regis Vaillant Method for the processing of radiography cardiac images with a view to obtaining a subtracted and registered image
US20090198124A1 (en) * 2008-01-31 2009-08-06 Ralf Adamus Workflow to enhance a transjugular intrahepatic portosystemic shunt procedure
US20090252291A1 (en) * 2007-10-25 2009-10-08 Weiguo Lu System and method for motion adaptive optimization for radiation therapy delivery
US20090264729A1 (en) * 2000-01-10 2009-10-22 Pinhas Gilboa Methods And Systems For Performing Medical Procedures With Reference To Projective Image And With Respect To Pre Stored Images
EP2119397A1 (en) 2008-05-15 2009-11-18 BrainLAB AG Determining calibration information for an x-ray machine
US7657300B2 (en) 1999-10-28 2010-02-02 Medtronic Navigation, Inc. Registration of human anatomy integrated for electromagnetic localization
US7660623B2 (en) 2003-01-30 2010-02-09 Medtronic Navigation, Inc. Six degree of freedom alignment display for medical procedures
US20100069741A1 (en) * 2005-11-23 2010-03-18 General Electric Company System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging
US20100080415A1 (en) * 2008-09-29 2010-04-01 Restoration Robotics, Inc. Object-tracking systems and methods
US7697972B2 (en) 2002-11-19 2010-04-13 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US7751865B2 (en) 2003-10-17 2010-07-06 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7764985B2 (en) 2003-10-20 2010-07-27 Smith & Nephew, Inc. Surgical navigation system component fault interfaces and related processes
US7763035B2 (en) 1997-12-12 2010-07-27 Medtronic Navigation, Inc. Image guided spinal surgery guide, system and method for use thereof
US7797032B2 (en) 1999-10-28 2010-09-14 Medtronic Navigation, Inc. Method and system for navigating a catheter probe in the presence of field-influencing objects
US7831082B2 (en) 2000-06-14 2010-11-09 Medtronic Navigation, Inc. System and method for image based sensor calibration
US20100284601A1 (en) * 2006-09-25 2010-11-11 Mazor Surgical Technologies, Ltd. C-arm computerized tomography system
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US7835778B2 (en) 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US20100292565A1 (en) * 2009-05-18 2010-11-18 Andreas Meyer Medical imaging medical device navigation from at least two 2d projections from different angles
US7840253B2 (en) 2003-10-17 2010-11-23 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7853305B2 (en) 2000-04-07 2010-12-14 Medtronic Navigation, Inc. Trajectory storage apparatus and method for surgical navigation systems
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US7881770B2 (en) 2000-03-01 2011-02-01 Medtronic Navigation, Inc. Multiple cannula image guided tool for image guided procedures
USRE42194E1 (en) 1997-09-24 2011-03-01 Medtronic Navigation, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US20110054297A1 (en) * 2008-03-03 2011-03-03 Clemens Bulitta Medical system
US7920909B2 (en) 2005-09-13 2011-04-05 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
US7925328B2 (en) 2003-08-28 2011-04-12 Medtronic Navigation, Inc. Method and apparatus for performing stereotactic surgery
US7953471B2 (en) 2004-05-03 2011-05-31 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7966058B2 (en) 2003-12-31 2011-06-21 General Electric Company System and method for registering an image with a representation of a probe
US20110152676A1 (en) * 2009-12-21 2011-06-23 General Electric Company Intra-operative registration for navigated surgical procedures
US7974677B2 (en) 2003-01-30 2011-07-05 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
US7996064B2 (en) 1999-03-23 2011-08-09 Medtronic Navigation, Inc. System and method for placing and determining an appropriately sized surgical implant
US7998062B2 (en) 2004-03-29 2011-08-16 Superdimension, Ltd. Endoscope structures and techniques for navigating to a target in branched structure
US8057407B2 (en) 1999-10-28 2011-11-15 Medtronic Navigation, Inc. Surgical sensor
US8060185B2 (en) 2002-11-19 2011-11-15 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US8074662B2 (en) 1999-10-28 2011-12-13 Medtronic Navigation, Inc. Surgical communication and power system
US8109942B2 (en) 2004-04-21 2012-02-07 Smith & Nephew, Inc. Computer-aided methods, systems, and apparatuses for shoulder arthroplasty
US8112292B2 (en) 2006-04-21 2012-02-07 Medtronic Navigation, Inc. Method and apparatus for optimizing a therapy
US8150495B2 (en) 2003-08-11 2012-04-03 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
USRE43328E1 (en) 1997-11-20 2012-04-24 Medtronic Navigation, Inc Image guided awl/tap/screwdriver
US8165658B2 (en) 2008-09-26 2012-04-24 Medtronic, Inc. Method and apparatus for positioning a guide relative to a base
US8175681B2 (en) 2008-12-16 2012-05-08 Medtronic Navigation Inc. Combination of electromagnetic and electropotential localization
US8200314B2 (en) 1992-08-14 2012-06-12 British Telecommunications Public Limited Company Surgical navigation
US8229068B2 (en) 2005-07-22 2012-07-24 Tomotherapy Incorporated System and method of detecting a breathing phase of a patient receiving radiation therapy
US8239001B2 (en) 2003-10-17 2012-08-07 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US8303505B2 (en) 2005-12-02 2012-11-06 Abbott Cardiovascular Systems Inc. Methods and apparatuses for image guided medical procedures
WO2012155050A2 (en) * 2011-05-12 2012-11-15 The Johns Hopkins University Electromagnetic tracking system and methods of using same
USRE43952E1 (en) 1989-10-05 2013-01-29 Medtronic Navigation, Inc. Interactive system for local intervention inside a non-homogeneous structure
US8391952B2 (en) 2007-10-11 2013-03-05 General Electric Company Coil arrangement for an electromagnetic tracking system
US8452068B2 (en) 2008-06-06 2013-05-28 Covidien Lp Hybrid registration method
US20130150865A1 (en) * 2011-12-09 2013-06-13 Samsung Electronics Co., Ltd. Medical robot system and method for controlling the same
US8473032B2 (en) 2008-06-03 2013-06-25 Superdimension, Ltd. Feature-based registration method
US8473026B2 (en) 1994-09-15 2013-06-25 Ge Medical Systems Global Technology Company System for monitoring a position of a medical instrument with respect to a patient's body
US8494614B2 (en) 2009-08-31 2013-07-23 Regents Of The University Of Minnesota Combination localization system
US8494613B2 (en) 2009-08-31 2013-07-23 Medtronic, Inc. Combination localization system
WO2013106926A1 (en) * 2012-01-17 2013-07-25 Sunnybrook Health Sciences Centre Method for three-dimensional localization of an object from a two-dimensional medical image
US20130217952A1 (en) * 2003-07-21 2013-08-22 Vanderbilt University Ophthalmic orbital surgery apparatus and method and image-guided navigation system
AU2011250755B2 (en) * 2005-09-30 2013-08-29 Restoration Robotics, Inc. Automated systems and methods for harvesting and implanting follicular units
US20130245429A1 (en) * 2012-02-28 2013-09-19 Siemens Aktiengesellschaft Robust multi-object tracking using sparse appearance representation and online sparse appearance dictionary update
US8611984B2 (en) 2009-04-08 2013-12-17 Covidien Lp Locatable catheter
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US8660635B2 (en) 2006-09-29 2014-02-25 Medtronic, Inc. Method and apparatus for optimizing a computer assisted surgical procedure
US8663088B2 (en) 2003-09-15 2014-03-04 Covidien Lp System of accessories for use with bronchoscopes
US8696549B2 (en) 2010-08-20 2014-04-15 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US20140112529A1 (en) * 2012-10-24 2014-04-24 Samsung Electronics Co., Ltd. Method, apparatus, and system for correcting medical image according to patient's pose variation
CN103767683A (en) * 2012-10-19 2014-05-07 韦伯斯特生物官能(以色列)有限公司 Integration between 3D maps and fluoroscopic images
US8764725B2 (en) 2004-02-09 2014-07-01 Covidien Lp Directional anchoring mechanism, method and applications thereof
US8768437B2 (en) 1998-08-20 2014-07-01 Sofamor Danek Holdings, Inc. Fluoroscopic image guided surgery system with intraoperative registration
US8767917B2 (en) 2005-07-22 2014-07-01 Tomotherapy Incorpoated System and method of delivering radiation therapy to a moving region of interest
US8781186B2 (en) 2010-05-04 2014-07-15 Pathfinder Therapeutics, Inc. System and method for abdominal surface matching using pseudo-features
US8792963B2 (en) 2007-09-30 2014-07-29 Intuitive Surgical Operations, Inc. Methods of determining tissue distances using both kinematic robotic tool position information and image-derived position information
US8838199B2 (en) 2002-04-04 2014-09-16 Medtronic Navigation, Inc. Method and apparatus for virtual digital subtraction angiography
US8845655B2 (en) 1999-04-20 2014-09-30 Medtronic Navigation, Inc. Instrument guide system
EP2701605A4 (en) * 2011-04-27 2014-10-01 Univ Virginia Commonwealth 3d tracking of an hdr source using a flat panel detector
US8905920B2 (en) 2007-09-27 2014-12-09 Covidien Lp Bronchoscope adapter and method
US8911453B2 (en) 2010-12-21 2014-12-16 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US8932207B2 (en) 2008-07-10 2015-01-13 Covidien Lp Integrated multi-functional endoscopic tool
US9055881B2 (en) 2004-04-26 2015-06-16 Super Dimension Ltd. System and method for image-based alignment of an endoscope
US9078685B2 (en) 2007-02-16 2015-07-14 Globus Medical, Inc. Method and system for performing invasive medical procedures using a surgical robot
US9125611B2 (en) 2010-12-13 2015-09-08 Orthoscan, Inc. Mobile fluoroscopic imaging system
US9138165B2 (en) 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9155592B2 (en) 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9168102B2 (en) 2006-01-18 2015-10-27 Medtronic Navigation, Inc. Method and apparatus for providing a container to a sterile environment
US20150355113A1 (en) * 2013-01-25 2015-12-10 Werth Messtechnik Gmbh Method and device for determining the geometry of structures by means of computer tomography
US20160133016A1 (en) * 2006-11-26 2016-05-12 Algotec Systems Ltd. Comparison workflow automation by registration
US9398675B2 (en) 2009-03-20 2016-07-19 Orthoscan, Inc. Mobile imaging apparatus
US9443633B2 (en) 2013-02-26 2016-09-13 Accuray Incorporated Electromagnetically actuated multi-leaf collimator
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US20160331481A1 (en) * 2002-03-20 2016-11-17 P Tech, Llc Methods of using a robotic spine system
US9498289B2 (en) 2010-12-21 2016-11-22 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US9575140B2 (en) 2008-04-03 2017-02-21 Covidien Lp Magnetic interference detection system and method
US20170157770A1 (en) * 2014-06-23 2017-06-08 Abb Schweiz Ag Method for calibrating a robot and a robot system
US9675424B2 (en) 2001-06-04 2017-06-13 Surgical Navigation Technologies, Inc. Method for calibrating a navigation system
US9757087B2 (en) 2002-02-28 2017-09-12 Medtronic Navigation, Inc. Method and apparatus for perspective inversion
US9782229B2 (en) 2007-02-16 2017-10-10 Globus Medical, Inc. Surgical robot platform
US20170347922A1 (en) * 2010-05-06 2017-12-07 Sachin Bhandari Calibration Device for Inertial Sensor Based Surgical Navigation System
US9974525B2 (en) 2014-10-31 2018-05-22 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
USD820452S1 (en) 2016-07-21 2018-06-12 Broncus Medical Inc. Radio-opaque marker
US9993273B2 (en) 2013-01-16 2018-06-12 Mako Surgical Corp. Bone plate and tracking device using a bone plate for attaching to a patient's anatomy
US10080615B2 (en) 2015-08-12 2018-09-25 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
US10117632B2 (en) 2016-02-03 2018-11-06 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
US10136954B2 (en) 2012-06-21 2018-11-27 Globus Medical, Inc. Surgical tool systems and method
US10166078B2 (en) * 2015-07-21 2019-01-01 Synaptive Medical (Barbados) Inc. System and method for mapping navigation space to patient space in a medical procedure
US10231791B2 (en) 2012-06-21 2019-03-19 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US10292778B2 (en) 2014-04-24 2019-05-21 Globus Medical, Inc. Surgical instrument holder for use with a robotic surgical system
US10299871B2 (en) 2005-09-30 2019-05-28 Restoration Robotics, Inc. Automated system and method for hair removal
US10335237B2 (en) 2008-04-03 2019-07-02 Brainlab Ag Visual orientation aid for medical instruments
US10350013B2 (en) 2012-06-21 2019-07-16 Globus Medical, Inc. Surgical tool systems and methods
US10357184B2 (en) 2012-06-21 2019-07-23 Globus Medical, Inc. Surgical tool systems and method
US10357257B2 (en) 2014-07-14 2019-07-23 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10420616B2 (en) 2017-01-18 2019-09-24 Globus Medical, Inc. Robotic navigation of robotic surgical systems
US10426555B2 (en) 2015-06-03 2019-10-01 Covidien Lp Medical instrument with sensor for use in a system and method for electromagnetic navigation
WO2019155036A3 (en) * 2018-02-11 2019-10-03 St. Jude Medical International Holding S.À R.L. Mechanical design considerations for table-mounted device used as a sub-assembly in a magnetic tracking system working in conjunction with an x-ray imaging system
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US10456601B2 (en) * 2013-07-17 2019-10-29 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US10478254B2 (en) 2016-05-16 2019-11-19 Covidien Lp System and method to access lung tissue
CN110495899A (en) * 2018-05-16 2019-11-26 西门子医疗有限公司 The method for determining the method and apparatus of geometry calibration and determining associated data
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10531926B2 (en) 2016-05-23 2020-01-14 Mako Surgical Corp. Systems and methods for identifying and tracking physical objects during a robotic surgical procedure
US10531925B2 (en) 2013-01-16 2020-01-14 Stryker Corporation Navigation systems and methods for indicating and reducing line-of-sight errors
US10537395B2 (en) 2016-05-26 2020-01-21 MAKO Surgical Group Navigation tracker with kinematic connector assembly
US10546423B2 (en) 2015-02-03 2020-01-28 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US10548620B2 (en) 2014-01-15 2020-02-04 Globus Medical, Inc. Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
US10555775B2 (en) 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US10555782B2 (en) 2015-02-18 2020-02-11 Globus Medical, Inc. Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
US10573023B2 (en) 2018-04-09 2020-02-25 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
US10569794B2 (en) 2015-10-13 2020-02-25 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US10582834B2 (en) 2010-06-15 2020-03-10 Covidien Lp Locatable expandable working channel and method
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10617324B2 (en) 2014-04-23 2020-04-14 Veran Medical Technologies, Inc Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US10624701B2 (en) 2014-04-23 2020-04-21 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US10646283B2 (en) 2018-02-19 2020-05-12 Globus Medical Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US10646298B2 (en) 2015-07-31 2020-05-12 Globus Medical, Inc. Robot arm and methods of use
US10646280B2 (en) 2012-06-21 2020-05-12 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
US10653497B2 (en) 2006-02-16 2020-05-19 Globus Medical, Inc. Surgical tool systems and methods
US10660712B2 (en) 2011-04-01 2020-05-26 Globus Medical Inc. Robotic system and method for spinal and other surgeries
US10674982B2 (en) 2015-08-06 2020-06-09 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US10675094B2 (en) 2017-07-21 2020-06-09 Globus Medical Inc. Robot surgical platform
US10687905B2 (en) 2015-08-31 2020-06-23 KB Medical SA Robotic surgical systems and methods
US10699448B2 (en) 2017-06-29 2020-06-30 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US10702226B2 (en) 2015-08-06 2020-07-07 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US10716525B2 (en) 2015-08-06 2020-07-21 Covidien Lp System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US10758315B2 (en) 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
US10765438B2 (en) 2014-07-14 2020-09-08 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
WO2020186075A1 (en) * 2019-03-12 2020-09-17 Micro C, LLC Method of fluoroscopic surgical registration
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
US10799298B2 (en) 2012-06-21 2020-10-13 Globus Medical Inc. Robotic fluoroscopic navigation
US10806471B2 (en) 2017-01-18 2020-10-20 Globus Medical, Inc. Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
US10813704B2 (en) 2013-10-04 2020-10-27 Kb Medical, Sa Apparatus and systems for precise guidance of surgical tools
US10828120B2 (en) 2014-06-19 2020-11-10 Kb Medical, Sa Systems and methods for performing minimally invasive surgery
CN111956327A (en) * 2020-07-27 2020-11-20 季鹰 Image measuring and registering method
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US10842461B2 (en) 2012-06-21 2020-11-24 Globus Medical, Inc. Systems and methods of checking registrations for surgical systems
US10866119B2 (en) 2016-03-14 2020-12-15 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
US10864057B2 (en) 2017-01-18 2020-12-15 Kb Medical, Sa Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
US10874466B2 (en) 2012-06-21 2020-12-29 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
US10893842B2 (en) 2018-02-08 2021-01-19 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US10893843B2 (en) 2017-10-10 2021-01-19 Covidien Lp System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US10893912B2 (en) 2006-02-16 2021-01-19 Globus Medical Inc. Surgical tool systems and methods
US10898252B2 (en) 2017-11-09 2021-01-26 Globus Medical, Inc. Surgical robotic systems for bending surgical rods, and related methods and devices
US10905498B2 (en) 2018-02-08 2021-02-02 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US10925681B2 (en) 2015-07-31 2021-02-23 Globus Medical Inc. Robot arm and methods of use
US10939968B2 (en) 2014-02-11 2021-03-09 Globus Medical Inc. Sterile handle for controlling a robotic surgical system from a sterile field
US10952593B2 (en) 2014-06-10 2021-03-23 Covidien Lp Bronchoscope adapter
US10973594B2 (en) 2015-09-14 2021-04-13 Globus Medical, Inc. Surgical robotic systems and methods thereof
US11006921B2 (en) 2016-09-15 2021-05-18 Oxos Medical, Inc. Imaging systems and methods
US11006914B2 (en) 2015-10-28 2021-05-18 Medtronic Navigation, Inc. Apparatus and method for maintaining image quality while minimizing x-ray dosage of a patient
US11039893B2 (en) 2016-10-21 2021-06-22 Globus Medical, Inc. Robotic surgical systems
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
US11051886B2 (en) 2016-09-27 2021-07-06 Covidien Lp Systems and methods for performing a surgical navigation procedure
CN113081009A (en) * 2015-04-15 2021-07-09 莫比乌斯成像公司 Integrated medical imaging and surgical robotic system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US11071594B2 (en) 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
US11103316B2 (en) 2014-12-02 2021-08-31 Globus Medical Inc. Robot assisted volume removal during surgery
US11116576B2 (en) 2012-06-21 2021-09-14 Globus Medical Inc. Dynamic reference arrays and methods of use
CN113473915A (en) * 2019-01-15 2021-10-01 皇家飞利浦有限公司 Real-time tracking of fused ultrasound and X-ray images
US11134862B2 (en) 2017-11-10 2021-10-05 Globus Medical, Inc. Methods of selecting surgical implants and related devices
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
CN113520593A (en) * 2020-04-15 2021-10-22 史赛克欧洲运营有限公司 Techniques for determining the position of one or more imaging markers in an image coordinate system
US11172895B2 (en) 2015-12-07 2021-11-16 Covidien Lp Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated
US11207047B2 (en) * 2018-08-01 2021-12-28 Oxos Medical, Inc. Imaging systems and methods
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
CN113873936A (en) * 2019-03-26 2021-12-31 皇家飞利浦有限公司 Continuous guidewire identification
CN113905685A (en) * 2019-05-23 2022-01-07 伯恩森斯韦伯斯特(以色列)有限责任公司 Probe with radiopaque label
US11219489B2 (en) 2017-10-31 2022-01-11 Covidien Lp Devices and systems for providing sensors in parallel with medical tools
US11241165B2 (en) 2017-12-05 2022-02-08 St. Jude Medical International Holding S.À R.L. Magnetic sensor for tracking the location of an object
US11253216B2 (en) 2020-04-28 2022-02-22 Globus Medical Inc. Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11278360B2 (en) 2018-11-16 2022-03-22 Globus Medical, Inc. End-effectors for surgical robotic systems having sealed optical components
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11304629B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US11317973B2 (en) 2020-06-09 2022-05-03 Globus Medical, Inc. Camera tracking bar for computer assisted navigation during surgery
US11317978B2 (en) 2019-03-22 2022-05-03 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11331150B2 (en) 1999-10-28 2022-05-17 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US11337742B2 (en) 2018-11-05 2022-05-24 Globus Medical Inc Compliant orthopedic driver
US11357548B2 (en) 2017-11-09 2022-06-14 Globus Medical, Inc. Robotic rod benders and related mechanical and motor housings
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11382713B2 (en) 2020-06-16 2022-07-12 Globus Medical, Inc. Navigated surgical system with eye to XR headset display calibration
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11395706B2 (en) 2012-06-21 2022-07-26 Globus Medical Inc. Surgical robot platform
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US20220257206A1 (en) * 2018-04-17 2022-08-18 The Board Of Trustees Of Leland Stanford Junior University Augmented Fluoroscopy with Digital Subtraction Imaging
US11419616B2 (en) 2019-03-22 2022-08-23 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11426178B2 (en) 2019-09-27 2022-08-30 Globus Medical Inc. Systems and methods for navigating a pin guide driver
US11439444B1 (en) 2021-07-22 2022-09-13 Globus Medical, Inc. Screw tower and rod reduction tool
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11510684B2 (en) 2019-10-14 2022-11-29 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US11523785B2 (en) 2020-09-24 2022-12-13 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US11571265B2 (en) 2019-03-22 2023-02-07 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11571171B2 (en) 2019-09-24 2023-02-07 Globus Medical, Inc. Compound curve cable chain
US11589771B2 (en) 2012-06-21 2023-02-28 Globus Medical Inc. Method for recording probe movement and determining an extent of matter removed
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11607149B2 (en) 2012-06-21 2023-03-21 Globus Medical Inc. Surgical tool systems and method
US11628023B2 (en) 2019-07-10 2023-04-18 Globus Medical, Inc. Robotic navigational system for interbody implants
US11627924B2 (en) 2019-09-24 2023-04-18 Covidien Lp Systems and methods for image-guided navigation of percutaneously-inserted devices
US11642182B2 (en) * 2016-09-27 2023-05-09 Brainlab Ag Efficient positioning of a mechatronic arm
US11717350B2 (en) 2020-11-24 2023-08-08 Globus Medical Inc. Methods for robotic assistance and navigation in spinal surgery and related systems
CN116593121A (en) * 2023-07-12 2023-08-15 中国航空工业集团公司沈阳空气动力研究所 Aircraft model vibration measurement method based on monitoring camera
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11779732B2 (en) 2016-11-21 2023-10-10 St Jude Medical International Holding S.À R.L. Medical device sensor
US11786324B2 (en) 2012-06-21 2023-10-17 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11794338B2 (en) 2017-11-09 2023-10-24 Globus Medical Inc. Robotic rod benders and related mechanical and motor housings
US11793588B2 (en) 2020-07-23 2023-10-24 Globus Medical, Inc. Sterile draping of robotic arms
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11806084B2 (en) 2019-03-22 2023-11-07 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11850009B2 (en) 2021-07-06 2023-12-26 Globus Medical, Inc. Ultrasonic robotic surgical navigation
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11864857B2 (en) 2019-09-27 2024-01-09 Globus Medical, Inc. Surgical robot with passive end effector
US11877807B2 (en) 2020-07-10 2024-01-23 Globus Medical, Inc Instruments for navigated orthopedic surgeries
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US11890066B2 (en) 2019-09-30 2024-02-06 Globus Medical, Inc Surgical robot with passive end effector
US11896446B2 (en) 2012-06-21 2024-02-13 Globus Medical, Inc Surgical robotic automation with tracking markers
US11911112B2 (en) 2020-10-27 2024-02-27 Globus Medical, Inc. Robotic navigational system
US11911115B2 (en) 2021-12-20 2024-02-27 Globus Medical Inc. Flat panel registration fixture and method of using same
US11918313B2 (en) 2019-03-15 2024-03-05 Globus Medical Inc. Active end effectors for surgical robots
US11941814B2 (en) 2020-11-04 2024-03-26 Globus Medical Inc. Auto segmentation using 2-D images taken during 3-D imaging spin
US11944325B2 (en) 2019-03-22 2024-04-02 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11963755B2 (en) 2012-06-21 2024-04-23 Globus Medical Inc. Apparatus for recording probe movement
US11974822B2 (en) 2012-06-21 2024-05-07 Globus Medical Inc. Method for a surveillance marker in robotic-assisted surgery
US11974886B2 (en) 2016-04-11 2024-05-07 Globus Medical Inc. Surgical tool systems and methods
US11992373B2 (en) 2019-12-10 2024-05-28 Globus Medical, Inc Augmented reality headset with varied opacity for navigated robotic surgery
US12004905B2 (en) 2012-06-21 2024-06-11 Globus Medical, Inc. Medical imaging systems using robotic actuators and related methods
US12048493B2 (en) 2022-03-31 2024-07-30 Globus Medical, Inc. Camera tracking system identifying phantom markers during computer assisted surgery navigation
EP4375934A3 (en) * 2016-02-12 2024-07-31 Intuitive Surgical Operations, Inc. Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery
US12059804B2 (en) 2019-05-22 2024-08-13 Mako Surgical Corp. Bidirectional kinematic mount
US12064189B2 (en) 2019-12-13 2024-08-20 Globus Medical, Inc. Navigated instrument for use in robotic guided surgery
US12070276B2 (en) 2020-06-09 2024-08-27 Globus Medical Inc. Surgical object tracking in visible light via fiducial seeding and synthetic image registration
US12070286B2 (en) 2021-01-08 2024-08-27 Globus Medical, Inc System and method for ligament balancing with robotic assistance
US12076091B2 (en) 2020-10-27 2024-09-03 Globus Medical, Inc. Robotic navigational system
US12082886B2 (en) 2017-04-05 2024-09-10 Globus Medical Inc. Robotic surgical systems for preparing holes in bone tissue and methods of their use
US12089902B2 (en) 2019-07-30 2024-09-17 Coviden Lp Cone beam and 3D fluoroscope lung navigation
US12103480B2 (en) 2022-03-18 2024-10-01 Globus Medical Inc. Omni-wheel cable pusher
US12121278B2 (en) 2023-12-05 2024-10-22 Globus Medical, Inc. Compliant orthopedic driver

Families Citing this family (155)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7074179B2 (en) * 1992-08-10 2006-07-11 Intuitive Surgical Inc Method and apparatus for performing minimally invasive cardiac procedures
US6132441A (en) 1996-11-22 2000-10-17 Computer Motion, Inc. Rigidly-linked articulating wrist with decoupled motion transmission
US20030135115A1 (en) * 1997-11-24 2003-07-17 Burdette Everette C. Method and apparatus for spatial registration and mapping of a biopsy needle during a tissue biopsy
DE19930408A1 (en) * 1999-07-02 2001-01-04 Zeiss Carl Fa An optical coherence computer tomography (OCT) system for surgical support combines pre-operation tissue position data and CT data for use in navigation assistance for the operating surgeon
US6991656B2 (en) 2000-04-26 2006-01-31 Dana Mears Method and apparatus for performing a minimally invasive total hip arthroplasty
US6856826B2 (en) * 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
JP2001325614A (en) * 2000-05-15 2001-11-22 Sony Corp Device and method for processing three-dimensional model and program providing medium
US6782287B2 (en) * 2000-06-27 2004-08-24 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for tracking a medical instrument based on image registration
US6837892B2 (en) * 2000-07-24 2005-01-04 Mazor Surgical Technologies Ltd. Miniature bone-mounted surgical robot
US6584339B2 (en) * 2001-06-27 2003-06-24 Vanderbilt University Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery
US7286866B2 (en) * 2001-11-05 2007-10-23 Ge Medical Systems Global Technology Company, Llc Method, system and computer product for cardiac interventional procedure planning
US7311705B2 (en) 2002-02-05 2007-12-25 Medtronic, Inc. Catheter apparatus for treatment of heart arrhythmia
DE10210287B4 (en) * 2002-03-08 2004-01-22 Siemens Ag Method and device for markerless registration for navigation-guided interventions
US7499743B2 (en) * 2002-03-15 2009-03-03 General Electric Company Method and system for registration of 3D images within an interventional system
US7346381B2 (en) * 2002-11-01 2008-03-18 Ge Medical Systems Global Technology Company Llc Method and apparatus for medical intervention procedure planning
US7927368B2 (en) 2002-03-25 2011-04-19 Kieran Murphy Llc Device viewable under an imaging beam
US20030181810A1 (en) * 2002-03-25 2003-09-25 Murphy Kieran P. Kit for image guided surgical procedures
US9375203B2 (en) 2002-03-25 2016-06-28 Kieran Murphy Llc Biopsy needle
ES2225680T3 (en) * 2002-04-16 2005-03-16 Brainlab Ag MARKER FOR AN INSTRUMENT AND PROCEDURE FOR THE LOCATION OF A MARKER.
US7778686B2 (en) * 2002-06-04 2010-08-17 General Electric Company Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool
US6978167B2 (en) * 2002-07-01 2005-12-20 Claron Technology Inc. Video pose tracking system and method
GB0216893D0 (en) * 2002-07-20 2002-08-28 Univ Surrey Image colouring
US6987834B2 (en) * 2003-01-09 2006-01-17 Ge Medical Systems Global Technology Company, Llc Optimized record technique selection in radiography and fluoroscopy applications
US7505809B2 (en) * 2003-01-13 2009-03-17 Mediguide Ltd. Method and system for registering a first image with a second image relative to the body of a patient
US20050267354A1 (en) * 2003-02-04 2005-12-01 Joel Marquart System and method for providing computer assistance with spinal fixation procedures
DE10313829B4 (en) * 2003-03-21 2005-06-09 Aesculap Ag & Co. Kg Method and device for selecting an image section from an operating area
US20050049672A1 (en) * 2003-03-24 2005-03-03 Murphy Kieran P. Stent delivery system and method using a balloon for a self-expandable stent
US20040236216A1 (en) * 2003-04-15 2004-11-25 Manjeshwar Ravindra Mohan System and method for simulating imaging data
US7983735B2 (en) * 2003-04-15 2011-07-19 General Electric Company Simulation of nuclear medical imaging
US7747047B2 (en) * 2003-05-07 2010-06-29 Ge Medical Systems Global Technology Company, Llc Cardiac CT system and method for planning left atrial appendage isolation
US7343196B2 (en) * 2003-05-09 2008-03-11 Ge Medical Systems Global Technology Company Llc Cardiac CT system and method for planning and treatment of biventricular pacing using epicardial lead
US7565190B2 (en) * 2003-05-09 2009-07-21 Ge Medical Systems Global Technology Company, Llc Cardiac CT system and method for planning atrial fibrillation intervention
US20050010105A1 (en) * 2003-07-01 2005-01-13 Sra Jasbir S. Method and system for Coronary arterial intervention
US7344543B2 (en) * 2003-07-01 2008-03-18 Medtronic, Inc. Method and apparatus for epicardial left atrial appendage isolation in patients with atrial fibrillation
US7813785B2 (en) * 2003-07-01 2010-10-12 General Electric Company Cardiac imaging system and method for planning minimally invasive direct coronary artery bypass surgery
EP1646317A1 (en) * 2003-07-10 2006-04-19 Koninklijke Philips Electronics N.V. Apparatus and method for navigating an instrument through an anatomical structure
US7209538B2 (en) * 2003-08-07 2007-04-24 Xoran Technologies, Inc. Intraoperative stereo imaging system
WO2005013841A1 (en) * 2003-08-07 2005-02-17 Xoran Technologies, Inc. Intra-operative ct scanner
US20050049485A1 (en) * 2003-08-27 2005-03-03 Harmon Kim R. Multiple configuration array for a surgical navigation system
US20060009755A1 (en) * 2003-09-04 2006-01-12 Sra Jasbir S Method and system for ablation of atrial fibrillation and other cardiac arrhythmias
US20050054918A1 (en) * 2003-09-04 2005-03-10 Sra Jasbir S. Method and system for treatment of atrial fibrillation and other cardiac arrhythmias
US7153268B2 (en) * 2003-09-09 2006-12-26 General Electric Company Motion adaptive frame averaging for ultrasound doppler color flow imaging
US8354837B2 (en) * 2003-09-24 2013-01-15 Ge Medical Systems Global Technology Company Llc System and method for electromagnetic tracking operable with multiple coil architectures
DE102004001858A1 (en) * 2003-10-22 2005-05-25 Schaerer Mayfield Technologies Gmbh Procedure for fluoroscopy-based neuronavigation
US7308299B2 (en) * 2003-10-22 2007-12-11 General Electric Company Method, apparatus and product for acquiring cardiac images
US7274811B2 (en) * 2003-10-31 2007-09-25 Ge Medical Systems Global Technology Company, Llc Method and apparatus for synchronizing corresponding landmarks among a plurality of images
US7308297B2 (en) * 2003-11-05 2007-12-11 Ge Medical Systems Global Technology Company, Llc Cardiac imaging system and method for quantification of desynchrony of ventricles for biventricular pacing
US20050143777A1 (en) * 2003-12-19 2005-06-30 Sra Jasbir S. Method and system of treatment of heart failure using 4D imaging
US20050137661A1 (en) * 2003-12-19 2005-06-23 Sra Jasbir S. Method and system of treatment of cardiac arrhythmias using 4D imaging
US7103136B2 (en) * 2003-12-22 2006-09-05 General Electric Company Fluoroscopic tomosynthesis system and method
US7454248B2 (en) * 2004-01-30 2008-11-18 Ge Medical Systems Global Technology, Llc Method, apparatus and product for acquiring cardiac images
US8126224B2 (en) * 2004-02-03 2012-02-28 Ge Medical Systems Global Technology Company, Llc Method and apparatus for instrument tracking on a scrolling series of 2D fluoroscopic images
US20050187562A1 (en) * 2004-02-03 2005-08-25 Grimm James E. Orthopaedic component inserter for use with a surgical navigation system
JP4783742B2 (en) * 2004-02-20 2011-09-28 パチェコ,ヘクター,オー. Method for improving pedicle screw placement in spinal surgery
US20050215888A1 (en) * 2004-03-05 2005-09-29 Grimm James E Universal support arm and tracking array
US20060052691A1 (en) * 2004-03-05 2006-03-09 Hall Maleata Y Adjustable navigated tracking element mount
US20050203539A1 (en) * 2004-03-08 2005-09-15 Grimm James E. Navigated stemmed orthopaedic implant inserter
US7993341B2 (en) * 2004-03-08 2011-08-09 Zimmer Technology, Inc. Navigated orthopaedic guide and method
US8114086B2 (en) 2004-03-08 2012-02-14 Zimmer Technology, Inc. Navigated cut guide locator
JP4630564B2 (en) 2004-03-30 2011-02-09 国立大学法人浜松医科大学 Surgery support apparatus, method and program
US20050228270A1 (en) * 2004-04-02 2005-10-13 Lloyd Charles F Method and system for geometric distortion free tracking of 3-dimensional objects from 2-dimensional measurements
US20050222793A1 (en) * 2004-04-02 2005-10-06 Lloyd Charles F Method and system for calibrating deformed instruments
US20060036148A1 (en) * 2004-07-23 2006-02-16 Grimm James E Navigated surgical sizing guide
US8021162B2 (en) * 2004-08-06 2011-09-20 The Chinese University Of Hong Kong Navigation surgical training model, apparatus having the same and method thereof
US7634122B2 (en) 2004-08-25 2009-12-15 Brainlab Ag Registering intraoperative scans
DE502004009087D1 (en) * 2004-08-25 2009-04-16 Brainlab Ag Registration of intraoperative scans
US8515527B2 (en) * 2004-10-13 2013-08-20 General Electric Company Method and apparatus for registering 3D models of anatomical regions of a heart and a tracking system with projection images of an interventional fluoroscopic system
US7327872B2 (en) * 2004-10-13 2008-02-05 General Electric Company Method and system for registering 3D models of anatomical regions with projection images of the same
KR100702148B1 (en) 2004-12-30 2007-03-30 한국전기연구원 X-ray computed tomography apparatus to acquire the tomography and three-dimension surface image
US20060161059A1 (en) * 2005-01-20 2006-07-20 Zimmer Technology, Inc. Variable geometry reference array
US20060173268A1 (en) * 2005-01-28 2006-08-03 General Electric Company Methods and systems for controlling acquisition of images
CN101528122B (en) * 2005-03-07 2011-09-07 赫克托·O·帕切科 Drivepipe used for inserting into vertebral pedicle
EP1859407A1 (en) 2005-03-10 2007-11-28 Koninklijke Philips Electronics N.V. Image processing system and method for registration of two-dimensional with three-dimensional volume data during interventional procedures
US8147503B2 (en) * 2007-09-30 2012-04-03 Intuitive Surgical Operations Inc. Methods of locating and tracking robotic instruments in robotic surgical systems
US8108072B2 (en) * 2007-09-30 2012-01-31 Intuitive Surgical Operations, Inc. Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
DE102005032523B4 (en) * 2005-07-12 2009-11-05 Siemens Ag Method for the pre-interventional planning of a 2D fluoroscopy projection
US7640121B2 (en) * 2005-11-30 2009-12-29 General Electric Company System and method for disambiguating the phase of a field received from a transmitter in an electromagnetic tracking system
US20070156066A1 (en) * 2006-01-03 2007-07-05 Zimmer Technology, Inc. Device for determining the shape of an anatomic surface
US7520880B2 (en) * 2006-01-09 2009-04-21 Zimmer Technology, Inc. Adjustable surgical support base with integral hinge
US7744600B2 (en) 2006-01-10 2010-06-29 Zimmer Technology, Inc. Bone resection guide and method
KR101286362B1 (en) * 2006-01-24 2013-07-15 루카디아 6, 엘엘씨 Methods for determining the pedicle base circumference and the pedicle isthmus for enabling optimum screw or instrument placement in a pedicle of a vertebral body during spinal surgery
US8526688B2 (en) * 2006-03-09 2013-09-03 General Electric Company Methods and systems for registration of surgical navigation data and image data
CA2644574C (en) 2006-03-17 2016-11-08 Zimmer, Inc. Methods of predetermining the contour of a resected bone surface and assessing the fit of a prosthesis on the bone
US20070225725A1 (en) * 2006-03-21 2007-09-27 Zimmer Technology, Inc. Modular acetabular component inserter
US8121361B2 (en) 2006-05-19 2012-02-21 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
DE102006024425A1 (en) * 2006-05-24 2007-11-29 Siemens Ag Medical instrument e.g. catheter, localizing method during electrophysiological procedure, involves obtaining position information of instrument using electromagnetic localization system, and recording two-dimensional X-ray images
CN100399997C (en) * 2006-06-22 2008-07-09 上海交通大学 Turn moving type C type arm alignment target
CN100394890C (en) * 2006-06-22 2008-06-18 上海交通大学 Disassembling C type arm alignment target
JP5052608B2 (en) * 2006-06-28 2012-10-17 パチェコ,ヘクター,オー. Method of operating an apparatus for determining the size and / or placement of a prosthetic disc
US8040127B2 (en) * 2006-08-15 2011-10-18 General Electric Company Multi-sensor distortion mapping method and system
US8022990B2 (en) 2006-08-18 2011-09-20 General Electric Company Systems and methods for on-line marker-less camera calibration using a position tracking system
US10016148B2 (en) * 2006-09-27 2018-07-10 General Electric Company Method and apparatus for correction of multiple EM sensor positions
US7892165B2 (en) * 2006-10-23 2011-02-22 Hoya Corporation Camera calibration for endoscope navigation system
US20080118116A1 (en) * 2006-11-20 2008-05-22 General Electric Company Systems and methods for tracking a surgical instrument and for conveying tracking information via a network
US7671887B2 (en) * 2006-11-20 2010-03-02 General Electric Company System and method of navigating a medical instrument
US7995818B2 (en) * 2006-11-22 2011-08-09 General Electric Company Systems and methods for synchronized image viewing with an image atlas
EP1925256A1 (en) * 2006-11-24 2008-05-28 BrainLAB AG Method and device for registering an anatomical structure with markers
US20080132757A1 (en) * 2006-12-01 2008-06-05 General Electric Company System and Method for Performing Minimally Invasive Surgery Using a Multi-Channel Catheter
US20080139929A1 (en) * 2006-12-06 2008-06-12 General Electric Company System and method for tracking an invasive surgical instrument while imaging a patient
US7933730B2 (en) * 2006-12-21 2011-04-26 General Electric Co. Method and system for restoration of a navigation data loss in image-guided navigation
US7780349B2 (en) * 2007-01-03 2010-08-24 James G. Schwade Apparatus and method for robotic radiosurgery beam geometry quality assurance
US20080183064A1 (en) * 2007-01-30 2008-07-31 General Electric Company Multi-sensor distortion detection method and system
EP2123232A4 (en) * 2007-01-31 2011-02-16 Nat University Corp Hamamatsu University School Of Medicine Device for displaying assistance information for surgical operation, method for displaying assistance information for surgical operation, and program for displaying assistance information for surgical operation
US8108025B2 (en) * 2007-04-24 2012-01-31 Medtronic, Inc. Flexible array for use in navigated surgery
US8734466B2 (en) * 2007-04-25 2014-05-27 Medtronic, Inc. Method and apparatus for controlled insertion and withdrawal of electrodes
US9289270B2 (en) 2007-04-24 2016-03-22 Medtronic, Inc. Method and apparatus for performing a navigated procedure
US8311611B2 (en) 2007-04-24 2012-11-13 Medtronic, Inc. Method for performing multiple registrations in a navigated procedure
US8301226B2 (en) 2007-04-24 2012-10-30 Medtronic, Inc. Method and apparatus for performing a navigated procedure
US7658541B2 (en) * 2007-06-26 2010-02-09 General Electric Company Apparatus for universal electromagnetic navigation target for fluoroscopic systems
DE102007029364A1 (en) * 2007-06-26 2009-01-02 Siemens Ag A method of determining access to an area of a brain
US8055049B2 (en) * 2007-07-18 2011-11-08 Xoran Technologies, Inc. Motion correction for CT using marker projections
US9179983B2 (en) 2007-08-14 2015-11-10 Zimmer, Inc. Method of determining a contour of an anatomical structure and selecting an orthopaedic implant to replicate the anatomical structure
US9265589B2 (en) 2007-11-06 2016-02-23 Medtronic Navigation, Inc. System and method for navigated drill guide
EP2186466A4 (en) * 2007-12-28 2011-01-19 Olympus Medical Systems Corp Medical instrument system
US8571637B2 (en) 2008-01-21 2013-10-29 Biomet Manufacturing, Llc Patella tracking method and apparatus for use in surgical navigation
US8571633B2 (en) * 2008-03-23 2013-10-29 Scott Rosa Diagnostic imaging method
WO2010044844A1 (en) 2008-10-13 2010-04-22 George Papaioannou Dynamic biplane roentgen stereophotogrammetric analysis
JP5566657B2 (en) * 2008-10-15 2014-08-06 株式会社東芝 3D image processing apparatus and X-ray diagnostic apparatus
JP5569711B2 (en) 2009-03-01 2014-08-13 国立大学法人浜松医科大学 Surgery support system
US8180130B2 (en) * 2009-11-25 2012-05-15 Imaging Sciences International Llc Method for X-ray marker localization in 3D space in the presence of motion
US8942457B2 (en) 2010-01-12 2015-01-27 Koninklijke Philips N.V. Navigating an interventional device
EP2747641A4 (en) 2011-08-26 2015-04-01 Kineticor Inc Methods, systems, and devices for intra-scan motion correction
US8849388B2 (en) 2011-09-08 2014-09-30 Apn Health, Llc R-wave detection method
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
CN109008972A (en) 2013-02-01 2018-12-18 凯内蒂科尔股份有限公司 The motion tracking system of real-time adaptive motion compensation in biomedical imaging
DE102013209158A1 (en) * 2013-05-16 2014-11-20 Fiagon Gmbh Method for integrating data obtained by means of an imaging method
US10854111B2 (en) 2013-06-12 2020-12-01 University Of Florida Research Foundation, Inc. Simulation system and methods for surgical training
DE102013222230A1 (en) 2013-10-31 2015-04-30 Fiagon Gmbh Surgical instrument
CN106572810A (en) 2014-03-24 2017-04-19 凯内蒂科尔股份有限公司 Systems, methods, and devices for removing prospective motion correction from medical imaging scans
CN111184577A (en) * 2014-03-28 2020-05-22 直观外科手术操作公司 Quantitative three-dimensional visualization of an instrument in a field of view
FR3024235B1 (en) 2014-07-22 2022-01-28 Univ Joseph Fourier PLANAR IMAGING SYSTEM ALLOWING DIFFUSE CORRECTION
EP3188660A4 (en) 2014-07-23 2018-05-16 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9872663B2 (en) * 2015-02-04 2018-01-23 Dentsply Sirona Inc. Methods, systems, apparatuses, and computer programs for removing marker artifact contribution from a tomosynthesis dataset
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US11253322B2 (en) 2015-10-14 2022-02-22 Ecential Robotics Fluoro-navigation system for navigating a tool relative to a medical image
FR3042881B1 (en) 2015-10-26 2019-10-25 Surgiqual Institute ROTATING COLLIMATOR FOR DETERMINING THE POSITION OF AN ELEMENT PROVIDED WITH SENSORS IN AN X-RAY IMAGING SYSTEM
WO2017091479A1 (en) 2015-11-23 2017-06-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
CN114376588A (en) * 2016-03-13 2022-04-22 乌泽医疗有限公司 Apparatus and method for use with bone surgery
US10478143B2 (en) 2016-08-02 2019-11-19 Covidien Lp System and method of generating and updatng a three dimensional model of a luminal network
EP3531946A4 (en) 2016-10-27 2020-10-21 Leucadia 6, LLC Intraoperative fluoroscopic registration of vertebral bodies
US10478149B2 (en) * 2017-02-21 2019-11-19 Siemens Healthcare Gmbh Method of automatically positioning an X-ray source of an X-ray system and an X-ray system
CN110769771B (en) 2017-04-02 2023-07-04 马佐尔机器人有限公司 Three-dimensional robot biological printer
US11166766B2 (en) 2017-09-21 2021-11-09 DePuy Synthes Products, Inc. Surgical instrument mounted display system
US20190247638A1 (en) 2018-02-13 2019-08-15 Kieran P. Murphy Delivery system for delivering a drug depot to a target site under image guidance and methods and uses of same
US10849711B2 (en) 2018-07-11 2020-12-01 DePuy Synthes Products, Inc. Surgical instrument mounted display system
US20210327089A1 (en) * 2018-08-27 2021-10-21 Ying Ji Method for Measuring Positions
US11406472B2 (en) 2018-12-13 2022-08-09 DePuy Synthes Products, Inc. Surgical instrument mounted display system
US20200237459A1 (en) * 2019-01-25 2020-07-30 Biosense Webster (Israel) Ltd. Flexible multi-coil tracking sensor
EP3719749A1 (en) 2019-04-03 2020-10-07 Fiagon AG Medical Technologies Registration method and setup
FR3099832B1 (en) 2019-08-09 2021-10-08 Univ Grenoble Alpes Rotating collimator for an X-ray detection system
US11622739B2 (en) * 2019-12-23 2023-04-11 General Electric Company Intra-surgery imaging system
EP4092684A1 (en) * 2021-05-18 2022-11-23 Koninklijke Philips N.V. Method for providing feedback data in a medical imaging system
US20240216072A1 (en) * 2023-01-03 2024-07-04 Nuvasive, Inc. Calibration for surgical navigation

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5628315A (en) 1994-09-15 1997-05-13 Brainlab Med. Computersysteme Gmbh Device for detecting the position of radiation target points
US5643268A (en) 1994-09-27 1997-07-01 Brainlab Med. Computersysteme Gmbh Fixation pin for fixing a reference system to bony structures
US5676673A (en) 1994-09-15 1997-10-14 Visualization Technology, Inc. Position tracking and imaging system with error detection for use in medical applications
US5690106A (en) 1995-06-30 1997-11-25 Siemens Corporate Research, Inc. Flexible image registration for rotational angiography
US5702406A (en) 1994-09-15 1997-12-30 Brainlab Med. Computersysteme Gmbb Device for noninvasive stereotactic immobilization in reproducible position
US5769861A (en) 1995-09-28 1998-06-23 Brainlab Med. Computersysteme Gmbh Method and devices for localizing an instrument
US5772594A (en) 1995-10-17 1998-06-30 Barrick; Earl F. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US5823958A (en) * 1990-11-26 1998-10-20 Truppe; Michael System and method for displaying a structural data image in real-time correlation with moveable body
US5829444A (en) 1994-09-15 1998-11-03 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US5836954A (en) 1992-04-21 1998-11-17 University Of Utah Research Foundation Apparatus and method for photogrammetric surgical localization
US5852646A (en) 1996-05-21 1998-12-22 U.S. Philips Corporation X-ray imaging method
US5889834A (en) 1995-09-28 1999-03-30 Brainlab Med. Computersysteme Gmbh Blade collimator for radiation therapy
US5923727A (en) 1997-09-30 1999-07-13 Siemens Corporate Research, Inc. Method and apparatus for calibrating an intra-operative X-ray system
US5951475A (en) 1997-09-25 1999-09-14 International Business Machines Corporation Methods and apparatus for registering CT-scan data to multiple fluoroscopic images
US5967982A (en) 1997-12-09 1999-10-19 The Cleveland Clinic Foundation Non-invasive spine and bone registration for frameless stereotaxy
WO1999060939A1 (en) 1998-05-28 1999-12-02 Orthosoft, Inc. Interactive computer-assisted surgical system and method thereof
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US6069932A (en) 1996-05-15 2000-05-30 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US6081577A (en) * 1998-07-24 2000-06-27 Wake Forest University Method and system for creating task-dependent three-dimensional images
US6097994A (en) * 1996-09-30 2000-08-01 Siemens Corporate Research, Inc. Apparatus and method for determining the correct insertion depth for a biopsy needle
US6118845A (en) 1998-06-29 2000-09-12 Surgical Navigation Technologies, Inc. System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
US6149592A (en) * 1997-11-26 2000-11-21 Picker International, Inc. Integrated fluoroscopic projection image data, volumetric image data, and surgical device position data

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8921300D0 (en) * 1989-03-13 1989-11-08 Univ Sheffield Improvements in the radiographic analysis of bones
US5437277A (en) * 1991-11-18 1995-08-01 General Electric Company Inductively coupled RF tracking system for use in invasive imaging of a living body
US6285902B1 (en) * 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5823958A (en) * 1990-11-26 1998-10-20 Truppe; Michael System and method for displaying a structural data image in real-time correlation with moveable body
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US6165181A (en) 1992-04-21 2000-12-26 Sofamor Danek Holdings, Inc. Apparatus and method for photogrammetric surgical localization
US5836954A (en) 1992-04-21 1998-11-17 University Of Utah Research Foundation Apparatus and method for photogrammetric surgical localization
US5676673A (en) 1994-09-15 1997-10-14 Visualization Technology, Inc. Position tracking and imaging system with error detection for use in medical applications
US5702406A (en) 1994-09-15 1997-12-30 Brainlab Med. Computersysteme Gmbb Device for noninvasive stereotactic immobilization in reproducible position
US5628315A (en) 1994-09-15 1997-05-13 Brainlab Med. Computersysteme Gmbh Device for detecting the position of radiation target points
US5800352A (en) 1994-09-15 1998-09-01 Visualization Technology, Inc. Registration system for use with position tracking and imaging system for use in medical applications
US5803089A (en) 1994-09-15 1998-09-08 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US5829444A (en) 1994-09-15 1998-11-03 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US5967980A (en) 1994-09-15 1999-10-19 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US5873822A (en) 1994-09-15 1999-02-23 Visualization Technology, Inc. Automatic registration system for use with position tracking and imaging system for use in medical applications
US5643268A (en) 1994-09-27 1997-07-01 Brainlab Med. Computersysteme Gmbh Fixation pin for fixing a reference system to bony structures
US5690106A (en) 1995-06-30 1997-11-25 Siemens Corporate Research, Inc. Flexible image registration for rotational angiography
US5889834A (en) 1995-09-28 1999-03-30 Brainlab Med. Computersysteme Gmbh Blade collimator for radiation therapy
US5769861A (en) 1995-09-28 1998-06-23 Brainlab Med. Computersysteme Gmbh Method and devices for localizing an instrument
US5772594A (en) 1995-10-17 1998-06-30 Barrick; Earl F. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US6198794B1 (en) 1996-05-15 2001-03-06 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US6069932A (en) 1996-05-15 2000-05-30 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US5852646A (en) 1996-05-21 1998-12-22 U.S. Philips Corporation X-ray imaging method
US6097994A (en) * 1996-09-30 2000-08-01 Siemens Corporate Research, Inc. Apparatus and method for determining the correct insertion depth for a biopsy needle
US5951475A (en) 1997-09-25 1999-09-14 International Business Machines Corporation Methods and apparatus for registering CT-scan data to multiple fluoroscopic images
US5923727A (en) 1997-09-30 1999-07-13 Siemens Corporate Research, Inc. Method and apparatus for calibrating an intra-operative X-ray system
US6149592A (en) * 1997-11-26 2000-11-21 Picker International, Inc. Integrated fluoroscopic projection image data, volumetric image data, and surgical device position data
US5967982A (en) 1997-12-09 1999-10-19 The Cleveland Clinic Foundation Non-invasive spine and bone registration for frameless stereotaxy
WO1999060939A1 (en) 1998-05-28 1999-12-02 Orthosoft, Inc. Interactive computer-assisted surgical system and method thereof
US6118845A (en) 1998-06-29 2000-09-12 Surgical Navigation Technologies, Inc. System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
US6081577A (en) * 1998-07-24 2000-06-27 Wake Forest University Method and system for creating task-dependent three-dimensional images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
R. Hofstetter et al. "Fluoroscopy as an Imaging Means for Computer-Assisted Surgical Navigation" Biomedical Paper (Computer Aided Surgery) 4:65-76 (1999).
Roger Y. Tsai "A Versatile camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses" (IEEE Journal of Robotics and Automation) vol. RA-3, No. 4 (Aug. 1987).
Shih-Hsu Chang et al. "Fast Algorithm for Point Pattern Matching: Invariant to Translations, Rotations and Scale Changes" (Pergamon, Patent Recognition) vol. 30, No. 2 pp. 311-320, 1997.

Cited By (685)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE43952E1 (en) 1989-10-05 2013-01-29 Medtronic Navigation, Inc. Interactive system for local intervention inside a non-homogeneous structure
US8200314B2 (en) 1992-08-14 2012-06-12 British Telecommunications Public Limited Company Surgical navigation
US8473026B2 (en) 1994-09-15 2013-06-25 Ge Medical Systems Global Technology Company System for monitoring a position of a medical instrument with respect to a patient's body
USRE44305E1 (en) 1997-09-24 2013-06-18 Medtronic Navigation, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
USRE42194E1 (en) 1997-09-24 2011-03-01 Medtronic Navigation, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
USRE42226E1 (en) 1997-09-24 2011-03-15 Medtronic Navigation, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
USRE43328E1 (en) 1997-11-20 2012-04-24 Medtronic Navigation, Inc Image guided awl/tap/screwdriver
USRE46422E1 (en) 1997-11-20 2017-06-06 Medtronic Navigation, Inc. Image guided awl/tap/screwdriver
USRE46409E1 (en) 1997-11-20 2017-05-23 Medtronic Navigation, Inc. Image guided awl/tap/screwdriver
US7763035B2 (en) 1997-12-12 2010-07-27 Medtronic Navigation, Inc. Image guided spinal surgery guide, system and method for use thereof
US8105339B2 (en) 1997-12-12 2012-01-31 Sofamor Danek Holdings, Inc. Image guided spinal surgery guide system and method for use thereof
US8768437B2 (en) 1998-08-20 2014-07-01 Sofamor Danek Holdings, Inc. Fluoroscopic image guided surgery system with intraoperative registration
US20010036245A1 (en) * 1999-02-10 2001-11-01 Kienzle Thomas C. Computer assisted targeting device for use in orthopaedic surgery
US6697664B2 (en) * 1999-02-10 2004-02-24 Ge Medical Systems Global Technology Company, Llc Computer assisted targeting device for use in orthopaedic surgery
US7996064B2 (en) 1999-03-23 2011-08-09 Medtronic Navigation, Inc. System and method for placing and determining an appropriately sized surgical implant
US8845655B2 (en) 1999-04-20 2014-09-30 Medtronic Navigation, Inc. Instrument guide system
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US11331150B2 (en) 1999-10-28 2022-05-17 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7797032B2 (en) 1999-10-28 2010-09-14 Medtronic Navigation, Inc. Method and system for navigating a catheter probe in the presence of field-influencing objects
US8057407B2 (en) 1999-10-28 2011-11-15 Medtronic Navigation, Inc. Surgical sensor
US9504530B2 (en) 1999-10-28 2016-11-29 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US8290572B2 (en) 1999-10-28 2012-10-16 Medtronic Navigation, Inc. Method and system for navigating a catheter probe in the presence of field-influencing objects
US8074662B2 (en) 1999-10-28 2011-12-13 Medtronic Navigation, Inc. Surgical communication and power system
US7657300B2 (en) 1999-10-28 2010-02-02 Medtronic Navigation, Inc. Registration of human anatomy integrated for electromagnetic localization
US8548565B2 (en) 1999-10-28 2013-10-01 Medtronic Navigation, Inc. Registration of human anatomy integrated for electromagnetic localization
US20090264729A1 (en) * 2000-01-10 2009-10-22 Pinhas Gilboa Methods And Systems For Performing Medical Procedures With Reference To Projective Image And With Respect To Pre Stored Images
US8565858B2 (en) * 2000-01-10 2013-10-22 Covidien Lp Methods and systems for performing medical procedures with reference to determining estimated dispositions for actual dispositions of projective images to transform projective images into an image volume
US7881770B2 (en) 2000-03-01 2011-02-01 Medtronic Navigation, Inc. Multiple cannula image guided tool for image guided procedures
US10898153B2 (en) 2000-03-01 2021-01-26 Medtronic Navigation, Inc. Multiple cannula image guided tool for image guided procedures
US7853305B2 (en) 2000-04-07 2010-12-14 Medtronic Navigation, Inc. Trajectory storage apparatus and method for surgical navigation systems
US8634897B2 (en) 2000-04-07 2014-01-21 Medtronic Navigation, Inc. Trajectory storage apparatus and method for surgical navigation systems
US6856827B2 (en) * 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US20030088179A1 (en) * 2000-04-28 2003-05-08 Teresa Seeley Fluoroscopic tracking and visualization system
US7831082B2 (en) 2000-06-14 2010-11-09 Medtronic Navigation, Inc. System and method for image based sensor calibration
US8320653B2 (en) 2000-06-14 2012-11-27 Medtronic Navigation, Inc. System and method for image based sensor calibration
US20020156359A1 (en) * 2001-03-16 2002-10-24 Knoplioch Jerome Francois Short/long axis cardiac display protocol
US6975897B2 (en) * 2001-03-16 2005-12-13 Ge Medical Systems Global Technology Company, Llc Short/long axis cardiac display protocol
US6817762B2 (en) * 2001-04-10 2004-11-16 Koninklijke Philips Electronics N.V. Fluoroscopy intervention method with a cone-beam
US20030147492A1 (en) * 2001-04-10 2003-08-07 Roland Proksa Fluoroscopy intervention method with a cone-beam
US9675424B2 (en) 2001-06-04 2017-06-13 Surgical Navigation Technologies, Inc. Method for calibrating a navigation system
US20050033164A1 (en) * 2001-07-12 2005-02-10 Takeshi Yatsuo Endoscopic image pickup method and magnetic resonance imaging device using the same
US7653426B2 (en) * 2001-07-12 2010-01-26 Hitachi Medical Corporation Endoscopic image pickup method and magnetic resonance imaging device using the same
US6999811B2 (en) * 2001-07-25 2006-02-14 Koninklijke Philips Electronics N.V. Method and device for the registration of two 3D image data sets
US20030021381A1 (en) * 2001-07-25 2003-01-30 Reiner Koppe Method and device for the registration of two 3D image data sets
US6810280B2 (en) * 2001-08-23 2004-10-26 Siemens Aktiengesellschaft Method and apparatus for detecting the three-dimensional position of an examination instrument inserted into a body region
US20030128800A1 (en) * 2001-08-23 2003-07-10 Norbert Strobel Method for detecting the three-dimensional position of an examination instrument inserted into a body region
US20030212320A1 (en) * 2001-10-01 2003-11-13 Michael Wilk Coordinate registration system for dual modality imaging systems
US20030149364A1 (en) * 2002-02-01 2003-08-07 Ajay Kapur Methods, system and apparatus for digital imaging
US20030218720A1 (en) * 2002-02-07 2003-11-27 Olympus Optical Co., Ltd. Three-dimensional observation apparatus and method of three-dimensional observation
US6985765B2 (en) * 2002-02-07 2006-01-10 Olympus Corporation Three-dimensional observation apparatus and method of three-dimensional observation
US6940941B2 (en) 2002-02-15 2005-09-06 Breakaway Imaging, Llc Breakable gantry apparatus for multidimensional x-ray based imaging
US20040022350A1 (en) * 2002-02-15 2004-02-05 Breakaway Imaging, Llc Breakable gantry apparatus for multidimensional x-ray based imaging
US9757087B2 (en) 2002-02-28 2017-09-12 Medtronic Navigation, Inc. Method and apparatus for perspective inversion
US8746973B2 (en) 2002-03-13 2014-06-10 Medtronic Navigation, Inc. Systems and methods for quasi-simultaneous multi-planar x-ray imaging
US7188998B2 (en) 2002-03-13 2007-03-13 Breakaway Imaging, Llc Systems and methods for quasi-simultaneous multi-planar x-ray imaging
US20040013239A1 (en) * 2002-03-13 2004-01-22 Breakaway Imaging, Llc Systems and methods for quasi-simultaneous multi-planar x-ray imaging
USRE49349E1 (en) 2002-03-19 2022-12-27 Medtronic Navigation, Inc. Systems and methods for imaging large field-of-view objects
US9724058B2 (en) 2002-03-19 2017-08-08 Medtronic Navigation, Inc. Systems and methods for imaging large field-of-view objects
US7661881B2 (en) 2002-03-19 2010-02-16 Medtronic Navigation, Inc. Systems and methods for imaging large field-of-view objects
US20040013225A1 (en) * 2002-03-19 2004-01-22 Breakaway Imaging, Llc Systems and methods for imaging large field-of-view objects
US7108421B2 (en) 2002-03-19 2006-09-19 Breakaway Imaging, Llc Systems and methods for imaging large field-of-view objects
US8678647B2 (en) 2002-03-19 2014-03-25 Medtronic Navigation, Inc. Systems and methods for imaging large field-of-view objects
US9398886B2 (en) 2002-03-19 2016-07-26 Medtronic Navigation, Inc. Systems and methods for imaging large field-of-view objects
US10959791B2 (en) 2002-03-20 2021-03-30 P Tech, Llc Robotic surgery
US10869728B2 (en) * 2002-03-20 2020-12-22 P Tech, Llc Robotic surgery
US10368953B2 (en) 2002-03-20 2019-08-06 P Tech, Llc Robotic system for fastening layers of body tissue together and method thereof
US20160331481A1 (en) * 2002-03-20 2016-11-17 P Tech, Llc Methods of using a robotic spine system
US10201391B2 (en) * 2002-03-20 2019-02-12 P Tech, Llc Methods of using a robotic spine system
US10265128B2 (en) * 2002-03-20 2019-04-23 P Tech, Llc Methods of using a robotic spine system
US10932869B2 (en) 2002-03-20 2021-03-02 P Tech, Llc Robotic surgery
US7588569B2 (en) * 2002-03-22 2009-09-15 Karl Storz Gmbh & Co. Kg Medical instrument for the treatment of tissue by means of a high-frequency current and medical system with a medical instrument of this type
US20050065513A1 (en) * 2002-03-22 2005-03-24 Irion Klaus M. Medical instrument for the treatment of tissue by means of a high-frequency current and medical system with a medical instrument of this type
US8838199B2 (en) 2002-04-04 2014-09-16 Medtronic Navigation, Inc. Method and apparatus for virtual digital subtraction angiography
US8696548B2 (en) 2002-04-17 2014-04-15 Covidien Lp Endoscope structures and techniques for navigating to a target in branched structure
US9642514B2 (en) 2002-04-17 2017-05-09 Covidien Lp Endoscope structures and techniques for navigating to a target in a branched structure
US8696685B2 (en) 2002-04-17 2014-04-15 Covidien Lp Endoscope structures and techniques for navigating to a target in branched structure
US10743748B2 (en) 2002-04-17 2020-08-18 Covidien Lp Endoscope structures and techniques for navigating to a target in branched structure
US20030235266A1 (en) * 2002-06-11 2003-12-25 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US7001045B2 (en) 2002-06-11 2006-02-21 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US7905659B2 (en) 2002-06-11 2011-03-15 Medtronic Navigation, Inc. Cantilevered gantry apparatus for x-ray imaging
US20080212743A1 (en) * 2002-06-11 2008-09-04 Gregerson Eugene A Cantilevered gantry apparatus for x-ray imaging
US8308361B2 (en) 2002-06-11 2012-11-13 Medtronic Navigation, Inc. Cantilevered gantry apparatus for X-ray imaging
US20040015077A1 (en) * 2002-07-11 2004-01-22 Marwan Sati Apparatus, system and method of calibrating medical imaging systems
US7065393B2 (en) * 2002-07-11 2006-06-20 Cedara Software Corp. Apparatus, system and method of calibrating medical imaging systems
US20040034300A1 (en) * 2002-08-19 2004-02-19 Laurent Verard Method and apparatus for virtual endoscopy
US20060120511A1 (en) * 2002-08-21 2006-06-08 Gregerson Eugene A Gantry positioning apparatus for x-ray imaging
US7338207B2 (en) 2002-08-21 2008-03-04 Medtronic Navigation, Inc. Gantry positioning apparatus for X-ray imaging
US20040170254A1 (en) * 2002-08-21 2004-09-02 Breakaway Imaging, Llc Gantry positioning apparatus for X-ray imaging
US7965811B1 (en) 2002-08-21 2011-06-21 Medtronic Navigation, Inc. Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7490982B2 (en) 2002-08-21 2009-02-17 Medtronic Navigation, Inc. Gantry positioning apparatus for x-ray imaging
US7106825B2 (en) 2002-08-21 2006-09-12 Breakaway Imaging, Llc Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7903779B2 (en) 2002-08-21 2011-03-08 Medtronic Navigation, Inc. Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US20060153468A1 (en) * 2002-09-04 2006-07-13 Torsten Solf Imaging system and method for optimizing an x-ray image
US20040082854A1 (en) * 2002-09-12 2004-04-29 Robert Essenreiter X-ray image-assisted navigation using original, two-dimensional x-ray images
US7251522B2 (en) * 2002-09-12 2007-07-31 Brainlab Ag X-ray image-assisted navigation using original, two-dimensional x-ray images
US8401616B2 (en) 2002-11-19 2013-03-19 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US8467853B2 (en) 2002-11-19 2013-06-18 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US8046052B2 (en) 2002-11-19 2011-10-25 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US7697972B2 (en) 2002-11-19 2010-04-13 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US8060185B2 (en) 2002-11-19 2011-11-15 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US20060098786A1 (en) * 2003-01-15 2006-05-11 Kai Eck Method of determining the position of an object in an image
WO2004064379A1 (en) * 2003-01-16 2004-07-29 Philips Intellectual Property & Standards Gmbh Method of determining the position of an object in an image
US8355773B2 (en) 2003-01-21 2013-01-15 Aesculap Ag Recording localization device tool positional parameters
US20040143178A1 (en) * 2003-01-21 2004-07-22 Francois Leitner Recording localization device tool positional parameters
WO2004064659A1 (en) * 2003-01-21 2004-08-05 Aesculap Ag & Co. Kg Method end device for recording localization device tool positional parameters
US9867721B2 (en) 2003-01-30 2018-01-16 Medtronic Navigation, Inc. Method and apparatus for post-operative tuning of a spinal implant
US11707363B2 (en) 2003-01-30 2023-07-25 Medtronic Navigation, Inc. Method and apparatus for post-operative tuning of a spinal implant
US7974677B2 (en) 2003-01-30 2011-07-05 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
US7660623B2 (en) 2003-01-30 2010-02-09 Medtronic Navigation, Inc. Six degree of freedom alignment display for medical procedures
US11684491B2 (en) 2003-01-30 2023-06-27 Medtronic Navigation, Inc. Method and apparatus for post-operative tuning of a spinal implant
US20050004449A1 (en) * 2003-05-20 2005-01-06 Matthias Mitschke Method for marker-less navigation in preoperative 3D images using an intraoperatively acquired 3D C-arm image
US20050027193A1 (en) * 2003-05-21 2005-02-03 Matthias Mitschke Method for automatically merging a 2D fluoroscopic C-arm image with a preoperative 3D image with one-time use of navigation markers
AU2004202167B2 (en) * 2003-05-29 2010-09-16 Biosense Webster, Inc. Hysteresis assessment for metal immunity
EP1510174A1 (en) * 2003-05-29 2005-03-02 Biosense Webster, Inc. Apparatus, method and software for tracking an object
US7974680B2 (en) 2003-05-29 2011-07-05 Biosense, Inc. Hysteresis assessment for metal immunity
US20040239314A1 (en) * 2003-05-29 2004-12-02 Assaf Govari Hysteresis assessment for metal immunity
US20060154604A1 (en) * 2003-07-01 2006-07-13 General Electric Company Electromagnetic coil array integrated into antiscatter grid
US7907701B2 (en) 2003-07-01 2011-03-15 General Electric Company Electromagnetic coil array integrated into antiscatter grid
US7580676B2 (en) 2003-07-01 2009-08-25 General Electric Company Electromagnetic coil array integrated into flat-panel detector
US20060121849A1 (en) * 2003-07-01 2006-06-08 Peter Traneus Anderson Electromagnetic coil array integrated into flat-panel detector
US20130217952A1 (en) * 2003-07-21 2013-08-22 Vanderbilt University Ophthalmic orbital surgery apparatus and method and image-guided navigation system
US20050262031A1 (en) * 2003-07-21 2005-11-24 Olivier Saidi Systems and methods for treating, diagnosing and predicting the occurrence of a medical condition
US20050024043A1 (en) * 2003-07-31 2005-02-03 Assaf Govari Detection of metal disturbance in a magnetic tracking system
US7321228B2 (en) 2003-07-31 2008-01-22 Biosense Webster, Inc. Detection of metal disturbance in a magnetic tracking system
US11154283B2 (en) 2003-08-11 2021-10-26 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US8483801B2 (en) 2003-08-11 2013-07-09 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US7853307B2 (en) 2003-08-11 2010-12-14 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US10470725B2 (en) 2003-08-11 2019-11-12 Veran Medical Technologies, Inc. Method, apparatuses, and systems useful in conducting image guided interventions
US8150495B2 (en) 2003-08-11 2012-04-03 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US11426134B2 (en) 2003-08-11 2022-08-30 Veran Medical Technologies, Inc. Methods, apparatuses and systems useful in conducting image guided interventions
US7398116B2 (en) * 2003-08-11 2008-07-08 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US7925328B2 (en) 2003-08-28 2011-04-12 Medtronic Navigation, Inc. Method and apparatus for performing stereotactic surgery
US20070038081A1 (en) * 2003-09-04 2007-02-15 Koninklijke Philips Electronics N.V. Device and method for displaying ultrasound images of a vessel
US8090427B2 (en) 2003-09-04 2012-01-03 Koninklijke Philips Electronics N.V. Methods for ultrasound visualization of a vessel with location and cycle information
WO2005024729A1 (en) * 2003-09-04 2005-03-17 Philips Intellectual Property & Standards Gmbh Device and method for displaying ultrasound images of a vessel
US8663088B2 (en) 2003-09-15 2014-03-04 Covidien Lp System of accessories for use with bronchoscopes
US9089261B2 (en) 2003-09-15 2015-07-28 Covidien Lp System of accessories for use with bronchoscopes
US10383509B2 (en) 2003-09-15 2019-08-20 Covidien Lp System of accessories for use with bronchoscopes
US7835778B2 (en) 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US8706185B2 (en) 2003-10-16 2014-04-22 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US7840253B2 (en) 2003-10-17 2010-11-23 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7971341B2 (en) 2003-10-17 2011-07-05 Medtronic Navigation, Inc. Method of forming an electromagnetic sensing coil in a medical instrument for a surgical navigation system
US7818044B2 (en) 2003-10-17 2010-10-19 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US8359730B2 (en) 2003-10-17 2013-01-29 Medtronic Navigation, Inc. Method of forming an electromagnetic sensing coil in a medical instrument
US8549732B2 (en) 2003-10-17 2013-10-08 Medtronic Navigation, Inc. Method of forming an electromagnetic sensing coil in a medical instrument
US8239001B2 (en) 2003-10-17 2012-08-07 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7751865B2 (en) 2003-10-17 2010-07-06 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US8271069B2 (en) 2003-10-17 2012-09-18 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7764985B2 (en) 2003-10-20 2010-07-27 Smith & Nephew, Inc. Surgical navigation system component fault interfaces and related processes
US20070197908A1 (en) * 2003-10-29 2007-08-23 Ruchala Kenneth J System and method for calibrating and positioning a radiation therapy treatment table
US20100312104A1 (en) * 2003-10-29 2010-12-09 Ruchala Kenneth J System and method for calibrating and positioning a radiation therapy treatment table
US7794467B2 (en) 2003-11-14 2010-09-14 Smith & Nephew, Inc. Adjustable surgical cutting systems
US20050149041A1 (en) * 2003-11-14 2005-07-07 Mcginley Brian J. Adjustable surgical cutting systems
US7966058B2 (en) 2003-12-31 2011-06-21 General Electric Company System and method for registering an image with a representation of a probe
US20050154279A1 (en) * 2003-12-31 2005-07-14 Wenguang Li System and method for registering an image with a representation of a probe
US20050154285A1 (en) * 2004-01-02 2005-07-14 Neason Curtis G. System and method for receiving and displaying information pertaining to a patient
US8764725B2 (en) 2004-02-09 2014-07-01 Covidien Lp Directional anchoring mechanism, method and applications thereof
US20050209524A1 (en) * 2004-03-10 2005-09-22 General Electric Company System and method for receiving and storing information pertaining to a patient
US7998062B2 (en) 2004-03-29 2011-08-16 Superdimension, Ltd. Endoscope structures and techniques for navigating to a target in branched structure
US20050228251A1 (en) * 2004-03-30 2005-10-13 General Electric Company System and method for displaying a three-dimensional image of an organ or structure inside the body
US20050228252A1 (en) * 2004-04-02 2005-10-13 General Electric Company Electrophysiology system and method
US20050222509A1 (en) * 2004-04-02 2005-10-06 General Electric Company Electrophysiology system and method
US8109942B2 (en) 2004-04-21 2012-02-07 Smith & Nephew, Inc. Computer-aided methods, systems, and apparatuses for shoulder arthroplasty
US10321803B2 (en) 2004-04-26 2019-06-18 Covidien Lp System and method for image-based alignment of an endoscope
US9055881B2 (en) 2004-04-26 2015-06-16 Super Dimension Ltd. System and method for image-based alignment of an endoscope
US7953471B2 (en) 2004-05-03 2011-05-31 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7097357B2 (en) 2004-06-02 2006-08-29 General Electric Company Method and system for improved correction of registration error in a fluoroscopic image
US20050281385A1 (en) * 2004-06-02 2005-12-22 Johnson Douglas K Method and system for improved correction of registration error in a fluoroscopic image
US20050288574A1 (en) * 2004-06-23 2005-12-29 Thornton Thomas M Wireless (disposable) fiducial based registration and EM distoration based surface registration
US20060025668A1 (en) * 2004-08-02 2006-02-02 Peterson Thomas H Operating table with embedded tracking technology
US7925326B2 (en) * 2004-09-03 2011-04-12 Siemens Molecular Imaging, Inc. Solid fiduciary marker for multimodality imaging
US20070073143A1 (en) * 2004-09-03 2007-03-29 Cti Concorde Microsystems, Llc Solid fiduciary marker for multimodality imaging
US20060063998A1 (en) * 2004-09-21 2006-03-23 Von Jako Ron Navigation and visualization of an access needle system
US20060064005A1 (en) * 2004-09-23 2006-03-23 Innovative Spinal Technologies System and method for externally controlled surgical navigation
US7444178B2 (en) 2004-10-05 2008-10-28 Brainlab Ag Positional marker system with point light sources
US8446473B2 (en) 2004-10-05 2013-05-21 Brainlab Ag Tracking system with scattering effect utilization, in particular with star effect and/or cross effect utilization
US20060089552A1 (en) * 2004-10-05 2006-04-27 Gunter Goldbach Tracking system with scattering effect utilization, in particular with star effect and/or cross effect utilization
US20060115054A1 (en) * 2004-11-12 2006-06-01 General Electric Company System and method for integration of a calibration target into a C-arm
US7344307B2 (en) 2004-11-12 2008-03-18 General Electric Company System and method for integration of a calibration target into a C-arm
US20060229626A1 (en) * 2005-02-22 2006-10-12 Mclean Terry W In-line milling system
US8177788B2 (en) 2005-02-22 2012-05-15 Smith & Nephew, Inc. In-line milling system
US20060293592A1 (en) * 2005-05-13 2006-12-28 General Electric Company System and method for controlling a medical imaging device
US8208988B2 (en) 2005-05-13 2012-06-26 General Electric Company System and method for controlling a medical imaging device
US10555775B2 (en) 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US10792107B2 (en) 2005-05-16 2020-10-06 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US10842571B2 (en) 2005-05-16 2020-11-24 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US11116578B2 (en) 2005-05-16 2021-09-14 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US8971597B2 (en) * 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US11478308B2 (en) 2005-05-16 2022-10-25 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US11672606B2 (en) 2005-05-16 2023-06-13 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US7840256B2 (en) * 2005-06-27 2010-11-23 Biomet Manufacturing Corporation Image guided tracking array and method
US20070016009A1 (en) * 2005-06-27 2007-01-18 Lakin Ryan C Image guided tracking array and method
US20070016025A1 (en) * 2005-06-28 2007-01-18 Siemens Medical Solutions Usa, Inc. Medical diagnostic imaging three dimensional navigation device and methods
US7773074B2 (en) 2005-06-28 2010-08-10 Siemens Medical Solutions Usa, Inc. Medical diagnostic imaging three dimensional navigation device and methods
US8229068B2 (en) 2005-07-22 2012-07-24 Tomotherapy Incorporated System and method of detecting a breathing phase of a patient receiving radiation therapy
US8767917B2 (en) 2005-07-22 2014-07-01 Tomotherapy Incorpoated System and method of delivering radiation therapy to a moving region of interest
US9218663B2 (en) 2005-09-13 2015-12-22 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
US10617332B2 (en) 2005-09-13 2020-04-14 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US11304629B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US9218664B2 (en) 2005-09-13 2015-12-22 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US11304630B2 (en) 2005-09-13 2022-04-19 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US7920909B2 (en) 2005-09-13 2011-04-05 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US8467851B2 (en) 2005-09-21 2013-06-18 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US10299871B2 (en) 2005-09-30 2019-05-28 Restoration Robotics, Inc. Automated system and method for hair removal
US20070106307A1 (en) * 2005-09-30 2007-05-10 Restoration Robotics, Inc. Methods for implanting follicular units using an automated system
US8690894B2 (en) 2005-09-30 2014-04-08 Restoration Robotics, Inc. Automated system for harvesting or implanting follicular units
US10327850B2 (en) 2005-09-30 2019-06-25 Restoration Robotics, Inc. Automated system and method for harvesting or implanting follicular units
US7962192B2 (en) * 2005-09-30 2011-06-14 Restoration Robotics, Inc. Systems and methods for aligning a tool with a desired location or object
US20070078466A1 (en) * 2005-09-30 2007-04-05 Restoration Robotics, Inc. Methods for harvesting follicular units using an automated system
US9526581B2 (en) 2005-09-30 2016-12-27 Restoration Robotics, Inc. Automated system and method for harvesting or implanting follicular units
US20110224693A1 (en) * 2005-09-30 2011-09-15 Mohan Bodduluri Automated System for Harvesting or Implanting Follicular Units
US20070106306A1 (en) * 2005-09-30 2007-05-10 Restoration Robotics, Inc. Automated system for harvesting or implanting follicular units
AU2011250755B2 (en) * 2005-09-30 2013-08-29 Restoration Robotics, Inc. Automated systems and methods for harvesting and implanting follicular units
US20080285707A1 (en) * 2005-10-24 2008-11-20 Cas Innovations Ag System and Method for Medical Navigation
US20070100234A1 (en) * 2005-10-27 2007-05-03 Arenson Jerome S Methods and systems for tracking instruments in fluoroscopy
US20100069741A1 (en) * 2005-11-23 2010-03-18 General Electric Company System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging
US7711406B2 (en) 2005-11-23 2010-05-04 General Electric Company System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging
US8303505B2 (en) 2005-12-02 2012-11-06 Abbott Cardiovascular Systems Inc. Methods and apparatuses for image guided medical procedures
US10597178B2 (en) 2006-01-18 2020-03-24 Medtronic Navigation, Inc. Method and apparatus for providing a container to a sterile environment
US9168102B2 (en) 2006-01-18 2015-10-27 Medtronic Navigation, Inc. Method and apparatus for providing a container to a sterile environment
US10893912B2 (en) 2006-02-16 2021-01-19 Globus Medical Inc. Surgical tool systems and methods
US10653497B2 (en) 2006-02-16 2020-05-19 Globus Medical, Inc. Surgical tool systems and methods
US11628039B2 (en) 2006-02-16 2023-04-18 Globus Medical Inc. Surgical tool systems and methods
US7471202B2 (en) 2006-03-29 2008-12-30 General Electric Co. Conformal coil array for a medical tracking system
US7532997B2 (en) 2006-04-17 2009-05-12 General Electric Company Electromagnetic tracking using a discretized numerical field model
US8112292B2 (en) 2006-04-21 2012-02-07 Medtronic Navigation, Inc. Method and apparatus for optimizing a therapy
WO2007132381A3 (en) * 2006-05-11 2008-01-24 Koninkl Philips Electronics Nv System and method for generating intraoperative 3-dimensional images using non-contrast image data
US20090123046A1 (en) * 2006-05-11 2009-05-14 Koninklijke Philips Electronics N.V. System and method for generating intraoperative 3-dimensional images using non-contrast image data
WO2007132381A2 (en) * 2006-05-11 2007-11-22 Koninklijke Philips Electronics N.V. System and method for generating intraoperative 3-dimensional images using non-contrast image data
US8233962B2 (en) * 2006-05-16 2012-07-31 Siemens Medical Solutions Usa, Inc. Rotational stereo roadmapping
US20080009715A1 (en) * 2006-05-16 2008-01-10 Markus Kukuk Rotational stereo roadmapping
US7620144B2 (en) 2006-06-28 2009-11-17 Accuray Incorporated Parallel stereovision geometry in image-guided radiosurgery
US20080002809A1 (en) * 2006-06-28 2008-01-03 Mohan Bodduluri Parallel stereovision geometry in image-guided radiosurgery
US9044190B2 (en) * 2006-09-25 2015-06-02 Mazor Robotics Ltd. C-arm computerized tomography system
US20100284601A1 (en) * 2006-09-25 2010-11-11 Mazor Surgical Technologies, Ltd. C-arm computerized tomography system
US9597154B2 (en) 2006-09-29 2017-03-21 Medtronic, Inc. Method and apparatus for optimizing a computer assisted surgical procedure
US8660635B2 (en) 2006-09-29 2014-02-25 Medtronic, Inc. Method and apparatus for optimizing a computer assisted surgical procedure
US20080161684A1 (en) * 2006-10-26 2008-07-03 General Electric Company Systems and methods for integrating a navigation field replaceable unit into a fluoroscopy system
US7621169B2 (en) 2006-10-26 2009-11-24 General Electric Company Systems and methods for integrating a navigation field replaceable unit into a fluoroscopy system
US20160133016A1 (en) * 2006-11-26 2016-05-12 Algotec Systems Ltd. Comparison workflow automation by registration
US10172678B2 (en) 2007-02-16 2019-01-08 Globus Medical, Inc. Method and system for performing invasive medical procedures using a surgical robot
US9782229B2 (en) 2007-02-16 2017-10-10 Globus Medical, Inc. Surgical robot platform
US9078685B2 (en) 2007-02-16 2015-07-14 Globus Medical, Inc. Method and system for performing invasive medical procedures using a surgical robot
US20080255403A1 (en) * 2007-04-13 2008-10-16 Ethicon Endo-Surgery, Inc. Magnetic nanoparticle therapies
US8239008B2 (en) 2007-04-13 2012-08-07 Ethicon Endo-Surgery, Inc. Sentinel node identification using fluorescent nanoparticles
US8239007B2 (en) * 2007-04-13 2012-08-07 Ethicon Endo-Surgert, Inc. Biocompatible nanoparticle compositions and methods
US20080255460A1 (en) * 2007-04-13 2008-10-16 Ethicon Endo-Surgery, Inc. Nanoparticle tissue based identification and illumination
US20080255459A1 (en) * 2007-04-13 2008-10-16 Ethicon Endo-Surgery, Inc. Sentinel node identification using fluorescent nanoparticles
US20080255537A1 (en) * 2007-04-13 2008-10-16 Ethicon Endo-Surgery, Inc. Biocompatible nanoparticle compositions and methods
US20080255425A1 (en) * 2007-04-13 2008-10-16 Ethicon Endo-Surgery, Inc. Nanoparticle treated medical devices
US8062215B2 (en) 2007-04-13 2011-11-22 Ethicon Endo-Surgery, Inc. Fluorescent nanoparticle scope
US20080255414A1 (en) * 2007-04-13 2008-10-16 Ethicon Endo-Surgery, Inc. Fluorescent nanoparticle scope
WO2008130354A1 (en) * 2007-04-24 2008-10-30 Medtronic, Inc. Intraoperative image registration
US8010177B2 (en) 2007-04-24 2011-08-30 Medtronic, Inc. Intraoperative image registration
US20080269588A1 (en) * 2007-04-24 2008-10-30 Medtronic, Inc. Intraoperative Image Registration
US20080292046A1 (en) * 2007-05-09 2008-11-27 Estelle Camus Bronchopulmonary medical services system and imaging method
US20210378761A1 (en) * 2007-06-15 2021-12-09 Orthosoft Inc. Computer-assisted surgery system and method
US20170119477A1 (en) * 2007-06-15 2017-05-04 Orthosoft Inc. Computer-assisted surgery system and method
US9532848B2 (en) * 2007-06-15 2017-01-03 Othosoft, Inc. Computer-assisted surgery system and method
US20080312529A1 (en) * 2007-06-15 2008-12-18 Louis-Philippe Amiot Computer-assisted surgery system and method
US11771502B2 (en) * 2007-06-15 2023-10-03 Orthosoft Ulc Computer-assisted surgery system and method
US11116577B2 (en) * 2007-06-15 2021-09-14 Orthosoft Ulc Computer-assisted surgery system and method
US20080319311A1 (en) * 2007-06-22 2008-12-25 General Electric Company System and method for accuracy verification for image based surgical navigation
US9468412B2 (en) * 2007-06-22 2016-10-18 General Electric Company System and method for accuracy verification for image based surgical navigation
US7742569B2 (en) * 2007-07-23 2010-06-22 Siemens Aktiengesellschaft X-ray system and method for image composition
US20090028291A1 (en) * 2007-07-23 2009-01-29 Rainer Graumann X-ray system and method for image composition
US10390686B2 (en) 2007-09-27 2019-08-27 Covidien Lp Bronchoscope adapter and method
US9668639B2 (en) 2007-09-27 2017-06-06 Covidien Lp Bronchoscope adapter and method
US9986895B2 (en) 2007-09-27 2018-06-05 Covidien Lp Bronchoscope adapter and method
US8905920B2 (en) 2007-09-27 2014-12-09 Covidien Lp Bronchoscope adapter and method
US10980400B2 (en) 2007-09-27 2021-04-20 Covidien Lp Bronchoscope adapter and method
US8792963B2 (en) 2007-09-30 2014-07-29 Intuitive Surgical Operations, Inc. Methods of determining tissue distances using both kinematic robotic tool position information and image-derived position information
US8391952B2 (en) 2007-10-11 2013-03-05 General Electric Company Coil arrangement for an electromagnetic tracking system
US8467497B2 (en) 2007-10-25 2013-06-18 Tomotherapy Incorporated System and method for motion adaptive optimization for radiation therapy delivery
US20090116616A1 (en) * 2007-10-25 2009-05-07 Tomotherapy Incorporated System and method for motion adaptive optimization for radiation therapy delivery
US20090252291A1 (en) * 2007-10-25 2009-10-08 Weiguo Lu System and method for motion adaptive optimization for radiation therapy delivery
US8509383B2 (en) 2007-10-25 2013-08-13 Tomotherapy Incorporated System and method for motion adaptive optimization for radiation therapy delivery
US20090135992A1 (en) * 2007-11-27 2009-05-28 Regis Vaillant Method for the processing of radiography cardiac images with a view to obtaining a subtracted and registered image
US20090198124A1 (en) * 2008-01-31 2009-08-06 Ralf Adamus Workflow to enhance a transjugular intrahepatic portosystemic shunt procedure
US20110054297A1 (en) * 2008-03-03 2011-03-03 Clemens Bulitta Medical system
US9575140B2 (en) 2008-04-03 2017-02-21 Covidien Lp Magnetic interference detection system and method
US10335237B2 (en) 2008-04-03 2019-07-02 Brainlab Ag Visual orientation aid for medical instruments
EP2119397A1 (en) 2008-05-15 2009-11-18 BrainLAB AG Determining calibration information for an x-ray machine
US20090285366A1 (en) * 2008-05-15 2009-11-19 Robert Essenreiter Determining calibration information for an x-ray apparatus
US7922391B2 (en) * 2008-05-15 2011-04-12 Brainlab Ag Determining calibration information for an x-ray apparatus
US10096126B2 (en) 2008-06-03 2018-10-09 Covidien Lp Feature-based registration method
US8473032B2 (en) 2008-06-03 2013-06-25 Superdimension, Ltd. Feature-based registration method
US9659374B2 (en) 2008-06-03 2017-05-23 Covidien Lp Feature-based registration method
US11783498B2 (en) 2008-06-03 2023-10-10 Covidien Lp Feature-based registration method
US11074702B2 (en) 2008-06-03 2021-07-27 Covidien Lp Feature-based registration method
US9117258B2 (en) 2008-06-03 2015-08-25 Covidien Lp Feature-based registration method
US10285623B2 (en) 2008-06-06 2019-05-14 Covidien Lp Hybrid registration method
US8452068B2 (en) 2008-06-06 2013-05-28 Covidien Lp Hybrid registration method
US10478092B2 (en) 2008-06-06 2019-11-19 Covidien Lp Hybrid registration method
US8467589B2 (en) 2008-06-06 2013-06-18 Covidien Lp Hybrid registration method
US9271803B2 (en) 2008-06-06 2016-03-01 Covidien Lp Hybrid registration method
US10674936B2 (en) 2008-06-06 2020-06-09 Covidien Lp Hybrid registration method
US11931141B2 (en) 2008-06-06 2024-03-19 Covidien Lp Hybrid registration method
US11241164B2 (en) 2008-07-10 2022-02-08 Covidien Lp Integrated multi-functional endoscopic tool
US8932207B2 (en) 2008-07-10 2015-01-13 Covidien Lp Integrated multi-functional endoscopic tool
US10912487B2 (en) 2008-07-10 2021-02-09 Covidien Lp Integrated multi-function endoscopic tool
US11234611B2 (en) 2008-07-10 2022-02-01 Covidien Lp Integrated multi-functional endoscopic tool
US10070801B2 (en) 2008-07-10 2018-09-11 Covidien Lp Integrated multi-functional endoscopic tool
US8165658B2 (en) 2008-09-26 2012-04-24 Medtronic, Inc. Method and apparatus for positioning a guide relative to a base
US9405971B2 (en) 2008-09-29 2016-08-02 Restoration Robotics, Inc. Object-Tracking systems and methods
US8848974B2 (en) * 2008-09-29 2014-09-30 Restoration Robotics, Inc. Object-tracking systems and methods
US20100080417A1 (en) * 2008-09-29 2010-04-01 Qureshi Shehrzad A Object-Tracking Systems and Methods
US9589368B2 (en) 2008-09-29 2017-03-07 Restoration Robotics, Inc. Object-tracking systems and methods
US8811660B2 (en) * 2008-09-29 2014-08-19 Restoration Robotics, Inc. Object-tracking systems and methods
US20100080415A1 (en) * 2008-09-29 2010-04-01 Restoration Robotics, Inc. Object-tracking systems and methods
US8731641B2 (en) 2008-12-16 2014-05-20 Medtronic Navigation, Inc. Combination of electromagnetic and electropotential localization
US8175681B2 (en) 2008-12-16 2012-05-08 Medtronic Navigation Inc. Combination of electromagnetic and electropotential localization
US9402690B2 (en) 2008-12-31 2016-08-02 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local and remote robotic proctoring
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
US9398675B2 (en) 2009-03-20 2016-07-19 Orthoscan, Inc. Mobile imaging apparatus
US8611984B2 (en) 2009-04-08 2013-12-17 Covidien Lp Locatable catheter
US10154798B2 (en) 2009-04-08 2018-12-18 Covidien Lp Locatable catheter
US9113813B2 (en) 2009-04-08 2015-08-25 Covidien Lp Locatable catheter
US20100292565A1 (en) * 2009-05-18 2010-11-18 Andreas Meyer Medical imaging medical device navigation from at least two 2d projections from different angles
US9155592B2 (en) 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US8494614B2 (en) 2009-08-31 2013-07-23 Regents Of The University Of Minnesota Combination localization system
US8494613B2 (en) 2009-08-31 2013-07-23 Medtronic, Inc. Combination localization system
US20110152676A1 (en) * 2009-12-21 2011-06-23 General Electric Company Intra-operative registration for navigated surgical procedures
US8694075B2 (en) 2009-12-21 2014-04-08 General Electric Company Intra-operative registration for navigated surgical procedures
US8781186B2 (en) 2010-05-04 2014-07-15 Pathfinder Therapeutics, Inc. System and method for abdominal surface matching using pseudo-features
US11246509B2 (en) * 2010-05-06 2022-02-15 Sachin Bhandari Calibration device for inertial sensor based surgical navigation system
US20170347922A1 (en) * 2010-05-06 2017-12-07 Sachin Bhandari Calibration Device for Inertial Sensor Based Surgical Navigation System
US10582834B2 (en) 2010-06-15 2020-03-10 Covidien Lp Locatable expandable working channel and method
US11690527B2 (en) 2010-08-20 2023-07-04 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US10264947B2 (en) 2010-08-20 2019-04-23 Veran Medical Technologies, Inc. Apparatus and method for airway registration and navigation
US10898057B2 (en) 2010-08-20 2021-01-26 Veran Medical Technologies, Inc. Apparatus and method for airway registration and navigation
US8696549B2 (en) 2010-08-20 2014-04-15 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US10165928B2 (en) 2010-08-20 2019-01-01 Mark Hunter Systems, instruments, and methods for four dimensional soft tissue navigation
US11109740B2 (en) 2010-08-20 2021-09-07 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US9125611B2 (en) 2010-12-13 2015-09-08 Orthoscan, Inc. Mobile fluoroscopic imaging system
US10178978B2 (en) 2010-12-13 2019-01-15 Orthoscan, Inc. Mobile fluoroscopic imaging system
US9833206B2 (en) 2010-12-13 2017-12-05 Orthoscan, Inc. Mobile fluoroscopic imaging system
US9498289B2 (en) 2010-12-21 2016-11-22 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US11510744B2 (en) 2010-12-21 2022-11-29 Venus Concept Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US8911453B2 (en) 2010-12-21 2014-12-16 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US10188466B2 (en) 2010-12-21 2019-01-29 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US9743988B2 (en) 2010-12-21 2017-08-29 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US12096994B2 (en) 2011-04-01 2024-09-24 KB Medical SA Robotic system and method for spinal and other surgeries
US11744648B2 (en) 2011-04-01 2023-09-05 Globus Medicall, Inc. Robotic system and method for spinal and other surgeries
US10660712B2 (en) 2011-04-01 2020-05-26 Globus Medical Inc. Robotic system and method for spinal and other surgeries
US11202681B2 (en) 2011-04-01 2021-12-21 Globus Medical, Inc. Robotic system and method for spinal and other surgeries
EP2701605A4 (en) * 2011-04-27 2014-10-01 Univ Virginia Commonwealth 3d tracking of an hdr source using a flat panel detector
US9474493B2 (en) 2011-04-27 2016-10-25 Virginia Commonwealth University 3D tracking of an HDR source using a flat panel detector
US10575797B2 (en) 2011-05-12 2020-03-03 The Johns Hopkins University Electromagnetic tracking system and methods of using same
US11596367B2 (en) 2011-05-12 2023-03-07 The Johns Hopkins University Electromagnetic tracking system and methods of using same
WO2012155050A2 (en) * 2011-05-12 2012-11-15 The Johns Hopkins University Electromagnetic tracking system and methods of using same
WO2012155050A3 (en) * 2011-05-12 2013-03-07 The Johns Hopkins University Electromagnetic tracking system and methods of using same
US20130150865A1 (en) * 2011-12-09 2013-06-13 Samsung Electronics Co., Ltd. Medical robot system and method for controlling the same
US9277968B2 (en) * 2011-12-09 2016-03-08 Samsung Electronics Co., Ltd. Medical robot system and method for controlling the same
US9142018B2 (en) 2012-01-17 2015-09-22 Sunnybrook Health Sciences Centre Method for three-dimensional localization of an object from a two-dimensional medical image
WO2013106926A1 (en) * 2012-01-17 2013-07-25 Sunnybrook Health Sciences Centre Method for three-dimensional localization of an object from a two-dimensional medical image
US9972082B2 (en) 2012-02-22 2018-05-15 Veran Medical Technologies, Inc. Steerable surgical catheter having biopsy devices and related systems and methods for four dimensional soft tissue navigation
US10977789B2 (en) 2012-02-22 2021-04-13 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US11551359B2 (en) 2012-02-22 2023-01-10 Veran Medical Technologies, Inc Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US11830198B2 (en) 2012-02-22 2023-11-28 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US10140704B2 (en) 2012-02-22 2018-11-27 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9138165B2 (en) 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US10460437B2 (en) 2012-02-22 2019-10-29 Veran Medical Technologies, Inc. Method for placing a localization element in an organ of a patient for four dimensional soft tissue navigation
US11403753B2 (en) 2012-02-22 2022-08-02 Veran Medical Technologies, Inc. Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
US10249036B2 (en) 2012-02-22 2019-04-02 Veran Medical Technologies, Inc. Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
US9700276B2 (en) * 2012-02-28 2017-07-11 Siemens Healthcare Gmbh Robust multi-object tracking using sparse appearance representation and online sparse appearance dictionary update
US20130245429A1 (en) * 2012-02-28 2013-09-19 Siemens Aktiengesellschaft Robust multi-object tracking using sparse appearance representation and online sparse appearance dictionary update
US11395706B2 (en) 2012-06-21 2022-07-26 Globus Medical Inc. Surgical robot platform
US11026756B2 (en) 2012-06-21 2021-06-08 Globus Medical, Inc. Surgical robot platform
US10231791B2 (en) 2012-06-21 2019-03-19 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US11439471B2 (en) 2012-06-21 2022-09-13 Globus Medical, Inc. Surgical tool system and method
US10136954B2 (en) 2012-06-21 2018-11-27 Globus Medical, Inc. Surgical tool systems and method
US10639112B2 (en) 2012-06-21 2020-05-05 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US11331153B2 (en) 2012-06-21 2022-05-17 Globus Medical, Inc. Surgical robot platform
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US10350013B2 (en) 2012-06-21 2019-07-16 Globus Medical, Inc. Surgical tool systems and methods
US10646280B2 (en) 2012-06-21 2020-05-12 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11284949B2 (en) 2012-06-21 2022-03-29 Globus Medical, Inc. Surgical robot platform
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11589771B2 (en) 2012-06-21 2023-02-28 Globus Medical Inc. Method for recording probe movement and determining an extent of matter removed
US12070285B2 (en) 2012-06-21 2024-08-27 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US10357184B2 (en) 2012-06-21 2019-07-23 Globus Medical, Inc. Surgical tool systems and method
US11607149B2 (en) 2012-06-21 2023-03-21 Globus Medical Inc. Surgical tool systems and method
US11684437B2 (en) 2012-06-21 2023-06-27 Globus Medical Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11684433B2 (en) 2012-06-21 2023-06-27 Globus Medical Inc. Surgical tool systems and method
US11191598B2 (en) 2012-06-21 2021-12-07 Globus Medical, Inc. Surgical robot platform
US12016645B2 (en) 2012-06-21 2024-06-25 Globus Medical Inc. Surgical robotic automation with tracking markers
US11684431B2 (en) 2012-06-21 2023-06-27 Globus Medical, Inc. Surgical robot platform
US12004905B2 (en) 2012-06-21 2024-06-11 Globus Medical, Inc. Medical imaging systems using robotic actuators and related methods
US10758315B2 (en) 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
US11690687B2 (en) 2012-06-21 2023-07-04 Globus Medical Inc. Methods for performing medical procedures using a surgical robot
US11135022B2 (en) 2012-06-21 2021-10-05 Globus Medical, Inc. Surgical robot platform
US11116576B2 (en) 2012-06-21 2021-09-14 Globus Medical Inc. Dynamic reference arrays and methods of use
US11744657B2 (en) 2012-06-21 2023-09-05 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US11109922B2 (en) 2012-06-21 2021-09-07 Globus Medical, Inc. Surgical tool systems and method
US10799298B2 (en) 2012-06-21 2020-10-13 Globus Medical Inc. Robotic fluoroscopic navigation
US11974822B2 (en) 2012-06-21 2024-05-07 Globus Medical Inc. Method for a surveillance marker in robotic-assisted surgery
US11103320B2 (en) 2012-06-21 2021-08-31 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US11103317B2 (en) 2012-06-21 2021-08-31 Globus Medical, Inc. Surgical robot platform
US11786324B2 (en) 2012-06-21 2023-10-17 Globus Medical, Inc. Surgical robotic automation with tracking markers
US10835326B2 (en) 2012-06-21 2020-11-17 Globus Medical Inc. Surgical robot platform
US10835328B2 (en) 2012-06-21 2020-11-17 Globus Medical, Inc. Surgical robot platform
US11963755B2 (en) 2012-06-21 2024-04-23 Globus Medical Inc. Apparatus for recording probe movement
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11911225B2 (en) 2012-06-21 2024-02-27 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
US10842461B2 (en) 2012-06-21 2020-11-24 Globus Medical, Inc. Systems and methods of checking registrations for surgical systems
US11819283B2 (en) 2012-06-21 2023-11-21 Globus Medical Inc. Systems and methods related to robotic guidance in surgery
US11896446B2 (en) 2012-06-21 2024-02-13 Globus Medical, Inc Surgical robotic automation with tracking markers
US11819365B2 (en) 2012-06-21 2023-11-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US10485617B2 (en) 2012-06-21 2019-11-26 Globus Medical, Inc. Surgical robot platform
US10874466B2 (en) 2012-06-21 2020-12-29 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US10912617B2 (en) 2012-06-21 2021-02-09 Globus Medical, Inc. Surgical robot platform
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US10531927B2 (en) 2012-06-21 2020-01-14 Globus Medical, Inc. Methods for performing invasive medical procedures using a surgical robot
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
EP3175815B1 (en) * 2012-10-19 2023-12-13 Biosense Webster (Israel) Ltd. Integration between 3d maps and fluoroscopic images
EP2722018B1 (en) 2012-10-19 2017-03-08 Biosense Webster (Israel) Ltd. Integration between 3D maps and fluoroscopic images
EP2722018B2 (en) 2012-10-19 2020-01-01 Biosense Webster (Israel) Ltd. Integration between 3D maps and fluoroscopic images
US10441236B2 (en) * 2012-10-19 2019-10-15 Biosense Webster (Israel) Ltd. Integration between 3D maps and fluoroscopic images
CN103767683B (en) * 2012-10-19 2018-11-13 韦伯斯特生物官能(以色列)有限公司 Integration between 3D scaling graphs and fluoroscopic image
CN103767683A (en) * 2012-10-19 2014-05-07 韦伯斯特生物官能(以色列)有限公司 Integration between 3D maps and fluoroscopic images
US20140112529A1 (en) * 2012-10-24 2014-04-24 Samsung Electronics Co., Ltd. Method, apparatus, and system for correcting medical image according to patient's pose variation
US9437003B2 (en) * 2012-10-24 2016-09-06 Samsung Electronics Co., Ltd. Method, apparatus, and system for correcting medical image according to patient's pose variation
US11369438B2 (en) 2013-01-16 2022-06-28 Stryker Corporation Navigation systems and methods for indicating and reducing line-of-sight errors
US10932837B2 (en) 2013-01-16 2021-03-02 Mako Surgical Corp. Tracking device using a bone plate for attaching to a patient's anatomy
US11622800B2 (en) 2013-01-16 2023-04-11 Mako Surgical Corp. Bone plate for attaching to an anatomic structure
US9993273B2 (en) 2013-01-16 2018-06-12 Mako Surgical Corp. Bone plate and tracking device using a bone plate for attaching to a patient's anatomy
US10531925B2 (en) 2013-01-16 2020-01-14 Stryker Corporation Navigation systems and methods for indicating and reducing line-of-sight errors
US12102365B2 (en) 2013-01-16 2024-10-01 Mako Surgical Corp. Bone plate for attaching to an anatomic structure
US10900777B2 (en) * 2013-01-25 2021-01-26 Werth Messtechnik Gmbh Method and device for determining the geometry of structures by means of computer tomography
US20150355113A1 (en) * 2013-01-25 2015-12-10 Werth Messtechnik Gmbh Method and device for determining the geometry of structures by means of computer tomography
US9443633B2 (en) 2013-02-26 2016-09-13 Accuray Incorporated Electromagnetically actuated multi-leaf collimator
US11896363B2 (en) 2013-03-15 2024-02-13 Globus Medical Inc. Surgical robot platform
US20240123260A1 (en) * 2013-07-17 2024-04-18 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US12042671B2 (en) * 2013-07-17 2024-07-23 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US20200016434A1 (en) * 2013-07-17 2020-01-16 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US20240042241A1 (en) * 2013-07-17 2024-02-08 Vision Rt Limited Calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US20230330438A1 (en) * 2013-07-17 2023-10-19 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US11633629B2 (en) * 2013-07-17 2023-04-25 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US20210146162A1 (en) * 2013-07-17 2021-05-20 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US10933258B2 (en) * 2013-07-17 2021-03-02 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US10456601B2 (en) * 2013-07-17 2019-10-29 Vision Rt Limited Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus
US12114939B2 (en) 2013-10-04 2024-10-15 KB Medical SA Apparatus, systems, and methods for precise guidance of surgical tools
US10813704B2 (en) 2013-10-04 2020-10-27 Kb Medical, Sa Apparatus and systems for precise guidance of surgical tools
US11172997B2 (en) 2013-10-04 2021-11-16 Kb Medical, Sa Apparatus and systems for precise guidance of surgical tools
US11737766B2 (en) 2014-01-15 2023-08-29 Globus Medical Inc. Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
US10548620B2 (en) 2014-01-15 2020-02-04 Globus Medical, Inc. Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
US10939968B2 (en) 2014-02-11 2021-03-09 Globus Medical Inc. Sterile handle for controlling a robotic surgical system from a sterile field
US10617324B2 (en) 2014-04-23 2020-04-14 Veran Medical Technologies, Inc Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US10624701B2 (en) 2014-04-23 2020-04-21 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US11553968B2 (en) 2014-04-23 2023-01-17 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US11793583B2 (en) 2014-04-24 2023-10-24 Globus Medical Inc. Surgical instrument holder for use with a robotic surgical system
US10292778B2 (en) 2014-04-24 2019-05-21 Globus Medical, Inc. Surgical instrument holder for use with a robotic surgical system
US10828116B2 (en) 2014-04-24 2020-11-10 Kb Medical, Sa Surgical instrument holder for use with a robotic surgical system
US10952593B2 (en) 2014-06-10 2021-03-23 Covidien Lp Bronchoscope adapter
US12042243B2 (en) 2014-06-19 2024-07-23 Globus Medical, Inc Systems and methods for performing minimally invasive surgery
US10828120B2 (en) 2014-06-19 2020-11-10 Kb Medical, Sa Systems and methods for performing minimally invasive surgery
US20170157770A1 (en) * 2014-06-23 2017-06-08 Abb Schweiz Ag Method for calibrating a robot and a robot system
US9889565B2 (en) * 2014-06-23 2018-02-13 Abb Schweiz Ag Method for calibrating a robot and a robot system
US10765438B2 (en) 2014-07-14 2020-09-08 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
US10357257B2 (en) 2014-07-14 2019-07-23 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
US11534179B2 (en) 2014-07-14 2022-12-27 Globus Medical, Inc. Anti-skid surgical instrument for use in preparing holes in bone tissue
US10945742B2 (en) 2014-07-14 2021-03-16 Globus Medical Inc. Anti-skid surgical instrument for use in preparing holes in bone tissue
US9986983B2 (en) 2014-10-31 2018-06-05 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US11871913B2 (en) 2014-10-31 2024-01-16 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US10321898B2 (en) 2014-10-31 2019-06-18 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US10314564B2 (en) 2014-10-31 2019-06-11 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US9974525B2 (en) 2014-10-31 2018-05-22 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US11103316B2 (en) 2014-12-02 2021-08-31 Globus Medical Inc. Robot assisted volume removal during surgery
US12002171B2 (en) 2015-02-03 2024-06-04 Globus Medical, Inc Surgeon head-mounted display apparatuses
US11217028B2 (en) 2015-02-03 2022-01-04 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US10650594B2 (en) 2015-02-03 2020-05-12 Globus Medical Inc. Surgeon head-mounted display apparatuses
US11763531B2 (en) 2015-02-03 2023-09-19 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US11062522B2 (en) 2015-02-03 2021-07-13 Global Medical Inc Surgeon head-mounted display apparatuses
US10580217B2 (en) 2015-02-03 2020-03-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US11176750B2 (en) 2015-02-03 2021-11-16 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US11734901B2 (en) 2015-02-03 2023-08-22 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US11461983B2 (en) 2015-02-03 2022-10-04 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US10546423B2 (en) 2015-02-03 2020-01-28 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US10555782B2 (en) 2015-02-18 2020-02-11 Globus Medical, Inc. Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
US11266470B2 (en) 2015-02-18 2022-03-08 KB Medical SA Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
US12076095B2 (en) 2015-02-18 2024-09-03 Globus Medical, Inc. Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
CN113081009A (en) * 2015-04-15 2021-07-09 莫比乌斯成像公司 Integrated medical imaging and surgical robotic system
US10426555B2 (en) 2015-06-03 2019-10-01 Covidien Lp Medical instrument with sensor for use in a system and method for electromagnetic navigation
US10166078B2 (en) * 2015-07-21 2019-01-01 Synaptive Medical (Barbados) Inc. System and method for mapping navigation space to patient space in a medical procedure
US10646298B2 (en) 2015-07-31 2020-05-12 Globus Medical, Inc. Robot arm and methods of use
US10925681B2 (en) 2015-07-31 2021-02-23 Globus Medical Inc. Robot arm and methods of use
US11672622B2 (en) 2015-07-31 2023-06-13 Globus Medical, Inc. Robot arm and methods of use
US11337769B2 (en) 2015-07-31 2022-05-24 Globus Medical, Inc. Robot arm and methods of use
US10674982B2 (en) 2015-08-06 2020-06-09 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11547377B2 (en) 2015-08-06 2023-01-10 Covidien Lp System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US11992349B2 (en) 2015-08-06 2024-05-28 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11559266B2 (en) 2015-08-06 2023-01-24 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11707241B2 (en) 2015-08-06 2023-07-25 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US10716525B2 (en) 2015-08-06 2020-07-21 Covidien Lp System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US10702226B2 (en) 2015-08-06 2020-07-07 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11751950B2 (en) 2015-08-12 2023-09-12 Globus Medical Inc. Devices and methods for temporary mounting of parts to bone
US10080615B2 (en) 2015-08-12 2018-09-25 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
US10786313B2 (en) 2015-08-12 2020-09-29 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
US11872000B2 (en) 2015-08-31 2024-01-16 Globus Medical, Inc Robotic surgical systems and methods
US10687905B2 (en) 2015-08-31 2020-06-23 KB Medical SA Robotic surgical systems and methods
US10973594B2 (en) 2015-09-14 2021-04-13 Globus Medical, Inc. Surgical robotic systems and methods thereof
US11066090B2 (en) 2015-10-13 2021-07-20 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US10569794B2 (en) 2015-10-13 2020-02-25 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US11801024B2 (en) 2015-10-28 2023-10-31 Medtronic Navigation, Inc. Apparatus and method for maintaining image quality while minimizing x-ray dosage of a patient
US11006914B2 (en) 2015-10-28 2021-05-18 Medtronic Navigation, Inc. Apparatus and method for maintaining image quality while minimizing x-ray dosage of a patient
US11172895B2 (en) 2015-12-07 2021-11-16 Covidien Lp Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated
US11925493B2 (en) 2015-12-07 2024-03-12 Covidien Lp Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated
US12016714B2 (en) 2016-02-03 2024-06-25 Globus Medical Inc. Portable medical imaging system
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US11986333B2 (en) 2016-02-03 2024-05-21 Globus Medical Inc. Portable medical imaging system
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US10849580B2 (en) 2016-02-03 2020-12-01 Globus Medical Inc. Portable medical imaging system
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US11801022B2 (en) 2016-02-03 2023-10-31 Globus Medical, Inc. Portable medical imaging system
US10687779B2 (en) 2016-02-03 2020-06-23 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
US10117632B2 (en) 2016-02-03 2018-11-06 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
US11523784B2 (en) 2016-02-03 2022-12-13 Globus Medical, Inc. Portable medical imaging system
EP4375934A3 (en) * 2016-02-12 2024-07-31 Intuitive Surgical Operations, Inc. Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery
US10866119B2 (en) 2016-03-14 2020-12-15 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
US11920957B2 (en) 2016-03-14 2024-03-05 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
US11668588B2 (en) 2016-03-14 2023-06-06 Globus Medical Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
US12044552B2 (en) 2016-03-14 2024-07-23 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
US11974886B2 (en) 2016-04-11 2024-05-07 Globus Medical Inc. Surgical tool systems and methods
US11160617B2 (en) 2016-05-16 2021-11-02 Covidien Lp System and method to access lung tissue
US11786317B2 (en) 2016-05-16 2023-10-17 Covidien Lp System and method to access lung tissue
US10478254B2 (en) 2016-05-16 2019-11-19 Covidien Lp System and method to access lung tissue
US11937881B2 (en) 2016-05-23 2024-03-26 Mako Surgical Corp. Systems and methods for identifying and tracking physical objects during a robotic surgical procedure
US10531926B2 (en) 2016-05-23 2020-01-14 Mako Surgical Corp. Systems and methods for identifying and tracking physical objects during a robotic surgical procedure
US11559358B2 (en) 2016-05-26 2023-01-24 Mako Surgical Corp. Surgical assembly with kinematic connector
US10537395B2 (en) 2016-05-26 2020-01-21 MAKO Surgical Group Navigation tracker with kinematic connector assembly
USD820452S1 (en) 2016-07-21 2018-06-12 Broncus Medical Inc. Radio-opaque marker
US11006921B2 (en) 2016-09-15 2021-05-18 Oxos Medical, Inc. Imaging systems and methods
US11647976B2 (en) 2016-09-15 2023-05-16 Oxos Medical, Inc. Imaging systems and methods
US11051886B2 (en) 2016-09-27 2021-07-06 Covidien Lp Systems and methods for performing a surgical navigation procedure
US12114944B2 (en) * 2016-09-27 2024-10-15 Brainlab Ag Efficient positioning of a mechatronic arm
US11642182B2 (en) * 2016-09-27 2023-05-09 Brainlab Ag Efficient positioning of a mechatronic arm
US20230293248A1 (en) * 2016-09-27 2023-09-21 Brainlab Ag Efficient positioning of a mechatronic arm
US11806100B2 (en) 2016-10-21 2023-11-07 Kb Medical, Sa Robotic surgical systems
US11039893B2 (en) 2016-10-21 2021-06-22 Globus Medical, Inc. Robotic surgical systems
US11786314B2 (en) 2016-10-28 2023-10-17 Covidien Lp System for calibrating an electromagnetic navigation system
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US11672604B2 (en) 2016-10-28 2023-06-13 Covidien Lp System and method for generating a map for electromagnetic navigation
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US11759264B2 (en) 2016-10-28 2023-09-19 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US11779732B2 (en) 2016-11-21 2023-10-10 St Jude Medical International Holding S.À R.L. Medical device sensor
US10806471B2 (en) 2017-01-18 2020-10-20 Globus Medical, Inc. Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
US10420616B2 (en) 2017-01-18 2019-09-24 Globus Medical, Inc. Robotic navigation of robotic surgical systems
US11779408B2 (en) 2017-01-18 2023-10-10 Globus Medical, Inc. Robotic navigation of robotic surgical systems
US11529195B2 (en) 2017-01-18 2022-12-20 Globus Medical Inc. Robotic navigation of robotic surgical systems
US10864057B2 (en) 2017-01-18 2020-12-15 Kb Medical, Sa Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
US11813030B2 (en) 2017-03-16 2023-11-14 Globus Medical, Inc. Robotic navigation of robotic surgical systems
US11071594B2 (en) 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
US12082886B2 (en) 2017-04-05 2024-09-10 Globus Medical Inc. Robotic surgical systems for preparing holes in bone tissue and methods of their use
US10846893B2 (en) 2017-06-29 2020-11-24 Covidien Lp System and method for identifying, marking and navigating to a target using real time three dimensional fluoroscopic data
US10699448B2 (en) 2017-06-29 2020-06-30 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US11341692B2 (en) 2017-06-29 2022-05-24 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US11135015B2 (en) 2017-07-21 2021-10-05 Globus Medical, Inc. Robot surgical platform
US11253320B2 (en) 2017-07-21 2022-02-22 Globus Medical Inc. Robot surgical platform
US11771499B2 (en) 2017-07-21 2023-10-03 Globus Medical Inc. Robot surgical platform
US10675094B2 (en) 2017-07-21 2020-06-09 Globus Medical Inc. Robot surgical platform
US12064280B2 (en) 2017-10-10 2024-08-20 Covidien Lp System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US10893843B2 (en) 2017-10-10 2021-01-19 Covidien Lp System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US11564649B2 (en) 2017-10-10 2023-01-31 Covidien Lp System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US11219489B2 (en) 2017-10-31 2022-01-11 Covidien Lp Devices and systems for providing sensors in parallel with medical tools
US11357548B2 (en) 2017-11-09 2022-06-14 Globus Medical, Inc. Robotic rod benders and related mechanical and motor housings
US11382666B2 (en) 2017-11-09 2022-07-12 Globus Medical Inc. Methods providing bend plans for surgical rods and related controllers and computer program products
US11794338B2 (en) 2017-11-09 2023-10-24 Globus Medical Inc. Robotic rod benders and related mechanical and motor housings
US10898252B2 (en) 2017-11-09 2021-01-26 Globus Medical, Inc. Surgical robotic systems for bending surgical rods, and related methods and devices
US11786144B2 (en) 2017-11-10 2023-10-17 Globus Medical, Inc. Methods of selecting surgical implants and related devices
US11134862B2 (en) 2017-11-10 2021-10-05 Globus Medical, Inc. Methods of selecting surgical implants and related devices
US11872027B2 (en) 2017-12-05 2024-01-16 St. Jude Medical International Holding S.á r.l. Magnetic sensor for tracking the location of an object
US11241165B2 (en) 2017-12-05 2022-02-08 St. Jude Medical International Holding S.À R.L. Magnetic sensor for tracking the location of an object
US11701184B2 (en) 2018-02-08 2023-07-18 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US11364004B2 (en) 2018-02-08 2022-06-21 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US10905498B2 (en) 2018-02-08 2021-02-02 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US10893842B2 (en) 2018-02-08 2021-01-19 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11253325B2 (en) 2018-02-08 2022-02-22 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US11712213B2 (en) 2018-02-08 2023-08-01 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11896414B2 (en) 2018-02-08 2024-02-13 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11950932B2 (en) 2018-02-11 2024-04-09 St Jude Medical International Holding, Sa.R.L. Mechanical design considerations for table-mounted device used as a sub-assembly in a magnetic tracking system working in conjunction with an X-ray imaging system
WO2019155036A3 (en) * 2018-02-11 2019-10-03 St. Jude Medical International Holding S.À R.L. Mechanical design considerations for table-mounted device used as a sub-assembly in a magnetic tracking system working in conjunction with an x-ray imaging system
US10646283B2 (en) 2018-02-19 2020-05-12 Globus Medical Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US11694355B2 (en) 2018-04-09 2023-07-04 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
US11100668B2 (en) 2018-04-09 2021-08-24 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
US10573023B2 (en) 2018-04-09 2020-02-25 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
US20220257206A1 (en) * 2018-04-17 2022-08-18 The Board Of Trustees Of Leland Stanford Junior University Augmented Fluoroscopy with Digital Subtraction Imaging
CN110495899B (en) * 2018-05-16 2023-08-22 西门子医疗有限公司 Method and device for determining geometric calibration and method for determining associated data
CN110495899A (en) * 2018-05-16 2019-11-26 西门子医疗有限公司 The method for determining the method and apparatus of geometry calibration and determining associated data
US12097065B2 (en) 2018-08-01 2024-09-24 Oxos Medical, Inc. Imaging systems and methods
US11207047B2 (en) * 2018-08-01 2021-12-28 Oxos Medical, Inc. Imaging systems and methods
US11337742B2 (en) 2018-11-05 2022-05-24 Globus Medical Inc Compliant orthopedic driver
US11751927B2 (en) 2018-11-05 2023-09-12 Globus Medical Inc. Compliant orthopedic driver
US11832863B2 (en) 2018-11-05 2023-12-05 Globus Medical, Inc. Compliant orthopedic driver
US11278360B2 (en) 2018-11-16 2022-03-22 Globus Medical, Inc. End-effectors for surgical robotic systems having sealed optical components
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11969224B2 (en) 2018-12-04 2024-04-30 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
CN113473915B (en) * 2019-01-15 2024-06-04 皇家飞利浦有限公司 Real-time tracking of fused ultrasound images and X-ray images
CN113473915A (en) * 2019-01-15 2021-10-01 皇家飞利浦有限公司 Real-time tracking of fused ultrasound and X-ray images
WO2020186075A1 (en) * 2019-03-12 2020-09-17 Micro C, LLC Method of fluoroscopic surgical registration
US11918313B2 (en) 2019-03-15 2024-03-05 Globus Medical Inc. Active end effectors for surgical robots
US11944325B2 (en) 2019-03-22 2024-04-02 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11806084B2 (en) 2019-03-22 2023-11-07 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11571265B2 (en) 2019-03-22 2023-02-07 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11850012B2 (en) 2019-03-22 2023-12-26 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11419616B2 (en) 2019-03-22 2022-08-23 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11744598B2 (en) 2019-03-22 2023-09-05 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11317978B2 (en) 2019-03-22 2022-05-03 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11737696B2 (en) 2019-03-22 2023-08-29 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
CN113873936A (en) * 2019-03-26 2021-12-31 皇家飞利浦有限公司 Continuous guidewire identification
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
US12059804B2 (en) 2019-05-22 2024-08-13 Mako Surgical Corp. Bidirectional kinematic mount
CN113905685A (en) * 2019-05-23 2022-01-07 伯恩森斯韦伯斯特(以色列)有限责任公司 Probe with radiopaque label
US11628023B2 (en) 2019-07-10 2023-04-18 Globus Medical, Inc. Robotic navigational system for interbody implants
US12076097B2 (en) 2019-07-10 2024-09-03 Globus Medical, Inc. Robotic navigational system for interbody implants
US12089902B2 (en) 2019-07-30 2024-09-17 Coviden Lp Cone beam and 3D fluoroscope lung navigation
US11571171B2 (en) 2019-09-24 2023-02-07 Globus Medical, Inc. Compound curve cable chain
US11627924B2 (en) 2019-09-24 2023-04-18 Covidien Lp Systems and methods for image-guided navigation of percutaneously-inserted devices
US11864857B2 (en) 2019-09-27 2024-01-09 Globus Medical, Inc. Surgical robot with passive end effector
US11426178B2 (en) 2019-09-27 2022-08-30 Globus Medical Inc. Systems and methods for navigating a pin guide driver
US11890066B2 (en) 2019-09-30 2024-02-06 Globus Medical, Inc Surgical robot with passive end effector
US11510684B2 (en) 2019-10-14 2022-11-29 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US11844532B2 (en) 2019-10-14 2023-12-19 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US11992373B2 (en) 2019-12-10 2024-05-28 Globus Medical, Inc Augmented reality headset with varied opacity for navigated robotic surgery
US12064189B2 (en) 2019-12-13 2024-08-20 Globus Medical, Inc. Navigated instrument for use in robotic guided surgery
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11883117B2 (en) 2020-01-28 2024-01-30 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11690697B2 (en) 2020-02-19 2023-07-04 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
CN113520593A (en) * 2020-04-15 2021-10-22 史赛克欧洲运营有限公司 Techniques for determining the position of one or more imaging markers in an image coordinate system
US11253216B2 (en) 2020-04-28 2022-02-22 Globus Medical Inc. Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11839435B2 (en) 2020-05-08 2023-12-12 Globus Medical, Inc. Extended reality headset tool tracking and control
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11838493B2 (en) 2020-05-08 2023-12-05 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US12115028B2 (en) 2020-05-08 2024-10-15 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US12070276B2 (en) 2020-06-09 2024-08-27 Globus Medical Inc. Surgical object tracking in visible light via fiducial seeding and synthetic image registration
US11317973B2 (en) 2020-06-09 2022-05-03 Globus Medical, Inc. Camera tracking bar for computer assisted navigation during surgery
US11382713B2 (en) 2020-06-16 2022-07-12 Globus Medical, Inc. Navigated surgical system with eye to XR headset display calibration
US11877807B2 (en) 2020-07-10 2024-01-23 Globus Medical, Inc Instruments for navigated orthopedic surgeries
US11793588B2 (en) 2020-07-23 2023-10-24 Globus Medical, Inc. Sterile draping of robotic arms
CN111956327A (en) * 2020-07-27 2020-11-20 季鹰 Image measuring and registering method
CN111956327B (en) * 2020-07-27 2024-04-05 季鹰 Image measurement and registration method
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11890122B2 (en) 2020-09-24 2024-02-06 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal c-arm movement
US11523785B2 (en) 2020-09-24 2022-12-13 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US11911112B2 (en) 2020-10-27 2024-02-27 Globus Medical, Inc. Robotic navigational system
US12076091B2 (en) 2020-10-27 2024-09-03 Globus Medical, Inc. Robotic navigational system
US11941814B2 (en) 2020-11-04 2024-03-26 Globus Medical Inc. Auto segmentation using 2-D images taken during 3-D imaging spin
US11717350B2 (en) 2020-11-24 2023-08-08 Globus Medical Inc. Methods for robotic assistance and navigation in spinal surgery and related systems
US12070286B2 (en) 2021-01-08 2024-08-27 Globus Medical, Inc System and method for ligament balancing with robotic assistance
US11857273B2 (en) 2021-07-06 2024-01-02 Globus Medical, Inc. Ultrasonic robotic surgical navigation
US11850009B2 (en) 2021-07-06 2023-12-26 Globus Medical, Inc. Ultrasonic robotic surgical navigation
US11439444B1 (en) 2021-07-22 2022-09-13 Globus Medical, Inc. Screw tower and rod reduction tool
US11622794B2 (en) 2021-07-22 2023-04-11 Globus Medical, Inc. Screw tower and rod reduction tool
US11911115B2 (en) 2021-12-20 2024-02-27 Globus Medical Inc. Flat panel registration fixture and method of using same
US11918304B2 (en) 2021-12-20 2024-03-05 Globus Medical, Inc Flat panel registration fixture and method of using same
US12103480B2 (en) 2022-03-18 2024-10-01 Globus Medical Inc. Omni-wheel cable pusher
US12048493B2 (en) 2022-03-31 2024-07-30 Globus Medical, Inc. Camera tracking system identifying phantom markers during computer assisted surgery navigation
US12127803B2 (en) 2023-01-05 2024-10-29 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
CN116593121A (en) * 2023-07-12 2023-08-15 中国航空工业集团公司沈阳空气动力研究所 Aircraft model vibration measurement method based on monitoring camera
CN116593121B (en) * 2023-07-12 2023-10-24 中国航空工业集团公司沈阳空气动力研究所 Aircraft model vibration measurement method based on monitoring camera
US12121240B2 (en) 2023-11-01 2024-10-22 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US12121278B2 (en) 2023-12-05 2024-10-22 Globus Medical, Inc. Compliant orthopedic driver

Also Published As

Publication number Publication date
WO2001087136A2 (en) 2001-11-22
EP1278458A4 (en) 2006-10-18
EP1278458A2 (en) 2003-01-29
AU2001257384A1 (en) 2001-11-26
WO2001087136A3 (en) 2002-02-28
ATE515976T1 (en) 2011-07-15
EP1278458B1 (en) 2011-07-13
CA2407616A1 (en) 2001-11-22
US6490475B1 (en) 2002-12-03

Similar Documents

Publication Publication Date Title
US6484049B1 (en) Fluoroscopic tracking and visualization system
US6856827B2 (en) Fluoroscopic tracking and visualization system
US6856826B2 (en) Fluoroscopic tracking and visualization system
US7097357B2 (en) Method and system for improved correction of registration error in a fluoroscopic image
US8131031B2 (en) Systems and methods for inferred patient annotation
US8682413B2 (en) Systems and methods for automated tracker-driven image selection
US7831096B2 (en) Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use
US7344307B2 (en) System and method for integration of a calibration target into a C-arm
US8022990B2 (en) Systems and methods for on-line marker-less camera calibration using a position tracking system
US9320569B2 (en) Systems and methods for implant distance measurement
US7885441B2 (en) Systems and methods for implant virtual review
JP6526688B2 (en) Method of reconstructing a three-dimensional image from a two-dimensional x-ray image
US20080119712A1 (en) Systems and Methods for Automated Image Registration
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
EP1173105B1 (en) Apparatus and method for image guided surgery
US6782287B2 (en) Method and apparatus for tracking a medical instrument based on image registration
US7621169B2 (en) Systems and methods for integrating a navigation field replaceable unit into a fluoroscopy system
US20080154120A1 (en) Systems and methods for intraoperative measurements on navigated placements of implants
US9477686B2 (en) Systems and methods for annotation and sorting of surgical images
Yaniv Fluoroscopic X-ray image guidance for manual and robotic orthopedic surgery

Legal Events

Date Code Title Description
AS Assignment

Owner name: VISUALIZATION TECHNOLOGY, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SEELEY, TERESA;LIN, FAITH;KAPUR, TINA;AND OTHERS;REEL/FRAME:010928/0110

Effective date: 20000613

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: OEC MEDICAL SYSTEMS, INC., UTAH

Free format text: MERGER;ASSIGNOR:GE MEDICAL SYSTEMS NAVIGATION & VISUALIZATION, INC.;REEL/FRAME:013907/0699

Effective date: 20020927

Owner name: VISUALIZATION TECHNOLOGY, INC., MASSACHUSETTS

Free format text: MERGER;ASSIGNORS:GENERAL ELECTRIC COMPANY;EMERALD MERGER CORPORATION;REEL/FRAME:013908/0875

Effective date: 20020212

Owner name: GE MEDICAL SYSTEMS NAVIGATION AND VISUALIZATION, I

Free format text: CHANGE OF NAME;ASSIGNOR:VISUALIZATION TECHNOLOGY, INC.;REEL/FRAME:013907/0403

Effective date: 20020417

AS Assignment

Owner name: GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY COMPANY, LLC,

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OEC MEDICAL SYSTEMS, INC.;REEL/FRAME:014567/0262

Effective date: 20030925

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: STRYKER EUROPEAN HOLDINGS I, LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC COMPANY;REEL/FRAME:046020/0621

Effective date: 20171206