[go: nahoru, domu]

US10251705B2 - Software for use with deformity correction - Google Patents

Software for use with deformity correction Download PDF

Info

Publication number
US10251705B2
US10251705B2 US15/171,121 US201615171121A US10251705B2 US 10251705 B2 US10251705 B2 US 10251705B2 US 201615171121 A US201615171121 A US 201615171121A US 10251705 B2 US10251705 B2 US 10251705B2
Authority
US
United States
Prior art keywords
bone
image
model
deformed
orientation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/171,121
Other versions
US20170348054A1 (en
Inventor
Anup Kumar
Sridhar Anjanappa
Ashish Gangwar
Manash Lahiri
Arpit Gautam
Kanishk Sethi
Sistu Ganesh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Stryker European Operations Holdings LLC
Original Assignee
Stryker European Holdings I LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US15/171,121 priority Critical patent/US10251705B2/en
Application filed by Stryker European Holdings I LLC filed Critical Stryker European Holdings I LLC
Priority to EP17173905.5A priority patent/EP3251625B1/en
Assigned to STRYKER EUROPEAN HOLDINGS I, LLC reassignment STRYKER EUROPEAN HOLDINGS I, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GANGWAR, ASHISH, GAUTAM, Arpit, KUMAR, ANUP, SETHI, Kanishk, ANJANAPPA, Sridhar, LAHIRI, Manash, GANESH, Sistu
Priority to US15/626,497 priority patent/US10154884B2/en
Publication of US20170348054A1 publication Critical patent/US20170348054A1/en
Priority to US16/286,757 priority patent/US10603112B2/en
Publication of US10251705B2 publication Critical patent/US10251705B2/en
Application granted granted Critical
Priority to US16/793,145 priority patent/US11020186B2/en
Assigned to STRYKER EUROPEAN HOLDINGS III, LLC reassignment STRYKER EUROPEAN HOLDINGS III, LLC NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER EUROPEAN HOLDINGS I, LLC
Assigned to STRYKER EUROPEAN OPERATIONS HOLDINGS LLC reassignment STRYKER EUROPEAN OPERATIONS HOLDINGS LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER EUROPEAN HOLDINGS III, LLC
Priority to US17/242,389 priority patent/US11553965B2/en
Priority to US18/066,679 priority patent/US12029496B2/en
Priority to US18/739,413 priority patent/US20240325086A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • A61B17/58Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like
    • A61B17/60Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like for external osteosynthesis, e.g. distractors, contractors
    • A61B17/62Ring frames, i.e. devices extending around the bones to be positioned
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy

Definitions

  • the present disclosure relates to software used in planning the correction of bone deformities preoperatively and/or postoperatively, and in particular relates to autonomously or semi-autonomously creating virtual models to create the correction plan.
  • the Ilizarov external fixation device (or similar system) may be used for such a purpose.
  • the Ilizarov-type devices generally translate bone segments by manipulating the position of rings connected to each bone segment.
  • These external fixation devices generally utilize threaded rods fixated to through-holes in the rings to build the frame. In order to build the desired frame, these rods generally have to have different lengths. Once the frame is installed, the patient or surgeon moves the rings or percutaneous fixation components manually or mechanically by adjusting a series of nuts.
  • a method of generating a correction plan for correcting a deformed bone includes inputting to a computer system a first image of the deformed bone in a first plane and inputting to the computer system a second image of the deformed bone in a second plane.
  • Image processing techniques are employed to identify a plurality of anatomical landmarks of the deformed bone in the first image.
  • the first image of the deformed bone is displayed on a display device.
  • a graphical of the deformed bone is autonomously generated and graphically overlaid on the first image of the deformed bone on the display device, the graphical template including a plurality of lines, each line connected at each end to a landmark point corresponding to one of the anatomical landmarks.
  • a model of the deformed bone may be autonomously generated based on the graphical template.
  • a first model fixation ring having a first position and orientation and a second model fixation ring having a second position and orientation may be generated and displayed on the display device. At least one of the position and orientation of at least one of the model fixation rings may be graphically manipulated.
  • Combinations of sizes of a plurality of model struts to connect the models of the first and second fixation rings may be determined with an algorithm using the position and orientation of the first and second model fixation rings.
  • a first position for a limiting anatomical structure may be input to the computer system, the limiting anatomical structure defining a location having a maximum distraction value.
  • the model rings and the model struts may be simultaneously displayed on the display device and overlap the first image of the deformed bone on the display device.
  • the first image of the deformed bone may include visible soft tissue structures.
  • the limiting anatomical structure may be input graphically using an input device, which may be a computer mouse.
  • a second position for the limiting anatomical structure may be input to the computer system while the model rings and the model struts are simultaneously displayed on the display device and overlap the second image of the deformed bone on the display device, the second image of the deformed bone including visible soft tissue structures.
  • Each landmark point of the graphical template may be configured to be repositioned via an input device. Upon repositioning one of the landmark points, each line connected to the repositioned landmark point may remain connected to the repositioned landmark point.
  • the first image of the deformed bone may be an x-ray image displayed on the visual medium in one of an anterior-posterior and a lateral view
  • the second image of the deformed bone may be an x-ray image displayed on the visual medium in the other of an anterior-posterior and a lateral view
  • the first and second images of the deformed bone may include images of physical rings and physical struts of an external fixation frame coupled to a patient.
  • a position and orientation of the physical rings and a length and orientation of the physical struts may be autonomously determined based on the first and second images.
  • the determined position and orientation of the physical rings and the determined length and orientation of the physical struts may be displayed on the visual medium.
  • At least one of the determined position and the determined orientation of at least one of the physical rings may be graphically manipulated.
  • the determined orientation of at least one of the struts may be graphically manipulated.
  • FIG. 1 illustrates a home page screen of a deformity correction application.
  • FIGS. 2A-I illustrate various deformity definition screens of the deformity correction application.
  • FIGS. 3A-B illustrate various ring configuration screens of the deformity correction application in a preoperative (“pre-op”) mode.
  • FIGS. 4A-B illustrate various strut configuration screens of the deformity correction application in the pre-op mode.
  • FIG. 5 illustrates limiting anatomical structure input screen of the deformity correction application in the pre-op mode.
  • FIG. 6A illustrates a ring configuration screen of the deformity correction application in a postoperative (“post-op”) mode prior to a determination step.
  • FIG. 6B illustrates a ring configuration screen of the deformity correction application in the post-op mode during the determination step.
  • FIG. 7 illustrates a ring configuration screen of the deformity correction application in the post-op mode after the determination step.
  • FIG. 8 illustrates a limiting anatomical structure input screen of the deformity correction application in the post-op mode.
  • software aids a user, such as a physician, surgeon, or other medical personnel, in planning and carrying out the correction of a bone deformity using a limb reconstruction frame using a web application, for example.
  • a user such as a physician, surgeon, or other medical personnel
  • Other software for creating a correction plan for an external fixation frame is described in U.S. Patent Publication No. 2014/0236153, the contents of which are hereby incorporated by reference herein.
  • the login screen preferably includes a username field and password field in which the user enters, respectively, a username and password to gain further access to the application.
  • This step of authentication may, for example, help maintain compliance with patient privacy regulations.
  • a new user account may have to be created.
  • the user upon logging in, the user is taken to the home screen 110 .
  • the user has the option of starting a new case for a patient whose information has not yet been entered into the software.
  • the user may enter a case name and/or number for later reference, and may also enter any desired notes regarding the case to be saved with the case.
  • a skeletal representation 112 may also be provided, for example on the home screen 110 , so that the user may select the relevant bone. As shown in FIG. 1 , the left femur has been selected.
  • the user may choose to begin the case as a pre-op case or a post-op case, with each procedure being described separately below.
  • the pre-op mode is used prior to the surgical fixation of the limb reconstruction device to the deformed bone.
  • the post-op mode is to be used after the limb reconstruction device, with associated rings and struts, has already been affixed to the patient.
  • the pre-op mode can be used alone, the post-op mode can be used alone, or each mode can be used prior to and following surgery, respectively.
  • the user may be brought to a case details screen which may allow entering, viewing, or modifying patient details such as the patient's name, gender, race, date of birth, anatomy relevant to the case, and notes as the user sees fit.
  • a case details screen which may allow entering, viewing, or modifying patient details such as the patient's name, gender, race, date of birth, anatomy relevant to the case, and notes as the user sees fit.
  • the user may begin a deformity definition procedure.
  • the user may be initially presented with a first deformity definition screen 200 A, as shown in FIG. 2A , which may prompt a user to open or otherwise load one or more medical images, such as X-ray images, onto the application.
  • medical images such as X-ray images
  • FIG. 2A may prompt a user to open or otherwise load one or more medical images, such as X-ray images, onto the application.
  • X-ray images Although described herein in terms of X-ray images, it should be understood that other types of medical images, such as “slices” of a CT-scan, may also be used with the methods and systems described herein.
  • multiple medical images such as images of the same anatomy in different views (e.g. anterior-posterior view and lateral view) may be loaded to the application. This may be accomplished by any number of suitable methods, for example by choosing one or more image files that have been previously saved to memory on the computer running the application.
  • the medical image 201 may be shown on a second deformity definition screen 200 B.
  • These medical images 201 may help the user to define the bone deformity, described more fully below.
  • the user Before, during, or after uploading, the user also may provide details relating to the image 201 , such as the view (e.g. lateral plane) in which the image was taken.
  • the user may scale the image 201 to the application.
  • a size reference R such as a ruler
  • a measurement unit in the application e.g. a pixel
  • a real measurement unit represented in the image 201 e.g. a millimeter
  • This step may be performed by the user by selecting a drawing tool, such as a line or circle, and creating a drawing on the image, preferably in relation to the size reference R.
  • the user may enter the measurement of the drawn line (or other geometry) that represents the real measurement value, which the application may then correlate to the application measurement unit.
  • this scaling step may be performed automatically, for example by the application recognizing a defining characteristic of the size reference R, which may be compared to a real measurement value already stored in the application.
  • relevant axes of the anatomy may be defined, either autonomously or semi-autonomously.
  • a user may define, aided by the application, the mechanical and/or anatomical axes of the bone(s) against the backdrop of image 201 on screen 200 C.
  • the user may select the “mechanical axis” radio button on screen 200 C and using an input device, may define relevant anatomic landmarks.
  • one (or two) femoral mechanical axis indicia and one (or two) tibial mechanical axis indicia may appear on the image 201 .
  • two mechanical femoral axis indicia take the form of lines 280
  • two mechanical tibial axis indicia take the form of lines 290 .
  • Each line 280 , 290 may include an endpoint that the user can drag to a different position on the image 201 to help define the relevant axes.
  • the user may drag a first end of line 280 to a center of the femoral head on one leg and the other end of line 280 to the articular surface of the distal femur, and the process may be repeated for the other leg if desired.
  • the user may drag a first end of line 290 to the articular surface of the proximal tibia and the other end of line 290 to the center of the ankle joint.
  • the application may calculate and display relevant mechanical axes measurement, for example including, the lateral proximal femoral angle (“LPFA”), the mechanical lateral distal femoral angle (“mLDFA”), the lateral proximal tibial angle (“LPTA”), and the lateral distal tibial angle (“LDTA”), although other relevant measurements may also be calculated and displayed. In order to make these measurements, additional lines must be provided.
  • the LPFA is measured as the angle between the line joining the trochanteric tip to the femoral head center 281 and the femoral mechanical axis as represented by line 280 .
  • the mLDFA is measured as the angle that a condylar tangent line 282 makes with the line representing the femoral mechanical axis as represented by line 280 .
  • Lines 281 and 282 may be displayed on the medical image 201 and manipulated as desired.
  • the LPTA is measured laterally as the angle between a tibial plateau tangent line 291 and the line representing the tibial mechanical axis 290
  • the LDTA is measured laterally as the intercept of a tibial articular plafond line 292 with the line representing the tibial mechanical axis 290 .
  • Lines 291 and 292 may also be displayed on the medical image 201 and manipulated by the user.
  • the application may automatically recognize the relevant landmarks and place the lines on the image 201 , with the user having the ability to modify the placement of lines 280 - 282 and 290 - 292 if such placement is incorrect.
  • the process may be repeated for each leg if two legs are shown in medical image 201 .
  • the application may also compare the calculated angles described above to a range of values considered normal, which may be stored in memory, and highlight or otherwise indicate to the user any calculated angle falling outside the range. As shown in FIG. 2 , the LDTA of the leg with the deformed tibia is calculated as 65°, which is outside a range considered normal, which may be for example between 86° and 92°, leading to the abnormal LDTA being highlighted.
  • a rectangle 270 (or other shape) may be overlaid on the medical image 201 with the option for the user to resize and/or reposition the rectangle 270 to select the relevant deformed anatomy that is to be corrected.
  • the image 201 may be modified, for example by cropping the image so that only the relevant deformed anatomy is displayed, as shown on screen 200 E of FIG. 2E .
  • the cropped image 201 ′ may be further modified with a number of image processing features, including, for example, resizing, repositioning (e.g.
  • the user may apply an exposure filter to minimize or eliminate the tissue region shown in the cropped image 201 ′ for a better view of the deformed bone.
  • the user may further define the deformity.
  • the cropped image 201 ′ of the deformed right tibia is illustrated in FIG. 2F on a new screen 200 F after being rotated and filtered.
  • the mechanical tibia axis 290 from screen 200 C may be displayed, along with an anatomical axis 295 a of the proximal tibia and an anatomical axis 295 b of the distal tibia, the axes 295 a and 295 b being different due to the deformity.
  • the lines representing the anatomical axes 295 a , 295 b may be automatically placed on the cropped image 201 ′, for example based upon landmarks of the tibia, with the user having the option to modify the position of the lines 295 a , 295 b .
  • a template 260 may be displayed on the cropped image 201 ′ to identify landmarks on the deformed tibia.
  • the template 260 includes a plurality of landmark points corresponding to anatomical landmarks.
  • the template 260 including the medial and lateral edges of the proximal tibial, the center of the proximal tibia, the medial and lateral edges of the distal tibia, and medial and lateral surfaces of the location of the deformity in the tibia.
  • the template 260 may be automatically placed on the cropped image 201 ′, for example based upon landmarks of the tibia, with the user having the option of moving one or more of the landmark points to a different position on the cropped image 201 ′, resulting in the connecting lines repositioning and altering the shape of template 260 .
  • each line connected to the repositioned landmark point remains connected to the repositioned landmark point.
  • a radio button may be selected to display a model bone, in this case a model tibia 202 , overlaid on the cropped image 201 ′.
  • the model bone may be selected from a library of model bones based, at least in part, on the particular anatomy and patient information entered upon creating the case.
  • a button may be clicked on screen 200 G to deform the model bone 202 such that the landmarks of the model tibia align with the landmarks defined by template 260 .
  • the deformity may be further defined on deformity definition screen 200 H, as shown in FIG. 2H .
  • a line 262 representing the osteotomy plane, and a point 264 representing the deformity apex, may each be shown on the cropped image 201 ′, whether or not the model 202 is simultaneously shown on screen 200 H.
  • the model bone 202 may include separate proximal and distal (or reference and moving) portions that may be manipulated directly by the user. For example, the user may click one of the portions of model bone 202 and drag the portion into a different position, with calculated values (e.g. angulation, translation) updating as the model bone 202 is manipulated.
  • the deformity definition is described above with reference to a medical image 201 in an anterior-posterior plane, it is preferable that some or all of the deformity definition steps are additionally performed on a medical image in a different plane, such as a medial-lateral plane or a superior-inferior plane, for example.
  • the medical image 201 may alternatively be viewed in axial, coronal or sagittal planes, for example.
  • the deformed model bone 202 is shown over a cropped image 201 ′ of the deformed tibia in an AP view and the model bone 202 is also shown on an adjacent image of the deformed tibia in a lateral view.
  • the parameters of the model bone 202 , the mechanical and anatomic axes, and the template 260 may be revised in either view to update the deformity parameters until the user is satisfied that the model bone 202 accurately reflects the patient's deformed bone.
  • the system and methods described herein provide a user the ability to accurately define the deformity of the deformed bone by manipulating on-screen representations of the bone or relevant parameters or landmarks with an input device, such as a mouse, without needing to manually enter numerical values relating to the deformity.
  • the user can proceed to the first ring configuration screen 300 A ( FIG. 3A ).
  • the user may input the size of the desired rings, including a reference ring 305 and a moving ring 310 .
  • a user may be able to choose between a 155 mm, 180 mm, or 210 mm ring.
  • the user may also be able to choose the type of ring, such as a full ring, partial ring, open ring, or closed ring.
  • Different types of rings are known in the art and the inclusion of different rings as options in the software is largely a matter of design choice.
  • the rings 305 , 310 are displayed along with the model bone 202 on the screen, preferably in an AP view, a lateral view, and/or an axial view. Additional views, such as a perspective view, may be included.
  • the model bone 202 and rings 305 , 310 are displayed in an AP view with options to change to lateral, axial, and perspective views by choosing the corresponding tab on screen.
  • the cropped medical image 201 ′ is also displayed, although either the model bone 202 or the cropped image 201 ′ may be removed by clicking the appropriate radio button on screen.
  • portion of the model bone 202 proximal to the deformity and the portion of the model bone 202 distal to the deformity are based on the input received during the deformity definition described above.
  • a size and/or type of ring is selected for the reference ring 305 , it is displayed perpendicular to the reference bone fragment (in the illustrated example, the portion of the model bone 202 proximal to the deformity) with a longitudinal axis of the reference bone fragment extending through the center of the reference ring 305 .
  • the moving ring 310 is displayed perpendicular to the non-reference bone fragment (in the illustrated example, the portion of the model bone 202 distal to the deformity) with a longitudinal axis of the non-reference bone fragment extending through the center of the moving ring 310 .
  • the rings 305 , 310 may also be placed with a default axial translation that can be changed.
  • the reference ring 305 may have a default axial translation of approximately 50 mm with respect to the deformity apex
  • the moving ring 210 may have a default axial translation of approximately 150 mm with respect to the deformity apex.
  • the user may enter numerical values for position and orientation parameters for the rings 305 , 310 , by inputting values, clicking the “up” or “down” arrows associated with the particular position or orientation, or by interacting with the rings 305 , 310 on screen, for example by clicking one of the rings 305 , 310 with a mouse and dragging or rotating the ring to a new position and/or orientation. Because this is the pre-op mode and no fixation devices has yet been attached to the patient, the user chooses the ring sizes, positions and orientations that he believes will be effective for the correction based, for example, on his experience and knowledge.
  • the graphical representations of the rings 305 , 310 changes to reflect the new values. If the rings 305 , 310 are being manipulated graphically (e.g. via dragging on screen with a mouse), the numerical values associated with the position and/or orientation may update accordingly.
  • the position values may include an AP translation, a lateral translation, an axial translation, and an axial orientation.
  • the moving ring 310 may include these values, and additional values may include an AP orientation and a lateral orientation. Any of the above-described values may be displayed on screen to assist the user in understanding the position of the rings 305 , 310 relative to the model 202 .
  • the user may position multiple views of the model bone 202 and the rings 305 , 310 on the screen simultaneously.
  • screen 300 B illustrates the model bone 202 with rings 305 , 310 positioned thereon simultaneous in the AP and lateral views with the cropped image 201 ′ hidden.
  • the first strut configuration screen 400 A allows the user to initiate an automatic calculation of possible strut combinations to connect the reference ring 305 to the moving ring 310 .
  • a plurality of graphical representations of struts 410 are illustrated on the screen in their intended initial positions with respect to the graphical representation of the reference ring 305 and the moving ring 310 .
  • the user also has the option to display all of the calculated combinations of struts 410 that may be used with the external fixator. For example, although one particular combination of struts 410 is illustrated on screen, multiple combinations may be calculated as possibilities.
  • the application may default to showing the combination of struts 410 that requires the fewest number of strut change-outs during the deformity correction, but other options may be available for the user to choose based on his or her particular desire.
  • the possible strut combinations may be presented in a table with a description of each strut in a particular combination.
  • the user may cause other views of the model bone 202 , rings 305 , 310 and struts 410 to be illustrated on screen, either individually or simultaneously.
  • the model bone 202 , rings 305 , 310 , and struts 410 are shown in the AP, lateral, axial, and perspective views on screen 400 B in FIG. 4B . This may help the user better visualize the external fixator system.
  • the orientation of each strut 410 including strut length and strut angle, may be displayed.
  • the user may proceed to a limiting anatomical structure (“LAS”) input screen 500 .
  • LAS limiting anatomical structure
  • the LAS input screen 500 allows a user to input a position for a limiting anatomical structure.
  • the user may input a value (or the application may provide a default value) for a maximum distraction rate, which is the maximum distance a structure may move over time. For example, nerves, soft tissue, or even ends of the bone may be damaged if the rate of distraction at these points is too great.
  • the user may define a LAS point 510 on screen 500 by dragging the LAS point 510 to the desired position. This step may be done both the AP and lateral views to define the LAS point 510 in three dimensions.
  • the LAS point 510 defines a position that cannot be have a distraction rate greater than the maximum distraction rate, so that the anatomy at the LAS point 510 does not distract too quickly during correction and become damaged. For example, neurovascular tissue may sustain stretch damage if the tissue experiences too great a distraction rate.
  • a user may choose the position of the LAS point 510 based on his experience and the model bone 202 on screen 500 , it would be helpful to the user to be able to visualize soft tissue when defining the position of the LAS point 510 as soft tissue may be the anatomy at risk of damage from the deformity correction.
  • the cropped image 201 ′ may be unhidden, with one or more of the model bone 202 , rings 305 , 310 , and struts 410 simultaneously being shown on screen 500 .
  • the user may view the patient's soft tissue in addition to the deformed bone on a screen with the models of the bone 202 , rings 305 , 310 , and/or struts 410 .
  • the visualization of the soft tissue may aid the user in precisely defining the LAS point 510 to reduce the chance of injury to the patient's LAS during correction of the deformed bone.
  • the user may generate the correction plan.
  • the user may enter the date on which the user or patient will begin adjusting the fixation frame according to the correction plan.
  • the user commands the computer to generate the correction plan, which may be displayed on screen.
  • the correction plan may include, for example, the position and angle of each strut of the fixation frame for each day of the correction, along with the date and day number (e.g. first day, second day) of the correction plan.
  • the correction plan may also show a relationship between positions of the struts and discrete user or patient actions.
  • the correction plan may indicate that the user or patient should increase the length of that strut four separate times, for example by 0.25 millimeters in the morning, 0.25 millimeters at noon, 0.25 millimeters in the evening and another 0.25 millimeters at night.
  • the correction plan may also aid a physician or surgeon in monitoring the progress of the correction of the bone deformity, for example by checking at periodic intervals that the struts of the fixation frame are in the proper position as called for by the correction plan.
  • the application can be used in a post-op mode in addition or as an alternative to the pre-op mode.
  • This mode can be used once the patient has already undergone surgery to attach the fixation frame to the deformed bone.
  • the post-op mode can be used as an alternative to the pre-op mode, for example in cases in which time is limited and surgery must be performed without the benefit of the planning provided in the pre-op mode described above.
  • the post-op mode can be used in addition to the pre-op mode, if the physician was unable to affix the fixation frame to the bone as suggested by the pre-op mode.
  • the steps described above with reference to the login screen and home page 110 are the same as in the pre-op mode ( FIG. 1 ).
  • accurate models of the mounted frame should be created in the application. Any misinterpretations or calculation errors during the modeling process can affect the correction plan.
  • the application is capable of recognizing the anatomical structures and frame components in the medical image 201 (or cropped image 201 ′) by using image processing algorithms and coordinate geometry theories to provide accurate measurements of the fixation frame and anatomy, either in a fully autonomous or semi-autonomous fashion.
  • a first screen 600 A may include scaling the medical images 201 .
  • a size reference R with a known size may be included in the medical image 201 , with the known size stored in memory so that the application is able to automatically scale each medical image 201 to the correct size.
  • the user may initiate a processing step in which the application determines the size and orientations of the physical reference ring 605 , the physical moving ring 610 , and the physical struts 710 .
  • the application may process the medical images 201 , with a first recognition stage employing texture guided shape analysis algorithms that recognize and identify the structures based on textures and/or shapes in the images 201 .
  • the application employs projective geometry techniques to determine the position and orientation of the physical rings 605 , 610 and physical struts 710 .
  • This step may include the calculation of the radius (or diameter) of each physical ring 605 , 610 , the angular orientations of each ring 605 , 610 , the length of each physical strut 710 , the angular orientation of each physical strut 710 , and the connection points of each strut 710 to each ring 605 , 610 .
  • the application may also recognize the patient's bone structures as well as the position and orientations of relevant fragments. During this step, the application recognizes a reference fragment (as illustrated on screen 600 B of FIG. 6B , this is the bone fragment proximal to the deformity) and a moving fragment (as illustrated on screen 600 B of FIG. 6B , this is the bone fragment distal to the deformity).
  • the bone structures may be recognized using image processing techniques that use structural and textural features along with machine learning techniques, including, for example, statistical shape modelling. Subspace analysis techniques for bone detection may make use of shape, texture distributions, and kernel method based learning techniques for accurate extraction of anatomical structures.
  • indicia I may be provided on screen to indicate the structures as identified by the application. As shown in FIG. 6B , such indicia I may include one or more points along the physical reference ring 605 , one or more points along the physical moving ring 610 , and one or more relevant positions of the bone.
  • the relevant parameters are displayed on a ring configuration screen 700 as shown in FIG. 7 .
  • Relevant parameters which may be the same as those described with respect to the pre-op mode and FIG. 3B , may be displayed so that the user is able to confirm that the calculations performed by the application are correct. If the user desires to alter any of the parameters, he may use an input device (e.g. a mouse or keyboard) to activate the “up” or “down” arrow on screen 700 next to the relevant parameter to increase or decrease the parameter, or use the input device to graphically change one of the lines representing the relevant parameter on the image 201 .
  • an input device e.g. a mouse or keyboard
  • the user may similarly confirm or revise relevant parameters calculated with respect to the physical struts 710 on a strut configuration page (not shown).
  • the user may advance to a LAS input page 800 , as shown in FIG. 8 , to indicate the position of the LAS point 810 .
  • the procedure regarding the input of the position of the LAS point 810 may be the same as described in connection with FIG. 5 in the pre-op mode.
  • the application preferably automatically and correctly identifies the bone fragments, the physical components of the fixation frame, and the positions and orientations of the fragments and components.
  • the user desires to change the automatically determined identifications, positions, and orientations of the frame components, he may do so as described above with respect to FIGS. 7-8 by adjusting the parameters on screen.
  • the application may display the template 260 over the model bone 202 (and/or medical image 201 ) similar to that shown and described in connection to FIGS. 2F-G in the pre-op mode.
  • this automatic recognition process may be used when initiating the deformity measurement in the pre-op mode as well.
  • the user desires to alter the automatically populated template, he may alter the template graphically by moving the relevant landmarks of the template similar to the method described in connection with FIGS. 2F-G .
  • the user may generate a correction schedule in the same manner as described above with respect to the pre-op mode.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Surgical Instruments (AREA)

Abstract

A method of generating a correction plan for correcting a deformed bone includes inputting to a computer system a first image of the deformed bone in a first plane and inputting to the computer system a second image of the deformed bone in a second plane. Image processing techniques are employed to identify a plurality of anatomical landmarks of the deformed bone in the first image. The first image of the deformed bone is displayed on a display device. A graphical of the deformed bone is autonomously generated and graphically overlaid on the first image of the deformed bone on the display device, the graphical template including a plurality of lines, each line connected at each end to a landmark point corresponding to one of the anatomical landmarks. A model of the deformed bone may be autonomously generated based on the graphical template.

Description

FIELD OF THE INVENTION
The present disclosure relates to software used in planning the correction of bone deformities preoperatively and/or postoperatively, and in particular relates to autonomously or semi-autonomously creating virtual models to create the correction plan.
BACKGROUND OF THE INVENTION
Currently, external fixation systems may be used to correct skeletal deformities using the distraction osteogenesis process, for example. The Ilizarov external fixation device (or similar system) may be used for such a purpose. The Ilizarov-type devices generally translate bone segments by manipulating the position of rings connected to each bone segment.
These external fixation devices generally utilize threaded rods fixated to through-holes in the rings to build the frame. In order to build the desired frame, these rods generally have to have different lengths. Once the frame is installed, the patient or surgeon moves the rings or percutaneous fixation components manually or mechanically by adjusting a series of nuts.
As fixation devices become more complex, the task of determining the optimal lengths and positions of the struts with respect to rings of the fixation frame, as well as creating a correction plan for manipulating the struts to correct the bone deformity, becomes more difficult.
The increasing difficulty of these determinations decreases the attractiveness of using complex fixation frames. It would be advantageous to have an at least partially automated method for determining the optimal configuration of a fixation frame in reference to a deformed bone, as well as a correction plan for manipulating the fixation frame to correct the bone deformity.
BRIEF SUMMARY OF THE INVENTION
According to a first embodiment of the disclosure, a method of generating a correction plan for correcting a deformed bone includes inputting to a computer system a first image of the deformed bone in a first plane and inputting to the computer system a second image of the deformed bone in a second plane. Image processing techniques are employed to identify a plurality of anatomical landmarks of the deformed bone in the first image. The first image of the deformed bone is displayed on a display device. A graphical of the deformed bone is autonomously generated and graphically overlaid on the first image of the deformed bone on the display device, the graphical template including a plurality of lines, each line connected at each end to a landmark point corresponding to one of the anatomical landmarks. A model of the deformed bone may be autonomously generated based on the graphical template. A first model fixation ring having a first position and orientation and a second model fixation ring having a second position and orientation may be generated and displayed on the display device. At least one of the position and orientation of at least one of the model fixation rings may be graphically manipulated.
Combinations of sizes of a plurality of model struts to connect the models of the first and second fixation rings may be determined with an algorithm using the position and orientation of the first and second model fixation rings. A first position for a limiting anatomical structure may be input to the computer system, the limiting anatomical structure defining a location having a maximum distraction value. During the step of inputting the first position for the limiting anatomical structure, the model rings and the model struts may be simultaneously displayed on the display device and overlap the first image of the deformed bone on the display device. During the step of inputting the first position for the limiting anatomical structure, the first image of the deformed bone may include visible soft tissue structures. The limiting anatomical structure may be input graphically using an input device, which may be a computer mouse. A second position for the limiting anatomical structure may be input to the computer system while the model rings and the model struts are simultaneously displayed on the display device and overlap the second image of the deformed bone on the display device, the second image of the deformed bone including visible soft tissue structures.
Each landmark point of the graphical template may be configured to be repositioned via an input device. Upon repositioning one of the landmark points, each line connected to the repositioned landmark point may remain connected to the repositioned landmark point.
The first image of the deformed bone may be an x-ray image displayed on the visual medium in one of an anterior-posterior and a lateral view, and the second image of the deformed bone may be an x-ray image displayed on the visual medium in the other of an anterior-posterior and a lateral view. The first and second images of the deformed bone may include images of physical rings and physical struts of an external fixation frame coupled to a patient. A position and orientation of the physical rings and a length and orientation of the physical struts may be autonomously determined based on the first and second images. The determined position and orientation of the physical rings and the determined length and orientation of the physical struts may be displayed on the visual medium. At least one of the determined position and the determined orientation of at least one of the physical rings may be graphically manipulated. The determined orientation of at least one of the struts may be graphically manipulated.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a home page screen of a deformity correction application.
FIGS. 2A-I illustrate various deformity definition screens of the deformity correction application.
FIGS. 3A-B illustrate various ring configuration screens of the deformity correction application in a preoperative (“pre-op”) mode.
FIGS. 4A-B illustrate various strut configuration screens of the deformity correction application in the pre-op mode.
FIG. 5 illustrates limiting anatomical structure input screen of the deformity correction application in the pre-op mode.
FIG. 6A illustrates a ring configuration screen of the deformity correction application in a postoperative (“post-op”) mode prior to a determination step.
FIG. 6B illustrates a ring configuration screen of the deformity correction application in the post-op mode during the determination step.
FIG. 7 illustrates a ring configuration screen of the deformity correction application in the post-op mode after the determination step.
FIG. 8 illustrates a limiting anatomical structure input screen of the deformity correction application in the post-op mode.
DETAILED DESCRIPTION
In one embodiment of the disclosure, software aids a user, such as a physician, surgeon, or other medical personnel, in planning and carrying out the correction of a bone deformity using a limb reconstruction frame using a web application, for example. Other software for creating a correction plan for an external fixation frame is described in U.S. Patent Publication No. 2014/0236153, the contents of which are hereby incorporated by reference herein.
Upon starting the application, the user is presented with a login screen. The login screen preferably includes a username field and password field in which the user enters, respectively, a username and password to gain further access to the application. This step of authentication may, for example, help maintain compliance with patient privacy regulations. In cases where a first time user tries to gain further access to the application, a new user account may have to be created.
As shown in FIG. 1, upon logging in, the user is taken to the home screen 110. From the home screen 110, the user has the option of starting a new case for a patient whose information has not yet been entered into the software. Prior to starting the new case, the user may enter a case name and/or number for later reference, and may also enter any desired notes regarding the case to be saved with the case. A skeletal representation 112 may also be provided, for example on the home screen 110, so that the user may select the relevant bone. As shown in FIG. 1, the left femur has been selected. With the desired anatomy entered, and the relevant case name and number information entered, the user may choose to begin the case as a pre-op case or a post-op case, with each procedure being described separately below. Generally speaking, the pre-op mode is used prior to the surgical fixation of the limb reconstruction device to the deformed bone. The post-op mode is to be used after the limb reconstruction device, with associated rings and struts, has already been affixed to the patient. In a single case, the pre-op mode can be used alone, the post-op mode can be used alone, or each mode can be used prior to and following surgery, respectively.
After the user begins the case as a pre-op case, the user may be brought to a case details screen which may allow entering, viewing, or modifying patient details such as the patient's name, gender, race, date of birth, anatomy relevant to the case, and notes as the user sees fit. With the case details entered as desired, the user may begin a deformity definition procedure.
The user may be initially presented with a first deformity definition screen 200A, as shown in FIG. 2A, which may prompt a user to open or otherwise load one or more medical images, such as X-ray images, onto the application. Although described herein in terms of X-ray images, it should be understood that other types of medical images, such as “slices” of a CT-scan, may also be used with the methods and systems described herein. It should also be understood that multiple medical images, such as images of the same anatomy in different views (e.g. anterior-posterior view and lateral view) may be loaded to the application. This may be accomplished by any number of suitable methods, for example by choosing one or more image files that have been previously saved to memory on the computer running the application. In the pre-op mode, there will generally not be a fixation device shown, as the fixation device hast not yet been implanted onto the patient. Once chosen, the medical image 201, for example as shown in FIG. 2B, may be shown on a second deformity definition screen 200B. These medical images 201 may help the user to define the bone deformity, described more fully below. Before, during, or after uploading, the user also may provide details relating to the image 201, such as the view (e.g. lateral plane) in which the image was taken. With the image 201 shown on screen 200B, the user may scale the image 201 to the application. For example, a size reference R, such as a ruler, may be included in the image 201, so that the user may scale a measurement unit in the application (e.g. a pixel) to a real measurement unit represented in the image 201 (e.g. a millimeter). This step may be performed by the user by selecting a drawing tool, such as a line or circle, and creating a drawing on the image, preferably in relation to the size reference R. The user may enter the measurement of the drawn line (or other geometry) that represents the real measurement value, which the application may then correlate to the application measurement unit. Alternately, this scaling step may be performed automatically, for example by the application recognizing a defining characteristic of the size reference R, which may be compared to a real measurement value already stored in the application.
With the image 201 uploaded and scaled, relevant axes of the anatomy may be defined, either autonomously or semi-autonomously. For example, as shown in FIG. 2C, a user may define, aided by the application, the mechanical and/or anatomical axes of the bone(s) against the backdrop of image 201 on screen 200C. For example, as shown in FIG. 2C, the user may select the “mechanical axis” radio button on screen 200C and using an input device, may define relevant anatomic landmarks. For example, upon selecting the “mechanical axis” radio button, one (or two) femoral mechanical axis indicia and one (or two) tibial mechanical axis indicia may appear on the image 201.
In the illustrated embodiment, two mechanical femoral axis indicia take the form of lines 280, and two mechanical tibial axis indicia take the form of lines 290. Each line 280, 290 may include an endpoint that the user can drag to a different position on the image 201 to help define the relevant axes. For example, to define the mechanical femoral axis, the user may drag a first end of line 280 to a center of the femoral head on one leg and the other end of line 280 to the articular surface of the distal femur, and the process may be repeated for the other leg if desired. Similarly, to define the mechanical tibial axis, the user may drag a first end of line 290 to the articular surface of the proximal tibia and the other end of line 290 to the center of the ankle joint. Based on the placement of lines 280 and 290, the application may calculate and display relevant mechanical axes measurement, for example including, the lateral proximal femoral angle (“LPFA”), the mechanical lateral distal femoral angle (“mLDFA”), the lateral proximal tibial angle (“LPTA”), and the lateral distal tibial angle (“LDTA”), although other relevant measurements may also be calculated and displayed. In order to make these measurements, additional lines must be provided. For example, the LPFA is measured as the angle between the line joining the trochanteric tip to the femoral head center 281 and the femoral mechanical axis as represented by line 280. Similarly the mLDFA is measured as the angle that a condylar tangent line 282 makes with the line representing the femoral mechanical axis as represented by line 280. Lines 281 and 282 may be displayed on the medical image 201 and manipulated as desired. With regard to the tibial angles, the LPTA is measured laterally as the angle between a tibial plateau tangent line 291 and the line representing the tibial mechanical axis 290, while the LDTA is measured laterally as the intercept of a tibial articular plafond line 292 with the line representing the tibial mechanical axis 290. Lines 291 and 292 may also be displayed on the medical image 201 and manipulated by the user. Although the steps above are described as manual placement of lines 280-282 and 290-292, it should be understood that the application may automatically recognize the relevant landmarks and place the lines on the image 201, with the user having the ability to modify the placement of lines 280-282 and 290-292 if such placement is incorrect. As should be understood, the process may be repeated for each leg if two legs are shown in medical image 201. The application may also compare the calculated angles described above to a range of values considered normal, which may be stored in memory, and highlight or otherwise indicate to the user any calculated angle falling outside the range. As shown in FIG. 2, the LDTA of the leg with the deformed tibia is calculated as 65°, which is outside a range considered normal, which may be for example between 86° and 92°, leading to the abnormal LDTA being highlighted.
On a fourth deformity definition screen 200D, the user may select the area of interest on the medical image 201 for measuring deformity parameters in a following step. For example, a rectangle 270 (or other shape) may be overlaid on the medical image 201 with the option for the user to resize and/or reposition the rectangle 270 to select the relevant deformed anatomy that is to be corrected. With the relevant area selected, the image 201 may be modified, for example by cropping the image so that only the relevant deformed anatomy is displayed, as shown on screen 200E of FIG. 2E. The cropped image 201′ may be further modified with a number of image processing features, including, for example, resizing, repositioning (e.g. rotating), or changing the contrast of the cropped image 201′. In one example, the user may apply an exposure filter to minimize or eliminate the tissue region shown in the cropped image 201′ for a better view of the deformed bone. Once the cropped image 201′ is edited as desired, the user may further define the deformity.
The cropped image 201′ of the deformed right tibia is illustrated in FIG. 2F on a new screen 200F after being rotated and filtered. On screen 200F, the mechanical tibia axis 290 from screen 200C may be displayed, along with an anatomical axis 295 a of the proximal tibia and an anatomical axis 295 b of the distal tibia, the axes 295 a and 295 b being different due to the deformity. As with the mechanical tibial axis 290 described above, the lines representing the anatomical axes 295 a, 295 b, may be automatically placed on the cropped image 201′, for example based upon landmarks of the tibia, with the user having the option to modify the position of the lines 295 a, 295 b. In addition to the tibial mechanical axis 290 and the anatomical axes 295 a, 295 b, a template 260 may be displayed on the cropped image 201′ to identify landmarks on the deformed tibia. The template 260 includes a plurality of landmark points corresponding to anatomical landmarks. In the illustrated example, the template 260 including the medial and lateral edges of the proximal tibial, the center of the proximal tibia, the medial and lateral edges of the distal tibia, and medial and lateral surfaces of the location of the deformity in the tibia. The template 260 may be automatically placed on the cropped image 201′, for example based upon landmarks of the tibia, with the user having the option of moving one or more of the landmark points to a different position on the cropped image 201′, resulting in the connecting lines repositioning and altering the shape of template 260. In other words, upon repositioning one of the landmark points, each line connected to the repositioned landmark point remains connected to the repositioned landmark point.
As shown in FIG. 2G, a radio button may be selected to display a model bone, in this case a model tibia 202, overlaid on the cropped image 201′. The model bone may be selected from a library of model bones based, at least in part, on the particular anatomy and patient information entered upon creating the case. A button may be clicked on screen 200G to deform the model bone 202 such that the landmarks of the model tibia align with the landmarks defined by template 260. The deformity may be further defined on deformity definition screen 200H, as shown in FIG. 2H. A line 262 representing the osteotomy plane, and a point 264 representing the deformity apex, may each be shown on the cropped image 201′, whether or not the model 202 is simultaneously shown on screen 200H. The model bone 202 may include separate proximal and distal (or reference and moving) portions that may be manipulated directly by the user. For example, the user may click one of the portions of model bone 202 and drag the portion into a different position, with calculated values (e.g. angulation, translation) updating as the model bone 202 is manipulated.
Although the deformity definition is described above with reference to a medical image 201 in an anterior-posterior plane, it is preferable that some or all of the deformity definition steps are additionally performed on a medical image in a different plane, such as a medial-lateral plane or a superior-inferior plane, for example. The medical image 201 may alternatively be viewed in axial, coronal or sagittal planes, for example. As shown in FIG. 2I, the deformed model bone 202 is shown over a cropped image 201′ of the deformed tibia in an AP view and the model bone 202 is also shown on an adjacent image of the deformed tibia in a lateral view. The parameters of the model bone 202, the mechanical and anatomic axes, and the template 260 may be revised in either view to update the deformity parameters until the user is satisfied that the model bone 202 accurately reflects the patient's deformed bone. As should be understood from the above description, the system and methods described herein provide a user the ability to accurately define the deformity of the deformed bone by manipulating on-screen representations of the bone or relevant parameters or landmarks with an input device, such as a mouse, without needing to manually enter numerical values relating to the deformity.
Once the user is satisfied that the model bone 202 is an accurate representation of the deformed bone, the user can proceed to the first ring configuration screen 300A (FIG. 3A). At this point, the user may input the size of the desired rings, including a reference ring 305 and a moving ring 310. For example, a user may be able to choose between a 155 mm, 180 mm, or 210 mm ring. The user may also be able to choose the type of ring, such as a full ring, partial ring, open ring, or closed ring. Different types of rings are known in the art and the inclusion of different rings as options in the software is largely a matter of design choice. The rings 305, 310 are displayed along with the model bone 202 on the screen, preferably in an AP view, a lateral view, and/or an axial view. Additional views, such as a perspective view, may be included. On screen 300A, the model bone 202 and rings 305, 310 are displayed in an AP view with options to change to lateral, axial, and perspective views by choosing the corresponding tab on screen. The cropped medical image 201′ is also displayed, although either the model bone 202 or the cropped image 201′ may be removed by clicking the appropriate radio button on screen.
The position and orientation of portion of the model bone 202 proximal to the deformity and the portion of the model bone 202 distal to the deformity are based on the input received during the deformity definition described above. Once a size and/or type of ring is selected for the reference ring 305, it is displayed perpendicular to the reference bone fragment (in the illustrated example, the portion of the model bone 202 proximal to the deformity) with a longitudinal axis of the reference bone fragment extending through the center of the reference ring 305. Similarly, once a size and/or type of ring is selected for the moving ring 310, it is displayed perpendicular to the non-reference bone fragment (in the illustrated example, the portion of the model bone 202 distal to the deformity) with a longitudinal axis of the non-reference bone fragment extending through the center of the moving ring 310. The rings 305, 310 may also be placed with a default axial translation that can be changed. For example, the reference ring 305 may have a default axial translation of approximately 50 mm with respect to the deformity apex, while the moving ring 210 may have a default axial translation of approximately 150 mm with respect to the deformity apex. The user may enter numerical values for position and orientation parameters for the rings 305, 310, by inputting values, clicking the “up” or “down” arrows associated with the particular position or orientation, or by interacting with the rings 305, 310 on screen, for example by clicking one of the rings 305, 310 with a mouse and dragging or rotating the ring to a new position and/or orientation. Because this is the pre-op mode and no fixation devices has yet been attached to the patient, the user chooses the ring sizes, positions and orientations that he believes will be effective for the correction based, for example, on his experience and knowledge. As the values for the position and/or orientation of the rings 305, 310 are changed, the graphical representations of the rings 305, 310 changes to reflect the new values. If the rings 305, 310 are being manipulated graphically (e.g. via dragging on screen with a mouse), the numerical values associated with the position and/or orientation may update accordingly. For the reference ring 305, the position values may include an AP translation, a lateral translation, an axial translation, and an axial orientation. The moving ring 310 may include these values, and additional values may include an AP orientation and a lateral orientation. Any of the above-described values may be displayed on screen to assist the user in understanding the position of the rings 305, 310 relative to the model 202. It may be particularly useful to display only non-zero values so that the most pertinent information is displayed. The user may position multiple views of the model bone 202 and the rings 305, 310 on the screen simultaneously. For example, as shown in FIG. 3B, screen 300B illustrates the model bone 202 with rings 305, 310 positioned thereon simultaneous in the AP and lateral views with the cropped image 201′ hidden.
Once the user is satisfied that the reference ring 305 and moving ring 310 are at locations on the model bone 202 representative of where the actual rings should be located on the patient's deformed bone, the user can proceed to the first strut configuration screen 400A as shown in FIG. 4A. The first strut configuration screen 400A allows the user to initiate an automatic calculation of possible strut combinations to connect the reference ring 305 to the moving ring 310. Once the calculation is complete, a plurality of graphical representations of struts 410 are illustrated on the screen in their intended initial positions with respect to the graphical representation of the reference ring 305 and the moving ring 310. The user also has the option to display all of the calculated combinations of struts 410 that may be used with the external fixator. For example, although one particular combination of struts 410 is illustrated on screen, multiple combinations may be calculated as possibilities. The application may default to showing the combination of struts 410 that requires the fewest number of strut change-outs during the deformity correction, but other options may be available for the user to choose based on his or her particular desire. The possible strut combinations may be presented in a table with a description of each strut in a particular combination.
As with the other planning stages described above, the user may cause other views of the model bone 202, rings 305, 310 and struts 410 to be illustrated on screen, either individually or simultaneously. For example, the model bone 202, rings 305, 310, and struts 410 are shown in the AP, lateral, axial, and perspective views on screen 400B in FIG. 4B. This may help the user better visualize the external fixator system. When a particular combination of strut 410 is selected, the orientation of each strut 410, including strut length and strut angle, may be displayed. After the user is satisfied with the selected combination of struts 410, the user may proceed to a limiting anatomical structure (“LAS”) input screen 500.
The LAS input screen 500 (FIG. 5) allows a user to input a position for a limiting anatomical structure. In particular, the user may input a value (or the application may provide a default value) for a maximum distraction rate, which is the maximum distance a structure may move over time. For example, nerves, soft tissue, or even ends of the bone may be damaged if the rate of distraction at these points is too great. The user may define a LAS point 510 on screen 500 by dragging the LAS point 510 to the desired position. This step may be done both the AP and lateral views to define the LAS point 510 in three dimensions. The LAS point 510 defines a position that cannot be have a distraction rate greater than the maximum distraction rate, so that the anatomy at the LAS point 510 does not distract too quickly during correction and become damaged. For example, neurovascular tissue may sustain stretch damage if the tissue experiences too great a distraction rate. Although a user may choose the position of the LAS point 510 based on his experience and the model bone 202 on screen 500, it would be helpful to the user to be able to visualize soft tissue when defining the position of the LAS point 510 as soft tissue may be the anatomy at risk of damage from the deformity correction. To that end, when defining the position of the LAS point 510, the cropped image 201′ may be unhidden, with one or more of the model bone 202, rings 305, 310, and struts 410 simultaneously being shown on screen 500. By editing the parameters of cropped image 201′, for example by adjusting the contrast or exposure as described above, the user may view the patient's soft tissue in addition to the deformed bone on a screen with the models of the bone 202, rings 305, 310, and/or struts 410. The visualization of the soft tissue may aid the user in precisely defining the LAS point 510 to reduce the chance of injury to the patient's LAS during correction of the deformed bone.
Based on the position and orientation of the model bone 202, the rings 305, 310, the struts 410, and the position of the LAS point 510, the user may generate the correction plan. To generate the correction plan, the user may enter the date on which the user or patient will begin adjusting the fixation frame according to the correction plan. Once entered, the user commands the computer to generate the correction plan, which may be displayed on screen. The correction plan may include, for example, the position and angle of each strut of the fixation frame for each day of the correction, along with the date and day number (e.g. first day, second day) of the correction plan. The correction plan may also show a relationship between positions of the struts and discrete user or patient actions. For example, if the correction plan calls for a strut to be lengthened by 1 millimeter on the first day, the correction plan may indicate that the user or patient should increase the length of that strut four separate times, for example by 0.25 millimeters in the morning, 0.25 millimeters at noon, 0.25 millimeters in the evening and another 0.25 millimeters at night. Besides use as an instructional tool, the correction plan may also aid a physician or surgeon in monitoring the progress of the correction of the bone deformity, for example by checking at periodic intervals that the struts of the fixation frame are in the proper position as called for by the correction plan.
As mentioned above, the application can be used in a post-op mode in addition or as an alternative to the pre-op mode. This mode can be used once the patient has already undergone surgery to attach the fixation frame to the deformed bone. The post-op mode can be used as an alternative to the pre-op mode, for example in cases in which time is limited and surgery must be performed without the benefit of the planning provided in the pre-op mode described above. However, the post-op mode can be used in addition to the pre-op mode, if the physician was unable to affix the fixation frame to the bone as suggested by the pre-op mode.
In the post-op mode, the steps described above with reference to the login screen and home page 110 are the same as in the pre-op mode (FIG. 1). In order to generate an accurate post-op correction plan that minimizes the risk of misalignment of the deformed bone and/or damage to tissue, accurate models of the mounted frame should be created in the application. Any misinterpretations or calculation errors during the modeling process can affect the correction plan. Thus, it would be beneficial for the application to assist the user in creating the model rings 305, 310 and struts 410 to generate the model parameters as accurately as possible, preferably while minimizing user intervention. As described below, the application is capable of recognizing the anatomical structures and frame components in the medical image 201 (or cropped image 201′) by using image processing algorithms and coordinate geometry theories to provide accurate measurements of the fixation frame and anatomy, either in a fully autonomous or semi-autonomous fashion.
Similar to the pre-op mode, after entering the relevant patient details, the user can upload one or more medical images 201 in one or more views to the application. Because this is a post-op mode, uploaded medical images 201 show the physical rings and struts of the fixation frame, as they have already been attached to the patient's bone. The process of inputting the measurements in the deformity definition step may be the similar or the same as described with respect to the pre-op mode. For example, as shown in FIG. 6A, a first screen 600A may include scaling the medical images 201. A size reference R with a known size may be included in the medical image 201, with the known size stored in memory so that the application is able to automatically scale each medical image 201 to the correct size.
With each medical image 201 properly scaled, the user may initiate a processing step in which the application determines the size and orientations of the physical reference ring 605, the physical moving ring 610, and the physical struts 710. The application may process the medical images 201, with a first recognition stage employing texture guided shape analysis algorithms that recognize and identify the structures based on textures and/or shapes in the images 201. Once recognized, the application employs projective geometry techniques to determine the position and orientation of the physical rings 605, 610 and physical struts 710. This step may include the calculation of the radius (or diameter) of each physical ring 605, 610, the angular orientations of each ring 605, 610, the length of each physical strut 710, the angular orientation of each physical strut 710, and the connection points of each strut 710 to each ring 605, 610.
In addition to recognizing the components of the physical fixation system and determining the position and orientation of the components, the application may also recognize the patient's bone structures as well as the position and orientations of relevant fragments. During this step, the application recognizes a reference fragment (as illustrated on screen 600B of FIG. 6B, this is the bone fragment proximal to the deformity) and a moving fragment (as illustrated on screen 600B of FIG. 6B, this is the bone fragment distal to the deformity). The bone structures may be recognized using image processing techniques that use structural and textural features along with machine learning techniques, including, for example, statistical shape modelling. Subspace analysis techniques for bone detection may make use of shape, texture distributions, and kernel method based learning techniques for accurate extraction of anatomical structures. During or after the automatic recognition of the fixation frame components and the bone fragments, indicia I may be provided on screen to indicate the structures as identified by the application. As shown in FIG. 6B, such indicia I may include one or more points along the physical reference ring 605, one or more points along the physical moving ring 610, and one or more relevant positions of the bone.
Once the identification step and position and orientation recognition steps are completed for both the physical frame components and the bone fragments, the relevant parameters are displayed on a ring configuration screen 700 as shown in FIG. 7. Relevant parameters, which may be the same as those described with respect to the pre-op mode and FIG. 3B, may be displayed so that the user is able to confirm that the calculations performed by the application are correct. If the user desires to alter any of the parameters, he may use an input device (e.g. a mouse or keyboard) to activate the “up” or “down” arrow on screen 700 next to the relevant parameter to increase or decrease the parameter, or use the input device to graphically change one of the lines representing the relevant parameter on the image 201. The user may similarly confirm or revise relevant parameters calculated with respect to the physical struts 710 on a strut configuration page (not shown). Finally, the user may advance to a LAS input page 800, as shown in FIG. 8, to indicate the position of the LAS point 810. The procedure regarding the input of the position of the LAS point 810 may be the same as described in connection with FIG. 5 in the pre-op mode.
As mentioned above, although the application preferably automatically and correctly identifies the bone fragments, the physical components of the fixation frame, and the positions and orientations of the fragments and components. To the extent that the user desires to change the automatically determined identifications, positions, and orientations of the frame components, he may do so as described above with respect to FIGS. 7-8 by adjusting the parameters on screen. With regard to the bone fragments, upon the identification and determination of the position of the fragments, the application may display the template 260 over the model bone 202 (and/or medical image 201) similar to that shown and described in connection to FIGS. 2F-G in the pre-op mode. In fact, this automatic recognition process may be used when initiating the deformity measurement in the pre-op mode as well. To the extent the user desires to alter the automatically populated template, he may alter the template graphically by moving the relevant landmarks of the template similar to the method described in connection with FIGS. 2F-G.
Once the user is satisfied that the automatically calculated positions and orientations of the physical components of the frame and the bone fragments are accurate, or after adjusting the calculated positions and orientations to the user's satisfaction, and also after inputting the position of the LAS point 810, the user may generate a correction schedule in the same manner as described above with respect to the pre-op mode.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. For example, although described in relation to a correction of a deformed tibia, other bones and fixation frames for those bones may be modeled by the application according to the same principles described above. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims.

Claims (21)

The invention claimed is:
1. A method of generating a correction plan for correcting a deformity in a deformed bone comprising the steps of:
inputting to a computer system a first image of the deformed bone in a first plane;
inputting to the computer system a second image of the deformed bone in a second plane;
employing image processing techniques to identify a plurality of anatomical landmarks of the deformed bone in the first image;
displaying the first image of the deformed bone on a display device;
autonomously generating and graphically overlaying a graphical template of the deformed bone on the first image of the deformed bone on the display device, the graphical template including a plurality of lines, each line connected at each end to a landmark point corresponding to one of the anatomical landmarks;
displaying a model bone on the display device so that the model bone overlies the first image of the deformed bone, the model bone selected from a library of model bones; and
deforming the model bone so that the deformed model bone includes a plurality of model bone landmarks that are aligned with the landmark points of the graphical template.
2. The method of claim 1, wherein the step of deforming the model bone is performed autonomously.
3. The method of claim 2, further comprising generating and displaying on the display device a first model fixation ring having a first position and orientation and second model fixation ring having a second position and orientation.
4. The method of claim 3, further comprising graphically manipulating at least one of the position and orientation of at least one of the model fixation rings.
5. The method of claim 3, further comprising determining combinations of sizes of a plurality of model struts to connect the models of the first and second fixation rings with an algorithm using the position and orientation of the first and second model fixation rings.
6. The method of claim 5, further comprising inputting to the computer system a first position for a limiting anatomical structure, the limiting anatomical structure defining a location having a maximum distraction value.
7. The method of claim 6, wherein during the step of inputting the first position for the limiting anatomical structure, the model rings and the model struts are simultaneously displayed on the display device and overlap the first image of the deformed bone on the display device.
8. The method of claim 7, wherein during the step of inputting the first position for the limiting anatomical structure, the first image of the deformed bone includes visible soft tissue structures.
9. The method of claim 8, wherein the limiting anatomical structure is input graphically using an input device.
10. The method of claim 9, wherein the input device is a computer mouse.
11. The method of claim 8, further comprising inputting to the computer system a second position for the limiting anatomical structure while the model rings and the model struts are simultaneously displayed on the display device and overlap the second image of the deformed bone on the display device, the second image of the deformed bone including visible soft tissue structures.
12. The method of claim 1, wherein each landmark point of the graphical template is configured to be repositioned via an input device.
13. The method of claim 12, wherein upon repositioning one of the landmark points, each line connected to the repositioned landmark point remains connected to the repositioned landmark point.
14. The method of claim 1, wherein the first image of the deformed bone is an x-ray image displayed on the visual medium in one of an anterior-posterior and a lateral view.
15. The method of claim 14, wherein the second image of the deformed bone is an x-ray image displayed on the visual medium in the other of an anterior-posterior and a lateral view.
16. The method of claim 15, wherein the first and second images of the deformed bone include images of physical rings and physical struts of an external fixation frame coupled to a patient.
17. The method of claim 16, further comprising autonomously determining a position and orientation of the physical rings and a length and orientation of the physical struts based on the first and second images.
18. The method of claim 17, further comprising displaying the determined position and orientation of the physical rings and the determined length and orientation of the physical struts on the visual medium.
19. The method of claim 18, further comprising graphically manipulating at least one of the determined position and the determined orientation of at least one of the physical rings.
20. The method of claim 18, further comprising graphically manipulating the determined orientation of at least one of the struts.
21. The method of claim 1, wherein the graphical template spans the deformity in the first image.
US15/171,121 2016-06-02 2016-06-02 Software for use with deformity correction Active US10251705B2 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US15/171,121 US10251705B2 (en) 2016-06-02 2016-06-02 Software for use with deformity correction
EP17173905.5A EP3251625B1 (en) 2016-06-02 2017-06-01 Software for use with deformity correction
US15/626,497 US10154884B2 (en) 2016-06-02 2017-06-19 Software for use with deformity correction
US16/286,757 US10603112B2 (en) 2016-06-02 2019-02-27 Software for use with deformity correction
US16/793,145 US11020186B2 (en) 2016-06-02 2020-02-18 Software for use with deformity correction
US17/242,389 US11553965B2 (en) 2016-06-02 2021-04-28 Software for use with deformity correction
US18/066,679 US12029496B2 (en) 2016-06-02 2022-12-15 Software for use with deformity correction
US18/739,413 US20240325086A1 (en) 2016-06-02 2024-06-11 Software for use with deformity correction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/171,121 US10251705B2 (en) 2016-06-02 2016-06-02 Software for use with deformity correction

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/626,497 Continuation US10154884B2 (en) 2016-06-02 2017-06-19 Software for use with deformity correction
US16/286,757 Continuation US10603112B2 (en) 2016-06-02 2019-02-27 Software for use with deformity correction

Publications (2)

Publication Number Publication Date
US20170348054A1 US20170348054A1 (en) 2017-12-07
US10251705B2 true US10251705B2 (en) 2019-04-09

Family

ID=59215446

Family Applications (7)

Application Number Title Priority Date Filing Date
US15/171,121 Active US10251705B2 (en) 2016-06-02 2016-06-02 Software for use with deformity correction
US15/626,497 Active US10154884B2 (en) 2016-06-02 2017-06-19 Software for use with deformity correction
US16/286,757 Active US10603112B2 (en) 2016-06-02 2019-02-27 Software for use with deformity correction
US16/793,145 Active US11020186B2 (en) 2016-06-02 2020-02-18 Software for use with deformity correction
US17/242,389 Active 2036-08-19 US11553965B2 (en) 2016-06-02 2021-04-28 Software for use with deformity correction
US18/066,679 Active 2036-08-10 US12029496B2 (en) 2016-06-02 2022-12-15 Software for use with deformity correction
US18/739,413 Pending US20240325086A1 (en) 2016-06-02 2024-06-11 Software for use with deformity correction

Family Applications After (6)

Application Number Title Priority Date Filing Date
US15/626,497 Active US10154884B2 (en) 2016-06-02 2017-06-19 Software for use with deformity correction
US16/286,757 Active US10603112B2 (en) 2016-06-02 2019-02-27 Software for use with deformity correction
US16/793,145 Active US11020186B2 (en) 2016-06-02 2020-02-18 Software for use with deformity correction
US17/242,389 Active 2036-08-19 US11553965B2 (en) 2016-06-02 2021-04-28 Software for use with deformity correction
US18/066,679 Active 2036-08-10 US12029496B2 (en) 2016-06-02 2022-12-15 Software for use with deformity correction
US18/739,413 Pending US20240325086A1 (en) 2016-06-02 2024-06-11 Software for use with deformity correction

Country Status (2)

Country Link
US (7) US10251705B2 (en)
EP (1) EP3251625B1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10603112B2 (en) * 2016-06-02 2020-03-31 Stryker European Holdings I, Llc Software for use with deformity correction
US10687899B1 (en) * 2016-07-05 2020-06-23 Smith & Nephew, Inc. Bone model correction angle determination
US10881433B2 (en) 2013-02-19 2021-01-05 Stryker European Operations Holdings Llc Software for use with deformity correction
US11759216B2 (en) 2021-09-22 2023-09-19 Arthrex, Inc. Orthopaedic fusion planning systems and methods of repair
US11877802B2 (en) 2020-12-30 2024-01-23 DePuy Synthes Products, Inc. Perspective frame matching process for deformed fixation rings
US11890058B2 (en) 2021-01-21 2024-02-06 Arthrex, Inc. Orthopaedic planning systems and methods of repair
US12127752B2 (en) 2023-07-07 2024-10-29 Arthrex, Inc. Orthopaedic fusion planning systems and methods of repair

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201008281D0 (en) 2010-05-19 2010-06-30 Nikonovas Arkadijus Indirect analysis and manipulation of objects
EP3281592B1 (en) 2013-03-13 2021-04-21 DePuy Synthes Products, Inc. External bone fixation device
US20170340448A1 (en) * 2016-04-07 2017-11-30 Kambiz Behzadi Materials in orthopedics and fracture fixation
EP3471635A4 (en) * 2016-06-19 2020-03-18 Orthospin Ltd. User interface for strut device
US10835318B2 (en) 2016-08-25 2020-11-17 DePuy Synthes Products, Inc. Orthopedic fixation control and manipulation
US20180268614A1 (en) * 2017-03-16 2018-09-20 General Electric Company Systems and methods for aligning pmi object on a model
US20210007806A1 (en) * 2018-03-21 2021-01-14 Vikas KARADE A method for obtaining 3-d deformity correction for bones
CN108888340A (en) * 2018-06-27 2018-11-27 上海昕健医疗技术有限公司 Personalized preoperative planning system
CN113301864B (en) 2019-01-31 2024-09-17 史密夫和内修有限公司 Device for external fixed strut measurement and real-time feedback
US11439436B2 (en) 2019-03-18 2022-09-13 Synthes Gmbh Orthopedic fixation strut swapping
US11304757B2 (en) 2019-03-28 2022-04-19 Synthes Gmbh Orthopedic fixation control and visualization
CN110136051A (en) * 2019-04-30 2019-08-16 北京市商汤科技开发有限公司 A kind of image processing method, device and computer storage medium
US20230086184A1 (en) * 2020-02-27 2023-03-23 Smith & Nephew, Inc. Methods and arrangements for external fixators
US11334997B2 (en) 2020-04-03 2022-05-17 Synthes Gmbh Hinge detection for orthopedic fixation
WO2023205046A1 (en) * 2022-04-22 2023-10-26 Smith & Nephew, Inc. Automated transosseous element planning for orthopedic devices

Citations (113)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5546942A (en) 1994-06-10 1996-08-20 Zhang; Zhongman Orthopedic robot and method for reduction of long-bone fractures
US5681309A (en) 1993-06-10 1997-10-28 Texas Scottish Rite Hospital For Crippled Children Distractor mechanism for external fixation device
US5682886A (en) 1995-12-26 1997-11-04 Musculographics Inc Computer-assisted surgical system
US5702389A (en) 1995-03-01 1997-12-30 Smith & Nephew Richards, Inc. Orthopaedic fixation device
US5728095A (en) 1995-03-01 1998-03-17 Smith & Nephew, Inc. Method of using an orthopaedic fixation device
US5769092A (en) 1996-02-22 1998-06-23 Integrated Surgical Systems, Inc. Computer-aided system for revision total hip replacement surgery
US5824085A (en) 1996-09-30 1998-10-20 Integrated Surgical Systems, Inc. System and method for cavity generation for surgical planning and initial placement of a bone prosthesis
US5880976A (en) 1997-02-21 1999-03-09 Carnegie Mellon University Apparatus and method for facilitating the implantation of artificial components in joints
US5891143A (en) 1997-10-20 1999-04-06 Smith & Nephew, Inc. Orthopaedic fixation plate
US5971984A (en) 1995-03-01 1999-10-26 Smith & Nephew, Inc. Method of using an orthopaedic fixation device
US6030386A (en) 1998-08-10 2000-02-29 Smith & Nephew, Inc. Six axis external fixator strut
US6112109A (en) 1993-09-10 2000-08-29 The University Of Queensland Constructive modelling of articles
US6129727A (en) 1999-03-02 2000-10-10 Smith & Nephew Orthopaedic spatial frame apparatus
US6205411B1 (en) 1997-02-21 2001-03-20 Carnegie Mellon University Computer-assisted surgery planner and intra-operative guidance system
US20020010465A1 (en) 2000-01-31 2002-01-24 Ja Kyo Koo Frame fixator and operation system thereof
US20030191466A1 (en) 2002-04-05 2003-10-09 Ed Austin Orthopaedic fixation method and device
US20040039259A1 (en) 2000-04-07 2004-02-26 Norman Krause Computer-aided bone distraction
US6711432B1 (en) 2000-10-23 2004-03-23 Carnegie Mellon University Computer-aided orthopedic surgery
US20040068187A1 (en) * 2000-04-07 2004-04-08 Krause Norman M. Computer-aided orthopedic surgery
US20040073212A1 (en) 2002-10-15 2004-04-15 Kim Jung Jae Extracorporeal fixing device for a bone fracture
US20050004451A1 (en) 2002-04-26 2005-01-06 Stefan Vilsmeier Planning and navigation assistance using two-dimensionally adapted generic and detected patient data
US20050054917A1 (en) 2002-09-26 2005-03-10 David Kitson Orthopaedic surgery planning
US20050267360A1 (en) 2004-04-26 2005-12-01 Rainer Birkenbach Visualization of procedural guidelines for a medical procedure
US20060015120A1 (en) 2002-04-30 2006-01-19 Alain Richard Determining femoral cuts in knee surgery
US20060079745A1 (en) 2004-10-07 2006-04-13 Viswanathan Raju R Surgical navigation with overlay on anatomical images
US7039225B2 (en) 2000-09-18 2006-05-02 Fuji Photo Film Co., Ltd. Artificial bone template selection system, artificial bone template display system, artificial bone template storage system and artificial bone template recording medium
US20060161052A1 (en) 2004-12-08 2006-07-20 Perception Raisonnement Action En Medecine Computer assisted orthopaedic surgery system for ligament graft reconstruction
US20060189842A1 (en) 2005-02-14 2006-08-24 Hoeg Hans D Method for using variable direction of view endoscopy in conjunction with image guided surgical systems
US20060276786A1 (en) 2005-05-25 2006-12-07 Brinker Mark R Apparatus for accurately positioning fractured bone fragments toward facilitating use of an external ring fixator system
US20070055234A1 (en) 2005-06-10 2007-03-08 Mcgrath William M External fixation system with provisional brace
US20070078678A1 (en) 2005-09-30 2007-04-05 Disilvestro Mark R System and method for performing a computer assisted orthopaedic surgical procedure
US20070133845A1 (en) 2003-11-13 2007-06-14 Maxim Fradkin Three-dimensional segmentation using deformable surfaces
US20070219561A1 (en) 2006-03-20 2007-09-20 Perception Raisonnement Action En Medecine Distractor system
US7280683B2 (en) 2002-07-22 2007-10-09 Compumed, Inc. Method, code, and system for assaying joint deformity
US20080051779A1 (en) 2006-08-02 2008-02-28 The Nemours Foundation Force-controlled autodistraction
DE102006048451A1 (en) 2006-10-11 2008-04-17 Siemens Ag Object e.g. implant, virtual adjustment method for e.g. leg, of patient, involves automatically adjusting object relative to body part in smooth manner for long time, until tolerance dimension achieves desired threshold value
US20080108912A1 (en) 2006-11-07 2008-05-08 General Electric Company System and method for measurement of clinical parameters of the knee for use during knee replacement surgery
US20080119719A1 (en) * 2006-08-21 2008-05-22 The Regents Of The University Of California Templates for assessing bone quality and methods of use thereof
US20080137923A1 (en) 2006-12-06 2008-06-12 Siemens Medical Solutions Usa, Inc. X-Ray Identification of Interventional Tools
US7394946B2 (en) 2004-05-18 2008-07-01 Agfa Healthcare Method for automatically mapping of geometric objects in digital medical images
US20080177203A1 (en) 2006-12-22 2008-07-24 General Electric Company Surgical navigation planning system and method for placement of percutaneous instrumentation and implants
US20080198966A1 (en) 2007-01-31 2008-08-21 Sectra Mamea Ab Method and Arrangement Relating to X-Ray Imaging
US20080234554A1 (en) 2007-03-21 2008-09-25 Vvedensky Pyotr S Computer-Aided System for Limb Lengthening
US20080243127A1 (en) 2001-05-25 2008-10-02 Conformis, Inc. Surgical Tools for Arthroplasty
US20080275467A1 (en) 2007-05-02 2008-11-06 Siemens Corporate Research, Inc. Intraoperative guidance for endovascular interventions via three-dimensional path planning, x-ray fluoroscopy, and image overlay
US20080319448A1 (en) 2006-12-12 2008-12-25 Perception Raisonnement Action En Medecine System and method for determining an optimal type and position of an implant
US20090054887A1 (en) 2004-10-06 2009-02-26 Covidien Ag Systems and Methods for Thermally Profiling Radiofrequency Electrodes
US7547307B2 (en) 2001-02-27 2009-06-16 Smith & Nephew, Inc. Computer assisted knee arthroplasty instrumentation, systems, and processes
WO2009076296A2 (en) 2007-12-06 2009-06-18 Smith & Nephew, Inc. Systems and methods for determining the mechanical axis of a femur
US20100036393A1 (en) 2007-03-01 2010-02-11 Titan Medical Inc. Methods, systems and devices for threedimensional input, and control methods and systems based thereon
US20100087819A1 (en) 2008-10-07 2010-04-08 Extraortho, Inc. Forward Kinematic Solution for a Hexapod Manipulator and Method of Use
US20100130858A1 (en) 2005-10-06 2010-05-27 Osamu Arai Puncture Treatment Supporting Apparatus
WO2010104567A1 (en) 2009-03-10 2010-09-16 Stryker Trauma Sa External fixation system
US20100286995A1 (en) 2007-07-27 2010-11-11 Koninklijke Philips Electronics N.V. Interactive atlas to image registration
US20110004199A1 (en) 2008-02-18 2011-01-06 Texas Scottish Rite Hospital For Children Tool and method for external fixation strut adjustment
US20110009868A1 (en) 2007-09-28 2011-01-13 Takashi Sato Apparatus for preoperative planning of artificial knee joint replacement operation and jig for supporting operation
US20110103676A1 (en) 2002-11-14 2011-05-05 Extraortho, Inc. Method for using a fixator device
US20110103556A1 (en) 2009-11-02 2011-05-05 Carn Ronald M Alignment fixture for x-ray images
US20110116041A1 (en) 2006-04-11 2011-05-19 Hartung Paul D Ocular Imaging
US7967868B2 (en) 2007-04-17 2011-06-28 Biomet Manufacturing Corp. Patient-modified implant and associated method
US20110188726A1 (en) 2008-06-18 2011-08-04 Ram Nathaniel Method and system for stitching multiple images into a panoramic image
US8055487B2 (en) 2005-02-22 2011-11-08 Smith & Nephew, Inc. Interactive orthopaedic biomechanics system
US20110304332A1 (en) 2009-02-25 2011-12-15 Mohamed Rashwan Mahfouz Intelligent cartilage system
US20110313418A1 (en) 2010-05-19 2011-12-22 Arkadijus Nikonovas Orthopedic fixation with imagery analysis
US20110313424A1 (en) 2010-06-18 2011-12-22 Howmedica Osteonics Corp. Patient-specific total hip arthroplasty
US20110313419A1 (en) 2010-06-22 2011-12-22 Extraortho, Inc. Hexapod External Fixation System with Collapsing Connectors
RU2448663C1 (en) 2010-11-19 2012-04-27 Федеральное государственное учреждение "Российский ордена Трудового Красного Знамени научно-исследовательский институт травматологии и ортопедии им. Р.Р. Вредена" Министерства здравоохранения и социального развития Российской Федерации (ФГУ "РНИИТО им. Р.Р. Вредена" Минздравсоцразвития России) Method of osteosynthesis with ortho-suv apparatus for treating injuries of distal one-third of femur
US20120130687A1 (en) 2008-09-19 2012-05-24 Smith & Nephew, Inc. Tuning Implants For Increased Performance
US20120155732A1 (en) 2009-06-26 2012-06-21 University Of South Florida CT Atlas of Musculoskeletal Anatomy to Guide Treatment of Sarcoma
US20120214121A1 (en) * 2011-01-26 2012-08-23 Greenberg Surgical Technologies, Llc Orthodontic Treatment Integrating Optical Scanning and CT Scan Data
US8257353B2 (en) 2010-02-24 2012-09-04 Wright Medical Technology, Inc. Orthopedic external fixation device
US8265949B2 (en) 2007-09-27 2012-09-11 Depuy Products, Inc. Customized patient surgical plan
US8296094B2 (en) 2007-04-04 2012-10-23 Smith & Nephew, Inc. Analysis of parallel manipulators
US8311306B2 (en) 2008-04-30 2012-11-13 Otismed Corporation System and method for image segmentation in generating computer models of a joint to undergo arthroplasty
US20120330312A1 (en) 2011-06-23 2012-12-27 Stryker Trauma Gmbh Methods and systems for adjusting an external fixation frame
US20120328174A1 (en) 2011-06-24 2012-12-27 Rajendra Prasad Jadiyappa System and method for processing an x-ray image of an organ
RU2471447C1 (en) 2011-11-01 2013-01-10 Федеральное государственное бюджетное учреждение "Российский ордена Трудового Красного Знамени научно-исследовательский институт травматологии и ортопедии им. Р.Р. Вредена" Министерства здравоохранения и социального развития Российской Федерации (ФГБУ "РНИИТО им. Р.Р. Вредена" Минздравсоцразвития Ро METHOD OF OSTEOSYNTHESIS BY APPARATUS Ortho-SUV IN TREATMENT OF INJURIES OF PROXIMAL THIRD OF FEMORAL BONE
WO2013013170A1 (en) 2011-07-20 2013-01-24 Smith & Nephew, Inc. Systems and methods for optimizing fit of an implant to anatomy
US20130089253A1 (en) * 2010-06-16 2013-04-11 A2 Surgical Method for determining bone resection on a deformed bone surface from few parameters
US20130096373A1 (en) 2010-06-16 2013-04-18 A2 Surgical Method of determination of access areas from 3d patient images
US8439914B2 (en) 2008-02-08 2013-05-14 Texas Scottish Rite Hospital For Children External fixation strut
US20130121612A1 (en) 2008-08-29 2013-05-16 Peter F. Falco, Jr. Preventing pixel modification of an image based on a metric indicating distortion in a 2d representation of a 3d object
US20130172783A1 (en) 2011-12-29 2013-07-04 Mako Surgical Corp. Systems and Methods for Prosthetic Component Orientation
US8484001B2 (en) 2003-08-26 2013-07-09 Voyant Health Ltd. Pre-operative medical planning system and method for use thereof
US20130201212A1 (en) 2012-02-03 2013-08-08 Orthohub, Inc. External Fixator Deformity Correction Systems and Methods
RU2489106C2 (en) 2011-11-01 2013-08-10 Федеральное государственное бюджетное учреждение "Российский ордена Трудового Красного Знамени научно-исследовательский институт травматологии и ортопедии им. Р.Р. Вредена" Министерства здравоохранения и социального развития Российской Федерации (ФГБУ "РНИИТО им. Р.Р. Вредена" Минздравсоцразвития Ро Method of osteosynthesis by ortho-suv apparatus in case of deformations of midfoot
US20130211792A1 (en) * 2011-12-30 2013-08-15 Mako Surgical Corp. Systems and methods for customizing interactive haptic boundaries
US8617171B2 (en) 2007-12-18 2013-12-31 Otismed Corporation Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide
US20140039663A1 (en) 2012-07-31 2014-02-06 Makerbot Industries, Llc Augmented three-dimensional printing
US20140073907A1 (en) 2012-09-12 2014-03-13 Convergent Life Sciences, Inc. System and method for image guided medical procedures
US8715291B2 (en) 2007-12-18 2014-05-06 Otismed Corporation Arthroplasty system and related methods
US8731885B2 (en) 2007-03-06 2014-05-20 The Cleveland Clinic Foundation Method and apparatus for preparing for a surgical procedure
US8737700B2 (en) 2007-12-18 2014-05-27 Otismed Corporation Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide
US20140189508A1 (en) 2012-12-31 2014-07-03 Mako Surgical Corp. Systems and methods for guiding a user during surgical planning
US8777946B2 (en) 2009-10-05 2014-07-15 Aalto University Foundation Anatomically customized and mobilizing external support, method for manufacture
EP2767252A1 (en) * 2013-02-19 2014-08-20 Stryker Trauma GmbH Software for planning deformity correction
US20140270437A1 (en) 2013-03-14 2014-09-18 Reuven R. Shreiber Method for efficient digital subtraction angiography
US20140303486A1 (en) 2013-03-07 2014-10-09 Adventist Health System/Sunbelt, Inc. Surgical Navigation Planning System and Associated Methods
US8860753B2 (en) 2004-04-13 2014-10-14 University Of Georgia Research Foundation, Inc. Virtual surgical system and methods
US8864763B2 (en) 2013-03-13 2014-10-21 DePuy Synthes Products, LLC External bone fixation device
US20140324403A1 (en) 2011-12-09 2014-10-30 Brainlab Ag Determining a range of motion of an anatomical joint
US20140328460A1 (en) 2013-05-06 2014-11-06 Siemens Aktiengesellschaft Method and device for assisting in the treatment of bone fractures
US20140343586A1 (en) 2012-01-31 2014-11-20 Fujifilm Corporation Surgery assistance apparatus, surgery assistance method and non-transitory computer-readable recording medium having stored therein surgery assistance program
US20140350389A1 (en) 2013-05-21 2014-11-27 Autonomic Technologies, Inc. System and method for surgical planning and navigation to facilitate placement of a medical device within a target region of a patient
US20140357984A1 (en) 2013-05-30 2014-12-04 Translucent Medical, Inc. System and method for displaying anatomy and devices on a movable display
US20140379356A1 (en) 2013-06-20 2014-12-25 Rohit Sachdeva Method and system for integrated orthodontic treatment planning using unified workstation
US8923590B2 (en) 2011-01-20 2014-12-30 Siemens Aktiengesellschaft Method and system for 3D cardiac motion estimation from single scan of C-arm angiography
US8945128B2 (en) 2010-08-11 2015-02-03 Stryker Trauma Sa External fixator system
US20150049083A1 (en) 2013-08-13 2015-02-19 Benjamin J. Bidne Comparative Analysis of Anatomical Items
US20150087965A1 (en) 2013-09-20 2015-03-26 Junichi Tokuda System and method for automatic detection and registration of medical images
US9101398B2 (en) 2012-08-23 2015-08-11 Stryker Trauma Sa Bone transport external fixation frame
US20150238271A1 (en) * 2014-02-25 2015-08-27 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US20160331463A1 (en) 2014-01-10 2016-11-17 Ao Technology Ag Method for generating a 3d reference computer model of at least one anatomical structure

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004071314A1 (en) * 2003-02-12 2004-08-26 Tsuyoshi Murase Member for assisting cutting of diseased bone and member for assisting judgment of corrected position
US8202273B2 (en) 2007-04-28 2012-06-19 John Peter Karidis Orthopedic fixation device with zero backlash and adjustable compliance, and process for adjusting same
US8690808B2 (en) * 2010-05-28 2014-04-08 Fixes 4 Kids Inc. Systems, devices, and methods for mechanically reducing and fixing bone fractures
CA2809002C (en) 2010-08-20 2017-11-21 Amei Technologies, Inc. Method and system for roentgenography-based modeling
US10540479B2 (en) 2011-07-15 2020-01-21 Stephen B. Murphy Surgical planning system and method
TWM433186U (en) 2012-03-22 2012-07-11 National Yang-Ming Univ Bone-plate type multi-axial external fastener
US9017339B2 (en) 2012-04-26 2015-04-28 Stryker Trauma Gmbh Measurement device for external fixation frame
EP2872065A1 (en) * 2012-07-12 2015-05-20 AO Technology AG Method for generating a graphical 3d computer model of at least one anatomical structure in a selectable pre-, intra-, or postoperative status
US9039706B2 (en) 2013-03-13 2015-05-26 DePuy Synthes Products, Inc. External bone fixation device
BR112015023127B1 (en) 2013-03-15 2022-02-08 Texas Scottish Rite Hospital For Children METHOD TO DETERMINE THE POSITION OF AN OBJECT USING MARKER OR RUBBER PROJECTIONS
US20160092651A1 (en) 2013-05-14 2016-03-31 Smith & Nephew, Inc. Apparatus and method for administering a medical device prescription
US10258377B1 (en) 2013-09-27 2019-04-16 Orthex, LLC Point and click alignment method for orthopedic surgeons, and surgical and clinical accessories and devices
US10082384B1 (en) 2015-09-10 2018-09-25 Stryker European Holdings I, Llc Systems and methods for detecting fixation frame parameters
US9959689B2 (en) * 2015-11-23 2018-05-01 Tesla Laboratories Llc System and method for creation of unique identification for use in gathering survey data from a mobile device at a live event
US10603122B2 (en) 2016-02-17 2020-03-31 Rowan University Surgical robot
US10010346B2 (en) * 2016-04-20 2018-07-03 Stryker European Holdings I, Llc Ring hole planning for external fixation frames
US10251705B2 (en) 2016-06-02 2019-04-09 Stryker European Holdings I, Llc Software for use with deformity correction
US10835318B2 (en) 2016-08-25 2020-11-17 DePuy Synthes Products, Inc. Orthopedic fixation control and manipulation

Patent Citations (132)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5681309A (en) 1993-06-10 1997-10-28 Texas Scottish Rite Hospital For Crippled Children Distractor mechanism for external fixation device
US6112109A (en) 1993-09-10 2000-08-29 The University Of Queensland Constructive modelling of articles
US5546942A (en) 1994-06-10 1996-08-20 Zhang; Zhongman Orthopedic robot and method for reduction of long-bone fractures
US5971984A (en) 1995-03-01 1999-10-26 Smith & Nephew, Inc. Method of using an orthopaedic fixation device
US5702389A (en) 1995-03-01 1997-12-30 Smith & Nephew Richards, Inc. Orthopaedic fixation device
US5728095A (en) 1995-03-01 1998-03-17 Smith & Nephew, Inc. Method of using an orthopaedic fixation device
US5682886A (en) 1995-12-26 1997-11-04 Musculographics Inc Computer-assisted surgical system
US5769092A (en) 1996-02-22 1998-06-23 Integrated Surgical Systems, Inc. Computer-aided system for revision total hip replacement surgery
US5824085A (en) 1996-09-30 1998-10-20 Integrated Surgical Systems, Inc. System and method for cavity generation for surgical planning and initial placement of a bone prosthesis
US5880976A (en) 1997-02-21 1999-03-09 Carnegie Mellon University Apparatus and method for facilitating the implantation of artificial components in joints
US6205411B1 (en) 1997-02-21 2001-03-20 Carnegie Mellon University Computer-assisted surgery planner and intra-operative guidance system
US5891143A (en) 1997-10-20 1999-04-06 Smith & Nephew, Inc. Orthopaedic fixation plate
USRE40914E1 (en) 1997-10-20 2009-09-08 Smith & Nephew, Inc. Orthopaedic fixation plate
US6030386A (en) 1998-08-10 2000-02-29 Smith & Nephew, Inc. Six axis external fixator strut
US6129727A (en) 1999-03-02 2000-10-10 Smith & Nephew Orthopaedic spatial frame apparatus
US20020010465A1 (en) 2000-01-31 2002-01-24 Ja Kyo Koo Frame fixator and operation system thereof
US7837621B2 (en) 2000-04-07 2010-11-23 Carnegie Mellon University Computer-aided bone distraction
US20040039259A1 (en) 2000-04-07 2004-02-26 Norman Krause Computer-aided bone distraction
US6701174B1 (en) 2000-04-07 2004-03-02 Carnegie Mellon University Computer-aided bone distraction
US20040068187A1 (en) * 2000-04-07 2004-04-08 Krause Norman M. Computer-aided orthopedic surgery
US7039225B2 (en) 2000-09-18 2006-05-02 Fuji Photo Film Co., Ltd. Artificial bone template selection system, artificial bone template display system, artificial bone template storage system and artificial bone template recording medium
US6711432B1 (en) 2000-10-23 2004-03-23 Carnegie Mellon University Computer-aided orthopedic surgery
US7547307B2 (en) 2001-02-27 2009-06-16 Smith & Nephew, Inc. Computer assisted knee arthroplasty instrumentation, systems, and processes
US20080243127A1 (en) 2001-05-25 2008-10-02 Conformis, Inc. Surgical Tools for Arthroplasty
US20050215997A1 (en) 2002-04-05 2005-09-29 Ed Austin Orthopaedic fixation method and device with delivery and presentation features
US20030191466A1 (en) 2002-04-05 2003-10-09 Ed Austin Orthopaedic fixation method and device
US20040073211A1 (en) 2002-04-05 2004-04-15 Ed Austin Orthopaedic fixation method and device with delivery and presentation features
US20050004451A1 (en) 2002-04-26 2005-01-06 Stefan Vilsmeier Planning and navigation assistance using two-dimensionally adapted generic and detected patient data
US20060015120A1 (en) 2002-04-30 2006-01-19 Alain Richard Determining femoral cuts in knee surgery
US7280683B2 (en) 2002-07-22 2007-10-09 Compumed, Inc. Method, code, and system for assaying joint deformity
US7388972B2 (en) 2002-09-26 2008-06-17 Meridian Technique Limited Orthopaedic surgery planning
US20050054917A1 (en) 2002-09-26 2005-03-10 David Kitson Orthopaedic surgery planning
US20040073212A1 (en) 2002-10-15 2004-04-15 Kim Jung Jae Extracorporeal fixing device for a bone fracture
US8419732B2 (en) 2002-11-14 2013-04-16 Sixfix, Inc. Method for using a fixator device
US20110103676A1 (en) 2002-11-14 2011-05-05 Extraortho, Inc. Method for using a fixator device
US8484001B2 (en) 2003-08-26 2013-07-09 Voyant Health Ltd. Pre-operative medical planning system and method for use thereof
US20070133845A1 (en) 2003-11-13 2007-06-14 Maxim Fradkin Three-dimensional segmentation using deformable surfaces
US8860753B2 (en) 2004-04-13 2014-10-14 University Of Georgia Research Foundation, Inc. Virtual surgical system and methods
US20050267360A1 (en) 2004-04-26 2005-12-01 Rainer Birkenbach Visualization of procedural guidelines for a medical procedure
US7394946B2 (en) 2004-05-18 2008-07-01 Agfa Healthcare Method for automatically mapping of geometric objects in digital medical images
US20090054887A1 (en) 2004-10-06 2009-02-26 Covidien Ag Systems and Methods for Thermally Profiling Radiofrequency Electrodes
US20060079745A1 (en) 2004-10-07 2006-04-13 Viswanathan Raju R Surgical navigation with overlay on anatomical images
US20060161052A1 (en) 2004-12-08 2006-07-20 Perception Raisonnement Action En Medecine Computer assisted orthopaedic surgery system for ligament graft reconstruction
US20060189842A1 (en) 2005-02-14 2006-08-24 Hoeg Hans D Method for using variable direction of view endoscopy in conjunction with image guided surgical systems
US8055487B2 (en) 2005-02-22 2011-11-08 Smith & Nephew, Inc. Interactive orthopaedic biomechanics system
US20060276786A1 (en) 2005-05-25 2006-12-07 Brinker Mark R Apparatus for accurately positioning fractured bone fragments toward facilitating use of an external ring fixator system
US20070055234A1 (en) 2005-06-10 2007-03-08 Mcgrath William M External fixation system with provisional brace
US20070078678A1 (en) 2005-09-30 2007-04-05 Disilvestro Mark R System and method for performing a computer assisted orthopaedic surgical procedure
US20100130858A1 (en) 2005-10-06 2010-05-27 Osamu Arai Puncture Treatment Supporting Apparatus
US20070219561A1 (en) 2006-03-20 2007-09-20 Perception Raisonnement Action En Medecine Distractor system
US20110116041A1 (en) 2006-04-11 2011-05-19 Hartung Paul D Ocular Imaging
US20080051779A1 (en) 2006-08-02 2008-02-28 The Nemours Foundation Force-controlled autodistraction
US20080119719A1 (en) * 2006-08-21 2008-05-22 The Regents Of The University Of California Templates for assessing bone quality and methods of use thereof
DE102006048451A1 (en) 2006-10-11 2008-04-17 Siemens Ag Object e.g. implant, virtual adjustment method for e.g. leg, of patient, involves automatically adjusting object relative to body part in smooth manner for long time, until tolerance dimension achieves desired threshold value
US20080108912A1 (en) 2006-11-07 2008-05-08 General Electric Company System and method for measurement of clinical parameters of the knee for use during knee replacement surgery
US20080137923A1 (en) 2006-12-06 2008-06-12 Siemens Medical Solutions Usa, Inc. X-Ray Identification of Interventional Tools
US20080319448A1 (en) 2006-12-12 2008-12-25 Perception Raisonnement Action En Medecine System and method for determining an optimal type and position of an implant
US20080177203A1 (en) 2006-12-22 2008-07-24 General Electric Company Surgical navigation planning system and method for placement of percutaneous instrumentation and implants
US20080198966A1 (en) 2007-01-31 2008-08-21 Sectra Mamea Ab Method and Arrangement Relating to X-Ray Imaging
US20100036393A1 (en) 2007-03-01 2010-02-11 Titan Medical Inc. Methods, systems and devices for threedimensional input, and control methods and systems based thereon
US8731885B2 (en) 2007-03-06 2014-05-20 The Cleveland Clinic Foundation Method and apparatus for preparing for a surgical procedure
US8157800B2 (en) 2007-03-21 2012-04-17 Vvedensky Pyotr S Computer-aided system for limb lengthening
US20080234554A1 (en) 2007-03-21 2008-09-25 Vvedensky Pyotr S Computer-Aided System for Limb Lengthening
US8296094B2 (en) 2007-04-04 2012-10-23 Smith & Nephew, Inc. Analysis of parallel manipulators
US7967868B2 (en) 2007-04-17 2011-06-28 Biomet Manufacturing Corp. Patient-modified implant and associated method
US20080275467A1 (en) 2007-05-02 2008-11-06 Siemens Corporate Research, Inc. Intraoperative guidance for endovascular interventions via three-dimensional path planning, x-ray fluoroscopy, and image overlay
US20100286995A1 (en) 2007-07-27 2010-11-11 Koninklijke Philips Electronics N.V. Interactive atlas to image registration
US8265949B2 (en) 2007-09-27 2012-09-11 Depuy Products, Inc. Customized patient surgical plan
US20110009868A1 (en) 2007-09-28 2011-01-13 Takashi Sato Apparatus for preoperative planning of artificial knee joint replacement operation and jig for supporting operation
WO2009076296A2 (en) 2007-12-06 2009-06-18 Smith & Nephew, Inc. Systems and methods for determining the mechanical axis of a femur
US20110029116A1 (en) * 2007-12-06 2011-02-03 Jason Sean Jordan Systems and methods for determining the mechanical axis of a femur
US8737700B2 (en) 2007-12-18 2014-05-27 Otismed Corporation Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide
US8617171B2 (en) 2007-12-18 2013-12-31 Otismed Corporation Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide
US8715291B2 (en) 2007-12-18 2014-05-06 Otismed Corporation Arthroplasty system and related methods
US8439914B2 (en) 2008-02-08 2013-05-14 Texas Scottish Rite Hospital For Children External fixation strut
US20110004199A1 (en) 2008-02-18 2011-01-06 Texas Scottish Rite Hospital For Children Tool and method for external fixation strut adjustment
US8864750B2 (en) 2008-02-18 2014-10-21 Texas Scottish Rite Hospital For Children Tool and method for external fixation strut adjustment
US8311306B2 (en) 2008-04-30 2012-11-13 Otismed Corporation System and method for image segmentation in generating computer models of a joint to undergo arthroplasty
US20110188726A1 (en) 2008-06-18 2011-08-04 Ram Nathaniel Method and system for stitching multiple images into a panoramic image
US20130121612A1 (en) 2008-08-29 2013-05-16 Peter F. Falco, Jr. Preventing pixel modification of an image based on a metric indicating distortion in a 2d representation of a 3d object
US20120130687A1 (en) 2008-09-19 2012-05-24 Smith & Nephew, Inc. Tuning Implants For Increased Performance
US20100087819A1 (en) 2008-10-07 2010-04-08 Extraortho, Inc. Forward Kinematic Solution for a Hexapod Manipulator and Method of Use
US20110304332A1 (en) 2009-02-25 2011-12-15 Mohamed Rashwan Mahfouz Intelligent cartilage system
US8333766B2 (en) 2009-03-10 2012-12-18 Stryker Trauma Sa External fixation system
WO2010104567A1 (en) 2009-03-10 2010-09-16 Stryker Trauma Sa External fixation system
US20120155732A1 (en) 2009-06-26 2012-06-21 University Of South Florida CT Atlas of Musculoskeletal Anatomy to Guide Treatment of Sarcoma
US8777946B2 (en) 2009-10-05 2014-07-15 Aalto University Foundation Anatomically customized and mobilizing external support, method for manufacture
US20110103556A1 (en) 2009-11-02 2011-05-05 Carn Ronald M Alignment fixture for x-ray images
US8257353B2 (en) 2010-02-24 2012-09-04 Wright Medical Technology, Inc. Orthopedic external fixation device
US20110313418A1 (en) 2010-05-19 2011-12-22 Arkadijus Nikonovas Orthopedic fixation with imagery analysis
US20130096373A1 (en) 2010-06-16 2013-04-18 A2 Surgical Method of determination of access areas from 3d patient images
US20130089253A1 (en) * 2010-06-16 2013-04-11 A2 Surgical Method for determining bone resection on a deformed bone surface from few parameters
US20110313424A1 (en) 2010-06-18 2011-12-22 Howmedica Osteonics Corp. Patient-specific total hip arthroplasty
US20110313419A1 (en) 2010-06-22 2011-12-22 Extraortho, Inc. Hexapod External Fixation System with Collapsing Connectors
US8945128B2 (en) 2010-08-11 2015-02-03 Stryker Trauma Sa External fixator system
RU2448663C1 (en) 2010-11-19 2012-04-27 Федеральное государственное учреждение "Российский ордена Трудового Красного Знамени научно-исследовательский институт травматологии и ортопедии им. Р.Р. Вредена" Министерства здравоохранения и социального развития Российской Федерации (ФГУ "РНИИТО им. Р.Р. Вредена" Минздравсоцразвития России) Method of osteosynthesis with ortho-suv apparatus for treating injuries of distal one-third of femur
US8923590B2 (en) 2011-01-20 2014-12-30 Siemens Aktiengesellschaft Method and system for 3D cardiac motion estimation from single scan of C-arm angiography
US20120214121A1 (en) * 2011-01-26 2012-08-23 Greenberg Surgical Technologies, Llc Orthodontic Treatment Integrating Optical Scanning and CT Scan Data
US20120330312A1 (en) 2011-06-23 2012-12-27 Stryker Trauma Gmbh Methods and systems for adjusting an external fixation frame
US20120328174A1 (en) 2011-06-24 2012-12-27 Rajendra Prasad Jadiyappa System and method for processing an x-ray image of an organ
WO2013013170A1 (en) 2011-07-20 2013-01-24 Smith & Nephew, Inc. Systems and methods for optimizing fit of an implant to anatomy
RU2471447C1 (en) 2011-11-01 2013-01-10 Федеральное государственное бюджетное учреждение "Российский ордена Трудового Красного Знамени научно-исследовательский институт травматологии и ортопедии им. Р.Р. Вредена" Министерства здравоохранения и социального развития Российской Федерации (ФГБУ "РНИИТО им. Р.Р. Вредена" Минздравсоцразвития Ро METHOD OF OSTEOSYNTHESIS BY APPARATUS Ortho-SUV IN TREATMENT OF INJURIES OF PROXIMAL THIRD OF FEMORAL BONE
RU2489106C2 (en) 2011-11-01 2013-08-10 Федеральное государственное бюджетное учреждение "Российский ордена Трудового Красного Знамени научно-исследовательский институт травматологии и ортопедии им. Р.Р. Вредена" Министерства здравоохранения и социального развития Российской Федерации (ФГБУ "РНИИТО им. Р.Р. Вредена" Минздравсоцразвития Ро Method of osteosynthesis by ortho-suv apparatus in case of deformations of midfoot
US20140324403A1 (en) 2011-12-09 2014-10-30 Brainlab Ag Determining a range of motion of an anatomical joint
US20130172783A1 (en) 2011-12-29 2013-07-04 Mako Surgical Corp. Systems and Methods for Prosthetic Component Orientation
US20130211792A1 (en) * 2011-12-30 2013-08-15 Mako Surgical Corp. Systems and methods for customizing interactive haptic boundaries
US20140343586A1 (en) 2012-01-31 2014-11-20 Fujifilm Corporation Surgery assistance apparatus, surgery assistance method and non-transitory computer-readable recording medium having stored therein surgery assistance program
US8952986B2 (en) 2012-02-03 2015-02-10 Orthohub, Inc. External fixator deformity correction systems and methods
US20130201212A1 (en) 2012-02-03 2013-08-08 Orthohub, Inc. External Fixator Deformity Correction Systems and Methods
US9524581B2 (en) 2012-02-03 2016-12-20 Stryker European Holdings I, Llc Orthopedic treatment device co-display systems and methods
US8654150B2 (en) 2012-02-03 2014-02-18 Orthohub, Inc. External fixator deformity correction systems and methods
US20140039663A1 (en) 2012-07-31 2014-02-06 Makerbot Industries, Llc Augmented three-dimensional printing
US9101398B2 (en) 2012-08-23 2015-08-11 Stryker Trauma Sa Bone transport external fixation frame
US20140073907A1 (en) 2012-09-12 2014-03-13 Convergent Life Sciences, Inc. System and method for image guided medical procedures
US20140189508A1 (en) 2012-12-31 2014-07-03 Mako Surgical Corp. Systems and methods for guiding a user during surgical planning
US9724129B2 (en) * 2013-02-19 2017-08-08 Stryker European Holdings I, Llc Software for use with deformity correction
EP2767252A1 (en) * 2013-02-19 2014-08-20 Stryker Trauma GmbH Software for planning deformity correction
US20160045225A1 (en) 2013-02-19 2016-02-18 Stryker European Holdings I, Llc Software for use with deformity correction
US9204937B2 (en) 2013-02-19 2015-12-08 Stryker Trauma Gmbh Software for use with deformity correction
US20170281233A1 (en) 2013-02-19 2017-10-05 Stryker European Holdings I, Llc Software for use with deformity correction
US20140236153A1 (en) * 2013-02-19 2014-08-21 Stryker Trauma Gmbh Software for use with deformity correction
US20140303486A1 (en) 2013-03-07 2014-10-09 Adventist Health System/Sunbelt, Inc. Surgical Navigation Planning System and Associated Methods
US8864763B2 (en) 2013-03-13 2014-10-21 DePuy Synthes Products, LLC External bone fixation device
US20140270437A1 (en) 2013-03-14 2014-09-18 Reuven R. Shreiber Method for efficient digital subtraction angiography
US20140328460A1 (en) 2013-05-06 2014-11-06 Siemens Aktiengesellschaft Method and device for assisting in the treatment of bone fractures
US20140350389A1 (en) 2013-05-21 2014-11-27 Autonomic Technologies, Inc. System and method for surgical planning and navigation to facilitate placement of a medical device within a target region of a patient
US20140357984A1 (en) 2013-05-30 2014-12-04 Translucent Medical, Inc. System and method for displaying anatomy and devices on a movable display
US20140379356A1 (en) 2013-06-20 2014-12-25 Rohit Sachdeva Method and system for integrated orthodontic treatment planning using unified workstation
US20150049083A1 (en) 2013-08-13 2015-02-19 Benjamin J. Bidne Comparative Analysis of Anatomical Items
US20150087965A1 (en) 2013-09-20 2015-03-26 Junichi Tokuda System and method for automatic detection and registration of medical images
US20160331463A1 (en) 2014-01-10 2016-11-17 Ao Technology Ag Method for generating a 3d reference computer model of at least one anatomical structure
US20150238271A1 (en) * 2014-02-25 2015-08-27 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
Craveiro-Lopes, MD, Software Assisted "Ortho-SUV Frame", Int'l Congress on External Fixation & Bone Reconstruction, Oct. 22, 2010.
European Patent Office (ISA), International Search Report and Written Opinion dated Jun. 25, 2013 for International Application No. PCT/US2013/024548, International filing date Feb. 3, 2013.
Extended European Seach Report for Application No. 14154820.6 dated Jun. 16, 2014.
Extended European Search Report for Application No. EP17173905, dated Oct. 30, 2017.
IMED Surgical, Adam Frame with Paley's Method, Workshop, Oct. 2010.
LITOS GmbH, "Ilizarov Hexapod System," available from http://d3llyibkg2zj6z.cloudfront.net/ImagemAnexo/Ilozarov-Hexapod-System.-PDF, dated May 23, 2007.
LITOS GmbH, "Ilizarov Hexapod System," available from http://d3llyibkg2zj6z.cloudfront.net/ImagemAnexo/Ilozarov-Hexapod-System.—PDF, dated May 23, 2007.
Response Ortho LLC, Smart Correction Computer Assisted Circular Hexapod System Brochure, date not known.
Smart Correction, Computer-Assisted Circular External Fixator System, website printout, Feb. 2, 2011.
Vreden Russian Research Institute of Traumatology and Orthopedics Ortho-SUV Ltd., Deformity Correction and Fracture Treatment by Software-based Ortho-SUV Frame, Saint-Petersburg, 2013.

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10881433B2 (en) 2013-02-19 2021-01-05 Stryker European Operations Holdings Llc Software for use with deformity correction
US11819246B2 (en) 2013-02-19 2023-11-21 Stryker European Operations Holdings Llc Software for use with deformity correction
US10603112B2 (en) * 2016-06-02 2020-03-31 Stryker European Holdings I, Llc Software for use with deformity correction
US11020186B2 (en) * 2016-06-02 2021-06-01 Stryker European Operations Holdings Llc Software for use with deformity correction
US20210244476A1 (en) * 2016-06-02 2021-08-12 Stryker European Operations Holdings Llc Software for Use with Deformity Correction
US11553965B2 (en) * 2016-06-02 2023-01-17 Stryker European Operations Holdings Llc Software for use with deformity correction
US12029496B2 (en) * 2016-06-02 2024-07-09 Stryker European Operations Holdings Llc Software for use with deformity correction
US10687899B1 (en) * 2016-07-05 2020-06-23 Smith & Nephew, Inc. Bone model correction angle determination
US11877802B2 (en) 2020-12-30 2024-01-23 DePuy Synthes Products, Inc. Perspective frame matching process for deformed fixation rings
US11890058B2 (en) 2021-01-21 2024-02-06 Arthrex, Inc. Orthopaedic planning systems and methods of repair
US11759216B2 (en) 2021-09-22 2023-09-19 Arthrex, Inc. Orthopaedic fusion planning systems and methods of repair
US12127752B2 (en) 2023-07-07 2024-10-29 Arthrex, Inc. Orthopaedic fusion planning systems and methods of repair

Also Published As

Publication number Publication date
US11020186B2 (en) 2021-06-01
US20210244476A1 (en) 2021-08-12
US12029496B2 (en) 2024-07-09
US20230111705A1 (en) 2023-04-13
EP3251625B1 (en) 2022-04-27
US20190183581A1 (en) 2019-06-20
US10603112B2 (en) 2020-03-31
US20240325086A1 (en) 2024-10-03
EP3251625A1 (en) 2017-12-06
US11553965B2 (en) 2023-01-17
US20200179055A1 (en) 2020-06-11
US20170348057A1 (en) 2017-12-07
US20170348054A1 (en) 2017-12-07
US10154884B2 (en) 2018-12-18

Similar Documents

Publication Publication Date Title
US12029496B2 (en) Software for use with deformity correction
US11819246B2 (en) Software for use with deformity correction
US11918292B2 (en) Orthopedic fixation control and manipulation
US6711432B1 (en) Computer-aided orthopedic surgery
EP2512360B1 (en) Visualization guided acl localization system
US20040068187A1 (en) Computer-aided orthopedic surgery
EP1955668B1 (en) Method and device for the determination of alignment information during sonographically navigable repositioning of bone fragments
EP3012759A1 (en) Method for planning, preparing, accompaniment, monitoring and/or final control of a surgical procedure in the human or animal body, system for carrying out such a procedure and use of the device
JP5898186B2 (en) A method to determine bone resection on deformed bone surface from a small number of parameters
JP2005521534A (en) Orthopedic fixation method and apparatus having delivery and display features
JP2017507689A (en) Method for generating a 3D reference computer model of at least one anatomical structure
JP2004254899A (en) Surgery supporting system and surgery supporting method
JP7500601B2 (en) Orthopedic Fixation Control and Visualization
Lai et al. Computer-Aided Preoperative Planning and Virtual Simulation in Orthopedic

Legal Events

Date Code Title Description
AS Assignment

Owner name: STRYKER EUROPEAN HOLDINGS I, LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUMAR, ANUP;ANJANAPPA, SRIDHAR;GANGWAR, ASHISH;AND OTHERS;SIGNING DATES FROM 20160728 TO 20170605;REEL/FRAME:042615/0406

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: STRYKER EUROPEAN HOLDINGS III, LLC, DELAWARE

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:STRYKER EUROPEAN HOLDINGS I, LLC;REEL/FRAME:055019/0258

Effective date: 20210114

AS Assignment

Owner name: STRYKER EUROPEAN OPERATIONS HOLDINGS LLC, MICHIGAN

Free format text: CHANGE OF NAME;ASSIGNOR:STRYKER EUROPEAN HOLDINGS III, LLC;REEL/FRAME:055117/0114

Effective date: 20190226

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4