(19)
(11)EP 2 814 644 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
24.06.2020 Bulletin 2020/26

(21)Application number: 13748533.0

(22)Date of filing:  11.02.2013
(51)International Patent Classification (IPC): 
A61B 34/30(2016.01)
B25J 9/16(2006.01)
(86)International application number:
PCT/US2013/025630
(87)International publication number:
WO 2013/122889 (22.08.2013 Gazette  2013/34)

(54)

SWITCHING CONTROL OF AN INSTRUMENT TO AN INPUT DEVICE UPON THE INSTRUMENT ENTERING A DISPLAY AREA VIEWABLE BY AN OPERATOR OF THE INPUT DEVICE

SCHALTSTEUERUNG EINES INSTRUMENTS MIT EINER EINGABEVORRICHTUNG AUF DEM INSTRUMENT ZUM AUFRUFEN EINES ANZEIGEBEREICHS ZUR ANZEIGE FÜR EINEN BEDIENER DER EINGABEVORRICHTUNG

COMMANDE DE COMMUTATION D'UN INSTRUMENT VERS UN DISPOSITIF D'ENTRÉE LORS DE L'ENTRÉE DE L'INSTRUMENT DANS UNE ZONE D'AFFICHAGE OBSERVABLE PAR L'OPÉRATEUR DU DISPOSITIF D'ENTRÉE


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 15.02.2012 US 201261599208 P

(43)Date of publication of application:
24.12.2014 Bulletin 2014/52

(73)Proprietor: Intuitive Surgical Operations, Inc.
Sunnyvale, CA 94086 (US)

(72)Inventor:
  • DIOLAITI, Nicola
    Menlo Park, California 94025 (US)

(74)Representative: MacDougall, Alan John Shaw et al
Mathys & Squire LLP The Shard 32 London Bridge Street
London SE1 9SG
London SE1 9SG (GB)


(56)References cited: : 
WO-A2-2007/146987
US-A- 6 102 850
US-A1- 2009 088 773
US-A1- 2010 228 264
US-B2- 7 865 266
KR-A- 20110 081 153
US-A1- 2007 265 495
US-A1- 2009 326 556
US-B2- 7 543 588
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description

    FIELD OF THE INVENTION



    [0001] The present invention generally relates to robotic systems for switching control of an instrument to an input device upon the instrument entering a display area viewable by an operator of the input device.

    BACKGROUND OF THE INVENTION



    [0002] In a robotic system, a plurality of instruments may be telerobotically controlled by an operator to perform a procedure on an object at a work site. A camera is provided at the work site to capture images of end effectors of the instruments as they interact with the object to perform the procedure, so that the operator may view their movements on a display screen while controlling them through associated input devices.

    [0003] While performing the procedure, it may be necessary to introduce a new instrument to the work site to perform a needed function; replace an instrument already there with a new instrument; replace an end effector on an instrument already there with a new end effector; introduce some accessory item to the work site for performing the procedure by attaching it to an instrument and inserting the instrument to the work site; or remove an item from the work site by attaching it to an instrument, retracting the instrument and removing the item from the instrument, and reinserting the instrument back to the work site so that it may once again be available for use there.

    [0004] In each of these applications, it may be advantageous to have an assistant control all or a part of these actions while the primary operator continues to perform the procedure using the remaining instruments. Once the new or the retracted and reinserted instrument becomes available at the work site, its control may be transferred to the primary operator so that the instrument is available for use by the primary operator at the work site.

    [0005] To insert the new or the retracted and reinserted instrument to the work site, however, may be a time consuming and difficult task for the assistant. Also, it may be difficult for the assistant to bring the new or reinserted instrument into the field of view of the on-site camera so that it may be viewed by the primary operator on the display screen before switching control of the instrument to the primary operator. It is also possible for the assistant to misjudge the depth of insertion of the new or reinserted instrument and place its distal end too deep into the work site, which may cause unintended contact between the instrument and objects at the work site. To avoid such unintended and possibly harmful contact, the assistant is likely to move the new or reinserted instrument very slowly into the work site.

    [0006] U.S. Pat. No. 6,645,1 96 describes a guided tool exchange procedure employable in a medical robotic system to guide a new tool quickly and precisely, after a tool exchange operation, into close proximity to the operating position of the original tool prior to its removal from a work site.

    [0007] During the performance of a procedure, however, the camera pose may change so that the camera may get a better view of the working ends of instruments as the instruments move while performing a procedure on an object at the work site. In such a situation, inserting a replacement or reinserting the old instrument to its former position may be undesirable since it may not be a good location for the new instrument and further because it may be outside the current field of view of the camera. In addition, it may be desirable to allow the primary operator to assume control of the new instrument as soon as practical so as not to unduly delay the continued performance of the procedure using the new instrument.

    [0008] US 2009/088773 describes a robotic medical system that includes a stereo endoscopic camera having a field of view; instruments having working ends, operator manipulatable input devices, a processor, and a processor readable medium for storing program code to be executed by the processor. The processor implements a tool tracking system that performs tool acquisition and tool tracking by fusing kinematics and visual information to obtain pose information for one or more of the instruments. In particular, the goal of the tool acquisition stage is to obtain the absolute pose information of one or more robotic instruments within the field of view of the camera. The goal of the tool tracking stage is to dynamically update the absolute pose of a moving robotic instrument.

    [0009] US 2007/265495 describes a field of view control system which includes a camera and a display for providing visual feedback to a user who is manipulating tools. The camera has a field of view that may be viewed by the user as an image on the display. One or more markers are applied to the tools to permit a tracking controller to autonomously move the camera in order to keep the markers within the field of view, so that images of the tools may continuously be seen on the display by the user while the user is manipulating the tools.

    OBJECTS AND SUMMARY OF THE INVENTION



    [0010] Accordingly, one object of one or more aspects of the present invention is a robotic system implemented therein, as set out in the appended claims, that switches control of an instrument from one of an assistant control mode and a processor control mode to an operator control mode, the assistant control mode being a mode in which an assistant manually controls movement of the instrument, the processor control mode being a mode in which a processor controls responds to programmed instructions stored in a memory to control movement of the instrument, the operator control mode being a mode in which the processor responds to operator manipulation of the input device to control movement of the instrument, upon determining that a working end of the instrument has entered an area of a field of view of a camera.

    [0011] Also described is a robotic system and exemplary method implemented therein that is compatible with an instrument or tool exchange procedure.

    [0012] Also described is a robotic system and exemplary method implemented therein that prevents harm to an object at a work site while an instrument is being guided to the work site.

    [0013] Also described is a robotic system and exemplary method implemented therein that warns an operator of the robotic system if an insertion of an instrument towards a work site reaches a maximum insertion distance without the instrument entering a display area viewable by the operator on a display screen.

    [0014] Also described is a robotic system comprising: a camera; an instrument; an input device; and a processor configured to switch control of the instrument to the input device upon determining that the instrument has entered an area of a field of view of the camera.

    [0015] Also described, and not part of the invention set out in the appended claims, is a method implemented in a robotic system to switch control of an instrument to an input device, the method comprising: switching control of the instrument to the input device upon determining that the instrument has entered an area of a field of view of a camera.

    [0016] Additional objects, features and advantages of the various aspects of the present invention will become apparent from the following description of its preferred embodiment, which description should be taken in conjunction with the accompanying drawings.

    BRIEF DESCRIPTION OF THE DRAWINGS



    [0017] 

    FIG. 1 illustrates a block diagram of a robotic system utilizing aspects of the present invention.

    FIG. 2 illustrates a flow diagram of a method of switching control of an instrument to an input device utilizing aspects of the present invention.

    FIG. 3 illustrates a flow diagram of a method for determining whether an instrument is within a viewing area of a display screen, which may be used in the method of FIG. 2 utilizing aspects of the present invention.

    FIG. 4 illustrates left and right views of a point in a camera reference frame.

    FIGS. 5 and 6 respectively illustrate a full left camera view being displayed on a left viewing area of a computer monitor and partial left camera view being displayed on a left viewing area of a computer monitor.

    FIGS. 7a-7d illustrate schematic views of instruments at a work site in various stages of a guided tool exchange procedure performed in a multiple aperture medical robotic system utilizing aspects of the present invention.

    FIGS. 8a-8c illustrate schematic views of articulated instruments at a work site in different stages of a guided tool exchange procedure performed in a single aperture medical robotic system utilizing aspects of the present invention.

    FIG. 9 illustrates a top view of an operating room employing a multiple aperture medical robotic system utilizing aspects of the present invention.

    FIG. 10 illustrates a front view of a patient side cart for a multiple aperture medical robotic system utilizing aspects of the present invention.

    FIG. 11 illustrates a perspective view of an instrument usable in a multiple aperture medical robotic system utilizing aspects of the present invention.

    FIG. 12 illustrates a top view of an operating room employing a single aperture medical robotic system utilizing aspects of the present invention.

    FIG. 13 illustrates a perspective view of a distal end of an entry guide with articulated instruments extending out of it in a single aperture medical robotic system utilizing aspects of the present invention.

    FIG. 14 illustrates a cross-sectional view of an entry guide useful in a single aperture medical robotic system utilizing aspects of the present invention.

    FIG. 15 illustrates a perspective view of articulated instrument assemblies mounted on a platform coupled to a robotic arm assembly in a single aperture medical robotic system utilizing aspects of the present invention.

    FIG. 16 illustrates a front view of a console usable in a robotic system utilizing aspects of the present invention.


    DETAILED DESCRIPTION



    [0018] FIG. 1 illustrates, as an example, a block diagram of various components used by a method implemented in a robotic system 1000 for switching control of an instrument (also referred to as a "tool") to an input device upon the instrument entering a display area viewable by an operator of the input device. FIGS. 2-8 illustrate flow charts and other items describing the method. FIGS. 9-11 illustrate details of a first exemplary robotic system 2000 using multiple apertures for entry to a work site. FIGS. 12-15 illustrate details of a second exemplary robotic system 3000 using a single aperture for entry to a work site. FIG. 16 illustrates an exemplary operator's console that may be used with both of the robotic systems 2000 and 3000.

    [0019] Before describing details of the invention as illustrated in FIGS. 1-8, the exemplary robotic systems 2000 and 3000 will first be described to provide context and additional details on exemplary implementations of the robotic system 1000. Although medical robotic systems are described herein as examples of the robotic system 1000, it is to be appreciated that the various aspects of the invention as claimed herein are not to be limited to such types of robotic systems.

    [0020] Referring to FIG. 9, a top view of an operating room is illustrated in which a multiple aperture medical robotic system 2000 is being employed by a Surgeon ("S") to perform a medical procedure on a Patient ("P"). The medical robotic system in this case is a Minimally Invasive Robotic Surgical (MIRS) system including a Console ("C") utilized by the Surgeon while performing a minimally invasive diagnostic or surgical procedure on the Patient with assistance from one or more Assistants ("A") while the Patient is on an Operating table ("O").

    [0021] The Console, as further described in reference to FIG. 16, includes a processor 43 which communicates with a patient-side cart 150 over a bus 110. A plurality of robotic arms 34, 36, 38 are included on the cart 150. An instrument 33 is held and manipulated by robotic arm 34, another instrument 35 is held and manipulated by robotic arm 36, and an endoscope 37 is held and manipulated by robotic arm 38. The medical robotic system 2000 is referred to as being a multiple aperture medical robotic system, because multiple apertures are used so that each of its instruments is introduced through its own entry aperture in the Patient. As an example, instrument 35 is inserted into aperture 166 to enter the Patient.

    [0022] The Surgeon performs the medical procedure by manipulating the input devices 41, 42 so that the processor 43 causes their respectively associated robotic arms 34, 36 to manipulate their respective removably coupled instruments 33, 35 accordingly while the Surgeon views real-time images of a work site in three-dimensions ("3D") on a stereo vision display 45 (also referred to as "display screen") of the Console. A stereoscopic endoscope 37 (having left and right cameras for capturing left and right stereo views) captures stereo images of the work site. The processor 43 processes the stereo images so that they may be properly displayed on the stereo vision display 45.

    [0023] Each of the robotic arms 34, 36, 38 is conventionally formed of links, such as link 162, which are coupled together and manipulated through actuatable joints, such as joint 163. Each of the robotic arms includes a setup arm and an instrument manipulator. The setup arm positions its held instrument so that a pivot point occurs at its entry aperture into the Patient. The instrument manipulator may then manipulate its held instrument so that it may be pivoted about the pivot point, inserted into and retracted out of the entry aperture, and rotated about its shaft axis. The robotic arms 34, 36, 38 may be carted into the operating room via a Patient-Side Cart 150 or alternatively, they may be attached to sliders on a wall or ceiling of the operating room.

    [0024] FIG. 10 illustrates a front view of an exemplary Patient-Side Cart 150. In addition to the robotic arms 34, 36, 38, shown in FIG. 9, a fourth robotic arm 32 is shown in FIG. 10. The fourth robotic arm 32 is available so that another instrument 31 may be introduced at the work site along with the instruments 33, 35 and endoscope 37. Each of the robotic arms 32, 34, 36, 38 may be adapted with a Light Emitting Diode ("LED") array 1091 or other visual indicator to indicate status and/or other information for or related to the robotic arm. One or more buttons 1092 or other type of switch mechanism may also be provided on or near the robotic arm for various purposes such as to allow the Assistant to take control of the robotic arm so that the Assistant may manually retract and/or insert an instrument held by the robotic arm out of or into a corresponding entry aperture in the Patient.

    [0025] FIG. 11 illustrates an exemplary instrument 100 that may be used for either instrument 33 or 35. The instrument 100 comprises an interface housing 108, a shaft 104, an end effector 102, and a wrist mechanism 106 which includes one or more wrist joints. The interface housing 108 is removably attached to a robotic arm so as to be mechanically coupled to actuators (such as motors) in the instrument manipulator of the attached robotic arm. Cables or rods, that are coupled to the actuators of the instrument manipulator and extend through the shaft 104 from the interface housing 108 to the one or more wrist joints of the wrist mechanism 106 and to the jaws of the instrument's end effector 102, actuate the wrist joints and jaws in a conventional manner. The instrument manipulator may also manipulate the instrument in pitch and yaw angular rotations about its pivot point at the entry aperture, manipulate the instrument in a roll angular rotation about the instrument's shaft axis, and insert and retract the instrument along a rail on the robotic arm as commanded by the processor 43.

    [0026] The number of surgical tools used at one time and consequently, the number of robotic arms being used in the system 2000 will generally depend on the diagnostic or surgical procedure and the space constraints within the operating room, among other factors. If it is necessary to change one or more of the tools being used during a procedure, the Surgeon may instruct the Assistant to remove the instrument that is no longer being used from its robotic arm, and replace it with another instrument 131 from a Tray ("T") in the operating room. To aid the Assistant in identifying the instrument to be replaced, one or more LEDs of the LED color array 1091 on the robotic arm holding the instrument may be lit. Likewise, to aid the Assistant in identifying the replacement instrument 131, an LED that is adjacent the replacement instrument may be energized. Alternatively, other well known means for conveying such information may be used such as providing the information on a local display screen viewable by the Assistant or by voice instructions to the Assistant from the Surgeon over an audio communication system. To transfer control of the instrument (and its manipulator) from the Surgeon to the Assistant, either the Surgeon may activate a button on or adjacent to the input device associated with the instrument (such as button 49 adjacent the input device 42 as seen in FIG. 16), or alternatively, the Assistant may activate a button such as the mechanism 1092 on the robotic arm holding the instrument. After the Assistant is in control of the instrument, the Assistant may then retract the instrument out of the Patient by manually sliding it backwards along the rail on its robotic arm or insert the instrument into the Patient by manually sliding it forward along the rail.

    [0027] U.S. Patent No. 6,659,939 B2 entitled "Cooperative Minimally Invasive Telesurgical System," provides additional details on a multiple aperture medical robotic system such as described herein.

    [0028] As a second example of the robotic system 1000, FIG. 12 illustrates a top view of an operating room in which a single aperture medical robotic system 3000 is being employed by a Surgeon ("S") to perform a medical procedure on a Patient ("P"). The medical robotic system in this case is a Minimally Invasive Robotic Surgical (MIRS) system including a Console ("C") utilized by the Surgeon while performing a minimally invasive diagnostic or surgical procedure on the Patient with assistance from one or more Assistants ("A") while the Patient is on an Operating table ("O").

    [0029] In the single aperture medical robotic system 3000, a plurality of articulated instruments are introduced to a work site through a single entry aperture 1101 in the Patient by an entry guide (EG) 200. The aperture 1101 may be a minimally invasive incision or a natural body orifice. The entry guide 200 is a cylindrical structure which is held and manipulated by a robotic arm 2514. The robotic arm 2514 includes a setup arm and an entry guide manipulator. The setup arm is used to position the entry guide 200 at the aperture 1101 so that a pivot point occurs at the aperture. The entry guide manipulator may then be used to robotically insert and retract the entry guide 200 into and out of the aperture 1101. It may also be used to robotic ally pivot the entry guide 200 (and the articulated instruments disposed within it at the time) in pitch and yaw angular rotations about the pivot point. It may also rotate the entry guide 200 (and the articulated instruments disposed within it at the time) in roll about a longitudinal axis of the entry guide 200. Attached to the distal end of the robotic arm 2514 is a platform 2512 upon which instrument assemblies 2516 are mounted so that their respective instruments may extend through the entry guide 200. Each instrument assembly comprises an articulated instrument and its instrument manipulator.

    [0030] As shown in FIG. 13, the entry guide 200 has articulated instruments such as articulated surgical tool instruments 231, 241 and an articulated stereo camera instrument 211 (or other image capturing device instrument) extending out of its distal end. The camera instrument 211 has a pair of stereo image capturing devices 311, 312 and a fiber optic cable 313 (coupled at its proximal end to a light source) housed in its tip. The surgical tools 231, 241 have end effectors 331, 341. Although only two tools 231, 241 are shown, the entry guide 200 may guide additional tools as required for performing a medical procedure at a work site in the Patient. For example, as shown in a cross-sectional view of the entry guide 200 in FIG. 14, a passage 351 is available for extending another articulated surgical tool through the entry guide 200 and out through its distal end. Passages 431, 441, are respectively used by the articulated surgical tool instruments 231, 241, and passage 321 is used for the articulated camera instrument 211.

    [0031] Each of the articulated instruments comprises a plurality of actuatable joints and a plurality of links coupled to the joints. As an example, as shown in FIG. 13, the second articulated instrument 241 comprises first, second, and third links 322, 324, 326, first and second joints 323, 325, and a wrist assembly 327. The first joint 323 couples the first and second links 322, 324 and the second joint 325 couples the second and third links 324, 326 so that the second link 324 may pivot about the first joint 323 in pitch and yaw while the first and third links 322, 326 remain parallel to each other. The first, third, and camera articulated instruments, 231, 251, 211, may be similarly constructed and operated.

    [0032] FIG. 15 illustrates, as an example, articulated instrument assemblies 2516 mounted on a platform 2512 at a distal end of the robotic arm 2514. The entry guide 200 is attached to the platform 2512 so that entry guide 200 may be manipulated (along with the platform 2512) by the entry guide manipulator, as previously described. Each articulated instrument assembly includes an articulated instrument and its instrument manipulator. For example, an exemplary articulated instrument 2502a is mounted on an actuator assembly 2504 which includes a plurality of actuators for actuating joints of the articulated instrument. Instrument 2502a has a body tube 2506 that extends past its actuator assembly 2504 and enters the entry guide 200. Actuator assembly 2504 is mounted to a linear actuator 2510 (e.g. a servocontrolled lead screw and nut or a ball screw and nut assembly) that controls the insertion and retraction of the body tube 2506 into and out of the entry guide 200. The instrument manipulator 2520 in this case comprises the actuator assembly 2504 and the linear actuator 2510. In the case where the instrument 2502a is the articulated instrument 241, the distal end of the body tube 2506 is the first link 322 shown in FIG. 13. The second instrument 2502b is mounted with similar mechanisms as shown. In addition, an articulated camera instrument may be similarly mounted.

    [0033] FIG. 16 illustrates, as an example, a front view of a Console which may used in both medical robotic systems 2000 and 3000. The Console has left and right input devices 41, 42 which the user may grasp respectively with his/her left and right hands to manipulate associated devices, such as the entry guide and articulated instruments, in preferably six degrees-of-freedom ("DOF"). Foot pedals 44 with toe and heel controls are provided on the Console so the user may control movement and/or actuation of devices associated with the foot pedals. A processor 43 is provided in the Console for control and other purposes. A stereo vision display 45 is also provided in the Console so that the user may view the work site in stereo vision from images captured by the stereoscopic camera of the endoscope 37 or the articulated camera instrument 211. Left and right eyepieces, 46 and 47, are provided in the stereo vision display 45 so that the user may view left and right two-dimensional ("2D") display screens inside the display 45 respectively with the user's left and right eyes.

    [0034] The processor 43 performs various functions in the medical robotic system. One important function that it performs is to translate and transfer the mechanical motion of input devices 41, 42 through control signals over bus 110 to command actuators in their associated manipulators to actuate their respective joints so that the Surgeon can effectively manipulate devices, such as the tool instruments 231, 241, camera instrument 211, and entry guide 200. Another function is to perform various methods described herein.

    [0035] Although described as a processor, it is to be appreciated that the processor 43 may be implemented by any combination of hardware, software and firmware. Also, its functions as described herein may be performed by one unit or divided up among different components, each of which may be implemented in turn by any combination of hardware, software and firmware. Further, although being shown as part of or being physically adjacent to the Console, the processor 43 may also comprise a number of subunits distributed throughout the system.

    [0036] U.S. Publication No. US 2008/0065108 A1 entitled "Minimally Invasive Surgical System," provides additional details on a single aperture medical robotic system such as described herein.

    [0037] Now referring back to FIG. 1, a block diagram of components of the robotic system 1000 is illustrated to describe various aspects of the present invention. In this example, the robotic system 1000 has two tools and one camera. The pose (i.e., position and orientation) of the working end of the first tool is movable by a first plurality of actuatable joints 1001 whose positions and/or velocities are sensed by a first plurality of joint sensors 1002, the pose of the working end of the second tool is movable by a second plurality of actuatable joints 1004 whose positions and/or velocities are sensed by a second plurality of joint sensors 1005, and the pose of the image capturing end of the camera 37 is movable by a third plurality of actuatable joints 1007 whose positions and/or velocities are sensed by a third plurality of joint sensors 1008. Images captured by the camera 37 are processed by a video processor 1010 and/or the processor 43 with the processed images displayed on the stereo vision display 45.

    [0038] In a normal operating mode (referred to as a "tool following" mode), an operator of the system 1000 controls the tools to perform a procedure on an object at a work site while an on-site camera captures images of the working ends of the tools. In this mode, the input device 41 is associated with the first tool so that the processor 43 causes a first plurality of joint actuators 1003 to actuate the first plurality of joints 1001 so that sensed joint positions and/or velocities provided by the first plurality of joint sensors 1002 match the joint positions and/or velocities commanded at the time by the input device 41. Also, input device 42 is associated with the second tool so that the processor 43 causes a second plurality of joint actuators 1006 to actuate the second plurality of joints 1004 so that sensed joint positions and/or velocities provided by the second plurality of joint sensors 1005 match the joint positions and/or velocities commanded at the time by the input device 42. Meanwhile, the camera 37 may be held in place by the processor 43 commanding a third plurality of joint actuators 1009 to maintain the third plurality of joints 1007 in their current positions. Alternatively, the camera 37 may track movement of the first and second tools by the processor 43 commanding the third plurality of joint actuators 1009 to actuate the third plurality of joints 1007 so that the working ends of the first and second tools remain in a field-of-view ("FOV") of the camera 37.

    [0039] One or more switches 1011 are provided to indicate that control of one of the tools is to be switched to an assistant operator or the processor 43. Some or all of the switches 1011 may be provided on the robotic arms holding the tools so that the assistant may activate a designated one of the switches and assume control of the corresponding tool through the processor 43. Alternatively or additionally, some or all of the switches 1011 may be provided on or adjacent to the input devices 41, 42 (such as the switch 40 on an arm rest as shown in FIG. 16) so that the operator may activate a designated one of the switches and transfer control of the associated tool to the assistant through the processor 43 for manual action or directly to the processor 43 for automated action. One or more indicators 1012 (such as LEDs) are also provided to indicate to the assistant which one of the tools whose control is being switched to the assistant by the processor 43. One or more of the indicators 1012 may be provided on the robotic arms holding the tools so that the assistant knows which one of the tools is now under his or her control. Additionally, one or more of the indicators 1012 may be provided on a structure so that a tool may be indicated by the processor 43 as a new or replacement tool to be introduced to the work site. A memory 1013 is provided to store programmed instructions and data.

    [0040] FIG. 2 illustrates a flow diagram of a method implemented in the processor 43 of the robotic system 1000 for switching control of an instrument to an input device. In block 1021, the operator is operating in a normal mode in which the operator has control of an instrument through its associated input device, as described previously. In block 1022, the method checks whether control of the instrument is to be switched to the assistant so that the assistant may manually control its movement by moving its manipulator or to the processor 43 so that the processor may control its movement according to programmed instructions. As previously described, such switching may be indicated by activation of one of the switches 1010. If the determination in block 1022 is NO, then the method jumps back to block 1021 and the operator maintains control of the instrument. On the other hand, if the determination in block 1022 is YES, then in block 1023, the method switches control of the instrument to the assistant or processor 43.

    [0041] In blocks 1024 and 1026, the method respectively determines whether the instrument is moving in a retraction direction or an insertion direction. If the instrument is moving in a retraction direction (i.e., in a direction that would result in the instrument moving away from the work site), then the method allows such action and either loops back to repeat block 1024 or performs an optional block 1025 to perform a retraction mode algorithm that commands an articulated instrument to assume a retraction pose and/or avoid collisions with other instruments or objects during the instrument's retraction. Additional details for such a retraction algorithm for an articulated instrument may be found, for example, in U.S. Publication No. US 2011/0040305 A1 entitled "Controller Assisted Reconfiguration of an Articulated Instrument during Movement Into and Out of an Entry Guide". On the other hand, if the instrument is moving in an insertion direction (i.e., in a direction moving towards the work site), then the method allows such action and either proceeds directly to block 1028 or performs an optional block 1027 in which a camera field of view (FOV) targeting algorithm commands the tool joint actuators to actuate the tool joints so that the working end of the instrument moves towards the FOV of the camera 37 while avoiding collisions with other instruments and objects on the way. In this way, if the camera 37 moves during the procedure, the working end of the instrument may automatically change its direction to follow it as it is being inserted towards the work site.

    [0042] The camera 37 may move during the procedure if a coupled control mode is implemented by the processor 43 in which the pose of the camera 37 is automatically changed in response to commanded movement of the instruments so that the working ends of the instruments are maintained in the field of view of the camera 37. The camera 37 may move in response to direct operator commands through an associated input device. In this latter case, the operator may effectively guide the placement of the instrument being inserted by the assistant to a desired location at the work site.

    [0043] As a simplified example of the FOV targeting algorithm, FIGS. 7a-7d illustrate schematic views of instruments 33, 35 extending out of their respective apertures 166, 167 at a work site in various stages of a guided tool exchange procedure using the multiple aperture medical robotic system 2000. In FIG. 7a, the working ends of the instruments are shown as being in a field of view (FOV) 2120 of the endoscope 37. In FIG. 7b, three events have occurred since the time corresponding to FIG. 7a. First, instrument 33 has been retracted for a tool exchange so the instrument 33 is not seen in this figure. Second, the working end of instrument 35 has moved as indicated by the instrument's prior position in dotted line form and new position in solid line form with an arrow indicating the direction of the movement. Third, the image capturing end of the endoscope 37 and its FOV 2120 has also moved to maintain the working end of the instrument 35 in the camera's FOV 2120. In FIG. 7c, the instrument 33 (or its replacement) has been reinserted along a line extending through its original retraction path. Since the FOV 2120 of the endoscope 37 has moved, however, the working end of the instrument 33 does not enter the FOV 2120 in this case. Therefore, to prevent this situation from occurring, in FIG. 7d, the FOV targeting algorithm has caused the instrument 33 to be pivoted during its insertion so that the working end of the instrument 33 is in the repositioned FOV 2120. The pivoting in this case is illustrated by showing the instrument's original position in dotted line form and modified position in solid line form with an arrow indicating the direction of the pivoting performed in response to the insertion mode algorithm. As may be appreciated, care must taken during the pivoting of the instrument 33 to avoid collisions with other instruments and/or striking objects at or on the way to the work site. To prevent such misfortune, conventional collision avoidance techniques may be employed by the FOV targeting algorithm using knowledge of the current placements of such objects and the other instruments.

    [0044] Referring back to FIG. 2, in block 1028, the method determines whether or not the instrument has entered into a view of the work site that is being displayed at the time on the stereo vision display 45. If the determination in block 1028 is NO, then the method jumps back to perform blocks 1024 and 1026 again. On the other hand, if the determination in block 1028 is YES, then in block 1029 the method switches control of the instrument from the assistant back to the operator and loops back to block 1021.

    [0045] Although not shown in the flow diagram, the method may also be implemented so that either the assistant or the operator may over-ride the flow of FIG. 2 at any time to switch control of the instrument to the operator. For example, one of the switches 1011 may be activated to indicate that the normal process is being interrupted and control of the instrument is to immediately revert back to the operator. One way this may be done is by deactivating the switch that was previously activated to switch control of the instrument from the operator.

    [0046] FIG. 3 illustrates, as an example, a flow diagram of a method for performing block 1028 of FIG. 2. In block 1031, the method receives data from the tool joint sensors of the instrument currently being controlled by the assistant. Concurrently, in block 1032, the method receives data from the camera joint sensors. In block 1033, the method determines the pose of the working end of the instrument in its tool reference frame by, for example, mapping the tool joint data to a pose in the tool reference frame using an inverse kinematics algorithm for the instrument, wherein the tool reference frame is, for example, a Cartesian reference frame having its origin fixed on a point on the instrument. In block 1034, the method determines the pose of an image capturing end of the camera 37 in its camera reference frame by, for example, mapping the camera joint data to a pose in the camera reference frame using an inverse kinematics algorithm for the camera instrument, wherein the camera reference frame is, for example, a Cartesian reference frame having its origin fixed on a point on the camera instrument.

    [0047] In block 1035, the method determines the pose of the working end of the instrument in the camera reference frame. It may do this simply by using a known transformation between the tool reference frame and a world reference frame and a known transformation between the camera reference frame and the world reference frame, wherein the world reference frame is, for example, a Cartesian reference frame having its origin at a stationary point at the work site. The determined pose of the working end of the instrument in the camera reference frame may then be corrected using a previously determined error transform, wherein the error transform may be determined from a difference between the tool pose determined using the inverse kinematics algorithm and a tool pose determined using video image processing. The error transform may be first determined with a pre-operative calibration step, and periodically updated when the working end of the instrument is in the field of view of the camera 37. For additional details on such reference frames and transformations, see, for example, U.S. Patent No. 6,671,581 B2 entitled "Camera Referenced Control in a Minimally Invasive Surgical Apparatus".

    [0048] In block 1036, the method determines the display view relative to the camera view, wherein the display view is what is being displayed at the time on the stereo vision display 45 and the camera view is the stereo image being captured at the time by the stereo camera.

    [0049] Referring to FIG. 4, as an illustrative example, the stereo camera includes two cameras, C1 and C2, separated by a baseline distance "b", and having image planes, I1 and 12, defined at the focal length "f' of the cameras. The image planes, I1 and 12, are warped using a conventional stereo rectification algorithm to remove the effects of differing internal and external camera geometries.

    [0050] A point P in the camera reference frame is projected onto the image planes, I1 and 12, at image points, PI and P2, by an epipolar plane containing the point P, the two optical centers of the cameras, C1 and C2, and the image points, PI and P2. The position of the point P may then be determined in the camera reference frame using known values for the baseline distance "b" and focal length "f", and a disparity "d" calculated from the distances of the image points, P1 and P2, from their respective image plane center points (i.e., at the intersections of the x-axis with the y1 and y2 axes).

    [0051] When the display view is the same as the camera view (i.e., the field of view of the camera), then the left image plane I1 will be displayed in the left eye piece 46 of the stereo display 45 and the right image plane 12 will be displayed in the right eye piece 47 of the stereo display 45. As shown in FIG. 5, the display view in this case for the left eye piece 46 is a frustum of the left camera C1 which emanates from the camera C1 and passes through the left image plane I1 as indicated in the figure as display view 1062-L. A viewing area 1082-L is also shown in FIG. 5 to illustrate the frustum, wherein the viewing area 1082-L is a slice of the frustum that is further away from the camera C1 than the left image plane I1, but parallel to the left image plane I1. The display view for the right eye piece 47 would be a similar frustum emanating from the right image plane 12.

    [0052] If only a portion of the field of view of the camera 37 is being displayed in the stereo vision display 45, however, such as depicted by area 1071 in FIG. 6, then the display view for the left eye piece 46 is a smaller frustum emanating from the camera C1 as indicated in the figure as display view 1061-L. Such a smaller display view may occur, for example, when the operator commands the stereo vision display 45 to "zoom-in" on images being displayed at the time on the stereo vision display 45. In addition, the processor 43 may further regulate the display view by shifting it to the right or left, shifting it up or down, or rotating it, to provide an intuitive connection in controlling the tools whose images are being seen at the time on the stereo vision display 45 with their associated input devices 41, 42. Although only the left camera view I1 is shown in FIGS. 5 and 6, it is to be appreciated that for a 3-D display, a corresponding right camera view 12 is also necessary such as described in reference to FIG. 4, but is not being shown in FIGS. 5 and 6 to simplify the description. Additional details for a stereo imaging system such as used herein may be found, for example, in U.S. 6,720,988 entitled "Stereo Imaging System and Method for Use in Telerobotic Systems".

    [0053] Referring back to FIG. 3 now, in block 1037, the method then determines whether the pose of the working end of the instrument (which was determined in block 1035) is at least partially in the display view (which was determined in block 1036). If the determination in block 1037 is NO, then the method jumps back to block 1024 as shown in FIG. 2. On the other hand, if the determination in block 1037 is YES, then the method proceeds to block 1029 as shown in FIG. 2. As a refinement to simply using the kinematically determined pose of the working end of the instrument in block 1037, a computer model of the working end may also be used as a template to identify the working end in the left and right images being displayed at the time in the left and right eye pieces 46, 47 of the stereo vision display 45. The image matching in this case may be performed only after the kinematically determined pose of the working end of the instrument has reached a threshold distance from the display view in order to save the processor 43 from performing unnecessary processing. During the insertion of the instrument, it is desirable to specify a "target location" for the instrument by specifying minimum and maximum insertion distances. As an example, FIG. 8a illustrates a simplified schematic view of the articulated tool instruments 231, 241 and articulated camera instrument 211 extending out of a distal end of the entry guide 200 in the single aperture medical robotic system 3000. In this schematic, an object 500 and field of view (FOV) 2110 of the camera 211 are also shown. For descriptive purposes, the FOV 2110 is also assumed to be the display view for the purposes of block 1028 in FIG. 2.

    [0054] In FIG. 8b, the articulated tool instrument 241 is shown being inserted along its insertion axis (dotted line). A minimum insertion distance 2111 is specified which represents a minimum insertion point at which the working end of the instrument 241 is expected to enter the FOV 2110 of the camera 211. A maximum insertion distance 2112 is also specified which represents a maximum insertion point (with safety margin) beyond which the working end of the instrument 241 may strike the object 500. In the event that the instrument 241 reaches the maximum insertion distance 2112 without entering the FOV 2110 of the camera 211 for some reason, for safety purposes, further insertion of the instrument 241 by the assistant is prevented by locking the instrument in place.

    [0055] In FIG. 8c, the articulated tool instrument 241 is shown being inserted along its insertion axis (dotted line), but with the camera 211 (and possibly also the entry guide 200) moved back away from its original position by the operator. In this case, the operator can capture control of the instrument 241 sooner during the insertion. The minimum and maximum insertion distances, 2113 and 2114, are also adjusted as the field of view (FOV) 2110 of the camera 211 moves. In particular, the minimum insertion distance 2113 is adjusted so that it still represents the minimum insertion point at which the working end of the instrument 241 is expected to enter the camera's FOV 2110. The maximum insertion distance 2114, however, now represents an insertion distance that is near (or on the boundary) of where the FOV 2110 intersects the insertion plane of the instrument 241. Note that in this case, the maximum insertion distance 2114 is before the object 500 since the object 500 is well beyond the camera's FOV 2110. When the object 500 is within the camera's FOV 2110, then the maximum insertion distance would be based instead on the distance to the object 500 as described in reference to FIG. 8b.

    [0056] When the instrument 241 is being inserted and the maximum insertion distance is reached before it enters the view on the stereo vision display 45, in addition to the instrument 241 being locked in place for safety purposes, a warning message is preferably provided on the stereo vision display 45 to inform the operator that the instrument 241 is outside the display area and locked in place. The assistant may also be warned by the lighting of an LED, for example, on the instrument's manipulator. Audio warnings may also be provided, alternatively or in addition to these visual warnings, so as to be heard by the operator and/or assistant. In addition to such warnings, a tool position and identification indicator may be displayed in a boundary area extending around the display area in the stereo vision display 45, so that the operator knows approximately how far away and in which direction the instrument 241 currently is. Details for such a tool position and identification indicator may be found, for example, in U.S. Publication No. 2008/0004603 A1 entitled "Tool Position and Identification Indicator Displayed in a Boundary Area of a Computer Display Screen". With such knowledge of the approximate position of the instrument 241, the operator may then move the camera instrument 211 towards the working end of the instrument 241 until the working end is within view on the stereo vision display 45, at which time, control of the instrument 241 is switched to the operator according to the method described in reference to FIG. 2.

    [0057] Although the various aspects of the present invention have been described with respect to a preferred embodiment, it will be understood that the invention is entitled to full protection within the full scope of the appended claims.


    Claims

    1. A robotic system (1000) comprising:

    a camera (37) having a field of view;

    an instrument (100, 231, 241) having a working end (102, 331, 341);

    an input device (41, 42);

    a processor (43); and

    a memory (1013); and characterised by:
    the processor (43) being configured to switch control of the instrument (100, 231, 241) from one of an assistant control mode and a processor control mode to an operator control mode upon determining that the working end (102, 331, 341) of the instrument (100, 231, 241) has entered an area of the field of view, the assistant control mode being a mode in which an assistant manually controls movement of the instrument (100, 231, 241), the processor control mode being a mode in which the processor (43) responds to programmed instructions stored in the memory (1013) to control movement of the instrument (100, 231, 241), and the operator control mode being a mode in which the processor (43) responds to operator manipulation of the input device (41, 42) to control movement of the instrument (100, 231, 241).


     
    2. The robotic system (1000) of claim 1:
    wherein the processor (43) is configured to switch control of the instrument (100, 231, 241) from the operator control mode to the assistant control mode upon receiving an indication that such switch to the assistant control mode is to be made.
     
    3. The robotic system (1000) of claim 2, further comprising:
    means (1011) for providing the indication that a switch is to be made from the operator control mode to the assistant control mode.
     
    4. The robotic system (1000) of claim 1, further comprising:
    means (1012, 45) for providing a sensory indication that the processor (43) has switched control of the instrument (100, 231, 241) from the assistant control mode to the operator control mode.
     
    5. The robotic system (1000) of claim 1, further comprising:

    a stereo vision display (45) viewable by an operator interacting with the input device (41, 42);

    the camera (37) being a stereoscopic camera (37), the field of view being a volume, a view being displayed on the display (45) being a subspace of the volume that corresponds to the area of the field of view.


     
    6. The robotic system (1000) of claim 1:
    wherein the processor (43) is configured to determine that the working end (102, 331, 341) of the instrument (100, 231, 241) has entered the area of the field of view by: determining a pose of the instrument (100, 231, 241) and a pose of the camera (37) in a world reference frame, translating the pose of the instrument (100, 231, 241) to a reference frame of the camera (37) to generate a pose of the instrument (100, 231, 241) in a reference frame of the camera (37), and using the pose of the instrument (100, 231, 241) in the reference frame of the camera (37) to determine if the working end (102, 331, 341) of the instrument (100, 231, 241) is within the area of the field of view.
     
    7. The robotic system (1000) of claim 6, further comprising:

    a plurality of joints (1000, 1004) which cause movement of the working end (102, 331, 341) of the instrument (100, 231, 241); and

    a plurality of sensors (1002, 1005) for sensing states of the plurality of joints (1001, 1004);

    wherein the processor (43) is configured to determine the pose of the instrument (100, 231, 241) in the world reference frame by receiving information from the plurality of sensors (1002, 1005) and applying the information to one or more forward kinematics equations to generate a kinematically derived estimate for the pose of the instrument (100, 231, 241) in the world reference frame.


     
    8. The robotic system (1000) of claim 7:
    wherein the instrument (100, 231, 241) comprises a wrist joint (106, 327), the plurality of joints (1001, 1004) includes the wrist joint (106, 327), and the plurality of sensors (1002, 1005) includes a sensor that senses a state of the wrist joint (106, 327).
     
    9. The robotic system of claim 7:
    wherein the instrument (241) comprises a plurality of instrument joints (323, 325, 327); the plurality of joints (1001, 1004) includes the plurality of instrument joints (323, 325, 327), and the plurality of sensors (1002, 1005) include sensors that sense states of the plurality of instrument joints (323, 325, 327).
     
    10. The robotic system (1000) of claim 9, further comprising:

    an entry guide (200); and

    an articulated camera instrument (211) that includes the camera (37);

    wherein the working end (331, 341) of the instrument (231, 241) and a distal end of the articulated camera instrument (211) are extendable out of a distal end of the entry guide (200).


     
    11. The robotic system (1000) of claim 7, further comprising:

    a second plurality of joints (1007) which cause movement of the camera (37); and

    a second plurality of sensors (1008) for sensing states of the second plurality of joints (1007);

    wherein the processor (43) is configured to determine the pose of the camera (37) in the world reference frame by receiving information from the second plurality of sensors (1008) and applying the information to one or more forward kinematics equations.


     
    12. The robotic system (1000) of claim 1:
    wherein the processor (43) is configured to determine that the instrument (100, 231, 241) has entered the area of the field of view by identifying the working end (102, 331, 341) of the instrument (100, 231, 241) in an image captured by the camera (37).
     
    13. The robotic system (1000) of claim 1:
    wherein the processor (43) is configured to control movement of the instrument (100, 231, 241) to prevent insertion of the working end (102, 331, 341) of the instrument (100, 231, 241) beyond a maximum insertion distance, and to issue a warning to an operator of the input device (41, 42) if insertion of the working end (102, 331, 341) of the instrument (100, 231, 241) has reached the maximum insertion distance without entering the area of the field of view.
     


    Ansprüche

    1. Ein Robotersystem (1000), das Folgendes beinhaltet:

    eine Kamera (37) mit einem Sichtfeld;

    ein Instrument (100, 231, 241) mit einem Arbeitsende (102, 331, 341);

    ein Eingabegerät (41, 42);

    einen Prozessor (43); und

    einen Speicher (1013); und das dadurch gekennzeichnet ist, dass:
    der Prozessor (43) dazu ausgelegt ist, die Steuerung des Instruments (100, 231, 241) von einem von einem Assistentensteuerungsmodus und einem Prozessorsteuerungsmodus auf einen Bedienersteuerungsmodus umzuschalten, nachdem er bestimmt hat, dass das Arbeitsende (102, 331, 341) des Instruments (100, 231, 241) in einen Bereich des Sichtfelds eingetreten ist, wobei der Assistentensteuerungsmodus ein Modus ist, in dem ein Assistent die Bewegung des Instruments (100, 231, 241) manuell steuert, der Prozessorsteuerungsmodus ein Modus ist, in dem der Prozessor (43) auf die im Speicher (1013) gespeicherten programmierten Anweisungen anspricht, um die Bewegung des Instruments (100, 231, 241) zu steuern, und der Bedienersteuerungsmodus ein Modus ist, in dem der Prozessor (43) auf die Bedienermanipulation des Eingabegeräts (41, 42) anspricht, um die Bewegung des Instruments (100, 231, 241) zu steuern.


     
    2. Robotersystem (1000) nach Anspruch 1:
    wobei der Prozessor (43) dazu ausgelegt ist, die Steuerung des Instruments (100, 231, 241) von dem Bedienersteuerungsmodus auf den Assistentensteuerungsmodus umzuschalten, nachdem er einen Hinweis erhalten hat, dass ein solche Umschaltung auf den Assistentensteuerungsmodus vorgenommen werden soll.
     
    3. Robotersystem (1000) nach Anspruch 2, das ferner Folgendes beinhaltet:
    Mittel (1011) zum Bereitstellen des Hinweises, dass eine Umschaltung von dem Bedienersteuerungsmodus auf den Assistentensteuerungsmodus vorgenommen werden soll.
     
    4. Robotersystem (1000) nach Anspruch 1, das ferner Folgendes beinhaltet:
    Mittel (1012, 45) zum Bereitstellen eines sensorischen Hinweises, dass der Prozessor (43) die Steuerung des Instruments (100, 231, 241) von dem Assistentensteuerungsmodus auf den Bedienersteuerungsmodus umgeschaltet hat.
     
    5. Robotersystem (1000) nach Anspruch 1, das ferner Folgendes beinhaltet:

    eine Stereosichtanzeige (45), die von einem mit dem Eingabegerät (41, 42) interagierenden Bediener eingesehen werden kann;

    wobei die Kamera (37) eine stereoskopische Kamera (37) ist, das Sichtfeld ein Volumen ist, eine auf der Anzeige (45) angezeigte Sicht ein dem Bereich des Sichtfelds entsprechender Teilraum des Volumens ist.


     
    6. Robotersystem (1000) nach Anspruch 1:
    wobei der Prozessor (43) dazu ausgelegt ist, zu bestimmen, dass das Arbeitsende (102, 331, 341) des Instruments (100, 231, 241) in den Bereich des Sichtfelds eingetreten ist, indem er: eine Stellung des Instruments (100, 231, 241) und eine Stellung der Kamera (37) in einem Weltbezugsrahmen bestimmt, die Stellung des Instruments (100, 231, 241) auf einen Bezugsrahmen der Kamera (37) überträgt, um eine Stellung des Instruments (100, 231, 241) in einem Bezugsrahmen der Kamera (37) zu erzeugen, und die Stellung des Instruments (100, 231, 241) in dem Bezugsrahmen der Kamera (37) verwendet, um zu bestimmen, ob das Arbeitsende (102, 331, 341) des Instruments (100, 231, 241) innerhalb des Bereichs des Sichtfelds liegt.
     
    7. Robotersystem (1000) nach Anspruch 6, das ferner Folgendes beinhaltet:

    eine Vielzahl von Gelenken (1000, 1004), die eine Bewegung des Arbeitsende (102, 331, 341) des Instruments (100, 231, 241) verursachen; und

    eine Vielzahl von Sensoren (1002, 1005) zum Abtasten von Zuständen der Vielzahl von Gelenken (1001, 1004);

    wobei der Prozessor (43) dazu ausgelegt ist, die Stellung des Instruments (100, 231, 241) in dem Weltbezugsrahmen zu bestimmen, indem er Informationen von der Vielzahl von Sensoren (1002, 1005) empfängt und die Informationen auf eine oder mehrere Vorwärtskinematik-Gleichungen anwendet, um eine kinematisch abgeleitete Schätzung für die Stellung des Instruments (100, 231, 241) in dem Weltbezugsrahmen zu erzeugen.


     
    8. Robotersystem (1000) nach Anspruch 7:
    wobei das Instrument (100, 231, 241) ein Handgelenksgelenk (106, 327) beinhaltet, die Vielzahl von Gelenken (1001, 1004) das Handgelenksgelenk (106,327) umfasst und die Vielzahl von Sensoren (1002, 1005) einen Sensor umfasst, der einen Zustand des Handgelenksgelenks (106, 327) abtastet.
     
    9. Robotersystem nach Anspruch 7:
    wobei das Instrument (241) eine Vielzahl von Instrumentengelenken (323, 325, 327) beinhaltet; die Vielzahl von Gelenken (1001, 1004) die Vielzahl von Instrumentengelenken (323, 325, 327) umfasst und die Vielzahl von Sensoren (1002, 1005) Sensoren umfasst, die Zustände der Vielzahl von Instrumentengelenken (323, 325, 327) abtasten.
     
    10. Robotersystem (1000) nach Anspruch 9, das ferner Folgendes beinhaltet:

    eine Eintrittsführung (200); und

    ein gelenkiges Kamerainstrument (211), das die Kamera (37) umfasst;

    wobei das Arbeitsende (331, 341) des Instruments (231, 241) und ein distales Ende des gelenkigen Kamerainstruments (211) aus einem distalen Ende der Eintrittsführung (200) ausziehbar sind.


     
    11. Robotersystem (1000) nach Anspruch 7, das ferner Folgendes beinhaltet:

    eine zweite Vielzahl von Gelenken (1007), die eine Bewegung der Kamera (37) verursachen; und

    eine zweite Vielzahl von Sensoren (1008) zum Abtasten von Zuständen der zweiten Vielzahl von Gelenken (1007);

    wobei der Prozessor (43) dazu ausgelegt ist, die Stellung der Kamera (37) in dem Weltbezugsrahmen zu bestimmen, indem er Informationen von der zweiten Vielzahl von Sensoren (1008) empfängt und die Informationen auf eine oder mehrere Vorwärtskinematik-Gleichungen anwendet.


     
    12. Robotersystem (1000) nach Anspruch 1:
    wobei der Prozessor (43) dazu ausgelegt ist, zu bestimmen, dass das Instrument (100, 231, 241) in den Bereich des Sichtfelds eingetreten ist, indem er das Arbeitsende (102, 331, 341) des Instruments (100, 231, 241) in einem von der Kamera (37) erfassten Bild identifiziert.
     
    13. Robotersystem (1000) nach Anspruch 1:
    wobei der Prozessor (43) dazu ausgelegt ist, die Bewegung des Instruments (100, 231, 241) zu steuern, um ein Einführen des Arbeitsendes (102, 331, 341) des Instruments (100, 231, 241) über eine maximale Einführungsdistanz hinaus zu verhindern und eine Warnung an einen Bediener des Eingabegeräts (41, 42) auszugeben, wenn die Einführung des Arbeitsendes (102, 331, 341) des Instruments (100, 231, 241) die maximale Einführungsdistanz erreicht hat, ohne in den Bereich des Sichtfelds einzutreten.
     


    Revendications

    1. Système robotique (1000) comprenant :

    une caméra (37) ayant un champ de vision ;

    un instrument (100, 231, 241) ayant une extrémité de travail (102, 331, 341) ;

    un dispositif d'entrée (41, 42) ;

    un transformateur (43) ; et

    une mémoire (1013) ; et caractérisé en ce que :
    le processeur (43) est configuré pour commuter la commande de l'instrument (100, 231, 241) d'un mode parmi un mode de commande par assistant et un mode de commande par processeur à un mode de commande par opérateur lorsqu'il est déterminé que l'extrémité de travail (102, 331, 341) de l'instrument (100, 231, 241) est entrée dans une zone du champ de vision, le mode de commande par assistant étant un mode dans lequel un assistant commande manuellement le mouvement de l'instrument (100, 231, 241), le mode de commande par processeur étant un mode dans lequel le processeur (43) répond à des instructions programmées stockées dans la mémoire (1013) pour commander le mouvement de l'instrument (100, 231, 241), et le mode de commande par opérateur étant un mode dans lequel le processeur (43) répond à une manipulation par opérateur du dispositif d'entrée (41, 42) pour commander le mouvement de l'instrument (100, 231, 241).


     
    2. Système robotique (1000) selon la revendication 1 :
    dans lequel le processeur (43) est configuré pour commuter la commande de l'instrument (100, 231, 241) du mode de commande par opérateur au mode de commande par assistant lors de la réception d'une indication selon laquelle cette commutation au mode de commande par assistant doit être effectuée.
     
    3. Système robotique (1000) selon la revendication 2, comprenant en outre :
    des moyens (1011) pour fournir l'indication selon laquelle une commutation doit être effectuée du mode de commande par opérateur au mode de commande par assistant.
     
    4. Système robotique (1000) selon la revendication 1, comprenant en outre :
    des moyens (1012, 45) pour fournir une indication sensorielle selon laquelle le processeur (43) a commuté la commande de l'instrument (100, 231, 241) du mode de commande par assistant au mode de commande par opérateur.
     
    5. Système robotique (1000) selon la revendication 1, comprenant en outre :

    un écran de vision stéréo (45) visible par un opérateur qui interagit avec le dispositif d'entrée (41, 42) ;

    la caméra (37) étant une caméra stéréoscopique (37), le champ de vision étant un volume, une vue qui est affichée sur l'écran (45) étant un sous-espace du volume qui correspond à la zone du champ de vision.


     
    6. Système robotique (1000) selon la revendication 1 :
    dans lequel le processeur (43) est configuré pour déterminer que l'extrémité de travail (102, 331, 341) de l'instrument (100, 231, 241) est entrée dans la zone du champ de vision par : détermination d'une pose de l'instrument (100, 231, 241) et d'une pose de la caméra (37) dans un cadre de référence mondial, traduction de la pose de l'instrument (100, 231, 241) dans un cadre de référence de la caméra (37) pour générer une pose de l'instrument (100, 231, 241) dans un cadre de référence de la caméra (37), et utilisation de la pose de l'instrument (100, 231, 241) dans le cadre de référence de la caméra (37) pour déterminer si l'extrémité de travail (102, 331, 341) de l'instrument (100, 231, 241) se trouve au sein de la zone du champ de vision.
     
    7. Système robotique (1000) selon la revendication 6, comprenant en outre :

    une pluralité d'articulations (1000, 1004) qui provoquent un mouvement de l'extrémité de travail (102, 331, 341) de l'instrument (100, 231, 241) ; et

    une pluralité de capteurs (1002, 1005) pour détecter des états de la pluralité d'articulations (1001, 1004) ;

    dans lequel le processeur (43) est configuré pour déterminer la pose de l'instrument (100, 231, 241) dans le cadre de référence mondial par réception d'informations provenant de la pluralité de capteurs (1002, 1005) et application des informations à une ou plusieurs équations de cinématique directe pour générer une estimation dérivée de manière cinématique pour la pose de l'instrument (100, 231, 241) dans le cadre de référence mondial.


     
    8. Système robotique (1000) selon la revendication 7 :
    dans lequel l'instrument (100, 231, 241) comprend une articulation de poignet (106, 327), la pluralité d'articulations (1001, 1004) inclut l'articulation de poignet (106, 327), et la pluralité de capteurs (1002, 1005) inclut un capteur qui détecte un état de l'articulation de poignet (106, 327).
     
    9. Système robotique selon la revendication 7 :
    dans lequel l'instrument (241) comprend une pluralité d'articulations d'instrument (323, 325, 327) ; la pluralité d'articulations (1001, 1004) inclut la pluralité d'articulations d'instruments (323, 325, 327), et la pluralité de capteurs (1002, 1005) inclut des capteurs qui détectent des états de la pluralité d'articulations d'instruments (323, 325, 327).
     
    10. Système robotique (1000) selon la revendication 9, comprenant en outre :

    un guide d'entrée (200) ; et

    un instrument de caméra articulé (211) qui inclut la caméra (37) ;

    dans lequel l'extrémité de travail (331, 341) de l'instrument (231, 241) et une extrémité distale de l'instrument de caméra articulé (211) peuvent être étendues hors d'une extrémité distale du guide d'entrée (200).


     
    11. Système robotique (1000) selon la revendication 7, comprenant en outre :

    une deuxième pluralité d'articulations (1007) qui provoquent un mouvement de la caméra (37) ; et

    une deuxième pluralité de capteurs (1008) pour détecter des états de la deuxième pluralité d'articulations (1007) ;

    dans lequel le processeur (43) est configuré pour déterminer la pose de la caméra (37) dans le cadre de référence mondial par réception d'informations provenant de la deuxième pluralité de capteurs (1008) et application des informations à une ou plusieurs équations de cinématique directe.


     
    12. Système robotique (1000) selon la revendication 1 :
    dans lequel le processeur (43) est configuré pour déterminer que l'instrument (100, 231, 241) est entré dans la zone du champ de vision par identification de l'extrémité de travail (102, 331, 341) de l'instrument (100, 231, 241) dans une image capturée par la caméra (37).
     
    13. Système robotique (1000) selon la revendication 1 :
    dans lequel le processeur (43) est configuré pour commander un mouvement de l'instrument (100, 231, 241) pour empêcher l'insertion de l'extrémité de travail (102, 331, 341) de l'instrument (100, 231, 241) au-delà d'une distance d'insertion maximale, et pour émettre un avertissement à un opérateur du dispositif d'entrée (41, 42) si l'insertion de l'extrémité de travail (102, 331, 341) de l'instrument (100, 231, 241) a atteint la distance d'insertion maximale sans entrer dans la zone du champ de vision.
     




    Drawing















































    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description