FIELD OF THE INVENTION
[0001] The invention relates to a system for assisting a user in navigating a medical device
in a region of a patient body. Moreover, the invention relates to a computer program
for carrying out a method for assisting a user in navigating a medical device in a
region of a patient body. The region of the patient body is a cardiac chamber and
the medical device is an intracardiac catheter or another intracardiac device.
BACKGROUND OF THE INVENTION
[0002] Interventional cardiology procedures, including electrophysiology (EP) and structural
heart disease (SHD) procedures rely on the use of fluoroscopy that allows real-time
visualization of the anatomy and of radiopaque devices used in these procedures. The
major disadvantage of fluoroscopy is however the exposure of the patient and staff
to radiation doses. Therefore, there is a trend and desire to minimize the use of
fluoroscopy during these procedures. Another disadvantage of fluoroscopy is the inability
to visualize soft-tissue structures.
[0003] Ultrasound (US) imaging is also often used in these procedures, including intracardiac
echocardiography (ICE), transesophageal echocardiography (TEE) and transthoracic echocardiography
(TTE). US imaging has the advantage that it allows for the visualization of soft-tissue
structures and blood flow without harmful scatter radiation. Devices such as catheters
and needles can be visualized using ultrasound. However, it is often difficult to
identify the tip of such a device, in particular when using two-dimensional ultrasound,
because the device can be out of the imaged plane and because shadowing and reverberations
complicate the identification of the tip.
[0004] Navigation platforms for navigating medical devices in cardiology procedures therefore
may use additional hardware for tracking the medical device in accordance with a certain
tracking modality such as electromagnetic (EM) tracking, impedance tracking, optical
shape sensing or satellite-based tracking. However, these tracking modalities give
rise to inaccuracies with respect to the localization of medical device relative to
the anatomy as e.g. shown in the US images.
[0005] Likewise, if the tracked devices are used to reconstruct the anatomy of the heart
or another body region as in electro-anatomical mapping, for example, the generated
representation of the anatomy may be inaccurate due to inaccuracies in the tracking
of the devices. In EM tracking, such inaccuracies may particularly be due to metal
in the environment which can cause disturbances. For impedance tracking, patches on
the patient surface are used as reference but inhomogeneities in impedances for various
tissues (e.g. cardiac and lung) and changes in volume load during the procedure can
create inaccuracies. For optical shape sensing, a fixture at the patient table is
used as a reference and the position error of this fixture propagates over the length
of the optical fiber. For satellite-based tracking, such as tracking using the Global
Positioning System (GPS), also the localization is independent of anatomy.
[0006] WO 2015/092667 A1 discloses a system for tracking an instrument including an intraoperative transducer
array configured to generate signals from array positions to generate real-time images
of an area of interest. The instrument can be a penetrating instrument having a sensor
mounted at a position of interest and being responsive to the signals from the array
positions. A signal processing module can be provided and configured to determine
a position and orientation of the instrument in accordance with the signals and to
classify media of the position of interest based upon a response of the sensor to
the signals from the array positions. An overlay module can be provided and configured
to generate an overlay image registered to the real-time images to identify a position
of the position of interest and provide feedback on the media in which the position
of interest is positioned. A display can be provided and configured to provide visual
feedback of the overlay image on the real-time images.
[0007] WO 2016/009350 A1 discloses a system for tracking an instrument, comprising:two or more sensors disposed
along a length of an instrument and being spaced apart from adjacent sensors;an interpretation
module configured to select and update an image slice from a three-dimensional image
volume in accordance with positions of the two or more sensors, the three-dimensional
image volume including the positions of two or more sensors with respect to a target
in the volume; and an image processing module configured to generate an overlay indicating
reference positions in the image slice, the reference positions including the positions
of the two or more sensors and relative offsets from the image slice in a display
to provide feedback for positioning and orienting the instrument.
[0008] EP1312309 A1 discloses a navigation system for assisting a surgeon during an operation in which
a first reference image is generated using computer tomography or magnetic resonance
prior to the operation. During an operation an area of interest is imaged ultrasonically.
A coordinate transformation is created linking the images. The position of an object
in further images representing further situations can then be seen.
SUMMARY OF THE INVENTION
[0009] It is an object of the present invention to provide a navigation platform that mitigates
the aforementioned problems and allows for a more accurate localization of a medical
device with respect to the anatomy as shown in US images thereof.
[0010] The invention is defined by the appended independent claims. Further embodiments
are defined by the dependent claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] In the following drawings:
Fig. 1 schematically and exemplarily shows components of a system for navigating a
medical device in a region of a patient body,
Fig. 2 schematically and exemplarily shows a three-dimensional model of a left atrium
of a heart,
Fig. 3a schematically and exemplarily shows a two-dimensional slice corresponding
to a field of view of an US probe of the system, which is mapped onto the model,
Fig. 3b schematically and exemplarily shows a three-dimensional cone corresponding
to a field of view of an US probe of the system, which is mapped onto the model,
Fig. 4 schematically and exemplarily shows a visualization in which a live US-image
and a position of the medical device is overlaid over the model,
Fig. 5 schematically and exemplarily shows on overlay of a current position and preceding
positions of an US sensor attached to the medical device over the model,
Fig. 6 schematically and exemplarity shows steps of a procedure for generating visualizations
in which a position of a medical device is shown using a model.
DETAILED DESCRIPTION OF EMBODIMENTS
[0012] Fig. 1 schematically and exemplarily shows components of a system for navigating
a medical device 1 in a region of a patient body, which corresponds to a cardiac chamber.
The system allows for visualizing the relevant region of the patient body and a position
and/or orientation of one or more medical device(s) 1 used in the region of the patient
body to a physician performing an interventional procedure using the medical device.
On the basis of the generated visualizations, the physician can steer the medical
device 1 during the interventional procedure.
[0013] The medical device 1 may be a catheter, particularly an ablation catheter, a needle
or a guidewire, for example. The system is used for carrying out structural heart
disease procedures including valve replacement/repair (e.g. Transcatheter Aortic Valve
Replacement (TAVR), mitraclip, pulmonic valve, tricuspid valve etc.) and occlusions
(e.g. ASD/PFO closure, VSD closure, left atrial appendage closure, etc.). Moreover,
the system may be used in electrophysiology (EP) studies with ablation, including
catheter ablation procedure for treatment of arrhythmias including atrial fibrillation
(AF).
[0014] The system comprises a miniaturized US probe 2 which includes an US transducer for
emitting US signals and for sensing echoes of the US signals in order to generate
US images with respect to a certain field of view. During the interventional procedure,
the US probe 2 is inserted into the patient body to acquire live US images of the
relevant body region essentially in real-time. In order to insert the US probe 2 into
the patient body, it may be attached to a catheter or a similar elongated device.
[0015] The US probe 2 is configured to acquire three- or two-dimensional US images. In order
to the generate the US images, the US signals sensed by means of the US probe 2 are
processed in a US unit 3 which is located external to the patient body and connected
to the US probe 2 and which is configured to generate the US images on the basis of
US signals in a manner known to the person skilled in the art as such.
[0016] The US probe 2 is preferably inserted into the heart to image the relevant cardiac
chamber in accordance with an ICE technique. However, the US probe 2 may likewise
be configured and utilized in accordance with another echocardiography technique known
to a person skilled in the art, such as echocardiographic imaging from the esophagus
as in TEE or echocardiographic imaging from a position external to the patient body
as in TTE.
[0017] Moreover, the system comprises a tracking arrangement for determining the position
and/or orientation of the medical device 1 relative to the US probe 2. This tracking
arrangement will be described in more detail further below. On the basis of the relative
position and/or orientation of the medical device 1 with respect to the US probe 2,
the system generates the visualization of the position and/or orientation of the medical
device 1 in the relevant region of the patient body.
[0018] In the system, the visualization of the relevant region of the patient body and of
the position and/or orientation of the medical device 1 positioned therein is based
on a three-dimensional model of the relevant region of the patient body. More specifically,
the system may generate visualizations in which the live US images and indications
of the position and/or orientation of the medical device are overlaid over the model.
In addition or as an alternative, the system may generate visualizations which include
a part of the model included in the field of view of a virtual eye at the tip of the
medical device 1. This part of the model may further be overlaid by the live US images
in the visualizations.
[0019] For displaying the visualizations of the volume of interest and of the position and/or
orientation of the medical device 1, the system further comprises a display unit 4.
The display unit 4 may comprise a monitor screen. Likewise, the display unit 4 may
be configured in another way and may comprise virtual reality glasses, for example.
[0020] The three-dimensional model of the relevant region of the patient is preferably created
prior to the actual interventional procedure during which the live US images are acquired
and stored in a 3D model providing unit 5 for use during the actual interventional
procedure. By way of example, a corresponding model 21 of the left atrium of the heart
is schematically illustrated in Fig. 2.
[0021] In one implementation, the model is created on the basis of a series of US images
acquired using the US probe 2 during an initialization phase preceding the actual
interventional procedure. During this initialization phase, the US probe 2 may be
moved to image relevant region of the patient body essentially completely in a series
of US images. Then, the 3D model providing unit 5 may create the model by combining
the US images, particularly by stitching the US images. For this purpose, any stitching
technique known the person skilled in the art may be applied.
[0022] If the relevant region of the patient body comprises the left atrium of the heart,
as it is the case in ablation of atrial fibrillation (AF), for example, it may be
imaged from the right atrium through the interatrial septum. For this purpose, the
US probe 2 is placed at an appropriate position in the right atrium and is operated
to acquire a series of US images of the left atrium under different viewing angles
so that the left atrium is imaged essentially completely. On the basis of these images,
a model of the left atrium may then be created in the 3D model providing unit 5 by
stitching the acquired US images. As an alternative, the US probe 2 may be positioned
within the left atrium for acquiring the series of images of the left atrium under
different viewing angles. For this purpose, a transseptal puncture can be made in
order to cross the interatrial septum with the US probe 2. In this procedure, a sufficiently
small US probe 2 may be used which allows for a safe transseptal crossing. In order
to acquire images of the complete left atrium, the US probe 2 may be moved in a suitable
combination of translations, deflections and rotations.
[0023] During the acquisition of the series of images used for creating the model, the positions
and orientation of the US probe 2 may optionally be tracked with respect to a certain
reference frame in order to determine the position and orientation of the model in
this reference frame. As will be explained further below, the position and orientation
may be used in the process of mapping the live US images onto the model. For tracking
the position and orientation of the US probe 2, any suitable tracking technique known
to a person skilled in the art may be used. Examples of such tracking techniques include
a tracking on the basis of images of the relevant region of the patient body acquired
using a suitable imaging modality, such as fluoroscopy, or EM tracking, impedance
tracking, optical shape sensing and satellite-based tracking.
[0024] In accordance with a further approach, the position and orientation of the US probe
2 may be tracked relative to the position and orientation of a further medical device
in a manner further described below, when the further medical device, which is also
referred to a reference device herein below, is positioned at a fixed reference location
during the initialization phase and during the actual interventional procedure. In
this case, the reference device defines the reference frame for the tracking of the
US probe 2.
[0025] In further implementations, the model of the relevant body region of a particular
patient may be selected from a plurality of pre-generated models for the same body
region, which may be generated on the basis of data collected for other patients and
stored in a corresponding library. These models may likewise be created on the basis
of US image data. Alternatively, these models may be created on the basis of imaging
data of another imaging modality, such as computed tomography (CT imaging) or magnetic
resonance (MR) imaging. From the pre-generated models, one model may be selected which
best matches with the anatomy of the patient.
[0026] The selection of the best matching model may again be carried out on the basis of
US images acquired during an initialization phase. In particular, the model may be
selected, which has the largest similarity to the US images in accordance with the
suitable similarity measure. The similarities between an US image and the model may
be determined on the basis of a segmented version of the US image, which may be computed
using a suitable procedure known the person skilled in the art. The similarity measure
may be computed on the basis of the number of overlapping points between the segmented
US image and the model for the best overlap between the segmented US image and the
model. On the basis of the determined position and orientation of the US probe 2 at
the time of the acquisition of the US images, the position and orientation of the
selected model in a reference frame may again be determined as described above.
[0027] In further alternatives, the three-dimensional model may be created on the basis
of images of the relevant body region which are not acquired using the US probe 2
but using another imaging modality. For instance, in case the US probe 2 is an ICE
probe, the images may be acquired using another US imaging modality, such as TEE or
TTE. Likewise, another imaging modality may be used to acquire one or more image(s)
for creating the model, such as, for example computed tomography (CT) imaging, magnetic
resonance (MR) imaging or 3D rotational angiography (3DATG). Also in these implementations,
the position and orientation of the model in a reference frame may be determined,
e.g. by tracking the utilized US probe or on the basis of the known image frame of
the CT or MR image.
[0028] Further, when the relevant body region moves periodically during the interventional
procedure, the three-dimensional model of the relevant body region may represent the
body region in one particularly phase of its periodic motion. In this implementation,
visualizations may only be generated for the relevant motion phase. This particularly
means that only live US images and position and/or orientation information acquired
during the relevant motion phase are used in the system. These data may be selected
on the basis of a gating signal, which indicates the start and end of the relevant
motion phase in each cycle of the periodic motion.
[0029] The relevant motion phase may correspond to the systole or the diastole. The gating
signal may be derived from an electrocardiography (ECG) signal for, example. As an
alternative, any other signal varying in synchronization with the periodic motion
of the heart may be used. So, the gating signal may be derived from position and/or
orientation information of the US probe 2 and/or the tracked medical device 1. Likewise,
the gating signal may be derived from the live US images acquired by means of the
US probe 2. In this embodiment, a statistical property of the live US images varying
in synchronization with the period motion of the heart, such as the mean pixel value
(in case of two-dimensional images) or voxel value (in case of three-dimensional images)
or the variance of all pixel or voxel values, may be evaluated, and the gating signal
may be derived from the variations of this property.
[0030] Moreover, a gating mechanism may be applied with respect to other motions of the
heart, such as respiratory motion. In this case, the model of the heart may be created
for a particular phase of the respiratory motion of the heart, and only live US images
and position and/or orientation information acquired during this phase are used in
the system for generating a visualization. For determining the occurrence of the relevant
phase of the respiratory motion, the system may further comprise a sensor for determining
the respiratory motion, such as, for example, a sensor for determining the ventilation
air flow and/or a sensor for determining the movement of the patient's chest or abdominal
wall during breathing. On the basis of the signals of this sensor, the data including
the live US images and the position and/or orientation data are unlocked (for the
relevant phase of the respiratory motion) or locked (during other phases of the respiratory
motion) for the creation of visualizations.
[0031] As an alternative to a static model and the aforementioned gating, a dynamic model
may be used. This model may include deforming sub-models for each relevant phase of
the periodic motion of the relevant body region, where of each deforming sub-model
models the changing form of the relevant body region. These sub-models may be defined
on the basis of vector fields describing the displacement of image portions of the
model with time during the motion phases. In each of the motion phases, the system
uses the associated sub-model for generating the visualizations on the basis of live
US images and position and/or orientation information for the tracked medical device
1 acquired during this motion phase. Corresponding sub-models may be created for different
phases of the cardiac motion and/or for the respiratory motion of the relevant body
region.
[0032] For identifying the relevant motion phases in this alternative, suitable trigger
signals are used, which may be derived in a similar manner as the aforementioned gating
signals. In case the relevant region of the patient body includes a cardiac chamber,
the trigger signals may particularly again be derived from an ECG signal or from another
signal varying in synchronization with the heart motion. Optionally, the dynamic model
may also be generated for different phases of the respiratory motion of the heart
and the corresponding phases may be identified using a sensor for determining the
respiratory motion.
[0033] In the way described above, models of various regions of interest may be created.
One such region may be the left atrium as described above. In a similar manner, models
can particularly be created for other heart chambers, such as the right atrium, left
and right ventricle, or for vessels such as the aorta, pulmonary artery, pulmonary
veins, inferior vena cava, superior vena cava, coronary arteries, coronary veins,
or for a valve anatomy, such as the aortic valve, mitral valve, tricuspid valve, pulmonary
valve, or the esophagus.
[0034] The tracking arrangement for determining the position and/or orientation of the medical
device 1 relative to the US probe 2 includes at least one US sensor 6 attached to
the medical device 1, particularly to its tip. The US sensor 6 is configured to sense
US signals incident onto the US sensor 6. For this purpose, the US sensor 6 may comprise
a foil of US sensitive material. Likewise, the US sensor 6 may comprise an US transducer,
such as for example, a lead zirconium titanate (PZT) transducer, a single crystal
transducer (SXL), a capacitive micro-machined ultrasonic transducer (CMUT) or a piezoelectric
micro-machined ultrasonic transducer (PMUT), where only the ability to sense US signals
is used here. During operation in the present system, the US sensor 6 senses US signals
emitted by the US probe 2.
[0035] The US sensor 6 is connected to a tracking unit 7 which determines the relative position
of the US sensor 6 with respect to the US probe 2 on the basis of the sensed US signals
and, thus, determines the relative position of the tip of the medical device 1 with
respect to the US probe 2. In order to determine the orientation of the medical device
1, at least one further US sensor 6 is attached to the medical device 1 and the tracking
unit 6 also determines the relative position of the further US sensor 6 with respect
to the US probe 2 on the basis of the US signals sensed by the further US sensor 6.
On the basis of the relative positions of the US sensors 6, the tracking unit then
determines the orientation of the medical device 1.
[0036] In order to ascertain the position of one US sensor 6, the tracking unit 7 evaluates
the US signals sensed by the US sensor 6 while the US probe 2 images the volume of
interest by emitting US beam pulses under different azimuth angles and, in case of
a 3D US probe 2, also under different elevation angles. In order to determine the
angular position of the US sensor 6 with respect to the US probe, the tracking unit
7 compares the responses to the emitted US beams sensed by the US sensor 6 and determines
the azimuth angle and, in case of a 3D US probe 2, also the elevation angle under
which the beam(s) resulting in the maximum response(s) have been emitted. The determined
angle(s) define(s) the relative angular position of the US sensor 6 with respect to
the US probe 2. The distance between the US sensor 6 and the US probe 2 is determined
on the basis of the time delays between the times of the transmission of the beams
producing the maximum responses and the times of the sensing of the beams by the US
sensor 6, i.e. on the basis of the time of flight of the beams.
[0038] As said above, the system generates visualizations in which the live US images and
indications of the position and/or orientation of the medical device 1 are overlaid
over the model of the relevant region of the patient body. These visualizations are
displayed at the display unit 4 during an interventional procedure in order to assist
the physician in steering the medical device 1 during the interventional procedure.
[0039] In order to generate these visualizations, a mapping unit 8 of the system maps the
live US images acquired using the imaging probe 2 onto the model of the relevant region
of the patient body provided by the 3D model providing unit 5. Thus, the mapping unit
8 determines the part of the model which is included in the live images. In Fig. 3a,
this mapping is schematically and exemplarily illustrated for a two-dimensional slice
31 corresponding to a field of view of an US probe 2 for acquiring two-dimensional
images, which is mapped onto the model 21 of the left atrium shown in Fig. 2. Fig.
3b schematically and exemplarily illustrated the mapping for a three-dimensional cone
32 corresponding to a field of view of an US probe 2 for acquiring three-dimensional
images, which is mapped onto the model 21 of the left atrium shown in Fig. 2.
[0040] In one implementation, the mapping of a live US image onto the model is performed
on the basis of the comparison between the live US image and the model. In particular,
an image registration between the live US image and the model may be carried out which
involves the determination of a rigid transformation for transforming the US image
such that it matches a portion of the model. The rigid transformation comprises a
rotation and/or a translation.
[0041] In one embodiment of the registration procedure, the mapping unit 8 may identify
fiducial image points in the live US image and map these image points to corresponding
points of the model in order to determine the transformation. The mapping of fiducial
points can be carried out using known computer vision techniques, such as, for example,
scale-invariant feature transform (SIFT). Alternatively, a registration method may
be applied which determines the rigid transformation such that the transformed live
US image has the largest similarity to the model. Such a registration procedure may
be performed on the basis of a segmented version of the live US image, which may be
determined using a suitable segmentation procedure known the person skilled in the
art. The similarity between the (transformed) US image and the model may again be
determined on the basis of a suitable similarity measure, e.g. as explained above.
[0042] In case the model is a dynamic model, the mapping of the live US image onto the model
may also be made by matching estimated motion vectors describing the displacement
of image portions in the live image pertaining to one motion phase relative to the
positions of the image portions in a live image of the preceding motion phase with
the motion vectors describing the displacement of image portions of the dynamic model.
[0043] The mapping of the live US images onto the model may be performed on the basis of
the aforementioned image registration procedure alone. In this case, the determined
transformation may also be evaluated to determine the relative position of the US
probe 2 with respect to the model, i.e. in the reference frame in which the model
is defined.
[0044] In addition or as an alternative, the mapping of a live US image onto the model may
be performed on the basis of information about the position and orientation of the
US probe 2 in case the position and orientation of the model has been determined with
respect to a reference frame as explained above. Using this position and orientation
information, the mapping unit 8 may determine a rigid transformation for transforming
the live US image into the reference frame in which the model is defined and maps
the live US image onto the model by applying this transformation. The transformation
may be determined on the basis of the information about position and orientation of
the US probe 2 alone or it may be determined based on this information and additionally
based on an image registration between the live US image and the model as explained
above.
[0045] In order to carry out the mapping in this embodiment, the position and orientation
of the US probe 2 at the time of the acquisition of the live image within the reference
frame is determined. On this basis, the mapping unit 8 further determines the relative
position and orientation of the field of view of the US probe 2 with respect to the
model and uses this information for determining which part of the model is imaged
by the US probe 2 in the live US image.
[0046] The determination of the position and orientation of the US probe 2 with respect
to the reference frame may be made using any of the tracking techniques already referred
to above in connection with the description of the creation of the model. Thus, it
may be determined on the basis of images of the relevant body region acquired using
a suitable imaging modality, such as fluoroscopy, or on the basis of EM tracking,
impedance tracking, optical shape sensing or satellite-based tracking.
[0047] Further, as described above, the position and orientation of the US probe 2 may likewise
be tracked with respect to the reference device when the reference device is held
at the same fixed position during the initialization phase in which the model is created
and during the actual interventional procedure. In this implementation, the position
and orientation of the reference device defines the reference frame of the model.
[0048] For determining the relative position and orientation of the US probe 2 with respect
to the reference device during the initialization phase (for creating the model) and
during the interventional procedure (for generating visualizations of the basis of
the model), the reference device may be equipped with US sensors and on the basis
of the US signals sensed by the US sensors, the relative position and orientation
of the US probe 2 and the reference device is determined as explained above in connection
with the medical device 1. On the basis of this information, the position and orientation
of the US probe 2 relative to the model is determined.
[0049] The reference device may be specifically provided in order to establish a reference
position and orientation for the tracking of the US probe 2. Alternatively, the reference
device may be a medical device which has another function during the interventional
procedure but is substantially not moved during the procedure, such as, for example,
a diagnostic EP catheter for sensor electrical signals or applying electrical signals
to tissue for stimulation.
[0050] Upon having mapped the live US image onto the model, the mapping unit 8 creates a
visualization in which the live US image is overlaid over the model in accordance
with the result of the mapping. Further, the mapping unit 8 marks the position(s)
of the US sensor(s) 6 attached to the medical device 1 in the visualization, i.e.
in the live US image and the model as included in the visualization. The marking may
be made by placing corresponding dots or other symbols in the visualization. The visualization
is then displayed at the display unit 4 of the system. A corresponding visualization
is schematically and exemplarily illustrated in Fig. 4 for a three-dimensional US
image 41. In the example illustrated in Fig. 4, the medical device 1 is shown in the
US image and the position of an US sensor 6 attached to the tip of the medical device
1 is marked with a dot 42.
[0051] In order to mark the position(s) of the US sensor(s) 6 in the visualization, the
mapping unit 8 determines the relative position(s) of the US sensor(s) 6 attached
to the medical device 1 with respect to live US image and/or the model.
[0052] This may be done on the basis of the relative position(s) of the US sensor(s) 6 with
respect to the US probe 2 as determined in the tracking unit 7 and on the basis of
the relative position of the US probe 2 or the live US image acquired using the US
probe 2 with respect to the model. These data allow for determining the relative position(s)
of the US sensor(s) 6 with respect to the model so that the mapping unit 8 can place
the marks in the visualization accordingly.
[0053] Likewise, the mapping unit 8 may directly determine the position(s) of the US sensor(s)
6 in the model. This is particularly possible if the position and orientation of the
medical device 1 defines the reference frame of the model as describe above.
[0054] The mapping unit 8 generates the visualizations in such a way that each of the visualizations
shows the current position(s) of the US sensor(s) attached to the medical device 1,
i.e. the position(s) at the time of the acquisition of the live US image included
in the visualization. Thus, a physician viewing the visualization at the display unit
can easily determine the current position and/or orientation of the medical device
1 during the interventional procedure.
[0055] In a further implementation, the mapping unit 8 may generate the visualizations in
such a way that previous positions of the one or more of the US sensor(s) 6 attached
to the medical device 1 are marked in addition to the current position(s). By way
of example, a corresponding visualization is illustrated in Fig. 5. In this visualization,
the current position of a US sensor 6 attached to a medical device 1 is indicated
by means of a mark 51 in the model 21 of the left atrium and previous positions of
the US sensor are indicated by means of marks 52a-c.
[0056] In particular, the visualizations may be generated such that previous positions of
the US sensor 6 attached to the device's tip are additionally marked in the visualizations.
This is particularly useful if the medical device 1 is an ablation catheter. In this
case, the previous positions may correspond to previous ablation points. These points
may be identified manually or automatically during the ablation procedure and stored
in the mapping unit 8 in response to their identification so that they can be marked
in subsequently generated visualizations. In addition, also ablation parameters such
as power and duration, which were used for ablation at the ablation points, or lesion
parameters may be stored in the mapping unit 8 and displayed in connection with the
marks identifying the ablations points in the visualizations.
[0057] In addition or as an alternative to the previous positions of the US sensor 6, the
system may mark positions of a planned (future) trajectory of the medical device 1
in the presented visualization in order to assist the physician viewing the visualizations
in following the planned trajectory.
[0058] In a further embodiment, the mapping unit 8 generates visualizations for displaying
at the display unit 4, which comprise a part of the model included in the view of
a virtual eye at the location of the US sensor 6 attached to the tip of the medical
device 1. The field of view of the virtual eye may particularly be directed along
the longitudinal direction of the distal end section of the medical device 1 and cover
a region in front of the medical device 1. The visualization may be generated from
the three-dimensional model and optionally also from the live US images.
[0059] In this embodiment, the mapping unit 8 maps the position and orientation of the medical
device 1 on the model. This mapping is performed on the basis of a mapping of plural
US sensors 6 attached to the medical device 1 on the model. The latter mapping is
carried out directly or on the basis of the mapping of the position and orientation
of the US probe 2 onto the model and on the basis of the relative positions of the
US sensors 6 with respect to the US probe 2 as already described above. On the basis
of the mapping of the position and orientation of the medical device 1 onto the model,
the mapping unit 8 then determines the parts of the model which are included in the
field of view of the virtual eye and generates the visualization such that it includes
these parts in a view which corresponds to the view as seen by the virtual eye.
[0060] In addition, the mapping unit 8 may map the live US images acquired by means of the
US probe 2 onto the determined view of the model on the basis of a transformation
of the US image. For this purpose, the mapping unit 8 may determine a rigid transformation
for transforming the image space corresponding to the live US image to a new image
space corresponding to the field of view of the virtual eye on the basis of the relative
position and orientation of the medical device 1 with respect to the US probe 2. This
transformation is then applied to the live US image. Thereupon, the mapping unit 8
generate a visualization in which the transformed live US image is overlaid over the
model.
[0061] In such a way it is possible to visualize the anatomy of the relevant region of the
patient body from the point of view of the tip of the medical device 1. Such visualization
can further assist a physician in steering the medical device 1 during the interventional
procedure. In this respect, the embodiments described above can also be combined such
that it is possible to switch between a visualization in which the positions of the
US sensors are marked 6 in the overlay of the live US images on the model and a visualization
corresponding to the view as seen by the virtual eye.
[0062] In the system described above, particularly the US unit 3, the 3D model providing
unit 5, the tracking unit 7 and the mapping unit 8 may be implemented as software
modules executed on one or more computer device(s). For this purpose, a corresponding
computer program is provided and installed on the computer device(s), which comprises
instructions for executing the functions of the units. Further, the computer device(s)
is/are particularly connected to the US probe 2 and the US sensor(s) 6 in order to
control the operation of the US probe 2 and to receive US signals acquired by the
US probe 2 and the US sensor(s) 6. Moreover, the computer device(s) is/are connected
to the display unit 4 to control the display unit 4 to display the generated visualizations
as explained above.
[0063] In the embodiments of the system described above, it is possible to generate visualizations
of a model and live US images of a region of the patient body on the basis of the
position and/or orientation of one medical device 1 included in the relevant region
of the patient body, where the position(s) of US sensor(s) 6 attached to the medical
device 1 are marked in the visualizations or where the visualizations correspond to
the view as seen by a virtual eye at the location of a US sensor 6 attached to the
tip of the medical device 1.
[0064] In Fig. 6, some of the steps of the related procedure are summarized. In the illustrated
step 61, the three-dimensional model of the relevant region of the patient body is
generated in the initialization phase as explained above. Thereupon, during the actual
interventional procedure, live US images are acquired by means of the US probe 2 (step
62). Further, the relative position(s) of the US sensor(s) 6 attached to the medical
device 1 and the US probe 2 is/are determined as explained above (step 63). These
positions are mapped onto the model 21 by the mapping unit 8 (step 64). Moreover,
the mapping unit 8 generates a visualization as described above in which the positions
of the US sensor(s) are marked in the model 21 (step 65). In addition, the mapping
unit maps the live US images acquired by means of the US probe onto the model 21 (step
66) and overlays the live US images over the model 21 accordingly in the generated
visualization.
[0065] In a similar manner, it is also possible to generate corresponding visualizations
with respect to a plurality of medical devices 1 used in the relevant region of the
patient body.
[0066] In related embodiments, the positions of US sensors 6 attached to these medical devices
1 may all be marked in the visualizations and/or the mapping unit 8 may generate visualizations
corresponding to views as seen by virtual eyes at the locations of the tips of the
different medical devices 1. In the latter case, it may also be possible to switch
between these visualizations. Moreover, the mapping unit 8 may mark in the visualization
pertaining to one medical device 1 the positions of US sensor 6 attached to the other
medical devices 1 if they are included in the field of view of the virtual eye at
the tip of the relevant medical device 1. The corresponding marks may be positioned
on the basis of a mapping of the positions of the US sensors 6 onto the field of view
of the virtual eye.
[0067] Further, one embodiment of the system comprises that the medical device 1 is an EP
catheter which is used for generating an electro-anatomical map of the relevant region
of the patient body, such as a cardiac chamber. This map may be overlaid over the
aforementioned visualizations generated in the system on the basis of the model and
may include an activation map indicating local activation times and/or a voltage map
indicating local electrogram amplitudes. The EP catheter may comprise a plurality
of electrodes for sensing electrical signals and optionally for delivering stimulation
signals and on the basis of the sensed electrical signals, local activation times
and/or electrogram amplitudes are determined in a way known to a person skilled in
the art. For generating the activation and/or voltage map, the EP catheter is moved
within the relevant region of the patient body and local measurements are made at
different locations within the region. At each measurement location, the positions
of the electrodes are determined on the basis of the US signals sensed by means of
the US sensor(s) 6 attached to the EP catheter as explained above. Then, the results
of the local measurements are combined to generate the map and the mapping unit 8
may overlay the map onto the model of the relevant region of the patient body on the
basis of the recorded position ad orientation information.
[0068] Moreover, the generated visualizations may be fused with fluoroscopy images of the
relevant region of the patient body acquired using a fluoroscopy device.
[0069] Other variations to the disclosed embodiments can be understood and effected by those
skilled in the art in practicing the claimed invention, from a study of the drawings,
the disclosure, and the appended claims.
[0070] In the claims, the word "comprising" does not exclude other elements or steps, and
the indefinite article "a" or "an" does not exclude a plurality.
[0071] A single unit or devices may fulfill the functions of several items recited in the
claims. The mere fact that certain measures are recited in mutually different dependent
claims does not indicate that a combination of these measures cannot be used to advantage.
[0072] A computer program may be stored/distributed on a suitable medium, such as an optical
storage medium or a solid-state medium, supplied together with or as part of other
hardware, but may also be distributed in other forms, such as via the Internet or
other wired or wireless telecommunication systems.
Any reference signs in the claims should not be construed as limiting the scope.
1. A system for assisting a user in navigating a cardiac medical device (1) in a cardiac
region of a patient body, the system comprising:
- a cardiac medical device (1),
- a 3D model providing unit (5) configured to provide a three-dimensional cardiac
model (21) of the cardiac region of the patient body,
- an ultrasound probe (2) for acquiring image signals of the cardiac region of the
patient body and an ultrasound unit (3) configured to provide live images of the cardiac
region of the patient body on the basis of the image signals,
- at least one ultrasound sensor (6) attached to the cardiac medical device (1) for
sensing ultrasound signals emitted by the ultrasound probe (2),
- a tracking unit (7) configured to determine a relative position of the at last one
ultrasound sensor (6) with respect to the live images and/or the ultrasound probe
(2) on the basis of the sensed ultrasound signals, and
- a mapping unit (8) configured to map the determined relative position of the at
least one ultrasound sensor (6) onto the cardiac model (21) to generate a visualization
of the cardiac region of the patient body on the basis of the cardiac model and on
the basis of the result of the mapping.
2. The system as defined in claim 1, wherein the mapping unit (8) is configured to generate
a visualization of the cardiac model (21) in which the position of the at least one
ultrasound sensor (6) is marked.
3. The system as defined in claim 2, wherein the mapping unit (8) is configured to map
the live images onto the cardiac model (21) and to overlay the cardiac model (21)
with the live images in the visualizations on the basis of the result of this mapping.
4. The system as defined in claim 3, wherein the mapping unit (8) is configured to map
the live images onto the cardiac model (21) on the basis of an image comparison of
the live images and the cardiac model.
5. The system as defined in claim 3, wherein the mapping unit (8) is configured to map
the live images onto the cardiac model (21) on the basis of a relative position and
orientation of the ultrasound probe (2) with respect to a reference frame associated
with the cardiac model (21).
6. The system as defined in claim 5, wherein the 3D model providing (5) unit is configured
to create the cardiac model (21) using ultrasound images acquired using the ultrasound
probe (2) during an initialization phase in which a further ultrasound sensor (6)
is positioned at a reference position and wherein the reference frame is defined on
the basis of a relative position and orientation of the ultrasound probe (2) with
respect to the further ultrasound sensor (2) determined on the basis of the ultrasound
signals sensed by the further ultrasound sensor (6).
7. The system as defined in claims 5 and 6, wherein the further ultrasound sensor (6)
is positioned at the reference position during the acquisition of the live images
and wherein the mapping unit (8) is configured to determine the relative position
and orientation of the ultrasound probe (2) with respect to the reference frame on
the basis of the relative position and/or orientation of the further ultrasound sensor
(6) with respect to the ultrasound probe (6).
8. The system as defined in claim 1, wherein the cardiac region of the patient body undergoes
a periodic motion having different motion phases, wherein the cardiac model (21) is
a dynamic model comprising a deforming sub-model for each of the motion phases and
wherein the mapping unit (8) is configured to determine a current motion phase and
to map the relative position of the at least one ultrasound sensor (6) on the deforming
sub-model for the current motion phase.
9. The system as defined in claim 1, wherein the cardiac medical device (1) is configured
to carry out electrical measurements to generate an electro-anatomical map of the
cardiac region of the patient body and wherein the mapping unit (8) is configured
to overlay the electro-anatomical map over the cardiac model (21) on the basis of
the relative position of the at least one ultrasound sensor (6) with respect to the
ultrasound probe (2) during the measurements.
10. The system as defined in claim 1, wherein the mapping unit (8) is configured to generate
a visualization of the cardiac model (21) corresponding to a view as seen by view
of a virtual eye based on the position of the at least one ultrasound sensor (6).
11. The system as defined in claim 10, wherein the mapping unit (8) is configured to map
the live images onto the view and to overlay the view with the live image in the visualization
on the basis of the result of the mapping.
12. The system as defined in claim 10, wherein the mapping unit (8) is configured to generate
the visualization on the basis of a mapping of the live image and/or the position
and orientation of the ultrasound probe (2) onto the cardiac model (21) and on the
basis of the relative position and orientation of the at least one ultrasound sensor
(6) with respect to the ultrasound probe (2).
13. The system as defined in claim 1, wherein the ultrasound probe (2) is configured to
emit ultrasound signals into different directions and wherein the tracking unit (7)
is configured to determine the position of the at least one ultrasound sensor (6)
based on a reception level of the ultrasound signals in the ultrasound sensor (6)
and/or wherein the tracking unit (7) is configured to determine the position of the
at least one ultrasound sensor (6) on the basis of a time difference between the emission
of the ultrasound signals by the ultrasound probe (2) and their sensing by the ultrasound
sensor (6).
14. A computer program comprising program code for instructing a computer device to perform
a method, when the computer program is executed on the computer device, wherein the
method comprises a method for assisting a user in navigating a cardiac medical device
(1) having at least one ultrasound sensor (6) attached thereto in a cardiac region
of a patient body, the method comprising:- providing (61) a three-dimensional cardiac
model (21) of the cardiac region of the patient body,- receiving (62) live images
of the cardiac region of the patient body on the basis of image signals acquired using
an ultrasound probe (2),- determining (63) a relative position of the at least one
ultrasound sensor (6) attached to the cardiac medical device (1) with respect to the
ultrasound probe (2), the ultrasound sensor (6) sensing ultrasound signals emitted
by the ultrasound probe (2),-mapping (64) the determined relative position of the
at least one ultrasound sensor (6) onto the cardiac model to generate (85) a visualization
of the cardiac region of the patient body on the basis of the cardiac model (21) and
on the basis of the result of the mapping.
1. System zum Unterstützen eines Benutzers beim Navigieren eines herzmedizinischen Geräts
(1) in einer Herzregion eines Patientenkörpers, wobei das System umfasst:
- ein herzmedizinisches Gerät (1),
- eine 3D-Modell-Bereitstellungseinheit (5), die so konfiguriert ist, dass sie ein
dreidimensionales Herzmodell (21) der Herzregion des Patientenkörpers bereitstellt,
- eine Ultraschallsonde (2) zum Erfassen von Bildsignalen der Herzregion des Patientenkörpers
und eine Ultraschalleinheit (3), die so konfiguriert ist, dass sie Live-Bilder des
Herz-Bereichs des Patientenkörpers anhand der Bildsignale liefert,
- mindestens einen an dem herzmedizinischen Gerät (1) angebrachten Ultraschallsensor
(6) zum Erfassen von Ultraschallsignalen, die von der Ultraschallsonde (2) ausgesendet
werden,
- eine Verfolgungseinheit (7), die konfiguriert ist, um eine relative Position des
zumindest einen Ultraschallsensors (6) zu bestimmen, in Bezug auf die Live-Bilder
und/oder die Ultraschallsonde (2), basierend auf den erfassten Ultraschallsignalen,
und
- eine Kartierungseinheit (8), die konfiguriert ist, um die bestimmte relative Position
des mindestens einen Ultraschallsensors (6) auf das Herzmodell (21) zu kartieren,
um eine Visualisierung der Herzregion des Patientenkörpers auf der Grundlage des Herzmodells
und auf der Grundlage des Ergebnisses der Kartierung zu erzeugen.
2. System wie definiert in Anspruch 1, wobei die Abbildungseinheit (8) konfiguriert ist,
eine Visualisierung des Herzmodells (21) zu erzeugen, in der die Position des mindestens
einen Ultraschallsensors (6) markiert ist.
3. System nach Anspruch 2, wobei die Abbildungseinheit (8) konfiguriert ist, die Live-Bilder
auf das Herzmodell (21) abzubilden und das Herzmodell (21) mit den Livebildern in
den Visualisierungen auf Basis des Ergebnisses dieser Kartierung zu überlagern.
4. System nach Anspruch 3, wobei die Abbildungseinheit (8) konfiguriert ist, die Live-Bilder
auf das Herzmodell (21) anhand eines Bildvergleichs der Live-Bilder und des Herzmodells
abzubilden.
5. System wie definiert in Anspruch 3, wobei die Abbildungseinheit (8) konfiguriert ist,
die Live-Bilder auf das Herzmodell (21) abzubilden, und dies anhand einer relativen
Position und Orientierung der Ultraschallsonde (2) in Bezug auf ein Referenzsystem,
das dem Herzmodell (21) zugeordnet ist.
6. System wie definiert in Anspruch 5, wobei die 3D-Modellbereitstellungseinheit (5)
konfiguriert ist, um das Herzmodell (21) unter Verwendung von Ultraschallbildern zu
erstellen, die unter Verwendung der Ultraschallsonde (2) während einer Initialisierungsphase
erfasst werden, in der sich ein weiterer Ultraschallsensor (6) an einer Referenzposition
befindet und wobei der Referenzrahmen definiert ist auf Grundlage einer relativen
Position und Orientierung der Ultraschallsonde (2) bezüglich des weiteren Ultraschallsensors
(2), bestimmt anhand der Ultraschallsignale, die vom weiteren Ultraschallsensor (6)
erfasst wurden.
7. System wie definiert wie definiert in Anspruch 5 und 6, wobei der weitere Ultraschallsensor
(6) während der Aufnahme der Live-Bilder an der Referenzposition positioniert wird
und wobei die Kartierungseinheit (8) konfiguriert ist, um die relative Position und
Orientierung der Ultraschallsonde (2) in Bezug auf das Bezugssystem auf der Grundlage
der relativen Position und/oder Orientierung des weiteren Ultraschallsensors (6) bezüglich
der UltraschallSonde (6) zu bestimmen.
8. System wie definiert in Anspruch 1, wobei die Herzregion des Patienten-Körpers eine
periodische Bewegung mit unterschiedlichen Bewegungsphasen erfährt, wobei das Herzmodell
(21) ein dynamisches Modell ist, das ein sich verformendes Teilmodell für jede der
Bewegungsphasen umfasst, und wobei die Abbildungseinheit (8) konfiguriert ist, um
eine aktuelle Bewegungsphase zu bestimmen und die relative Position des mindestens
einen Ultraschallsensors (6) auf dem sich verformenden Teilmodell für die aktuelle
Bewegungsphase abzubilden.
9. System wie definiert in Anspruch 1, wobei die herzmedizinische Vorrichtung (1) konfiguriert
ist, um elektrische Messungen durchzuführen, um eine elektroanatomische Karte der
Herzregion des Patientenkörpers zu erstellen, und wobei die Abbildungseinheit (8)
zum Überlagern der elektroanatomischen Karte über dem Herzmodell (21) auf Grundlage
der relativen Position des mindestens einen Ultraschallsensors (6) bezüglich der Ultraschallsonde
(2) während der Messungen konfiguriert ist.
10. System wie definiert in Anspruch 1, wobei die Kartierungseinheit (8) konfiguriert
ist, um eine Visualisierung des Herzmodells (21) zu erzeugen, die einer Ansicht entspricht,
wie sie der Ansicht eines virtuellen Auges aufgrund der Position des mindestens einen
Ultraschallsensors (6) entspricht.
11. System wie definiert in Anspruch 10, wobei die Kartierungseinheit (8) konfiguriert
ist, um die Live-Bilder auf die Ansicht abzubilden und die Ansicht mit dem Live-Bild
in der Visualisierung auf Basis des Ergebnisses der Kartierung zu überlagern.
12. System nach Anspruch 10, wobei die Abbildungseinheit (8) konfiguriert ist, die Visualisierung
auf Basis einer Abbildung des Livebildes und/oder der Position und Ausrichtung der
Ultraschallsonde (2) auf das Herzmodell (21) zu erzeugen, und dies anhand der relativen
Position und Orientierung des mindestens einen Ultraschallsensors (6) bezüglich der
Ultraschallsonde (2).
13. System wie definiert in Anspruch 1, wobei die Ultraschallsonde (2) konfiguriert, um
Ultraschallsignale in verschiedene Richtungen auszusenden, und wobei die Verfolgungseinheit
(7) konfiguriert ist, um die Position des mindestens einen Ultraschallsensors (6)
auf Grundlage des Empfangspegels der Ultraschallsignale im Ultraschallsensor (6) zu
bestimmen, und/oder wobei die Verfolgungseinheit (7) dazu konfiguriert ist, die Position
des mindestens einen Ultraschallsensors (6) auf der Grundlage einer Zeitdifferenz
zwischen dem Aussenden der Ultraschallsignale durch die Ultraschallsonde (2) und deren
Erfassung durch den Ultraschallsensor (6) zu bestimmen.
14. Ein Computerprogramm, das einen Programmcode umfasst, um ein Computergerät zum Ausführen
eines Verfahrens anzuleiten, wenn das Computerprogramm auf dem Computergerät ausgeführt
wird, wobei das Verfahren ein Verfahren zum Unterstützen eines Benutzers beim Navigieren
eines herzmedizinischen Geräts (1) umfasst, an dem mindestens ein Ultraschallsensor
(6) in einer Herzregion eines Patientenkörpers angebracht ist, wobei das Verfahren
umfasst: - Bereitstellen (61) eines dreidimensionales Herzmodells (21) der Herzregion
des Patientenkörpers, - Empfangen (62) von Live-Bildern der Herzregion des Patientenkörpers
auf der Grundlage von Bildsignalen, die unter Verwendung der Ultraschallsonde (2)
erfasst werden, - Bestimmen (63) einer relativen Position des mindestens einen Ultraschall-Sensors
(6), der am herzmedizinischen Gerät (1) angebracht ist, in Bezug auf die Ultraschallsonde
(2), der Ultraschallsensor (6) erfasst Ultraschallsignale, die von der Ultraschallsonde
(2) ausgesendet werden, - Kartieren (64) der ermittelten relativen Position des mindestens
einen Ultraschallsensors (6) auf dem Herzmodell zum Erzeugen (85) einer Visualisierung
der Herzregion des Patientenkörpers auf Grundlage des Herzmodells (21) und basierend
auf dem Ergebnis der Kartierung.
1. Un système pour permettre à un utilisateur dans la navigation d'un dispositif médical
cardiaque (1) dans une région cardiaque d'un corps de patient, le système comprend:
- un dispositif médical cardiaque (1),
- une unité de fourniture de modèles 3D (5) configurée pour fournir un modèle cardiaque
tridimensionnelle (21) de la région cardiaque du corps du patient,
- une sonde d'échographie (2) pour obtenir des signaux d'images de la région cardiaque
du corps du patient et une unité d'ultrasons (3) configurée pour fournir des images
en direct de la région cardiaque du corps du patient à partir des signaux d'images,
- au moins un capteur à ultrasons (6) attaché au dispositif cardiaque médical (1)
pour la détection des signaux ultrasonores émis par la sonde d'échographie (2),
- une unité de suivi (7) configurée pour déterminer une position relative d'au moins
un capteur à ultrasons (6) par rapport aux images en direct et/ou la sonde d'échographie
(2) à partir des signaux ultrasonores détectés, et
- une unité de cartographie (8) configurée pour cartographier la position relative
déterminée d'au moins un capteur à ultrasons (6) sur le modèle cardiaque (21) pour
générer une visualisation de la région cardiaque du corps du patient à partir du modèle
cardiaque et à partir du résultat de la cartographie.
2. Le système comme défini dans la revendication 1, où l'unité de cartographie (8) est
configurée pour générer une visualisation du modèle cardiaque (21) dans laquelle la
position d'au moins un capteur à ultrasons (6) est marqué.
3. Le système comme défini dans la revendication 2, où l'unité de cartographie (8) est
configurée pour cartographier les images en direct sur le modèle cardiaque (21) et
pour superposer le modèle cardiaque (21) avec les images en direct dans les visualisations
à partir du résultat de cette cartographie.
4. Le système comme défini dans la revendication 3, où l'unité de cartographie (8) est
configurée pour cartographier les images en direct à partir du modèle cardiaque (21)
à partir d'une comparaison d'image des images en direct et du modèle cardiaque.
5. Le système comme défini dans la revendication 3, où l'unité de cartographie (8) est
configurée pour cartographier les images en direct sur le modèle cardiaque (21) à
partir d'une position relative et une orientation d'une sonde à ultrasons (2) par
rapport au cadre de référence associé avec le modèle cardiaque (21) .
6. Le système comme défini dans la revendication 5, où le modèle 3D fournissant (5) l'unité
est configurée pour créer le modèle cardiaque (21) utilisant les images d'ultrasons
obtenues utilisant la sonde d'échographie (2) lors d'une phase d'initialisation dans
lequel un autre capteur à ultrasons (6) est positionné en position de référence et
où le cadre de référence est défini à partir d'une position relative et orientation
de la sonde d'échographie (2) par rapport à l'autre capteur d'ultrasons (2) déterminé
à partir des signaux d'ultrasons détectés par l'autre capteur à ultrasons (6).
7. Le système comme défini dans les revendications 5 et 6, où le capteur échographique
supplémentaire (6) est positionné à la position de référence lors de l'acquisition
d'images en direct et où l'unité de cartographie (8) est configurée pour déterminer
la position relative et l'orientation de la sonde d'échographie (2) par rapport au
cadre de référence à partir de la position relative et/ou l'orientation du capteur
à ultrasons supplémentaire (6) par rapport à la sonde d'ultrasons (6).
8. Le système comme défini dans la revendication 1, où la région cardiaque du corps du
patient subit un mouvement périodique ayant des phases de mouvement différentes, où
le modèle cardiaque (21) est un modèle dynamique qui comprend un sous-modèle déformant
pour chacune des phases de mouvement et où l'unité de cartographie (8) est configurée
pour déterminer une phase actuelle du mouvement et pour cartographier la position
relative d'au moins un capteur à ultrasons (6) sur le sous-modèle pour la phase de
mouvement actuelle.
9. Le système comme défini dans la revendication 1, où le dispositif médical cardiaque
(1) est configuré pour effectuer des mesures électriques pour générer une carte électro-anatomique
de la région cardiaque du corps de patient et où l'unité de cartographie (8) est configurée
pour générer une carte électro-anatomique sur le modèle cardiaque (21) à partir de
la position relative d'au moins un capteur à ultrasons (6) par rapport à la sonde
d'ultrasons (2) lors des mesures.
10. Le système comme défini dans la revendication 1, où l'unité de cartographie (8) est
configurée pour générer une visualisation du modèle cardiaque (21) correspondant à
une vue comme vu d'un œil virtuel à partir de la position d'au moins un capteur à
ultrasons (6).
11. Le système comme défini dans la revendication 10, où l'unité de cartographie (8) est
configurée pour cartographier les images en direct sur la vue et pour superposer la
vue avec les images en direct dans la visualisation à partir du résultat de la cartographie.
12. Le système comme défini dans la revendication 10, où l'unité de cartographie (8) est
configurée pour générer la visualisation à partir de la cartographie d'images en direct
et/ou la position et l'orientation de la sonde à ultrasons (2) sur le modèle cardiaque
(21) et à partir de la position relative et l'orientation d'au moins un capteur à
ultrasons (6) par rapport à la sonde échographique (2).
13. Le système comme défini dans la revendication 1, où la sonde d'ultrasons (2) est configurée
pour émettre des signaux d'ultrasons dans des directions différentes et où l'unité
de suivi (7) est configurée pour déterminer la position d'au moins un capteur à ultrasons
(6) fondé sur un niveau de réception des signaux d'ultrasons dans le capteur à ultrasons
(6) et/ou l'unité de suivi (7) est configurée pour déterminer la position d'au moins
un capteur d'ultrasons (6) à partir d'une différence de temps entre l'émission de
signaux d'ultrasons par la sonde d'imagerie (2) et leur détection par le capteur d'ultrasons
(6).
14. Un programme informatique comprend des moyens de codage informatique pour faire en
sorte qu'un dispositif informatique exécute la méthode, lorsqu'un programme informatique
est exécuté sur le dispositif informatique, où la méthode comprend une méthode pour
aider un utilisateur dans la navigation d'un dispositif médical cardiaque (1) ayant
au moins un capteur à ultrasons (6) attaché à celui-ci dans une région cardiaque d'un
corps patient, la méthode comprend: - la fourniture (61) d'un modèle cardiaque tridimensionnel
(21) de la région cardiaque du corps du patient, recevant (62) des images en direct
de la région du corps à partir des signaux d'images obtenus lors d'une sonde d'échographie
(2), - déterminer (63) une position relative d'au moins un capteur échographique (6)
attaché au dispositif médical cardiaque (1) par rapport à la sonde échographique (2),
le capteur d'ultrasons (6) la détection des signaux d'ultrasons émis par la sonde
échographique (2), - la cartographie (64) de la position déterminée relative d'au
moins un capteur à ultrasons (6) sur le modèle cardiaque pour générer (85) une visualisation
de la région cardiaque du corps du patient à partir du modèle cardiaque (21) et à
partir du résultat de la cartographie.