(19)
(11)EP 2 577 229 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
11.07.2018 Bulletin 2018/28

(21)Application number: 11724013.5

(22)Date of filing:  27.05.2011
(51)International Patent Classification (IPC): 
G01C 21/00(2006.01)
G06T 15/00(2011.01)
B64C 39/00(2006.01)
G09B 9/36(2006.01)
(86)International application number:
PCT/GB2011/051015
(87)International publication number:
WO 2011/148199 (01.12.2011 Gazette  2011/48)

(54)

SIMULATING A TERRAIN VIEW FROM AN AIRBORNE POINT OF VIEW

SIMULIERUNG EINER GELÄNDEANSICHT AUS EINER LUFTANSICHT

SIMULATION D'UNE VUE DE TERRAIN DU POINT DE VUE D'UN OBJET AÉROPORTÉ


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 28.05.2010 EP 10275056
28.05.2010 GB 201008950

(43)Date of publication of application:
10.04.2013 Bulletin 2013/15

(73)Proprietor: BAE Systems PLC
London SW1Y 5AD (GB)

(72)Inventors:
  • STANNARD, Andrew, John
    Preston Lancashire PR4 1AX (GB)
  • GREEN, Mark
    Preston Lancashire PR41AX (GB)
  • SNAPE, John
    Lancashire PR4 1AX (GB)


(56)References cited: : 
US-A- 5 187 754
US-A- 5 995 903
US-A- 5 904 724
US-A- 6 053 736
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description


    [0001] The present invention relates to method of simulating a terrain view from the point of view of an airborne object. In particular, a terrain view is generated using position and direction information relating to the airborne object, or an aircraft on which the object is mounted, which is correlated with a geo-referenced model of the terrain.

    [0002] Camera pods are often employed on aircraft, and particularly on military aircraft, to provide the pilot and/or co-pilot with terrain views and for reconnaissance purposes. Such camera pods, while extremely useful for such purposes, are very expensive which prevents widespread deployment, and require significant amounts of maintenance as well as having their own power requirements and weight implications.

    [0003] The lack of such widespread deployment means that many of the advantages of such camera pods cannot be enjoyed by the majority of personnel. For example, while flying at night, a pilot can be presented with an infrared view of the terrain below, assisting in the identification of terrain, landmarks and targets. Furthermore, mission traces can be presented to the pilot and/or co-pilot, which can be quickly correlated with what the pilot can actually see.

    [0004] Furthermore, in joint-training of aircraft pilots and ground based personnel, it is extremely helpful to display, at ground-level, real-time footage from the point of view of the pilot. This allows the ground based personnel to see what the pilot sees, and to act and advise accordingly.

    [0005] To date, such real-time footage has been achieved by way of a video stream between the aircraft (transmitting data from the camera pod) and the ground. However, such transmissions require high bandwidth communication links which are costly and whose effectiveness reduces with increasing transmission distance due to signal degradation. In addition, in poor weather conditions such video feeds may be of limited use due to low visibility of the terrain.

    [0006] US 5187754 discloses a method for generating a composite terrain map, proceeding from an overview taken at relatively high altitude and photographs taken at relatively low altitudes. A composite terrain map that is relatively free of step irradiance variations where the photographs taken at relatively low altitudes are splined is generated.

    [0007] US 5904724 discloses a method and apparatus that allows a remote aircraft to be controlled by a remotely located pilot who is presented with a synthesized three-dimensional projected view representing the environment around the remote aircraft. According to one aspect of this prior art, a remote aircraft transmits its three-dimensional position and orientation to a remote pilot station. The remote pilot station applies this information to a digital database containing a three dimensional description of the environment around the remote aircraft to present the remote pilot with a three dimensional projected view of this environment.

    [0008] US 5995903 discloses a digital computer system for displaying a computer generated terrain representing a 3-dimensional depiction of the real world terrain surrounding a vehicle in real-time while the vehicle is in motion. This 3-D (3-Dimensional) image is rendered in real time while the vehicle is in motion and uses Global Positioning System (GPS) or differential GPS (dGPS) data available from a GPS unit and translates that data into virtual space within an Image Generation Processing block of the digital computer system.

    [0009] According to a first aspect of the present invention, there is provided a method of simulating views of a terrain from the airborne object according to claim 1. The position and orientation information may be generated at the airborne object and transmitted to the remote location for correlation with the geo-referenced terrain model at the remote location.

    [0010] Preferably, the airborne object is a virtual camera pod.

    [0011] The remote location may be at ground level.

    [0012] The step of obtaining position and orientation information relating to the airborne object may comprise receiving at said remote location said information via telemetry.

    [0013] The geo-referenced model obtained at the ground level and the geo-referenced model obtained onboard the airborne object, or aircraft on which the airborne object is mounted, comprise the same geo-referenced terrain model.

    [0014] The step of obtaining position and orientation information relating to the airborne object may comprise receiving said information via telemetry. Optionally, said position and orientation information is determined via GPS, LORAN or equivalent.

    [0015] Alternatively, the step of obtaining position and orientation information relating to the airborne object may comprise the step of detecting the position and orientation of the airborne object, or aircraft on which the airborne object is mounted, at ground level, for example via radar, LIDAR or equivalent.

    [0016] The position and orientation information relating to the airborne object may be comprised in a viewing frustum representative of a field of view from the airborne object.

    [0017] Optionally, calculation of the viewing frustum includes the step of determining the position and orientation of the aircraft on which the airborne object is mounted and incorporating an offset between the determined position and orientation of the aircraft and a relative position and viewing direction of a camera pod or a pilot's head.

    [0018] The step of generating the simulated view may further comprise the step of obtaining moding information relating to the simulated camera pod, said moding information selected from the group comprising infra-red mode, tracking mode, display contrast, display gain and zoom level.

    [0019] The step of obtaining a geo-referenced model of the terrain preferably comprises correlating aerial images of the terrain with terrain height data.

    [0020] Optionally, the step of obtaining a geo-referenced model of the terrain comprises incorporating 3D models of objects located in the terrain. Additionally, or alternatively, the step of obtaining a geo-referenced model of the terrain comprises incorporating 3D models of virtual entities in the terrain.

    [0021] According to a second aspect of the present invention, there is provided a terrain view simulation system for simulating a view from an airborne object according to claim 12. The position and orientation information may relate to at least one of a simulated camera pod and an aircraft upon which the object is mounted. The airborne object, or aircraft upon which the object is mounted, comprises said first computer and said first display.

    [0022] The geo-referenced terrain model is common to both the first and second computers such that the generated simulated views are the same.

    [0023] The system may be configured to interface with the aircraft on which the object is mounted, and to obtain the position and direction information from the aircraft.

    [0024] The terrain view simulation system preferably further comprises a communication system adapted to communicate said position and orientation information from the airborne object to the second computer.

    [0025] Alternatively, the terrain view simulation system further comprises a detection and ranging system to determine the position and direction of the airborne object.

    [0026] The system may further comprise a human computer interface device configured to control parameters of the airborne object, said parameters selected from the group comprising; position, orientation, zoom level, viewing frustum and operational mode settings.

    [0027] The present invention will now be described by way of example only and with reference to the accompanying figures in which:

    Figure 1 illustrates in schematic form the use of geo-referenced terrain models in an airborne environment and also a ground-level environment in accordance with an aspect of the present invention; and

    Figure 2 illustrates in schematic form an exemplary architecture for carrying out the simulation of the view from a virtual camera pod mounted on an aircraft on both a cockpit display and a ground-based display in accordance with an aspect of the present invention.



    [0028] Figure 1 presents a schematic view of a terrain view simulation system that functions to present to an observer a view of a terrain from the point of view of an airborne object, in this case a virtual camera pod, as described in detail below.

    [0029] The system can be thought of as comprising an airborne environment 1 and a ground environment 3. The airborne environment includes an aircraft 5 with an onboard computer 7 and a cockpit display 9 operably connected to the onboard computer 7. A geo-referenced terrain model 11 is stored on the onboard computer 7 or on readable media connected thereto.

    [0030] The aircraft is capable of determining its own position and orientation and the onboard computer 7, which for example might be a ruggedized PC, is configured to interface with the aircraft's systems so as to obtain this position and orientation information. The position and orientation of a virtual camera pod 6 is then determined using a predetermined position (and optionally orientation) offset.

    [0031] Additionally, the ground environment 3 includes a receiver 13 which receives the position and orientation information relating to the camera pod 6 via telemetry. This information is relayed to a ground based computer 17 having a connected display 19. Similarly to the onboard computer 7 on the aircraft 5, a geo-referenced terrain model 11' is stored on the ground based computer 17 or on readable media connected thereto.

    [0032] Both the onboard computer 7 and the ground based computer 17 correlate the position and orientation information relating to the virtual camera pod 6 with the respective geo-referenced terrain models 11,11' to render a simulated view of the terrain from the point of view of the virtual camera pod 6. This simulated view is displayed simultaneously on the cockpit display 9 and on the ground based display 19.

    [0033] The geo-referenced terrain models 11,11' are common to both the airborne 1 and the ground 3 environments. Accordingly, the simulated views on the cockpit display 9 and the ground based display 19 convey the same information. For practical reasons the actual images displayed may differ in some details (for example there may be overlays presented to the pilot that are not presented to ground based personnel, and vice versa), but the effect is that a ground based observer can view the same image of the terrain that the pilot of the aircraft is presented with in the cockpit.

    [0034] For low bandwidth operation, it is beneficial that only position and orientation information is transmitted. Conventional systems employ video feeds which not only require a high-bandwidth communications link but additional expensive camera equipment. The present invention can be piggy-backed onto conventional telemetry. Other information that it is beneficial to transmit is so-called moding information; that is, information relating to the operational mode of the virtual camera pod. For example, whether a virtual infra-red mode was on or off, the tracking mode, display contrast and/or gain (to further enhance the correlation between the cockpit display and the ground based display), cross-hair visibility and/or position(s) etc.

    [0035] It is also beneficial to represent the position and orientation information by a viewing frustum. A viewing frustum is the field of view of a notional camera or observer in three-dimensional space. This information still requires far less bandwidth than a video stream as it can be represented simply, for example, by a field of view angle, aspect ratio and front and back near and far bounding planes.

    [0036] This also has an added advantage to the rendering process which produces the simulated view of the terrain, because the portions of the geo-referenced model which lie outside the viewing frustum can be removed (or disregarded) and the rendering process carried out only on the portion of the geo-referenced model bounded by the viewing frustum.

    [0037] An example of such a terrain view simulation system which has been demonstrated in practice is described in detail below with reference to Fig. 2.

    [0038] Again, the system can be thought of as comprising an airborne environment 101 and a ground based environment 103. A Tornado GR4 aircraft 105 was equipped with a ruggedized PC 107 containing a geo-referenced terrain model 111 representative of North-West England, and in particular of the locality of Lytham St Annes for which additional information such as 3D models of buildings in the area were also included.

    [0039] A ground based visualisation facility (indicated by reference numeral 117) was provided with the same geo-referenced terrain model 111', and a telemetry link via telemetry station 113 to the aircraft 105.

    [0040] During the above-mentioned demonstration, terrain view simulation software installed on the ruggedized PC 107 generated a real-time simulated view from a virtual camera pod 106 pointed in a direction chosen by the pilot and presented this view to the pilot on cockpit display 109. As the geo-referenced model was based on aerial photography of the area and aforementioned building models, the simulated view corresponded closely with the view that would have been generated by a multi-million pound camera pod pointed in the same direction, but without anywhere near the same associated cost.

    [0041] At ground level, the position and orientation information was received via telemetry at telemetry station 113. By using the same information as used to generate the simulated view on the cockpit display 109, and the same geo-referenced terrain model 111', the ground based visualisation facility 117 then re-generated, at ground level, a simulated view identical to that being displayed to the pilot on the cockpit display 109. Ground based personnel 121 were thus able to view, in real time, exactly what was being displayed to the pilot (on cockpit display 109) on ground based display 119.

    [0042] The use of geo-referenced terrain models 111,111' on board the aircraft 105 and at ground level 117 meant that both displays 109,119 were able to show views correlating closely to what would have been achieved had a real camera pod and high-bandwidth communication link been used. This was achieved without expensive equipment (and the associated expensive equipment installation) costs, and making use of conventional telemetry.

    [0043] Furthermore, conventional camera pods are typically controlled using a small joystick located in the cockpit. In this demonstration, the ruggedized PC was interfaced with the onboard camera pod controls in such a way that not only was the pilot able to control the orientation of the virtual camera pod 16 using the joystick, but the aircraft was unable to distinguish the returning video feed from the ruggedized PC from that which would ordinarily be returned from a real camera pod mounted on the aircraft. This was achieved by simulating to the aircraft the various interfaces and protocols it would expect to see were it communicating with a real camera pod, both in terms of the control output from the aircraft and the data input to the aircraft - effectively replicating a camera pod interface. This also meant that the pilot did not require additional training as to how to use the virtual camera pod.

    [0044] The system allowed the pilot in the above example to carry out an extremely effective training exercise. Furthermore, ground based personnel 121 were able to take advantage of a ground based display 119, displaying a view which corresponded to the view displayed on the cockpit display 109, to interact with the aircrew and provide training and advice on the basis of the simulated view. Within the simulated view, virtual features such as buildings or tanks etc. can be simulated such that, while the actual terrain may be relatively clear, the view presented to the pilot in the cockpit (and to the ground based personnel) contains targets which can be used for training purposes.

    [0045] There are numerous benefits obtained, in addition to those discussed in the foregoing description. For example, a pilot can be trained to use a camera pod without necessarily having to fit his aircraft with a real camera pod. This is not only beneficial in terms of the cost saving involved, but it also allows pilots to train in conditions where a real camera pod would be useless. For example, in conditions of extremely poor visibility, a real camera pod may present no useful information to the pilot, whereas a virtual camera pod view can simulate good weather conditions and permit training to continue. The converse is also true; pilots can train to use the camera pod in poor conditions even when the actual weather conditions are good.

    [0046] It is also envisaged that the geo-referenced model used may not correspond to the actual geographical location in which the aircraft is being flown. This would allow a pilot to train locally for missions etc. in remote locations, and in an actual airborne aircraft rather than in a simulator thus improving realism and the effectiveness of the training exercise. By way of example, the aircraft may be flying over Lytham St Annes but the georeferenced model may relate to a distant region in which a number of hostile targets are located. The pilot may therefore receive training in engaging the hostile targets (presented to him on the cockpit display) without having to leave the relative safety of domestic airspace.

    [0047] Additionally, weather and other environmental variables may be simulated on the display to give an authentic representation of the target environment, regardless of weather conditions in the training location. For example, it may be daylight with good visibility in the training location, but the pilot will be able to train for a night-time mission in a dusty environment. Conversely again, when actually carrying out a mission at night-time in a dusty environment, target identification and navigation may be assisted by presenting to the pilot a virtualised day-time and clear-sky view of the target area.

    [0048] It is anticipated, and it will be readily appreciated by the skilled person, that the examples described above in relation to a single aircraft and single ground based display may be extended, for example to multiple aircraft and/or multiple ground stations. It is also anticipated that "ground based" operations could be carried out on naval platforms, such as aircraft carriers or naval command and/or training vessels. This would facilitate coordinated training on large scales with associated cost savings because aircraft would not need to be fitted with expensive camera pods, and existing telemetry systems could be utilised. Furthermore, it is envisaged that a view in one aircraft could be reproduced in another aircraft.

    [0049] The viewing frustum may be calculated from determination of the position and orientation of the pilot's head (or a virtual position and orientation of a notional pilot's head) in real time. Calculation of the viewing frustum may also take into account an offset between a determined position and orientation of the aircraft and a relative position and orientation of the pilot's head. In either case, the view can be made to correspond to what the pilot is actually looking at rather than just what a virtual camera pod mounted, for example, on the aircraft undercarriage is pointed at.

    [0050] The position and orientation of the aircraft may be determined by one or more of a number of suitable systems. For example, on board GPS or LORAN may be employed to determine position and/or orientation information on board the aircraft. Alternatively, this information may be determined remotely; for example, via RADAR or LIDAR. Accordingly, a ground based system may be able to simulate the terrain view without the need to receive any information from an aircraft at all.

    [0051] Note that, for the purposes of the foregoing description and the appended claims, the term "ground level" should not be construed as limited to locations, objects, installations or the like actually on the ground, but to locations, objects, installations or the like where it is desirable to remotely simulate the view from an airborne object (such as an aircraft). As an example, it is possible that the simulation methods described herein may be carried out on a naval platform such as on board a ship or other sea-going vessel. It will be further appreciated that such ground level procedures and processes may equally be carried out on mobile, including airborne, platforms.

    [0052] Throughout the specification, unless the context demands otherwise, the terms "comprise" or "include", or variations such as "comprises" or "comprising", "includes" or "including" should be understood to imply the inclusion of a stated integer or group of integers, but not the exclusion of any other integer or group of integers.


    Claims

    1. A method of simulating views of a terrain from an airborne object, the method comprising the steps of:

    obtaining, at a remote location (3), a first geo-referenced model of the terrain (11');

    obtaining, at the airborne object in an airborne environment (1), a second geo-referenced model of the terrain (11, wherein the first and second geo-referenced models are common to both the airborne environment (1) and the remote location; obtaining, at the remote location (3), position and orientation information relating to the airborne object;

    obtaining, at the airborne object, the position and orientation information relating to the airborne object;

    correlating, at the remote location (3), the position and orientation information with the first geo-referenced terrain model (11') and generating a corresponding simulated view, from the airborne object, of the terrain;

    correlating, at the airborne object, the position and orientation information with the second geo-referenced terrain model (11) and generating said corresponding simulated view of the terrain; and

    simultaneously displaying the simulated view to a first observer at the remote location (3) and displaying the simulated view to a second observer aboard the airborne object or aboard an aircraft (5) on which the airborne object is mounted.


     
    2. The method of claim 1, wherein the position and orientation information is generated at the airborne object and transmitted to the remote location (3) for correlation with the first geo-referenced terrain model.
     
    3. The method of claim 2, wherein the remote location (3) is at ground level.
     
    4. The method of any of claims 1 to 3, wherein the step of obtaining, at the remote location (3), position and orientation information relating to the airborne object comprises receiving at said remote location (3) said information via telemetry.
     
    5. The method of any preceding claim, wherein said position and orientation information is determined via GPS or LORAN.
     
    6. The method of any of claims 1 to 5, wherein the step of obtaining position and orientation information relating to the airborne object, or aircraft (5) on which the object is mounted, comprises the step of detecting the position and orientation of the airborne object, or aircraft (5) on which the object is mounted, at ground level.
     
    7. The method of any preceding claim, wherein the position and orientation information relating to the airborne object is comprised in a viewing frustum representative of a field of view from the airborne object.
     
    8. The method of claim 7, wherein calculation of the viewing frustum includes the step of determining the position and orientation of the aircraft (5) on which the object is mounted and incorporating an offset between the determined position and orientation of aircraft (5) and a relative position and viewing direction of a virtual camera pod or a pilot's head.
     
    9. The method of any preceding claim, wherein the step of generating the simulated view further comprises the step of obtaining moding information relating to a virtual camera pod, said moding information selected from the group comprising infra-red mode, tracking mode, display contrast, display gain and zoom level.
     
    10. The method of any preceding claim, wherein a step of obtaining a geo-referenced model of the terrain comprises correlating aerial images of the terrain with terrain height data.
     
    11. The method of any preceding claim, wherein the step of obtaining a geo-referenced model of the terrain comprises incorporating 3D models of objects and/or of virtual entities located in the terrain.
     
    12. A terrain view simulation system for simulating views of a terrain from an airborne object, comprising:

    a first geo-referenced model of the terrain (11');

    a first computer (17); and a first display (19) at a remote location the system further comprising:

    a second geo-referenced model of the terrain (11);

    a second computer (7); and

    a second display (9) at an airborne environment, which is aboard the airborne object or aboard an aircraft (5) on which the airborne object is mounted; wherein

    the first and second geo-referenced models are common to both the airborne environment (1) and the remote location and wherein

    the first computer (17) is configured to:

    receive position and orientation information relating to the airborne object at the remote location (3);

    correlate the position and orientation information with the first geo-referenced terrain model (11') to generate a corresponding simulated view, from the airborne object, of the terrain; and

    output said simulated view to the first display (19);

    the second computer (7) is configured to:

    receive the position and orientation information relating to the airborne object;

    correlate the position and direction information with the second geo-referenced terrain model (11) to generate said corresponding simulated view, from the airborne object, of the terrain; and

    output said simulated view to the second display (9); and

    the first and second displays (9, 19) are configured to simultaneously display the simulated view to a first observer at the remote location (3) and a second observer aboard the airborne object or aboard the aircraft (5) on which the airborne object is mounted.


     
    13. The system of claim 12, wherein the position and orientation information relates to at least one of a simulated camera pod and an aircraft (5) upon which the object is mounted.
     
    14. The system of claims 12 or 13, wherein the position and orientation information is generated at the airborne object and the first computer (17) is configured to receive a transmission comprising said information for correlation with the first geo-referenced terrain model (11').
     
    15. The system of any of claims 12 to 14, configured to interface with an aircraft (5) on which the object is mounted, and to obtain the position and direction information from the aircraft (5).
     
    16. The system of any of claims 12 to 15, further comprising a communication system (13) adapted to communicate said position and orientation information from the airborne object to the first computer (17).
     
    17. The system of any of claims 12 to 16, further comprising a detection and ranging system to determine the position and direction of the airborne object.
     
    18. The system of any of claims 12 to 17, further comprising a human computer interface device configured to control parameters of the airborne object, said parameters selected from the group comprising; position, orientation, zoom level, viewing frustum and operational mode settings.
     


    Ansprüche

    1. Verfahren zum Simulieren von Ansichten eines Geländes von einem Flugobjekt, wobei das Verfahren die folgenden Schritte umfasst:

    Erhalten, an einem Fernstandort (3), eines ersten georeferenzierten Modells des Geländes (11');

    Erhalten, am Flugobjekt in einer Flugumgebung (1), eines zweiten georeferenzierten Modells des Geländes (11), wobei das erste und das zweite georeferenzierte Modell sowohl der Flugumgebung (1) als auch dem Fernstandort gemein sind;

    Erhalten, am Fernstandort (3), von Positions- und Orientierungsinformationen bezüglich des Flugobjekts;

    Erhalten, am Flugobjekt, der Positions- und Orientierungsinformationen bezüglich des Flugobjekts;

    Korrelieren, am Fernstandort (3), der Positions- und Orientierungsinformationen mit dem ersten georeferenzierten Geländemodell (11') und Erzeugen einer entsprechenden simulierten Ansicht, vom Flugobjekt, des Geländes;

    Korrelieren, am Flugobjekt, der Positions- und Orientierungsinformationen mit dem zweiten georeferenzierten Geländemodell (11) und Erzeugen der entsprechenden simulierten Ansicht des Geländes und

    gleichzeitiges Anzeigen der simulierten Ansicht einem ersten Betrachter am Fernstandort (3) und Anzeigen der simulierten Ansicht einem zweiten Betrachter an Bord des Flugobjekts oder an Bord eines Flugzeugs (5), an dem das Flugobjekt befestigt ist.


     
    2. Verfahren nach Anspruch 1, wobei die Positions- und Orientierungsinformationen am Flugobjekt erzeugt werden und zum Fernstandort (3) zur Korrelation mit dem ersten georeferenzierten Geländemodell übertragen werden.
     
    3. Verfahren nach Anspruch 2, wobei sich der Fernstandort (3) am Boden befindet.
     
    4. Verfahren nach einem der Ansprüche 1 bis 3, wobei der Schritt des Erhaltens, am Fernstandort (3), von Positions- und Orientierungsinformationen bezüglich des Flugobjekts ein Empfangen am Fernstandort (3) der Informationen über Telemetrie umfasst.
     
    5. Verfahren nach einem vorangegangenen Anspruch, wobei die Positions- und Orientierungsinformationen über GPS oder LORAN bestimmt werden.
     
    6. Verfahren nach einem der Ansprüche 1 bis 5, wobei der Schritt des Erhaltens von Positions- und Orientierungsinformationen bezüglich des Flugobjekts oder Flugzeugs (5), an dem das Objekt befestigt ist, den Schritt des Detektierens der Position und der Orientierung des Flugobjekts oder Flugzeugs (5), an dem das Objekt befestigt ist, am Boden umfasst.
     
    7. Verfahren nach einem vorangegangenen Anspruch, wobei die Positions- und Orientierungsinformationen bezüglich des Flugobjekts in einem Ansichtskegelstumpf, der ein Sichtfeld vom Flugobjekt repräsentiert, enthalten sind.
     
    8. Verfahren nach Anspruch 7, wobei die Berechnung des Ansichtskegelstumpfes den Schritt des Bestimmens der Position und der Orientierung des Flugzeugs (5), an dem das Objekt befestigt ist, und Einbeziehens eines Offsets zwischen der bestimmten Position und Orientierung des Flugzeugs (5) und einer relativen Position und Ansichtsrichtung eines virtuellen Kamerasockels oder des Kopfes eines Piloten beinhaltet.
     
    9. Verfahren nach einem vorangegangenen Anspruch, wobei der Schritt des Erzeugens der simulierten Ansicht ferner den Schritt des Erhaltens von Moding-Informationen bezüglich eines virtuellen Kamerasockels umfasst, wobei die Moding-Informationen aus der Gruppe ausgewählt werden, die Infrarotmodus, Verfolgungsmodus, Anzeigekontrast, Anzeigeverstärkung und Zoomstufe umfasst.
     
    10. Verfahren nach einem vorangegangenen Anspruch, wobei ein Schritt des Erhaltens eines georeferenzierten Modells des Geländes ein Korrelieren von Luftbildern des Geländes mit Geländehöhendaten umfasst.
     
    11. Verfahren nach einem vorangegangenen Anspruch, wobei der Schritt des Erhaltens eines georeferenzierten Modells des Geländes ein Einbeziehen von 3D-Modellen von Objekten und/oder von sich im Gelände befindlichen virtuellen Entitäten umfasst.
     
    12. Geländeansicht-Simulationssystem zum Simulieren von Ansichten eines Geländes von einem Flugobjekt, umfassend:

    ein erstes georeferenziertes Modell des Geländes (11');

    einen ersten Computer (17) und

    eine erste Anzeige (19) an einem Fernstandort;

    wobei das System ferner Folgendes umfasst:

    ein zweites georeferenziertes Modell des Geländes (11) ;

    einen zweiten Computer (7) und

    eine zweite Anzeige (9) in einer Flugumgebung, die sich an Bord des Flugobjekts oder an Bord eines Flugzeugs (5), an dem das Flugobjekt befestigt ist, befindet; wobei

    das erste und das zweite georeferenzierte Modell sowohl der Flugumgebung (1) als auch dem Fernstandort gemein sind und wobei

    der erste Computer (17) zu Folgendem ausgelegt ist:

    Empfangen von Positions- und Orientierungsinformationen bezüglich des Flugobjekts am Fernstandort (3);

    Korrelieren der Positions- und Orientierungsinformationen mit dem ersten georeferenzierten Geländemodell (11'), um eine entsprechende simulierte Ansicht, vom Flugobjekt, des Geländes zu erzeugen; und

    Ausgeben der simulierten Ansicht an die erste Anzeige (19);

    der zweite Computer (7) zu Folgendem ausgelegt ist:

    Empfangen der Positions- und Orientierungsinformationen bezüglich des Flugobjekts;

    Korrelieren der Positions- und Richtungsinformationen mit dem zweiten georeferenzierten Geländemodell (11), um die entsprechende simulierte Ansicht, vom Flugobjekt, des Geländes zu erzeugen; und

    Ausgeben der simulierten Ansicht an die zweite Anzeige (9); und

    die erste und die zweite Anzeige (9, 19) ausgelegt sind zum gleichzeitigen Anzeigen der simulierten Ansicht einem ersten Betrachter am Fernstandort (3) und einem zweiten Betrachter an Bord des Flugobjekts oder an Bord des Flugzeugs (5), an dem das Flugobjekt befestigt ist.


     
    13. System nach Anspruch 12, wobei sich die Positions- und Orientierungsinformationen auf einen simulierten Kamerasockel und/oder ein Flugzeug (5), an dem das Objekt befestigt ist, beziehen.
     
    14. System nach Anspruch 12 oder 13, wobei die Positions- und Orientierungsinformationen am Flugobjekt erzeugt werden und der erste Computer (17) ausgelegt ist zum Empfangen einer Übertragung, die die Informationen umfasst, zur Korrelation mit dem ersten georeferenzierten Geländemodell (11').
     
    15. System nach einem der Ansprüche 12 bis 14, das ausgelegt ist zum sich Verbinden mit einem Flugzeug (5), an dem das Objekt befestigt ist, und zum Erhalten der Positions- und Richtungsinformationen vom Flugzeug (5).
     
    16. System nach einem der Ansprüche 12 bis 15, das ferner ein Kommunikationssystem (13) umfasst, das eingerichtet ist zum Kommunizieren der Positions- und Orientierungsinformationen vom Flugobjekt zum ersten Computer (17).
     
    17. System nach einem der Ansprüche 12 bis 16, das ferner ein Detektions- und Entfernungsmessungssystem zum Bestimmen der Position und der Richtung des Flugobjekts umfasst.
     
    18. System nach einem der Ansprüche 12 bis 17, das ferner eine Mensch-Computer-Schnittstelleneinrichtung umfasst, die ausgelegt ist zum Steuern von Parametern des Flugobjekts, wobei die Parameter aus der Gruppe ausgewählt werden, die Folgendes umfasst: Position, Orientierung, Zoomstufe, Ansichtskegelstumpf und Betriebsmoduseinstellungen.
     


    Revendications

    1. Procédé de simulation de vues d'un terrain à partir d'un objet en vol, le procédé comprenant les étapes consistant :

    à obtenir, à un emplacement distant (3), un premier modèle géoréférencé du terrain (11') ;

    à obtenir, au niveau de l'objet en vol dans un environnement en vol (1), un second modèle géoréférencé du terrain (11), dans lequel les premier et second modèles géoréférencés sont communs à la fois à l'environnement en vol (1) et à l'emplacement distant ;

    à obtenir, à l'emplacement distant (3), des informations de position et d'orientation se rapportant à l'objet en vol ;

    à obtenir, au niveau de l'objet en vol, les informations de position et d'orientation se rapportant à l'objet en vol ;

    à mettre en corrélation, à l'emplacement distant (3), les informations de position et d'orientation avec le premier modèle de terrain géoréférencé (11') et à générer une vue simulée correspondante, à partir de l'objet en vol, du terrain ;

    à mettre en corrélation, au niveau de l'objet en vol, les informations de position et d'orientation avec le second modèle de terrain géoréférencé (11) et à générer ladite vue simulée correspondante, à partir de l'objet en vol, du terrain ; et

    à afficher en même temps, au niveau de l'objet en vol, la vue simulée à un premier observateur situé à l'endroit distant (3) et à afficher la vue simulée à un second observateur à bord de l'objet en vol ou à bord d'un aéronef (5) sur lequel l'objet en vol est monté.


     
    2. Procédé selon la revendication 1, dans lequel les informations de position et d'orientation sont générées au niveau de l'objet en vol et transmises à l'emplacement distant (3) pour une corrélation avec le premier modèle de terrain géoréférencé.
     
    3. Procédé selon la revendication 2, dans lequel l'emplacement distant (3) se trouve au niveau du sol.
     
    4. Procédé selon l'une quelconque des revendications 1 à 3, dans lequel l'étape d'obtention, à l'emplacement distant (3), d'informations de position et d'orientation se rapportant à l'objet en vol consiste à recevoir, au dit emplacement distant (3), lesdites informations par le biais d'une télémétrie.
     
    5. Procédé selon l'une quelconque des revendications précédentes, dans lequel lesdites informations de position et d'orientation sont déterminées par le biais d'un système GPS ou d'un système de radionavigation LORAN.
     
    6. Procédé selon l'une quelconque des revendications 1 à 5, dans lequel l'étape d'obtention d'informations de position et d'orientation se rapportant à l'objet en vol, ou à l'aéronef (5) sur lequel l'objet est monté, comprend l'étape de détection de la position et de l'orientation de l'objet en vol, ou de l'aéronef (5) sur lequel l'objet est monté, au niveau du sol.
     
    7. Procédé selon l'une quelconque des revendications précédentes, dans lequel les informations de position et d'orientation se rapportant à l'objet en vol sont incluses dans un tronc de visualisation représentant un champ de vision depuis l'objet en vol.
     
    8. Procédé selon la revendication 7, dans lequel le calcul du tronc de visualisation comprend l'étape de détermination de la position et de l'orientation de l'aéronef (5) sur lequel l'objet est monté, et d'incorporation d'un décalage entre la position et l'orientation déterminées de l'aéronef (5) et une position relative et d'une direction de visualisation d'une nacelle de caméra virtuelle ou de la tête d'un pilote.
     
    9. Procédé selon l'une quelconque des revendications précédentes, dans lequel l'étape de génération de la vue simulée comprend en outre l'étape d'obtention d'informations de mode se rapportant à une nacelle de caméra virtuelle, lesdites informations de mode étant sélectionnées dans le groupe constitué par le mode infrarouge, le mode de suivi, le contraste d'affichage, le gain d'affichage et le niveau de zoom.
     
    10. Procédé selon l'une quelconque des revendications précédentes, dans lequel une étape d'obtention d'un modèle géoréférencé d'un terrain consiste à mettre en corrélation des images aériennes du terrain avec des données de hauteur de terrain.
     
    11. Procédé selon l'une quelconque des revendications précédentes, dans lequel l'étape d'obtention d'un modèle géoréférencé du terrain consiste à incorporer des modèles 3D d'objets et/ou d'entités virtuelles situés sur le terrain.
     
    12. Système de simulation de vues de terrain pour simuler des vues d'un terrain à partir d'un objet en vol, comprenant :

    un premier modèle géoréférencé du terrain (11') ;

    un premier ordinateur (17) ; et

    un premier dispositif d'affichage (19) situé à un emplacement distant ; le système comprenant en outre :

    un second modèle géoréférencé du terrain (11) ;

    un second ordinateur (7) ; et

    un second dispositif d'affichage (9) situé au niveau d'un environnement en vol, qui est à bord de l'objet en vol ou à bord d'un aéronef (5) sur lequel l'objet en vol est monté ; dans lequel les premier et second modèles géoréférencés sont communs à la fois à l'environnement en vol (1) et

    à l'emplacement distant ; et dans lequel le premier ordinateur (17) est configuré :

    pour recevoir des informations de position et d'orientation se rapportant à l'objet en vol à l'emplacement distant (3) ;

    pour mettre en corrélation les informations de position et d'orientation avec le premier modèle de terrain géoréférencé (11') pour générer une vue simulée correspondante, à partir de l'objet en vol, du terrain ; et

    pour transmettre ladite vue simulée au premier dispositif d'affichage (19) ;

    le second ordinateur (7) est configuré :

    pour recevoir des informations de position et d'orientation se rapportant à l'objet en vol ;

    pour mettre en corrélation les informations de position et d'orientation avec le second modèle de terrain géoréférencé (11) pour générer ladite vue simulée correspondante, à partir de l'objet en vol, du terrain ; et

    pour transmettre ladite vue simulée au second dispositif d'affichage (9) ; et

    les premier et second dispositifs d'affichage (9, 19) sont configurés pour afficher en même temps la vue simulée à un premier observateur situé à l'endroit distant (3) et à un second observateur à bord de l'objet en vol ou à bord de l'aéronef (5) sur lequel l'objet en vol est monté.


     
    13. Système selon la revendication 12, dans lequel les informations de position et d'orientation se rapportent à une nacelle de caméra simulée et/ou à un aéronef (5) sur lequel l'objet en vol est monté.
     
    14. Système selon la revendication 12 ou 13, dans lequel les informations de position et d'orientation sont générées au niveau de l'objet en vol et le premier ordinateur (17) est configuré pour recevoir une transmission comportant lesdites informations pour une corrélation avec le premier modèle de terrain géoréférencé (11').
     
    15. Système selon l'une quelconque des revendications 12 à 14, configuré pour avoir une interface avec un aéronef (5) sur lequel l'objet est monté, et pour obtenir les informations de position et de direction de l'aéronef (5).
     
    16. Système selon l'une quelconque des revendications 12 à 15, comprenant en outre un système de communication (13) conçu pour communiquer lesdites informations de position et d'orientation de l'objet en vol au premier ordinateur (17).
     
    17. Système selon l'une quelconque des revendications 12 à 16, comprenant en outre un système de détection et de télémétrie pour déterminer la position et la direction de l'objet en vol.
     
    18. Système selon l'une quelconque des revendications 12 à 17, comprenant en outre un dispositif d'interface homme-ordinateur configuré pour commander des paramètres de l'objet en vol, lesdits paramètres étant sélectionnés dans le groupe constitué par : la position, l'orientation, le niveau de zoom, le tronc de visualisation et des caractéristiques de réglage de mode opérationnel.
     




    Drawing











    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description