(19)
(11)EP 3 722 749 A1

(12)EUROPEAN PATENT APPLICATION

(43)Date of publication:
14.10.2020 Bulletin 2020/42

(21)Application number: 19214594.4

(22)Date of filing:  09.12.2019
(51)International Patent Classification (IPC): 
G01C 21/16(2006.01)
G01C 25/00(2006.01)
G06T 7/277(2017.01)
(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
KH MA MD TN

(30)Priority: 09.04.2019 US 201916379210

(71)Applicant: ROSEMOUNT AEROSPACE INC.
Burnsville, MN 55306-4898 (US)

(72)Inventors:
  • ELLIS, David
    Plymouth, MN Minnesota 55447 (US)
  • ROBERTS, Shannon
    Burnsville, MN Minnesota 55337 (US)

(74)Representative: Dehns 
St. Bride's House 10 Salisbury Square
London EC4Y 8JD
London EC4Y 8JD (GB)

  


(54)NAVIGATION AUGMENTATION SYSTEM AND METHOD


(57) A navigation augmentation system includes a vehicle (101) including an imaging device (102) operably connected to a navigation data fusion module (104) for receiving and analyzing visual point of interest data, gyroscope data, and accelerometer data, wherein the navigation data fusion module (104) is operably connected to a sensor compensation module and an autopilot module for controlling the navigation of the vehicle.




Description

Background


Technological Field



[0001] The present disclosure relates to a navigation augmentation system, and more particularly to a navigation augmentation system without using GPS.

Description of Related Art



[0002] A variety of devices are known in the navigation systems for vehicles. Precision guided munitions require accurate estimates of position, velocity, and attitude in order to hit a designated target. Current weapon systems rely on GPS and high cost IMUs. The new paradigm for precision guided munitions is GPS-denied, low cost and volume IMUs, and maximum airframe performance. This patent eliminates major IMU error sources and enables use of IMUs with lower cost and volume than would otherwise be required.

[0003] The conventional methods and systems have generally been considered satisfactory for their intended purpose. However, there is still a need in the art for low-cost navigation systems operating in a GPS-denied environment but maintaining accurate position, velocity, and attitude estimates, which in turn improve seeker target acquisition capabilities. There also remains a need in the art for such systems and components that are economically viable. The present disclosure may provide a solution for at least one of these remaining challenges.

Summary of the Invention



[0004] A navigation augmentation system includes a vehicle, such as an aircraft, missile or projectile, including an imaging device operably connected to a navigation data fusion module configured for receiving and analyzing visual point of interest data, gyroscope data, and accelerometer data, wherein the navigation data fusion module is operably connected to a sensor compensation module and an autopilot module for controlling navigation of the vehicle. The accelerometer and the gyroscope can also be directly and operably connected to the sensor compensation module. The navigation data fusion module can include a Kalman filter.

[0005] The imaging device can include a series of cameras which can be interconnected or independent of each other, each communicating with the navigation data fusion module. Each imaging device can include a horizon sensor.

[0006] A method of augmenting navigation of a vehicle is also disclosed. The method includes updating a vehicle control command due to receiving data from an autopilot module and data from sensors compensation module for controlling a direction of travel of a vehicle wherein the sensor compensation module receives bias and scale factor error estimates for a gyroscope and bias and scale factor error estimates for an accelerometer from a navigation data fusion system wherein the navigation data fusion system receives and aggregates input from an integrator for a gyroscope, an integrator for an accelerometer, and an imaging device, and the vehicle changes direction based on the navigation updates. Roll and pitch of the vehicle can be determined by the imaging device when used as a horizon sensor, and can be used as inputs to the navigation data fusion system. The roll and pitch of the vehicle can be coupled with the visual points of interest data as an input to the data fusion system.

[0007] The imaging device can detect at least one point of interest in a first image and track the point of interest in a second image using an algorithm. The algorithm can be a SIFT (Scale Invariant Feature Transform), FAST (Features from Accelerated Segment Test) algorithm, or similar algorithm.

[0008] The method described above is intended to be used in areas where GPS data is inaccessible or intermittently accessible; as such the bias and scale factor error estimates are calculated without using GPS data, and specifically the vehicle can operate in a GPS denial area.

[0009] These and other features of the systems and methods of the subject disclosure will become more readily apparent to those skilled in the art from the following detailed description of the preferred embodiments taken in conjunction with the drawings.

Brief Description of the Drawings



[0010] So that those skilled in the art to which the subject invention appertains will readily understand how to make and use the devices and methods of the subject invention without undue experimentation, preferred embodiments thereof will be described in detail herein below with reference to certain figures, wherein:

Fig. 1 is a schematic view of a navigation augmentation system; and

Fig. 2 is a 2-D Camera Rotation/Translation Diagram for the navigation augmentation system of Fig. 1.


Detailed Description



[0011] Reference will now be made to the drawings wherein like reference numerals identify similar structural features or aspects of the subject invention. For purposes of explanation and illustration, and not limitation, a partial view of an exemplary embodiment of a navigation augmentation system in accordance with the invention is shown in Fig. 1 and is designated generally by reference character 100. The methods and systems of the invention can be used to bound the change in position and attitude by estimating the bias and scale factor errors of the accelerometers and gyroscopes of a vehicle in flight.

[0012] During flight, the imaging seeker is used to capture images at fixed intervals. Each recorded image is scanned for points of interest (POI) which can be tracked or matched in successive images. Given many such points, a least-squares estimate of the camera's change in roll, pitch, yaw, and translation direction can be calculated for each pair of images. The translation direction vector must be multiplied by a scaling factor, a, which can be determined using an additional information source, i.e. an altimeter, velocity, or position measurement within the image. These can be used as aiding information to enhance navigation estimates.

[0013] A navigation augmentation system 100 includes a vehicle 101, specifically an airborne vehicle such aircraft or a munition, including an imaging device 102 operably connected to a navigation data fusion module 104 configured for receiving and analyzing delta rotations and delta translations (from block 3), gyroscope data, and accelerometer data. The navigation data fusion module 104 is operably connected to a sensor compensation module 106 and an autopilot module 108 for controlling the vehicle 101. An accelerometer 110 and a gyroscope 112 are also directly and operably connected to the sensor compensation module 106. The navigation data fusion module 104 could be implemented as a Kalman filter.

[0014] The imaging device 102 can include a series of cameras which can be interconnected or independent of each other, each with independent translation/rotation estimation modules ((1)-(3)), each providing delta translation/rotation angles to the navigation data fusion module 104. Each imaging device can be used as a horizon sensor.

[0015] A method of augmenting navigation of a vehicle is also disclosed. Any feature detection method that provides a list of unique points can be used, so long as the same point can be detected or tracked in the second image, i.e. the FAST or the SIFT algorithm. The output of an imaging collection can be a matrix of points, p in homogeneous coordinates:



[0016] Where pnx is the pixel column of the nth point and pny is the pixel row of the nth point. Each of these homogeneous points can be treated as a 3-dimensional pointing vector from the optical center of the camera towards the point in 3D space, where the camera is at the origin and its center pixel points along the Z axis, where the X axis aligns to the image rows and the Y axis aligns to the image columns.

[0017] For each of the points identified in the first image, a matching point must be located in the second image. In the case where there is relatively little motion between successive camera frames, as in a high frame-rate video stream, the points can be located using a tracking algorithm. One method for tracking is to consider a small region of pixels around the POI of the first image and search the second image for the nearby region with the highest correlation:

Where:

px' and py' are the corresponding column and row of point p in the second image

R is a small region of pixels around p

I is the first image

I' is the second image



[0018] Other methods, such as the method above but with a Kalman filter, or the Kanade-Lucas-Tomasi (KLT) tracker can be used. As an alternative to feature tracking, feature matching can also be performed.

[0019] The output of this step is a second matrix of points, p' in homogeneous coordinates:



[0020] Where a correspondence between the points is maintained via indices. If a matching point cannot be located in the second image, it is removed from the list.

[0021] For every point in 3-dimensional space that is observed from two different camera locations, a plane can be defined using the point P, and the optical centers (lenses) of each camera - O and O'.



[0022] The line OO' is the direction of translation between frames k and k+1, and the lines OP and O'P are the lines connecting each camera's optical center to the point in 3-dimensional space. Since these points define a plane, the following relation holds:



[0023] This can be re-written using the coordinates of the first camera as:



[0024] Where t is the translation and R is the rotation matrix from the second camera orientation to the first camera orientation. The essential matrix is defined as:



[0025] Where [t×] is the skew symmetric form of t, which is the implementation of the cross-product with a matrix multiplication. The essential matrix is defined using normalized camera coordinates, accounting for the camera's intrinsic parameters (focal length, etc.) The transformation between the essential matrix and the fundamental matrix is linear:



[0026] Where K is the camera calibration matrix. This will be defined and handled in the following step. For the moment, for each point (defined in pixel coordinates) that exists in both images, the following relation holds:



[0027] Using the set of point correspondences acquired in the previous steps, a least-squares estimate of the fundamental matrix can be computed, which contains the desired rotation and translation information.

[0028] The equation can be rewritten to allow for a solution using standard linear least-squares optimization techniques:



[0029] This is a system of nine homogeneous linear equations:

Where

And



[0030] The fundamental matrix can now be estimated using standard linear least-squares techniques.

[0031] The fundamental matrix estimate does not account for the camera's focal length, focal center, and pixel size, however, it is linearly related to the desired essential matrix:



[0032] Where K is the camera calibration matrix, and predetermined, or sometimes called the intrinsic matrix, is defined as:



[0033] Where f is the focal length expressed in number of pixels and cx and cy are the pixel column and row aligned with the lens's optical axis.

[0034] To extract the rotation and translation vectors from the essential matrix, the singular value decomposition is utilized:



[0035] Where U and V are rotation matrices and W is a diagonal matrix containing the singular values. Due to the defined constraints of the essential matrix, it must have rank 2, and therefore only contain 2 singular values. This means that the third column of U defines the null space of ε. Since ε is defined as the product of one rotation matrix and one skew-symmetric matrix representing the translation vector, the vector t' is the null space of ε and therefore the 3rd column of U. It is only defined by its direction, and at the moment it could be positive or negative.

[0036] There are also two possibilities for the rotation matrix R depending on the ordering of the first two columns of U. Using the four total combinations of R and t', only one of these will result in the points being in front of the camera in both positions, and this provides the estimate of R and t'. The translation vector t', is a unit vector, therefore the scaling factor, a, must be multiplied by t' in order to have a properly scaled delta relative position.



[0037] This information can then be used to extract the delta attitude and delta relative position between frames, which can then be applied as an aiding source to a navigation algorithm.

[0038] The method includes updating a vehicle control command from an autopilot module 108 and data from sensors compensation module 106 for controlling a direction of travel of a vehicle wherein the sensor compensation module 106 receives bias and scale factor error estimates for a gyroscope 112 and bias and scale factor error estimates for an accelerometer 110 from a navigation data fusion system 104 wherein the navigation data fusion system 104 receives an aggregate input from an integrator for a gyroscope 113, an integrator for an accelerometer 111, and an imaging device 102, and the vehicle changes direction based on the navigation updates. Change in position, velocity and attitude of the vehicle are determined by the accelerometer and the gyroscope integrators and used as inputs to the navigation data fusion system. The visual points of interest data are processed to estimate delta translation and rotation of the imager, which is used as an input to the data fusion system. The data fusion system uses the delta attitude and delta relative positions to compare to the attitude and position changes estimated by the gyroscope and accelerometers. The difference between the two is specified as bias and scale factor error of the gyroscope and accelerometers, as the imaging device has bias-free error characteristics. The sensor fusion module uses these error terms to compensate the gyroscope and accelerometer outputs. This gives the compensated gyroscope output

and compensated accelerometer output

as:





[0039] The imaging device 102 can detect at least one point of interest in a first image and track the point of interest in a second image. The detection algorithm can be the SIFT or the FAST algorithm. The system uses an imaging seeker to track multiple distinguishable terrain features (i.e. rivers, lakes, roads, shoreline, rock formations, vegetation, etc.) which are assumed to be on the surface, and inertially fixed. An imaging sensor provides angle-only measurements (with respect to the body) to each feature. Tracking these features allows delta-position, delta-attitude, and velocity to be bounded while navigating via dead-reckoning by using features as an aid to navigation.

[0040] This system and method enhance integrated guidance units to operate by improving attitude and position estimates and seeker acquisition capabilities. The systems allows for using small, low cost sensors, which can be calibrated post launch. This system and method disclosed above remove gyro and accelerometer scale factor and bias, which are major error sources of low cost units, and improve navigation, airframe stability, and guidance accuracy.

[0041] The method described above is intended to be used in areas where GPS data is inaccessible or intermittently accessible, as such the bias and scale factor error estimates are calculated without using GPS data.

[0042] The methods and systems of the present disclosure, as described above and shown in the drawings, provide for flight systems with superior properties including increased reliability and stability, and reduced size, weight, complexity, and/or cost. While the apparatus and methods of the subject disclosure have been showing and described with reference to embodiments, those skilled in the art will readily appreciate that changes and/or modifications may be made thereto without departing from the scope of the subject disclosure.


Claims

1. A navigation augmentation system comprising:

a vehicle (101) including an imaging device (102) operably connected to a navigation data fusion module (104) configured for receiving and analyzing visual point of interest data, gyroscope data, and accelerometer data;

wherein the navigation data fusion module (104) is operably connected to a sensor compensation module and an autopilot module for controlling the navigation of the vehicle.


 
2. The navigation augmentation system of claim 1, wherein the accelerometer is operably connected to the sensor compensation module;
 
3. The navigation system of claim 1 or 2, wherein the gyroscope is operably connected to the sensor compensation module.
 
4. The navigation system of any preceding claim, wherein navigation data fusion module includes a Kalman filter.
 
5. The navigation system of any preceding claim, wherein the imaging device includes a series of independent cameras.
 
6. The navigation system of any preceding claim, wherein the imaging device includes a horizon sensor.
 
7. A method of augmenting navigation of a vehicle comprising:

updating a vehicle control command due to receiving data from an autopilot module and data from sensors compensation module and data from a guidance module for controlling a vehicle;

wherein the sensor compensation module receives bias and scale factor error estimates for a gyroscope and bias and scale factor error estimates for an accelerometer from a navigation data fusion system; and

the navigation data fusion system receiving and aggregating input from an integrator for a gyroscope, an integrator for an accelerometer, and an imaging device.
 
8. The method of claim 7, wherein the sensor compensation module adjusts the angular rate sensed by the gyroscope based upon the estimated gyroscope bias and scale factors.
 
9. The method of claim 7 or 8, wherein the sensor compensation module adjusts the acceleration sensed by the accelerometer based upon the estimated accelerometer bias and scale factors.
 
10. The method of any of claims 7-9 wherein the imaging device detects at least one point of interest in a first image and tracks the point of interest in a second image using a feature-tracking algorithm.
 
11. The method of claim 10, wherein the feature-tracking algorithm receives scale information from an altitude estimate, velocity estimate, or estimated distance in the imaging information to provide scale to at least one point of interest is the first image and the second image; and/or wherein the feature-tracking algorithm is a scale-invariant feature transform (SIFT) algorithm.
 
12. The method of any of claims 7-10, wherein an algorithm is configured to determine the change in attitude and the change in relative position between the first and second images by applying a least-squares fit to satisfy epipolar geometry of at least one point of interest in a first image and the point of interest tracked in a second image, and preferably wherein the navigation data fusion system receives from the algorithm an estimated attitude change independent of an estimated relative position change.
 
13. The method of any of claims 7-12 wherein the accelerometer supplies acceleration vector data directly to the sensor compensation module.
 
14. The method of any of claims 7-13, wherein the gyroscope supplies angular acceleration vector data directly to the sensor compensation module, and preferably wherein the navigation data fusion system compares estimated attitude and estimated relative position change to the gyroscope and accelerometer data to determine the bias and the scale factor errors.
 
15. The method of any of claims 7-14, wherein bias and scale factor error estimates are calculated without using GPS data; and/or wherein roll and pitch are determined and used as an input to the navigation data fusion system and the roll and pitch of the vehicle are coupled with change in attitude and change in relative position as an input to the navigation data fusion system; and/or wherein the vehicle changes direction based on the navigation updates.
 




Drawing










Search report









Search report