TECHNICAL FIELD
[0001] The present invention relates to a navigation system that estimates the attitude
of a moving body in which a camera for surveys and a laser distance measuring device
are mounted, and a survey system provided with this navigation system.
BACKGROUND ART
[0002] Patent Literature 1, for example, discloses a survey system that performs photographic
surveying and airborne laser scanning by using a camera and a laser emitting and receiving
device which are mounted in a flying body.
[0003] In this survey system, the camera that shoots a survey target from the flying body
is supported by an attitude stabilizing device called a stabilizer, and the shooting
direction can be kept aligned with a vertically downward direction regardless of the
attitude of the flying body in flight.
[0004] Further, the laser emitting and receiving device projects laser light to a survey
target from the flying body at a predetermined period, and receives light reflected
from this survey target. A control device in this survey system performs the airborne
laser scanning by using information about the reflected light from the survey target
which is received by the laser emitting and receiving device. The laser emitting and
receiving device corresponds to a laser distance measuring device according to the
present invention.
[0005] In the airborne laser scanning, both three-dimensional coordinate data about the
flying body (the horizontal position and the altitude of the flying body) and information
showing the attitude of the flying body in flight are needed in addition to the above-mentioned
information. The three-dimensional coordinate data about the flying body, among these
pieces of information, is detected by a GNSS (Global Navigation Satellite System)
device mounted in the flying body. More specifically, the GNSS device receives GNSS
information from a GNSS satellite at a predetermined period, and analyzes this GNSS
information to acquire the three-dimensional coordinate data about the flying body.
[0006] On the other hand, the length of the period at which the laser light is projected
to a survey target by the laser emitting and receiving device is shorter than the
length of the period at which GNSS information is received by the GNSS device. Therefore,
even if reflected light from a survey target is received by the laser emitting and
receiving device, the control device cannot acquire the three-dimensional coordinate
data about the flying body at a certain period not matching the period at which GNSS
information is received.
[0007] In contrast with this, in conventional typical airborne laser scanning, three-dimensional
coordinate data about a flying body are acquired at a certain period other than the
period at which GNSS information is received, by using information about acceleration
along three axes and angular acceleration along three axes which are measured by an
IMU (Inertial Measurement Unit) mounted in the flying body.
[0008] However, because the IMU is very expensive and is relatively heavy, a limitation
is imposed on the types of flying bodies into which this IMU can be incorporated.
[0009] Accordingly, in the survey system described in Patent Literature 1, instead of the
IMU, an accelerometer and an angular accelerometer which are less expensive and smaller
than the IMU are disposed.
[0010] More specifically, this survey system acquires three-dimensional coordinate data
about a flying body at a certain period not matching the period at which GNSS information
is received, by using both information about acceleration along three axes from the
accelerometer and information about angular acceleration along three axes from the
angular accelerometer.
[0011] Further, information showing the attitude of the flying body is angles in a rolling
direction, a pitching direction and a yawing direction of the flying body (referred
to as a roll angle, a pitch angle and a yaw angle from here on), and values acquired
by bundle calculation for corresponding points of images which are shot from two or
more different positions by a camera are used. The control device calculates an attitude
of the flying body at each scan period of laser light (this period does not match
the period at which the GNSS information is received) in accordance with the attitude
of the flying body acquired by the bundle calculation and by using both the acceleration
from the accelerometer and the angular acceleration from the angular accelerometer.
[0012] Patent Literature 2 describes a method of calculating position and attitude of aerial
photograph. In the method, after a first and second aerial photographs successively
taken during flying in the air are mutually located, a topological model corresponding
to these aerial photographs is read to obtain the positions and the attitudes of the
first and second aerial photographs by the absolute plots where the brightnesses of
both photographs coincide in the topological model, based on a point inputted on the
topological model.
[0013] Patent Literature 3 describes an indoor navigation system which is based on a multi-beam
laser projector, a set of calibrated cameras, and a processor that uses knowledge
of the projector design and data on laser spot locations observed by the cameras to
solve the space resection problem to find the location and orientation of the projector.
[0014] Patent Literature 4 describes a motion control method of imaging means for making
an image with longer base length photographed at various points by a moving object
such as helicopter available, and to provide 3-dimensional shape or 3-dimensional
topography measuring method using the image.
CITATION LIST
PATENT LITERATURE
SUMMARY OF INVENTION
TECHNICAL PROBLEM
[0016] The survey system described in Patent Literature 1 estimates the attitude of the
flying body by the bundle calculation using the image data about images captured from
different positions by a camera, and uses only the image data for the estimation of
the attitude. Therefore, the estimation accuracy of the attitude has a limit.
[0017] Further, the above-mentioned survey system is based on the premise that information
of images to be used for the estimation of the attitude of the flying body is taken
with the camera pointing vertically downward with respect to the flying body, regardless
of the attitude of the flying body.
[0018] Thus, a stabilizer for keeping the imaging direction of the camera vertically downward
at all times needs to be provided, and therefore the system configuration becomes
complicated.
[0019] The present invention is made in order to solve the above-mentioned problems, and
it is therefore an object of the present invention to provide a navigation system
and a survey system capable of estimating the attitude of a moving body with a high
degree of accuracy by using a configuration not having any IMU and any stabilizer.
SOLUTION TO PROBLEM
[0020] The above problems are solved by the subject-matter according to the independent
claim. According to the present invention, there is provided a navigation system including:
a data acquiring unit for acquiring distance data showing a distance from a projection
reference point of laser light to a distance measurement point, the distance being
measured by a laser distance measuring device mounted in a moving body, angle data
showing a projection angle of the laser light, coordinate data showing three-dimensional
coordinates of the projection reference point of the laser light, the three-dimensional
coordinates being measured by a coordinate measuring device mounted in the moving
body, and image data containing a distance measurement point on an object taken by
an image shooting device mounted in the moving body; a coordinate calculating unit
for calculating coordinates of the distance measurement point on an image shown by
the image data on a basis of the distance data, the angle data and the coordinate
data that are acquired by the data acquiring unit, and a parameter showing an attitude
of the moving body; an image matching unit for performing image matching on a pair
of pieces of image data taken by the image shooting device at different shooting positions,
and searching an image shown by one image data of the pair for a point corresponding
to coordinates of a distance measurement point on an image shown by the other image
data of the pair, the coordinates being calculated by the coordinate calculating unit;
and an attitude estimating unit for correcting a value of the parameter showing the
attitude of the moving body in such a way that a difference between coordinates of
the distance measurement point on the image shown by the other image data of the pair,
the coordinates being calculated by the coordinate calculating unit, and coordinates
of the corresponding point searched for by the image matching unit becomes small,
and estimating the attitude of the moving body.
ADVANTAGEOUS EFFECTS OF INVENTION
[0021] Because the navigation system according to the present invention pays attention to
the fact that the coordinates of corresponding points between images shot at different
shooting positions deviate from each other dependently on the attitude of the moving
body, and corrects the value of the parameter showing the attitude of the moving body
in such a way that the difference between these coordinates becomes small, to estimate
the attitude of the moving body, the navigation system can estimate the attitude of
the moving body even if the navigation system does not use an IMU and a stabilizer.
[0022] Further, because the navigation system estimates the attitude of the moving body
by using, in addition to image data about an image containing a shot distance measurement
point, the distance from the projection reference point of laser light to the distance
measurement point, the projection angle of the laser light and the three-dimensional
coordinates of the projection reference point of the laser light, the navigation system
can estimate the attitude of the moving body with a high degree of accuracy.
BRIEF DESCRIPTION OF DRAWINGS
[0023]
Fig. 1 is a block diagram showing the configuration of a survey system according to
Embodiment 1 of the present invention;
Fig. 2 is a block diagram showing the function configuration of a navigation system
according to Embodiment 1;
Fig. 3 illustrates block diagrams showing the hardware configuration of the navigation
system according to Embodiment 1, wherein Fig. 3A shows a processing circuit which
is hardware for implementing the functions of the navigation system and Fig. 3B shows
a hardware configuration which executes software for implementing the functions of
the navigation system;
Fig. 4 is a flow chart showing an overview of the operation of the navigation system
according to Embodiment 1;
Fig. 5 illustrates diagrams schematically showing a positional relationship among
a left camera, a right camera and a laser distance measuring device, wherein Fig.
5A is a perspective view of a unit provided with the left camera, the right camera
and the laser distance measuring device, Fig. 5B is a diagram showing the unit when
viewed from a direction of an X axis, Fig. 5C is a diagram showing the unit when viewed
from a direction of a Z axis, and Fig. 5D is a diagram showing the unit when viewed
from a direction of a Y axis;
Fig. 6 illustrates diagrams showing changes in the positions of the left camera, the
right camera and the laser distance measuring device, the changes being caused by
a flight of an airplane, wherein Fig. 6A shows data about the position coordinates
of the laser distance measuring device, Fig. 6B is a graph in which the position coordinates
of the left camera, the right camera and the laser distance measuring device are plotted
on an XZ plane, Fig. 6C is a graph in which these position coordinates are plotted
on a YZ plane, and Fig. 6D is a graph in which these position coordinates are plotted
on an XY plane;
Fig. 7 illustrates diagrams showing changes in results of measurements performed by
the laser distance measuring device, the changes being caused by a flight of the airplane,
wherein Fig. 7A shows angle data and distance data acquired at times, and Fig. 7B
is a graph in which the data shown in Fig. 7A are plotted;
Fig. 8 is a diagram showing images which are shot every second by the left camera
and the right camera;
Fig. 9 is a diagram showing images which are shot by the left camera and the right
camera while the airplane makes a level flight and each of which contains a distance
measurement point of the laser distance measuring device;
Fig. 10 is a diagram showing images which are shot by the left camera and the right
camera when the airplane flies while the airplane is tilted in a pitch direction and
each of which contains a distance measurement point of the laser distance measuring
device;
Fig. 11 is a diagram showing an error occurring between the coordinates of a distance
measurement point on an image, the coordinates being calculated on the assumption
that the airplane makes a level flight, and the coordinates of a corresponding point
on an image shot when the airplane flies while the airplane is tilted in a pitch direction;
Fig. 12 is a flow chart showing the operation of the navigation system according to
Embodiment 1;
Fig. 13 is a diagram showing results of the calculation of the three-dimensional coordinates
of a distance measurement point;
Fig. 14 is a diagram showing the coordinates of the projection centers of the left
camera and the right camera;
Fig. 15 is a diagram showing the coordinates of distance measurement points on images
shot by the left and right cameras;
Fig. 16 is a diagram showing the coordinates of a distance measurement point on a
left camera image at each time i, and the coordinates of a point corresponding to
the above-mentioned coordinates and existing on a right camera image at a time j,
the point being searched for through image matching;
Fig. 17 is a diagram showing a constant vector of an observation equation;
Fig. 18 is a diagram showing a design matrix;
Fig. 19 is a diagram showing the product of the transpose of the design matrix shown
in Fig. 18, and the design matrix;
Fig. 20 is a diagram showing the product of a matrix acquired from the product of
the transpose of the design matrix shown in Fig. 18 and the constant vector shown
in Fig. 17, and the inverse of the matrix of Fig. 19;
Fig. 21 is a diagram showing correction amounts for attitude angles;
Fig. 22 is a diagram showing estimated results of final attitude angles; and
Fig. 23 is a block diagram showing the configuration of a survey system according
to Embodiment 2 of the present invention.
DESCRIPTION OF EMBODIMENTS
[0024] Hereafter, in order to explain this invention in greater detail, embodiments of the
present invention will be described with reference to the accompanying drawings.
Embodiment 1.
[0025] Fig. 1 is a block diagram showing the configuration of a survey system 1 according
to Embodiment 1 of the present invention. The survey system 1 surveys geographical
features from an airplane 2, and includes a left camera 20a, a right camera 20b, a
laser distance measuring device 21, a GNSS device 22 and a memory card 23 which are
mounted in the airplane 2, and a navigation system 3. The navigation system 3 estimates
the attitude of the airplane 2 in flight, and, as shown in Fig. 1, is disposed separately
from the airplane 2. Alternatively, the navigation system 3 may be mounted in the
airplane 2. Further, the attitude of the airplane 2 is specified by the following
three parameters: a roll angle ω, a pitch angle φ and a yaw angle κ which are attitude
angles in a rolling direction, in a pitching direction and in a yawing direction of
the airplane 2.
[0026] The airplane 2 is an embodiment of a moving body described in the present invention,
and can fly with the left camera 20a, the right camera 20b, the laser distance measuring
device 21, the GNSS device 22 and the memory card 23 mounted therein. For example,
an airplane which a pilot on board operates may be used, or a UAV (Unmanned Aerial
Vehicle) may be used.
[0027] The left camera 20a and the right camera 20b are components which are embodiments
of a first shooting unit and a second shooting unit according to the present invention,
and each of the cameras shoots a ground surface including a distance measurement point
of the laser distance measuring device 21. In this case, a device including the left
camera 20a, the right camera 20b and a control device for controlling shooting processes
performed by these cameras corresponds to an image shooting device according to the
present invention. For example, the control device instructs the left camera 20a and
the right camera 20b to shoot a ground surface at a predetermined period, and stores
image data in which an image acquired via shooting and a shooting date are brought
into correspondence with each other in the memory card 23. As the predetermined period,
it is conceivable to perform shooting every second.
[0028] The laser distance measuring device 21 measures a distance 1 from a projection reference
point of laser light to a distance measurement point by projecting the laser light
to a ground surface which is a survey target while changing a projection angle θ of
the laser light. Further, every time the laser distance measuring device 21 measures
the distance 1, the laser distance measuring device 21 stores distance data showing
this distance 1 and angle data showing the projection angle θ of the laser light at
which this distance 1 is acquired in the memory card 23.
[0029] The GNSS device 22 is a component which is a concrete example of a coordinate measuring
device according to the present invention, and measures the three-dimensional coordinates
of the projection reference point of the laser light in the laser distance measuring
device 21.
[0030] The GNSS device 22 also stores coordinate data showing the three-dimensional coordinates
of the projection reference point in the memory card 23 at a predetermined period.
For example, the GNSS device measures the coordinates every second in synchronization
with the shooting performed by the left camera 20a and the right camera 20b.
[0031] The difference between the position of the GNSS device 22 and the position of the
projection reference point falls within an allowable range, with respect to the accuracy
of measurement of the GNSS device 22. More specifically, it is assumed that the GNSS
device 22 is located at the same position as the projection reference point, and the
position of the projection reference point has the same meaning as the position of
the airplane 2.
[0032] The memory card 23 is a component which is a concrete example of a storage device
according to the present invention, and stores distance data, angle data, image data
and coordinate data which are acquired during a flight of the airplane 2.
[0033] As the memory card 23, for example, an SD (Secure Digital) memory card can be used.
[0034] Fig. 2 is a block diagram showing the function configuration of the navigation system
3. The navigation system 3 includes a data acquiring unit 30, a coordinate calculating
unit 31, an image matching unit 32 and an attitude estimating unit 33, as shown in
Fig. 2. The data acquiring unit 30 is a component that acquires distance data, angle
data, coordinate data and image data which are stored in the memory card 23 of the
airplane 2.
[0035] For example, the data acquiring unit 30 connects to the card drive of the memory
card 23 via a cable or radio, and reads and acquires the above-mentioned data.
[0036] The coordinate calculating unit 31 calculates the coordinates of a distance measurement
point on the image shown by image data by using the distance data, the angle data
and the coordinate data, which are acquired by the data acquiring unit 30, and the
attitude angles of the airplane 2 (the roll angle ω, the pitch angle φ and the yaw
angle κ). For example, the coordinate calculating unit calculates the three-dimensional
coordinates of a distance measurement point by using the distance 1 from the projection
reference point of the laser light to the distance measurement point, the projection
angle θ of the laser light, the three-dimensional coordinates of the projection reference
point of the laser light, and the roll angle ω, the pitch angle φ and the yaw angle
κ of the airplane 2. The coordinate calculating unit then calculates the coordinates
of the distance measurement point on the image shown by each of image data generated
by the left camera 20a and the right camera 20b by using the three-dimensional coordinates
of the distance measurement point, the coordinates of the projection center of the
left camera 20a and the coordinates of the projection center of the right camera 20b.
[0037] Because the roll angle ω, the pitch angle φ and the yaw angle κ of the airplane 2
are unknown, and correction amounts for the attitude angles are not calculated first,
the coordinate calculating unit calculates the coordinates by using the roll angle
ω=0, the pitch angle φ=0 and the yaw angle κ=0 as their initial values. The details
of this coordinate calculation will be described below.
[0038] The image matching unit 32 performs image matching on a pair of image data about
an image shot at a shooting position by at least one of the left camera 20a and the
right camera 20b, and image data about an image shot at a different shooting position
by at least one of the left camera 20a and the right camera 20b, and searches for
a point corresponding to the coordinates of a distance measurement point on the image
shown by one (referred to as first image data from here on as needed) of the pair
of image data, through the image shown by the other image data (referred to as second
image data from here on as needed) of the pair.
[0039] As an image matching method, a well-known template matching method of examining the
degree of similarity between two images, or the like can be used. For example, both
the two image data are compared with each other with the first image data being set
as template image data and the second image data being set as image data to be compared,
and a point corresponding to the coordinates of a distance measurement point on the
template image is searched for through the image shown by the image data to be compared.
[0040] Further, because what is necessary is just to use a pair of image data about images
shot at different shooting positions as the pair of image data, image data about an
image shot at a time i during a flight of the airplane 2 and image data about an image
shot at a time j later than this time i during the flight can be alternatively used.
[0041] As an alternative, as the pair of image data, a pair of image data about respective
images shot at a time i by the left camera 20a and the right camera 20b can be used.
[0042] As an alternative, as the pair of image data, a pair of image data about an image
shot at a time i by at least one of the left camera 20a and the right camera 20b and
image data about an image shot at a time j later than the time i by at least one of
the cameras can be used.
[0043] By using such a pair of image data as above, a change in an object to be shot on
images, the change being dependent on the attitude of the airplane 2, can be used
for the estimation of the attitude of the airplane 2.
[0044] In other words, because the survey system according to the present invention uses
a change in an object to be shot on images, the change being dependent on the attitude
of the airplane 2, for the estimation of the attitude of the airplane 2, no stabilizers
are needed for the left camera 20a and the right camera 20b.
[0045] The attitude estimating unit 33 corrects the values of the attitude angles of the
airplane 2 in such a way that the difference between the coordinates of the distance
measurement point on the image shown by the other image data (second image data) of
the pair, the coordinates being calculated by the coordinate calculating unit 31,
and the coordinates of the corresponding point which is searched for by the image
matching unit 32 becomes small, to estimate the attitude of the airplane 2. As mentioned
above, the coordinate calculating unit 31 calculates the coordinates of the distance
measurement point on the image by using the attitude angles (ω, φ, κ) = (0, 0, 0)
as their initial values. Therefore, when the airplane 2 flies while rotating, the
coordinates of the above-mentioned distance measurement point, which are calculated
by the coordinate calculating unit 31, do not match the coordinates of the above-mentioned
corresponding point which is searched for by the image matching unit 32.
[0046] To solve this problem, the attitude estimating unit 33 calculates correction amounts
for the values of the attitude angles of the airplane 2 in such a way that the difference
between the coordinates of these two points becomes small, and estimates the attitude
angles which minimize the difference between the coordinates of the two points as
final attitude angles of the airplane 2. As a result, the attitude estimating unit
can estimate the attitude angles of the airplane 2 with a high degree of accuracy
by using the distance data, the angle data, the coordinate data and the image data.
[0047] Fig. 3 illustrates block diagrams showing the hardware configuration of the navigation
system 3. Fig. 3A shows a processing circuit 100 which is hardware for implementing
the functions of the navigation system 3, and Fig. 3B shows a hardware configuration
which executes software for implementing the functions of the navigation system 3.
Fig. 4 is a flow chart showing an overview of the operation of the navigation system
3.
[0048] The functions of the data acquiring unit 30, the coordinate calculating unit 31,
the image matching unit 32 and the attitude estimating unit 33 of the navigation system
3 are implemented by a processing circuit.
[0049] More specifically, the navigation system 3 includes a processing circuit for performing
a step ST1 of acquiring distance data, angle data, coordinate data and image data,
a step ST2 of calculating the coordinates of a distance measurement point on an image
shown by image data by using the distance data, the angle data, the coordinate data
and the attitude angles of the airplane 2, a step ST3 of performing image matching
on a pair of image data about images shot at different shooting positions, to search
for a point corresponding to the coordinates of a distance measurement point on the
image shown by one image data of the pair, through the image shown by the other image
data of the pair, and a step ST4 of correcting the values of the attitude angles of
the airplane 2 in such a way that the difference between the coordinates of the distance
measurement point on the image shown by the other image data of the pair and the coordinates
of the corresponding point which is searched for by the image matching unit 32 becomes
small, to estimate the attitude of the airplane 2, the steps being shown in Fig. 4.
[0050] The processing circuit can be hardware for exclusive use, or a CPU (Central Processing
Unit) that executes a program stored in a memory.
[0051] As shown in Fig. 3A, in the case in which the above-mentioned processing circuit
is the processing circuit 100 which is hardware for exclusive use, the processing
circuit 100 is, for example, a single circuit, a composite circuit, a programmed processor,
a parallel programmed processor, an ASIC (Application Specific Integrated Circuit),
an FPGA (Field-Programmable Gate Array), or a circuit which is a combination of some
of these circuits.
[0052] In addition, the functions of each of the following units : the data acquiring unit
30, the coordinate calculating unit 31, the image matching unit 32 and the attitude
estimating unit 33 can be implemented by respective processing circuits, or can be
implemented collectively by a single processing circuit.
[0053] In the case in which the above-mentioned processing circuit is a CPU 101, as shown
in Fig. 3B, the functions of the data acquiring unit 30, the coordinate calculating
unit 31, the image matching unit 32 and the attitude estimating unit 33 are implemented
by software, firmware or a combination of software and firmware. Software and firmware
are described as programs and stored in the memory 102. The CPU 101 implements the
functions of each of the units by reading and executing programs stored in the memory
102. More specifically, in the case in which the navigation system 3 is implemented
by the CPU 101, the memory 102 is disposed to store the programs which the CPU executes
so as to perform, as a result, the processes of steps ST1 to ST4 shown in Fig. 4.
These programs are provided to cause a computer to execute procedures or methods which
are carried out by the data acquiring unit 30, the coordinate calculating unit 31,
the image matching unit 32 and the attitude estimating unit 33.
[0054] Here, the memory is, for example, a nonvolatile or volatile semiconductor memory,
such as a RAM (Random Access Memory), a ROM, a flash memory, an EPROM (Erasable Programmable
ROM) or an EEPROM (Electrically EPROM), a magnetic disk, a flexible disk, an optical
disk, a compact disk, a mini disk, or a DVD (Digital Versatile Disk).
[0055] A part of the functions of the data acquiring unit 30, the coordinate calculating
unit 31, the image matching unit 32 and the attitude estimating unit 33 can be implemented
by hardware for exclusive use and another part of the functions can be implemented
by software or firmware.
[0056] For example, the processing circuit 100 which is hardware for exclusive use implements
the functions of the data acquiring unit 30, and the CPU 101 implements the functions
of the coordinate calculating unit 31, the image matching unit 32 and the attitude
estimating unit 33 by executing programs stored in the memory 102.
[0057] In the way mentioned above, the above-mentioned processing circuit can implement
the above-mentioned functions by using hardware, software, firmware or a combination
of some of these elements.
[0058] Fig. 5 illustrates diagrams schematically showing a positional relationship among
the left camera 20a, the right camera 20b and the laser distance measuring device
21. Fig. 5A is a perspective view of a unit provided with the left camera 20a, the
right camera 20b and the laser distance measuring device 21, Fig. 5B is a diagram
showing the unit when viewed from a direction of an X axis, Fig. 5C is a diagram showing
the unit when viewed from a direction of a Z axis, and Fig. 5D is a diagram showing
the unit when viewed from a direction of a Y axis. As shown in Fig. 5A, the left camera
20a is attached to an end of an arm 20c extending leftward from the laser distance
measuring device 21, and the right camera 20b is attached to an end of an arm 20d
extending rightward from the laser distance measuring device 21. The length of each
of both the arms 20c and 20d is, for example, 1 m. Further, the shooting directions
of the left camera 20a and the right camera 20b are oriented toward a vertically downward
direction with respect to the airplane 2 (a direction of the Z axis).
[0059] The laser distance measuring device 21 projects laser light from the projection reference
point 21a to a distance measurement point P
0 on a ground surface and receives reflected light from the distance measurement point
P
0 while changing the projection angle θ of the laser light, to measure the distance
1 from the projection reference point 21a to the distance measurement point P
0, as shown in Fig. 5B. It is assumed that the projection angle θ at which laser light
is projected from the projection reference point 21a toward the vertically downward
direction is 90 degrees.
[0060] The left camera 20a and the right camera 20b acquire image data, as will be mentioned
below using Fig. 8, by shooting images of rectangular image shooting areas, as shown
in Fig. 5C.
[0061] Here, it is assumed that the position of the projection reference point 21a is the
same as that of the airplane 2. Therefore, when the airplane 2 makes a level flight
in a direction of the X axis, the unit which configured by the left camera 20a, the
right camera 20b and the laser distance measuring device 21 also moves in the direction
of the X axis, as shown in Fig. 5D.
[0062] However, in an actual flight environment, even if the airplane 2 intends to make
a level flight, the airplane cannot have a linear flight path under the influence
of winds and so on. More specifically, the airplane 2 flies in a state in which the
airplane rotates in a rolling direction, in a pitching direction and in a yawing direction.
[0063] Fig. 6 illustrates diagrams showing changes in the positions of the left camera 20a,
the right camera 20b and the laser distance measuring device 21, the changes being
caused by a flight of the airplane. Fig. 6A shows data about the coordinates of the
position of the laser distance measuring device 21. Fig. 6B is a graph in which the
coordinates of the positions of the left camera 20a, the right camera 20b and the
laser distance measuring device 21 are plotted on the XZ plane. Fig. 6C is a graph
in which these position coordinates are plotted on the YZ plane, and Fig. 6D is a
graph in which these position coordinates are plotted on the XY plane. As shown in
Fig. 6A, the airplane 2 was located at the point of origin (0, 0, 0) of the XYZ coordinate
system at a time t=0, and had made a level flight for three seconds in the direction
of the X axis shown in Fig. 6B at a certain speed.
[0064] The position coordinates of the laser distance measuring device 21 are the position
coordinates of the projection reference point 21a which are measured every second
by the GNSS device 22. The position coordinates of the left camera 20a and the right
camera 20b are calculated by assuming that the left camera 20a and the right camera
20b are apart, by the length of each of the arms 20c and 20d which is 1 m, from the
position of the projection reference point 21a in directions of the Y axis.
[0065] In Figs. 6B to 6D, a large square symbol denotes the position coordinates of the
left camera 20a at a time t=0. A large inverse triangle symbol denotes the position
coordinates of the right camera 20b at the time t=0, and a small square symbol denotes
the position coordinates of the laser distance measuring device 21 at the time t=0.
[0066] Further, a large square symbol with a point denotes the position coordinates of the
left camera 20a at a time t=1. A large inverse triangle symbol with a point denotes
the position coordinates of the right camera 20b at the time t=1, and a small square
symbol with a point denotes the position coordinates of the laser distance measuring
device 21 at the time t=1.
[0067] A large circle symbol denotes the position coordinates of the left camera 20a at
a time t=2. A large triangle symbol denotes the position coordinates of the right
camera 20b at the time t=2, and a small circle symbol denotes the position coordinates
of the laser distance measuring device 21 at the time t=2.
[0068] A large circle symbol with a point denotes the position coordinates of the left camera
20a at a time t=3. A large triangle symbol with a point denotes the position coordinates
of the right camera 20b at the time t=3, and a small circle symbol with a point denotes
the position coordinates of the laser distance measuring device 21 at the time t=3.
[0069] As explained using Fig. 5, in the unit which consists of the left camera 20a, the
right camera 20b and the laser distance measuring device 21, these components are
connected along a direction of the Y axis in order of the left camera 20a, the laser
distance measuring device 21 and the right camera 20b. For this reason, as shown in
Fig. 6B, the positions of the left camera 20a, the right camera 20b and the laser
distance measuring device 21 matched one another with respect to the directions of
the Y axis.
[0070] After the time t=1, the positions of the left camera 20a, the right camera 20b and
the laser distance measuring device 21 shifted toward a direction of the Z axis.
[0071] Further, as shown in Figs. 6C and 6D, the positions of the left camera 20a, the right
camera 20b and the laser distance measuring device 21 shifted by 0.5 m toward a direction
of the Y axis within a time interval from time t=1 to time=2, and then returned to
the same position as the point of origin at time t=3.
[0072] Considering the graphs of Figs. 6B to 6D, the position of the airplane 2 had shifted
toward the direction of the Y axis and the direction of the Z axis for three seconds,
and the airplane 2 had flown for three seconds while rotating.
[0073] Fig. 7 illustrates diagrams showing changes in measurement results acquired by the
laser distance measuring device 21, the changes being caused by the flight of the
airplane 2, and shows the measurement results acquired when the airplane 2 flew in
the state shown in Fig. 6.
[0074] Fig. 7A shows the angle data and the distance data at each of times, and Fig. 7B
is a graph in which the data shown in Fig. 7A are plotted.
[0075] Further, Fig. 8 is a diagram showing images which were shot every second by the left
camera 20a and the right camera 20b, and shows the images which were shot when the
airplane 2 flew in the state shown in Fig. 6.
[0076] As shown in Fig. 7A, the measurement results acquired by the laser distance measuring
device 21 are stored in the memory card 23 in such a way that each set of a measurement
time t, a projection angle θ and a distance 1 is defined as one record. Here, it is
assumed that the laser distance measuring device 21 performs four measurements per
second. In addition, the projection angle θ is taken in such a manner that the vertical
direction downward from the projection reference point 21a, shown in Fig. 5B, is 90
degrees.
[0077] As shown in Fig. 5B, the laser distance measuring device 21 scans the laser light
by rotating the projection reference point 21a clockwise in steps of 18 degrees around
the X axis when viewed from the positive direction of the X axis.
[0078] In Fig. 7B, white triangle symbols denote angle data and distance data at times t=0.00
to 0.15, black triangle symbols denote angle data and distance data at times t=1.00
to 1.15, and white rectangle symbols denote angle data and distance data at times
t=2.00 to 2.15.
[0079] As explained with reference to Fig. 6, when the airplane 2 flies while rotating,
laser light is also projected from the laser distance measuring device 21 while the
laser light is inclined. More specifically, distance data and angle data as shown
in Figs. 7A and 7B also change dependently on the attitude angles of the airplane
2.
[0080] Further, when the airplane 2 flies while rotating, the shooting directions of the
left camera 20a and the right camera 20b are also inclined. As a result, left camera
images and right camera images, as shown in Fig. 8, which are shot by the left camera
image 20a and the right camera 20b, also change dependently on the attitude angles
of the airplane 2.
[0081] Therefore, an error depending on the attitude angles of the airplane 2 occurs between
the coordinates of a distance measurement point on an image, the coordinates being
calculated using distance data, angle data, coordinate data and image data on the
assumption that the airplane 2 makes a level flight, and the coordinates of the same
distance measurement point when the airplane 2 actually flies while rotating.
[0082] Accordingly, in the present invention, the attitude angles are corrected in such
a way that the above-mentioned error becomes small, and the attitude angles that minimize
the above-mentioned error are determined as the estimated values of the attitude angles
of the airplane 2. Hereafter, an overview of a process of estimating the attitude
angles according to the present invention will be explained.
[0083] Fig. 9 is a diagram showing images 100a and 100b which are shot by the left camera
20a and the right camera 20b while the airplane 2 makes a level flight and each of
which contains a distance measurement point P
0 of the laser distance measuring device 21. In Fig. 9, it is assumed that the airplane
2 makes a level flight along with the positive direction of the X axis.
[0084] In this example, it is assumed that a ground surface below the airplane 2 is shot
by the left camera 20a and the right camera 20b every second, and the laser distance
measuring device 21 measures the distance to the distance measurement point P
0 located directly under the airframe of the airplane with the projection angle θ being
set to be 90 degrees.
[0085] Further, the coordinates P
0a of the distance measurement point P
0 on the image 100a which is shot at a time t=0 by the left camera 20a can be calculated
from both the three-dimensional coordinates of the projection reference point 21a
of the laser light and the attitude angles of the airplane 2 at the time t=0.
[0086] Similarly, the coordinates P
0b of the distance measurement point P
0 on the image 100b which is shot at a time t=1 by the right camera 20b can be calculated
from both the three-dimensional coordinates of the projection reference point 21a
and the attitude angles of the airplane 2 at the time t=1.
[0087] In the example shown in Fig. 9, because it is assumed that the airplane 2 makes a
level flight, the attitude angles are zeros.
[0088] Fig. 10 is a diagram showing images 100a and 100c which are shot by the left camera
20a and the right camera 20b when the airplane 2 flies while the airplane is tilted
in a pitch direction and each of which contains a distance measurement point P
0 of the laser distance measuring device 21. In the example shown in Fig. 10, it is
assumed that the airplane 2 flies while the airplane is tilted by a pitch angle φ
at the time t=1. In this example, in the image 100c shot at the time t=1 by the right
camera 20b, the object to be shot is seen at a position close to the right as a whole,
as shown by a broken chain line, as compared with the example shown in Fig. 9.
[0089] Fig. 11 is a diagram showing an error occurring between the coordinates of a distance
measurement point P
0 on an image, the coordinates being calculated on the assumption that the airplane
2 makes a level flight, and the coordinates of a corresponding point on an image shot
when the airplane 2 flies while the airplane is tilted in a pitch direction. The coordinates
P
0b of the distance measurement point P
0 on the image 100c shown in Fig. 11 are calculated on the assumption that the airplane
2 flies without rotating also at the time t=1, like in the example shown in Fig. 9.
On the other hand, as a result of searching through the image 100c for a point corresponding
to the coordinates P
0a of the distance measurement point P
0 on the image 100a which is shot at the time t=0 by the left camera 20a by performing
image matching, the coordinates P
0b' of the point on the image 100c are acquired.
[0090] Such the difference Δu
L between the coordinates P
0b and the coordinates P
0b' occurs as a result of calculating the coordinates P
0b of the distance measurement point P
0 on the image 100c on the assumption that the airplane 2 flies without rotating, even
though the airplane 2 actually flies while rotating. Therefore, the attitude angles
of the airplane 2 which minimize the difference Δu
L are defined as estimated results expressing the attitude of the actual airplane 2
appropriately.
[0091] For example, because in the example shown in Fig. 11 the difference Δu
L is minimized when the airplane 2 is tilted by the pitch angle φ, the pitch angle
φ is acquired as an estimated result of the attitude.
[0092] Although the airplane 2 actually rotates also in both a rolling direction and a yawing
direction, in addition to a pitching direction, what is necessary in this case is
just to similarly estimate the roll angle ω and the yaw angle κ.
[0093] Next, operations will be explained.
[0094] Fig. 12 is a flow chart showing the operation of the navigation system 3 according
to Embodiment 1, and shows a series of processes of estimating the attitude angles
of the airplane 2 in flight.
[0095] Hereafter, the attitude angles of the airplane 2 are expressed by the three parameters
including the roll angle ω, the pitch angle φ and the yaw angle κ, and these angles
are estimated per second.
[0096] Because it is assumed hereafter that the attitude angles (ω, φ, κ) at a time t=0
and t=3 are (0, 0, 0) for the sake of convenience, the unknown attitude angles (ω,
φ, κ) at a time t=1 and those at a time t=2 may be estimated. More specifically, the
attitude angles, six in total, which are unknown are estimated.
[0097] First, the data acquiring unit 30 reads and acquires distance data, angle data, coordinate
data and image data from the memory card 23 mounted in the airplane 2 (step ST1a).
[0098] Each distance data shows the distance 1 measured by the laser distance measuring
device 21 from the projection reference point 21a of laser light to a distance measurement
point P
0, and each angle data shows the projection angle θ of laser light. Each coordinate
data shows the three-dimensional coordinates (X
0, Y
0, Z
0) of the projection reference point 21a of laser light, the three-dimensional coordinates
being measured by the GNSS device 22. Each image data shows images each containing
a distance measurement point P
0 on an object to be shot, the images being shot by the left camera 20a and the right
camera 20b.
[0099] By using data accumulated in the memory card 23 during a flight of the airplane 2
the attitude of the airplane 2 can be estimated after the flight has been ended, and
by using estimated attitude angles survey results can also be corrected.
[0100] Next, the coordinate calculating unit 31 calculates the three-dimensional coordinates
(X, Y, Z) of each distance measurement point P
0 in accordance with the following expression (1) by using the distance data, the angle
data and the coordinate data, which are acquired by the data acquiring unit 30, and
the settings of the attitude angles (ω, φ, κ) of the airplane 2 (step ST2a) .
[0101] In the following expression (1), a
11 to a
33 denote the elements of a 3×3 rotation matrix showing the inclinations of the laser
distance measuring device 21, the left camera 20a and the right camera 20b, the inclinations
depending on the attitude of the airplane 2.
In this expression,
[0102] Further, in the above-mentioned expression (1), (X
0, Y
0, Z
0) denote the three-dimensional coordinates of the projection reference point 21a of
laser light, the three-dimensional coordinates being shown by each of the above-mentioned
coordinate data. θ denotes the projection angle of the laser light which is shown
by the corresponding one of the above-mentioned angle data, and 1 denotes the distance
from the projection reference point 21a of the laser light to the distance measurement
point P
0, the distance being shown by the corresponding one of the above-mentioned distance
data. The projection angle θ is defined by assuming that the vertically downward direction
with respect to the airplane 2 makes an angle of 90 degrees.
[0103] Further, the settings of the attitude angles are (ω, φ, κ) = (0, 0, 0) by assuming
that the airplane 2 had made a level flight.
[0104] Results of calculating the three-dimensional coordinates (X, Y, Z) of each distance
measurement point P
0 by using the coordinate data at the times t=0 to 2 shown in Fig. 6A, and the angle
data and the distance data for the distance measurement point P
0 at the times t=0 to 2 shown in Fig. 7A are shown in Fig. 13.
[0105] Next, the coordinate calculating unit 31 calculates the second-by-second projection
center coordinates (X
L, Y
L, Z
L) of the left camera 20a and the second-by-second projection center coordinates (X
R, Y
R, Z
R) of the right camera 20b by using the coordinate data and the settings of the attitude
angles, in accordance with the following expressions (2) and (3) (step ST3a) .
[0106] Results of calculating the projection center coordinates by assuming that the settings
of the attitude angles are (ω, φ, κ) = (0, 0, 0), and by using the coordinate data
at the times t=0 to 2 shown in Fig. 6A are shown in Fig. 14.
[0107] Next, the coordinate calculating unit 31 calculates the coordinates (x
L, y
L) of a distance measurement point P
0 on a left camera image and the coordinates (X
R, y
R) of the distance measurement point P
0 on a right camera image on the basis of the coordinate data, the settings of the
attitude angles, the three-dimensional coordinates of the distance measurement point
P
0, the projection center coordinates of the left camera 20a, and the projection center
coordinates of the right camera 20b, in accordance with the following expressions
(4) and (5) (step ST4a).
[0109] Fig. 15 shows results of calculation of the coordinates (x
L, y
L) of a distance measurement point P
0 on each left camera image and the coordinates (X
R, y
R) of the distance measurement point P
0 on the corresponding right camera image. The calculation is made using the coordinate
data at the times t=0 to 2 shown in Fig. 6A, the three-dimensional coordinates (X,
Y, Z) of each distance measurement point P
0 shown in Fig. 13, and the projection center coordinates shown in Fig. 14.
[0110] Next, the image matching unit 32 extracts, as a pair, a left camera image shot at
a time i and a right camera image shot at a time j later than the time i by +1 from
the image data acquired by the data acquiring unit 30. The process of extracting a
pair of image data which is a target for image matching in this way is referred to
as pairing. Through this pairing, a pair of image data about images which are shot
at different shooting positions is acquired.
[0111] Then, the image matching unit 32 searches for a point corresponding to the coordinates
(x
L, y
L) of a distance measurement point P
0 on the left camera image shot at the time i, the coordinates being calculated by
the coordinate calculating unit 31, from the right camera image shot at the time j
by performing template matching on the left camera image shot at the time i and the
right camera image shot at the time j (step ST5a).
[0112] Fig. 16 shows a correspondence between the coordinates of the distance measurement
point P
0 on the left camera image at the time i, and the coordinates of the point corresponding
to the coordinates of the distance measurement point and existing on the right camera
image shot at the time j, the point being searched for through the template matching.
[0113] SCAN
x (x
Li, y
Li) denotes the x coordinate of the corresponding point which is searched for by performing
template matching on the right camera image shot at the time j with respect to a small
region centered at the coordinates (x
Li, y
Li) of the distance measurement point P
0 on the left camera image shot at the time i. Further, SCAN
y (x
Li, y
Li) denotes the y coordinate of the corresponding point which is searched for by performing
template matching in the same way.
[0114] As shown in Fig. 16, there occurs a difference between the coordinates (x
Rj, y
Rj) of the distance measurement point P
0 on the right camera image shot at the time j, the coordinates being calculated by
the coordinate calculating unit 31, and the coordinates (SCAN
x(x
Li, y
Li), SCAN
y(x
Li, y
Li)) of the corresponding point which is searched for by the image matching unit 32.
This is because the attitude angles (ω, φ, κ) of the airplane 2 have values other
than zeros.
[0115] More specifically, by setting appropriate attitude angles (ω, φ, κ) showing the attitude
of the airplane 2 in flight, and then re-calculating the coordinates as shown in Fig.
16, the above-mentioned coordinates (x
Rj, y
Rj) are made to match the above-mentioned coordinates (SCAN
x(x
Li, y
Li), SCAN
y(x
Li, y
Li)).
[0116] Therefore, the attitude angles which minimize the difference between the above-mentioned
pair of coordinates are set as the estimated values of the attitude angles of the
airplane 2.
[0117] Returning to the explanation of Fig. 12, a case in which the attitude estimating
unit 33 estimates the attitude angles of the airplane 2 in accordance with a procedure
based on a nonlinear least square method will be explained.
[0118] The attitude estimating unit 33 calculates the correction amounts for the attitude
angles which reduce the difference between the coordinates (x
Rj, y
Rj) of the distance measurement point P
0 on the right camera image shot at the time j and the coordinates (SCAN
x(x
Li, y
Li), SCAN
y(x
Li, y
Li)) of the corresponding point which is searched for by the image matching unit 32
(step ST6a) . For example, observation equations v
x and v
y which are shown in the following expression (6) are used.
[0119] In the above-mentioned expression (6), tilde ω (
∼ω), tilde φ (
∼φ) and tilde κ (
∼κ) are approximate solutions of the roll angle ω, the pitch angle φ and the yaw angle
κ which are unknown. δω, δφ, and δκ are the correction amounts for the approximate
solutions tilde ω, tilde φ and tilde κ.
[0120] Further, ∂F
x/∂ω is the partial derivative of F
x with respect to the roll angle ω, ∂F
x/∂φ is the partial derivative of F
x with respect to the pitch angle φ, and ∂F
x/∂κ is the partial derivative of F
x with respect to the yaw angle κ. These partial derivatives are coefficients whose
values are acquired by substitutions of the approximate solutions tilde ω, tilde φ
and tilde κ.
[0121] Similarly, ∂F
y/∂ω is the partial derivative of F
y with respect to the roll angle ω, ∂F
y/∂φ is the partial derivative of F
y with respect to the pitch angle φ, and ∂F
y/∂κ is the partial derivative of F
y with respect to the yaw angle κ. These partial derivatives are also coefficients
whose values are acquired by substitutions of the approximate solutions tilde ω, tilde
φ and tilde κ.
[0122] Tilde F
x (
∼F
x) is a value which is acquired by substituting SCANx(x
L, y
L) and an approximate solution of x
R into F
x, tilde F
y (
∼F
y) is a value which is acquired by substituting SCANy(x
L, y
L) and an approximate solution of y
R into F
y.
[0123] When the data shown in Fig. 16 are used, as to each of pairs of a left camera image
shot at a time t=i and a right camera image shot at a time t=i+1, four observation
equations v
x are acquired for x and four observation equations v
y are acquired for y. Therefore, the number of observation equations is obtained as
3×4×2=24. A constant vector for the observation equations at this time is shown in
Fig. 17.
[0124] Next, the attitude estimating unit 33 partially differentiates the observation equations
with respect to each of the six unknown quantities. For example, the attitude estimating
unit partially differentiates the observation equations with respect to the roll angle
ω, the pitch angle φ and the yaw angle κ at the time t=1 and, after that, partially
differentiates the observation equations with respect to the roll angle ω, the pitch
angle φ and the yaw angle κ at the time t=2.
[0125] A 24×6 design matrix which consists of partial differential coefficients which are
calculated for the observation equations in this way is shown in Fig. 18.
[0126] The attitude estimating unit 33 then calculates the product of the transpose of this
design matrix and the design matrix. A calculation result acquired using the design
matrix shown in Fig. 18 is shown in Fig. 19.
[0127] The attitude estimating unit 33 further calculates the product of the transpose of
this design matrix, and the constant vector shown in Fig. 17. A result of this calculation
is shown in Fig. 20.
[0128] After that, the attitude estimating unit 33 calculates the product of the inverse
matrix calculated from the matrix shown in Fig. 19, and the vector shown in Fig. 20.
A result of this calculation is the correction amounts (δω, δφ, δκ) for the attitude
angles shown in Fig. 21.
[0129] Because it is assumed initially that the airplane 2 makes a level flight without
rotating, and (0, 0, 0) are set as the initial values of the attitude angles (ω, φ,
κ), the above-mentioned correction amounts serve as the approximate solutions of the
attitude angles, just as they are.
[0130] The attitude estimating unit 33 adds the correction amounts which the attitude estimating
unit calculates in the above-mentioned way to the previous approximate solutions to
correct these approximate solutions, and determines the corrected approximate solutions
as the settings of the attitude angles (step ST7a) . At this time, when the series
of processes has not been performed the predetermined number of repetitions (when
NO in step ST8a), the attitude estimating unit 33 instructs the coordinate calculating
unit 31 to perform the same coordinate calculation as the above-mentioned coordinate
calculation.
[0131] As a result, the coordinate calculating unit 31 performs the processes in steps ST2a
to ST4a by using the corrected approximate solutions as the settings of the attitude
angles, and the image matching unit 32 performs the process in step ST5a.
[0132] By using the correction amounts (δω, δφ, δκ) which are calculated by repeatedly performing
the above-mentioned series of processes, the difference between the coordinates (x
Rj, y
Rj) of the distance measurement point P
0 on the right camera image shot at the time j and the coordinates (SCAN
x(x
Li, y
Li), SCAN
y(x
Li, y
Li)) of the corresponding point which is searched for by the image matching unit 32
becomes small.
[0133] When the above-mentioned series of processes has been performed the predetermined
number of repetitions (when YES in step ST8a), and the correction amounts which minimize
the difference between the above-mentioned coordinates are acquired, the attitude
estimating unit 33 outputs the approximate solutions which are corrected by using
these correction amounts as final estimated results of the attitude angles (step ST9a).
[0134] The estimated results of the attitude angles at the times t=0.00 to 3.00, which are
acquired in this way, are shown in Fig. 22.
[0135] Although the case in which a pair of a left camera image shot at a time i and a right
camera image shot at a time j (=i+1) is used as the pair of image data is shown above,
a pair of a left camera image shot at a time i and a right camera image shot at the
time i can be alternatively used. More specifically, according to the present invention,
it is sufficient that a pair of image data about images shot at different shooting
positions is used.
[0136] As stereo image processing for searching for corresponding points between images
shot by cameras located at different positions, and acquiring three-dimensional information
including the distance to an observation object and depth information, there are a
method called fixed stereo and a method called motion stereo.
[0137] In the fixed stereo, two cameras are arranged at a spacing, and images are shot by
the cameras. Pairing of a left camera image shot at a time i and a right camera image
shot at the time i is equivalent to the fixed stereo.
[0138] In the motion stereo, images are shot from different shooting positions by a camera
while the camera is moved. Pairing of a left camera image shot at a time i and a right
camera image shot at a time j (=i+1) is equivalent to the motion stereo.
[0139] Further, although the configuration using the left camera 20a and the right camera
20b is shown in the above explanation, only one camera can be used instead of the
two cameras. In this case, a pair of a camera image shot at a time i and a camera
image shot at a time j (=i+1) is used.
[0140] In addition, although the case in which the unknown quantities are the three parameters
(ω, φ, κ) which are the attitude angles at each time is shown above, the six parameters
additionally including the position coordinates (X, Y, Z) of the airplane 2 can be
alternatively used, or an internal parameter, such as the focal distance c of the
cameras, can be included.
[0141] As mentioned above, the navigation system 3 according to Embodiment 1 pays attention
to the fact that the coordinates of corresponding points between images shot at different
shooting positions deviate from each other dependently on the attitude of the airplane
2, and corrects the values of the parameters (ω, φ, κ) showing the attitude of the
airplane 2 in such a way that the difference between these coordinates becomes small,
to estimate the attitude of the airplane 2. As a result, the attitude of the airplane
2 can be estimated even if an IMU and a stabilizer are not used.
[0142] Further, because the attitude of the airplane 2 is estimated by using, in addition
to image data about an image containing a shot distance measurement point P
0, the distance 1 from the projection reference point 21a of laser light to the distance
measurement point P
0, the projection angle θ of the laser light and the three-dimensional coordinates
(X, Y, Z) of the projection reference point 21a of the laser light, the attitude of
the airplane 2 can be estimated with a high degree of accuracy.
[0143] Further, in the navigation system 3 according to Embodiment 1, the pair of image
data includes image data about an image shot at a time i during a flight of the airplane
2, and image data about an image shot at a time j later than this time i during the
flight. By using such a pair of image data, a change in an object to be shot on an
image, the change depending on the attitude of the airplane 2, can be used for the
estimation of the attitude of the airplane 2.
[0144] Further, in the navigation system 3 according to Embodiment 1, the pair of image
data includes a pair of image data about images shot at a time i by the left camera
20a and the right camera 20b, or a pair of image data about an image shot at a time
i by at least one of the left camera 20a and the right cameras 20b and image data
about an image shot at a time j later than the time i by at least one of the left
and right cameras.
[0145] Even by using such image data, a change in an object to be shot on an image, the
change depending on the attitude of the airplane 2, can be used for the estimation
of the attitude of the airplane 2.
[0146] In addition, although the navigation system 3 according to Embodiment 1 performs
the calculation of the coordinates by setting the initial value of the yaw angle κ
to zero, the navigation system can acquire an approximate solution of the yaw angle
κ from a time series of three-dimensional coordinates which is measured by the GNSS
device 22. Therefore, a value calculated from the time series of three-dimensional
coordinates measured by the GNSS device 22 can be used as the initial value of the
yaw angle κ.
[0147] In addition, the survey system 1 according to Embodiment 1 is provided with the memory
card 23 mounted in the airplane 2. The data acquiring unit 30 reads and acquires distance
data, angle data, coordinate data and image data which are stored in the memory card
23.
[0148] By using data which are stored in the memory card 23 during a flight of the airplane
2 in this way the attitude of the airplane 2 can be estimated after the flight has
been ended, and the survey results can also be corrected by using the estimated attitude
angles.
Embodiment 2.
[0149] Fig. 23 is a block diagram showing the configuration of a survey system 1A according
to Embodiment 2 of the present invention. The survey system 1A surveys geographical
features from an airplane 2A, and includes a left camera 20a, a right camera 20b,
a laser distance measuring device 21, a GNSS device 22 and a wireless communication
device 24 which are mounted in the airplane 2A, and a navigation system 3.
[0150] The wireless communication device 24 transmits distance data, angle data, coordinate
data and image data which are acquired during a flight of the airplane 2A to the navigation
system 3.
[0151] The navigation system 3 is provided separately from the airplane 2A, as shown in
Fig. 23. As an alternative, the navigation system 3 can be mounted in the airplane
2A.
[0152] A data acquiring unit 30 of the navigation system 3 receives and acquires the distance
data, the angle data, the coordinate data and the image data which are transmitted
by the wireless communication device 24.
[0153] The navigation system 3 estimates the attitude of the airplane 2A by performing the
same processing as that shown in Embodiment 1 by using the above-mentioned data which
the navigation system acquires in this way.
[0154] As mentioned above, the survey system 1A according to Embodiment 2 includes the wireless
communication device 24 mounted in the airplane 2. The data acquiring unit 30 receives
and acquires distance data, angle data, coordinate data and image data which are transmitted
by the wireless communication device 24.
[0155] By using the data transmitted by radio from the wireless communication device 24
in this way, the attitude of the airplane 2A can be estimated during a flight of the
airplane 2A. The survey results can also be corrected during a flight of the airplane
2A by using the estimated attitude angles.
[0156] Although the example in which the moving body described in the present invention
is a flying body such as an airplane 2 is shown in the above explanation, the invention
is not limited to this example. For example, the navigation system according to the
present invention can be implemented as a mobile mapping system, and a vehicle in
which this system is mounted is defined as a moving body. Further, a railroad car,
a ship or a robot can be defined as a moving body, and the navigation system according
to the present invention can be used as a device that estimates the attitude of the
moving body. Also for such a moving body, the attitude angles (ω, φ, κ) of the moving
body can be used similarly as the parameters showing the attitude of the moving body,
and position information can be included in the parameters in some cases.
INDUSTRIAL APPLICABILITY
[0157] Because the navigation system according to the present invention can estimate the
attitude of a moving body with a high degree of accuracy by using a configuration
of not including any IMU and any stabilizer, the navigation system is suitable for
use as, for example, a navigation system for UAV.
REFERENCE SIGNS LIST
[0158] 1, 1A survey system; 2, 2A airplane; 3 navigation system; 20a left camera; 20b right
camera; 20c, 20d arm; 21 laser distance measuring device; 21a projection reference
point; 22 GNSS device; 23 memory card; 24 wireless communication device; 30 data acquiring
unit; 31 coordinate calculating unit; 32 image matching unit; 33 attitude estimating
unit; 100 processing circuit; 100a to 100c image; 101 CPU; and 102 memory.