Technical Field
[0001] The present disclosure relates to a device for monitoring anomalies in railroad systems.
Background Art
[0002] The patent document 1 describes a technique for automatically detecting an obstacle
to railroad systems, such as an overhead wire and a rail, by comparing the image data
obtained by photographing the front from a traveling railroad vehicle with the background
image data previously taken. The technique described in Patent Document 1 aims to
efficiently monitor anomalies in railroad systems.
Citation List
Patent Document
Summary of Invention
Problems to be Solved by Invention
[0004] One of the factors that cause anomalies in railroad systems is an object dropped
off from railroad vehicles. The problem of the technique described in Patent Document
1 is that a railroad vehicle can detect objects dropped off from its preceding railroad
vehicles but cannot detect objects dropped off from itself while traveling.
[0005] The present invention has been made to solve the above-mentioned problem, and aims
to enable a railroad vehicle to detect an object dropped off from itself.
Means for Solving the Problems
[0006] The monitoring device according to the present disclosure includes: a front data
acquisition unit to acquire, via a sensor installed in a front of a mobile object,
data around its front area with respect to its traveling direction and status information
of the mobile object; a rear data acquisition unit to acquire, via a sensor installed
in a rear of the mobile object, data around its rear area with respect to the traveling
direction and status information of the mobile object; a position identification unit
to identify a position of the data around the front area and a position of the data
around the rear area; and an object detection unit to compare the data around the
front area with the data around the rear area of the same position as the data around
the front area, detect an object, and warn of an anomaly.
Effect of Invention
[0007] The present invention makes it possible for a railroad vehicle to detect an object
dropped off while the railroad vehicle is traveling.
Brief Description of Drawings
[0008]
FIG. 1 is a hardware configuration diagram showing a mobile object according to Embodiment
1.
FIG. 2 is a functional configuration diagram showing a monitoring device according
to Embodiment 1.
FIG. 3 is a flowchart showing operation of the monitoring device according to Embodiment
1.
FIG. 4 is a hardware configuration diagram of a mobile object according to Modified
Example 1 of Embodiment 1.
FIG. 5 is a hardware configuration diagram of a mobile object according to Modified
Example 2 of Embodiment 1.
FIG. 6 is a functional configuration diagram showing a monitoring device according
to Embodiment 2.
FIG. 7 is a flowchart showing operation of the monitoring device according to Embodiment
2.
Modes for Carrying Out Invention
Embodiment 1
[0009] FIG. 1 is a hardware configuration diagram of a mobile object using a monitoring
device according to the present embodiment.
[0010] In FIG. 1, the numeral 100 denotes the mobile object, the numeral 10 denotes the
monitoring device, and the numeral 101 denote a vehicle control unit. The monitoring
device 10 is a computer provided in the mobile object 100.
[0011] Note that the monitoring device 10 may be implemented in an integrated (or inseparable)
form or in a removable (or separable) form, with the mobile object 100 or another
component illustrated herein. Further, the monitoring device 10 is not limited to
the one described in the present embodiment, although a railroad vehicle is used as
an example of the mobile object 100 in the present embodiment.
[0012] The monitoring device 10 includes hardware such as a processor 11, a storage device
12, a communication interface 13, and an on-board interface 14. The processor 11 is
connected to other hardware devices via a system bus to control them.
[0013] The processor 11 is an integrated circuit (IC) that performs processing. For specific
examples, the processor 11 is a central processing unit (CPU), a digital signal processor
(DSP), or a graphics processing unit (GPU).
[0014] The storage device 12 includes a memory 121 and a storage 122. For a specific example,
the memory 121 is a random-access memory (RAM). For a specific example, the storage
122 is a hard disk drive (HDD). Further, the storage 122 may be a portable storage
medium such as a secure digital (SD) memory card, a compact flash (CF), a NAND flash,
a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark)
disk, or a DVD.
[0015] The communication interface 13 is a device for communicating with a communication
device around the mobile object 100. For a specific example, the communication interface
13 is an Ethernet (registered trademark) terminal or a universal serial bus (USB)
terminal.
[0016] The on-board interface 14 is a device for connecting to the vehicle control unit
101 installed on the mobile object 100. For a specific example, the on-board interface
14 is a USB terminal, an IEEE1394 terminal, or an HDMI (registered trademark) terminal.
[0017] The vehicle control unit 101 includes sensing devices such as a camera, a light detection
and ranging device (LiDAR), a radar, a sonar, and a positioning device and also includes
devices such as a steering, a brake, and an accelerator to control the mobile object
100.
[0018] FIG. 2 shows a functional configuration diagram of the monitoring device 10. The
monitoring device 10 includes, as functional components, a front data acquisition
unit 21, a rear data acquisition unit 22, a position identification unit 23, an object
detection unit 24, and a history storage unit 25. The numeral 31 denotes front data;
the numeral 32 denotes rear data; the numeral 41 denotes locating data; and the numeral
42 denotes exclusion data.
[0019] The front data acquisition unit 21 collects information obtained by a first sensor
installed in the front of the mobile object 100 via the vehicle control unit 101.
In the present embodiment, the first sensor is exemplified as a camera (front camera)
installed in the front of the mobile object 100, and the information obtained from
the front camera is used as data around its front area. However, the data may be information
obtained from a LiDAR, a radar, or a sonar. Note that, in the present embodiment,
the information to be collected by the front camera is the information obtained in
the direction of travel of the mobile object 100. However, the first sensor only needs
to be installed in the front of the mobile object 100, and the information to be collected
by the first sensor may be information in any direction.
[0020] The data around its front area that is obtained is recorded in the storage 122. In
addition, the front data acquisition unit 21 also records, in the storage 122, the
status information of the mobile object 100 such as position information, speed, attitude
angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at
the time of the data acquisition and the like at the time when acquiring data around
its front area.
The position information of the mobile object 100 may be, for example, the latitude
and longitude of the mobile object 100 obtained from the output value of the positioning
device which is connected to the mobile object via the vehicle control unit 101. Here,
let the data recorded by the front data acquisition unit 21 in the storage 122 be
front data 31.
[0021] The rear data acquisition unit 22 collects information obtained by a second sensor
installed in the rear of the mobile object 100 via the vehicle control unit 101. In
the present embodiment, the second sensor is exemplified as a camera (rear camera)
installed in the rear of the mobile object 100, and the image obtained from the rear
camera is used as data around its rear area. However, the data may be information
obtained from a LiDAR, a radar, and a sonar. In the present embodiment, the information
to be collected by the rear camera is the information obtained in the direction opposite
to the direction of travel of the mobile object 100. However, the second sensor only
needs to be installed in the rear of the mobile object 100, and the information to
be collected by the second sensor is information in any direction.
[0022] The data obtained from around the rear area is recorded in the storage 122. In addition,
the rear data acquisition unit 22 also records, in the storage 122, the status information
of the mobile object 100 such as position information, speed, attitude angles (roll
angle, pitch angle, yaw angle), lighting color of the mobile object at the time of
the data acquisition and the like at the time when obtaining data from around the
rear area. The position information of the mobile object 100 may be, for example,
the latitude and longitude of the mobile object 100 obtained from the output value
of the positioning device which is connected to the mobile object via the vehicle
control unit 101. Here, let the data recorded by the rear data acquisition unit 22
in the storage 122 be rear data 32.
[0023] In the present embodiment, it is described that the data around its front area and
the data around its rear area are recorded in the storage 122. However, they may be
recorded in the memory 121, another area prepared in the storage device 12, or an
external device (not shown) connected via the communication I/F 13.
[0024] The position identification unit 23 is called from the front data acquisition unit
21 and the rear data acquisition unit 22. The position identification unit 23 identifies
the position of the front data 31 when called from the front data acquisition unit
21 and identifies the position of the rear data 32 when called from the rear data
acquisition unit 22. Note that the identified position of the data around its front
area and the identified position of the data around its rear area are each the position
corresponding to the information (identified position information) obtained from the
first and second sensors, but not the position of the mobile object 100.
[0025] In the present embodiment, the identified position information identified by the
position identification unit 23 is recorded together with the front data 31 and the
rear data 32, but they may be recorded in separate areas if it is known which data
the identified position information is linked with. Further, the position identification
unit 23 may identify the positions of the data around the front area and the data
around the rear area by using locating data 41. The locating data 41 may be data of
any object that can uniquely identify the data obtained by the first and second sensors,
such as a building, a pillar, a signboard, and a characteristic landscape that exist
along the railroad track.
[0026] Then, the image data of the objects that can be uniquely identified as described
above and their position information are associated with the locating data 41 in advance.
By doing so, if the data around its front area and the data around its rear area contain
information that matches the locating data 41, the position identification unit 23
will be able to identify the position without using the status information of the
mobile object 100, making it possible to shorten the processing time. In the present
embodiment, a camera is used as the sensor. However, when a radar or a sonar is used
as a sensor, a combination of an object made of a material that generates characteristic
reflected waves, and positional information thereof may be used.
[0027] The object detection unit 24 compares the identified position information recorded
in the front data 31 with the identified position information recorded in the rear
data 32 to detect a matching combination. If there is a matching combination in the
identified position information, the object detection unit 24 compares the data around
its front area with the data around its rear area. Then, if there is a difference,
it is determined that the difference indicates a fallen object dropped off during
the passage of the mobile object 100 and an anomaly warning is issued.
[0028] Further, the object detection unit 24 can determine whether the object determined
to be a fallen object is really the fallen object dropped off during the passage of
the mobile object 100 by using the exclusion data 42. Examples of the items to be
excluded include animals such as crows and cats, gravel and stones, something blown
by the wind such as paper wastes like newspapers and magazines, and vinyl sheets,
and the image data of these items may be held as the exclusion data 42.
[0029] The object detection unit 24 compares the image determined to be a fallen object
with the images of the exclusion data 42. If they match, the object detection unit
24 determines that the object found is not the one that was dropped off during the
passage of the mobile object 100, so that a false alarm will not be issued to the
mobile object 100 regarding the occurrence of a fallen object. At this time, a warning
may be issued that it is not a fallen object from the mobile object 100 but something
that has been blown from the outside during the passage of the mobile object 100.
Note that the locating data 41 and the exclusion data 42 may be recorded in the storage
122 or in the memory 121. Also, these data may be recorded in another area prepared
in the storage device 12, or an external device (not shown) connected via the communication
I/F 13.
[0030] The function of each functional component of the monitoring device 10 is implemented
by software. The storage 122 of the storage device 12 stores a program that implements
the function of each functional component implemented by the software. This program
is loaded into the memory 121 by the processor 11 and executed by the processor 11.
[0031] In addition, the storage 122 implements the function of the history storage unit
25. The history storage unit 25 stores information about the fallen objects that the
object detection unit 24 detected in the past. Examples of such information to be
stored include position, time, and number of times of detection of each fallen object
detected. By using such information held by the history storage unit 25, the front
data acquisition unit 21 and the rear data acquisition unit 22 may, for example, shorten
the interval for collecting data in the vicinity of the location where fallen objects
are frequent, and may, conversely, lengthen the interval for collecting data in the
vicinity of the location where fallen objects are less frequent. This makes it possible
to efficiently obtain the data around its front area and the data around its rear
area.
[0032] Note that, in the present embodiment, as shown in FIG. 1, only one processor 11 is
provided. Instead, however, multiple processors 11 may be provided. In that case,
the multiple processors 11 cooperate to execute the program that implements each function
of the monitoring device 10.
[0033] FIG. 3 is a flowchart showing processing of the monitoring device according to the
present embodiment. The operation of the monitoring device 10 according to Embodiment
1 will be described with reference to FIG. 3. In the present embodiment, for ease
of explanation, the processes of the front data acquisition unit 21, the rear data
acquisition unit 22, the position identification unit 23, and the object detection
unit 24 are described in a way that they are executed sequentially as shown in the
flowchart. However, instead, the three units, namely, the front data acquisition unit
21, the rear data acquisition unit 22, and the object detection unit 24, in other
words, the units other than the position identification unit 23 which is called from
the front data acquisition unit 21 and the rear data acquisition unit 22, can be executed
in parallel.
(Step S11: Processing of front data acquisition)
[0034] The front data acquisition unit 21 acquires the data around its front area by the
first sensor installed in the front of the mobile object 100 and the status information
and writes the data in the front data 31. The front data acquisition unit 21 calls
the position identification unit 23.
(Step S12: Calculation and identification of front data position)
[0035] The position identification unit 23 identifies the position of the data around its
front area on the basis of the status information written in the front data 31 and
writes the identified position in the front data 31.
(Step S13: Processing of rear data acquisition)
[0036] The rear data acquisition unit 22 acquires the data around its rear area by the second
sensor installed in the rear of the mobile object 100 and the status information and
writes the data in the rear data 32. The rear data acquisition unit 22 calls the position
identification unit 23.
(Step S14: Calculation and identification of rear data position)
[0037] The position identification unit 23 identifies the position of the data around its
rear area on the basis of the status information written in the rear data 32 and writes
the identified position in the rear data 32.
(Step S15: Detection of object)
[0038] The object detection unit 24 compares the identified position information of the
front data 31 and the identified position information of the rear data 32, both stored
in the storage 122, and detects a matching combination. If there is a matching combination
in the identified position information, the object detection unit 24 compares the
data around its front area with the data around its rear area. Then, if there is a
difference in the combination, it is determined that the difference indicates the
existence of a fallen object dropped off during the passage of the mobile object 100
and an anomaly warning is issued.
[0039] However, since it should be considered that the acquisition directions are different
by about 180 degrees between the front data 31 and the rear data 32 that are identical
in the identified position information, it is not possible to simply compare their
data. Therefore, the object detection unit 24 converts the pixel signals of either
the front data 31 or the rear data 32 by using the status information included in
the front data 31 and the rear data 32 as well as the characteristics of the acquired
data.
[0040] The status information will be explained in detail. Specifically, the speed information,
the attitude angles (roll angle, pitch angle, yaw angle) of the mobile object 100,
and the characteristics of sensing devices provided in the vehicle control unit 101
for the acquisition of data are used. For example, in a case of a camera, the lens
and the size of image sensor each are a characteristic, and the focal length and the
angle of view are determined from these characteristics. In the image taken by the
camera, the distances corresponding to the pixels in the camera image can be roughly
known from the shooting position, the focal length, and the angle of view, and as
a result, the positions corresponding to the pixel signals can be known.
[0041] In this way, the object detection unit 24 obtains the positions of the pixel signals
for each of the front data 31 and the rear data 32, and finds a pair of the front
data 31 and the rear data 32 having the same position of the pixel signals to perform
the comparison. When the pixel signals are compared, it is possible to use the markers
and the characteristics of the landmarks and the like recorded in the data. In doing
so, size scaling and angle adjustment may be performed in order to match the size
of the pixel area and the obstacle.
[0042] Note that, in the acquired data, depending on the imaging interval, the pixel signals
of the same position may be found in multiple images such as a near view and a distant
view, etc. When there are multiple choices for the images to be used as described
above, the identification accuracy is improved by giving priority to the image with
higher resolution without using the image with lower resolution. Similarly, in the
acquired data, depending on the imaging interval, the pixel signals of the same position
may be found in multiple images with different subject depths. When there are multiple
choices for the images to be used as described above, the identification accuracy
is improved by giving priority to the image of higher resolution with no blurring
without using the image with lower resolution due to blurring.
[0043] Also, depending on the direction of the sun, the shadow of the mobile object may
be cast on the image. In such a case, the cast shadow may be corrected so as for the
images to match with each other, or the orientation of the sensor may be changed to
take a shadow-free image for priority use. The shadow can be detected from the image
signal, but it can also be predicted from the position of the sun calculated from
the photographing time and place and the size of the moving object or it is possible
that the image will not be used when it is determined in advance not to be suitable
for the identification processing.
[0044] The object detection unit 24 searches for the object to be detected on the basis
of the pixel signals at the identified data position. In the description of the present
embodiment, this object is assumed to be a part that has fallen from the mobile object
100. The object detection unit 24 can create the stereoscopic image from the front
data 31 by performing viewpoint conversion. The stereoscopic image may be a bird's-eye
view image or a three-dimensionally reconstructed image. The object detection unit
24 performs the similar processing on the rear data 32 as well.
[0045] Then, the front data 31 and the rear data 32 are compared to determine whether the
difference therebetween indicates an object to be detected. Moved stones on the railroad
track, for example, are excluded from the objects to be detected so that they are
not determined as fallen objects. However, in consideration of the possibility that
foreign objects such as stones consequently move on a railroad track, hindering the
safe railroad operation, it is also possible to include the objects as the targets
to be detected.
[0046] While it is bright in the daytime, there is no problem in particular, but when lighting
is required, care must be taken when a camera recording visible light is used as a
sensor to acquire images. For example, if the mobile object 100 is a railroad vehicle,
it is required that, during operation, the color of the front light and the color
of the rear light be different from each other. Therefore, if the camera recording
visible light is used as the sensor, determination as a fallen object may be made
due to the difference in color.
[0047] In such a case, the object detection unit 24 cancels the color difference between
the front light and the rear light by using the setting values such as the colors
of the lights in the status information. A straightforward way to correct the color
difference of the lights is to use the Von Kries color conversion formula. Thus, even
if a color difference occurs between the camera image acquired in front of the mobile
object 100 and the camera image acquired in rear, it is possible to prevent erroneous
detection of an obstacle owing to the color difference.
[0048] As described above, in the monitoring device 10 according to the present embodiment,
by using the front data acquired by the front data acquisition unit 21 as well as
the status information acquired when the front data is acquired by the front data
acquisition unit 21, and the rear data acquired by the rear data acquisition unit
22 as well as the status information acquired when the rear data is acquired by the
rear data acquisition unit 22, and further by using, in the position identification
unit 23, the characteristics of the status information acquired when the front data
is acquired as well as the status information acquired when the rear data is acquired,
the front data and the rear data having the same identified position information are
compared to detect a fallen object. Thus, this makes it possible to detect a fallen
object immediately after it occurs during the travel of the mobile object 100.
[0049] As a result, if a fallen object occurs which may interfere with the railroad operation,
an immediate action can be taken for it to improve the safe operation of the railroad.
In addition, this configuration, which only uses the sensors attached to the front
and the rear of the railroad vehicle, contributes to reducing the number of monitoring
cameras to be installed along the railroad track. In the present embodiment, a case
is described in which the front data acquisition unit 21 acquires the data around
its front area in the direction of travel of the mobile object 100, but the data around
its rear area may be acquired as long as the first sensor is installed in the front
of the mobile object 100. Similarly, in the present embodiment, a case is described
in which the rear data acquisition unit 22 acquires the data around its rear area
in the direction of travel of the mobile object 100, but it may acquire the data around
its front area as long as the second sensor is installed in the rear of the mobile
object 100
[0050] When the mobile object 100 includes a plurality of railroad cars, the front data
acquisition unit 21 and the rear data acquisition unit 22 may be provided in the front
and in the rear of each of the cars constituting the mobile object 100, respectively.
In that case, the front data acquisition unit 21 of the frontmost car and the rear
data acquisition unit 22 of the rearmost car are used while traveling. With this configuration,
even if the mobile object 100 is separated into a plurality of moving bodies, it is
possible to use the front data acquisition unit 21 of the frontmost car and the rear
data acquisition unit 22 of the rearmost car in each separated mobile object. To the
contrary, when a plurality of the moving bodies 100 are connected into one mobile
object, the front data acquisition unit 21 of the frontmost mobile object 100 and
the rear data acquisition unit 22 of the rearmost mobile object 100 can be used.
Modified Example 1
[0051] Embodiment 1 describes the case where each functional component is implemented by
software. However, each of these functional components may be implemented by hardware.
[0052] FIG. 4 shows a hardware configuration of the monitoring device 10 according to Modified
Example 1. If each functional component is implemented by hardware, the monitoring
device 10 includes an electronic circuit 15 in place of the processor 11 and the storage
device 12. The electronic circuit 15 is a dedicated circuit that implements the functions
of each functional component and the storage device 12.
[0053] FIG. 4 shows, as FIG. 1 does, a configuration in which the communication interface
13, the on-board interface 14, and the electronic circuit 15 are connected via a bus.
However, the electronic circuit 15 may be configured as a single circuit that also
implements the functions of the communication interface 13 and the on-board interface
14.
[0054] Examples of the electronic circuit 15 include a single circuit, a composite circuit,
a programmed processor, a parallel programmed processor, a logic IC, a gate array
(GA), an application specific integrated circuit (ASIC), and a field-programmable
gate array (FPGA).
[0055] Further, each functional component of the monitoring device 10 described in Embodiment
1 may be integrated into one electronic circuit 15, or they may be allocated to and
implemented by a plurality of the electronic circuits 15.
Modified Example 2
[0056] Modified Example 2 describes the case in which some of each of the functional components
are implemented by hardware and each of the remaining functional components is implemented
by software. FIG. 5 shows a configuration of the monitoring device 10 according to
Modified Example 2.
[0057] In FIG. 5, the processor 11, the storage device 12, and the electronic circuit 15
are called processing circuits.
[0058] In other words, the function of each functional component is implemented by a processing
circuit.
Embodiment 2
[0059] The configuration of Embodiment 1 is for detecting a fallen object from a railroad
vehicle. Next, in the present embodiment, a configuration for detecting an obstacle
at a station which is provided with platform doors will be described. The present
embodiment differs from Embodiment 1 in that the mobile object 100 identifies the
position of the data around its front area and the position of the data around its
rear area on the basis of the positions of the platform doors at a station. In the
present embodiment, the different points will be explained, and the same points will
be omitted.
[0060] FIG. 6 shows a functional configuration diagram of the monitoring device 10 according
to the present embodiment. In FIG. 6, the numeral 26 denotes a control unit with which
the monitoring device 10 identifies the position of the data around its front area
and the position of the data around its rear area on the basis of the positions of
the platform doors at the station. For example, the control unit 26 stores an image
of the platform door in advance and determines whether the image of the platform door
is included in the data around its front area acquired by the front data acquisition
unit 21. If the platform door image is included, the control unit 26 calls the front
data acquisition unit 21 each time the frontmost of the mobile object 100 reaches
each platform door at the station and calls the rear data acquisition unit 22 each
time the rearmost of the mobile object 100 reaches each platform door at the station.
[0061] The control unit 26 outputs an anomaly warning when the object detection unit 24
detects an obstacle. In FIG. 6, the same reference numerals as those in FIG. 2 denote
the same or corresponding components or units, each of which, except for the control
unit 26, performs the same operation as those described in FIG. 2 shown in Embodiment
1.
[0062] FIG. 7 is a flowchart showing operation of the monitoring device 10 according to
the present embodiment. The operation of the monitoring device 10 according to the
present embodiment will be described with reference to FIG. 7. In the present embodiment,
for ease of explanation, it is described that the control unit 26 calls the front
data acquisition unit 21, the rear data acquisition unit 22, and the object detection
unit 24, and that when an obstacle is detected as the result, an anomaly warning is
issued. Instead, as in Embodiment 1, the three units, namely, the front data acquisition
unit 21, the rear data acquisition unit 22, and the object detection unit 24, in other
words, the units other than the position identification unit 23 which is called from
the front data acquisition unit 21 and the rear data acquisition unit 22, can be executed
in parallel. At that time, the object detection unit 24 can output an anomaly warning
to be outputted when the obstacle is detected.
(Step S21: Processing of front data acquisition)
[0063] When the railroad vehicle enters the station, the control unit 26 repeatedly calls
the front data acquisition unit 21 until the railroad vehicle stops at the station.
The front data acquisition unit 21 acquires the data around its front area using a
first sensor installed in the front of the mobile object 100. The front data acquisition
unit 21 writes the acquired data around its front area in the memory 121.
[0064] For example, if there are three platform doors corresponding to the doors of the
railroad vehicle, the control unit 26 calls the front data acquisition unit 21 each
time the railroad vehicle approaches each of the three platform doors. When the mobile
object 100 approaches a predetermined position with respect to each of the platform
doors or each of the vehicle doors (for example, position from which the entire door
can be overlooked), the front data acquisition unit 21 records the data around its
front area and the status information as the front data 31. In this example, the front
data taken at the three locations is written in the memory 121.
(Step S22: Position identification processing of data from around front area)
[0065] As in Embodiment 1, when called by the front data acquisition unit 21, the position
identification unit 23 identifies the position of the data around its front area and
writes the position in the front data 31.
(Step S23: Waiting for completion of passengers getting on and off)
[0066] The control unit 26 waits while the doors of the railroad vehicle and the platform
doors open, the passengers get on and off, and the platform doors and the doors of
the railroad vehicle close.
(Step S24: Processing of rear data acquisition)
[0067] The control unit 26 calls the rear data acquisition unit 22 until the railroad vehicle
leaves the station. The rear data acquisition unit 22 acquires the data around its
rear area using a second sensor installed in the rear of the mobile object 100. Specifically,
as in Step S21, the rear area data acquisition unit 22 collects the information obtained
by the second sensor via the vehicle control unit 101. The rear data acquisition unit
22 writes the acquired rear data 32 in the memory 121. To explain in accordance with
the example given in Step S21, when the rear of the mobile object 100 approaches a
predetermined position with respect to each of the platform doors or each of the vehicle
doors (for example, the position from which the entire door can be overlooked), the
rear data acquisition unit 22 acquires the data around its rear area and the status
information. In this way, the data around the three rear areas each corresponding
to the data around the three front areas described above are written in the memory
121.
(Step S25: Position identification processing of data from around rear area)
[0068] As in Embodiment 1, when called by the rear data acquisition unit 22, the position
identification unit 23 identifies the position of the data around its rear area and
writes the position in the rear data 32.
(Step S26: Processing of obstacle detection)
[0069] Next, the control unit 26 calls the object detection unit 24. As in Embodiment 1,
the object detection unit 24 compares the data around its front area with the data
around its rear area whose identified positions match, and determines a difference,
if any, to be a fallen object which occurred during the passage of the mobile object
100, and issues an anomaly warning.
[0070] In the present embodiment, an example is described in which the position of each
platform door or each vehicle door is recognized in advance, and then the front data
and the rear data are taken at each position. However, in a case where the front data
and the rear data are continuously taken at regular time intervals, it is also possible
to detect and count doors from images by considering the front-rear relationship between
images taken in time-series. Alternatively, the position identification unit 23 can
identify the position of an obstacle from the numbers or the codes written on the
platform doors recorded in the front data and the rear data stored.
(Step S27: Anomaly warning)
[0071] The control unit 26 issues an anomaly warning when the object detection unit 24 detects
an obstacle (YES in Step S26). For example, by transmitting this anomaly warning to
the station (management center), the station staffs can make actions in a prompt manner.
Also, it is possible to take measures such as stopping the following train if it is
urgent or allowing the next train to enter and the doors to open at the platform to
handle the problem occurred if it is not urgent.
[0072] As so far described, the monitoring device 10 according to the present embodiment
can detect an anomaly which occurs when the mobile object 100 enters the platform
of a station at the time when the mobile object 100 leaves the platform of the station.
With the first and second sensors installed at the front and the rear of the mobile
object 100, the monitoring device 10 according to the present embodiment obtains the
data around its front area viewed from the mobile object 100 and the data around its
rear area viewed from the mobile object 100. Then, by comparing them, it is further
made possible to immediately detect a fallen object existing on the outside of the
platform doors and on the nearside of the railroad track, which is normally difficult
to detect only with monitoring cameras installed on the platform.
[0073] With the above effects, when an obstacle that is detected may interfere with the
railroad service, it is possible to take immediate actions, which contributes to improvement
of the safe and on-time operation of the railroad service. Further, since the sensors
attached to the railroad vehicle is used, it is not necessary to equip the platform
doors with their respective image monitoring devices, and the cost can be greatly
reduced.
Description of Reference Numerals and Signs
[0074] 10 monitoring device, 11 processor, 12 storage device, 13 communication I/F, 14 in-vehicle
I/F, 15 electronic circuit, 21 front data acquisition unit, 22 rear data acquisition
unit, 23 position identification unit, 24 object detection unit, 25 history storage
unit, 26 control unit, 31 front data, 32 rear data, 41 locating data, 42 exclusion
data, 100 mobile object, 101 vehicle control unit, 121 memory, 122 storage