(19)
(11) EP 4 286 245 A1

(12) EUROPEAN PATENT APPLICATION

(43) Date of publication:
06.12.2023 Bulletin 2023/49

(21) Application number: 23176460.6

(22) Date of filing: 31.05.2023
(51) International Patent Classification (IPC): 
B61L 25/02(2006.01)
B61L 23/04(2006.01)
(52) Cooperative Patent Classification (CPC):
B61L 25/025; B61L 23/04
(84) Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA
Designated Validation States:
KH MA MD TN

(30) Priority: 03.06.2022 JP 2022090911

(71) Applicants:
  • KABUSHIKI KAISHA TOSHIBA
    Minato-ku Tokyo 105-0023 (JP)
  • Toshiba Infrastructure Systems & Solutions Corporation
    Kawasaki-shi, Kanagawa 212-0013 (JP)

(72) Inventors:
  • SUZUKI, Yoshihiko
    Kanagawa, 212-0013 (JP)
  • KOBAYASHI, Hiroyuki
    Kanagawa, 212-0013 (JP)
  • TAKAHASHI, Yusuke
    Kanagawa, 212-0013 (JP)
  • OODAKE, Tatsuya
    Kanagawa, 212-0013 (JP)
  • SETO, Naoto
    Kanagawa, 212-0013 (JP)
  • FUTAGAMI, Takuya
    Kanagawa, 212-0013 (JP)
  • NAKANO, Takahisa
    Kanagawa, 212-0013 (JP)
  • ASANO, Wataru
    Kanagawa, 212-0013 (JP)
  • HATTORI, Yohei
    Kanagawa, 212-0013 (JP)
  • KATO, Noriyasu
    Kanagawa, 212-0013 (JP)
  • SEGAWA, Taisei
    Kanagawa, 212-0013 (JP)
  • NAMEKI, Hideaki
    Kanagawa, 212-0013 (JP)

(74) Representative: AWA Sweden AB 
Box 5117
200 71 Malmö
200 71 Malmö (SE)

   


(54) POSITION INFORMATION DETECTION DEVICE


(57) According to one embodiment, a position information detection device includes a receiver configured to receive a plurality of data in which a plurality of node coordinates arranged at regular intervals in such a manner as to correspond to left and right rails are correlated with position information; a generator configured to generate an edge image by using an input image from an imaging device disposed in a vehicle; a rail specifying unit configured to specify the data including second rail node string information that is the node coordinates that are most overlapping with a position of an edge included in the edge image; and a vehicle position acquisition unit configured to acquire second position information based on the specified data.




Description

FIELD



[0001] Embodiments described herein relate generally to a position information detection device.

BACKGROUND



[0002] There is known a method of extracting image features (for example, edges) of left and right rails from a camera attached to a front of a vehicle (in particular, a train), and detecting the rails in a traveling direction. In this method, there is a case where the image features cannot be extracted due to shapes of rails (curves, branches, or the like), weather, time slots, and the like, and it is difficult to obtain a stable performance.

[0003] In addition, there is known a method of acquiring left-and-right rail information by prestoring information of rails in a map database, and interlocking the map database with GNSS (Global Navigation Satellite System) position information.

[0004] In the method in which the map database is interlocked with the GNSS position information, while a stable performance can be obtained, it is not possible to detect rails by taking into account an error of GNSS position information, and there is a case in which position correction of a vehicle is needed in a place where the position accuracy of the vehicle is regarded as important.

BRIEF DESCRIPTION OF THE DRAWINGS



[0005] 

FIG. 1 is a diagram schematically illustrating an example of an overall configuration of a position information detection system including a position information detection device according to one embodiment.

FIG. 2 is a block diagram illustrating an example of a configuration of a central device according to one embodiment.

FIG. 3 is a view for describing an example of left-and-right rail node information included in a map database.

FIG. 4 is a block diagram illustrating one configuration example of a vehicle in which a position information detection device according to one embodiment is mounted.

FIG. 5 is a view for describing an example of edge positions near left and right rails included in an edge image, and a plurality of node coordinates included in left-and-right rail node string information.

FIG. 6 is a flowchart for describing an example of a position detection process operation of a position information detection device according to one embodiment.

FIG. 7 is a flowchart for describing an example of the position detection process operation of the position information detection device according to the embodiment.

FIG. 8 is a flowchart for describing an example of the position detection process operation of the position information detection device according to the embodiment.


DETAILED DESCRIPTION



[0006] According to one embodiment, a position information detection device includes a receiver configured to receive a plurality of data in which a plurality of node coordinates arranged at regular intervals in such a manner as to correspond to left and right rails are correlated with position information; a generator configured to generate an edge image by using an input image from an imaging device disposed in a vehicle; a rail specifying unit configured to specify the data including second rail node string information that is the node coordinates that are most overlapping with a position of an edge included in the edge image; and a vehicle position acquisition unit configured to acquire second position information based on the specified data.

[0007] Hereinafter, a position information detection device according to an embodiment is described in detail with reference to the accompanying drawings. Note that in the drawings used in the description of the embodiment, scales of parts are changed as appropriate. In addition, in some cases, in the drawings used in the description of the embodiment, configurations are omitted as appropriate for the purpose of description.

[0008] FIG. 1 is a diagram schematically illustrating an example of an overall configuration of a position information detection system including a position information detection device according to one embodiment.

[0009] The position information detection system illustrated in FIG. 1 includes a vehicle 1 and a central device 2.

[0010] The vehicle 1 is a vehicle included in a train that runs on rails, for example, according to a predetermined schedule. Note that in a case where a train includes a plurality of vehicles, it is not necessary that all vehicles include the functions described below.

[0011] The vehicle 1 is connected to the central device 2 and a satellite 3 via a network. The vehicle 1 is, for example, a vehicle located forwardmost in a traveling direction of a train including a plurality of vehicles, and includes an antenna (not illustrated) that receives an electric wave from the satellite 3. For example, the antenna is, in design, disposed on a roof portion about 3 m backward from the frontmost position of the vehicle that is located forwardmost. The position information of the vehicle 1 is based on the position at which the antenna is disposed.

[0012] The central device 2 is configured to be communicable with the vehicle 1 via a network, and manages the operational condition of, for example, a plurality of vehicles 1.

[0013] FIG. 2 is a block diagram illustrating an example of a configuration of the central device 2 according to one embodiment.

[0014] The central device 2 of the present embodiment includes a receiver 21, a controller 22, an input unit 23, an output unit 24, a storage unit 25, a transmitter 26, and a bus communication line BL1.

[0015] The bus communication line BL1 is connected to each of the structural components included in the central device 2. The controller 22 can transmit and receive data to and from the other structural components included in the central device 2 via the bus communication line BL1.

[0016] The receiver 21 receives, from the vehicle 1, landmark information and information relating to an error corrected by the vehicle 1 (hereinafter referred to as "error information"). The error information includes, for example, at least one of a fact that an error exists in present position information of the vehicle 1 (hereinafter referred to as "first position information") acquired by using GNSS position information; a fact that an error was corrected by position information of the vehicle 1 (hereinafter "second position information") acquired by using an input image from a an imaging device 4 (to be described later); a fact that an error was corrected by using position information of the vehicle 1 (hereinafter "third position information") acquired by using landmark information; landmark information included in a corrected input image; a correction area estimated from the first position information; and rail node string information of a correction area that can be acquired from data of a map database. The information included in the error information is not limited to the above. The details of the landmark information, the second position information and the third position information will be described later.

[0017] The controller 22 includes at least one of processors such as a CPU (Central Process Unit), an MPU (micro processing unit), a GPU (Graphics Processing Unit), and an FPGA (field-programmable gate array). The controller 22 can implement various functions of the central device 2, based on programs, such as system software, application software or firmware, stored in an auxiliary storage unit 252.

[0018] The controller 22 compares, by referring to the position information of the vehicle 1, landmark information that is fed back from the vehicle 1, and landmark information included in the data of a map database stored in the storage unit 25. In a case where a landmark existing in a landmark candidate area included in the fed-back landmark information does not agree with a landmark existing in a landmark candidate area included in the landmark information of the map database, the controller 22 corrects the information as needed, and outputs the corrected information to the output unit 24.

[0019] The input unit 23 may include, for example, a user interface such as a mouse or a keyboard, and various kinds of sensors such as a microphone, a touch panel and a camera. The input unit 23 transmits information, which is acquired by an operation of a user, to the controller 22 via the bus communication line BL1.

[0020] The output unit 24 may include, for example, display means such as a monitor, and sound output means such as a speaker. Note that the output unit 24 may be configured to be connected to an outside of a computer. The output unit 24 displays information, which is output by the controller 22, on a monitor or the like as alarm information, or outputs the information by using sound output means such as a speaker.

[0021] The storage unit 25 includes, for example, a main storage unit 251 and an auxiliary storage unit 252.

[0022] The main storage unit 251 may include, for example, a ROM (read-only memory) and a RAM (random-access memory). The ROM is a nonvolatile memory that is used for data read only, and can store data and various setting values, which are used by the controller 22 in executing various processes. In addition, the RAM can be used as a so-called work area, in which the controller 22 temporarily stores data in executing various processes. The main storage unit 251 of the present embodiment is, for example, a RAM, and is used as a memory.

[0023] The main storage unit 251 can temporarily store the data of the map database, a position information priority setting, error information, and the like.

[0024] The auxiliary storage unit 252 is a non-transitory computer-readable storage medium of a computer whose central component is the controller 22. The auxiliary storage unit 252 is, for example, an EEPROM (trademark) (electric erasable programmable read-only memory), an HDD (hard disk drive), an SSD (solid state drive), or the like.

[0025] The auxiliary storage unit 252 can store data used by the controller 22 in executing various processes, and data or various setting values generated in the processing in the controller 22. For example, the auxiliary storage unit 252 is a memory that stores various kinds of information, and can store timetable (schedule) information, the data of the map database, position information priority setting, and error information. A concrete description of the position information priority setting will be given later.

[0026] The timetable (schedule) information includes information of the position and time of a vehicle, for example, information of a station at which the vehicle stops, a time at which the vehicle arrives at the station, and a time at which the vehicle leaves the station, in a running route of the vehicle 1.

[0027] The map database includes a plurality of sets of data, which are correlated and stored so as to correspond to the position information of the vehicle 1. The map database includes, for example, a plurality of sets of data, such as position information (latitude, longitude, altitude), land undulations, rail shapes, location features, landmark information (number, size, coordinates, type), left rail node information (number, coordinates), and right rail node information (number, coordinates). In the map database, pieces of data are correlated with, for example, identifiers (for example, row numbers) that are added in the order of pieces of position information of positions over which the vehicle runs. Note that the data stored in the map database is not limited to the above. It is assumed that the data of the map database is acquired or updated in advance by a dedicated vehicle.

[0028] FIG. 3 is a view for describing an example of left-and-right rail node information included in the map database.

[0029] The coordinates in the left-and-right rail node information included in the map database are position coordinates of rails in a photographed image captured by the imaging device 4 while the vehicle 1 runs over a position of the position information (latitude, longitude, altitude). The coordinates of a node are calculated by setting an upper left of an input image from the imaging device 4 (an upper left in a case of facing toward the traveling direction of the vehicle 1) as a reference 0 (origin), and setting the right in a horizontal direction from the reference as an x-axis positive value and the down side in a vertical direction from the reference as a y-axis positive value. It is assumed that the reference point and the axes are preset by a map database creator or the like. In addition, the same applies to the coordinates in the landmark information stored in the map database.

[0030] It is assumed that the set of the left rail node information (number, coordinates) and right rail node information (number, coordinates) included in the data of the map database is rail node string information. The rail node string information is composed of a plurality of node coordinates arranged at regular intervals, and is correlated with the position information (latitude, longitude, altitude) of the vehicle 1. A shape substantially equal to the shape of a rail can be reproduced by connecting neighboring nodes (nodes with a least distance of node coordinates) included in the rail node string information by a straight line. In a case where the shape of a rail is a curve, a branch or the like, a shape substantially equal to the shape of the rail can similarly be reproduced by decreasing the distance between nodes as small as possible by creating a curved line in a pseudo-manner.

[0031] The transmitter 26 transmits the data of the map database to the vehicle 1 (a position information detection device 5 and a support control device 6 to be described later). Note that the data of the map database in the central device 2 may include data of two map databases of master data and vehicle instruction data. The master data of the map database includes detailed map information of the entire running route, and the vehicle instruction data of the map database is data including at least partial information of the master data and includes data in a running route for each vehicle according to the schedule (timetable). Upon a request from the vehicle 1, or periodically, the transmitter 26 transmits at least the vehicle instruction data to the position information detection device 5 and the support control device 6.

[0032] FIG. 4 is a block diagram illustrating one configuration example of a vehicle in which a position information detection device according to one embodiment is mounted.

[0033] A vehicle 1 of the present embodiment includes an imaging device 4, a position information detection device 5, a support control device 6, and a bus communication line BL2.

[0034] The bus communication line BL2 is connected to each of the structural components included in the vehicle 1. A controller 52 included in the position information detection device 5, and a correction controller 62 included in the support control device 6 can communicate data with the other structural components included in the vehicle 1 via the bus communication line BL2.

[0035] The imaging device 4 is, for example, a stereo camera. The imaging device 4 transmits a photographed image to the position information detection device 5 as an input image. In a case where a frame rate is 30 fps, the imaging device 4 may be configured to transmit input images 30 times per second. The frequency, at which the imaging device 4 transmits input images to the position information detection device 5, is not limited to the above, and may be changed as appropriate in accordance with the frame rate of the imaging device 4 and the frequency of electric wave reception from the satellite 3.

[0036] The support control device 6 includes a receiver 61, a correction controller 62, a storage unit 63, and a transmitter 64.

[0037] The receiver 61 receives the first position information from the antenna that receives an electric wave of the satellite 3. The receiver 61 includes a function of receiving a plurality of data of the map database. In addition, the receiver 61 receives, from the position information detection device 5, the second position information, the third position information, the rail node string information and the landmark information.

[0038] The correction controller 62 includes at least one of processors such as a CPU (Central Process Unit), an MPU (micro processing unit), a GPU (Graphics Processing Unit), and an FPGA (field-programmable gate array). The correction controller 62 can implement various functions of the support control device 6, based on programs, such as system software, application software or firmware, stored in an auxiliary storage unit 632.

[0039] The correction controller 62 compares the first position information and the second position information, and executes a process corresponding to the comparison result. To be more specific, the correction controller 62 detects an error between the first position information and the second position information, and, in a case where the error exceeds a predetermined threshold, refers to a position information priority setting, and can correct the error by setting the first position information or the second position information as the present position information of the vehicle 1.

[0040] Similarly, the correction controller 62 compares the first position information and the third position information, and executes a process corresponding to the comparison result. The correction controller 62 compares the second position information and the third position information, and executes a process corresponding to the comparison result.

[0041] To be more specific, the correction controller 62 detects an error between the first position information and the third position information, and, in a case where the error exceeds a predetermined threshold, refers to the position information priority setting, and can correct the error by setting the first position information or the third position information as the present position information of the vehicle 1. In addition, the correction controller 62 detects an error between the second position information and the third position information, and, in a case where the error exceeds a predetermined threshold, refers to the position information priority setting, and can correct the error by setting the second position information or the third position information as the present position information of the vehicle 1.

[0042] The storage unit 63 includes, for example, a main storage unit 631 and an auxiliary storage unit 632.

[0043] The main storage unit 631 may include, for example, a ROM (read-only memory) and a RAM (random-access memory). The ROM is a nonvolatile memory that is used for data read only, and can store data and various setting values, which are used by the correction controller 62 in executing various processes. In addition, the RAM can be used as a so-called work area, in which the correction controller 62 temporarily stores data in executing various processes. The main storage unit 631 of the present embodiment is, for example, a RAM, and is used as a memory.

[0044] The main storage unit 631 can temporarily store the data of the map database, first position information, rail node string information, second position information, landmark information, third position information, position information priority setting, and information (error information) relating to an error corrected by the correction controller 62.

[0045] The auxiliary storage unit 632 is a non-transitory computer-readable storage medium of a computer whose central component is the correction controller 62. The auxiliary storage unit 632 is, for example, an EEPROM (trademark) (electric erasable programmable read-only memory), an HDD (hard disk drive), an SSD (solid state drive) or the like.

[0046] The auxiliary storage unit 632 can store data used by the correction controller 62 in executing various processes, and data or various setting values generated in the processing in the correction controller 62. For example, the auxiliary storage unit 632 is a memory that stores various kinds of information, and can store the data of the map database, first position information, rail node string information, second position information, landmark information, third position information, position information priority setting, and information (error information) relating to an error corrected by the correction controller 62.

[0047] The transmitter 64 transmits the landmark information and the error information to the central device 2. In addition, the transmitter 64 transmits the error information to the position information detection device 5.

[0048] The position information detection device 5 of the present embodiment includes a receiver 51, a controller 52, a storage unit 53, and a transmitter 54.

[0049] The position information detection device 5 may include a processor that executes programs for implementing various functions to be described later, and a memory storing the programs. The processor is typically a CPU (Central Processing Unit) and/or a GPU (Graphics Processing Unit), but may be a microcomputer, an FPGA (Field Programmable Gate Array), or a DSP (Digital Signal Processor). In addition, the memory records a program that is executed by the processor in order to implement the operation of the position information detection device 5, and temporarily stores data or the like used by the processor. Note that the program may be recorded in a recording medium that is readable by the position information detection device 5. In this case, the processor can implement various functions by executing the program that is read from the recording medium.

[0050] The receiver 51 receives the first position information from the antenna that receives an electric wave of the satellite 3. The receiver 51 includes a function of receiving, from the central device 2, a plurality of pieces of the timetable information and the data of the map database. The receiver 51 can receive, for example, the timetable of the running route for one day and the data of the map database in a batchwise manner from the central device 2 before starting the running for the day.

[0051] The receiver 51 can receive the data of the map database in accordance with the operational condition. The data of the map database that the receiver 51 receives may be configured, for example, such that when the vehicle 1 moves from a station A to a station B, the data from the station A to the station B is received from the central device 2, and then, when the vehicle 1 moves from the station B to a station C, the data from the station B to the station C is received from the central device 2. In addition, for example, the data of the map database that the receiver 51 receives from the central device 2 may be configured such that only the data correlated with the position information of several meters before and after the present position information of the vehicle 1 is received, and this data is updated in real time.

[0052] The receiver 51 further receives an input image from the imaging device 4. The imaging device 4 and the receiver 51 may be communicably connected by wire, or may be communicably connected wirelessly. The receiver 51 can communicate with the imaging device 4, for example, based on communication standards such as the internet, Ethernet (trademark), wireless LAN (Wi-Fi (trademark) or the like), and Bluetooth (trademark).

[0053] The controller 52 includes at least one of processors such as a CPU (Central Process Unit), an MPU (micro processing unit), a GPU (Graphics Processing Unit), and an FPGA (field-programmable gate array). The controller 52 can implement various functions of the position information detection device 5, based on programs, such as system software, application software or firmware, stored in an auxiliary storage unit 532.

[0054] The controller 52 includes a generator 521, a rail specifying unit 522, a vehicle position acquisition unit 523, and a landmark detector 524.

[0055] The landmark detector 524 detects, among objects included in the input image from the imaging device 4, an object (hereinafter referred to as "landmark") that is an immovable property and has features in regard to the shape, luminosity, color, pattern and the like. The information of the landmark is included in the data of the map database in advance, and is stored by being correlated with the position information and rail node string information. The landmark detector 524 displays a landmark candidate area after recognizing the presence of a landmark by referring to landmark information correlated with the first position information. The data included in the landmark information is, for example, the number of landmarks, the size (vertical and horizontal), coordinates (x, y), and a landmark type (for example, an object attached to an aerial line pole, a building, or the like).

[0056] The landmark detector 524 sends the detected landmark information to the rail specifying 522 and vehicle position acquisition unit 523.

[0057] FIG. 5 is a view for describing an example of edge positions near left and right rails included in an edge image, and a plurality of node coordinates included in left-and-right rail node string information.

[0058] The generator 521 detects a part with a conspicuous luminosity change from the input image of the imaging device 4 received by the receiver 51, and generates an edge image. An edge image generation process is executed by a generally known method.

[0059] The rail specifying unit 522 compares edge positions near left and right rails included in the edge image generated by the generator 521, and a plurality of node coordinates included in the left-and-right rail node string information. The rail specifying unit 522 specifies second rail node string information composed of a plurality of node coordinates with largest overlapping parts with the edge positions, and data including the second rail node string information.

[0060] In addition, the rail specifying unit 522 can specify first rail node string information by using the data of the map database correlated with the first position information received by the receiver 51.

[0061] The rail specifying unit 522 specifies third rail node string information from the data correlated with the landmark information detected by the landmark detector 524.

[0062] The vehicle position acquisition unit 523 acquires position information included in the data specified by the rail specifying unit 522. The position information acquired by the vehicle position acquisition unit 523 becomes the second position information.

[0063] The vehicle position acquisition unit 523 specifies the third position information from the data correlated with the landmark information detected by the landmark detector 524.

[0064] The vehicle position acquisition unit 523 compares the first position information, second position information and third position information, and, in a case where there is an error between the first position information, second position information and third position information, the vehicle position acquisition unit 523 selects the position information in accordance with the position information priority setting in which the position information, to which priority is to be given due to rail shapes or other factors, is set in advance. The details of the above process will be described later. Note that the process of selecting the position information of the vehicle 1 may not be executed by the vehicle position acquisition unit 523, and the support control device 6 may correct the error, and the position information detection device 5 may be configured to execute the process, based on the corrected result.

[0065] In addition, the vehicle position acquisition unit 523 may be configured to output the result of the comparison between the first rail node string information and the second rail node string information.

[0066] The above-described position information priority setting based on the rail shape is merely an example, and is not limited to this example. In the position information priority setting, the reliability of each of the first position information, second position information and third position information can be set as one index.

[0067] In the present embodiment, the positioning method of the position information using the GNSS includes singular positioning, relative positioning, and self-position detection. In the singular positioning, information, such as the position and time of the satellite, which is transmitted from the GNSS satellite, is received by a single antenna, thereby measuring a time needed from when an electric wave is emitted from the satellite to when the electric wave reaches the receiver, and converting the time to a distance. By setting a GNSS satellite, the position of which is recognized, as a reference point of movement, distances from four or more satellites to an observation point are recognized at the same time, thereby determining the position of the vehicle 1.

[0068] In the relative positioning, two or more receivers are used, and four or more identical GNSS satellites are observed at the same time. Setting the positions of the GNSS satellites as references, a time difference between the electric wave signals reaching the receivers from the GNSS satellites are measured, and a relative positional relationship between two points is computed.

[0069] In the self-position detection, for example, in a case of performing self-position estimation in a place or the like where the electric wave from the GNSS satellite does not reach, the position information is computed based on the number of revolutions of the wheels, the present speed of the vehicle, or the like.

[0070] In regard to the reliability of position information, the relative positioning has a highest reliability, the singular position has a next highest reliability, and the self-position detection has a lowest reliability. The position information error in the relative positioning is about 0.5 m or less, the position information error in the singular positioning is about 1 to 2 m or less, and the position information error in the self-position detection is greater than these. Thus, the position information priority setting is set in accordance with the positioning method. The position information priority setting is preset by an administrator or a user of the position information detection system.

[0071] A concrete example of the above-described position information priority setting is described. If it is assumed that the vehicle 1 runs at 108 km per hour (30 m per second) and the frame rate of the imaging device 4 is 30 fps, the distance of movement of one frame is about 1 m. If it is assumed that the vehicle 1 runs at 36 km per hour (10 m per second) and the frame rate of the imaging device 4 is 30 fps, the distance of movement of one frame is about 0.3 m. In this manner, since the distance of movement of one frame varies in accordance with the speed of the vehicle, an error amount of the position information of the vehicle 1 also varies. Thus, the position information priority setting is set, for example, such that priority is given to the first position information in the case of the relative positioning, and priority is given to the second position information or third position information in the case of the singular positioning or self-position detection.

[0072] In addition, in some cases, the position information priority setting is set based on the shape of the rail. The shape of the rail can be discriminated by narrowing down the area from the first position information. For example, if the shape of the rail is straight, there is a case where rail node string information is similar. Thus, the possibility that correct position information cannot be acquired increases, and the priority to the second position information lowers. Similarly, in a case where the shape of the rail is a curved line, such as a curve, there is a case where the rail node string information becomes similar in a place at which the curvature does not vary, and therefore the priority to the second position information lowers.

[0073] On the other hand, in a case where the rail shape is a rail branch part, the rail node string information becomes characteristic rail node string information and rarely becomes similar. Thus, it can be said that the second position information is more accurate than the first position information, and the priority of the second position information increases.

[0074] In addition, in a case where the area is a place where electric waves do not reach, for example, a tunnel, the first position information is obtained by the self-position detection, and is set in accordance with the shape of the rail in the similar manner as described above. Note that the third position information has a higher reliability than the first position information and second position information, and is set to have a highest priority regardless of the shape of the rail.

[0075] The storage unit 53 includes, for example, a main storage unit 531 and an auxiliary storage unit 532.

[0076] The main storage unit 531 may include, for example, a ROM (read-only memory) and a RAM (random-access memory). The ROM is a nonvolatile memory that is used for data read only, and can store data and various setting values, which are used by the controller 52 in executing various processes. In addition, the RAM can be used as a so-called work area, in which the controller 52 temporarily stores data in executing various processes. The main storage unit 531 of the present embodiment is, for example, a RAM, and is used as a memory.

[0077] The main storage unit 531 can temporarily store the data of the map database, the first position information, the input image from the imaging device 4, the edge image, the rail node string information, the second position information, the landmark information, the third position information, and the position information priority setting.

[0078] The auxiliary storage unit 532 is a non-transitory computer-readable storage medium of a computer whose central component is the controller 52. The auxiliary storage unit 532 is, for example, an EEPROM (trademark) (electric erasable programmable read-only memory), an HDD (hard disk drive), an SSD (solid state drive) or the like.

[0079] The auxiliary storage unit 532 can store data used by the controller 52 in executing various processes, and data or various setting values generated in the processing in the controller 52. For example, the auxiliary storage unit 532 is a memory that stores various kinds of information, and can store the data of the map database, the first position information, the input image from the imaging device 4, the edge image, the rail node string information, the second position information, the landmark information, the third position information, and the position information priority setting.

[0080] The transmitter 54 transmits the second position information, third position information, rail node string information and landmark information to the support control device 6. The transmitter 54 can execute communication, for example, based on communication standards such as the internet, Ethernet (trademark), wireless LAN (Wi-Fi (trademark) or the like), and Bluetooth (trademark).

[0081] FIG. 6, FIG. 7 and FIG. 8 are flowcharts for describing an example of a position detection process operation of a position information detection device according to one embodiment.

[0082] Hereinafter, a description is given of an example of a procedure of acquiring, by the position information detection device 5, a plurality of pieces of position information of the vehicle 1, and determining the position information of the vehicle 1 in accordance with the position information priority setting. Note that the content of the process in the following description of the operation is merely an example, and various processes, by which similar advantageous effects can be obtained, can be utilized as appropriate.

[0083] The position information detection device 5 receives, for example, vehicle instruction data (hereinafter referred to as "data") of the map database in a specific section from the central device 2 by the receiver 51. The position information detection device 5 stores the data in the storage unit 53, and the controller 52 acquires the data from the storage unit 53 (step 1). Since the vehicle 1 of the present embodiment is a vehicle that runs on a route determined according to a schedule, a time in which the vehicle runs between predetermined stations is determined in advance, and the data is acquired by using this time as a reference.

[0084] The receiver 51 acquires the first position information from the antenna (step 2). Note that the order of processes of step 1 and step 2 may be reversed, and, for example, such a configuration may be adopted that step 2 is executed, and, by referring to the first position information, the central device 2 transmits data of the vicinity of the area.

[0085] The controller 52 acquires the input image corresponding to the frame number from the imaging device 4 (step 3). It is assumed that the input image is acquired by the controller 52 in real time.

[0086] The generator 521 executes an edge process on the acquired input image, and generates an edge image from the input image. Note that the generator 521 is not required to execute the edge process on all input images and generate edge images. For example, such a process may be executed as to generate an edge image in regard to at least one frame of three frames of the input images.

[0087] The rail specifying unit 522 compares the edge image and a plurality of node coordinates (step 4). Here, the node coordinates, which are the target of comparison, are selected, for example, based on the first position information. Taking the error of the first position information into account, the rail specifying unit 522 selects node coordinates from the data correlated with the position information of several meters before and after the first position information. In addition, such a configuration may be adopted that comparison is executed with all node coordinates included in the data.

[0088] The rail specifying unit 522 specifies, from the comparison result of step 4, data including a plurality of node coordinates (rail node string information) that are most overlapping with the edge position included in the edge image (step 5). The vehicle position acquisition unit 523 acquires the second position information based on the specified data (step 6).

[0089] The landmark detector 524 acquires information relating to the landmark from the data including the first position information (step 7). The information relating to the landmark is the number of landmarks, the size of the landmark, the coordinates of the landmark, and the type of landmark. If the landmark detector 524 determines, from the landmark information, that a landmark exists in the input image (step 7, YES), the landmark detector 524 displays a landmark candidate area, and detects a landmark existing within the range of, or in the vicinity of, the landmark candidate area (step 8).

[0090] If the landmark detector 524 detects a landmark within the range of, or in the vicinity of, the landmark candidate area, the landmark detector 524 detects an error between the landmark candidate area and the landmark detection area. Regardless of the presence or absence of the error, the landmark detector 524 specifies the data including the landmark information, from the landmark coordinates of the landmark detection area, and acquires the third position information based on the specified data (step 9).

[0091] Note that the transition of the process of step 2 of acquiring the first position information, the process of step 3 to step 6 of acquiring the second position information and the process of step 7 to step 9 of acquiring the third position information is not limited to the transition illustrated in FIG. 6. For example, such a configuration may be adopted that the process of step 2, the process of steps 3 to 6, and the process of steps 7 to 9 are executed in parallel.

[0092] If the landmark detector 524 acquires the third position information in step 9, the controller 52 maintains or updates the present position information of the vehicle 1 in accordance with the position information priority setting.

[0093] Based on the position information priority setting, the controller 52 compares the first position information and the second position information, and determines the present position information of the vehicle 1 (step 10).

[0094] If the controller 52 determines, based on the position information priority setting, the first position information as the present position information of the vehicle 1 (step 10, NO), the controller 52 maintains the present vehicle position information as the first position information (step 11).

[0095] After the process of step 11, the controller 52 compares the first position information and the third position information, based on the position information priority setting once again, and determines the present position information of the vehicle 1 (step 13).

[0096] If the controller 52 determines, based on the position information priority setting, the first position information as the present position information of the vehicle 1 (step 13, NO), the controller 52 maintains the present vehicle position information as the first position information (step 14).

[0097] If the controller 52 determines, based on the position information priority setting, the third position information as the present position information of the vehicle 1 (step 13, YES), the controller 52 updates the present vehicle position information to the third position information (step 15).

[0098] If the controller 52 determines, based on the position information priority setting, the second position information as the present position information of the vehicle 1 (step 10, YES), the controller 52 updates the present vehicle position information to the second position information (step 12).

[0099] After the process of step 12, once again, based on the position information priority setting, the controller 52 compares the second position information and the third position information, and determines the present position information of the vehicle 1 (step 16).

[0100] If the controller 52 determines, based on the position information priority setting, the second position information as the present position information of the vehicle 1 (step 16, NO), the controller 52 maintains the present vehicle position information as the second position information (step 17).

[0101] If the controller 52 determines, based on the position information priority setting, the third position information as the present position information of the vehicle 1 (step 16, YES), the controller 52 updates the present vehicle position information to the third position information (step 18).

[0102] The controller 52 causes the transmitter 54 to transmit the data including the present position information of the vehicle 1 to the support control device 6 (step 19), and terminates the process.

[0103] If the landmark detector 524 determines, from the landmark information, that a landmark does not exist in the input image (step 7, NO), the process goes to step 20.

[0104] The controller 52 compares the first position information and the second position information, based on the position information priority setting, and determines the present position information of the vehicle 1 (step 20) .

[0105] If the controller 52 determines, based on the position information priority setting, the first position information as the present position information of the vehicle 1 (step 20, NO), the controller 52 maintains the present vehicle position information as the first position information (step 21).

[0106] If the controller 52 determines, based on the position information priority setting, the second position information as the present position information of the vehicle 1 (step 20, YES), the controller 52 updates the present vehicle position information to the second position information (step 22).

[0107] The controller 52 causes the transmitter 54 to transmit the data including the present position information of the vehicle 1 to the support control device 6 (step 23), and terminates the process.

[0108] In the description of the present embodiment, as regards the present position information of the vehicle 1, the first position information is set as the reference, but the second position information or the third position information may be set as the reference. If the present position information of the vehicle 1 is updated, the process of the flowchart transitions by using this position information as the reference.

[0109] Here, a case is considered in which in the process of the above flowchart, the area determination is executed by using the first position information as the reference, and the position information other than the first position information is selected as the present position information by the position information priority setting. If a data number on the map database in the first rail node string information and a data number on the map database of the rail node string information correlated with the selected position information exceed predetermined thresholds, the vehicle position acquisition unit 523 is configured to select position information that is higher in the position information priority setting, next to the presently selected position information.

[0110] Note that the predetermined thresholds are preset by an administrator or a user of the position information detection system. In addition, the reference for the area determination is not limited to the first position information, and may be the second position information or the third position information.

[0111] According to the position information detection device 5 of the present embodiment, the position information detection device 5 acquires the present position information of the vehicle 1, and thereby the accurate information relating to the rails, on which the vehicle 1 runs at present, can be acquired from the rail node string information of the data including this position information. By acquiring the information relating to the rails, the vehicle 1 can set, for example, about 10 cm around the rails as the detection target area, and can detect an obstacle existing in the forward direction.

[0112] Specifically, according to the present embodiment, the position information detection device capable of detecting accurate position information can be provided.

[0113] While certain embodiments of the present invention have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the invention. These novel embodiments may be implemented in a variety of other forms, and various omissions, substitutions and changes may be made without departing from the spirit of the inventions. These embodiments and their modifications fall within the scope and spirit of the invention and within the inventions of the accompanying claims and their equivalents.


Claims

1. A position information detection device comprising:

a receiver configured to receive a plurality of data in which a plurality of node coordinates arranged at regular intervals in such a manner as to correspond to left and right rails are correlated with position information;

a generator configured to generate an edge image by using an input image from an imaging device disposed in a vehicle;

a rail specifying unit configured to specify the data including second rail node string information that is the node coordinates that are most overlapping with a position of an edge included in the edge image; and

a vehicle position acquisition unit configured to acquire second position information based on the specified data.


 
2. The position information detection device of Claim 1, further comprising a detector configured to detect a landmark included in the input image from the imaging device, wherein

the data includes information of the landmark correlated with the position information,

the rail specifying unit specifies the data including the information of the landmark detected by the detector, and

the vehicle position acquisition unit acquires third position information based on the specified data.


 
3. The position information detection device of Claim 2, wherein

the receiver receives first position information of the vehicle measured by a GNSS, and

the vehicle position acquisition unit selects present position information of the vehicle from the first position information, the second position information and the third position information, in accordance with a position information priority setting that is preset.


 
4. The position information detection device of Claim 1, wherein

the receiver receives first position information of the vehicle measured by a GNSS,

the rail specifying unit specifies first rail node string information from the data including the first position information, and

the vehicle position acquisition unit outputs a result of a comparison between the first rail node string information and the second rail node string information.


 




Drawing




























Search report









Search report