(19)
(11) EP 4 207 133 A1

(12) EUROPEAN PATENT APPLICATION
published in accordance with Art. 153(4) EPC

(43) Date of publication:
05.07.2023 Bulletin 2023/27

(21) Application number: 20954577.1

(22) Date of filing: 25.09.2020
(51) International Patent Classification (IPC): 
G08G 1/0967(2006.01)
G08G 1/01(2006.01)
(86) International application number:
PCT/CN2020/117785
(87) International publication number:
WO 2022/061725 (31.03.2022 Gazette 2022/13)
(84) Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
KH MA MD TN

(71) Applicant: Huawei Technologies Co., Ltd.
Shenzhen, Guangdong 518129 (CN)

(72) Inventors:
  • LU, Yuanzhi
    Shenzhen, Guangdong 518129 (CN)
  • CHEN, Canping
    Shenzhen, Guangdong 518129 (CN)
  • CHEN, Baocheng
    Shenzhen, Guangdong 518129 (CN)
  • ZHAO, Jian
    Shenzhen, Guangdong 518129 (CN)

(74) Representative: Thun, Clemens 
Mitscherlich PartmbB Patent- und Rechtsanwälte Sonnenstraße 33
80331 München
80331 München (DE)

   


(54) TRAFFIC ELEMENT OBSERVATION METHOD AND APPARATUS


(57) This application provides a traffic element observation method and a related device. The method includes: receiving a plurality of groups of first observation data that are of a traffic element and that are sent by a plurality of vehicles, where each of the plurality of vehicles collects one of the plurality of groups of first observation data, and the first observation data indicates a change of a coordinate location of the traffic element with time and/or a change of a speed of the traffic element with time; performing time synchronization processing and/or space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data; and determining, based on the plurality of groups of processed observation data, second observation data that is of the traffic element and that is observed by the plurality of vehicles. Therefore, accuracy of obtained observation data of the traffic element is improved.




Description

TECHNICAL FIELD



[0001] This application relates to the field of autonomous driving, and more specifically, to a traffic element observation method and apparatus.

BACKGROUND



[0002] Autonomous driving is a mainstream application in the field of artificial intelligence. The autonomous driving technology relies on collaboration of computer vision, a radar, a monitoring apparatus, a global positioning system, and the like, to implement autonomous driving of a motor vehicle without human intervention. An autonomous vehicle uses various computing systems to assist in transporting passengers from one location to another location. Some autonomous vehicles may require some initial or continuous input from operators (such as navigators, drivers, or passengers). The autonomous vehicle allows the operator to switch from a manual operation mode to an autonomous driving mode or a mode between the manual operation mode and the autonomous driving mode. Because the autonomous driving technology does not require a human to drive the motor vehicle, driving errors caused by people can be effectively avoided in theory, traffic accidents can be reduced, and road transportation efficiency can be improved. Therefore, the autonomous driving technology attracts increasing attention.

[0003] With development of autonomous driving technologies, functions such as vehicle path planning and vehicle obstacle avoidance become increasingly important, and these functions cannot be separated from a basic technology of collecting observation data of a traffic element. Currently, a solution in which a plurality of vehicles cooperatively collect observation data of a traffic element is usually used to improve accuracy of collected observation data of the traffic element. To be specific, the plurality of vehicles separately observe a same traffic element, and send first observation data obtained after respective observation to a cloud server, and then the cloud server integrates the first observation data respectively uploaded by the plurality of vehicles, to finally obtain second observation data of the traffic element.

[0004] However, in the foregoing solution in which the plurality of vehicles cooperatively collect the observation data of the traffic element, because there is an error in observation data collected by the vehicles, in a process of integrating a plurality of groups of first observation data, the cloud server may identify a plurality of groups of observation data of a same traffic element as observation data of different traffic elements, or identify observation data of different traffic elements as observation data of a same traffic element. Consequently, the obtained second observation data of the traffic element is inaccurate.

SUMMARY



[0005] This application provides a traffic element observation method and apparatus, to improve accuracy of obtained observation data of a traffic element.

[0006] According to a first aspect, a traffic element observation method is provided. The method includes: receiving a plurality of groups of first observation data that are of a traffic element and that are sent by a plurality of vehicles, where each of the plurality of vehicles collects one of the plurality of groups of first observation data, and the first observation data indicates a change of a coordinate location of the traffic element with time and/or a change of a speed of the traffic element with time; performing time synchronization processing and/or space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data; and determining, based on the plurality of groups of processed observation data, second observation data that is of the traffic element and that is observed by the plurality of vehicles.

[0007] In embodiments of this application, the time synchronization processing and/or space correction processing are/is performed on the plurality of groups of first observation data to obtain the plurality of groups of processed observation data, and the second observation data that is of the traffic element and that is observed by the plurality of vehicles is determined based on the plurality of groups of processed observation data, to improve accuracy of obtained observation data of the traffic element. This avoids a problem in the conventional technology in which obtained second observation data of the traffic element is inaccurate because the second observation data of the traffic element is determined by directly integrating a plurality of groups of first observation data.

[0008] In a possible implementation, if the first observation data indicates the change of a speed of the traffic element with time, the performing time synchronization processing on the first observation data that is of the traffic element and that is sent by the plurality of vehicles, to obtain a plurality of groups of processed observation data includes: determining a time offset between the plurality of groups of first observation data; and adjusting each of the plurality of groups of first observation data based on the time offset between the plurality of groups of first observation data, to obtain the processed observation data, where time points of all groups of the processed observation data are synchronized.

[0009] In embodiments of this application, each of the plurality of groups of first observation data is adjusted based on the time offset between the plurality of groups of first observation data, to obtain the processed observation data. The time points of all groups of the processed observation data are synchronized, so that accuracy of obtained observation data of the traffic element is improved.

[0010] In a possible implementation, if the first observation data indicates the change of a coordinate location of the traffic element with time, the performing space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data includes: determining, in a preset coordinate system, coordinates that are of the traffic element at different time points and that are indicated by each of the plurality of groups of first observation data; and representing a coordinate value that is of the traffic element and that is included in each preset coordinate range in the coordinate system as a target coordinate value corresponding to the coordinate range, to obtain coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system, where the processed observation data includes the coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system.

[0011] In this embodiment of this application, the coordinate value that is of the traffic element and that is included in each preset coordinate range in the coordinate system is represented as the target coordinate value corresponding to the coordinate range, to obtain the coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system, so that accuracy of obtained observation data of the traffic element is improved.

[0012] In a possible implementation, when the traffic element is a target object, the first observation data includes at least one of a type of the target object, a motion status of the target object, a motion trail of the target object, and a size of the target object.

[0013] In a possible implementation, when the target object is a traffic signal light, the first observation data further includes time serving information of the traffic signal light.

[0014] In a possible implementation, the plurality of groups of first observation data are obtained after being collected by an in-vehicle sensor in each of the plurality of vehicles and processed by a multi-domain controller.

[0015] In embodiments of this application, the first observation data is collected by using the in-vehicle sensor, and is processed by using the multi-domain controller, to avoid adding an additional data collection apparatus and data processing apparatus. This helps avoid increasing costs.

[0016] According to a second aspect, a traffic element observation apparatus is provided. The apparatus may be a computing device, or may be a chip in a computing device.

[0017] The apparatus may include a processing unit and a receiving unit. When the apparatus is the computing device, the processing unit may be a processor, and the receiving unit may be a communications interface. Optionally, the apparatus may further include a storage unit. When the apparatus is the computing device, the storage unit may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the computing device performs the method according to the first aspect.

[0018] When the apparatus is the chip in a computing device, the processing unit may be a processor, and the receiving unit may be an input/output interface, a pin, a circuit, or the like. The processing unit executes the instructions stored in the storage unit, so that the computing device performs the method according to the first aspect.

[0019] Optionally, the storage unit may be a storage unit (for example, a register or a cache) in the chip, or may be a storage unit (for example, a read-only memory or a random access memory) that is located in the computing device and that is located outside the chip.

[0020] That the memory is coupled to the processor may be understood as that the memory is located in the processor, or the memory is located outside the processor, so that the memory is independent of the processor.

[0021] According to a third aspect, a computer program product is provided, where the computer program product includes computer program code. When the computer program code is run on a computer, the computer is enabled to perform the method according to the foregoing aspects.

[0022] It should be noted that all or a part of the foregoing computer program code may be stored on a first storage medium. The first storage medium may be encapsulated together with a processor, or may be encapsulated separately from a processor. This is not specifically limited in embodiments of this application.

[0023] According to a fourth aspect, a computer-readable medium is provided, where the computer-readable medium stores program code. When the computer program code is run on a computer, the computer is enabled to perform the method according to the foregoing aspects.

BRIEF DESCRIPTION OF DRAWINGS



[0024] 

FIG. 1 is a functional block diagram of a vehicle 100 according to an embodiment of this application;

FIG. 2 is a schematic diagram of an autonomous driving system to which an embodiment of this application is applicable;

FIG. 3 is a schematic diagram of a system 300 that includes an autonomous vehicle and a cloud service center and to which an embodiment of this application is applicable;

FIG. 4 is a flowchart of a traffic element observation method according to an embodiment of this application;

FIG. 5 is a simulation diagram of observation data collected before time synchronization processing;

FIG. 6 is a simulation diagram of observation data collected after time synchronization processing;

FIG. 7 is a flowchart of a traffic element observation method according to an embodiment of this application;

FIG. 8 is a schematic diagram of a traffic element observation apparatus according to an embodiment of this application; and

FIG. 9 is a schematic block diagram of a computing device according to another embodiment of this application.


DESCRIPTION OF EMBODIMENTS



[0025] The following describes technical solutions in this application with reference to the accompanying drawings. To facilitate understanding, a scenario to which embodiments of this application are applicable is described below with reference to FIG. 1 to FIG. 3 by using an intelligent driving scenario as an example.

[0026] FIG. 1 is a functional block diagram of a vehicle 100 according to an embodiment of this application. In an embodiment, the vehicle 100 is configured to be in a fully or partially autonomous driving mode. For example, the vehicle 100 in an autonomous driving mode may control the vehicle 100, and may determine current states of the vehicle and an ambient environment of the vehicle through manual operations, determine possible behavior of at least one another vehicle in the ambient environment, determine a confidence level corresponding to a possibility that the another vehicle performs the possible behavior, and control the vehicle 100 based on determined information. When the vehicle 100 is in the autonomous driving mode, the vehicle 100 may be set to operate without interaction with a human.

[0027] The vehicle 100 may include various subsystems, for example, a travel system 102, a sensor system 104, a control system 106, one or more peripheral devices 108, a power supply 110, a computer system 112, and a user interface 116. Optionally, the vehicle 100 may include more or fewer subsystems, and each subsystem may include a plurality of elements. In addition, all the subsystems and elements of the vehicle 100 may be interconnected in a wired or wireless manner.

[0028] The travel system 102 may include a component that provides power for the vehicle 100 to move. In an embodiment, the travel system 102 may include an engine 118, an energy source 119, a drive apparatus 120, and a wheel/tire 121. The engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or a combination of other types of engines, for example, a hybrid engine including a gasoline engine and an electric motor, or a hybrid engine including an internal combustion engine and an air compression engine. The engine 118 converts the energy source 119 into mechanical energy.

[0029] Examples of the energy source 119 include gasoline, diesel, other oil-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other power sources. The energy source 119 may also provide energy for another system of the vehicle 100.

[0030] The drive apparatus 120 may transmit mechanical power from the engine 118 to the wheel 121. The drive apparatus 120 may include a gearbox, a differential, and a drive shaft. In an embodiment, the drive apparatus 120 may further include another component, for example, a clutch. The drive shaft may include one or more shafts that may be coupled to one or more wheels 121.

[0031] The sensor system 104 (also referred to as a "collection device") may include several sensors that sense information about an environment around the vehicle 100. For example, the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (global positioning system, GPS), or may be a Beidou system or another positioning system), an inertial measurement unit (inertial measurement unit, IMU) 124, a radar 126, a laser rangefinder 128, and a camera 130. The sensor system 104 may further include sensors (for example, an in-vehicle air quality monitor, a fuel gauge, and an oil temperature gauge) in an internal system of the monitored vehicle 100. One or more pieces of sensor data from these sensors may be used to detect an object and corresponding features (a location, a shape, a direction, a speed, and the like) of the object. Such detection and identification are key functions of a security operation of the autonomous vehicle 100.

[0032] The positioning system 122 may be configured to estimate a geographical location of the vehicle 100. The IMU 124 is configured to sense location and orientation changes of the vehicle 100 based on inertial acceleration. In an embodiment, the IMU 124 may be a combination of an accelerometer and a gyroscope.

[0033] The radar 126 may sense an object in the ambient environment of the vehicle 100 by using a radio signal. In some embodiments, in addition to sensing a target object, the radar 126 may be further configured to sense one or more states of a speed, a location, and a forward direction of the target object.

[0034] The laser rangefinder 128 may sense, by using a laser, an object in an environment in which the vehicle 100 is located. In some embodiments, the laser rangefinder 128 may include one or more laser sources, a laser scanner, one or more detectors, and another system component.

[0035] The camera 130 may be configured to capture a plurality of images of the ambient environment of the vehicle 100. The camera 130 may be a static camera or a video camera.

[0036] The control system 106 controls operations of the vehicle 100 and the components of the vehicle. The control system 106 may include various elements, including a steering system 132, a throttle 134, a brake unit 136, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.

[0037] The steering system 132 may be operated to adjust a forward direction of the vehicle 100. For example, in an embodiment, the steering system may be a steering wheel system.

[0038] The throttle 134 is configured to control an operating speed of the engine 118 and further control a speed of the vehicle 100.

[0039] The brake unit 136 is configured to control the vehicle 100 to decelerate. The brake unit 136 may use friction to reduce a rotational speed of the wheel 121. In another embodiment, the brake unit 136 may convert kinetic energy of the wheel 121 into a current. The brake unit 136 may alternatively use another form to reduce a rotational speed of the wheel 121, to control the speed of the vehicle 100.

[0040] The computer vision system 140 may be operated to process and analyze an image captured by the camera 130, to recognize an object and/or a feature in the ambient environment of the vehicle 100. The object and/or the feature may include a traffic signal, a road boundary, and an obstacle. The computer vision system 140 may use an object recognition algorithm, a structure from motion (structure from motion, SFM) algorithm, video tracking, and another computer vision technology. In some embodiments, the computer vision system 140 may be configured to: draw a map for an environment, trail an object, estimate an object speed, and the like.

[0041] The route control system 142 is configured to determine a travel route of the vehicle 100. In some embodiments, the route control system 142 may determine a driving route for the vehicle 100 with reference to data from the sensors, the GPS 122, and one or more predetermined maps.

[0042] The obstacle avoidance system 144 is configured to recognize, evaluate, and avoid or surmount, in other manners, potential obstacles in the environment of the vehicle 100.

[0043] Certainly, for example, the control system 106 may additionally or alternatively include components in addition to those shown and described. Alternatively, the control system may not include some of the components shown above.

[0044] The vehicle 100 interacts with an external sensor, another vehicle, another computer system, or a user by using the peripheral device 108. The peripheral device 108 may include a wireless communications system 146, a vehicle-mounted computer 148, a microphone 150, and/or a speaker 152.

[0045] In some embodiments, the peripheral device 108 provides a means for a user of the vehicle 100 to interact with the user interface 116. For example, the vehicle-mounted computer 148 may provide information for the user of the vehicle 100. The user interface 116 may further operate the vehicle-mounted computer 148 to receive an input of the user. The vehicle-mounted computer 148 may be operated by using a touchscreen. In another case, the peripheral device 108 may provide a means for the vehicle 100 to communicate with another device located in the vehicle. For example, the microphone 150 may receive audio (for example, according to a voice command or based on other audio input) from the user of the vehicle 100. Likewise, the speaker 152 may output audio to the user of the vehicle 100.

[0046] The wireless communications system 146 may communicate wirelessly with one or more devices directly or through a communications network. For example, the wireless communications system 146 may use 3G cellular communications, for example, code division multiple access (code division multiple access, CDMA), global system for mobile communications (Global System for Mobile Communications, GSM)/GPRS, fourth generation (fourth generation, 4G) communications, for example, LTE, or 5th-generation (5th-Generation, 5G) communications. The wireless communications system 146 may communicate with a wireless local area network (wireless local area network, WLAN) by using Wi-Fi. In some embodiments, the wireless communications system 146 may communicate directly with a device by using an infrared link, Bluetooth, or ZigBee (ZigBee). Other wireless protocols, for example, various vehicle communications systems such as the wireless communications system 146, may include one or more dedicated short range communications (dedicated short range communications, DSRC) devices, which may include public and/or private data communications between vehicles and/or roadside stations.

[0047] The power supply 110 may supply power to various components of the vehicle 100. In an embodiment, the power supply 110 may be a rechargeable lithium-ion or lead-acid battery. One or more battery packs of such batteries may be configured as the power supply to supply power to the components of the vehicle 100. In some embodiments, the power supply 110 and the energy source 119 may be implemented together, for example, in some pure electric vehicles.

[0048] Some or all functions of the vehicle 100 are controlled by the computer system 112. The computer system 112 may include at least one processor 113. The processor 113 executes instructions 115 stored in a non-transitory computer-readable medium such as a data memory 114. The computer system 112 may alternatively be a plurality of computing devices that control an individual component or a subsystem of the vehicle 100 in a distributed manner.

[0049] The processor 113 may be any conventional processor, such as a commercially available central processing unit (central processing unit, CPU). Alternatively, the processor may be a dedicated device such as an application-specific integrated circuit (application-specific integrated circuit, ASIC) or another hardware-based processor. Although FIG. 1 functionally illustrates the processor, the memory, and other elements of the computer 110 in a same block, a person of ordinary skill in the art should understand that the processor, the computer, or the memory may actually include a plurality of processors, computers, or memories that may or may not be stored in a same physical housing. For example, the memory may be a hard disk drive, or another storage medium not located in a housing of the computer 110. Thus, it is understood that a reference to the processor or the computer includes a reference to a set of processors or computers or memories that may or may not operate in parallel. Different from using a single processor to perform the steps described herein, some components such as a steering component and a deceleration component may include respective processors. The processor performs only computation related to a component-specific function.

[0050] In various aspects described herein, the processor may be located far away from the vehicle and wirelessly communicate with the vehicle. In another aspect, some of processes described herein are performed on a processor disposed inside the vehicle, while others are performed by a remote processor. The processes include necessary steps for performing a single operation.

[0051] In some embodiments, the memory 114 may include the instructions 115 (for example, program logics), and the instructions 115 may be executed by the processor 113 to perform various functions of the vehicle 100, including the functions described above. The memory 114 may also include additional instructions, including instructions used to send data to, receive data from, interact with, and/or control one or more of the travel system 102, the sensor system 104, the control system 106, and the peripheral device 108.

[0052] In addition to the instructions 115, the memory 114 may further store data, such as a road map, route information, a location, a direction, a speed, and other vehicle data of the vehicle, and other information. Such information may be used by the vehicle 100 and the computer system 112 when the vehicle 100 operates in an autonomous mode, a semi-autonomous mode, and/or a manual mode.

[0053] In some embodiments, the processor 113 may further execute a solution for planning a vertical motion parameter of the vehicle in this embodiment of this application, to help the vehicle plan the vertical motion parameter. For a specific method for planning a vertical motion parameter, refer to the following description of FIG. 3. For brevity, details are not described herein again.

[0054] The user interface 116 is configured to provide information for or receive information from the user of the vehicle 100. Optionally, the user interface 116 may include one or more input/output devices within a set of peripheral devices 108, such as the wireless communications system 146, the vehicle-mounted computer 148, the microphone 150, and the speaker 152.

[0055] The computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (for example, the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may use input from the control system 106 to control the steering unit 132 to avoid an obstacle detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 may operate to provide control over many aspects of the vehicle 100 and the subsystems of the vehicle 100.

[0056] Optionally, one or more of the foregoing components may be installed separately from or associated with the vehicle 100. For example, the memory 114 may exist partially or completely separate from the vehicle 100. The foregoing components may be communicatively coupled together in a wired and/or wireless manner.

[0057] Optionally, the components are merely examples. In actual application, components in the foregoing modules may be added or deleted based on an actual requirement. FIG. 1 should not be construed as a limitation on embodiments of the present invention.

[0058] An autonomous vehicle driving on a road, such as the vehicle 100, may identify an object in an ambient environment of the vehicle to determine an adjustment to a current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, the autonomous vehicle may independently consider each identified object, and may determine a to-be-adjusted speed of the autonomous vehicle based on characteristics of each identified object, such as a current speed of the object, acceleration of the object, and a distance between the object and the autonomous vehicle.

[0059] Optionally, the autonomous vehicle 100 or a computing device associated with the autonomous vehicle 100 (for example, the computer system 112, the computer vision system 140, or the memory 114 in FIG. 1) may predict behavior of the identified object based on a feature of the identified object and a state of the ambient environment (for example, traffic, rain, and ice on a road). Optionally, identified objects depend on behavior of each other, and therefore all the identified objects may be considered together to predict behavior of a single identified object. The vehicle 100 can adjust the speed of the vehicle based on the predicted behavior of the identified object. In other words, the automatic driving car can determine, based on the predicted behavior of the object, a stable state to which the vehicle needs to be adjusted (for example, acceleration, deceleration, or stop). In this process, another factor may also be considered to determine the speed of the vehicle 100, for example, a horizontal location of the vehicle 100 on a road on which the vehicle travels, a curvature of the road, and proximity between a static object and a dynamic object.

[0060] In addition to providing an instruction for adjusting the speed of the autonomous vehicle, the computing device may provide an instruction for modifying a steering angle of the vehicle 100, so that the autonomous vehicle follows a given trajectory and/or maintains safe transverse and longitudinal distances from an object (for example, a car in an adjacent lane on the road) next to the autonomous vehicle.

[0061] The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, a construction device, a trolley, a golf cart, a train, a handcart, or the like. This is not specifically limited in embodiments of the present invention.

[0062] The foregoing describes, with reference to FIG. 1, a scenario to which embodiments of this application are applicable. The following describes, with reference to FIG. 2, an autonomous driving system to which embodiments of this application are applicable.

[0063] FIG. 2 is a schematic diagram of an autonomous driving system to which an embodiment of this application is applicable. A computer system 101 includes a processor 103, and the processor 103 is coupled to a system bus 105. The processor 103 may be one or more processors, and each processor may include one or more processor cores. A video adapter (video adapter) 107, where the video adapter may drive a display 109, and the display 109 is coupled to the system bus 105. The system bus 105 is coupled to an input/output (input/output, I/O) bus 113 through a bus bridge 111. An I/O interface 115 is coupled to the I/O bus. The I/O interface 115 communicates with a plurality of I/O devices, for example, an input device 117 (such as a keyboard, a mouse, and a touchscreen) and a media tray (media tray) 121 (such as a CD-ROM and a multimedia interface). A transceiver 123 (which may send and/or receive a radio communications signal), a camera 155 (which may capture static and dynamic digital video images), and an external USB interface 125 are provided. Optionally, an interface connected to the I/O interface 115 may be a USB port.

[0064] The processor 103 may be any conventional processor, including a reduced instruction set computing (Reduced Instruction Set Computing, RISC) processor, a complex instruction set computing (Complex Instruction Set Computing, CISC) processor, or a combination thereof. Optionally, the processor may be a dedicated apparatus such as an application-specific integrated circuit (ASIC). Optionally, the processor 103 may be a neural network processor or a combination of the neural network processor and the foregoing conventional processor.

[0065] Optionally, in various embodiments described herein, the computer system 101 may be located away from an autonomous vehicle, and may wirelessly communicate with the autonomous vehicle. In another aspect, some of processes described herein are performed on a processor disposed in the autonomous vehicle, and others are performed by a remote processor, including taking an action required to perform a single manipulation.

[0066] The computer 101 may communicate with a software deploying server 149 by using a network interface 129. The network interface 129 is a hardware network interface, such as a network adapter. A network 127 may be an external network such as the Internet, or an internal network such as the Ethernet or a virtual private network (virtual private network, VPN). Optionally, the network 127 may alternatively be a wireless network, for example, a Wi-Fi network or a cellular network.

[0067] A hard disk drive interface is coupled to the system bus 105. The hardware drive interface is connected to a hard disk drive. A system memory 135 is coupled to the system bus 105. Data running in the system memory 135 may include an operating system 137 and an application program 143 of the computer 101.

[0068] The operating system includes a shell (shell) 139 and a kernel (kernel) 141. The shell 139 is an interface between a user and the kernel of the operating system. The shell 139 is an outermost layer of the operating system. The shell 139 manages interaction between the user and the operating system: waiting for input of the user, explaining the input of the user to the operating system, and processing various output results of the operating system.

[0069] The kernel 141 includes parts of the operating system that are used for managing a memory, a file, a peripheral device, and a system resource. When directly interacting with hardware, the kernel of the operating system usually runs a process, provides inter-process communication, and provides functions such as CPU time slice management, interrupt, memory management, and I/O management.

[0070] The application program 143 includes a program related to controlling autonomous driving of the vehicle, for example, a program for managing interaction between the autonomous vehicle and a road obstacle, a program for controlling a route or a speed of the autonomous vehicle, and a program for controlling interaction between the autonomous vehicle and another autonomous vehicle on the road. The application program 143 may be on a system of the software deploying server (deploying server) 149. In an embodiment, when the application program 147 needs to be executed, the computer system 101 may download the application program 143 from the software deploying server (deploying server) 149.

[0071] In some embodiments, the application program may further include an application program corresponding to a solution of target object perception provided in embodiments of this application. The solution of target object perception in embodiments of this application is specifically described below. For brevity, details are not described herein again.

[0072] A sensor 153 is associated with the computer system 101. The sensor 153 is configured to detect an ambient environment of the computer 101. For example, the sensor 153 may detect a target object, for example, an animal, a vehicle, or an obstacle. Further, the sensor may detect an ambient environment of the target object, for example, an environment around the animal, another animal appearing around the animal, a weather condition, or brightness of the ambient environment. Optionally, if the computer 101 is located in an autonomous vehicle, the sensor may be a laser radar, a camera, an infrared sensor, a chemical detector, a microphone, or the like.

[0073] The foregoing describes, with reference to FIG. 1 and FIG. 2, a vehicle and a driving system to which embodiments of this application are applicable. The following describes, with reference to FIG. 3, a scenario to which the embodiments of this application are applicable by using a system including a vehicle and a cloud service center as an example.

[0074] FIG. 3 is a schematic diagram of a system 300 that includes an autonomous vehicle and a cloud service center and to which an embodiment of this application is applicable. The cloud service center 310 may receive information from an autonomous vehicle 330 and an autonomous vehicle 331 via a network 320, such as a wireless communications network.

[0075] Optionally, the received information may be a location of a target object, a speed of the target object, or the like sent by the autonomous vehicle 330 and/or the autonomous vehicle 331. The target object may be a traffic element detected by a vehicle that collects data in a running process, for example, another vehicle, a pedestrian, or a traffic signal light.

[0076] The cloud service center 310 runs, based on received data, a program that is stored in the cloud service center and that is related to controlling autonomous driving of a vehicle, to control the autonomous vehicles 330 and 331. The program related to controlling autonomous driving of a vehicle may include a program related to controlling autonomous driving of the vehicle, a program for managing interaction between the autonomous vehicle and a road obstacle, a program for controlling a route or a speed of the autonomous vehicle, and a program for controlling interaction between the autonomous vehicle and another autonomous vehicle on the road.

[0077] The network 320 provides a portion of a map outward to the autonomous vehicle 330 or 331. In another example, operations may be divided between different locations. For example, a plurality of cloud service centers may receive, acknowledge, combine, and/or send information reports. In some examples, information reports and/or sensor data may also be sent between autonomous vehicles. Another configuration is also possible.

[0078] In some examples, the center sends a suggested solution to the autonomous vehicle for possible driving conditions within the system 300 (For example, informing a forward obstacle and telling how to bypass the forward obstacle). For example, the cloud service center may assist the vehicle in determining how to travel when there is a specific obstacle ahead in the environment. The cloud service center sends, to the autonomous vehicle, a response indicating how the vehicle should travel in a given scenario. For example, the cloud service center may determine, based on collected sensor data, that there is a temporary stop sign in the road ahead, and further determine, based on a "lane closed" sign and sensor data from a construction vehicle, that the lane is closed due to construction. Correspondingly, the cloud server center sends a suggested operation mode used by the autonomous vehicle to go around the obstacle (for example, indicating the vehicle to change the lane to another road). When the cloud server center observes a video stream in an operating environment of the cloud server center and determines that the autonomous vehicle can safely and successfully go around the obstacle, operation steps used for the autonomous vehicle may be added to a driving information map. Correspondingly, the information may be sent to another vehicle that may encounter a same obstacle in the region, to assist the another vehicle not only in recognizing the closed lane but also in knowing how to go around.

[0079] Currently, a solution in which a plurality of vehicles cooperatively collect observation data of a traffic element in a traffic scenario is usually used to improve accuracy of obtained observation data of the traffic element in the traffic scenario. In the solution in which the plurality of vehicles cooperatively collect the observation data, there is an error in observation data collected by the vehicles in terms of time and space. Consequently, in a process of integrating data, observation data of a same traffic element may be identified as observation data of different traffic elements, or observation data of different traffic elements is identified as observation data of a same traffic element. As a result, an observation result of the traffic element is inaccurate.

[0080] To avoid the foregoing advantages, this application provides a new traffic element observation solution. To be specific, observation data that is of a traffic element and that is collected by a plurality of vehicles is synchronized in terms of time, and/or corrected in terms of space, and the observation data collected by the plurality of vehicles is adjusted and then integrated, to obtain a final observation result of the traffic element, so as to improve accuracy of the observation result of the traffic element. The following describes a traffic element observation method according to an embodiment of this application with reference to FIG. 4.

[0081] FIG. 4 is a flowchart of a traffic element observation method according to an embodiment of this application. The method shown in FIG. 4 may be performed by the cloud service center 310 shown in FIG. 3, or may be performed by another computing device. This is not limited in this embodiment of this application. The method shown in FIG. 4 includes steps 410 to 430.

[0082] 410: Receive a plurality of groups of first observation data that are of a traffic element and that are sent by a plurality of vehicles, where each of the plurality of vehicles collects one of the plurality of groups of first observation data, and the first observation data indicates a change of a coordinate location of the traffic element with time and/or a change of a speed of the traffic element with time.

[0083] The traffic element may include a dynamic obstacle or a static obstacle in a traffic scenario. The dynamic obstacle may be a vehicle other than the vehicle that collects the first observation data, or may be a pedestrian or the like in the traffic scenario. The static obstacle may be a traffic signal light or the like.

[0084] Optionally, when the traffic element is a target object, the first observation data may include at least one of a type of the target object, a motion status of the target object, a motion trail of the target object, and a size of the target object. The type of the target object may include a vehicle, a pedestrian, a bicycle, and the like. The motion status of the target object may include a static state and a dynamic state. The motion trail of the target object may include a speed trail of the target object and a spatial trail of the target object. The size of the target object may include a length of the target object and a width of the target object.

[0085] Optionally, when the target object is a traffic signal light, the first observation data further includes time serving information of the traffic signal light.

[0086] Optionally, the plurality of groups of first observation data are obtained after being collected by an in-vehicle sensor in each of the plurality of vehicles and processed by a multi-domain controller (multi-domain controller, MDC).

[0087] Optionally, when a distance between the plurality of vehicles is 100 meters, a vehicle observation accuracy is high, and may be about 3 cm to 4 cm. For example, a distance between a vehicle #1 and the traffic element is 200 meters, and a distance between a vehicle #2 and the traffic element is 100 meters. In this case, accuracy of observation data based on the distance from the vehicle #1 to the traffic element may be low. According to the method in this embodiment of this application, the accuracy of observation data that is of the traffic element and that is collected by the vehicle #1 can be compensated by using observation data that is of the traffic element and that is collected by the vehicle # 2. Certainly, a distance between every two of the plurality of vehicles is not specifically limited in this embodiment of this application.

[0088] Optionally, the plurality of vehicles are intelligent vehicles.

[0089] 420: Perform time synchronization processing and/or space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data.

[0090] The foregoing time synchronization processing and space correction processing may be selected based on a type of the observation data. If the observation data includes location information of the traffic element, space correction processing may be performed on observation data of this type. For example, when the observation data includes a coordinate location of the traffic element, space correction processing and time synchronization processing may be performed on observation data of this type. If the observation data includes time information, time synchronization can be performed on observation data of this type. For example, if the observation data includes a speed curve of the traffic element, time synchronization processing may be performed on observation data of this type.

[0091] Optionally, the foregoing step 420 includes: determining a time offset between the plurality of groups of first observation data; and adjusting each of the plurality of groups of first observation data based on the time offset between the plurality of groups of first observation data, to obtain the plurality of groups of processed observation data, where time points of all of the plurality of groups of processed observation data are synchronized.

[0092] The time offset between the plurality of groups of first observation data may be an average value of time offsets between every two of the plurality of groups of first observation data, or may be a minimum value in time offsets between every two of the plurality of groups of first observation data, or may be a maximum value in time offsets between every two of the plurality of groups of first observation data. This is not specifically limited in this embodiment of this application.

[0093] Optionally, for a same traffic element, longitudes, latitudes or course angles that are of the traffic element and that are collected by all of the vehicles at a same moment should be the same. Therefore, the time offset may alternatively be determined based on an offset between longitudes, latitudes or course angles that are of the traffic element and that are collected by all of the vehicles. The following describes a method for determining a time offset between the plurality of groups of first observation data by using an example in which an offset between longitudes, latitudes or course angles that are of the traffic element and that are collected by all of the vehicles is calculated by using the least square method.

[0094] The time offset Δoffset between the plurality of groups of first observation data may be determined by using the formula

, where Yt represents a total quantity of moments at which the plurality of vehicles observe the target traffic element, i represents an ith vehicle in the plurality of vehicles, j represents a jth vehicle in the plurality of vehicles, t represents a tth moment in n moments, lonit represents a longitude that is of the target traffic element and that is collected by the i th vehicle at the tth moment, and lonjt represents a longitude that is of the target traffic element and that is collected by the jth vehicle at the tth moment.

[0095] Alternatively, the time offset Δoffset between the plurality of groups of first observation data may be determined by using the formula

, where Yt represents a total quantity of moments at which the plurality of vehicles observe the target traffic element, i represents an ith vehicle in the plurality of vehicles, j represents a jth vehicle in the plurality of vehicles, t represents a tth moment in Yt moments, latit represents a latitude that is of the target traffic element and that is collected by the ith vehicle at the tth moment, and latjt represents a latitude that is of the target traffic element and that is collected by the jth vehicle at the tth moment.

[0096] Alternatively, the time offset Δoffset between the plurality of groups of first observation data may be determined by using the formula

, where Yt represents a total quantity of moments at which the plurality of vehicles observe the target traffic element, i represents an ith vehicle in the plurality of vehicles, j represents a jth vehicle in the plurality of vehicles, t represents a tth moment in n moments, yawit represents an angular rate of a course angle that is of the target traffic element and that is collected by the ith vehicle at the tth moment, and yawjt represents an angular rate of the course angle that is of the target traffic element and that is collected by the jth vehicle at the tth moment.

[0097] It should be noted that the foregoing three manners of determining the time offset may be separately used based on different scenarios, or may be combined to determine the time offset. This is not specifically determined in this embodiment of this application.

[0098] With reference to FIG. 5 and FIG. 6, the following describes a simulation result of time synchronization processing according to an embodiment of this application. FIG. 5 is a simulation diagram of observation data collected before time synchronization processing. In FIG. 5, a curve 1 indicates a change of observation data that is of a traffic element 1 and that is collected by the vehicle #1 with time, and a curve 2 indicates a change of observation data that is of the traffic element 1 and that is collected by the vehicle #2 with time. It can be learned that before time synchronization processing, the two curves correspond to different observation data at a same moment. FIG. 6 is a simulation diagram of observation data collected after time synchronization processing. It can be learned that after the time synchronization processing method in this embodiment of this application is used, the curve 1 and the curve 2 basically overlap.

[0099] Optionally, if the first observation data indicates a change of a coordinate location of the traffic element with time, step 420 includes: determining, in a preset coordinate system, coordinates over time that are of the traffic element and that are indicated by each of the plurality of groups of first observation data; and representing a coordinate value that is of the traffic element and that is included in each preset coordinate range in the coordinate system as a target coordinate value corresponding to the coordinate range, to obtain coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system, where the processed observation data includes the coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system.

[0100] For example, at the qth moment, coordinates that are of the traffic element and that are indicated by each of the plurality of groups of first observation data is located in a grid #1 in the preset coordinate system, and target coordinate values corresponding to the grid #1 is (x,y). In this case, it may be determined that coordinates that are of the traffic element and that are indicated by each group of first observation data at the qth moment is the target coordinate values (x,y) corresponding to the grid #1.

[0101] It should be noted that grids in the coordinate system may be divided in advance. This is not limited in this embodiment of this application.

[0102] Optionally, the change of a coordinate location of the traffic element with time may be determined based on current states (including a location and a speed of the traffic element) that are of the traffic element and that are collected by the plurality of vehicles, and a Kalman filtering algorithm.

[0103] The foregoing Kalman filtering algorithm may be divided into two phases: a prediction phase and an update phase. The prediction phase is used to predict a state of a traffic element at a kth moment based on a state of a traffic element at a (k - 1)th moment. The update phase is used to update a variable in the Kalman filtering algorithm based on a predicted state of the traffic element at the kth moment.

[0104] In a prediction phase, a state k|k-1 of the traffic element at the kth moment is predicted by using the formulas k|k-1 = Fkk-1|k-1 + Bkuk and

, where k|k-1 represents a state vector that is of the traffic element at the kth moment and that is predicted based on a state vector of the traffic element at the (k -1)th moment, k-1|k-1 represents a state vector including a location and a speed of the traffic element at the (k -1)th moment, Pk|k-1 represents a first covariance matrix that is at the kth moment and that is predicted based on a covariance matrix at the (k - 1)th moment, Pk-1|k-1 represents a covariance matrix at the (k - 1)th moment, uk represents a preset control vector, Bk represents a preset control matrix, and Qk represents a preset second covariance matrix, and Fk represents a prediction matrix used for predicting a state of the traffic element at the kth moment based on the state of the traffic element at the (k - 1)th moment.

[0105] It should be noted that the control vector and the control matrix may reflect impact of an external factor on the state of the traffic element at the k th moment, and the second covariance matrix may reflect impact of an external uncertainty on the state of the traffic element at the kth moment.

[0106] In the update phase, a measurement residual vector k at the kth moment, a measurement residual covariance matrix Sk at the kth moment, and a Kalman gain Kk at the kth moment may be determined by using the formulas k = zk - Hkk|k-1,

, and

, where Hk represents a sensor reading matrix used to collect a state of the traffic element, Rk represents a preset third covariance matrix, and zk represents a distribution average value of the foregoing sensor readings.

[0107] It should be noted that the third covariance matrix Rk may be set based on noise of the sensor.

[0108] The residual covariance matrix Sk and the Kalman gain Kk are measured based on the measurement residual vector k, and the state k|k of the traffic element at the kth moment and the covariance matrix Pk|k at the k th moment are updated by using the formulas k|k = k|k-1 + Kkk and Pk|k = (I - KkHk)Pk|k-1.

[0109] 430: Determine, based on the plurality of groups of processed observation data, second observation data that is of the traffic element and that is observed by the plurality of vehicles. The second observation data may be used as final observation data of the traffic element.

[0110] Certainly, the plurality of vehicles may further report one or more types of information such as a location of a current vehicle, a speed of the current vehicle, and a state of the current vehicle, so that a cloud computing server determines information such as a relative location and a relative speed between the current vehicle and the traffic element based on the second observation data and the information about the vehicle, and feeds back calculated information to the current vehicle. In this way, the current vehicle may adjust a driving route, a driving speed, or the like based on the information such as the location of the current vehicle, the speed of the current vehicle, and the state of the current vehicle.

[0111] With reference to FIG. 7, the following describes a traffic element observation method according to an embodiment of this application by using an example in which observation data collected by a vehicle # 1, a vehicle #2, and a vehicle #3 is processed. FIG. 7 is a flowchart of a traffic element observation method according to an embodiment of this application. The method shown in FIG. 7 includes steps 710 to 750.

[0112] 710: The vehicle #1, the vehicle #2, and the vehicle #3 respectively send data collected by the vehicle #1, the vehicle #2, and the vehicle #3 to a cloud server.

[0113] The data uploaded by each of the vehicle includes observation data collected by the vehicle, a traveling trail of the vehicle, and information of the vehicle. Observation data #1 is collected by the vehicle #1 and includes data of a traffic element #1, observation data #2 is collected by the vehicle #2 and includes data of the traffic element #1, and observation data #3 is collected by the vehicle #3 and includes data of the traffic element #1.

[0114] Optionally, when a distance between every two of the vehicle #1, the vehicle #2, and the vehicle #3 is less than or equal to 100 m, and a distance between any one of the vehicles and the traffic element #1 is less than or equal to 100 m, accuracy of data of the traffic element #1 separately collected by the vehicle #1, the vehicle #2, and the vehicle #3 is high.

[0115] Optionally, the vehicle #1, the vehicle #2, and the vehicle #3 are intelligent vehicles.

[0116] 720: The cloud computing server performs time synchronization processing on the observation data in the foregoing data, to obtain processed observation data #1, processed observation data #2, and processed observation data #3.

[0117] It should be noted that, for a process of the time synchronization processing, refer to the foregoing description. For brevity, details are not described herein again.

[0118] 730: Determine a plurality of groups of coordinate locations of the traffic element #1 on a high-definition map according to location information that is of the traffic element #1 and that is carried in the processed observation data #1, the processed observation data #2, and the processed observation data #3.

[0119] 740: Perform space correction processing on the plurality of groups of coordinate locations to obtain a corrected coordinate location of the traffic element #1.

[0120] It should be noted that, for a process of the space correction processing, refer to the foregoing description. For brevity, details are not described herein again.

[0121] 750: Determine an observation result of each vehicle for the traffic element #1 based on the corrected coordinate location of the traffic element #1, a processed speed curve of the traffic element #1, the traveling trail of each vehicle, and the information of each vehicle, and send the observation result of each vehicle to the vehicle.

[0122] An observation result #1 of the vehicle #1 for the traffic element #1 includes information such as a relative location and a relative speed between the vehicle #1 and the traffic element #1. An observation result #2 of the vehicle #2 for the traffic element #1 includes information such as a relative location and a relative speed between the vehicle #2 and the traffic element #1. An observation result #3 of the vehicle #3 for the traffic element #1 includes information such as a relative location and a relative speed between the vehicle #3 and the traffic element #1.

[0123] The foregoing describes, with reference to FIG. 1 to FIG. 7, the traffic element observation method in embodiments of this application. The following describes an apparatus, with reference to FIG. 8 and FIG. 9, in embodiments of this application. It should be understood that the apparatuses shown in FIG. 8 and FIG. 9 may implement the steps in the foregoing methods. For brevity, details are not described herein again.

[0124] FIG. 8 is a schematic diagram of a traffic element observation apparatus according to an embodiment of this application. An apparatus 800 shown in FIG. 8 includes a receiving unit 810 and a processing unit 820.

[0125] The receiving unit 810 is configured to receive a plurality of groups of first observation data that are of a traffic element and that are sent by a plurality of vehicles, where each of the plurality of vehicles collects one of the plurality of groups of first observation data, and the first observation data indicates a change of a coordinate location of the traffic element with time and/or a change of a speed of the traffic element with time.

[0126] The processing unit 820 is configured to perform time synchronization processing and/or space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data.

[0127] The processing unit 820 is further configured to determine, based on the plurality of groups of processed observation data, second observation data that is of the traffic element and that is observed by the plurality of vehicles.

[0128] Optionally, in an embodiment, the processing unit 820 is further configured to: determine a time offset between the plurality of groups of first observation data; and adjust each of the plurality of groups of first observation data based on the time offset between the plurality of groups of first observation data, to obtain the processed observation data, where time points of all groups of the processed observation data are synchronized.

[0129] Optionally, in an embodiment, the processing unit 820 is further configured to: determine, in a preset coordinate system, coordinates that are of the traffic element at different time points and that are indicated by each of the plurality of groups of first observation data; and represent a coordinate value that is of the traffic element and that is included in each preset coordinate range in the coordinate system as a target coordinate value corresponding to the coordinate range, to obtain coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system, where the processed observation data includes the coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system.

[0130] Optionally, in an embodiment, when the traffic element is a target object, the first observation data includes at least one of a type of the target object, a motion status of the target object, a motion trail of the target object, and a size of the target object.

[0131] Optionally, in an embodiment, when the target object is a traffic signal light, the first observation data further includes time serving information of the traffic signal light.

[0132] Optionally, in an embodiment, the plurality of groups of first observation data are obtained after being collected by an in-vehicle sensor in each of the plurality of vehicles and processed by a multi-domain controller.

[0133] In an optional embodiment, the receiving unit 810 may be a communications interface 930, the processing unit 820 may be a processor 920, and the computing device may further include a memory 910. Details are shown in FIG. 9.

[0134] FIG. 9 is a schematic block diagram of a computing device according to another embodiment of this application. The computing device 900 shown in FIG. 9 may include a memory 910, a processor 920, and a communications interface 930. The memory 910, the processor 920, and the communications interface 930 are connected by using an internal connection path. The memory 910 is configured to store instructions. The processor 920 is configured to execute the instructions stored in the memory 920, to control the input/output interface 930 to receive/send at least some parameters of a second channel model. Optionally, the memory 910 may be coupled to the processor 920 through an interface, or may be integrated together with the processor 920.

[0135] It should be noted that the communications interface 930 uses a transceiver apparatus such as but not limited to a transceiver to implement communication between the communications device 900 and another device or a communications network. The communications interface 930 may further include an input/output interface (input/output interface).

[0136] In an implementation process, steps in the foregoing methods can be implemented by using a hardware integrated logical circuit in the processor 920, or by using instructions in a form of software. The methods disclosed with reference to embodiments of this application may be directly performed and completed by using a hardware processor, or may be performed and completed by using a combination of hardware in the processor and a software module. The software module may be located in a mature storage medium in the art, such as a random access memory, a flash memory, a read-only memory, a programmable read-only memory, an electrically erasable programmable memory, a register, or the like. The storage medium is located in the memory 910, and the processor 920 reads information in the memory 910 and completes the steps in the foregoing methods in combination with the hardware of the processor. To avoid repetition, details are not described herein again.

[0137] It should be understood that, the processor in embodiments of this application may be a central processing unit (central processing unit, CPU), or may further be another general-purpose processor, a digital signal processor (digital signal processor, DSP), an application-specific integrated circuit (application-specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA), or another programmable logic device, discrete gate or transistor logic device, discrete hardware component, or the like. The general-purpose processor may be a microprocessor, or the processor may further be any conventional processor, or the like.

[0138] It should also be understood that in embodiments of this application, the memory may include a read-only memory and a random access memory, and provide instructions and data for the processor. A part of the processor may further include a non-volatile random access memory. For example, the processor may further store information about a device type.

[0139] The term "and/or" in this specification describes only an association relationship for describing associated objects and indicates that three relationships may exist. For example, A and/or B may represent the following three cases: Only A exists, both A and B exist, and only B exists. In addition, the character "/" in this specification generally indicates an "or" relationship between the associated objects.

[0140] It should be understood that sequence numbers of the foregoing processes do not mean execution sequences in various embodiments of this application. The execution sequences of the processes should be determined according to functions and internal logic of the processes, and should not be construed as any limitation on the implementation processes of embodiments of this application.

[0141] A person of ordinary skill in the art may be aware that, in combination with the examples described in embodiments disclosed in this specification, units and algorithm steps may be implemented by electronic hardware or a combination of computer software and electronic hardware. Whether the functions are performed by hardware or software depends on particular applications and design constraint conditions of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope of this application.

[0142] It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, refer to a corresponding process in the foregoing method embodiments, and details are not described herein again.

[0143] In several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other manners. For example, the described apparatus embodiments are merely examples. For example, division into the units is merely logical function division and may be other division during an actual implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be ignored or may not be performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented by using some interfaces. Indirect couplings or communication connections between the apparatuses or units may be implemented in an electrical form, a mechanical form, or another form.

[0144] The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units may be selected based on actual requirements to achieve the objectives of the solutions of embodiments.

[0145] In addition, functional units in embodiments of this application may be integrated into one processing unit, each of the units may exist alone physically, or two or more units are integrated into one unit.

[0146] When the functions are implemented in a form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of this application essentially, or the part contributing to the conventional technology, or some of the technical solutions may be implemented in a form of a software product. The computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or some of the steps of the methods described in embodiments of this application. The foregoing storage medium includes: any medium that can store program code, such as a USB flash disk, a removable hard disk, a read-only memory (read-only memory, ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disc.

[0147] The foregoing descriptions are merely specific implementations of this application, but are not intended to limit the protection scope of this application. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in this application shall fall within the protection scope of this application. Therefore, the protection scope of this application shall be subject to the protection scope of the claims.


Claims

1. A traffic element observation method, comprising:

receiving a plurality of groups of first observation data that are of a traffic element and that are sent by a plurality of vehicles, wherein each of the plurality of vehicles collects one of the plurality of groups of first observation data, and the first observation data indicates a change of a coordinate location of the traffic element with time and/or a change of a speed of the traffic element with time;

performing time synchronization processing and/or space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data; and

determining, based on the plurality of groups of processed observation data, second observation data that is of the traffic element and that is observed by the plurality of vehicles.


 
2. The method according to claim 1, wherein if the first observation data indicates the change of the speed of the traffic element with time, the performing time synchronization processing on the first observation data that is of the traffic element and that is sent by the plurality of vehicles, to obtain a plurality of groups of processed observation data comprises:

determining a time offset between the plurality of groups of first observation data; and

adjusting each of the plurality of groups of first observation data based on the time offset between the plurality of groups of first observation data, to obtain the processed observation data, wherein time points of all groups of the processed observation data are synchronized.


 
3. The method according to claim 1 or 2, wherein if the first observation data indicates the change of the coordinate location of the traffic element with time, the performing space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data comprises:

determining, in a preset coordinate system, coordinates that are of the traffic element at different time points and that are indicated by each of the plurality of groups of first observation data; and

representing a coordinate value that is of the traffic element and that is comprised in each preset coordinate range in the coordinate system as a target coordinate value corresponding to the coordinate range, to obtain coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system, wherein the processed observation data comprises the coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system.


 
4. The method according to any one of claims 1 to 3, wherein when the traffic element is a target object, the first observation data comprises at least one of a type of the target object, a motion status of the target object, a motion trail of the target object, and a size of the target object.
 
5. The method according to claim 4, wherein when the target object is a traffic signal light, the first observation data further comprises time serving information of the traffic signal light.
 
6. The method according to any one of claims 1 to 5, wherein the plurality of groups of first observation data are obtained after being collected by an in-vehicle sensor in each of the plurality of vehicles and processed by a multi-domain controller MDC.
 
7. A traffic element observation apparatus, comprising:

a receiving unit, configured to receive a plurality of groups of first observation data that are of a traffic element and that are sent by a plurality of vehicles, wherein each of the plurality of vehicles collects one of the plurality of groups of first observation data, and the first observation data indicates a change of a coordinate location of the traffic element with time and/or a change of a speed of the traffic element with time; and

a processing unit, configured to perform time synchronization processing and/or space correction processing on the plurality of groups of first observation data, to obtain a plurality of groups of processed observation data, wherein

the processing unit is further configured to determine, based on the plurality of groups of processed observation data, second observation data that is of the traffic element and that is observed by the plurality of vehicles.


 
8. The apparatus according to claim 7, wherein the processing unit is further configured to:

determine a time offset between the plurality of groups of first observation data; and

adjust each of the plurality of groups of first observation data based on the time offset between the plurality of groups of first observation data, to obtain the processed observation data, wherein time points of all groups of the processed observation data are synchronized.


 
9. The apparatus according to claim 7 or 8, wherein the processing unit is further configured to:

determine, in a preset coordinate system, coordinates that are of the traffic element at different time points and that are indicated by each of the plurality of groups of first observation data; and

represent a coordinate value that is of the traffic element and that is comprised in each preset coordinate range in the coordinate system as a target coordinate value corresponding to the coordinate range, to obtain coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system, wherein the processed observation data comprises the coordinates that are of the traffic element and that are in each preset coordinate range in the coordinate system.


 
10. The apparatus according to any one of claims 7 to 9, wherein when the traffic element is a target object, the first observation data comprises at least one of a type of the target object, a motion status of the target object, a motion trail of the target object, and a size of the target object.
 
11. The apparatus according to claim 10, wherein when the target object is a traffic signal light, the first observation data further comprises time serving information of the traffic signal light.
 
12. The apparatus according to any one of claims 7 to 11, wherein the plurality of groups of first observation data are obtained after being collected by an in-vehicle sensor in each of the plurality of vehicles and processed by a multi-domain controller MDC.
 
13. A computing device, comprising at least one processor and a memory, wherein the at least one processor is coupled to the memory, and is configured to: read and execute instructions in the memory, to perform the method according to any one of claims 1 to 6.
 
14. A computer-readable medium, wherein the computer-readable medium stores program code, and when the computer program code is run on a computer, the computer is enabled to perform the method according to any one of claims 1 to 6.
 




Drawing






















Search report