(19)
(11)EP 4 089 648 A1

(12)EUROPEAN PATENT APPLICATION

(43)Date of publication:
16.11.2022 Bulletin 2022/46

(21)Application number: 22164698.7

(22)Date of filing:  28.03.2022
(51)International Patent Classification (IPC): 
G06V 20/56(2022.01)
G06V 10/62(2022.01)
G06V 10/44(2022.01)
G06T 7/246(2017.01)
(52)Cooperative Patent Classification (CPC):
G06V 20/588; G06V 10/44; G06V 10/62; G06T 7/12; G06T 2207/30256; G06T 7/251
(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
KH MA MD TN

(30)Priority: 10.05.2021 CN 202110505759

(71)Applicant: Nio Technology (Anhui) Co., Ltd
Hefei City, Anhui 230601 (CN)

(72)Inventor:
  • LIN, Binbin
    Anhui, 230611 (CN)

(74)Representative: Vossius & Partner Patentanwälte Rechtsanwälte mbB 
Siebertstraße 3
81675 München
81675 München (DE)

  


(54)LANE EDGE EXTRACTION METHOD AND APPARATUS, AUTONOMOUS DRIVING SYSTEM, VEHICLE, AND STORAGE MEDIUM


(57) This application relates to a lane edge extraction method and apparatus, an autonomous driving system, a vehicle, and a storage medium. The lane edge extraction method includes the steps of: receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence; determining observation edge points, about the lane edges, of a current frame of the edge image sequence; continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame; fitting a lane edge curve based on the temporary tracking edge points; and excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame. The method can improve the stability and accuracy of lane edge extraction.




Description

Technical Field



[0001] This application relates to the field of visual control for vehicles, and in particular, to a lane edge extraction method, a lane edge extraction apparatus, an autonomous driving system, a vehicle, and a computer-readable storage medium.

Background Art



[0002] Computer vision processing technology is increasingly applied in the field of vehicle driving. At present, functions such as lateral control for driver assistance are highly dependent on the quality of lane lines on a road. When information about lane lines is inaccurate due to blurred road signs, accumulated water and snow on a road, etc., driver assistance may hardly control the direction of a vehicle correctly. The above common scenarios limit the application range of vehicle driving assistance, and also easily lead to dangers during an application process. By means of existing visual identification, a drivable space of a vehicle can be provided. However, data is relatively primitive in structure, and the data is noisy and has a large error. Therefore, although making no difference to suppression and warning of a function, the data may raise many risks if being directly used for route planning and control.

Summary of the Invention



[0003] Embodiments of this application provide a lane edge extraction method, a lane edge extraction apparatus, an autonomous driving system, a vehicle, and a computer-readable storage medium, which are used for improving the stability and accuracy of lane edge extraction.

[0004] According to an aspect of this application, a lane edge extraction method is provided, the method including: receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence; determining observation edge points, about the lane edges, of a current frame of the edge image sequence; continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame; fitting a lane edge curve based on the temporary tracking edge points; and excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame.

[0005] In some embodiments of this application, optionally, tracking edge points of a first frame of the edge image sequence are observation edge points of the first frame.

[0006] In some embodiments of this application, optionally, the observation edge points of the current frame are determined in a vehicle rectangular coordinate system, and the tracking edge points of the immediately preceding frame are corrected in a vehicle polar coordinate system.

[0007] In some embodiments of this application, optionally, the continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame include: determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; mapping the observation edge points of the current frame and the tracking edge points of the immediately preceding frame from the vehicle rectangular coordinate system to the vehicle polar coordinate system; continuing and correcting, in the vehicle polar coordinate system, the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame, to obtain the temporary tracking edge points; and mapping the temporary tracking edge points into the vehicle rectangular coordinate system.

[0008] In some embodiments of this application, optionally, the continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame include: determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; continuing, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points; mapping a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correcting the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame in the coincident part, to obtain a remaining part of the temporary tracking edge points; and mapping the remaining part of the temporary tracking edge points into the vehicle rectangular coordinate system.

[0009] In some embodiments of this application, optionally, the determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system includes: determining the current positions of the tracking edge points of the immediately preceding frame based on a speed and a yaw angle of a vehicle and a time difference between the immediately preceding frame and the current frame.

[0010] In some embodiments of this application, optionally, where an angle of orientation of a first point in the tracking edge points of the immediately preceding frame is between a second point and a third point of the observation edge points of the current frame, linear interpolation is performed by using amplitude values of the second point and the third point, to obtain an amplitude value of a fourth point that is the same as the angle of orientation of the first point; and an amplitude value of the first point is corrected by means of filtering based on the amplitude value of the first point and the amplitude value of the fourth point.

[0011] In some embodiments of this application, optionally, the method includes: fitting the lane edge curve by means of a least square method based on the temporary tracking edge points; and determining that the fourth point is a part of the outliers if a distance between the fourth point in the temporary tracking edge points and the lane edge curve is greater than a preset value.

[0012] According to another aspect of this application, a lane edge extraction apparatus is provided, the apparatus including: an image obtaining apparatus configured to obtain an edge image sequence; a calculation apparatus configured to: receive tracking edge points, about lane edges, of an immediately preceding frame of the edge image sequence; determine observation edge points, about the lane edges, of a current frame of the edge image sequence; continue and correct the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame; fit a lane edge curve based on the temporary tracking edge points; and exclude outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame; and an edge generation unit configured to output the lane edge curve.

[0013] In some embodiments of this application, optionally, tracking edge points of a first frame of the edge image sequence are observation edge points of the first frame.

[0014] In some embodiments of this application, optionally, the calculation apparatus is configured to: determine the observation edge points of the current frame in a vehicle rectangular coordinate system, and correct the tracking edge points of the immediately preceding frame in a vehicle polar coordinate system.

[0015] In some embodiments of this application, optionally, the calculation apparatus is configured to: determine current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; map the observation edge points of the current frame and the tracking edge points of the immediately preceding frame from the vehicle rectangular coordinate system to the vehicle polar coordinate system; continue and correct, in the vehicle polar coordinate system, the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame, to obtain the temporary tracking edge points; and map the temporary tracking edge points into the vehicle rectangular coordinate system.

[0016] In some embodiments of this application, optionally, the calculation apparatus is configured to: determine current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; continue, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points; map a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correct the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame in the coincident part, to obtain a remaining part of the temporary tracking edge points; and map the remaining part of the temporary tracking edge points into the vehicle rectangular coordinate system.

[0017] In some embodiments of this application, optionally, the determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system includes: determining the current positions of the tracking edge points of the immediately preceding frame based on a speed and a yaw angle of a vehicle and a time difference between the immediately preceding frame and the current frame.

[0018] In some embodiments of this application, optionally, the calculation apparatus is configured to: where an angle of orientation of a first point in the tracking edge points of the immediately preceding frame is between a second point and a third point of the observation edge points of the current frame, perform linear interpolation by using amplitude values of the second point and the third point, to obtain an amplitude value of a fourth point that is the same as the angle of orientation of the first point; and correct an amplitude value of the first point by means of filtering based on the amplitude value of the first point and the amplitude value of the fourth point.

[0019] In some embodiments of this application, optionally, the calculation apparatus is configured to: fit the lane edge curve by means of a least square method based on the temporary tracking edge points; and determine that the fourth point is a part of the outliers if a distance between the fourth point in the temporary tracking edge points and the lane edge curve is greater than a preset value.

[0020] According to another aspect of this application, an autonomous driving system is provided, which includes any one of the lane edge extraction apparatuses as described above.

[0021] According to another aspect of this application, a vehicle is provided, which includes any one of the lane edge extraction apparatuses as described above or any one of the autonomous driving systems as described above.

[0022] According to another aspect of this application, there is provided a computer-readable storage medium storing instructions, where the instructions, when executed by a processor, cause the processor to perform any one of the methods as described above.

Brief Description of the Drawings



[0023] The above and other objectives and advantages of this application will be more thorough and clearer from the following detailed description in conjunction with the drawings, where the same or similar elements are represented by the same reference numerals.

FIG. 1 shows a lane edge extraction method according to an embodiment of this application.

FIG. 2 shows a lane edge extraction apparatus according to an embodiment of this application.

FIG. 3 shows a principle of lane edge extraction according to an embodiment of this application.

FIG. 4 shows a principle of lane edge extraction according to an embodiment of this application.

FIG. 5 shows a principle of lane edge extraction according to an embodiment of this application.

FIG. 6 shows a scenario for lane edge extraction according to an embodiment of this application.


Detailed Description of Embodiments



[0024] For the sake of brevity and illustrative purposes, the principles of this application are mainly described herein with reference to its exemplary embodiments. However, those skilled in the art would readily appreciate that the same principles can be equivalently applied to all types of lane edge extraction methods, lane edge extraction apparatuses, autonomous driving systems, vehicles, and computer-readable storage media, and the same or similar principles may be implemented therein. These variations do not depart from the true spirit and scope of this application.

[0025] FIG. 6 shows a scenario 60 for lane edge extraction according to an embodiment of this application, in which a vehicle 601 is used for obtaining image information of lane edges by using an image obtaining apparatus 602 equipped for the vehicle itself. Only one image obtaining apparatus 602 is shown in the figure. However, the vehicle 601 may be provided with multiple image obtaining apparatuses as desired, which may also operate in different wavelength ranges.

[0026] As shown in FIG. 6, the lane edges may include a lane dividing line 612 (generally a white broken line) of lanes in a same direction, a road edge line 613 (generally a long solid white or yellow line), and a lane dividing line 611 (generally a long single or double solid yellow line) of lanes in different directions. Nevertheless, the above various traffic markings may be visually discontinuous due to aging, accumulated water and snow, etc., resulting in a failure to capture continuous markings by the image obtaining apparatus 602. In addition, the image obtaining apparatus 602 may also fail to capture the markings due to short-term blocking. However, computer vision-assisted/autonomous driving functions require accurate lane edge information. In view of this, high-quality lane edge information will be generated based on defective marking images obtained by the image obtaining apparatus 602 in the following examples of this application, for calling by a processing device such as an onboard computer.

[0027] According to an aspect of this application, a lane edge extraction method is provided. As shown in FIG. 1, a lane edge extraction method 10 includes the steps as follows. The lane edge extraction method 10 involves receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence in step S 102. An image obtaining apparatus 602 such as in FIG. 6 can, when moving with a vehicle, capture a sequence of edge images at a fixed time interval. The sequence of edge images will be used to form observation edge points about lane edges, and tracking edge points can be further obtained by the following calculations. In some examples of this application, tracking edge points of each frame are generated by means of rolling, and tracking edge points of a current frame are generated in step S 110 below. It should be noted that the tracking edge points of the immediately preceding frame may be generated in a previous computing cycle by using the same method as in steps S102 to S 110.

[0028] In the context of the invention, the immediately preceding frame refers to a preceding frame that participates in the calculation and is used to determine the observation edge points and tracking edge points therein. In a typical example, the immediately preceding frame refers to an adjacent previous frame. In other examples, not every frame in the image sequence is used to determine observation edge points and tracking edge points therein, and the immediately preceding frame in this case may also be a non-adjacent previous frame.

[0029] The lane edge extraction method 10 involves determining observation edge points, about the lane edges, of a current frame of the edge image sequence in step S104. The observation edge points are related to various lane edges shown in FIG. 6, and specifically, may be feature points extracted from various lane edges in the image sequence. A feature point extraction method may be implemented according to the existing technology or the technology to be developed, which is not limited in this application here. In addition, the tracking edge points in the context of this application are calculated based on the observation edge points, and therefore, the tracking edge points are indirectly calculated from the lane edges.

[0030] The lane edge extraction method 10 involves continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame in step S106. As the vehicle 601 in, for example, FIG. 6 travels, the image obtaining apparatus 602 will capture new image frames. When the vehicle 601 travels to a current position, the image obtaining apparatus 602 captures a current frame. In this case, observation edge points can be extracted from the current frame, and are used to be continued to the tracking edge points of the immediately preceding frame, to supplement information of the lane edges brought about by the vehicle 601 traveling to a new position. On the other hand, there may be differences in a coincident area between some observation edge points in the current frame and some tracking edge points in the immediately preceding frame, and in this case, these tracking edge points can be corrected by using the observation edge points, thereby eliminating accumulated errors. Edge points formed after continuation and correction will continue to be used in the following steps, and are thus called temporary tracking edge points.

[0031] Certainly, in a specific case, there may be no tracking edge points of the immediately preceding frame, and in this case, all observation edge points will serve as the temporary tracking edge points of the current frame. In other words, the above-mentioned continuation and correction operations may be omitted (or it is considered that a do-nothing operation is performed). This processing approach to unusual situations is used throughout this application.

[0032] The lane edge extraction method 10 involves fitting a lane edge curve based on the temporary tracking edge points in step S 108. For example, the lane edge curve may be fitted by means of a least square method based on the temporary tracking edge points. Although some outliers will be removed from these temporary tracking edge points in step S 110, since there are usually fewer outliers, the curve fitted in step S108 can well reflect features of the lane edges, without the need of removing the outliers and then fitting the curve again.

[0033] The lane edge extraction method 10 involves excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame in step S110. For example, if a distance between a certain point in the temporary tracking edge points and the lane edge curve is greater than a preset value, it is determined that this point is a part of the outliers. After exclusion of the outliers, the remaining temporary tracking edge points are collected as "the tracking edge points of the current frame". The tracking edge points of the current frame will be input into the processing process of an immediately following frame, and the above process can be repeated. For the definition of the immediately following frame, reference may be made to the foregoing definition of the immediately preceding frame.

[0034] The general principle of processing for unusual situations has been described above. In some embodiments of this application, there is no tracking edge point input before a first frame of the edge image sequence, and therefore, observation edge points of the first frame can be used as tracking edge points of the first frame, and continue to be used for subsequent processing.

[0035] It can be seen from the above example that the tracking edge points are continuously updated iteratively during a calculation process. In each processing cycle, an observation edge point captured at a current moment is input, and the fitted lane edge curve is output. The iterative update of the tracking edge points can effectively eliminate the accumulated errors, and therefore, the lane edge curve generated by using the above example can well reflect the features of the lane edges, which provides a reliable data guarantee for functions such as autonomous driving.

[0036] In some embodiments, the determination of the observation edge points of the current frame is accomplished in a vehicle rectangular coordinate system. The vehicle rectangular coordinate system is a reference system with a certain point (for example, the image obtaining apparatus) of the vehicle itself as the origin. With the vehicle rectangular coordinate system, motion information of the vehicle can be directly used to deduce the update of a position of the measured point in the vehicle rectangular coordinate system. This configuration can significantly reduce the computational complexity.

[0037] In some embodiments, the correction of the tracking edge points of the immediately preceding frame is accomplished in a vehicle polar coordinate system. The image obtaining apparatus 602 such as in FIG. 6 captures images with its own position as a center of a circle (a center of sphere), and therefore, an imaging error is also related to this imaging principle. The determination of the observation edge points in the vehicle rectangular coordinate system has been described above, and the correction of the tracking edge points of the immediately preceding frame also in the vehicle rectangular coordinate system cannot well reflect the characteristics of the imaging error. However, performing error correction in the vehicle polar coordinate system can reflect the characteristics of circular (spherical) imaging, and will achieve a better correction effect than that in the vehicle rectangular coordinate system. Although it seems that computing overheads will increase due to a single coordinate transformation, the road edge curve can be efficiently fitted by fully utilizing different characteristics of the two coordinate systems, which cannot be implemented by means of a conventional single coordinate system.

[0038] In some embodiments of this application, the lane edge extraction method 10 further specifically includes the following process: determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; mapping the observation edge points of the current frame and the tracking edge points of the immediately preceding frame from the vehicle rectangular coordinate system to the vehicle polar coordinate system; continuing and correcting, in the vehicle polar coordinate system, the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame, to obtain the temporary tracking edge points; and mapping the temporary tracking edge points into the vehicle rectangular coordinate system. In this example, the continuation and correction are both performed in the polar coordinate system. In this way, the continuation and correction can be completed in one coordinate system, which can simplify the processing process.

[0039] Referring to FIG. 3, a tracking edge point 301 of an immediately preceding frame and an observation edge point 302 of a current frame are both shown in a vehicle rectangular coordinate system, and there is an overlapping part 303 between the two. The so-called overlapping part refers to a set of areas by which two sides extend to each other into the deepest. There may be a difference between coordinates of the tracking edge point and coordinates of the observation edge point in the overlapping part 303, and the existence of two coordinate records for a same feature point indicates that there is an error in terms of coordinate values. Therefore, the overlapping part 303 includes accumulated errors recorded in the tracking edge point 301 of the immediately preceding frame, and these accumulated errors can be corrected by using the observation edge point 302 of the current frame. In the previous example, the continuation and correction are both performed in polar coordinates, and therefore, no continuation or correction is performed in the rectangular coordinate system shown in FIG. 3.

[0040] FIG. 4 shows an example of mapping a point in a rectangular coordinate system into a polar coordinate system. In an upper part of FIG. 4, a point set 401 corresponds to the tracking edge point 301 of the immediately preceding frame, a point set 402 corresponds to the observation edge point 302 of the current frame, and a point set 403 corresponds to the overlapping part 303 between the tracking edge point and the observation edge point. In the previous example, the continuation and correction are both performed in the coordinates. Specifically, the overlapping part can be first corrected to obtain a correction point set corresponding to the point set 403, and the correction point set can then be continued with the point set 401 and the point set 402. These point sets continued together are referred to as temporary tracking edge points, which will also be mapped back into the vehicle rectangular coordinate system.

[0041] In some other embodiments of this application, the lane edge extraction method 10 further specifically includes the following process: determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; continuing, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points; mapping a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correcting the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame in the coincident part, to obtain a remaining part of the temporary tracking edge points; and mapping the remaining part of the temporary tracking edge points into the vehicle rectangular coordinate system. In this example, the continuation is carried out in rectangular coordinates, while the correction is carried out in polar coordinates. In this way, issues suitable for processing in the coordinate system can be handled by using the characteristics of different coordinates, such that the processing efficiency can be improved.

[0042] Still referring to FIG. 3, the tracking edge point 301 of the immediately preceding frame can be first continued to the observation edge point 302 of the current frame in the rectangular coordinate system, that is to say, coincident parts of the tracking edge point of the immediately preceding frame and the observation edge point of the current frame (i.e., 301 and 302 in FIG. 3) are first continued in the rectangular coordinate system. In this case, a part of the temporary tracking edge points can be generated (and the remaining part thereof will be generated in the polar coordinates).

[0043] In addition, FIG. 4 shows an example of mapping a point in a rectangular coordinate system into a polar coordinate system. In a lower part of FIG. 4, the point set 413 corresponds to the overlapping part 303 in FIG. 3. It can be seen that compared with the upper part of the figure, in the lower part, only the overlapping part is mapped. In this case, the tracking edge points of the immediately preceding frame can be corrected by using the observation edge points of the current frame in the overlapping part, to obtain a correction result, that is, the remaining part of the temporary tracking edge points, and the remaining part is also mapped back into the vehicle rectangular coordinate system. The part of the temporary tracking edge points and the remaining part of the temporary tracking edge points may then be combined to form the temporary tracking edge points.

[0044] Due to a dynamic change in vehicle positions, current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system need to be first determined in the above two examples. For example, the current positions of the tracking edge points of the immediately preceding frame may be determined based on a speed and a yaw angle of a vehicle and a time difference between the immediately preceding frame and the current frame. In this way, the positions of the tracking edge points can be updated directly based on the vehicle motion and the time difference.

[0045] In the above two examples, the correction operation is implemented in polar coordinates. In some embodiments of this application, where an angle of orientation of a first point in the tracking edge points of the immediately preceding frame is between a second point and a third point of the observation edge points of the current frame, linear interpolation is performed by using amplitude values of the second point and the third point, to obtain an amplitude value of a fourth point that is the same as the angle of orientation of the first point; and an amplitude value of the first point is corrected by means of filtering based on the amplitude value of the first point and the amplitude value of the fourth point. Specifically, as shown in FIG. 5, points A and B in the vehicle rectangular coordinate system are the observation edge points of the current frame, and a point C is the tracking edge point of the immediately preceding frame. In other words, an area formed by the points A, B, and C is the coincident(overlapping) part described in the context of this application. In this case, the points A, B, and C are mapped into the vehicle polar coordinate system. A right part of FIG. 5 shows relative positional relationships of the points A, B, and C in the polar coordinate system, where an angle of orientation of the point C is between points A and B. In some examples of this application, coordinates of the point C may be corrected by using coordinates of points A and B. First, linear interpolation can be performed by using the points A and B, to obtain a point C' that is the same as the angle of orientation of the point C. Then, filtering (e.g., Kalman filtering) processing can be performed on the points C and C', to obtain a corrected position point C", which is used as new coordinates of the point C.

[0046] According to another aspect of this application, a lane edge extraction apparatus is provided. As shown in FIG. 2, a lane edge extraction apparatus 20 includes an image obtaining apparatus 202, a calculation apparatus 204, and an edge generation unit 206. The image obtaining apparatus 202 of the lane edge extraction apparatus 20 is configured to obtain an edge image sequence.

[0047] The calculation apparatus 204 of the lane edge extraction apparatus 20 is configured to receive tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence. The image obtaining apparatus 202 can, when moving with a vehicle, capture a sequence of edge images at a fixed time interval. The sequence of edge images will be used by the calculation apparatus 204 to form observation edge points about lane edges, and tracking edge points can be further obtained by the following calculations. In some examples of this application, tracking edge points of each frame are generated by means of rolling, and the calculation apparatus 204 may also generate tracking edge points of a current frame below. It should be noted that the tracking edge points of the immediately preceding frame may be generated in a previous computing cycle by using the same method as that for calculating the tracking edge points of the current frame by the calculation apparatus 204.

[0048] In the context of the invention, the immediately preceding frame refers to a preceding frame that participates in the calculation and is used to determine the observation edge points and tracking edge points therein. In a typical example, the immediately preceding frame refers to an adjacent previous frame. In other examples, not every frame in the image sequence is used to determine observation edge points and tracking edge points therein, and the immediately preceding frame in this case may also be a non-adjacent previous frame.

[0049] The calculation apparatus 204 is further configured to determine observation edge points, about the lane edges, of a current frame of the edge image sequence. The observation edge points are related to various lane edges shown in FIG. 6, and specifically, may be feature points extracted from various lane edges in the image sequence. A method for extracting feature points by the calculation apparatus 204 may be implemented according to the existing technology or the technology to be developed, which is not limited in this application here. In addition, the tracking edge points in the context of this application are calculated based on the observation edge points, and therefore, the tracking edge points are indirectly calculated from the lane edges.

[0050] The calculation apparatus 204 is further configured to continue and correct the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame. As the vehicle travels, the image obtaining apparatus 202 will capture new image frames. When the vehicle travels to a current position, the image obtaining apparatus 202 captures a current frame. In this case, observation edge points can be extracted from the current frame, and are used to be continued with the tracking edge points of the immediately preceding frame, to supplement information of the lane edges brought about by the vehicle traveling to a new position. On the other hand, there may be differences in a coincident area between some observation edge points in the current frame and some tracking edge points in the immediately preceding frame, and in this case, these tracking edge points can be corrected by using the observation edge points, thereby eliminating accumulated errors. Edge points formed after continuation and correction will continue to be used in the following steps, and are thus called temporary tracking edge points.

[0051] Certainly, in a specific case, there may be no tracking edge points of the immediately preceding frame, and in this case, all observation edge points will serve as the temporary tracking edge points of the current frame. In other words, the above-mentioned continuation and correction operations may be omitted (or it is considered that a do-nothing operation is performed). This processing approach to unusual situations is used throughout this application.

[0052] The calculation apparatus 204 is further configured to fit a lane edge curve based on the temporary tracking edge points. For example, the calculation apparatus 204 may fit the lane edge curve by means of a least square method based on the temporary tracking edge points. Although some outliers will be removed from these temporary tracking edge points, since there are usually fewer outliers, the curve fitted by the calculation apparatus 204 can well reflect features of the lane edges, without the need of removing the outliers and then fitting the curve again.

[0053] The calculation apparatus 204 is further configured to exclude outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame. For example, if a distance between a certain point in the temporary tracking edge points and the lane edge curve is greater than a preset value, it is determined that this point is a part of the outliers. After exclusion of the outliers, the remaining temporary tracking edge points are collected as "the tracking edge points of the current frame". The tracking edge points of the current frame will be input into the processing process of an immediately following frame, and the above process can be repeated. For the definition of the immediately following frame, reference may be made to the foregoing definition of the immediately preceding frame.

[0054] The general principle of processing for unusual situations has been described above. In some embodiments of this application, there is no tracking edge point input before a first frame of the edge image sequence, and therefore, observation edge points of the first frame can be used as tracking edge points of the first frame, and continue to be used for subsequent processing.

[0055] The edge generation unit 206 of the lane edge extraction apparatus 20 is configured to output the lane edge curve. It can be seen from the above example that the tracking edge points are continuously updated iteratively during a calculation process. In each processing cycle, a parameter of an observation edge point captured at a current moment is input, and the fitted lane edge curve is output by the edge generation unit 206. The iterative update of the tracking edge points can effectively eliminate the accumulated errors, and therefore, the lane edge curve generated by using the above example can well reflect the features of the lane edges, which provides a reliable data guarantee for functions such as autonomous driving.

[0056] In some embodiments of this application, the calculation apparatus 204 is configured to determine observation edge points of the current frame in a vehicle rectangular coordinate system, and the calculation apparatus 204 is configured to correct the tracking edge points of the immediately preceding frame in a vehicle polar coordinate system. The vehicle rectangular coordinate system is a reference system with a certain point (for example, the image obtaining apparatus) of the vehicle itself as the origin. With the vehicle rectangular coordinate system, motion information of the vehicle can be directly used to deduce the update of a position of the measured point in the vehicle rectangular coordinate system. This configuration can significantly reduce the computational complexity.

[0057] However, in general, the image obtaining apparatus 202 captures images with its own position as a center of a circle (a center of sphere), and therefore, an imaging error is also related to this imaging principle. The determination of the observation edge points in the vehicle rectangular coordinate system has been described above, and the correction of the tracking edge points of the immediately preceding frame also in the vehicle rectangular coordinate system cannot well reflect the characteristics of the imaging error. However, performing error correction in the vehicle polar coordinate system can reflect the characteristics of circular (spherical) imaging, and will achieve a better correction effect than that in the vehicle rectangular coordinate system. Although it seems that computing overheads will increase due to a single coordinate transformation, the road edge curve can be efficiently fitted by fully utilizing different characteristics of the two coordinate systems, which cannot be implemented by means of a conventional single coordinate system.

[0058] In some embodiments of this application, the calculation apparatus 204 is specifically configured to perform the following operations: determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; mapping the observation edge points of the current frame and the tracking edge points of the immediately preceding frame from the vehicle rectangular coordinate system to the vehicle polar coordinate system; continuing and correcting, in the vehicle polar coordinate system, the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame, to obtain the temporary tracking edge points; and mapping the temporary tracking edge points into the vehicle rectangular coordinate system. In this example, the continuation and correction are both performed in the polar coordinate system. In this way, the continuation and correction can be completed in one coordinate system, which can simplify the processing process.

[0059] Referring to FIG. 3, a tracking edge point 301 of an immediately preceding frame and an observation edge point 302 of a current frame are both shown in a vehicle rectangular coordinate system, and there is an overlapping part 303 between the two. The so-called overlapping part refers to a set of areas by which two sides extend to each other into the deepest. There may be a difference between coordinates of the tracking edge point and coordinates of the observation edge point in the overlapping part 303, and the existence of two coordinate records for a same feature point indicates that there is an error in terms of coordinate values. Therefore, the overlapping part 303 includes accumulated errors recorded in the tracking edge point 301 of the immediately preceding frame, and these accumulated errors can be corrected by using the observation edge point 302 of the current frame. In the previous example, the continuation and correction are both performed in polar coordinates, and therefore, no continuation or correction is performed in the rectangular coordinate system shown in FIG. 3.

[0060] FIG. 4 shows an example of mapping a point in a rectangular coordinate system into a polar coordinate system. In an upper part of FIG. 4, a point set 401 corresponds to the tracking edge point 301 of the immediately preceding frame, a point set 402 corresponds to the observation edge point 302 of the current frame, and a point set 403 corresponds to the overlapping part 303 between the tracking edge point and the observation edge point. In the previous example, the continuation and correction are both performed in the coordinates. Specifically, a correction point set corresponding to the point set 403 can be obtained by correcting the overlapping part, and the correction point set can then be continued with the point set 401 and the point set 402. These point sets continued together are referred to as temporary tracking edge points, which will also be mapped back into the vehicle rectangular coordinate system.

[0061] In some embodiments of this application, the calculation apparatus 204 is specifically configured to perform the following operations: determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system; continuing, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points; mapping a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correcting the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame in the coincident part, to obtain a remaining part of the temporary tracking edge points; and mapping the remaining part of the temporary tracking edge points into the vehicle rectangular coordinate system. In this example, the continuation is carried out in rectangular coordinates, while the correction is carried out in polar coordinates. In this way, issues suitable for processing in the coordinate system can be handled by using the characteristics of different coordinates, such that the processing efficiency can be improved.

[0062] Still referring to FIG. 3, the tracking edge point 301 of the immediately preceding frame can be first continued to the observation edge point 302 of the current frame in the rectangular coordinate system, that is to say, non-coincident parts of the tracking edge point of the immediately preceding frame and the observation edge point of the current frame (i.e. 301 and 302 in FIG. 3) are first continued in the rectangular coordinate system. In this case, a part of the temporary tracking edge points can be generated (and the remaining part thereof will be generated in the polar coordinates).

[0063] In addition, FIG. 4 shows an example of mapping a point in a rectangular coordinate system into a polar coordinate system. In a lower part of FIG. 4, the point set 413 corresponds to the overlapping part 303 in FIG. 3. It can be seen that compared with the upper part of the figure, in the lower part, only the overlapping part is mapped. In this case, the tracking edge points of the immediately preceding frame can be corrected by using the observation edge points of the current frame in the overlapping part, to obtain a correction result, that is, the remaining part of the temporary tracking edge points, and the remaining part is also mapped back into the vehicle rectangular coordinate system. The part of the temporary tracking edge points and the remaining part of the temporary tracking edge points may then be combined to form the temporary tracking edge points.

[0064] Due to a dynamic change in vehicle positions, current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system need to be first determined in the above two examples. For example, the current positions of the tracking edge points of the immediately preceding frame may be determined based on a speed and a yaw angle of a vehicle and a time difference between the immediately preceding frame and the current frame. In this way, the positions of the tracking edge points can be updated directly based on the vehicle motion and the time difference.

[0065] In the above two examples, the correction operation is implemented in polar coordinates. In some embodiments of this application, the calculation apparatus 204 is specifically configured to perform the following process to implement the correction: where an angle of orientation of a first point in the tracking edge points of the immediately preceding frame is between a second point and a third point of the observation edge points of the current frame, performing linear interpolation by using amplitude values of the second point and the third point, to obtain an amplitude value of a fourth point that is the same as the angle of orientation of the first point; and correcting an amplitude value of the first point by means of filtering based on the amplitude value of the first point and the amplitude value of the fourth point. Specifically, as shown in FIG. 5, points A and B in the vehicle rectangular coordinate system are the observation edge points of the current frame, and a point C is the tracking edge point of the immediately preceding frame. In other words, an area formed by the points A, B, and C is the coincident (overlapping) part described in the context of this application. In this case, the points A, B, and C are mapped into the vehicle polar coordinate system. A right part of FIG. 5 shows relative positional relationships of the points A, B, and C in the polar coordinate system, where an angle of orientation of the point C is between points A and B. In some examples of this application, coordinates of the point C may be corrected by using coordinates of points A and B. First, linear interpolation can be performed by using the points A and B, to obtain a point C' that is the same as the angle of orientation of the point C. Then, filtering (e.g., Kalman filtering) processing can be performed on the points C and C', to obtain a corrected position point C", which is used as new coordinates of the point C.

[0066] According to another aspect of this application, an autonomous driving system is provided, which includes any one of the lane edge extraction apparatuses as described above.

[0067] According to another aspect of this application, a vehicle is provided, which includes any one of the lane edge extraction apparatuses as described above or any one of the autonomous driving systems as described above.

[0068] According to another aspect of this application, there is provided a computer-readable storage medium storing instructions, where the instructions, when executed by a processor, cause the processor to perform any one of the lane edge extraction methods as described above. The computer-readable medium in this application includes various types of computer storage media, and may be any usable medium accessible to a general-purpose or special-purpose computer. For example, the computer-readable medium may include a RAM, a ROM, an EPROM, an E2PROM, a register, a hard disk, a removable hard disk, a CD-ROM or another optical memory, a magnetic disk memory or another magnetic storage device, or any other transitory or non-transitory media that can carry or store expected program code having an instruction or data structure form and be accessible to the general-purpose or special-purpose computer or a general-purpose or special-purpose processor. Data is usually copied magnetically in a disk used herein, while data is usually copied optically by using lasers in a disc. A combination thereof shall also fall within the scope of protection of the computer-readable media. For example, the storage medium is coupled to a processor, so that the processor can read information from and write information to the storage medium. In an alternative solution, the storage medium may be integrated into the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In an alternative solution, the processor and the storage medium may reside as discrete assemblies in a user terminal.

[0069] The foregoing descriptions are merely the embodiments of this application, but are not intended to limit the protection scope of this application. Any feasible variation or replacement conceived by a person skilled in the art within the technical scope disclosed in this application shall fall within the scope of protection of this application. In the case of no conflict, the embodiments of this application and features in the embodiments may also be combined with each another. The scope of protection of this application shall be subject to recitations of the claims.


Claims

1. A lane edge extraction method, comprising:

receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence;

determining observation edge points, about the lane edges, of a current frame of the edge image sequence;

continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame;

fitting a lane edge curve based on the temporary tracking edge points; and

excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame.


 
2. The method according to claim 1, wherein tracking edge points of a first frame of the edge image sequence are observation edge points of the first frame.
 
3. The method according to claim 1 or 2, wherein the observation edge points of the current frame are determined in a vehicle rectangular coordinate system, and the tracking edge points of the immediately preceding frame are corrected in a vehicle polar coordinate system.
 
4. The method according to claim 3, wherein the continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame comprise:

determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system;

mapping the observation edge points of the current frame and the tracking edge points of the immediately preceding frame from the vehicle rectangular coordinate system to the vehicle polar coordinate system;

continuing and correcting, in the vehicle polar coordinate system, the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame, to obtain the temporary tracking edge points; and

mapping the temporary tracking edge points into the vehicle rectangular coordinate system.


 
5. The method according to claim 3, wherein the continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame comprise:

determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system;

continuing, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points;

mapping a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correcting the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame in the coincident part, to obtain a remaining part of the temporary tracking edge points; and

mapping the remaining part of the temporary tracking edge points into the vehicle rectangular coordinate system.


 
6. The method according to claim 4 or 5, wherein the determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system comprises: determining the current positions of the tracking edge points of the immediately preceding frame based on a speed and a yaw angle of a vehicle and a time difference between the immediately preceding frame and the current frame.
 
7. The method according to claim 4, 5 or 6, wherein where an angle of orientation of a first point in the tracking edge points of the immediately preceding frame is between a second point and a third point of the observation edge points of the current frame,

linear interpolation is performed by using amplitude values of the second point and the third point, to obtain an amplitude value of a fourth point that is the same as the angle of orientation of the first point; and

an amplitude value of the first point is corrected by means of filtering based on the amplitude value of the first point and the amplitude value of the fourth point.


 
8. The method according to any one of claims 1 to 7, wherein

the lane edge curve is fitted by means of a least square method based on the temporary tracking edge points; and

it is determined that the fourth point is a part of the outliers if a distance between the fourth point in the temporary tracking edge points and the lane edge curve is greater than a preset value.


 
9. A computer-readable storage medium storing instructions, wherein the instructions, when executed by a processor, cause the processor to perform a lane edge extraction method, preferably the method of any one of claims 1 to 8, the method comprising:

receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence;

determining observation edge points, about the lane edges, of a current frame of the edge image sequence;

continuing and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame;

fitting a lane edge curve based on the temporary tracking edge points; and

excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame.


 
10. A lane edge extraction apparatus, comprising:

an image obtaining apparatus configured to obtain an edge image sequence;

a calculation apparatus, preferably being adapted for performing the method of any one of claims 1 to 8, the apparatus being configured to:

receive tracking edge points, about lane edges, of an immediately preceding frame of the edge image sequence;

determine observation edge points, about the lane edges, of a current frame of the edge image sequence;

continue and correct the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame, to obtain temporary tracking edge points of the current frame;

fit a lane edge curve based on the temporary tracking edge points; and

exclude outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame; and

an edge generation unit configured to output the lane edge curve.


 
11. An autonomous driving system, wherein the autonomous driving system comprises the lane edge extraction apparatus of claim 10.
 
12. A vehicle, wherein the vehicle comprises the lane edge extraction apparatus of claim 10 or the autonomous driving system of claim 11.
 




Drawing



















Search report









Search report