(19)
(11)EP 3 124 185 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
17.03.2021 Bulletin 2021/11

(21)Application number: 15770321.6

(22)Date of filing:  26.03.2015
(51)Int. Cl.: 
B25J 9/16  (2006.01)
B23K 9/127  (2006.01)
(86)International application number:
PCT/JP2015/001722
(87)International publication number:
WO 2015/146180 (01.10.2015 Gazette  2015/39)

(54)

ROBOT CONTROL METHOD

ROBOTERSTEUERUNGSVERFAHREN

PROCÉDÉ DE COMMANDE DE ROBOT


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 27.03.2014 JP 2014065178

(43)Date of publication of application:
01.02.2017 Bulletin 2017/05

(73)Proprietor: Panasonic Intellectual Property Management Co., Ltd.
Osaka 540-6207 (JP)

(72)Inventors:
  • KOMATSU, Takamichi
    Osaka-shi Osaka 540-6207 (JP)
  • YOSHIMA, Kazumasa
    Osaka-shi Osaka 540-6207 (JP)
  • IKEDA, Tatsuya
    Osaka-shi Osaka 540-6207 (JP)

(74)Representative: Vigand, Philippe et al
Novagraaf International SA Chemin de l'Echo 3
1213 Onex - Genève
1213 Onex - Genève (CH)


(56)References cited: : 
EP-A1- 0 862 963
JP-A- H06 320 462
JP-A- 2001 071 286
JP-A- 2003 164 982
JP-A- 2011 138 275
US-A- 5 534 705
JP-A- H04 367 373
JP-A- H06 324 733
JP-A- 2003 053 539
JP-A- 2007 290 025
JP-A- 2011 206 830
US-A- 5 582 750
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description

    TECHNICAL FIELD



    [0001] The disclosure relates to robot control methods for moving a work tool along a bent processing line based on information from a sensor mounted on the work tool for recognizing a shape of workpiece.

    BACKGROUND ART



    [0002] Recently, a higher quality of work has been demanded in industrial robots. To meet this demand, a sensor for recognizing the shape of workpiece is further mounted on a work tool attached to a tip of the robot. This sensor recognizes differences with teaching points set before processing that may occur during processing, and modifies processing points in real time.

    [0003] PTL1 and PTL2 describe a control method of a welding robot to which a laser sensor and welding torch are attached at its tip.

    [0004] EP 0 862 963 A1 discloses a welding robot control method for welding two plate-like workpieces along a straight corner line, that deals with the problem of tack welded portions.

    [0005] US 5 582 750 A discloses a control method for a robot which performs various operations in such a manner that a taught track is corrected by using detected signals from a sensor, and more particularly, to a robot control method in which a tool is moved along a corrected track for carrying out weaving welding or overlap welding.

    [0006] A conventional robot control method is described with reference to Fig. 17. Fig. 17 is a schematic view of a conventional welding robot. Conventional welding robot 401 includes welding torch 402 and laser sensor 403. Laser sensor 403 is provided ahead of welding torch 402 in a welding advancing direction to detect a shape of workpiece W. When welding workpiece W, welding robot 401 moves welding torch 402 and laser sensor 403 from teaching point A1 to teaching point B1 set before starting welding in a state an output of laser sensor 403 is turned ON. Laser sensor 403 recognizes a point where the shape of workpiece W changes (a point where a step occurs) as welding start point A2, and welding by welding torch 402 starts from welding start point A2. Laser sensor 403 recognizes a point where the shape of workpiece W changes (a point where the step disappears) as welding end point B2, and welding by welding torch 402 ends at welding end point B2.

    [0007] Laser sensor 403 continues to detect the shape of workpiece W also during welding of workpiece W by welding torch 402, and modifies the welding point by welding torch 402. This enables to achieve welding that is applicable to any displacement in welding point that may occur during welding.

    Citation List


    Patent Literature



    [0008] 

    PTL1 Japanese Patent Unexamined Publication No. H8-39470

    PTL2 Japanese Patent Unexamined Publication No. 2007-185666


    SUMMARY



    [0009] In the conventional robot control method, however, the shape of workpiece W to be detected by laser sensor 40 needs to have a fixed shape, and only linear welding is performed. Therefore, for example, to weld along an L-shaped welding line from point C to point E via point D with respect to workpieces W, welding is achieved in two operations. In other words, welding needs to take place from point C to point D in the first step, and then from point D to point E in the second step. The L-shaped welding line cannot thus be welded continuously, and a desired bead shape cannot be achieved at a portion bent in the L shape. Work efficiency is also low.

    [0010] An object of the present invention is to provide a robot control method that achieves continuous processing although a processing line is bent, such as an L shape, to achieve a required finish and also higher work efficiency.

    [0011] The problem is solved by a method according to claim 1.

    [0012] As described above, the robot control method in the disclosure can continuously process to achieve a required finish although a bent portion exists in the processing line. The work efficiency can also be further improved.

    BRIEF DESCRIPTION OF DRAWINGS



    [0013] 

    Fig. 1 is a schematic diagram of a robot system in accordance with an exemplary embodiment.

    Fig. 2 is a perspective view illustrating a processing step in accordance with the exemplary embodiment.

    Fig. 3 is a block diagram of robot controller and sensor controller in accordance with the exemplary embodiment.

    Fig. 4 illustrates a result of detecting a shape of workpiece by a sensor in accordance with the exemplary embodiment.

    Fig. 5 is a perspective view illustrating detection of the shape of workpiece by the sensor in accordance with the exemplary embodiment.

    Fig. 6 illustrates a result of detecting the shape of workpiece by the sensor in accordance with the exemplary embodiment.

    Fig. 7 is a perspective view illustrating teaching points in accordance with the exemplary embodiment.

    Fig. 8 is a sectional view illustrating an attitude of a welding torch relative to workpieces in accordance with the exemplary embodiment.

    Fig. 9 is a perspective view illustrating the attitude of the welding torch relative to the workpieces at each teaching point in accordance with the exemplary embodiment.

    Fig. 10 is a flowchart illustrating a robot control method in accordance with the exemplary embodiment.

    Fig. 11 illustrates a method of generating an abnormality decision point from the teaching point in accordance with the exemplary embodiment.

    Fig. 12 illustrates calculation of an amount of modification of a welding point from the shape of workpiece detected by the sensor.

    Fig. 13 illustrates notification of the amount of modification of the welding point to the robot controller.

    Fig. 14 illustrates modification of an interpolation point based on modification of the welding point by the robot controller.

    Fig. 15 illustrates transmission of an amount of modification from detection of the edge of the workpiece by the sensor to sending of an end point notice to the robot controller.

    Fig. 16 illustrates modification of teaching points in accordance with the exemplary embodiment. (a) is the case when the end point notice is received at the back of the teaching point in the welding advancing direction. (b) is the case when the end point notice is received in front of the teaching point in the welding advancing direction.

    Fig. 17 is a schematic view of a conventional welding robot.

    Fig. 18 is a perspective view illustrating a disadvantage of a conventional robot system.


    DESCRIPTION OF EMBODIMENTS


    (EXEMPLARY EMBODIMENT)



    [0014] The exemplary embodiment is described with reference to Fig. 1 to Fig. 16.

    (Configuration of robot system 100)



    [0015] Fig. 1 is a schematic diagram of robot system 100. In Fig. 1, robot system 100 includes robot controller 110, manipulator 120, sensor controller 130, and welding power supply unit 130. Welding wire feeder 121, welding wire 122, welding torch 123 (work tool), and sensor 131 are provided on manipulator 120.

    [0016] Robot controller 110 typically has CPU (Central Processing Unit, not illustrated) and memory (not illustrated) inside for executing various calculations at high speed. Robot controller 110 is connected to manipulator 120 to control the operation of manipulator 120. Robot controller 110 is connected to sensor controller 130 to control sensor controller 130. Robot controller 110 is connected to welding power supply unit 140 to control welding power supply unit 140.

    [0017] Manipulator 120 is configured with multiple servo motors, and robot controller 110 controls manipulator 120 to conduct various operations. Welding torch 123 is provided at a tip of manipulator 120, and welding torch 123 has a gas nozzle (not illustrated) at its tip. The gas nozzle supplies shielding gas fed from a gas canister (not illustrated) to a welding point of workpiece W based on a command from welding power supply unit 140. A contact chip (not illustrated) is also attached to the tip of welding torch 123. Welding wire 122 is supplied and power is supplied through this contact chip of welding torch 123.

    [0018] Welding power supply unit 140 typically includes an output part (not illustrated) for flowing welding current by applying welding voltage, a voltage detection part (not illustrated) for detecting the welding voltage, and a welding wire control part (not illustrated) for controlling welding wire feeder 121. The output part of welding power supply unit 140 is electrically coupled to welding torch 123 and workpiece W. The output part of welding power supply unit 140 applies the welding voltage between welding wire 122, which is a consumable electrode, and workpiece W based on a command from robot controller 110.

    [0019] Welding wire feeder 121 is mounted on an upper part of manipulator 120. Welding wire feeder 121 includes a feeding motor with guide roller and an angle detector for detecting a rotation angle of the feeding motor by an angular sensor (not illustrated), such as an encoder. Welding wire feeder is controlled by welding power supply unit 140, and feeds welding wire 122, which is a consumable electrode, to welding torch 123.

    [0020] When welding starts according to a command from robot controller 110, welding power supply unit 140 applies welding voltage between workpiece W and welding wire 122, and also controls welding wire feeder 121 such that welding wire 122 is fed at a feeding speed determined by command current. This generates arc between welding wire 122 fed and workpiece W, and welding takes place by droplet transfer from welding wire 122 to workpiece W.

    [0021] Sensor controller 130 is connected to sensor 131 to control sensor 131. Sensor 131 can two-dimensionally or three-dimensionally detect the shape of workpiece W (surface shape) in a noncontact manner. A method adopting laser beam is a known method of detecting the shape of workpiece W in a noncontact manner by sensor 131. The detection method using laser beam includes methods of calculating a distance based on time until laser beam reflects on workpiece and returns after pulse-waveform laser beam is emitted from sensor 131, and based on an angle of returning laser beam reflected on workpiece W. In general, sensor 131 detects the shape of workpiece W by scanning a broad range by laser beam, using an oscillating mirror.

    (Modification of welding point by sensor 131)



    [0022] The control of sensor 131 is further described with reference to Fig. 2 to Fig. 6. Fig. 2 is a perspective view illustrating a processing step in the exemplary embodiment. Fig. 3 is a block diagram of robot controller 130 and sensor controller 130 in the exemplary embodiment. Fig. 4 illustrates a result of detecting the shape of workpiece by the sensor. Fig. 5 illustrates detection of the shape of workpiece by the sensor in the exemplary embodiment. Fig. 6 illustrates a result of detecting the shape of workpiece by the sensor in the exemplary embodiment.

    [0023] As shown in Fig. 2, sensor 131 is provided in front of welding torch 123 in the welding advancing direction. Distance L between sensor 131 and welding torch 123 is equivalent to a distance between a position on workpiece scanned by sensor 131 and a welding point that is the tip of welding wire 122.

    [0024] As shown in Fig. 3, robot controller 110 includes sensor control part 111, trajectory arithmetic part 112, interpolation calculation part 113, and first data communication part 114. Sensor control part 111 sends a sensor output command to sensor controller 130 via first data communication part 114 based on a command in a teaching program stored in robot controller 110.

    [0025] As shown in Fig. 3, sensor controller 130 includes laser output control part 132, laser input control part 133, input data processing part 134, modification calculation part 135, data buffer 136, and second data communication part 137. When sensor controller 130 receives a sensor output command from robot controller 110, laser output control part 132 controls laser sensor 138, which is an example of sensor 131, and laser sensor 138 outputs laser beam. Here, by using the oscillating mirror (not illustrated), sensor controller 130 outputs laser beam in a radial fashion to workpiece W, as shown in Fig. 2. Laser input controller 133 receives information on the laser beam received (reflected light) from laser sensor 138 receiving the reflected light of laser beam output in a radial fashion. Input data processing part 134 then applies arithmetic processing to the laser beam (reflected light) received. Input data processing part 134 expresses and plots each element point as a coordinate on a sensor coordinate system. In this way, as shown in Fig. 4, the shape of workpiece W is detected, and a coordinate of feature point P can be calculated. As shown in Fig. 2 and Fig. 4, the welding advancing direction is the Z axis, a horizontal direction (within the plane of workpiece W) relative to the welding advancing direction is the X axis, and a vertical direction (vertical direction of workpiece W) relative to the welding advancing direction is the Y axis in the exemplary embodiment. This feature point P is a target welding point of welding torch 123.

    [0026] When laser sensor 138 outputs laser beam at a position shown in Fig. 5, a shape of workpiece W shown in Fig. 6 is detected. In this case, feature point P cannot be detected. Accordingly, it is apparent that a point shifting between absence and presence of feature point P is an edge of one of two workpieces W.

    (Operation of robot system 100)



    [0027] The operation of robot system 100 as configured above is described with reference to Fig. 7 to Fig. 15. Fig. 7 is a perspective view illustrating teaching points P1 to P5 in the exemplary embodiment. Fig. 8 is a sectional view illustrating an attitude of welding torch 123 relative to workpiece W in the exemplary embodiment. Fig. 9 is a perspective view illustrating an attitude of welding torch 123 relative to workpiece W at teaching points P2 to P4. Fig. 10 is a flowchart of the robot control method in the exemplary embodiment. Fig. 11 illustrates a method of generating abnormality decision point P2a from teaching point P2 in the exemplary embodiment. Fig. 12 illustrates calculation of an amount of modification of the welding point from the shape of workpiece W detected by sensor 131 in the exemplary embodiment. Fig. 13 illustrates notification of the amount of modification of the welding point to robot controller 110 in the exemplary embodiment. Fig. 14 illustrates modification of an interpolation point based on modification of the welding point by robot controller 110 in the exemplary embodiment. Fig. 15 illustrates transmission of an amount of modification from detection of an edge of workpiece W to sending of an end point notice to robot controller 110 in the exemplary embodiment. Fig. 16 illustrates modification of teaching points in the exemplary embodiment. (a) is the case when the end point notice is at the back of the teaching point in the welding advancing direction. (b) is the case when the end point notice is in front of the teaching point in the welding advancing direction.

    [0028] The robot control method in the exemplary embodiment modifies the operation of manipulator 120 in real time, based on the shape of workpiece W obtained by sensor 131. Still more, the robot control method in the exemplary embodiment enables to weld a corner of workpiece W while changing an angle of welding torch 123, taking into account displacement at the edge, when sensor 131 detects the edge of workpiece W. This achieves high-quality welding also at the corner of workpiece W. Furthermore, the welding work can be continued without stopping at the corner. Each step of the robot control method in the exemplary embodiment is detailed below.

    (Teaching step)



    [0029] Fig. 7 is an example of teaching when no thermal strain or negligibly small strain is generated by welding workpieces W. In Fig. 7, teaching points P1 to P5 are set (taught) in this order in the welding advancing direction. The teaching step is conducted before the welding step. In this step, positions of teaching points P1 to P5, an attitude of welding torch 123 at each teaching point, a trajectory of welding line (straight or arc) between two teaching points, and an attitude of welding torch 123 between two teaching points are set. This creates a program for movement of welding torch 123 in the welding step. Still more, in the teaching step, interpolation points are automatically set between two teaching points at a constant interval, based on a trajectory between two teaching points, so as to further finely set the movement of welding torch 123. In the exemplary embodiment, the trajectories of welding lines between the teaching points are all straight lines.

    [0030] Next is detailed teaching of an attitude of welding torch 123 in the exemplary embodiment with reference to Fig. 7 and Fig. 8.

    [0031] As shown in Fig. 7, workpieces W are welded in an L-shape welding line, starting from teaching point P1 at an end of workpiece W. Fig. 8 illustrates an attitude of welding torch 123 from teaching point P1 to teaching point P2 when seen from the welding advancing direction side. As shown in Fig. 8, welding torch 123 is taught to operate (move) retaining an appropriate angle relative to workpieces W, typically determined by the size of workpieces W and welding conditions. In the same way as from teaching point P1 to teaching point P2, welding torch 123 is taught to move retaining an appropriate angle relative to workpieces W from teaching point P4 to teaching point P5.

    [0032] Next is described teaching of an attitude of welding torch 123 from teaching point P2 to teaching point P4 via teaching point P3, which is a corner, with reference to Fig. 9. Fig. 9 is a perspective view magnifying the corner of workpiece W. Teaching point P2 and teaching point P4 are teaching points not so far away from teaching point P3 that is an edge of workpiece W. More specifically, distance D between teaching point P2 and teaching point P3, and distance D between teaching point P4 and teaching point P3 are shorter than distance L between welding torch 123 and sensor 131. As described above, welding torch 123 is taught to move retaining a certain attitude relative to workpieces W from teaching point P1 to teaching point P2 and from teaching point P4 to teaching point P5. However, attitudes of welding torch 123 largely differ between that from teaching point P1 to teaching point P2 and that from teaching point P4 to teaching point P5. In the exemplary embodiment, the attitude of welding torch 123 is rotated by 90°. Therefore, as shown in Fig. 9, the attitude of welding torch 123 (torch angle) is successively changed from teaching point P2 to teaching point P4 via teaching point P3. At teaching point P3, an intermediate attitude may be taught between an attitude of welding torch 123 from teaching point P1 to teaching point P2 and an attitude of welding torch 123 from teaching point P4 to teaching point P5. Alternatively, the attitude of welding torch 123 may be changed to rotate about an axis perpendicular to workpiece W from the attitude of welding torch 123 at teaching point P2 to the attitude of welding torch 123 at teaching point P4. From teaching point P2 to teaching point P3, the attitude of welding torch 123 (torch angle) is successively changed based on the teaching program stored in robot controller 110. In the same way, from teaching point P3 to teaching point P4, the attitude of welding torch 123 (torch angle) is successively changed to achieve an appropriate attitude (torch angle) on arriving at teaching point P4. By teaching attitudes of welding torch 123 in this way, the corner of workpiece W can be smoothly welded. The attitude of welding torch 123 may be changed to rotate about the axis perpendicular to workpiece W from the attitude of welding torch 123 at teaching point P2 to the attitude of welding torch 123 at teaching point P4. In other words, the attitude may be rotated retaining a predetermined angle formed by welding torch 123 and workpiece W. A rotation speed may be fixed, or accelerated from teaching point P2 to teaching point P3 and decelerated from teaching point P3 to teaching point P4.

    (Welding step)



    [0033] Next is described the robot control method which is applicable to a strain on workpieces W flexibly caused by welding heat on welding after the aforementioned teaching step is conducted.

    [0034] Fig. 10 is a flowchart illustrating a series of steps in the robot control method by robot system 100 in the exemplary embodiment. When robot system 100 after being taught starts auto-operation, welding torch 123 is first moved to teaching point P1. Then, trajectory arithmetic part 112 generates abnormality decision point P2a, instead of teaching point P2, at a point extending from teaching point P2 by predetermined distance M in the welding advancing direction. Trajectory arithmetic part 112 then generates linear trajectory from teaching point P1 to abnormality decision point P2a, and welding torch 123 starts to move. Here, trajectory arithmetic part 112 calculates distance D from teaching point P2 to teaching point P3, and notifies it to sensor controller 130. Trajectory arithmetic part 112 also notifies movement speed V of welding torch 123 to sensor controller 130 via sensor control part 111 and first data communication part 114 (STEP 1).

    [0035] Next, robot system 100 starts to move welding torch 123, and also starts welding (STEP 2). Also at this point, robot system 100 starts the profile modifying control using sensor 131 (STEP 3). When the profile modifying control starts, sensor 131 starts to detect the shape of workpiece W. Fig. 12 shows an example of detection of the shape of workpiece W. For example, when sensor 131 detects the shape indicated by solid line in Fig. 12, modification calculation part 135 compares it with the shape taught in the teaching step. Feature point P detected by sensor 131 and feature point Q taught in the teaching step may be displaced in the sensor coordinate system. This displacement is caused typically by thermal strain on workpieces W due to welding. Sensor 131 is also applicable to thermal strain during welding by modifying this displacement during welding. If teaching takes place after placing workpiece W, there is no placement error of workpiece W. However, if multiple workpieces are welded using the program conducting the teaching step once, displacement due to a placement error of workpiece W may occur in addition to that due to thermal strain. However, sensor 131 is also applicable to displacement due to placement error.

    [0036] Displacement at this feature point can be expressed by values in the sensor coordinate system. In general, teaching takes place using a two-dimensional sensor coordinate system (X axis and Y axis) perpendicular to the welding advancing direction (Z axis), as shown in Fig. 12. Therefore, when displaced feature point P relative to feature point Q is obtained, as shown in Fig. 12, displacement in the X axis is displacement in the horizontal direction relative to the welding advancing direction. Displacement in the Y axis is displacement in the height direction (vertical direction) relative to the welding advancing direction. After modification calculation part 135 calculates displacement in the horizontal direction and vertical direction relative to the welding advancing direction, sensor controller 130 stores these displacement values in data buffer 136.

    [0037] Fig. 13 shows displacement values retained in data buffer 136 storing displacement values. As shown in Fig. 13, data buffer 136 includes a buffer for horizontal direction and a buffer for vertical direction relative to the welding line. Each buffer is provided for sampling interval Ts for sending an amount of modification from sensor controller 130 to robot controller 110, and the entire data buffer can retain buffers for time T minute. Here, time T is a value obtained by dividing distance L between sensor 131 and welding torch 123 preset in sensor controller 130 by movement speed v of welding torch 123.

    [0038] Transmission of the displacement value in the horizontal direction and displacement value in the vertical direction detected by sensor 131 is delayed by time T, and then the displacement values are sent to robot controller 110 via second data communication part 137. When robot controller 110 receives an amount of modification via first data communication part 114, the displacement value in the horizontal direction and the displacement value in the vertical direction received, relative to the welding advancing direction, are converted to displacement values in the robot coordinate system. Then, robot controller 110 adds the displacement values to the set divided interpolation points to modify the interpolation points, in order to achieve a modified trajectory (welding line). Interpolation calculation part 113 calculates an attitude of each shaft of manipulator 120 by inverse kinematics computing, so as to weld at modified interpolation points. Modification calculation part 135 then sends a command to a motor configuring each shaft of manipulator 120 to take the calculated attitude.

    [0039] An effect of the above operation is described. Sensor 131 always detects the shape of workpiece W ahead of welding torch 123 by distance L. Therefore, displacement detected by sensor 131 is displacement that will occur when welding torch 123 further advances by distance L. Sensor controller 130 thus delays transmission by time T minute obtained by dividing distance L between sensor 131 and welding torch 123 by movement speed v of welding torch 123, using data buffer. After delaying transmission by time T, sensor controller 130 sends an amount of modification to robot controller 110. Robot controller 110 reflects the received amount of modification on the trajectory (welding line) by aforementioned processing in interpolation calculation part 113. Accordingly, displacement detected by sensor 131 is incorporated, and welding torch 123 can weld a position taking into account displacement of workpiece W during welding, as shown in Fig. 14.

    [0040] Next is described a system of detecting an edge of workpiece W. Welding torch 123 moves toward abnormality decision point P2A created in STEP 1 based on the profile modifying control described in STEP 3. If the edge of workpiece W is not detected, although welding torch 123 reaches abnormality decision point P2a, robot system 100 stops as an error based on a decision that the edge of workpiece W is drastically out of position (Yes in STEP 4).

    [0041] Sensor 131 always moves ahead of welding torch 123 by distance L. Therefore, sensor 131 can detect the edge of workpiece W ahead of welding torch 123. As described with reference to Fig. 6, sensor 131 detects the shape of workpiece W without feature point when sensor 131 reaches the edge of workpiece W. When the shape without feature point is still detected after a predetermined time passes, sensor controller 130 determines that sensor 131 has reached the edge of workpiece W, and stops storing modification amount to the buffer. Instead, an edge detection flag is turned ON. Even after the edge of workpiece W is detected, sensor controller 130 continues to send modification amount to robot controller 110 for a while. This is because welding torch 123 is still moving at a position before the edge of workpiece W when sensor 131 detects the edge of workpiece W. Displacement at a positon before the edge of workpiece W is stored in the data buffer of sensor controller 130. Therefore, the profile modifying control continues, as shown in Fig. 15, by sequentially sending modification amount in the data buffer to robot controller 110.

    [0042] Then, sensor controller 130 turns on an edge detection flag of workpiece W, and calculates Time Te = (L - D) /v, using distance L that is a distance between sensor 131 and welding torch 123, distance D that is a distance between teaching point P2 and teaching point P3, and movement speed v of welding torch 123. After time Te passes, sensor controller 130 notifies robot controller 110 via second data communication part 137 that the edge of workpiece W has been detected, i.e., welding torch 123 has reached a point equivalent to teaching point P2. A modified point equivalent to teaching point P2 is set as a first modified point.

    [0043] Now, an effect of time Te is described. The edge of workpiece W detected by sensor 131 in nature means detection of teaching point P3. After sensor 131 detects teaching point P3, welding torch 123 reaches teaching point P3 by moving distance L, which is a distance between welding torch 123 and sensor 131, at movement speed v. Therefore, arrival of welding torch 123 at a point equivalent to teaching point P3 can be notified after time calculated by Distance L/Movement speed v passes. However, in the exemplary embodiment, an attitude of welding torch 123 (torch angle) starts to be changed before teaching point P3, and thus sensor 131 detects arrival at teaching point P2 before teaching point P3, which is the edge of workpiece W, by distance D. Accordingly time Te from detection of the edge of workpiece W to arrival of welding torch 123 at teaching point P2 is calculated, in order to move welding torch 123 for a distance subtracting distance D from distance L.

    [0044] Next, is described processing in robot controller 110 after receiving the end point notice from sensor controller 130 (YES in STEP 5) with reference to Fig. 16. Fig. 16 is a top view of Fig. 7, illustrating teaching points P1 to P5. For the convenience of description, vertical displacement is omitted in Fig. 16.

    [0045] Fig. 16 (a) shows an example that a position receiving the end point notice is before teaching point P2 in the welding advancing direction. The position receiving the end point notice is a modified point displaced from teaching point P2 (first modified point) for a component synthesizing displacement caused by the profile modifying control and displacement caused by the end detection function. In the exemplary embodiment, modified point P2b (first modified point) at the time of receiving the end point notice is received is considered as a position reaching teaching point P2. Since the profile modifying control is stopped at teaching point P2, the profile modifying control is also stopped at modified point P2b at which the end point notice is received (STEP 6).

    [0046] Trajectory arithmetic part 112 then calculates difference (Δx, Δy, Δz) that is displacement of modified point P2b relative to teaching point P2, and generates modified point P3b (second modified point) in which the same amount of difference is added to teaching point P3 (STEP 7). Furthermore, trajectory arithmetic part 112 regenerates a trajectory for welding torch 123 to move from modified point P2 to modified point P3b, and welding torch 123 continues to move (STEP 8). Movement of welding torch 123 from teaching point P2 to teaching point P3 takes place while the attitude of welding torch (torch angle) is successively changed. In the same way, movement of welding torch 123 from modified point P2b to modified point P3b takes place while the attitude of welding torch 123 (torch angle) is successively changed.

    [0047] When welding torch 123 reaches modified point P3b (Yes in STEP 9), trajectory arithmetic part 112 generates modified point P4b (third modified point) by adding the amount of difference between teaching point P2 and modified point P2b to teaching point P4, same as for modified point P2b, and generates a trajectory from modified point P3b to modified point P4b. Movement of welding torch 123 from modified point P3b to modified point P4b also takes place while the attitude of welding torch 123 (torch angle) is successively changed, same as movement from teaching point P3 to teaching point P4. Then, when reaching modified point P4b (Yes in STEP 10), a trajectory is regenerated relative to modified point P5b (fourth modified point), in which the amount of difference is added to teaching point P5, and welding torch 123 moves toward modified point P5b. Here, the profile modifying control restarts from modified point P4b to modified point P5b, same as restarting the profile modifying control at teaching point P4 (STEP 11). Welding torch 123 moves to point P5b while the profile modifying control is applied. When welding torch 123 reaches a point equivalent to modified point P5b (fifth modified point), the work ends (Yes in STEP 12). Modified point P4b and modified point P5b may be calculated at the same time as calculating a difference between teaching point P2 and modified point P2b.

    [0048] The step of welding from teaching point P1 to modified point P2b is the first processing step, the step of modifying teaching points P2 to P5 to modified points P2b to P5b is the modifying step, welding from modified point P2b to modified point P4b is the second processing step, and welding from modified point P4b to modified point P5b is the third processing step.

    [0049] Fig. 16 (b) shows an example when a position receiving the end point notice is ahead of teaching point P2 in the welding advancing direction. A control method in this case can be achieved by the same control as that described in Fig. 16 (a).

    [0050] Next, an effect of the exemplary embodiment is described.

    [0051] In the exemplary embodiment, the profile modifying control by sensor 131 stops at a position before the edge of workpiece W by distance D. Then, the corner of workpiece W is welded while the attitude of welding torch 123 (torch angle) is changed based on the operation program stored in robot controller 110. If the attitude of welding torch 123 is changed at the corner of workpiece W while the profile modifying control is applied, a positional relationship of sensor 131 fixed on welding torch 123 and workpiece W will also be changed. This results in losing proper recognition of the position of workpiece W. Accordingly, the profile modifying control is stopped and then the corner of workpiece W is welded while the attitude of welding torch 123 is changed, as in the exemplary embodiment, so that the correct position of workpiece W can be identified. However, if distance D between teaching point P2 and teaching point P3 is enlarged, a period of profile modifying control becomes short. Distance D is thus preferably suppressed to a distance needed for changing the attitude of welding torch 123.

    [0052] As described above, bead at the corner of workpiece W can be made to a required shape by tentatively stopping the profile modifying control by sensor 131 and welding the corner of workpiece W while changing the attitude of welding torch 123. Welding at the corner of workpiece W is a partial section in the entire welding section of workpiece W. Accordingly, only negligible displacement occurs due to a strain generated by welding the corner section. Tentative stoppage of profile modifying control is thus not a problem.

    [0053] Still more, distance L and distance D are stored in advance in sensor controller 130 in the exemplary embodiment. Distance L is a distance between welding torch 123 and sensor 131. Distance D is a distance between teaching point P3 that is the edge of workpiece W and teaching point P2 before the edge. Time that welding torch 123 reaches modified point P2b equivalent to teaching point P2 is calculated based on distance L, distance D, and speed v; and notified to robot controller 110 at the time sensor 131 detects teaching point P3 that is the edge of workpiece W. This enables to execute the operation according to teaching from a position before the position detected by sensor 131 by predetermined distance D even if the edge of workpiece W is displaced typically due to thermal strain by welding.

    [0054] Still more, in the exemplary embodiment, abnormality decision point P2a for receiving the end point notice is created at a point ahead of teaching point P2 in the welding advancing direction, and the trajectory control is applied to abnormality decision point P2a. This enables to move welding torch 123 continuously even if a point at which the end point notice is received is ahead of teaching point P2, as shown in Fig. 16 (b).

    [0055] Still more, in the exemplary embodiment, teaching point P3 that is the edge of workpiece W, teaching point P2 before the edge by distance D, and distance D are stored in sensor controller 130 to enable sensor controller 130 to calculate modified point P2b equivalent to teaching point P2. However, distance D may be stored in robot controller 110. In this case, sensor controller 130 sends the end point notice to robot controller 110 at the time sensor 131 detects the end point of workpiece W, and robot controller 110 considers a position advanced by distance D from the time receiving the end point notice as modified point P2b.

    [0056] Still more, in the exemplary embodiment, sensor 131 using laser beam is used as a detector for recognizing workpiece W. However, as long as the shape of workpiece W can be recognized, detectors other than sensor 131 using laser beam are applicable. For example, equipment for image recognition using camera and equipment using ultrasonic waves can be used as the detector.

    [0057] Still more, the exemplary embodiment refers to an example of mounting sensor 131 on welding torch 123. However, sensor 131 may be mounted on manipulator 120 such that sensor 131 is in front of welding torch 123 with predetermined distance L.

    [0058] Still more, the weaving operation may be added to welding torch 123 in the exemplary embodiment. This further is applicable to displacement of welding position.

    [0059] Still more, the exemplary embodiment refers to a welding line with one corner (one bending). However, the control in the exemplary embodiment is applicable to each corner even when the welding line includes multiple corners. This achieves further efficient welding.

    [0060] Still more, the exemplary embodiment refers to the arc welding torch as an example of work tool. However, the work tool may also be a laser head for laser welding that achieves welding by laser beam. Furthermore, the exemplary embodiment refers to welding as an example of processing. However, processing may be coating using a coating material or application of adhesive. In this case, a spray gun may be given as an example of the work tool.

    INDUSTRIAL APPLICABILITY



    [0061] The robot control method of the disclosure can achieve a required finish by continuous processing even if a processing line includes a bent part. This further improves work efficiency, and is thus industrially useful.

    REFERENCE MARKS IN THE DRAWINGS



    [0062] 
    P1 - P5
    Teaching point
    P2a
    Abnormality decision point
    P2b P5b
    Modified point
    100
    Robot system
    110
    Robot controller
    111
    Sensor control part
    112
    Trajectory arithmetic part
    113
    Interpolation calculation part
    114
    First data communication part
    120
    Manipulator
    121
    Welding wire feeder
    122
    Welding wire
    123, 402
    Welding torch
    130
    Sensor controller
    131
    Sensor
    132
    Laser output control part
    133
    Laser input control part
    134
    Input data processing part
    135
    Modification calculation part
    136
    Data buffer
    137
    Second data communication part
    138, 403
    Laser sensor
    140
    Welding power supply unit



    Claims

    1. A robot control method for performing processing on a workpiece (W) on a processing line using a work tool (123) and a sensor (131) mounted on the processing advancing direction side of the work tool (123) and wherein the sensor is detecting the shape of the workpiece (W);
    the robot control method comprising:

    a) setting, in advance of the processing of the workpiece (W),a first teaching point (P1), a second teaching point (P2), a third teaching point (P3), a fourth teaching point (P4), and a fifth teaching point (P5) on the processing line for the workpiece (W),to create a teaching program,
    wherein each of the first teaching point (P1), the second teaching point (P2), the third teaching point (P3), the fourth teaching point (P4), and the fifth teaching point (P5) corresponds to a position of the work tool (123) during processing of the workpiece (W) in a process advancing direction on the processing line,
    wherein the third teaching point (P3) is a bent point in the processing line;and
    wherein the work tool is taught to move retaining a certain attitude relative to the workpiece from first teaching point (P1) to second teaching point (P2) and from fourth teaching point (P4) to fifth teaching point (P5), while the attitude of the work tool is successively changed from the second teaching point (P2) to the fourth teaching point (P4) via the third teaching point (P3);

    b) a first processing of performing the processing on the workpiece (W) using the work tool (123) from the first teaching point (P1) to a first modified point (P2b) on the process line reached as a result of applying a profile modifying control to change a position of the work tool (123) based on the second teaching point (P2) [original description page 3, lin. 18-23] and the shape of the workpiece detected by the sensor;

    c) changing, by adding the same amount of difference between the second teaching point (P2) and the first modified point (P2b), the third teaching point (P3) to a second modified point (P3b), the fourth teaching point (P4) to a third modified point (P4b), and the fifth teaching point (P5) to a fourth modified point (P5b);

    d) a second processing of performing the processing on the workpiece (W) using the work tool (123) from the first modified point (P2b) through the second modified point (P3b) to the third modified point (P4b); wherein the attitude of the work tool (123) is changed based on the teaching program created in step a) and stored in the robot, and

    e) a third processing of performing the processing on the workpiece (W) using the work tool (123) from the third modified point (P4b) to the fourth modified point (P5b).


     
    2. The robot control method of claim 1, wherein a distance between the second teaching point (P2) and the third teaching point (P3) is shorter than a distance (L) between the work tool (123) and the sensor (131).
     
    3. The robot control method of claim 1, wherein the work tool (123) is an arc welding torch, the sensor (131) is a laser sensor, and the processing line is a welding line.
     
    4. The robot control method of claim 1, wherein the work tool is a laser head for laser welding, the sensor is a laser sensor, and the processing line is a welding line.
     
    5. The robot control method of claim 1, wherein the attitude of the work tool (123) is fixed during the first processing.
     
    6. The robot control method of claim 1, wherein the attitude of the work tool (123) is fixed during the third processing.
     


    Ansprüche

    1. Robotersteuerverfahren zum Durchführen einer Verarbeitung an einem Werkstück (W) in einer Verarbeitungslinie unter Verwendung eines Arbeitswerkzeugs (123) und eines Sensors (131), der auf der Verarbeitungsvorschubrichtungsseite des Arbeitswerkzeugs (123) montiert ist, und wobei der Sensor die Form des Werkstücks (W) detektiert;
    wobei das Robotersteuerverfahren Folgendes umfasst:

    a) Einstellen eines ersten Lehrpunkts (P1), eines zweiten Lehrpunkts (P2), eines dritten Lehrpunkts (P3), eines vierten Lehrpunkts (P4) und eines fünften Lehrpunkts (P5) in der Verarbeitungslinie für das Werkstück (W) vor dem Verarbeiten des Werkstücks (W), um ein Lehrprogramm zu erstellen,
    wobei jeder des ersten Lehrpunkts (P1), des zweiten Lehrpunkts (P2), des dritten Lehrpunkts (P3), des vierten Lehrpunkts (P4) und des fünften Lehrpunkts (P5) einer Position des Arbeitswerkzeugs (123) während der Verarbeitung des Werkstücks (W) in einer Prozessvorschubrichtung in der Verarbeitungslinie entspricht,
    wobei der dritte Lehrpunkt (P3) ein abgeknickter Punkt in der Verarbeitungslinie ist, und
    wobei das Arbeitswerkzeug gelehrt wird, sich unter Beibehaltung einer bestimmten Lage relativ zum Werkstück vom ersten Lehrpunkt (P1) zum zweiten Lehrpunkt (P2) und vom vierten Lehrpunkt (P4) zum fünften Lehrpunkt (P5) zu bewegen, während die Lage des Arbeitswerkzeugs vom zweiten Lehrpunkt (P2) via den dritten Lehrpunkt (P3) zum vierten Lehrpunkt (P4) sukzessive geändert wird;

    b) eine erste Verarbeitung des Durchführens der Verarbeitung am Werkstück (W) unter Verwendung des Arbeitswerkzeugs (123) vom ersten Lehrpunkt (P1) zu einem ersten modifizierten Punkt (P2b) in der Prozesslinie, der als Resultat des Anwendens einer Profilmodifikationssteuerung zum Ändern einer Position des Arbeitswerkzeugs (123) auf Basis des zweiten Lehrpunkts (P2) und der Form des Werkstücks, die vom Sensor detektiert wird, erreicht wird;

    c) Ändern des dritten Lehrpunkts (P3) in einen zweiten modifizierten Punkt (P3b), des vierten Lehrpunkts (P4) in einen dritten modifizierten Punkt (P4b) und des fünften Lehrpunkts (P5) in einen vierten modifizierten Punkt (P5b) durch Hinzufügen desselben Differenzbetrags zwischen dem zweiten Lehrpunkt (P2) und dem ersten modifizierten Punkt (P2b);

    d) eine zweite Verarbeitung des Durchführens der Verarbeitung am Werkstück (W) unter Verwendung des Arbeitswerkzeugs (123) vom ersten modifizierten Punkt (P2b) über den zweiten modifizierten Punkt (P3b) zum dritten modifizierten Punkt (P4b),
    wobei die Lage des Arbeitswerkzeug (123) auf Basis des Lehrprogramms, das in Schritt a) erstellt und im Roboter gespeichert wird, geändert wird; und

    e) eine dritte Verarbeitung des Durchführens der Verarbeitung am Werkstück (W) unter Verwendung des Arbeitswerkzeugs (123) vom dritten modifizierten Punkt (P4b) zum vierten modifizierten Punkt (P5b).


     
    2. Robotersteuerverfahren Anspruch 1, wobei ein Abstand zwischen dem zweiten Lehrpunkt (P2) und dem dritten Lehrpunkt (P3) kürzer ist als ein Abstand (L) zwischen dem Arbeitswerkzeug (123) und dem Sensor (131).
     
    3. Robotersteuerverfahren nach Anspruch 1, wobei das Arbeitswerkzeug (123) ein Lichtbogenbrenner ist, der Sensor (131) ein Lasersensor ist und die Verarbeitungslinie eine Schweißlinie ist.
     
    4. Robotersteuerverfahren nach Anspruch 1, wobei das Arbeitswerkzeug ein Laserkopf zum Laserschweißen ist, der Sensor ein Lasersensor ist und die Verarbeitungslinie eine Schweißlinie ist.
     
    5. Robotersteuerverfahren nach Anspruch 1, wobei die Lage des Arbeitswerkzeugs (123) während der ersten Verarbeitung fest ist.
     
    6. Robotersteuerverfahren nach Anspruch 1, wobei die Lage des Arbeitswerkzeugs (123) während der dritten Verarbeitung fest ist.
     


    Revendications

    1. Procédé de commande de robot pour réaliser un traitement sur une pièce de fabrication (W) sur une ligne de traitement à l'aide d'un outil de travail (123) et d'un capteur (131) monté sur le côté de direction d'avancement de traitement de l'outil de travail (123), et dans lequel le capteur détecte la géométrie de la pièce de fabrication (W) ;
    le procédé de commande de robot comprenant :

    a) la définition, avant le traitement de la pièce de fabrication (W), d'un premier point d'apprentissage (P1), d'un deuxième point d'apprentissage (P2), d'un troisième point d'apprentissage (P3), d'un quatrième point d'apprentissage (P4) et d'un cinquième point d'apprentissage (P5) sur la ligne de traitement pour la pièce de fabrication (W), pour créer un programme d'apprentissage,
    dans lequel chacun du premier point d'apprentissage (P1), du deuxième point d'apprentissage (P2), du troisième point d'apprentissage (P3), du quatrième point d'apprentissage (P4) et du cinquième point d'apprentissage (P5) correspond à une position de l'outil de travail (123) au cours du traitement de la pièce de fabrication (W) dans une direction d'avancement de traitement sur la ligne de traitement,
    dans lequel le troisième point d'apprentissage (P3) est un point d'inflexion dans la ligne de traitement, et
    dans lequel l'outil de travail apprend à se déplacer en conservant une certaine attitude par rapport à la pièce de fabrication du premier point d'apprentissage (P1) au deuxième point d'apprentissage (P2) et du quatrième point d'apprentissage (P4) au cinquième point d'apprentissage (P5), tandis que l'attitude de l'outil de travail est successivement changée du deuxième point d'apprentissage (P2) au quatrième point d'apprentissage (P4) via le troisième point d'apprentissage (P3) ;

    b) un premier traitement consistant à réaliser le traitement sur la pièce de fabrication (W) à l'aide de l'outil de travail (123) du premier point d'apprentissage (P1) à un premier point modifié (P2b) sur la ligne de traitement atteint en conséquence de l'application d'une commande de modification de profil pour changer une position de l'outil de travail (123) sur la base du deuxième point d'apprentissage (P2) et de la géométrie de la pièce de fabrication détectée par le capteur ;

    c) le changement, par l'ajout de la même quantité de différence entre le deuxième point d'apprentissage (P2) et le premier point modifié (P2b), du troisième point d'apprentissage (P3) en un deuxième point modifié (P3b), du quatrième point d'apprentissage (P4) en un troisième point modifié (P4b) et du cinquième point d'apprentissage (P5) en un quatrième point modifié (P5b) ;

    d) un deuxième traitement consistant à réaliser le traitement sur la pièce de fabrication (W) à l'aide de l'outil de travail (123) du premier point modifié (P2b), en passant par le deuxième point modifié (P3b), au troisième point modifié (P4b),
    dans lequel l'attitude de l'outil de travail (123) est changée sur la base du programme d'apprentissage créé à l'étape a) et stocké dans le robot ; et

    e) un troisième traitement consistant à réaliser le traitement sur la pièce de fabrication (W) à l'aide de l'outil de travail (123) du troisième point modifié (P4b) au quatrième point modifié (P5b).


     
    2. Procédé de commande de robot selon la revendication 1, dans lequel une distance entre le deuxième point d'apprentissage (P2) et le troisième point d'apprentissage (P3) est plus courte qu'une distance (L) entre l'outil de travail (123) et le capteur (131).
     
    3. Procédé de commande de robot selon la revendication 1, dans lequel l'outil de travail (123) est une torche de soudage à l'arc, le capteur (131) est un capteur laser et la ligne de traitement est une ligne de soudage.
     
    4. Procédé de commande de robot selon la revendication 1, dans lequel l'outil de travail est une tête laser pour un soudage au laser, le capteur est un capteur laser et la ligne de traitement est une ligne de soudage.
     
    5. Procédé de commande de robot selon la revendication 1, dans lequel l'attitude de l'outil de travail (123) est fixe au cours du premier traitement.
     
    6. Procédé de commande de robot selon la revendication 1, dans lequel l'attitude de l'outil de travail (123) est fixe au cours du troisième traitement.
     




    Drawing




































    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description