|
(11) | EP 4 292 774 A1 |
| (12) | EUROPEAN PATENT APPLICATION |
|
|
|
|
|||||||||||||||||||||||||||
| (54) | DIGITAL TWIN-BASED INTELLIGENT MANUFACTURING SYSTEM WITH INTEGRATED SYNCHRONIZATION CONTROLLER |
| (57) A digital twin-based intelligent manufacturing system, composed of a human-computer
interaction system, a programming control system, a twin virtual system and a processing
execution system; the human-computer interaction system being used for input and transmission
of characteristic parameters of parts to be processed and bionic unit information;
the programming control system being used for converting the characteristic parameters
of parts to be processed and the bionic unit information into a motion trajectory
of a six-degree-of-freedom joint robot and processing parameters, and an integrated
synchronization controller on a high-performance computer realizing integrated control
on the whole system; the twin virtual system being used for real-time monitoring and
offline simulation of processing quality and bionic information during bionic processing,
thereby realizing real-time adjustment and feedback of process parameters; and the
processing execution system realizing a multi-dimensional motion of a laser processing
work head and laser processing of workpieces by means of the laser processing work
head arranged at a head end of the six-degree-of-freedom joint robot. The system can
be used to well meet the current precision and multifunctional manufacturing needs
of engineering bionics using laser methods.
|
BACKGROUND OF THE INVENTION
Field of the Invention
Description of Related Art
SUMMARY OF THE INVENTION
the human-computer interaction system includes a 3D model editor module and a bionic processing unit constructor module; a processing scheme generator receives parameters of parts to be processed from the 3D model editor module and bionic information parameters of the bionic processing unit constructor module;
the bionic information parameters include the selection of a bionic unit processing method, the setting of workpiece matrix material parameters, unit body characteristic parameters, the construction and calibration of processing position and area, etc.;
the programming control system includes a high-performance computer, a laser controller, a robot motion controller and an auxiliary device controller, with the laser controller, the robot motion controller and the auxiliary device controller being directly connected to the high-performance computer via a serial port;
the high-performance computer is integrated with a bionic process parameter database, the processing scheme generator and an integrated synchronization controller, with the bionic process parameter database and the processing scheme generator providing a human-computer interaction system interface;
the bionic process parameter database outputs laser processing parameters and environmental medium parameters to the processing scheme generator;
the processing scheme generator takes a 3D workpiece model added with bionic processing unit information and the bionic processing parameter information output from the bionic process parameter database as input, and adopts an intelligent algorithm to generate corresponding control command sequences respectively for a laser, a six-degree-of-freedom joint robot and an auxiliary device; and the control command sequences include a time control sequence and a device command control sequence, and the control command sequences not only contain respective control information, but also contain mutual synchronization information;
the input of the integrated synchronization controller involves the time control sequence and the device command control sequence generated by the processing scheme generator, and command execution feedback information of the six-degree-of-freedom joint robot, the laser and the auxiliary device are received at the same time; and the output of the integrated synchronization controller involves control commands respectively sent to the laser controller, the robot motion controller and the auxiliary device controller;
the twin virtual system includes a manufacturing terminal entity scene twin model, a process data real-time acquisition module, a bionic information parameter offline acquisition module, a process offline simulation module and a process data controller;
the process data controller receives process data information from the manufacturing terminal entity scene twin model, the process data real-time acquisition module, the bionic information parameter offline acquisition module and the process offline simulation module; the process offline simulation module receives information from the process data real-time acquisition module and the bionic information parameter offline acquisition module, and process parameter regulation results of the process data controller are transmitted to the processing scheme generator to generate control command sequences; and the bionic process parameter database receives process information of the bionic information parameter offline acquisition module and updates process data in the bionic process parameter database in real time; and
the processing execution system includes a laser, an optical fiber coupling transmission unit, a laser processing work head, a six-degree-of-freedom joint robot and an auxiliary device; the laser is connected to the laser processing work head via the optical fiber coupling transmission unit, and the laser processing work head arranged at a head end of the six-degree-of-freedom joint robot realizes a multi-dimensional motion of the laser processing work head and laser processing.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic diagram of an overall structural composition of a digital twin-based intelligent manufacturing system according to the present invention.
Fig. 2 is a schematic diagram of a working principle of a digital twin-based intelligent manufacturing system according to the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
the human-computer interaction system includes a 3D model editor module 11 and a bionic processing unit constructor module 12; the human-computer interaction system is used for input and transmission of characteristic parameters of parts to be processed and bionic unit information; the 3D model editor module 11 outputs the parameters of the parts to be processed (a 3D workpiece model) to a processing scheme generator 22, and the bionic processing unit constructor module 12 outputs the bionic information parameters to the processing scheme generator 22;
the human-computer interaction system is a window for customer demand information input and can perform parametrical customization on the parts to be processed and a bionic unit, and the human-computer interaction system mainly refers to interface system software, which is specially programmed for the design and preparation of bionic coupled surfaces; the preparation work in the early stage of processing only requires the customer to input the information of the bionic unit design according to the requirements of the bionic unit information construction wizard; the interface system software mainly includes the 3D model editor module 11 and the bionic processing unit constructor module 12, and the 3D model editor module 11 can perform CAD modeling, loading, display, manipulation, change, storage, data export etc. on the parts to be processed;
the bionic processing unit constructor module 12 uses a wizard construction method to guide the user to complete the setting of bionic information parameters step by step, with the bionic information parameters include the selection of a bionic unit processing method (phase transformation, fusion, engraving, cladding and alloying), the setting of workpiece matrix material parameters (material grade, absorption coefficient, thermal conductivity, etc.), unit body characteristic parameters (shape, spacing, depth, width and average microhardness of the unit body), and the construction and calibration of processing position and area; and if the processing method in the bionic processing unit information parameters is selected as cladding or alloying, a dialog box for adding powder material parameters needs to be popped up to set up cladding or alloy powder parameters (grade, composition, absorption coefficient, thermal conductivity, etc.);
the programming control system 2 includes a high-performance computer, a laser controller 62, a robot motion controller 41 and an auxiliary device controller 51, with the laser controller 62, the robot motion controller 41 and the auxiliary device controller 51 being directly connected to the high-performance computer via a serial port, and the programming control system 2 being used for converting the characteristic parameters of parts to be processed and the bionic unit information into a motion trajectory of a six-degree-of-freedom joint robot 42 and processing parameters, and an integrated synchronization controller 23 on the high-performance computer realizing integrated control on the whole system;
the high-performance computer is integrated with a bionic process parameter database 21, the processing scheme generator 22 and an integrated synchronization controller 23, with the bionic process parameter database 21 and the processing scheme generator 22 providing a human-computer interaction system interface;
the bionic process parameter database 21 outputs laser processing parameters (wavelength, mode, power density, frequency, pulse width, defocus amount, light spot diameter, scanning speed, etc.) and environmental medium parameters (airflow angle, speed, powder flow, water film thickness, etc.) to the processing scheme generator 22, and the bionic process parameter database 21 is constructed by relying on a large number of theoretical and experimental data and parameters;
the processing scheme generator 22 takes a 3D workpiece model added with bionic processing unit information and the bionic processing parameter information output from the bionic process parameter database 21 as input, and adopts an intelligent algorithm to generate corresponding control command sequences respectively for a laser 61, a six-degree-of-freedom joint robot 42 and an auxiliary device 52; and the control command sequences not only contain respective control information, but also contain mutual synchronization information;
its working principle is as follows: the processing scheme generator 22 first allowing a user to determine center coordinates of the six-degree-of-freedom joint robot 42, a workpiece fixture and the workpiece 6, so that the overall layout and positioning of the processing scheme is completed, and then the processing scheme generator 22 traversing bionic processing positions and areas in the 3D workpiece model: first extracting characteristic parameters (shape, spacing, depth, width and average microhardness of the unit body) of a unit body to be processed from the 3D workpiece model, and then using the processing information of the bionic processing unit to find the optimized process parameter data in the bionic process parameter database 21; next, combining the laser processing parameters (defocus amount and scanning speed) related to a path with a curve of the bionic unit to be processed to calculate walking path, speed, attitude and other information of the six-degree-of-freedom joint robot 42, where these information can also be obtained by connecting the robot motion controller 41 into the processing scheme generator 22 and instructing the six-degree-of-freedom joint robot 42; finally, respectively extracting relevant control information of the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52 in a classified manner, and respectively converting the relevant control information, in a manner of rule mapping, into code sequences that can be recognized by the laser controller 62, the robot motion controller 41 and the auxiliary device controller 51; and at the same time, generating a cooperative working time sequence of the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52, and establishing a corresponding mapping relationship between the code sequences and the time sequence, so that the integrated synchronization controller 23 can perform a uniform control thereon to enable coordinated work thereof;
the coordinated control of the processing execution system is mainly completed by the integrated synchronization controller 23, which integrates the control on the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52 for unified scheduling management, so as to achieve synchronous and coordinated operation among the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52; the input of the integrated synchronization controller 23 involves the time control sequence and the device command control sequence generated by the processing scheme generator 22, and command execution feedback information of the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52 are received at the same time; the output of the integrated synchronization controller 23 involves control commands respectively sent to the laser controller 62, the robot motion controller 41 and the auxiliary device controller 51, and its working process is as follows: according to the time control sequence, first respectively finding and sending the corresponding control commands of the laser controller 62, the robot motion controller 41 and the auxiliary device controller 51, receiving feedback information of the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52 at the same time, comparing the feedback information of the three with execution expectations, and if the feedback information meets the expectations, indicating that the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52 operate well synchronously, and continuing to send subsequent control commands; and if the feedback information does not meet the expectations, invoking a preset synchronization processing mechanism for processing (for example, certain equipment stops waiting) according to the feedback conditions of the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52, and then starting the execution of subsequent commands when the three devices are in sync again;
the twin virtual system 3 includes a manufacturing terminal entity scene twin model 31, a process data real-time acquisition module 32, a bionic information parameter offline acquisition module 33, a process offline simulation module 34 and a process data controller 35; the manufacturing terminal entity scene twin model 31 contains physical entity scene models and a large number of modeled and formulated process models; and the twin virtual system 3 is used for real-time monitoring and offline simulation of processing quality and bionic information during bionic processing, thereby realizing real-time adjustment and feedback of process parameters;
the process data controller 35 receives process data information from the manufacturing terminal entity scene twin model 31, the process data real-time acquisition module 32, the bionic information parameter offline acquisition module 33 and the process offline simulation module 34; the process offline simulation module 34 receives information from the process data real-time acquisition module 32 and the bionic information parameter offline acquisition module 33, and process parameter regulation results of the process data controller 35 are transmitted to the processing scheme generator 22 to generate control command sequences; and the bionic process parameter database 21 receives process information of the bionic information parameter offline acquisition module 33 and updates process data in the bionic process parameter database 21 in real time;
the process data controller 35 transmits process parameter regulation results to the processing scheme generator 22 in a specific manner as follows:
the processing scheme generator 22 first traversing in the manufacturing terminal entity scene twin model 31 and the bionic process parameter database 21 according to the bionic information parameters set by the user, receiving initial processing parameters, the process data real-time acquisition module 32 realizing the acquisition of process data such as processing terminal scene perception data, in-situ molten pool temperature and splash process data by means of a sensor and a visual acquisition device during the bionic processing, the process data controller 35 receiving relevant process information from the process data real-time acquisition module 32 and the process offline simulation module 34 for comprehensive analysis, judging the accuracy of the bionic information parameters and performing real-time adjustment on the process parameters, and performing real-time data interaction with the manufacturing terminal entity scene twin model 31; the bionic information parameter offline acquisition module 33 updating the process information in the bionic process parameter database 21 via the process data controller 35 by performing offline acquisition and entry on bionic processing elements; the process offline simulation module 34 receiving the information from the process data real-time acquisition module 32 and the bionic information parameter offline acquisition module 33 for performing offline simulation of the bionic manufacturing process, and the manufacturing terminal entity scene twin model 31 carrying on study by means of data interaction with the process offline simulation module 34, and adjusting the model; and
the processing execution system includes a laser 61, an optical fiber coupling transmission unit 63, a laser processing work head 64, a six-degree-of-freedom joint robot 42 and an auxiliary device 52; the laser 61 is connected to the laser processing work head 64 via the optical fiber coupling transmission unit 63, and the laser processing work head 64 arranged at a head end of the six-degree-of-freedom joint robot 42 realizes a multi-dimensional motion of the laser processing work head 64 and laser processing.
the user enters the 3D model editor module 11 of the human-computer interaction interface system, and creates or imports a CAD model of the test pieces or parts to be processed; in the 3D model editor module 11, the customer can also carry out operations such as modification and exporting on the creation or import of the test pieces or parts to be processed; and after entering the bionic processing unit builder module 12, the construction of the bionic processing unit is completed step by step by using the construction wizard provided by the bionic processing unit builder module 12, involving first selecting the type of processing methods, such as phase transformation, fusion, engraving, cladding, alloying, etc., and then the system popping up different selection pages according to the user's selection to prompt the user to enter the required workpiece matrix material parameter information (material grade, composition, absorption coefficient, thermal conductivity, etc.) of selected processing types;
if the processing method is phase transformation, fusion or engraving, the user only needs to provide the workpiece matrix material parameter information (material grade, composition, absorption coefficient, thermal conductivity, etc.);
if the processing method is cladding or alloying, the user also needs to provide parameter information such as the grade, composition, absorption coefficient, thermal conductivity coefficient, powder type and particle size of the added powder material except for only needing to provide the workpiece matrix material parameter information (material grade, composition, absorption coefficient, thermal conductivity, etc.);
after the input information is confirmed, the system automatically returns to the 3D model editing module 11, and by taking the created or loaded 3D model of the test pieces or parts to be processed as the blueprint, the bionic processing unit that needs to be processed is calibrated on the 3D model by means of a 3D modeling operation on its basis, so that the processing curve can be generated in the later stage; the system can provide 3D model modeling tools suitable for the selection results according to the type of processing methods selected mentioned above, the workpiece matrix material parameters and/or powder material parameters, so that the user can calibrate positions and areas that need to be subjected to bionic processing on the 3D model, containing calibration tools such as points, lines and surfaces;
after the bionic processing unit information is input and confirmed, the system automatically enters the processing scheme generator 22 in the programming control system, and the processing scheme generator 22, by taking the 3D workpiece model that is added with the bionic processing unit information, and bionic process parameter information as input, uses an intelligent method to generate corresponding control command sequences for the laser 61, the six-degree-of-freedom joint robot 42 and the auxiliary device 52, respectively; and
the processing scheme generator 22 first allowing a user to determine center coordinates of the six-degree-of-freedom joint robot 42, a workpiece fixture and the workpiece, so that the overall layout and positioning of the processing scheme is completed, and then the processing scheme generator 22 traversing bionic processing positions and areas in the 3D workpiece model: first extracting characteristic parameters of a unit body to be processed (shape, spacing, depth, width and average microhardness of the unit body) from the 3D workpiece model, and then using the processing information of the bionic processing unit to find the optimized process parameter data (laser wavelength, mode, power density, defocus amount, scanning speed, pulse width, frequency, shielding gas flow, angle, etc.) in the bionic process parameter database 21; at the same time, the processing scheme generator 22 extracting a curve of the bionic processing unit to be processed from the 3D workpiece model, next, combining the laser processing parameters (such as defocus amount and scanning speed) related to a path with the curve of the bionic processing unit to be processed to calculate walking path, walking speed, attitude and other information of the six-degree-of-freedom joint robot 42, where these information can also be obtained by connecting the robot motion controller 41 into the processing scheme generator 22 and instructing and programming the six-degree-of-freedom joint robot 42; finally, respectively extracting relevant control information of the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52 in a classified manner, and respectively converting the relevant control information, in a manner of rule mapping, into code sequences that can be recognized by the laser controller 62, the robot motion controller 41 and the auxiliary device controller 51; and at the same time, generating a cooperative working time sequence of the six-degree-of-freedom joint robot 42, the laser 61 and the auxiliary device 52, and establishing a corresponding mapping relationship between the code sequences and the time sequence, so that the integrated synchronization controller 23 can perform a uniform control thereon to enable coordinated work thereof.
the human-computer interaction system comprises a 3D model editor module (11) and a bionic processing unit constructor module (12); a processing scheme generator (22) receives parameters of parts to be processed from the 3D model editor module (11) and bionic information parameters of the bionic processing unit constructor module (12);
the bionic information parameters comprise the selection of a bionic unit processing method, the setting of workpiece matrix material parameters, unit body characteristic parameters, and the construction and calibration of processing position and area;
the programming control system (2) comprises a high-performance computer, a laser controller (62), a robot motion controller (41) and an auxiliary device controller (51), with the laser controller (62), the robot motion controller (41) and the auxiliary device controller (51) being directly connected to the high-performance computer via a serial port;
the high-performance computer is integrated with a bionic process parameter database (21), the processing scheme generator (22) and an integrated synchronization controller (23), with the bionic process parameter database (21) and the processing scheme generator (22) providing a human-computer interaction system interface;
the bionic process parameter database (21) outputs laser processing parameters and environmental medium parameters to the processing scheme generator (22);
the processing scheme generator (22) takes a 3D workpiece model added with bionic processing unit information and the bionic processing parameter information output from the bionic process parameter database (21) as input, and adopts an intelligent algorithm to generate corresponding control command sequences respectively for a laser (61), a six-degree-of-freedom joint robot (42) and an auxiliary device (52); and the control command sequences comprise a time control sequence and a device command control sequence, and the control command sequences not only contain respective control information, but also contain mutual synchronization information;
the input of the integrated synchronization controller (23) involves the time control sequence and the device command control sequence generated by the processing scheme generator (22), and command execution feedback information of the six-degree-of-freedom joint robot (42), the laser (61) and the auxiliary device (52) are received at the same time; and the output of the integrated synchronization controller (23) involves control commands respectively sent to the laser controller (62), the robot motion controller (41) and the auxiliary device controller (51);
the twin virtual system (3) comprises a manufacturing terminal entity scene twin model (31), a process data real-time acquisition module (32), a bionic information parameter offline acquisition module (33), a process offline simulation module (34) and a process data controller (35);
the process data controller (35) receives process data information from the manufacturing terminal entity scene twin model (31), the process data real-time acquisition module (32), the bionic information parameter offline acquisition module (33) and the process offline simulation module (34); the process offline simulation module (34) receives information from the process data real-time acquisition module (32) and the bionic information parameter offline acquisition module (33), and process parameter regulation results of the process data controller (35) are transmitted to the processing scheme generator (22) to generate control command sequences; and the bionic process parameter database (21) receives process information of the bionic information parameter offline acquisition module (33) and updates process data in the bionic process parameter database (21) in real time; and
the processing execution system comprises a laser (61), an optical fiber coupling transmission unit (63), a laser processing work head (64), a six-degree-of-freedom joint robot (42), and an auxiliary device (52); the laser (61) is connected to the laser processing work head (64) via the optical fiber coupling transmission unit (63), and the laser processing work head (64) arranged at a head end of the six-degree-of-freedom joint robot (42) realizes a multi-dimensional motion of the laser processing work head (64) and laser processing.