(19)
(11)EP 3 667 567 A1

(12)EUROPEAN PATENT APPLICATION

(43)Date of publication:
17.06.2020 Bulletin 2020/25

(21)Application number: 18211638.4

(22)Date of filing:  11.12.2018
(51)International Patent Classification (IPC): 
G06N 3/04(2006.01)
G06N 5/02(2006.01)
G06N 3/08(2006.01)
G06N 5/04(2006.01)
(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
KH MA MD TN

(71)Applicant: Siemens Aktiengesellschaft
80333 München (DE)

(72)Inventors:
  • Hildebrandt, Marcel
    81737 München (DE)
  • Mogoreanu, Serghei
    81827 München (DE)
  • Shyam Sunder, Swathi
    81549 München (DE)

  


(54)PLATFORM FOR SELECTION OF ITEMS USED FOR THE CONFIGURATION OF AN INDUSTRIAL SYSTEM


(57) A computer-implemented method and platform for context aware sorting of items available for configuration of a system during a selection session, the method comprising the steps of
providing (S1) a numerical input vector, V, representing items selected in a current selection session as context; calculating (S2) a compressed vector, Vcomp, from the numerical input vector, V, using an artificial neural network, ANN, adapted to capture non-linear dependencies between items; multiplying (S3) the compressed vector, Vcomp, with a weight matrix, EI, derived from a factor matrix, E, obtained as a result of a tensor factorization of a stored relationship tensor, Tr, representing relations, r, between selections of items performed in historical selection sessions, available items and their attributes to compute an output score vector, S; and sorting (S4) automatically the available items for selection in the current selection session according to relevance scores of the computed output score vector, S.




Description


[0001] The invention relates to a platform configured to select items which can be used for the configuration of a technical system, in particular an industrial system such as an automated system comprising a plurality of items, in particular hardware components and/or software components of the system.

[0002] A technical system, in particular an industrial system, can be very complex and comprise a plurality of different subsystems and/or components. Each component can comprise a variety of different features or attributes required for the operation of the respective system. The industrial system can be for instance a manufacturing facility having a plurality of machines connected to each other in a communication subsystem and having a plurality of machine tools and/or hardware components controlled by control components adapted to execute software components during the manufacturing process. All these components form items required for setting up the respective technical system. For implementing such an industrial system, in particular an industrial manufacturing system or automation system, it is necessary to provide a plurality of items provided by the manufacturer of the components or a component provider. An end customer planning to build an industrial system or a complex product needs to order a plurality of different items or components. Conventionally, end customers have access to product lists of the manufacturer listing a plurality of different available items or components offered by the respective manufacturer. A complex system or a complex product consists normally of several components or items which are typically bought together. For selection of the components, the provided product lists are normally sorted based on some criteria. The sorting criteria can comprise for instance the product name where the products are sorted alphabetically. Further sorting criteria can be for instance the product price of the respective item or component where the items are sorted according to the increasing or decreasing price per component. A further possible sorting criteria is the product release date of the respective item.

[0003] Conventional platforms also provide additional services to the end customer such as recommending items which have been bought together in the past most often at the top of a ranking list. These conventional services are mostly based on the historic selections performed by same or different users. These conventional platforms actually fail in scenarios where historic selection data is missing or not available to the platform. Further, conventional platforms fail to recognize contextual aspects of the current selection session and of the items themselves. A contextual aspect is for instance formed by the items currently selected in the current selection session.

[0004] Accordingly, there is a need to provide a method and a platform which provides for a context aware sorting of items available for the configuration of a technical system during a selection session.

[0005] This object is achieved according to a first aspect of the present invention by a computer-implemented method for context aware sorting of items available for the configuration of the system comprising the features of claim 1.

[0006] The invention provides according to a first aspect a computer-implemented method for context aware sorting of items available for configuration of a system during a selection session,
the method comprising the steps of:

providing a numerical input vector representing items selected in a current selection session as context,

calculating a compressed vector from the numerical input vector using an artificial neural network adapted to capture non-linear dependencies between items,

multiplying the compressed vector with a weight matrix derived from a factor matrix obtained as a result of a tensor factorization of a stored relationship tensor representing relations between selections of items performed in historical selection sessions, available items and their attributes to compute an output score vector and

sorting automatically the available items for selection in the current selection session according to relevance scores of the computed output score vector.



[0007] In a possible embodiment of the method according to the first aspect of the present invention, the numerical input vector is applied to an input layer of a trained feedforward artificial neural network.

[0008] In a still further possible embodiment of the method according to the first aspect of the present invention, the used artificial neural network comprises at least one hidden layer having nodes adapted to apply a non-linear activation function, in particular a ReLU activation function.

[0009] In a further possible embodiment of the method according to the first aspect of the present invention, the number of nodes in the last hidden layer of the used artificial neural network is equal to a dimensionality of a relationship core tensor obtained as a result of the tensor factorization of the stored relationship tensor.

[0010] In a further possible embodiment of the method according to the first aspect of the present invention, the used artificial neural network comprises an output layer having nodes adapted to apply a sigmoid activation function to compute the output score vector.

[0011] In a possible embodiment of the method according to the first aspect of the present invention, the numerical vector comprises for each available item a vector element having a numerical value indicating how many of the respective available items have been selected by a user or agent in the current selection session.

[0012] In a further possible embodiment of the method according to the first aspect of the present invention, the relationship tensor is decomposed by means of tensor factorization into a relationship core tensor and factor matrices.

[0013] In a still further possible embodiment of the method according to the first aspect of the present invention, the relationship tensor is derived automatically from a stored knowledge graph wherein the knowledge graph comprises nodes representing historic selection sessions, nodes representing available items and nodes representing features or attributes of the available items and further comprises edges representing relations between nodes of the knowledge graph.

[0014] In a further possible embodiment of the method according to the first aspect of the present invention, the relationship tensor comprises a three-dimensional contain-relationship tensor wherein each tensor element of the three-dimensional contain-relationship tensor represents a triple within the knowledge graph,
wherein the triple consists of a first node representing a selection session, a second node representing an available item and a contain-relationship between both nodes indicating that the selection session represented by the first node of the knowledge graph contains the item represented by the second node of the knowledge graph.

[0015] In a further possible embodiment of the method according to the first aspect of the present invention, the three-dimensional relationship tensor comprises a sparse tensor, wherein each tensor element has a logic high value if the associated triple is existent in the stored knowledge graph and has a logic low value if the associated triple is not existent in the stored knowledge graph.

[0016] In a still further possible embodiment of the method according to the first aspect of the present invention, the relationship tensor is decomposed automatically via Tucker decomposition into a product consisting of a transponded factor matrix, a relationship core tensor and a factor matrix.

[0017] In a still further possible embodiment of the method according to the first aspect of the present invention, wherein the score vector comprises as vector elements relevance scores for each available item used to sort automatically the available items in a ranking list for selection by a user or by an agent in the current selection session.

[0018] In a further possible embodiment of the method according to the first aspect of the present invention, the numerical value of each item within the numerical vector selected by the user or agent in the current selection session from the ranking list is automatically incremented.

[0019] In a still further possible embodiment of the method according to the first aspect of the present invention, the knowledge graph is generated automatically by combining historical selection session data comprising for all historic selection sessions the items selected in the respective historic selection sessions and technical data of the items comprising for each item attributes of the respective item.

[0020] In a still further possible embodiment of the method according to the first aspect of the present invention, if the current selection session is completed all items selected in the completed selection session and represented by the associated numerical input vector are used to extend the historical selection session data.

[0021] In a further possible embodiment of the method according to the first aspect of the present invention, the extended historic selection session data is used to update the stored knowledge graph and/or to update the relationship tensor derived from the updated knowledge graph.

[0022] In a still further possible embodiment of the method according to the first aspect of the present invention, the steps of providing the numerical input vector, calculating the compressed vector, computing the output score vector and sorting the available items for selection are performed iteratively until the current selection session is completed by the user or by the agent.

[0023] In a still further possible embodiment of the method according to the first aspect of the present invention, the available items comprise hardware components and/or software components selectable for the configuration of the respective system.

[0024] The invention further provides according to a further aspect a platform used for selection of items from context aware sorted available items in a selection session, comprising the features of claim 18.

[0025] The invention provides according to the second aspect a platform used for selection of items from context aware sorted available items in a selection session,
wherein the selected items are used for the configuration of a system, in particular an industrial system, said platform comprising
a processing unit adapted to calculate a compressed vector from a numerical input vector representing items selected in a current selection session as context,
wherein the compressed vector is calculated from the numerical input vector using an artificial neural network adapted to capture non-linear dependencies between items,
wherein the processing unit is adapted to multiply the compressed vector with a weight matrix derived from a factor matrix obtained as a result of a tensor factorization of a stored relationship tensor representing relations between selections of items performed in historical selection sessions, available items and their attributes to compute an output score vector,
wherein the available items are sorted automatically by the processing unit for selection in the current selection session according to relevance scores of the output score vector computed by said processing unit.

[0026] In a possible embodiment of the platform according to the second aspect of the present invention, the processing unit has access to a memory of the platform which stores a knowledge graph and/or the relationship tensor derived from the knowledge graph.

[0027] In a still further possible embodiment of the platform according to the second aspect of the present invention, the platform comprises an interface used for selecting items in a selection session from a ranking list of available items sorted according to the relevance scores of the computed output score vector.

[0028] In the following, possible embodiments of the different aspects of the present invention are described in more detail with reference to the enclosed figures.
Fig. 1
shows a schematic block diagram for illustrating a possible exemplary embodiment of a platform for selection of items according to an aspect of the present invention;
Fig. 2
shows schematically an exemplary knowledge graph for illustrating the operation of the method and platform according to the present invention;
Fig. 3
illustrates schematically the decomposition of a tensor performed by the method and apparatus according to the present invention;
Fig. 4
illustrates a further example of an industrial knowledge graph;
Fig. 5
illustrates the operation of a computer-implemented method according to the present invention;
Fig. 6
shows a flowchart of a possible exemplary embodiment of a computer-implemented method for context aware sorting of items according to a further aspect of the present invention.


[0029] As can be seen in the block diagram of Fig. 1, a platform 1 according to an aspect of the present invention comprises in the illustrated embodiment a processing unit 2 having access to a memory or database 3. The platform 1 illustrated in Fig. 1 can be used for selection of items from context aware sorted available items in a selection session. The items can form a variety of different items used for the configuration of a technical system, in particular an industrial system or automation system requiring a plurality of different items for its configuration. The processing unit 2 can be implemented on a server of a service provider providing items which can be used by an end customer to build up an industrial system or a complex product from a plurality of different hardware and/or software components forming available items provided by the service provider.

[0030] The processing unit 2 as shown in the embodiment of Fig. 1 can comprise several processing stages 2A, 2B, 2C each having at least one processor adapted to perform calculations. The processing unit 2 can have access to a local memory 3 or via a network to a remote memory 3. In the illustrated exemplary embodiment, the processing unit 2 comprises a first processing stage 2A adapted to process a numerical input vector V received by the processing unit 2 via a user interface 4 of a user terminal operated by an end customer or user. In a possible embodiment, the user terminal 4 can also be connected via a data network to the processing unit 2 implemented on the server of the service provider. In a possible embodiment, to start a selection session the end customer has to be authorized by the platform 1. After having initiated the selection session the end customer can start to select items from available items provided by the service provider or manufacturer of the items, i.e. the hardware and/or software components necessary to implement or build the respective industrial system. These items can for instance comprise sensor items, actuator items, cables, display panels or controller items as hardware components of the system. The items can also comprise software components, i.e. different versions of executable software programs. The numerical input vector V is provided in the initiated current selection session as context to the platform 1. The processing unit 2 is adapted to perform the computer-implemented method illustrated in the flowchart of Fig. 6. The processing unit 2 is adapted to calculate a compressed vector Vcomp from the numerical input vector V using an artificial neural network ANN. The compressed vector Vcomp is multiplied with a weight matrix EI derived from a factor matrix E obtained as a result of a tensor factorization of a stored relationship tensor Tr representing relations r between selections of items performed in historic selection sessions and available items as well as their attributes to compute a score output vector S. The available items are sorted by the processing unit 2 for selection in the current selection session according to relevance scores of the computed score vector S calculated by the processing unit 2 in response to the compressed vector Vcomp using the weight matrix EI.

[0031] In the illustrated exemplary embodiment of Fig. 1, the processing unit 2 comprises three processing stages. In the first processing stage 2A, the compressed vector Vcomp is calculated from the received numerical vector V representing items selected by the customer in the current selection session as context. The numerical input vector V comprises for each available item a vector element having a numerical value indicating how many of the respective available items have been selected by the user or agent in the current selection session. The number N of vector elements within the numerical vector V corresponds to the number N of available items.



[0032] For instance, a first vector element VI comprises a value indicating how many of the first item have been selected by the customer in the current selection session. On the basis of the received numerical input vector V, the first processing stage 2A of the processing unit 2 calculates the compressed vector Vcomp from the received numerical vector V using an artificial neural network ANN and using a stored relationship tensor Tr representing relations between selections of items performed in historic selection sessions and the available items. The relationship tensor Tr is decomposed by means of tensor factorization into a relationship core tensor Gr and factor matrices E as illustrated in Figs. 3, 5. The relationship core tensor Gr and the factor matrices E are used to calculate the compressed vector Vcomp from the received numerical input vector V.



[0033] The compressed vector Vcomp comprises M vector elements wherein M<<N. In a preferred embodiment, the decomposed relationship tensor Tr is stored in the memory 3 as also illustrated in Fig. 1. The relationship tensor Tr is derived automatically from a stored knowledge graph KG. Fig. 2 and Fig. 4 show schematically examples of such a knowledge graph KG. The knowledge graph KG comprises in a possible embodiment nodes representing historic selection sessions SS, nodes representing available items such as system components and/or nodes representing features or attributes f of available items. The different nodes of the knowledge graph KG are connected via edges representing the relations r between nodes of the knowledge graph KG. One of the relations r is a contain relation c as illustrated in Fig. 2. In the illustrated example of Fig. 2, the historic selection session SS1 contains the item I1, for instance a specific controller which can be used for the implementation of a production facility. Further, another historic selection session SS2 also contains this item II. The second historic selection session SS2 further contains a second item I2 as shown in Fig. 2. All items I1, I2 can comprise one or several features or attributes f, in particular technical features. The relationships within the knowledge graph KG can comprise other relations such as type or size or e.g. a specific supply voltage. In a possible embodiment, the knowledge graph KG as illustrated schematically in Fig. 2 can be enriched by the platform owner of the platform 1. In a possible embodiment, the knowledge graph KG stored in the memory 3 can be generated automatically by combining historical selection session data hss and technical data comprising for each item features f of the respective item as also illustrated in Fig. 1. The historical selection session data can comprise for all historic selection sessions SS performed by the same or different users the items selected in the respective historic selection session SS. For instance, historic selection session data can comprise a list of all historic selection sessions SS and the associated items selected within the respective historic selection session SS. The features, i.e. attributes, of the items I can comprise technical features such as type, size or supply voltage of the item. Other examples of the features f can also comprise different operation modes available for the specific item. For instance, a feature or attribute can indicate whether the respective component provides a fail-safe operation mode or not. Besides the technical features f, the knowledge graph KG can also comprise additional features f such as the price of the respective item. In a possible embodiment, the knowledge graph KG is generated automatically by combining the available historic selection session data and the available known features f of the items I in a preparation phase. Further, it is possible to derive in the preparation phase a corresponding relation tensor automatically from the generated knowledge graph KG database. Further, it is possible that the generated tensor T is also already decomposed to provide a core tensor Gc available to the processing unit 2 of the platform 1.

[0034] The first processing stage 2A of the processing unit 2 is adapted to calculate the compressed vector Vcomp from the received numerical vector V using a trained artificial neural network ANN as also illustrated in Fig. 5.

[0035] The relationship tensor Tr can be decomposed according to the following equation:

wherein E is a factor matrix (embedding matrix) and Gc is the core tensor.

[0036] The second processing stage 2B of the processing unit 2 is adapted to calculate an output score vector S for the compressed vector Vcomp output by the first processing stage 2A. The score vector S provides relevance scores for the different available items.

[0037] The compressed vector Vcomp is calculated by the trained artificial neural network implemented in the first processing stage 2A.

[0038] On the basis of the calculated compressed vector Vcomp, it is possible to calculate the output score vector S by multiplication as follows:

wherein EI is a weight matrix derived from the factor matrix (embedding matrix) E calculated as a result from the tensor decomposition as specified in equation (3).

[0039] The third processing stage 2C of the processing unit 2 is adapted to sort automatically the available items for selection in the current selection session according to the relevant scores of the calculated score vector S.

[0040] In a possible embodiment, the relationship tensor Tr comprises a three-dimensional contain-relationship core tensor Gc. Each tensor element of the three-dimensional contain-relationship core tensor Gc represents a triple t within the knowledge graph KG.



[0041] Each triple t consists of a first node n1 representing a selection session SS in the knowledge graph KG, a second node n2 representing an available item I in the knowledge graph KG and a contain-relationship c between both nodes n1, n2 indicating that the selection session SS represented by the first node n1 of the knowledge graph KG does contain the item I represented by the second node n2 of the knowledge graph KG. For instance, a tensor element of the three-dimensional relationship tensor Tr represents a triple SS1, c, I1 in the knowledge graph KG shown in Fig. 2. The three-dimensional relationship tensor Tr comprises accordingly a sparse tensor. Each tensor element within the three-dimensional relationship tensor Tr comprises a logic high value (H) if the associated triple t is existent in the stored knowledge graph KG and comprises a logic low value (L) if the associated triple is not existent in the stored knowledge graph KG. In a possible embodiment, the stored relationship tensor Tr can be decomposed automatically via Tucker decomposition into a product consisting of a transponded factor matrix ET, a relationship core tensor Gr, and a factor matrix E as expressed in equation (3) above. The score vector S can be computed by the second stage 2B of the processing unit 2 by multiplying the compressed vector Vcomp output by the trained artificial neural network ANN with the weight matrix EI as illustrated in Fig. 5. The calculated score vector S comprises as vector elements relevance scores for each available item I used by the sorting stage 2C to sort the available items I in a ranking list for selection by a user or by an agent in the current selection session SS. The items I sorted according to the ranking list can be displayed in a possible embodiment on a display of a graphical user interface 4 to the user performing the selection in the current selection session SS. If the user selects an item from the available items, the vector element of the numerical input vector V is incremented by the number of items selected by the user. The numerical value of each item I within the numerical input vector V selected by the user or agent in the current selection session SS from the ranking list is automatically incremented. If the current selection session SS is completed all items I selected in the completed selection session SS and represented by its associated numerical vector V can be used to extend the historical selection session data stored in the memory 3 of the platform 1. The extended historic selection session data can be used to update the stored knowledge graph KG and to update the relationship tensor Tr derived from the updated knowledge graph KG.

[0042] The processing steps of providing the numerical vector V, calculating the compressed vector Vcomp, computing the score vector S and sorting available items I for selection performed within the stages of the processing unit 2 can be performed in a possible embodiment iteratively until the current selection session SS is completed by the user or agent.

[0043] Fig. 6 shows a flowchart of a possible exemplary embodiment of a computer-implemented method for context aware sorting of items available for the configuration of a system, in particular an industrial system, during a selection session.

[0044] In the illustrated exemplary embodiment, the method comprises four main steps S1, S2, S3, S4.

[0045] In a first step S1, a numerical vector V representing items I selected in the current selection session SS are provided as context for the sorting.

[0046] In a second step S2, the compressed vector Vcomp is calculated from the numerical input vector V using a trained artificial neural network ANN adapted to capture non-linear dependencies between the items. The artificial neural network ANN can comprise in a preferred embodiment a feedforward artificial neural network. The numerical input vector V is applied to an input layer of the trained feedforward artificial neural network ANN as also illustrated in the diagram of Fig. 5. The used artificial neural network ANN comprises at least one hidden layer having nodes adapted to apply a non-linear activation function σ. In a possible embodiment, the activation function is a ReLu activation function. Other non-linear activation functions σ can also be used. The number of nodes in the last hidden layer of the used artificial neural network ANN is equal to a dimensionality of a relationship core tensor Gc obtained as a result of the tensor factorization of the stored relationship tensor Tr. The used artificial neural network comprises an output layer having nodes adapted to apply a sigmoid activation function to compute an output score vector S.

[0047] In a further step S3, the compressed vector Vcomp calculated in step S2 is multiplied with a weight matrix EI as illustrated in the schematic diagram of Fig. 5. The weight matrix EI is derived from a factor matrix E (embedding matrix) obtained as a result of a tensor factorization of a stored relationship tensor Tr representing relations between selections of items performed in historical (previous) selection sessions, available items and their attributes to compute the output score vector S.

[0048] Finally, in step S4, the available items for selection in the current selection session are sorted according to the relevance scores of the score vector computed in step S3.

[0049] The platform 1 according to the present invention takes into account contextual properties of selection sessions. The platform 1 makes use of a knowledge database which can contain historic data of selection sessions SS formed by users in the past but also descriptive features of the different available items. This leads to a graph-structured, multirelational data description, i.e. knowledge graph KG, which is equivalently represented as a high-dimensional tensor T. In this setting, predicting an edge in the knowledge graph KG corresponds to predicting a positive entry in the knowledge tensor. The method exploits the sparsity of this knowledge tensor by finding a low rank approximation via tensor factorization such as Tucker decomposition of the tensor. The platform 1 as illustrated in Fig. 1 takes into account the current configuration of the project, i.e. the items selected by the user in the current selection session SS as well as descriptive features f and attributes of the available items and not just historical data about the past user behavior. In a preparation phase of the platform 1, a joint database and a fitting tensor factorization model is formed. This is resource-consuming and can be executed either in regular time intervals or when new information data becomes available and is included into the database 3.

[0050] In a separate execution phase, the end customer or agent can perform a process of configuration of the respective industrial system. During the execution phase, the method for context aware sorting of items for the configuration of the system as illustrated in Fig. 6 can be performed by a processing unit of the platform 1. It provides for a dynamic adjustment of the order of the displayed or output items depending on the current user action of the items. The sorting of the items is performed on the basis of the compressed vector Vcomp which can be implemented efficiently and executed multiple times as the customer modifies his selection in the current selection session SS. The historic selection session data stored in the database 3 can contain information about previously configured solutions with respect to the implemented system. This can be typically an integer-valued data matrix stored in CSV data format, where the rows correspond to the different project solutions, i.e. historic selection sessions and comprising columns corresponding to the different available items.

[0051] Further, the database 3 can comprise technical information of the different items. This data can comprise detailed technical information about each item such as type information, voltage, size, etc.

[0052] The knowledge graph KG can comprise merged information of the historical selection session data and the technical information about the features f. The knowledge graph KG can be stored e.g. in an RDF format or as a triple store. The knowledge graph KG can equivalently be represented as a sparse numerical tensor with three modes, where the frontal slices correspond to adjacency matrices with respect to the different edge types and/or relations. A factorized tensor forming a low-rank approximation of the knowledge graph KG can be stored in a set of numerical tensors. Different processes can be used to compute a tensor factorization such as Tucker decomposition or CP decomposition.

[0053] The numerical vector V corresponds to a new selection session SS that is in the process of configuration, i.e. where a customer can currently add further items into the selection.

[0054] The compressed vector Vcomp is a numerical vector that contains a model-based compression of the numerical input vector V using the artificial neural network ANN. The sorting stage 2C can provide a rank list of items, i.e. a model-based ranking of all items specific to the current selection within the current selection session. The items are presented to the user on the user interface 4 in a sorted order according to the calculated rank of the item. Ranking helps the customer or user to find the items that he wants to configure quickly by displaying the most relevant items in an exposed top position of a list. Further, the sorting according to the rank helps the user to know which items match the current selection input by the user into the user interface 4. Ranking can serve as an indicator which item complements the already configured components or items selected in the current selection session. Assisted by the ranking, the user can add additional items into a selected group of items of the current selection session SS. The numerical vector V is updated accordingly in the current selection session.

[0055] The platform 1 according to the present invention as illustrated in Fig. 1 can take into account the context in which a purchase order or selection has been made, i.e. what other items have already been selected by the end customer in the current selection session SS. This allows the platform 1 to estimate what might be the end goal of the end customer with respect to the chosen components or items.

[0056] Further, the platform 1 takes into account the predefined relationships between the items, e.g. area of application, compatibility, item "tier", etc. This contextual knowledge enhances significantly the overall quality of the inherent recommendations of items for the further selection provided by the sorting of the output items. Further, if an item I is previously unseen, the platform 1 can still make meaningful recommendations by embedding the item I into the previously constructed latent space via its contextual description.

[0057] The method for context aware sorting of items I according to the present invention can be performed in a fully automated process generating functions in a source code of a product configurator platform. The platform 1 allows to rank items including hardware and/or software components intelligently making the setting up of an industrial system, in particular automation system, easier and speeding up the process of configuration of a technical system. In a possible embodiment, the knowledge graph KG can also be enriched by the platform owner of the platform 1. In a possible embodiment, the knowledge graph KG also illustrated in Figs. 2, 4 can be editable and displayed to the platform owner for enriching the graph with additional nodes and/or edges, in particular relevant features f.

[0058] In a preferred embodiment, the platform 1 and method according to the present invention makes use of tensor decompositions (tensor factorization) to provide a factor matrix E from which a weight matrix EI is derived which is used to calculate an output score vector S with relevance scores used to sort available items I. A three-dimensional tensor T can be seen as a data cube having tensor elements. In a possible embodiment of the platform 1 according to the present invention the tensor elements correspond to triples in the knowledge graph KG. Different algorithms can be employed for tensor decomposition of the tensor T. In a possible embodiment, a Tucker decomposition is applied. In an alternative embodiment, canonical polyadic decomposition CPD can be applied. The decomposition algorithm can be performed by a processor of the processing unit 2. The Tucker decomposition decomposes the tensor T into a so-called core tensor Gc and multiple matrices which can correspond to different core scalings along each mode. A core tensor Gc does express how and to which extent different tensor elements interact with each other.

[0059] The platform 1 according to the present invention comprises two major building blocks. A memory 3 is adapted to store a knowledge graph KG which allows to structure context information about items. The relationship tensor Tr is derived automatically from the stored knowledge graph KG and also stored in the memory 3 as illustrated in Fig. 1. The tensor factorization is performed for the relationship tensor Tr providing a factor matrix E from which the matrix EI is derived. The compression factor Vcomp output by the artificial neural network ANN is multiplied with this weight matrix EI to compute an output score vector S. The available items are then sorted automatically for selection in the current selection session according to the relevance scores of the calculated score vector S. An artificial neural network ANN is used to compress the input numerical vector V to generate a compressed vector Vcomp. The artificial neural network ANN acts as an encoder. Accordingly, the platform 1 is an autoencoder-like structure that results in a context-aware recommendation engine.

[0060] The knowledge graph KG stored in the memory 3 contains technical information of the configurable items I and past selection sessions for configurations. All entities under consideration correspond to vertices, i.e. nodes, in a directed multigraph, i.e. a graph with typed edges. Relations in the knowledge graph KG specify how the entities (nodes) are connected with each other. For example, selection sessions (solutions) can be linked to items I via a contain relationship c which specify which items have been configured in a solution or selection session. Other relations within the knowledge graph KG link items I with technical attributes or features. The knowledge graph KG has a numerical representation in terms of an adjacency relationship tensor T. In a possible embodiment, latent representations, i.e. low-dimensional vectors spaced embeddings, of the items I can be computed with the help of RESCAL to perform a tensor factorization of the adjacency relationship tensor Tr. These embeddings preserve a local proximity of the available items I. Hence, if items are similar from a technical point of view or if they are often configured together, i.e. in a selection session ss, they are close to each other in the latent feature space.

[0061] Fig. 4 shows a depiction of an exemplary knowledge graph KG. A corresponding adjacency relationship tensor Tr can be factorized as illustrated in Fig. 5. The adjacency tensor Tr can be in a possible embodiment three-dimensional with the dimensions: entities x entities x relations. The number of entities e can be quite high, e.g. 43,948 entities, connected with each other through different relations r. Entities e comprise selection sessions ss (solutions), items I and attributes. A solution or a selection session ss comprises a set of items I selected to configure a complex system. The items I can comprise hardware items and/or software items. An example for hardware items are for instance display panels, cables or processors. An example for software items are software modules or software components. Attributes or features f of the entities e indicate properties of the items I. Examples for the relations within the knowledge graph KG and the corresponding tensor comprise a contain relationship c, a category relationship cat and other kinds of relationships, for instance line voltage applied to the respective item. A selection session can contain one or more items I. An item I can also belong to a category. For instance, an item I (I1 in Fig. 4) can belong to the category controller CONT, another item I can belong to the category socket SOCK (I2 in Fig. 4). A knowledge graph KG such as illustrated in Fig. 4 captures technical information describing configurable items I and past solutions or configurations. The knowledge graph KG makes it possible to structure context information about items. The platform 1 makes use of this information for recommendation purposes via a tensor factorization. The artificial neural network ANN acts as an encoder for solutions.

[0062] An industrial system or automation solution can be very complex and can be comprised of a wide range of subsystems and components such as controllers, panels and software modules. Each component can comprise different features or attributes that required the proper operation of the overall industrial system. Conventionally, a suitable solution (i.e. configuration) of the industrial system involves a rather high effort and requires expertise. The method and platform 1 according to the present invention overcome this obstacle and can recommend a set of items I that complement a user's current partial solution or selection and/or by reordering a list of all available items based on their relevance, e.g. displaying the items I that are most relevant first. With the method and platform 1 according to the present invention, relevance scores for all items I are computed. These relevance scores are adjusted dynamically depending on the components or items I a user has already configured in a partial solution, i.e. partial selection session ss.

[0063] A feedforward artificial neural network ANN can be used to extract high-level representations of solutions that capture non-linear interactions or dependencies among different items I. The artificial neural network ANN is used to compute a score vector s with relevance scores for each item I based on the item embeddings (embedding matrix E) which is obtained by the tensor factorization. The platform 1 according to the present invention comprises an autoencoder-like structure where the embedding matrix E (factorization matrix) can serve as a basis to derive a weight matrix EI multiplied with the compressed vector Vcomp output by the artificial neural network ANN. The calculated output score vector S comprises relevance scores and can be used to reorder the items I and/or recommend certain items I to a user or configuration unit that may complement other items or components the user configuration unit has already configured. A weight sharing mechanism can be used to train the model end-to-end. The overall architecture of the platform 1 according to the present invention is also illustrated in Fig. 5. The platform 1 is adapted to merge both historical data and technical information from industrial databases to form a joined multirelational knowledge graph KG stored in a memory or database 3 of the platform 1. It is possible to extract context-aware embeddings by factorizing the corresponding adjacency relationship tensor T as illustrated in Fig. 5. Resulting latent representations of items I are employed both in the tensor factorization as well as in the output layer of the autoencoder-like artificial neural network ANN that is employed for scoring items I based on a current configuration. The basic idea of the employed architecture is to form a graphical, multirelational knowledge base which contains technical information about items I as well as historical user item interactions. By factorizing the resulting adjacency relationship tensor T one can obtain semantically meaningful embeddings that preserve local proximity in the graph structure. This information is leveraged by coupling the tensor factorization with a deep learning autoencoder via a weight sharing mechanism. The modelling of context information leads to large performance gains and thus lowering the dependency on historical data. The tensor factorization-based recommendation system provided by the platform 1 according to the present invention integrates an artificial neural autoencoder as illustrated in Fig. 5. The platform 1 according to the present invention can be executed in a possible embodiment in real time via a simple forward path. This is crucial in real-world applications where it is required that the platform 1 can work in real time while a user is configuring a solution or performs a selection session. By employing an artificial neural network ANN with non-linear activation functions, the platform 1 is sufficiently expressive to capture complex non-linear dependency among items. This is advantageous in the case of automation solutions for industrial systems. The inclusion of context information further allows to tackle any cold start problem thus lowering the dependency on historical data.


Claims

1. A computer-implemented method for context aware sorting of items available for configuration of a system during a selection session, the method comprising the steps of:

(a) providing (S1) a numerical input vector, V, representing items selected in a current selection session as context;

(b) calculating (S2) a compressed vector, Vcomp, from the numerical input vector, V, using an artificial neural network, ANN, adapted to capture non-linear dependencies between items;

(c) multiplying (S3) the compressed vector, Vcomp, with a weight matrix, EI, derived from a factor matrix, E, obtained as a result of a tensor factorization of a stored relationship tensor, Tr, representing relations, r, between selections of items performed in historical selection sessions, available items and their attributes to compute an output score vector, S; and

(d) sorting (S4) automatically the available items for selection in the current selection session according to relevance scores of the computed output score vector, S.


 
2. The method according to claim 1 wherein the numerical input vector, V, is applied to an input layer of a trained feedforward artificial neural network, ANN.
 
3. The method according to claim 1 or 2 wherein the used artificial neural network, ANN, comprises at least one hidden layer having nodes adapted to apply a non-linear activation function, σ, in particular a ReLU activation function.
 
4. The method according to claim 3 wherein the number of nodes in the last hidden layer of the used artificial neural network, ANN, is equal to a dimensionality of a relationship core tensor, Gc, obtained as a result of the tensor factorization of the stored relationship tensor, Tr.
 
5. The method according to any of the preceding claims 1 to 4 wherein the used artificial neural network, ANN, comprises an output layer having nodes adapted to apply a sigmoid activation function to compute the output score vector, S.
 
6. The method according to any of the preceding claims 1 to 5 wherein the numerical input vector, V, comprises for each available item a vector element having a numerical value indicating how many of the respective available items have been selected by a user or agent in the current selection session.
 
7. The method according to any of the preceding claims 1 to 6 wherein the relationship tensor, Tr, is decomposed by means of tensor factorization into a relationship core tensor, Gc, and factor matrices.
 
8. The method according to any of the preceding claims 1 to 7 wherein the relationship tensor, Tr, is derived automatically from a stored knowledge graph, KG, wherein the knowledge graph, KG, comprises nodes, n, representing historical selection sessions, nodes, n, representing available items and nodes, n, representing technical attributes of the available items and further comprises edges, e, representing relationships, r, between nodes, n, of the knowledge graph, KG.
 
9. The method according to claim 8 wherein the relationship tensor, Tr, comprises a three-dimensional contain-relationship tensor, Tc, wherein each tensor element of the three-dimensional contain-relationship tensor, Tc, represents a triple, t, within the knowledge graph, KG, wherein the triplet consists of a first node, n1, representing a selection session, a second node, n2, representing an available item and a contain-relationship, rc, between both nodes, n1, n2, indicating that the selection session represented by the first node n1, of the knowledge graph, KG, contains the item represented by the second node n2, of the knowledge graph, KG.
 
10. The method according to claim 9 wherein the three-dimensional relationship tensor, Tr, comprises a sparse tensor, wherein each tensor element has a logic high value if the associated triple, t, is existent in the stored knowledge graph, KG, and has a logic low value if the associated triple, t, is not existent in the stored knowledge graph, KG.
 
11. The method according to any of the preceding claims 1 to 10 wherein the relationship tensor, Tr, is decomposed automatically via Tucker-decomposition into a product comprising a transponded factor matrix, ET, a relationship core tensor, Gc, and a factor matrix, E.
 
12. The method according to claim 11 wherein the output score vector, S, comprises as vector elements relevance scores for each available item used to sort the available items in a ranking list for selection by a user or by an agent.
 
13. The method according to claim 12 wherein the numerical value of each item within the numerical input vector, V, selected by the user or agent in the current selection session from the ranking list is automatically incremented.
 
14. The method according to any of the preceding claims 8 to 13 wherein the knowledge graph, KG, is generated automatically by combining historical selection session data comprising for all historical selection sessions the items selected in the respective historical selection sessions and technical data of the items comprising for each item attributes of the respective item,
wherein if the current selection session is completed all items selected in the completed selection session and represented by the associated numerical input vector, V, are used to extend the historical session data.
 
15. The method according to claim 14 wherein the extended historical session data is used to update the stored knowledge graph, KG, and to update the relationship tensor, Tr, derived from the updated knowledge graph, KG.
 
16. The method according to any of the preceding claims 1 to 15 wherein the steps of providing the numerical input vector, V, calculating the compressed vector, Vcomp, computing the output score vector, S, and sorting the available items for selection are performed iteratively until the current selection session is completed by the user or agent.
 
17. The method according to any of the preceding claims 1 to 16 wherein the available items comprise hardware components and/or software components selectable for the configuration of the respective system.
 
18. A platform (1) used for selection of items from context aware sorted available items in a selection session,
wherein the selected items are used for the configuration of a system, in particular an industrial system,
said platform (1) comprising
a processing unit (2) adapted to calculate a compressed vector, Vcomp, from a numerical input vector, V, representing items selected in a current selection session as context,
wherein the compressed vector, Vcomp, is calculated from the numerical input vector, V, using an artificial neural network, ANN, adapted to capture non-linear dependencies between items,
wherein the processing unit (2) is adapted to multiply the compressed vector, Vcomp, with a weight matrix, EI, derived from a factor matrix, E, obtained as a result of a tensor factorization of a stored relationship tensor, Tr, representing relations, r, between selections of items performed in historical selection sessions, available items and their attributes to compute an output score vector, S,
wherein the available items are sorted automatically by said processing unit (2) for selection in the current selection session according to relevance scores of the output score vector, S, computed by said processing unit (2).
 
19. The platform according to claim 18 wherein the processing unit (2) has access to a memory (3) of the platform (1) which stores a knowledge graph, KG, and/or the relationship tensor, Tr, derived from the knowledge graph, KG.
 
20. The platform according to claim 18 or 19 wherein the platform (1) comprises an interface (4) used for selecting items in a selection session from a ranking list of available items sorted according to the relevance scores of the computed output score vector, S.
 




Drawing
















Search report









Search report