(19)
(11)EP 3 503 065 A1

(12)EUROPEAN PATENT APPLICATION

(43)Date of publication:
26.06.2019 Bulletin 2019/26

(21)Application number: 17306918.8

(22)Date of filing:  22.12.2017
(51)International Patent Classification (IPC): 
G08G 1/01(2006.01)
G08G 5/00(2006.01)
G08G 1/13(2006.01)
(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
MA MD TN

(71)Applicant: Ecole Nationale de l'Aviation Civile
31400 Toulouse Cedex (FR)

(72)Inventors:
  • GARCIA, Jérémie
    31400 TOULOUSE (FR)
  • CONVERSY, Stéphane
    31400 TOULOUSE (FR)
  • SAPORITO, Nicolas
    31400 TOULOUSE (FR)
  • BUISAN, Guilhem
    31400 TOULOUSE (FR)

(74)Representative: Bell, Mark 
Marks & Clerk France Immeuble «Visium» 22, avenue Aristide Briand
94117 Arcueil Cedex
94117 Arcueil Cedex (FR)

  


(54)METHOD AND APPARATUS MANAGING ENTITIES IN A PHYSICAL SPACE


(57) The movements and interactions of mobile physical entities such as vehicles, individuals and so on in a physical space can be monitored and controlled by providing a graphical representation of the physical space including an indication of a position of a mobile entity with respect to a region associated with a reference point in the physical space where the indication is situated at a distance from the reference point where the distance is proportional to the predicted time of arrival of the entity in the region. On this basis, mobile entities are shown in the order of expected arrival, rather than their instantaneous physical location, supporting more rapid assimilation of anticipated arrivals and traffic conditions.




Description

Field of the invention



[0001] The present invention relates generally to the managing of entities in a physical space.

Background of the invention



[0002] Various classes of oversight of mobile entities can be defined including pilots and drivers on one hand, and supervisors, air traffic controllers, vessel traffic controllers and the like who have a general responsibility for entities in a given area on the other. The working conditions of these classes of individuals are affected by current technological trends, and furthermore significant convergence of their roles. On one hand, vehicles are increasingly autonomous, so that the role of the driver or pilot is increasingly supported by electronic tools such as navigation tools, and/or entrusted to a remote operator who can take over control of the vehicle via a telecommunications channel at critical instants. Meanwhile, supervision and guidance tasks may be increasingly supported by information technology, so that one individual can be expected to supervise an ever larger area, or increasingly, to supervise several different areas, or attribute only a portion of their attention to traffic considerations, with the relevant traffic information being relayed from the respective areas via telecommunications means.

[0003] It is desirable to integrate automation into a supervision interface so as to reduce the overall work load of such individuals without reducing their situational awareness or the availability of direct control where appropriate.

Summary of the invention



[0004] In accordance with the present invention in a first aspect there is provided a mobile entity manager for managing a mobile entity in a physical space. The mobile entity manager comprises a graphics renderer adapted to generate a graphical representation of the physical space at a first scale, a representation engine adapted to define a reference point in the physical space, to determine a position of the mobile entity with respect to the reference point, and to determine a predicted time of arrival of the mobile entity within a region defined with respect to the reference point. The graphics renderer is further adapted to modify the graphical representation to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.

[0005] In accordance with the present invention in a second aspect there is provided a method of managing a mobile entity in a physical space, the method comprising the steps of:

defining a graphical representation of the physical space at a first scale,

defining a reference point in the physical space,

determining a position of the mobile entity with respect to the reference point,

determining a predicted time of arrival of the mobile entity within a region defined with respect to the reference point, and

modifying the graphical representation to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.



[0006] In accordance with a development of the second aspect, the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, calculating one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations.

[0007] In accordance with a development of the second aspect, the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, a modification of the path which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations.

[0008] In accordance with a development of the second aspect, at the step of modifying the graphical representation to incorporate an indication of the position of the mobile entity with respect to the region, the indication is situated in a position in graphical representation of the physical space with an orientation with respect to the reference point corresponding to the relative orientation of the entity to the reference point.

[0009] In accordance with a development of the second aspect, the method comprises the further steps of:

determining a position of a further the mobile entity with respect to the reference point,

determining a predicted time of arrival of the further mobile entity within the region,

and, modifying the graphical representation to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.



[0010] In accordance with a development of the second aspect, the method comprises the further step of determining a timing for a convergence of the two entities on the basis of a speed of the first mobile entity and a speed of the further mobile entity.

[0011] In accordance with a development of the second aspect, the path corresponds to a path in the physical space that the mobile entity is expected to follow.

[0012] In accordance with a development of the second aspect, the factor by which the distance is proportional to the predicted time of arrival varies as a function of the position of the indication thereon, or wherein the factor by which the distance is proportional to the predicted time of arrival varies as a function of the orientation.

[0013] In accordance with a development of the second aspect, the method comprises the further step of:

modifying the graphical representation to incorporate speed vector indicator graphic, wherein the speed vector indicator graphic has a dimension proportional to the speed of the mobile entity in the physical space.



[0014] In accordance with a development of the second aspect, the method comprises a further step of receiving a user input specifying a new position of the reference point, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new position of the reference point.

[0015] In accordance with a development of the second aspect, the method comprises the further step of receiving a user input specifying a new scale for the graphical representation, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new scale.

[0016] In accordance with a development of the second aspect, the graphical representation is three dimensional.

[0017] In accordance with the present invention in a third aspect there is provided computer program comprising instructions adapted to implement the steps of the second aspect.

Brief Description of the Drawings



[0018] The above and other advantages of the present invention will now be described with reference to the accompanying drawings, for illustration purposes only, in which:
Figure 1
shows a method of managing the behaviour of a mobile entity in a physical space in accordance with an embodiment;
Figure 2
shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1;
Figure 3
shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1;
Figure 4
shows an example of a physical space susceptible to graphical representation in accordance with an embodiment;
Figure 5
shows an example of a graphical representation of the physical space of figure 4 generated in accordance with an embodiment;
Figure 6
shows a further example of a graphical representation of the physical space of figure 4 generated in accordance with an embodiment;
Figure 7
shows a further example of a physical space susceptible to graphical representation in accordance with an embodiment;
Figure 8
shows an example of a graphical representation of the physical space of figure 7 generated in accordance with an embodiment;
Figure 9
shows a mobile entity manager in accordance with an embodiment;
Figure 10
shows a generic computing system suitable for implementation of embodiments of the invention;
Figure 11
shows a smartphone device adaptable to constitute an embodiment; and
Figure 12
shows an Air Traffic control desk adaptable to constitute an embodiment.

Detailed description



[0019] Figure 1 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with an embodiment.

[0020] As shown in figure 1, the method starts at step 100 before proceeding to step 110, at which a graphical representation of the physical space is defined at a first scale. The method next proceeds to step 120 at which a reference point in the physical space is defined. This definition may comprise retrieving a position value from memory, receiving user input, for example as discussed in further detail below, or otherwise. At step 130 a position of the mobile entity is determined with respect to the reference point. The position of the mobile entity may be obtained directly, for example by means of a radar, lidar or other such system, by triangulation of a signal emitted or reflected by the entity, or indirectly, for example by receiving positioning information, as obtained for example from a GNSS system, from the mobile entity itself, or inferred for example on a dead-reckoning basis from applying time and velocity data to a known starting point, or otherwise. At step 140 a predicted time of arrival of the mobile entity within a region defined with respect to the reference point is determined. Determination of the predicted time of arrive may comprise determining a speed of the mobile entity, either directly for example by means of a radar, lidar or other such system, by inference from successive location determinations, or indirectly, for example by receiving speed information from the mobile entity itself. The method then finally proceeds to step 150 at which the graphical representation is modified to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival before terminating at step 190.

[0021] Examples of embodiments in accordance with this method are described below with reference to figures 2 to 8.

[0022] Figure 2 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1.

[0023] As shown in figure 2, the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1, before proceeding to step 260 at which a user input is received specifying a new position of the indication on the path. The method then proceeds to calculate one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication at step 270 before issuing a communication to the mobile entity indicating the variations at step 280 before terminating at step 190. Such speed variations may be obtained on the basis of a model of the mobile entities capacities in terms of acceleration and deceleration, possibly taking into account fuel efficiency considerations, passenger comfort, possible interactions with other entities, and the like, as will occur to the skilled person.

[0024] Possible implementations of this approach and certain implication thereof are discussed further below, for example with reference to figures 4 to 8.

[0025] Figure 3 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1.

[0026] As shown in figure 3, the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1, before proceeding to step 360 at which a position of a further mobile entity is determined with respect to the reference point. The method next proceeds to step 370 at which a predicted time of arrival of the further mobile entity within the region is determined, and then proceeds to step 380 of modifying the graphical representation to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.

[0027] Possible implementations of this approach and certain implication thereof are discussed further below, for example with reference to figures 4 to 8.

[0028] It will be appreciated that certain variants of the method of figures 1, 2 and 3 are possible. In particular, it may be desirable to redefine the basic graphical representation of the physical space by looping back to step 110 as desired. Similarly, on determining that no interaction has occurred, the process might not necessarily look for new user input of new regions by looping back to step 120 on every iteration, and might loop back to steps 130 or 140 in some iterations. Still further, the method may look for certain inputs in parallel- for example, some or all of changes to the underlying physical space, entities, regions and interactions between them may be monitored in parallel.

[0029] It will be appreciated that the steps of figures 1, 2 and 3 may be looped to as to continually update the graphical representation as the mobile entities progress, and as new mobile entities appear or disappear. Further variants will now be discussed with respect to figures 4 to 8.

[0030] Figure 4 shows an example of a physical space susceptible to graphical representation in accordance with an embodiment. As shown in figure 4 there is provided a section of road 400, having two lanes 411, 412 in a first direction 410 (right to left), and two lanes 421, 422 in a second direction 420 (left to right). An entity shown as a car 431 is shown as travelling in the first lane 411 in the first direction 410, a entity shown as a car 432 is shown as travelling in the second lane 412 in the first direction 410, an entity shown as a car 433 is shown as travelling in the first lane 421 in the second direction 420, and an entity shown as a car 434 is shown as travelling in the second lane 422 in the second direction 420. The road section 400, or alternatively a road direction 410 or 420, or a lane 411, 412, 421 or 422 may correspond to the physical space 400 as discussed above. Road section 400 has an entry slip road 423.

[0031] A reference point 440 is defined in the physical space. This reference point 440 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof.

[0032] In accordance with the embodiment, a position of a mobile entity is determined with respect to the reference point. As shown, the position of the two mobile entities 433 and 434 are determined with respect to the reference point 440, as indicated by the lines 443 and 444. It will be appreciated that the position of any number of mobile entities may be determined in this way. In particular, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined. Furthermore, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined.

[0033] For the purposes of the present example, the speed of mobile entity 433 is 75km/h, while the speed of mobile entity 434 is 50km/h.

[0034] For the purposes of the present example, the distance from the reference point to the first mobile entity 433 is 1km, while the distance from the reference point to the second mobile entity 434 is 0.8km.

[0035] Figure 5 shows an example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment. As shown in figure 5 there is presented a representation of a section of road 500, having two lanes 511, 512 in a first direction 510 (right to left), and two lanes 521, 522 in a second direction 520 (left to right). An indication of the position of the first mobile entity 533 is shown as travelling in the first lane 521 in the second direction 520, and the second mobile entity 534 is shown as travelling in the second lane 522 in the second direction 520. As such, the road section 500 corresponds to the physical space 400 as discussed above, and represents the relevant physical features thereof.

[0036] It will be appreciated that while the indications of the mobile entities resemble the entities themselves, this need not be the case, and may be the case to any desired extent. For example, the indications may be simple points, boxes containing an icon and/or text or a schematic representation of the class of mobile entity in question at one end of the scale, up to a photorealistic representation or live image of the mobile entity itself at the other end of the scale.

[0037] As such, figure 5 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment. The scale is not specified in figure 5, and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation.

[0038] While for the sake of simplicity figure 5 shows a two dimensional representation of the physical space, other embodiments may equally present a three dimensional representation of the physical space.

[0039] Furthermore, the reference point 540 as defined in the physical space is presented in the graphical representation. It will be appreciated that in some embodiments, the position of the reference point need not be included in the graphical representation, or not at all times. The reference point 540 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof. Where the user point is defined by user input, this may conveniently take place by manipulation of the graphical representation, for example by conventional user interface mechanisms. For example, the user might select the desired location in the physical representation by clicking with a mouse cursor, finger tip, or the like.

[0040] In accordance with the embodiment a predicted time of arrival of each mobile entity whose position is determined within a region 550 defined with respect to the reference point 540 is obtained. The region and the reference point may coincide, or otherwise the region may be situated in any predefined spatial relationship to the reference point.

[0041] On the basis of the speed and distance information presented above, it may be determined that the first mobile entity 533 will arrive at the reference point in 1/75 = 0.01333hrs, that is, 48 seconds.

[0042] On the basis of the speed and distance information presented above, it may be determined that the second mobile entity 534 will arrive at the reference point in 0.8/50 = 0.016 hrs, that is, 58 seconds.

[0043] As shown in figure 5, the graphical representation is modified to incorporate an indication of the position of the first mobile entity 533 on a path 560 with respect to the region 550, wherein the indication 533 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.

[0044] Meanwhile the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 534 on the path 560 with respect to the region 550, wherein the indication 534 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.

[0045] On this basis, whilst a direct representation of the real physical position of the mobile entities with respect to the reference point would show the first mobile entity further from the reference point than the second entity, a graphical representation in accordance with the embodiment of figure 5 shows the second mobile entity further from the reference point that the first mobile entity, on the basis of the fact that it is expected to arrive later than the first mobile entity, which although presently further away in space, is moving faster, and thus closer in time.

[0046] As such where the position of a further mobile entity such as the second mobile entity 534 is determined with respect to the reference point, and a predicted time of arrival of the further mobile entity within the region, the graphical representation may further be modified to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.

[0047] While figure 5 shows indications of the position of two physical entities, it will be appreciated that at any given time the number of indications may be any value from zero upwards, depending on the number of mobile entities in the physical space, or otherwise taken into account as discussed above.

[0048] The representation of figure 5 may support further variants.

[0049] For example, on the basis of a speed of the first mobile entity and a speed of a further mobile entity a timing for a convergence of the two entities may be determined. This determination may then be used to indicate whether a particular mobile entity can overtake another within a specified time window (for example, before the arrival of an oncoming vehicle), or within a particular physical space (for example before an overtaking lane ends). Still further, the speed of one, or the other of the two or more entities may be adapted to achieve a desired result, for example indicating the required acceleration to achieve the desired manoeuvre in the available time or space, or in order for the two mobile entities to reach a common destination (corresponding to the reference point) at the same time.

[0050] This determination may constitute the basis of a communication. The communication may comprise by way of example the transmission of a message to the mobile entity, the transmission of a message to the user, or to any other destination. The communication may comprise multiple transmissions to different destinations. The communication may comprise a modification of the graphical representation, for example through changing the colour or prominence of certain features, such as the path, the region or the entity or an indicator region that may be defined in the graphical region for this purpose. A text message may be presented via the graphical representation, or otherwise. The communication may comprise transmission via any suitable channel. It may be transmitted via any data network such as a WAN (e.g. the internet, GSM, UMTS), a LAN (e.g. wifi network or Ethernet), PAN (e.g. Bluetooth, zigbee). The communication may be of any format, and may include text, graphical or audio content, or a combination of these. The communication may be formatted for a human addressee, for example in the form of a human readable text or audio message. The signalling action may be formatted for a machine recipient, for example on the basis of an API, or any suitable technical format having regard to the context. The signalling action may be transmitted merely for information or as a warning, or may contain instructions. Where the signalling action involves transmission to a machine recipient, these instructions may be directly operable by that machine. In particular, the communication may directly control the speed of the mobile entity.

[0051] The communication might additionally or alternatively comprise the transmission of an audible chime to be played to an occupant of the entity to alert them to the change of status.

[0052] As such, the communication may comprise sending a signal to an entity.

[0053] It will be appreciated that countless further additional or alternative components of the communication may be envisaged as a function of the context.

[0054] Still further, it will be appreciated that while the embodiments herein have been described with respect to a single reference point, any embodiment may be implemented with additional reference points. Any mobile entity may be represented in a position reflecting its time of arrival at a respective one of these reference points, or any mobile entity may be represented multiple times, with each representation position reflecting the time of arrival of the corresponding mobile entity at a respective one of these reference points have different targets in the physical space.

[0055] Still further, in any embodiment, in addition to providing an indication of the position of the mobile entity on a path with respect to the region, the indication may be situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival, a further indication for that same mobile entity may be provided in a position in the physical space corresponding to the actual physical position of that entity.

[0056] Still further, user input may be received specifying a new position of the reference point, whereupon the selection of mobile entities for representation, the determination of the position of those mobile entities, the determination of a predicted time of arrival of those mobile entities will be recalculated, and the graphical representation updated on the basis of the new position of the new reference point.

[0057] Figure 6 shows a further example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment. As shown in figure 6 there is presented a representation of a section of road 500, substantially as described with reference to figure 5. In accordance with certain embodiments, user input is received specifying a new position of the indication on the path. In figure 6 this is represented by the position of the cursor 601, which as shown has "dragged" the indicator of the second mobile entity 534 to a new position 633 on the path 560. Specifically, the second mobile entity is now situated ahead of the first mobile entity 533. On this basis, one or more variations in the speed of the second mobile entity 434 which would modify the predicted time of arrival of the second mobile entity to correspond to the new position of the indication may be calculated and issued in a communication to the mobile entity indicating the variations. Such a communication may be used in any of the ways, and take any of the forms, of communications as discussed above with reference to figure 5.

[0058] Alternatively, or additionally, the path itself may be modified so as to modify the predicted time of arrival of the second mobile entity to correspond to the new position of the indication may be calculated and issued in a communication to the mobile entity indicating the variations. For example, a shorter or longer path may be imposed, having a length such that at a particular speed the mobile entity will arrive at a desired time. The modified path comprise the original path replaced in whole or in part with a new path, or may constitute a distortion of the original path, or otherwise be selected from a predefined template, or automatically generated by a path finding algorithm given defined path constraints. Meanders, loops, dog legs or other such path extensions may be added to delay the time of arrival. A variety of automated or semi-automated algorithms for defining a suitable path adjustment will occur to the skilled person. The exact manner and degree of freedom for such modifications will vary depending on the context of the implementation- where mobile entities are aircraft in flights, a wide range of variations in three dimensions may be envisages, whilst road vehicles as shown in figure 8 might be more constrained.

[0059] The speed and or path of the mobile entity may be adjusted for example as described above, or any combination of the two approaches.

[0060] Figure 7 shows a further example of a physical space susceptible to graphical representation in accordance with an embodiment. As shown in figure 7 there is provided a section of road 700, having two lanes 711, 712 in a first direction 710 (right to left), and two lanes 721, 722 in a second direction 720 (left to right). An entity shown as a car 733 is shown as travelling in the first lane 721 in the second direction 720, and an entity shown as a car 734 is shown as travelling in the entry slip road 723. As shown, in view of the different paths of the first and second mobile entities, the first mobile entity 733 has a first orientation ϕ1 with respect to the reference point 740 and the second mobile entity 734 has a second orientation ϕ2 with respect to the reference point 740.

[0061] A reference point 740 is defined in the physical space.

[0062] In accordance with the embodiment, a position of a mobile entity is determined with respect to the reference point. As shown, the position of the two mobile entities 733 and 734 are determined with respect to the reference point 740, as indicated by the lines 743 and 744. It will be appreciated that the position of any number of mobile entities may be determined in this way. In particular, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined. Furthermore, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined.

[0063] For the purposes of the present example, the speed of mobile entity 733 is 75km/h, while the speed of mobile entity 734 is 50km/h.

[0064] For the purposes of the present example, the distance from the reference point to the first mobile entity 733 is 1km, while the distance from the reference point to the second mobile entity 734 is 0.8km.

[0065] Figure 8 shows an example of a graphical representation 800 of the physical space of figure 7 generated in accordance with an embodiment. As shown in figure 8 there is presented a representation of a section of road 800, having two lanes 811, 812 in a first direction 810 (right to left), and two lanes 821, 822 in a second direction 820 (left to right). A representation of the first mobile entity 833 is shown as travelling in the first lane 821 in the second direction 820, and the second mobile entity 834 is shown as travelling in the entry slip road 823. As such, the road section 800 corresponds to the physical space 700 as discussed above, and represents the relevant physical features thereof.

[0066] As such, figure 8 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment. The scale is not specified in figure 8, and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation.

[0067] In accordance with the embodiment a predicted time of arrival of each mobile entity whose position is determined within a region defined with respect to the reference point.

[0068] On the basis of the speed and distance information presented above, it may be determined that the first mobile entity 833 will arrive at the reference point in 1/75 = 0.01333hrs, that is, 48 seconds.

[0069] On the basis of the speed and distance information presented above, it may be determined that the second mobile entity 834 will arrive at the reference point in 0.8/50 = 0.016 hrs, that is, 58 seconds.

[0070] As shown in figure 8, the graphical representation is modified to incorporate an indication of the position of the first mobile entity 833 on a path 863 with respect to the region 850, wherein the indication 833 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.

[0071] Meanwhile the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 834 on the path 864 with respect to the region 850, wherein the indication 834 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.

[0072] As shown, the indication of the first mobile entity 833 is situated in a position in the representation of the physical space with an orientation θ1 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point ϕ1.

[0073] Similarly, the indication of the second mobile entity 834 is situated in a position in the representation of the physical space with an orientation θ2 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point ϕ2.

[0074] It will be appreciated that even where entities arrive with different orientations as shown in figure 7, it is still possible to represent the respective indicators on a single path, such as shown in figure 5. This approach may be appropriate where the direction of arrival is unimportant, or where many possible directions of arrival are possible such that attempting to accurately represent them would introduce an undesirable degree of variation in the graphical representation.

[0075] Still further, the indication of each entity may be located on a respective path corresponding to the path in the physical space that the respective mobile entity is expected to follow.

[0076] In accordance with certain embodiments, and as shown by way of example in figure 8, the factor by which the distance is proportional to the predicted time of arrival may vary as a function of the position of the indication thereon.

[0077] Specifically, as shown in figure 8 there are provided additional zones at the periphery of the graphical representation 800. In particular, a first additional zone 802 is provided around the edge of the graphical representation, and a second additional zone 803 is provided around the outer edge of the first additional zone 802.

[0078] Indications for mobile entities that are outside the physical area may be represented in these additional zones, such as elements 835 and 836 in figure 8. These may be presented as mobile entities out of the visible spatial range but visible in the temporal range. In certain embodiments, the factor by which the distance is proportional to the predicted time of arrival may be different in each zone, and become progressively greater for each successively more remote zone from the reference point. In other embodiments, the factor by which the distance is proportional to the predicted time of arrival may rise continually, in accordance with any continuous function, with the zones serving to mark convenient reference time graduations in this succession.

[0079] While as shown in figure 8 the zones are situated all around the periphery of the graphical representation, in some cases the zones need not occupy the entire periphery, for example in embodiments where mobile entities only arrive from certain directions. Still further, in cases where mobile entities only arrive along certain paths, the zones may be linear in nature, with mobile entities presented as a queue. Still further, in some cases, for example where such a linear presentation is adopted or where mobile entities arrive from a limited range of directions, the zones need not appear at the periphery of the graphical representation, but may be situated in any convenient location within the graphical representation.

[0080] Still further, the manner in which the factor by which the distance is proportional to the predicted time of arrival may vary as a function of orientation, with regard to the point of reference, so that entities arriving from one direction are subject to time scaling according to one factor, or one linear evolution, whilst entities arriving from another direction are subject to time scaling according to a further respective factor, or linear evolution.

[0081] Still further, a user input may be received specifying a new scale for the graphical representation or some part thereof, whereupon the selection of mobile entities for representation, the determination of the position of each mobile entity, and the determination of the predicted time of arrival of each mobile entity, can be recomputed and the graphical representation modified on the basis of the new scale accordingly.

[0082] In accordance with certain embodiments, and as shown by way of example in figure 8, the graphical representation may also be modified to incorporate speed vector indicator graphic, having a dimension proportional to the speed of each respective mobile entity in the physical space.

[0083] Accordingly, as shown in figure 8 the four mobile entities are each provided with a speed vector indicator in the form of an arrow whose length is proportional to the speed of the mobile entity, and whose orientation corresponds to the direction of movement thereof. It will be appreciated that not all mobile entities need be associated with a speed vector indicator, and that for example speed vector indicators might be presented only for mobile entities in the physical area, or in one or more of the additional zones. Inherently there will exist a relationship between the scale of the speed vector indicator and the factor by which the distance is proportional to the predicted time of arrival of the mobile entity in question. In some embodiments, the dimension of the speed vector indicator may be chosen to correspond to the time the entity will take to travel a specified distance at the known speed of the mobile entity, using the same factor by which the distance is proportional to the predicted time of arrival of the mobile entity in question

[0084] It will be appreciated that while figures 2 to 8 present certain combinations of features together, any other combination of features may be envisaged. For example, the drag feature of figure 6 might be integrated with the zone feature of figure 8, without necessarily adoption other features of these embodiments, and so on.

[0085] Figure 9 shows a mobile entity manager in accordance with an embodiment.

[0086] As shown there is provided a mobile entity manager 900 for managing a mobile entity 923 in a physical space 990. As shown, the mobile entity manager 900 comprises a graphics renderer adapted to generate a graphical representation 911 of the physical space 990 at a first scale. The mobile entity manager 900 further comprises a representation engine 950 adapted to define a reference point in the physical space 990, to determine a position of the mobile entity 923 with respect to the reference point, and to determine a predicted time of arrival of the mobile entity 923 within a region defined with respect to the reference point. The graphics renderer 910 is further adapted to modify the graphical representation 911 to incorporate an indication of the position of the mobile entity 923 on a path with respect to the region, wherein the indication is situated in a position in the representation of the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.

[0087] The representation of the physical space may be generated on the basis of a digital representation of the space 920. The position of the mobile entities may be generated on the basis of entity data 960. User input, for example as described with respect to any of figure 1 to 8, may be provided to the representation engine via the graphical representation 911 from a user 940.

[0088] Further adaptations to the system of figure 9 may be envisaged to implement any of the features of any of the foregoing embodiments.

[0089] According to certain embodiments, the movements and interactions of mobile physical entities such as vehicles, individuals and so on in a physical space can be monitored and controlled by providing a graphical representation of the physical space including an indication of a position of a mobile entity with respect to a region associated with a reference point in the physical space where the indication is situated at a distance from the reference point where the distance is proportional to the predicted time of arrival of the entity in the region. On this basis, mobile entities are shown in the order of expected arrival, rather than their instantaneous physical location, supporting more rapid assimilation of anticipated arrivals and traffic conditions.

[0090] Software embodiments include but are not limited to application, firmware, resident software, microcode, etc. The invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or an instruction execution system. Software embodiments include software adapted to implement the steps discussed above with reference to figures 1 to 8. A computer-usable or computer-readable can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.

[0091] In some embodiments, the methods and processes described herein may be implemented in whole or part by a user device. These methods and processes may be implemented by computer-application programs or services, an application-programming interface (API), a library, and/or other computer-program product, or any combination of such entities.

[0092] The user device may be a mobile device such as a smart phone or tablet, a drone, a computer or any other device with processing capability, such as a robot or other connected device, including loT (Internet of Things) devices.

[0093] Figure 10 shows a generic computing system suitable for implementation of embodiments of the invention.

[0094] A shown in figure 10, a system includes a logic device 1001 and a storage device 1002. The system may optionally include a display subsystem 1011, input/output subsystem 1003, communication subsystem 1020, and/or other components not shown.

[0095] Logic device 1001 includes one or more physical devices configured to execute instructions. For example, the logic device 1001 may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.

[0096] The logic device 1001 may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic device may include one or more hardware or firmware logic devices configured to execute hardware or firmware instructions. Processors of the logic device may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic device 1001 optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic device 1001 may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.

[0097] Storage device 1002 includes one or more physical devices configured to hold instructions executable by the logic device to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage 1002 device may be transformed-e.g., to hold different data.

[0098] Storage device 1002 may include removable and/or built-in devices. Storage device may be locally or remotely stored (in a cloud for instance). Storage device 1002 may comprise one or more types of storage device including optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., FLASH, RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.

[0099] In certain arrangements, the system may comprise an interface 1003 adapted to support communications between the logic device 1001 and further system components. For example, additional system components may comprise removable and/or built-in extended storage devices. Extended storage devices may comprise one or more types of storage device including optical memory 1032 (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory 1033 (e.g., RAM, EPROM, EEPROM, FLASH etc.), and/or magnetic memory 1031 (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Such extended storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.

[0100] It will be appreciated that storage device includes one or more physical devices, and excludes propagating signals per se. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.), as opposed to being stored on a storage device.

[0101] Aspects of logic device 1001 and storage device 1002 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.

[0102] The term "program" may be used to describe an aspect of computing system implemented to perform a particular function. In some cases, a program may be instantiated via logic device executing machine-readable instructions held by storage device 1002. It will be understood that different modules may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The term "program" may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.

[0103] In particular, the system of figure 10 may be used to implement embodiments of the invention.

[0104] For example a program implementing the steps described with respect to figures 1 to 8, or the algorithms presented above may be stored in storage device 1002 and executed by logic device 1001. Information reflecting or defining the physical space, the entities, or the regions may be stored in storage device 1002, 1031, 1032, 1033. Information reflecting or defining the physical space, the entities, or the regions may be stored received via the communications interface 1020. User input defining the regions may be received via the I/O interface 1003 and in particular the touchscreen display 1011, camera 1016, microphone 1015, mouse 1013, keyboard 1012 or otherwise. The functions of any or all of the units910 or 950, may similarly be implemented by a program performing the required functions, in communication with additional dedicated hardware units as necessary. The display 1011 may display the graphical representation of the physical space, and/or the regions, and/or the entities. Accordingly the invention may be embodied in the form of a computer program.

[0105] It will be appreciated that a "service", as used herein, is an application program executable across multiple user sessions. A service may be available to one or more system components, programs, and/or other services. In some implementations, a service may run on one or more server-computing devices.

[0106] When included, display subsystem 1011 may be used to present a visual representation of data held by a storage device. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage device 1002, and thus transform the state of the storage device 1002, the state of display subsystem 1011 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1011 may include one or more display devices utilizing virtually any type of technology for example as discussed above. Such display devices may be combined with logic device and/or storage device in a shared enclosure, or such display devices may be peripheral display devices. An audio output such as speaker 1014 may also be provided.

[0107] When included, input subsystem may comprise or interface with one or more user-input devices such as a keyboard 1012, mouse 1013, touch screen 1011, or game controller (not shown). In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone 1015 for speech and/or voice recognition; an infrared, colour, stereoscopic, and/or depth camera 1016 for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity. The input/output interface 1003 may similarly interface with a loudspeaker 1014, vibromotor or any other transducer device as may occur to the skilled person. For example, the system may interface with a printer 1017.

[0108] When included, communication subsystem 1020 may be configured to communicatively couple computing system with one or more other computing devices. For example, communication module of communicatively couple computing device to remote service hosted for example on a remote server 1076 via a network of any size including for example a personal area network, local area network, wide area network, or internet. Communication subsystem may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network 1074, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system to send and/or receive messages to and/or from other devices via a network such as Internet 1075. The communications subsystem may additionally support short range inductive communications with passive or active devices (NFC, RFID, UHF, etc). In certain variants of the embodiments described above, the traffic data may be received via the telephone network 1074 or Internet 1075.

[0109] The system of figure 10 is intended to reflect a broad range of different types of information handling system. It will be appreciated that many of the subsystems and features described with respect to figure 10 are not required for implementation of the invention, but are included to reflect possible systems in accordance with the present invention. It will be appreciated that system architectures vary widely, and the relationship between the different sub-systems of figure 10 is merely schematic, and is likely to vary in terms of layout and the distribution of roles in systems. It will be appreciated that, in practice, systems are likely to incorporate different subsets of the various features and subsystems described with respect to figure 10.

[0110] Examples of devices comprising at least some elements of the system described with reference to figure 10 and suitable for implementing embodiments of the invention include cellular telephone handsets including smart phones, and vehicle navigation systems.

[0111] Figure 11 shows a smartphone device adaptable to constitute an embodiment. As shown in figure 9, the smartphone device incorporates elements 1001, 1002, 1003, 1020, optional near field communications interface 1021, flash memory 1033 and elements 1014, 1015, 1016, and 1011 as described above. It is in communication with the telephone network 1074 and a server 1076 via the network 1075. Alternative communication mechanisms such as a dedicated network or Wi-Fi may also be used. The features disclosed in this figure may also be included within a tablet device as well.

[0112] Figure 12 shows an Air Traffic control desk adaptable to constitute an embodiment. As shown in figure 12, the Air Traffic control desk comprises elements 1001, 1002, 1003, 1020, 1014, 1015, 1016, 1011, 1031, 1032, 1033 as described above. As shown it is in communication with a drone 1001 via a communications satellite 1002 and a radio antenna 1003 coupled to the communications interface 1020. As shown, the cockpit comprises a seat, and joysticks, either of which may constitute suitable locations for any user status sensors and/or vibration transducers as discussed above. Alternative communication mechanisms may also be used.

[0113] Further embodiments may be based on or include immersive environment devices such as the HTC vive, Oculus rift etc, or other hybrid device such as the Hololens Meta vision 2.

[0114] It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.

[0115] The subject matter of the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.


Claims

1. A mobile entity manager for managing a mobile entity in a physical space, said mobile entity manager comprising
a graphics renderer adapted to generate a graphical representation of said physical space at a first scale,
a representation engine adapted to define a reference point in said physical space, to determine a position of said mobile entity with respect to said reference point, and to determine a predicted time of arrival of said mobile entity within a region defined with respect to said reference point,
said graphics renderer being further adapted to modify said graphical representation to incorporate an indication of the position of said mobile entity on a path with respect to said region, wherein said indication is situated in a position in said physical space at a distance from said reference point, said distance being proportional to said predicted time of arrival.
 
2. A method of managing a mobile entity in a physical space, said method comprising the steps of:

defining a graphical representation of said physical space at a first scale,

defining a reference point in said physical space,

determining a position of said mobile entity with respect to said reference point,

determining a predicted time of arrival of said mobile entity within a region defined with respect to said reference point,

modifying said graphical representation to incorporate an indication of the position of said mobile entity on a path with respect to said region, wherein said indication is situated in a position in said physical space at a distance from said reference point, said distance being proportional to said predicted time of arrival.


 
3. The method of claim 2 comprising a further steps of receiving a user input specifying a new position of said indication on said path, calculating one or more variations in the speed of said mobile entity which would modify the predicted time of arrival of said mobile entity to correspond to said new position of said indication, and issuing a communication to said mobile entity indicating said variations.
 
4. The method of claim 2 or 3 comprising a further steps of receiving a user input specifying a new position of said indication on said path, a modification of said path which would modify the predicted time of arrival of said mobile entity to correspond to said new position of said indication, and issuing a communication to said mobile entity indicating said variations.
 
5. The method of any of claims 2 to 4 wherein at said step of modifying said graphical representation to incorporate an indication of the position of said mobile entity with respect to said region, said indication is situated in a position in said graphical representation of said physical space with an orientation with respect to said reference point corresponding to the relative orientation of said entity to said reference point.
 
6. The method of any of claims 2 to 5 comprising the further steps of
determining a position of a further said mobile entity with respect to said reference point,
determining a predicted time of arrival of said further mobile entity within said region,
and modifying said graphical representation to incorporate an indication of the position of said further mobile entity with respect to said region, wherein said indication is situated on said path in a position in said graphical representation of said physical space at a distance from said reference point, said distance being proportional to said predicted time of arrival of said further mobile entity.
 
7. The method of claim 6 comprising a further step of determining on the basis of a speed of said first mobile entity and a speed of said further mobile entity a timing for a convergence of the two entities.
 
8. The method of any of claims 2 to 7 wherein said path corresponds to a path in said physical space that said mobile entity is expected to follow.
 
9. The method of any of claims 2 to 8 wherein said factor by which said distance is proportional to said predicted time of arrival varies as a function of the position of said indication thereon, or wherein said the factor by which said distance is proportional to said predicted time of arrival varies as a function of said orientation.
 
10. The method of any of claims 2 to 9 comprising the further step of
modifying said graphical representation to incorporate speed vector indicator graphic, wherein said speed vector graphic has a dimension proportional to the speed of said mobile entity in said physical space.
 
11. The method of any of claims 2 to 10 comprising a further step of receiving a user input specifying a new position of said reference point, and repeating said steps of
determining a position of said mobile entity, determining a predicted time of arrival of said mobile entity, and modifying said graphical representation on the basis of said new position of said reference point.
 
12. The method of any of claims 2 to 11 comprising a further step of receiving a user input specifying a new scale for said graphical representation, and repeating said steps of
determining a position of said mobile entity, determining a predicted time of arrival of said mobile entity, and modifying said graphical representation on the basis of said new scale.
 
13. The method of any of claims 2 to 12 in which said graphical representation is three dimensional.
 
14. A computer program comprising instructions adapted to implement the steps of any of claims 2 to 13.
 




Drawing


































Search report









Search report