[0001] The present invention concerns a method and a system for measuring the latency of
a graphical display output.
[0002] There is a system known, where a high-speed camera is used to measure input and graphics
latency, counting frames between a visible mouse click and a display update. This
system requires a user interaction.
[0003] In other systems a test display is compared with a control display, usually a CRT
and a stopwatch software displays a stopwatch program concurrently on both displays.
The time lag between the two displays is measure by taking a photograph of the displays.
[0004] Further, there exist some standards, in particular ISO 13406-2, where a monitor is
switch from black to white and vice versa and the time is measured of 10 and 90 percent
brightness. Another standard is ISO 9241-305, where not only black and white, but
also grey values are used.
[0005] The inconvenience of these systems that they take also into account also other lags
or latencies, for example of the treatment of the input device (input latency), without
the ability of determining the graphic latency alone. Further, they did not take into
account the time of a display to display a complete frame.
[0006] Update latency is important in automotive domain, since various safety applications
can require driver attention and low latency can give the driver more time to handle
a dangerous situation if the software is not able to do so.
[0007] Object of the invention is to provide a method and a system which provides an improved
measurement of the latency of the graphical output.
[0008] According to one aspect, a method is provided for measuring the latency of a graphical
display output, comprising the following steps:
- obtaining, by a processor, a time value;
- generating, by the processor, a time stamp based on the time value;
- transmitting, by the processor, the time stamp to a graphics processor unit, GPU,
- rendering, by the GPU, a display frame including at least one graphical symbol being
based on the time stamp;
- providing the rendered frame including the at least one graphical symbol to at least
one display;
- capturing, by at least one camera, at least a portion of the display comprising the
at least one graphical symbol of the time stamp;
- determining from the at least one captured graphical symbol the rendered time stamp;
and
- comparing the rendered time stamp with a time reference for obtaining the latency
of the graphical display output..
[0009] Further embodiments may relate to one or more of the following features, which may
be combined in any technical feasible combination:
the processor transmits to the GPU the time stamp as one or more characters and/or
one or more number values, in particular as non-graphical number values;
- the at least one graphical symbol is a visually readable graphically coded number;
- the graphical symbol is a bar code, in particular a one dimensional or two dimensional
bar code;
- the at least one graphical symbol is a black and white bar code;
- - the camera is adapted to capture at least 120 frames per second and/or the camera
capture rate is at least twice the display update frequency of the at least one display;
- two graphical symbols depending on the time stamp are arranged at the top and the
bottom of the display frame, in particular adjacent the top edge and adjacent the
bottom edge of the display frame;
- the coded time stamp is compared with the time reference only if the graphical symbol
at the top of the display and the graphical symbol at the bottom of the display are
based on the same time stamp;
- the time value is obtained from a network time protocol;
- the time stamp the generated comprises reduced time information with respect to
the time value;
- capturing comprises capturing, by the camera, rendered frames of at least two displays,
each of the displays displaying display frames including at least one graphical symbol
based on a time stamp, wherein at least one of the at least two displays is a reference
display, in particular a cathode ray tube display, adapted to display a frame within
a predetermined time, for example 50ms , wherein the time reference is obtained by
determining, from the captured graphical symbol of the reference display, the rendered
time stamp;
- the method further comprises:
- assigning to each captured frame of the camera a time value, when the frame was captured
by the camera;
- determining the time reference includes obtaining the time value assigned to the captured
frame of the camera, wherein, in particular the at least one camera obtains a time
from the network time protocol and determines a time value depending on the obtained
time; and/or
- the method further comprises:
determining the workload of the processor;
recording a plurality of latency values and a plurality workload values in at least
one database, such that the latency values are associated to workload values.
[0010] According to another aspect, a system is provided for measuring the latency of a
graphical display output, comprising a first device having processor and a graphics
processing unit, GPU, connected to the at least one processor, the GPU being connected
to at least one first display, wherein the processor is adapted to obtain a time value,
to generate a time stamp based on the time value and to transmit the time stamp to
the GPU, wherein the GPU is adapted to render a display frame including at least one
graphical symbol being based on the time stamp and to provide the rendered frame including
the at least one graphical symbol to the at least one first display; wherein the system
further comprises at least one camera adapted to capture at least a portion of the
first display comprising the at least one graphical symbol of the time stamp, wherein
the system is adapted to determine from the at least one captured graphical symbol
the rendered time stamp, and to compare the rendered time stamp with a time reference
for obtaining the latency of the graphical display output.
[0011] Further embodiments may relate to one or more of the following features, which may
be combined in any technical feasible combination:
The system further comprises a second device having processor and a graphics processing
unit, GPU, connected to the at least one processor, the GPU being connected to at
least one second display, wherein the processor of the second device is adapted to
obtain a time value, to generate a time stamp based on the time value and to transmit
the time stamp to the GPU of the second device, wherein the GPU of the second device
is adapted to render a display frame including at least one graphical symbol being
based on the time stamp and to provide the rendered frame including the at least one
graphical symbol to the at least one second display, wherein the first and second
device obtain a time value from the same time source, wherein the camera is adapted
to capture concurrently at least portion of the first display and at least a portion
of the second display comprising the at least one graphical symbol of the time stamp,
wherein the system is adapted to determine from the at least one captured graphical
symbol of the second display the rendered time stamp corresponding to the time reference.
[0012] Embodiments are also directed to the system for carrying out the disclosed methods
steps and in particular including apparatus parts and/or devices for performing described
method steps.
[0013] The method steps may be performed by way of hardware components, firmware, software,
a computer programmed by appropriate software, by any combination thereof or in any
other manner.
[0014] According to a further aspect, a computer program product is provided comprising
commands for executing the method according an embodiment disclosed herein, when loaded
and executed on a processor. According to an embodiment, a computer program product
may be a physical software product, for example a hard disc, a solid state disc, a
CD-ROM, a DVD, comprising the program.
[0015] According to other aspects, the present invention relates to non-volatile memory,
for example a hard disc, a solid state device, a CD-ROM, a DVD, including a program
containing commands for executing the method according an embodiment disclosed herein,
when loaded and executed on a processor.
[0016] Further advantages, features, aspects and details are evident from the dependent
claims, the description and the drawings.
[0017] So that the manner in which the above recited features of the present invention can
be understood in detail, a more particular description of the invention, briefly summarized
above, may be read by reference to embodiments. It is to be noted, however, that the
appended drawings illustrate only typical embodiments of this invention and are therefore
not to be considered limiting of its scope, for the invention may admit to other equally
effective embodiments.
[0018] The accompanying drawings relate to embodiments of the invention and are described
in the following:
Fig. 1 shows schematically a system for measuring the latency of a graphical display
output according to an embodiment,
Fig. 2 shows schematically the display screen of such a system;
Fig. 3 shows schematically a system for measuring the latency of a graphical display
output according to an embodiment using two displays;
Fig. 4 shows schematically a system for measuring the latency of a graphical display
output according to an embodiment with a single display; and
Fig. 5 shows a flowchart of a method for measuring the latency of a graphical display
output according to an embodiment.
[0019] Figure 1 shows schematically system for measuring the latency of a graphical display
output. The system includes a device 3 for generating at least one display output.
According to an embodiment, the device 3 comprises a processor 5 and a graphics processing
unit (GPU) 7. Further, the device 3 includes optionally one or more hardware controller
6 for a network and/or a bus 6a. The GPU is connected to one or more displays 10a,
10b and at least one camera 12 is provided to capture a picture from the one or more
displays 10a, 10b.
[0020] The processor 3 may include one or more cores. For example, the processor is a central
processing unit (CPU). Typically, the each core is an independent processing unit,
independent from the other cores. The cores enable parallel computing in the processor
5.
[0021] According to an embodiment, the device 3 may comprise an interface for connecting
to one or more bus systems, for example one or more hardware controller 6 for a controller
area network (CAN) busses 6a and/or one or more hardware controller 6 for FlexRay
busses 6a. In other embodiments, the device may also comprise further hardware controller
6 for connecting to one or more wired or wireless networks 6a, for example a Bluetooth
connection, an Ethernet network and/or to a USB (Universal Serial Bus) connection.
[0022] According to embodiments, a time value is obtained by the processor 5 for example
from an internal clock 9 of the device 3 and/or of the processor 5. In other embodiments,
the processor 5 obtains a time value via a computer network 9, for example using a
network time protocol (NTP). The network time protocol is defined in different standards,
for example RFC 5905 or RFC 1305. The network time protocol is provided for synchronization
between computer systems over variable latency data networks.
[0023] The processor 5 is adapted to generate the time stamp from the obtained time value
and to transmit the time stamp to the GPU 7, in particular immediately after the generation
of the time stamp. In other words, the time stamp is selected or generated by the
processor 5 immediately before transmitting a rendering request to the GPU 7. The
resulting latency is therefore a time between selecting or generating the timestamp
on the processor 5 and displaying the time stamp in form of a barcode on the one or
more displays 10a, 10b.
[0024] The time value is for example the astronomic time. In an embodiment, the time value
includes year, month, day, hours, minutes seconds and/or milliseconds. For example,
the time value may include all of these information. Instead of hours, minutes, seconds,
and milliseconds, the time value may include the milliseconds elapsed from the start
of the day.
[0025] According to embodiments, the time stamp generated comprises reduced time information
compared to the time value. In other words, the processor 5 is adapted to reduce the
time information with respect to the time value. For example, the time stamp includes
milliseconds elapsed from the start of the day. In an embodiment, the time stamp does
not include information about the year, the month and/or the day. For example, the
processor 5 removes the information about the year, the month and/or the day. Thus,
the amount of information needed to transfer to the GPU 7 is reduced. Further, portions
that are unlikely to change during the test, like day or month or year are filtered.
Therefore, a reduced timestamp is transferred to GPU 7.
[0026] The time stamp includes for example an absolute time or a relative time, in particular
of the time when the time stamp is created. A relative time is the time passed since
a specific starting time. For example, the time stamp can include the number of seconds
and/or milliseconds since the start of the day. In such a case, the specific starting
time is the start of the day, for example midnight. The time stamp includes one or
more characters and/or numerical values representing the time. In other words, the
time stamp does not include graphical representation.
[0027] The graphics processing unit (GPU) 7 is a device specifically adapted for calculating
and/or rendering graphics, for example two or three dimensional graphics. For that
purpose the GPU 7 has a highly parallel structure, where processing of data is done
in parallel.
[0028] According to an embodiment, the GPU 7 is separated from the one or more processors
3. In other embodiments, the GPU 7 is integrated into the one or more CPUs 3. Typically,
a GPU 7 transmits a rendered frame to the one or more displays 10a, 10b, for example
directly via a specific connection.
[0029] In some embodiments, the GPU 7 is connected or via the hardware controller 6 and
the bus and/or the network 6a to the one or more displays 10a, 10b. The GPU 7 is adapted
to transmit the rendered frames via the bus and/or the network 6a to the one or more
displays 10a, 10b.
[0030] According to an embodiment, rendering is the process for generating an image or a
display frame for one or more displays from the information, here the time stamp,
received from the processor 5. For example models or scene files, which are a virtual
model defining the characteristics of objects and lights, are used as a basis for
rendering the frame. For example during rendering, it is determined which objects
are visible for the observer, whether the objects are shaded or not, the light distribution
within the scene etc.
[0031] According to an embodiment, the GPU is adapted to generate or render display frames
from a textual and/or numerical input value, in particular the time stamp received
from the processor 5. For that purpose, the GPU can be programmed.
[0032] For example, the GPU is adapted to generate or render from the time stamp including
the one or more characters and/or numerical values at least one graphical symbol in
a display frame. The graphical symbol is for example a bar code, in particular a one
dimensional or two dimensional bar code. The bar code may be a black and white bar
code. According to an embodiment, the time stamp is transformed in an UPC (Universal
Product Code)-A barcode.
[0033] Figure 2 discloses schematically an embodiment of a display frame sent to a display
10a, 10b. The display frame has a top edge 14 and a bottom edge 16. Two graphical
symbols 18, 20 calculated based on the time stamp are arranged at the top and the
bottom of the display frame, in particular adjacent the top edge 14 and adjacent the
bottom edge 16 of the display frame. In other embodiments, the display frame may include
more or less graphical symbols based on the time stamp. For example, the display frame
may include a single graphical symbol or three or more graphical symbols based on
the time stamp.
[0034] In the embodiment shown in Figure 2, the graphical symbols 18 and 20 are different
even though they are based on the same time stamp. For example, the top graphical
symbol and the bottom graphical symbol encodes the same time stamp in a different
manner. In the example shown, the encoded numbers of the top and bottom graphical
symbols 18, 20, at the same position, are complemented to a predetermined number,
for example 9. In other words, if at a first position the digit is 1 in the top graphical
symbol 18, the first position of the bottom graphical symbol 20 is 8.
[0035] This enables to determine in a simple manner, whether the frame has been displayed
completely by comparing the top graphical symbol 18 and the bottom graphical symbol
20. In other words, it is determined, whether the graphical symbols 18, 20 are based
or refer to the same time stamp. In the example shown the sum of the encoded time
stamps equals to a predetermined number, which is in the example of Figure 2 9999999.
If this is not the case, the camera 12 might have captured a state of the display
10a, 10b in the middle of a frame update.
[0036] The generated display frame is then transmitted by the GPU 7 to the one or more displays
10a, 10b, either via a direct connection as shown in Figure 1 or via a bus and/or
network connection using the hardware controller 6 and the bus and/or network 6a.
[0037] The display or displays 10a, 10b then display upon reception the received display
frame.
[0038] In case only a single display 10a is used, the display 10a is the display, where
the latency has to be measured.
[0039] The latency is the time from generation of the time stamp provided to the GPU until
the complete display of the display frame at the display. In other words, the graphical
latency is determined in the present disclosure.
[0040] In case a plurality of displays 10a, 10b is used, all the displays 10a, 10b are the
displays for which the latency may be measured.
[0041] In other embodiments, in case of a plurality of displays 10a, 10b, one of the displays,
for example display 10b, is a reference display. The reference display is a display
with known display capabilities, for example having a known display latency. In an
example, the reference display is adapted to display a complete frame within 50 milliseconds
after the display frame has been provided to the input port of the display. Then,
the latency of a display may be measured with respect to the reference display. If,
for example, the same GPU is used, the latency due to the GPU 7 can be considered
being the same for all displays.
[0042] In other systems, two or more devices 3 are used each connected to a respective display
10a, 10b. In such a case the time stamps of the devices 3 are synchronized using the
NTP protocol. Typically, in such systems the time difference between the different
devices 3 is below 1 millisecond.
[0043] In an embodiment, the reference display is a cathode ray tube (CRT) display. A CRT
is considered to have itself nearly no latency.
[0044] The system further includes the at least one camera 12. The camera 12 has a field
of view 22. The field of view 22 of the camera(s) 12 is arranged such that one or
more portions of the display frame including the graphical symbols 18, 20 are included
in the field of view 22.
[0045] In an embodiment, a single camera 12 is used capturing all graphical symbols 18,
20 of all the displays 10a, 10b.
[0046] The at least one camera 12 is for example a camera capable of capturing 120 frames
per second. In an embodiment, the camera speed required depends on the display update
frequency. For example, for a display update rate of 60 Hz, the minimum speed of camera
is 120 frames per second, and preferably 240 frames per second. In some embodiments,
the camera capture rate should be at least twice the display update frequency, in
particular of the fastest display 10a, 10b captured by the camera 12. This may be
for example necessary in order to capture complete frames. In some embodiments, the
capture rate of the camera is at least four times the display update rate.
[0047] In some embodiments, the precision of the disclosed method may be improved with using
displays with higher update frequency and using cameras with higher frames per seconds
capture frequency.
[0048] The at least one camera 12 may be connected to an evaluation device 24 in order to
provide the captured frames to the evaluation device 24. The evaluation device 24
are provided to determine the legacy of the at least one display 10a, 10b. For example,
the at least one camera 12 is connected via the network and/or bus 6a to the evaluation
device 24. In other embodiments, the at least one camera 12 is directly connected
to the evaluation device 24, for example using a USB (Universal Serial Bus) connection.
[0049] In some embodiments, the evaluation device 24 is integrated into the device 3. In
other words, the evaluation device 24 and the device 3 are the same device.
[0050] In some embodiments, which may be connected to other embodiments, the at least one
camera 12 is adapted to obtain a time value, for example from the device 3, the evaluation
device 24 and/or the bus and/or network 6a using the network time protocol.
[0051] The at least one camera 12 may associate the time value with a captured frame. For
example, the time value can be stored with the captured frame.
[0052] According to some embodiments, the at least one camera 12 is capturing frames for
a predetermined time, for example several minutes.
[0053] The captured frame is provided to the evaluation device 24. The evaluation device
24 is adapted to analyze each captured frame. For example, the evaluation device is
adapted to convert the graphical symbols 18, 20 into numbers and/or a time stamp.
For that purpose, in case the graphical symbol is a barcode, an automated barcode
recognition software may be used.
[0054] Then the evaluation device 24 is adapted to compare the decoded time stamp with a
time reference. In some embodiments, the coded time stamp is compared with the time
reference if, in particular only if, the graphical symbol at the top of the display
and the graphical symbol at the bottom of the display refers to the same time stamp.
For example, this is done by adding the numbers of the decoded symbols and compare
them with a predefined number as explained above.
[0055] In other words, display frames that were captured in the middle of a display update,
that means that the graphical symbols at the bottom and the top of the display refer
to different time stamps, are discarded.
[0056] The Figure 3 shows a system with a first display 10a and a second display 10b, wherein
the second display is a reference display. The displays 10a and 10b may be connected
to the same device 3 or to two different devices 3. In the second case (two different
devices 3, for example a first device 3 and a second device 3), each device 3 has
obtained the time value from the same time source, for example via the network time
protocol and/or the clock 9. Thus, both devices 3 are synchronized to the same clock.
Both displays 10a, 10b, for example a first display 10a connected to the GPU of the
first device and a second display 10b connected to the GPU 7 of the second device
3, are within the field of view 22 of the camera 12.
[0057] The evaluation device 24 then determines in a first step from the graphical symbols
18, 20 the time stamps of a complete displayed display frame. In a second step, the
time stamp of the first display 10a is then compared with the time stamp of the second
display 10b. In case the second display 10b is a reference display, the latency of
the first display 10a can be calculated. In some embodiments, the time stamp difference
is calculated and analyzed. This allows an automatic comparison of the graphical latency
between the two displays. In the embodiment of Figure 3, it is not necessary to associate
a captured frame of the camera with a time stamp, but is also possible.
[0058] Typical barcode recognition software handles many types of image distortions, for
example camera angles, perspective distortions, and allows reliable timestamp recognition.
[0059] In another embodiment, for example as shown in Figure 4, as indicated above, the
captured frames of the at least one camera 12 are associated with a time value, when
the frame was captured by the at least one camera 12. For example, the camera 12 saves
the time of start of recording, and then time value or time stamp of each frame can
be calculated from time of start of recording, the number of the captured frame and
a specified predefined frame rate. In other words, the time value or a time stamp
is embedded into or associated to each captured frame.
[0060] The evaluation device 24 is adapted to receive the capture from the at least one
camera 12 and to determine, optionally, a complete displayed display frame.
[0061] Then, the time stamp of the captured display frame is determined from the at least
one graphical symbol 18, 20 in the captured frame. The evaluation device is adapted
to compare the determined time stamp with the time value associated with the captured
frame, from which the time stamp was determined, in order to determine the graphical
latency. For example, a difference is calculated between the determined time stamp
in order to determine the graphical latency.
[0062] In such a case, no reference system is necessary. In other embodiments, also more
than a single display can be evaluated at the same time with a single camera 12. In
other embodiments, the systems described with respect to Figures 3 and 4 could be
combined.
[0063] Figure 5 shows a flow chart of a method according to an embodiment. In first step
100 the processor 5 obtains a time value, for example from the clock 9 or from via
the network and/or bus 6a using the network time protocol. Thus, the clock 9 and the
network time protocol are a time source.
[0064] In step 102, the processor 5 determines or creates a time stamp based on the time
value and transmits the transmitting the time stamp to the GPU 7.
[0065] In the next step 104, the GPU calculates at least one graphical symbol 18, 20 from
the time stamps and renders a display frame including at least one graphical symbol
18, 20. In an embodiment, the calculating of the at least one graphical symbol and
the rendering of display frame can take place at the same time during the same calculation.
[0066] For example, the rendered display frame includes two graphical symbols, a first graphical
symbol 18 at the top edge of the display frame and a second graphical symbol 20 at
the bottom edge of the display frame.
[0067] In the next step 106, the GPU 7 provides the rendered display frame including the
at least one graphical symbol to at least one display 10a, 10b, for example via a
respective direct connection, as shown in Figure 1, or via a network and/or bus connection
6a.
[0068] In step 108 the at least one camera 12 captures at least a portion of the at least
one display 10a, 10b comprising the graphical symbol 18, 20 of the time stamp. The
at least one camera 12 provides the captured frame to the evaluation device 24. Preferably,
the camera captures all graphical symbols of the display frame displayed by the at
least one display 10a, 10b.
[0069] Finally, for example by the evaluation device 24, the time stamp is determined or
decoded from the captured graphical symbol(s) and the time stamp is compared with
a time reference in step 110.
[0070] In some embodiments, which may be combined with other embodiments disclosed herein,
the workload of the processor 3 and/or the GPU 7 may be varied over the time and the
latency of the display is determined over the time. For that purpose, the evaluation
device 24 may store the latency and workload values, in particular subsequent latency
values of the same display 10a and/or subsequent workload values, 10b, in at least
one database, in particular so that the workload values of the processor 3 can be
associated with corresponding individual latency values. The workload is an amount
of work that processor 5 and/or GPU 7 need to do.
[0071] Depending on nature of running applications, the workload can have peaks, when suddenly
the processor 5 and/or the GPU 7 is very busy with calculations. During these peaks
the latency of many parameters of the system usually also increases.
[0072] According to embodiments, in particular for safety applications, the latency is continuously
measured.
[0073] The present disclosure enables a precise detection of graphical latency.
[0074] According to the invention, the task of comparing performance of different solutions
for transferring graphics, in particular over the network, as well as measuring latency
with which the display get updated is solved.
[0075] Further, the disclosure allows automatic measurement of latency, producing more data
points with much higher frequency, allowing a more complex analysis of graphical latency
distribution under different workloads.
[0076] Further, the method and the system according to the invention enable an automatic
latency measurement. Further, any consumer grade camera capable of doing 120 frames
per second or more can be used to measure graphics latency so that it is not necessary
to use industrial grade high-speed cameras.
[0077] The present disclosure can be used as part of a continuous integration regression
tests for graphical system developers, detecting the latency problems as soon as they
are introduced.
[0078] In some examples of implementation, any feature of any embodiment described herein
may be used in combination with any feature of any other embodiment described herein.
1. Method for measuring the latency of a graphical display output, comprising the following
steps:
- obtaining, by a processor (5), a time value;
- generating, by the processor (5), a time stamp based on the time value;
- transmitting, by the processor, the time stamp to a graphics processor unit (7),
GPU,
- rendering, by the GPU (7), a display frame including at least one graphical symbol
being based on the time stamp;
- providing the rendered frame including the at least one graphical symbol to at least
one display (10a, 10b);
- capturing, by at least one camera (12), at least a portion of the display comprising
the at least one graphical symbol (18, 20) of the time stamp;
- determining from the at least one captured graphical symbol the rendered time stamp;
and
- comparing the rendered time stamp with a time reference for obtaining the latency
of the graphical display output.
2. Method according to claim 1, wherein the processor transmits to the GPU the time stamp
as one or more characters and/or one or more number values, in particular as non-graphical
number values.
3. Method according to claim 1 or 2, wherein the at least one graphical symbol (18, 20)
is a visually readable graphically coded number.
4. Method according to one of the preceding claims, wherein the graphical symbol (18,
20) is a bar code, in particular a one dimensional or two dimensional bar code.
5. Method according to one of the preceding claims, wherein the at least one graphical
symbol (18, 20) is a black and white bar code.
6. Method according to one of the preceding claims, wherein the camera (12) is adapted
to capture at least 120 frames per second and/or the camera capture rate is at least
twice the display update frequency of the at least one display (10a, 10b).
7. Method according to one of the preceding claims, wherein two graphical symbols (18,
20) depending on the time stamp are arranged at the top and the bottom of the display
frame, in particular adjacent the top edge and adjacent the bottom edge of the display
frame.
8. Method according to claim 7, wherein the coded time stamp is compared with the time
reference only if the graphical symbol (18) at the top of the display and the graphical
symbol (20) at the bottom of the display (10a, 10b) are based on the same time stamp.
9. Method according to one of the preceding claims, wherein the time value is obtained
from a network time protocol.
10. Method according to one of the preceding claims, wherein, the time stamp generated
comprises reduced time information with respect to the time value.
11. Method according to one of the preceding claims, wherein capturing comprises capturing,
by the camera (12), rendered frames of at least two displays (10a, 10b), each of the
displays displaying display frames including at least one graphical symbol (18, 20)
based on a time stamp, wherein at least one of the at least two displays is a reference
display, in particular a cathode ray tube display, adapted to display a frame within
a predetermined time, for example 50ms , wherein the time reference is obtained by
determining, from the captured graphical symbol of the reference display (10b), the
rendered time stamp.
12. Method according to one of the claims 1 to 10, further comprising:
- assigning to each captured frame of the camera a time value, when the frame was
captured by the camera (12);
- determining the time reference includes obtaining the time value assigned to the
captured frame of the camera (12), wherein, in particular the at least one camera
(12) obtains a time from the network time protocol and determines a time value depending
on the obtained time.
13. Method according to one of the preceding claims, further comprising;
determining the workload of the processor (5);
recording a plurality of latency values and a plurality workload values in at least
one database, such that the latency values are associated to workload values.
14. System for measuring the latency of a graphical display output, comprising a first
device (3) having processor (5) and a graphics processing unit (7), GPU, connected
to the at least one processor (5), the GPU being connected to at least one first display
(10a, 10b), wherein the processor is adapted to obtain a time value, to generate a
time stamp based on the time value and to transmit the time stamp to the GPU, wherein
the GPU (7) is adapted to render a display frame including at least one graphical
symbol (18, 20) being based on the time stamp and to provide the rendered frame including
the at least one graphical symbol to the at least one first display (10a); wherein
the system further comprises at least one camera (12) adapted to capture at least
a portion of the first display (10a) comprising the at least one graphical symbol
(18, 20) of the time stamp, wherein the system is adapted to determine from the at
least one captured graphical symbol (18, 20) the rendered time stamp, and to compare
the rendered time stamp with a time reference for obtaining the latency of the graphical
display output.
15. System according to claim 14, further comprising a second device (3) having processor
(5) and a graphics processing unit (7), GPU, connected to the at least one processor
(5), the GPU being connected to at least one second display (10b), wherein the processor
(5) of the second device (3) is adapted to obtain a time value, to generate a time
stamp based on the time value and to transmit the time stamp to the GPU of the second
device (3), wherein the GPU (7) of the second device (3) is adapted to render a display
frame including at least one graphical symbol (18, 20) being based on the time stamp
and to provide the rendered frame including the at least one graphical symbol to the
at least one second display (10b), wherein the first and second device obtain a time
value from the same time source, wherein the camera (12) is adapted to capture concurrently
at least portion of the first display (10a) and at least a portion of the second display
(10b) comprising the at least one graphical symbol (18, 20) of the time stamp, wherein
the system is adapted to determine from the at least one captured graphical symbol
(18, 20) of the second display (10b) the rendered time stamp corresponding to the
time reference.