[0001] The present invention relates to a method for controlling an elevator system, a control
system for the elevator system and to the elevator system.
[0002] Certain disabilities or circumstances may make it difficult for a person to press
a call button of an elevator to travel to their intended floor. For example, the buttons
may be out of reach to individuals in a wheel-chair, the use of crutches or other
aides may pose a problem, or decreased motor function may make it difficult to press
the intended button. Others may be hesitant to press the buttons for sanitary reasons
or may want the convenience to call a floor hands-free if they are holding items in
both of their hands.
[0003] JP 2010 100 370 describes that a line of sight of a passenger of an elevator car is determined and
that the elevator car is stopped at the next floor, when the passenger looks at a
specific location. This may be used as alternative to a security button.
[0004] WO 2011 114 489 A1 relates to a guide device for an elevator. The guide device comprises a camera, which
takes pictures around the entrance of the elevator. It is detected, whether or not
a person has entered the elevator or not based on the pictures.
[0005] WO 2005 56251 A1 describes an elevator system with a camera, which detects the face of a person and
determines therefrom, whether the person uses a wheel-chair or not.
[0006] There may be a need for a hands-free and economic control method of an elevator system.
[0007] Such a need may be met with the subject-matter of the independent claims. Advantageous
embodiments are defined in the dependent claims. Ideas underlying embodiments of the
present invention may be interpreted as being based, inter alia, on the following
observations and recognitions.
[0008] An aspect of the present invention relates to method for controlling an elevator
system. The method may be automatically performed by a control system of the elevator
system and/or may be implemented as a computer program. In general, an elevator system
may be any device adapted for transporting persons and/or goods vertically with an
elevator car. An elevator system may be installed in a building in an elevator shaft.
[0009] According to an embodiment of the invention, the method comprises: detecting the
presence of a person in front of a control panel of the elevator system with a presence
detection sensor; in the case a person has been detected, detecting the presence of
at least one eye of a person in front of the control panel from a video stream of
a camera of the control panel; in the case, when the eye has been detected, determining
a gaze point of the eye on the control panel from a video stream of the camera and/or
a further camera, wherein a line of sight of the eye is determined from an image of
the eye in the video stream and the gaze point is determined by intersecting the line
of sight with a component of the control panel stored in a virtual layout of the control
panel; determining a selected floor from the gaze point; and controlling the elevator
system to move an elevator car to the selected floor.
[0010] The control system of the elevator system may wait for a person to appear in front
of a control panel. In this case, a camera may start to work and/or the video stream
of the camera may be analysed, whether an eye of the person is visible for the camera
or not. It may be possible that an eye detection module analyses the video stream
and/or one or more images of the video stream. The eye detection module may be based
on a neural network and/or machine learning and/or may have been trained to detect
a portion of an image that contain an eye.
[0011] When the eye is visible, the video stream may be analysed to determine a line of
sight of the eye and/or whether the person is looking at the control panel. A gaze
tracking module that analyses one or more video streams of one or more of the cameras
may perform the determination of the line of sight. For example, from the portion
of the image and/or video stream that has been detected, the gaze tracking module
may determine reflections on the eye. The reflections may have been caused by infrared
lights and/or by other light sources, such as an elevator car lighting. From the reflections
an orientation of the eye and/or view direction may be determined. From the position
of the eye and the orientation a line of sight of the eye may be determined.
[0012] From the line of sight, a gaze point on the control panel may be determined, which
also may be performed by the gaze tracking module. The position of the camera and/or
of the control panel and/or optionally a virtual layout of the control panel may be
stored in a controller, which performs the method. From this information, it may be
determined at which part of the control panel the person is looking. For example,
such a part may be a display and/or may be a button, such as a floor selection button,
of the control panel. Thus, from the gaze point it may be determined to which floor
the person intends to go. The elevator car then may be controlled to move to this
floor.
[0013] The virtual layout may comprise the positions and/or extensions of buttons on the
control panel, the positions and/or extensions of visual control commands on a display
of the control panel, etc.
[0014] It also may be that a floor selection is cancelled by looking at a visual cue, such
as a text, a symbol, etc. and then moving the gaze point to the previously selected
floor. Such a visual cue also may be displayed as visual control command on a display.
[0015] The implementation of an eye tracking system for calling floors may help facilitate
the independence of disabled people as well as a convenience to others. With the method,
an elevator floor destination may be selected requiring no tactile input. The method
may use eye-tracking and gaze point detection algorithm in order that a person can
send command prompts to the control panel hands-free.
[0016] This may result in an increased efficiency and less time spent picking floors. There
may be no need to request other persons to push a button. Disabled persons may use
the elevator easily, for example wheel-chair users may select the floor hands-free.
Furthermore, additional support may be provided to persons under situations of stress
and emergency.
[0017] According to an embodiment of the invention, the method further comprises: playing
an audio prompt indicating the person to look at the control panel, when a person
has been detected and no eye has been detected. In the case the controller is not
able to detect an eye in the video stream of the camera, an audio prompt with instructions,
how a floor can be selected with gaze tracking, may be output via a loudspeaker.
[0018] According to an embodiment of the invention, the method further comprises: determining
the presence of the eye with a second camera of the control panel, when no eye has
been detected with the camera being a first camera. The control panel may comprise
more than one camera, which is used for eye tracking. When the controller analyses
the video stream of the first camera and is not able to detect an eye, it may switch
to the video stream of another camera. The video stream from the second camera may
be analysed for the presence of an eye and/or, when an eye has been detected for a
line of sight and/or the gaze point. To this end, a position of the second camera
relative to the control panel may be stored in the controller.
[0019] According to an embodiment of the invention, the second camera is installed lower
than the first camera. For example, the first camera (or the second camera) may be
installed in a height adapted for eye tracking of a standing grown-up person. The
second camera (or the first camera) may be installed in a height for eye tracking
of a child and/or a person in a wheel-chair.
[0020] According to an embodiment of the invention, the gaze point is determined with the
second camera. It may be that the first camera is solely used for determining whether
an eye is visible or not and the second camera is then used for eye tracking. For
example, the first camera may be less power consuming as the second one and this may
save power consumed by the control system.
[0021] According to an embodiment of the invention, the method further comprises: displaying
visual control commands on a display of the control panel. For example, when a person
has been detected and/or when the eye has been detected, control commands that may
be selected with the eye are displayed. Such control commands may comprise commands
like "Do you need help? Yes/No", "Do you want to move up? Yes/No", etc. The control
commands may provide possibilities to control the movement of the elevator car and/or
the selection of a floor. In general, the display may show visual symbols and/or text
as control commands. The display and/or screen may be integrated into the control
panel or may be provided in the elevator car. It may allow a person to interact with
the control system with his or her eye movement. The person may interact with control
command prompts on the display screen in an emergency scenario and/or also may allow
a person at an offsite location to control what is being displayed on the display.
[0022] According to an embodiment of the invention, the method further comprises: selecting
a control command by determining, whether the gaze point is on the control command.
The selected floor then may be determined with the selected control command. For example,
the control command may be a number representing a floor. When the gaze point stays
on this number, the control system may decide that the respective floor has been selected.
[0023] According to an embodiment of the invention, the method further comprises: selecting
a button on the control panel by determining, whether the gaze point is on the button.
The selected floor then may be determined from the selected button. It has to be noted
that the same button also may be pressed for selecting the same floor. With the method,
the floor may not solely be selected by pressing a button, but also by looking at
the button. It also may be possible that other operations are imitated on the control
panel, such as generating an emergency call, when the person looks at an emergency
button.
[0024] For example, to ensure that a person really has selected a specific button, it may
be that the gaze point has to stay on the specific button for more than a specific
time, such as three seconds. A button press on a special button of the control panel
may be used to cancel the last stored floor call and/or any other operation.
[0025] According to an embodiment of the invention, the presence detection sensor is a motion
detection sensor. For example, the presence detection sensor may be an infrared sensor,
an electromagnetic sensor, an ultrasonic sensor, etc. As the one or more cameras,
the presence detection sensor may be integrated into the control panel. In particular,
the presence detection sensor may be a sensor different from the camera providing
the video stream for gate tracking. This may provide the ability to interface with
a secondary sensor to control when the one or more cameras and/or other parts of the
control system are active, off, in a power-saving state and/or any other mode of operation.
[0026] It has to be noted that presence detection of a person in front of the control panel
also may be performed by analysing the video stream from a camera, such as the first
and/or second camera.
[0027] A further aspect of the invention relates to a control system for an elevator system,
which comprises a control panel and a controller adapted for performing the method
as described in the above and in the following. The control panel may be installed
in an elevator car and/or at a door of the elevator system. The controller may be
part of the control panel. It also may be that the controller comprises several parts,
for example a part in the control panel and a further part in the central controller
of the elevator system, which, for example, may be installed near a drive of the elevator
system. The control system may be adapted for controlling elevator calls that select
the desired floor by tracking eye movements based on where and/or what a person is
looking at.
[0028] The controller may comprise one or more processors, which are adapted for performing
the method, when a corresponding computer program is executed with them.
[0029] According to an embodiment of the invention, the control panel comprises at least
one camera adapted for eye tracking. The video stream from the at least one camera
may be analysed to determine a line of sight of the eye. It also may be that the video
stream of the at least one camera is presented to a central processing center to monitor
for safety reasons. The central processing center may be connected to a control system
of the elevator system by Internet and/or the video stream may be transmitted via
Internet.
[0030] According to an embodiment of the invention, the control panel comprises a first
module and a second module. In this context, a module may be mechanically interconnected
components that can be installed as one unit in the elevator car and/or other position.
The first module of the control panel may comprise buttons, such as floor selection
buttons, and a camera. The camera may be used for eye tracking. The second module
of the control panel may comprise a display and a further camera. The display may
be used for presenting control commands to a person in front of the control panel.
The further camera alternately and/or additionally may be used for eye tracking. A
video stream of the further camera may be transmitted to a central processing center
for monitoring the place in front of the control panel.
[0031] Alternatively, all components of the control panel as described in the above and
in the following also may be provided as one module.
[0032] According to an embodiment of the invention, the control panel comprises buttons
for manually selecting a floor. These buttons may be part of the first module.
[0033] According to an embodiment of the invention, the control panel comprises a display
for displaying control commands. The display device also may be used as a display
device for deaf users. For example, information on how to use the gaze tracking method
may be displayed in text on the display. The display may be part of the second module.
[0034] According to an embodiment of the invention, the control panel comprises a loudspeaker
for outputting audio prompts. The loudspeaker also may offer hearable information
for blind users. A loudspeaker may be part of the first module and/or the second module.
[0035] According to an embodiment of the invention, the control panel comprises a presence
detection sensor. The presence detection sensor may be part of the first module.
[0036] A further aspect of the invention relates to an elevator system, which comprises
an elevator car movable in an elevator shaft and a control system as described herein.
The control panel of the control system may be installed in the elevator car. However,
it also may be possible that the control panel is installed at a door of the elevator
system for getting access to the elevator car.
[0037] It has to be noted that features of the elevator system and the control system as
described herein may be features of the method for controlling the elevator system,
and vice versa.
[0038] In the following, advantageous embodiments of the invention will be described with
reference to the enclosed drawings. However, neither the drawings nor the description
shall be interpreted as limiting the invention.
Fig. 1 schematically shows an elevator system according to an embodiment of the invention.
Fig. 2 schematically shows a control panel for a control system according to an embodiment
of the invention.
Fig. 3 shows a flow diagram for a method for controlling an elevator system according
to an embodiment of the invention.
[0039] The figures are only schematic and not to scale. Same reference signs refer to same
or similar features.
[0040] Fig. 1 shows an elevator system 10 comprising an elevator car 12 movable in an elevator
shaft 14 by a drive 16. The elevator system 10 furthermore comprises a central controller
18 (which may be a part of the drive 16 or at least arranged near the drive 16) for
controlling the drive 16 and further equipment of the elevator system 10. For example,
the central controller 18 may also control elevator doors 20.
[0041] The central controller 18 may receive electronic control commands from a control
panel 22 inside the elevator car 12. It also may be that here and in the following
the control panel 22 is installed outside of the elevator car 12, for example besides
one of the doors 20. The control panel 22 and the central controller 18 may be seen
as a control system 24 of the elevator system 10.
[0042] The control panel 22 comprises a first module 26 and a second module 28, which will
be described in more detail with respect to Fig. 2.
[0043] The first module 26 comprises floor select buttons 30, an upper camera 32 arranged
above the buttons 30, a lower camera 34 arranged below the buttons 30 and a presence
detection sensor 36.
[0044] The floor select buttons 30 may be used for selecting a floor to which the elevator
car should move. There may be a button 30 for each floor. When a person pushes the
respective button 30, a corresponding electronic command is sent via a local controller
38 to the central controller 18, which then controls the elevator system 10 to move
the elevator car 12 to the respective floor. The local controller 38 may be part of
the control panel 22.
[0045] The camera 32 may generate a video stream that may be analysed by the local controller
38 to determine face data and/or retinal data of a person. Also the camera 34 may
generate a video stream to determine face data and/or retinal data of a person. This
may be performed additionally or alternatively with respect to the video stream of
the camera 32. The camera 34 may be a low-angle emergency and/or disability camera
34, for example, for high-stress situations and/or for persons with eyes on a lower
level, such as persons in a wheel-chair or children.
[0046] The lower camera 34 may provide more accurate registering of information for a person
in a wheelchair or of shorter stature. In addition, the camera 34 may be seen as a
secondary camera 34 and/or may be used instead of the primary camera 32, when the
secondary camera 34 is more viable based on the position of the person in the elevator
car 12. Additionally or alternatively, the camera 32 and the camera 34 may be used
in conjunction with each other.
[0047] From the video stream(s), the local controller 38 may determine a line of sight of
an eye of a person and a gaze point of the person, in particular a gaze point on the
module 26 and/or the module 28.
[0048] The presence detection sensor 36 may be arranged above the buttons 30 and/or the
camera 32. The presence detection sensor 36 may be adapted for detecting the presence
of a person in front of the control panel 22. For example, the presence detection
sensor 36 is adapted for sensing changes in infrared radiation and/or an ultrasonic
sound, which may be caused by a human body in front of the control panel 22.
[0049] The module 28 may be arranged above the module 26. Module 28 comprises a display
40 and one or more further cameras 42. Also the camera(s) 42 may generate a video
stream, which is evaluated by the controller 38 for face tracking and/or gaze tracking
data. For example, with the camera 42, gaze tracking data relating to the display
40 may be generated and analysed by the controller 38.
[0050] For additional support and/or emergency purposes the video stream of the camera 42
may be transmitted to a central processing center, for example via the controller
38 to the controller 18, which may be connected to the central processing center via
Internet. The camera 42 may be used for multiple purposes besides gaze detection including
but not limited to in-car monitoring.
[0051] The display 40 may be used for displaying text prompts or visual control commands
44, such as "Yes", "No", "Up", "Down", etc. For example, a selection of choice may
be stated, such as "Yes" and "No". A text may be added such as "Do you need help?".
When the display 40 is not used for any other functions, it may serve as an emergency
services device.
[0052] Module 28 also may comprise a loudspeaker or audio speaker 46, for example for prompting
and interacting with a person inside the elevator car 12.
[0053] Fig. 3 shows a flow diagram of a method that may be executed by the control system
24. In the following, most of the control functions are described with respect to
the local controller 38. However, it has to be understood, that these functions also
may be performed by the controller 18 or by a combination of both controllers 18,
38.
[0054] In step S10, only the presence detection sensor 36 may be active, i.e. measurements
may be performed with the presence detection sensor 36 and evaluated with the local
controller 38. Other components of the control panel 22, such as the cameras 32, 34,
42 and the display 40, may be inactive.
[0055] In Step S12, if the presence detection sensor 36 does not detect a human presence,
the control system 24 continues in a passive state. The control system 24 returns
to step S10 and the components 32, 34, 42 may remain inactive.
[0056] In steps S14, when the presence of a person in front of a control panel 22 of the
elevator system 10 is detected, the control system 24 is switched to an active state,
which may mean that the eye-tracking functionality is powered on. For example, firstly
the camera 32 may be powered on, which then generates a video stream, which is analysed
by the local controller 38.
[0057] In step S16, the local controller 38 starts to detect, whether there is an eye visible
in the video stream. The local controller 38 may detect, whether or not human eyes
are readable by scanning and analysing retinal data in the video stream.
[0058] For example, the local controller 38 may comprise an eye detection module, which,
for example based on a neural network and/or machine learning, has been trained to
detect portion of images that contain an eye.
[0059] In general, the presence of an eye of a person in front of the control panel 22 may
be detected with one or more video streams from one or more of the cameras 32, 34,
42 of the control panel 22.
[0060] For example, when no eye has been detected with the first camera 32, camera 34 may
be powered on and the video stream of the camera 34 may be analysed. The presence
of the eye may then be determined with the second camera 34.
[0061] In step S18, when no eye has been detected (either with one or with more cameras
32, 34, 42), the local controller 38 plays an audio prompt indicating the person to
look at the control panel 22.
[0062] In step S20, when the eye has been detected, the video stream of the respective camera
32, 34, 42 is analysed for determining a gaze point of the eye. It may be that the
gaze point is firstly determined with the first camera 32 and, if this is not possible,
secondly with the second camera 34.
[0063] The local controller 38 may comprise a gaze tracking module that analyses one or
more video streams of one or more of the cameras. For example, from the portion of
the image and/or video stream that has been detected in step S14, the gaze tracking
module may determine reflections on the eye. The reflections may have been caused
by infrared lights included into the control panel 22 and/or by other light sources,
such as an elevator car lighting. From the reflections an orientation of the eye and/or
view direction may be determined. From the position of the eye and the orientation
a line of sight of the eye may be determined.
[0064] The gaze point then may be determined by intersecting the line of sight with a component
of the control panel 22, which component is stored in a virtual layout of the control
panel. The virtual layout may comprise the positions and/or extensions of the first
and/or second module 26, 28, the positions and/or extensions of the buttons 30, the
positions and/or extensions of the visual control commands 44, etc. The virtual layout
may be stored in the local controller 38 and/or may be updated by the local controller
38, for example when the visual control commands 44 change.
[0065] Additionally, when the eye has been detected, visual control commands 44 may be displayed
on the display 40 of the control panel 22.
[0066] In step S22, the local controller 38 detects, whether or not the person is looking
at the control panel 22 and/or at which component of the control panel 22 the person
is looking.
[0067] When the gaze point is not on the control panel 22, the control system 24 may return
to step S20 and, for example, may output user instructions via the display 40 and/or
the loudspeaker 46. For example, if the eyes are not readable, an audio and or visual
prompt may alert the person stating instructions on how to improve the chances of
the person's eyes being readable. Similarly, if the person is not looking at the control
panel 22, additional user instructions may be given.
[0068] In step S24, if the gaze point is on the control panel 22, the control system 24
identifies, which floor is selected with the gaze point, i.e. a selected floor is
determined from the gaze point. This may be done by determining, whether the person
is looking at a specific button 30 or with the visual control commands 44 on the display
40.
[0069] For example, a control command 44 may be selected, by determining, whether the gaze
point is on the control command 44. The selected floor then may be determined with
the selected control command 44. For example, when the control command is "Up", the
selected floor may be the next floor above a current floor. By looking longer at the
control command "Up", the number of floors above the current floor may be increased.
Analogously, with the control command "Down", a floor below the current floor may
be selected.
[0070] Another possibility is that the person is looking at one of the buttons 30. A button
30 on the control panel 22 may be selected by determining, whether the gaze point
is on the button 30. The selected floor then may be determined from the selected button
30.
[0071] In both cases, a layout of the control panel 22 may be stored in the local controller
38, which then can determine from the gaze point the component of the control panel
22, the person is looking at. This layout may comprise the positions of the buttons
30 and/or the display 40.
[0072] In step S26, an electronic control command is sent to the central controller 18,
which floor has been selected. This electronic control command may be the same as
the one, which is generated, when a corresponding floor select button 30 is pushed.
[0073] In step S28, the elevator system 10 controls the drive 16 to move the elevator car
12 to the selected floor. Furthermore, other equipment, such as the doors 20, may
be controlled based on the electronic control command.
[0074] Finally, it should be noted that the term "comprising" does not exclude other elements
or steps and the "a" or "an" does not exclude a plurality. Also elements described
in association with different embodiments may be combined. It should also be noted
that reference signs in the claims should not be construed as limiting the scope of
the claims.
List of reference signs
[0075]
- 10
- elevator system
- 12
- elevator car
- 14
- elevator shaft
- 16
- drive
- 18
- central controller
- 20
- elevator door
- 22
- control panel
- 24
- control system
- 26
- first module
- 28
- second module
- 30
- floor select button
- 32
- first camera
- 34
- second camera
- 36
- presence detection sensor
- 38
- local controller
- 40
- display
- 42
- further camera
- 44
- visual control command
- 46
- loudspeaker
1. A method for controlling an elevator system (10), the method comprising:
detecting the presence of a person in front of a control panel (22) of the elevator
system (10) with a presence detection sensor (36);
in the case a person has been detected, detecting the presence of at least one eye
of a person in front of the control panel (22) from a video stream of a camera (32,
34, 42) of the control panel (22);
in the case, when the eye has been detected, determining a gaze point of the eye on
the control panel (22) from a video stream of the camera (32, 34, 42) and/or a further
camera (32, 34, 42), wherein a line of sight of the eye is determined from an image
of the eye in the video stream and the gaze point is determined by intersecting the
line of sight with a component of the control panel stored in a virtual layout of
the control panel (22);
determining a selected floor from the gaze point;
controlling the elevator system (10) to move an elevator car (12) to the selected
floor.
2. The method of claim 1, further comprising:
when a person has been detected and no eye has been detected, playing an audio prompt
indicating the person to look at the control panel (22).
3. The method of claim 1, further comprising:
when no eye has been detected with the camera (32) being a first camera, determining
the presence of the eye with a second camera (34) of the control panel (22).
4. The method of claim 3,
wherein the second camera (34) is installed lower as the first camera (32).
5. The method of claim 3 or 4,
wherein the gaze point is determined with the second camera (34).
6. The method of one of the previous claims, further comprising:
displaying control commands (44) on a display (40) of the control panel (22);
selecting a control command (44) by determining, whether the gaze point is on the
control command (44);
wherein the selected floor is determined with the selected control command (44).
7. The method of one of the previous claims, further comprising:
selecting a button (30) on the control panel (22) by determining, whether the gaze
point is on the button (30);
wherein the selected floor is determined from the selected button (30).
8. The method of one of the previous claims,
wherein the presence detection sensor (36) is a motion detection sensor.
9. A control system (24) for an elevator system (10), the control system (24) comprising:
a control panel (22);
a controller (38, 18) adapted for performing the method of one of the previous claims.
10. The control system (24) of claim 9,
wherein the control panel (24) comprises at least one camera (32, 34, 42) adapted
for eye tracking.
11. The control system (24) of claim 9 or 10,
wherein the control panel (22) comprises a first module (26) and a second module (28);
wherein the first module (26) comprises buttons (30) and a camera (32, 34);
wherein the second module (28) comprises a display (40) and a camera (42).
12. The control system (24) of one of claims 9 to 11,
wherein the control panel (22) comprises buttons (30) for manually selecting a floor.
13. The control system (24) of one of claims 9 to 12,
wherein the control panel (22) comprises a display (40) for displaying control commands
(44).
14. The control system (24) of one of claims 9 to 13,
wherein the control panel (22) comprises a loudspeaker (46) for outputting audio prompts;
and/or
wherein the control panel (22) comprises a presence detection sensor (36).
15. An elevator system (10), comprising:
an elevator car (12) movable in an elevator shaft (14);
a control system (24) according to one of claims 9 to 14;
wherein the control panel (22) of the control system (24) is installed in the elevator
car (12).