[0001] This invention refers to a smart guide device for visually impaired person according
to claim 1.
Background of the Invention
[0002] For a visually impaired person walking alone without falling down is a difficult
task. It's hard for them to walk properly even with stick. There are many devices
available in the market. This stick is prudent in alerting the person about an obstacle,
while walking. But, such stick may not give the information about whether the obstacle
is a pit or a bump on the road. Moreover, the devices cannot give information about
the dimensional features of the obstacle. That is if the obstacle is a bump or any
other object, the device is not capable to provide information about the height or
depth of the obstacle.
[0003] In prior art
KR20110078229A, a walking guide device for the blind is disclosed that comprises of an ultrasonic
sensor, which senses dynamic and non-dynamic obstacles using an ultrasonic signal,
a sensor unit which is equipped with an RFID and grasps location guide information
using an RFID signal, a vibrating motor which generates vibration if obstacles are
sensed, a controller which controls in order to output walk guiding information, and
an output unit which outputs obstacle information by a voice and indicates situation
information by an LED.
[0004] In another prior art
CN103830071A, a walking stick provided with an infrared detector is shown. An infrared detection
head is installed at the front end of a handle of the stick, and an internally-arranged
circuit of the infrared detection head is communicated with an alarm prompter on the
handle. The handle is provided with a switch used for connecting or disconnecting
the infrared detection head. When the stick is used, the infrared detection head can
be used for detecting the condition of the road in front in time, the alarm prompter
can give sound prompting when an obstacle occurs, and therefore the blind can walk
with the walking stick conveniently.
[0005] The subject-matter of the prior arts does not provide information to the visually
impaired person about the height or the depth of obstacle. Further, the prior art
may not provide the distance of the obstacle from the visually impaired person.
Object of the Invention
[0006] It is therefore the object of the present invention to provide a smart guide device
for a visually impaired person to detect or measure the height or depth and distance
of any obstacle on the walking direction.
Description of the Invention
[0007] The before mentioned object is solved by providing a smart guide device [herein referred
as 'device'] for a visually impaired person to detect or measure the height or depth
and distance of any obstacle on the walking direction according to claim 1. According
to an embodiment, the device for visually impaired person comprises of a laser source
attached to the device for generating laser beam, an ultrasonic sensor provided near
the laser source for measuring a height of the laser source from the ground, an output
mechanism for providing alert to the visually impaired person about any obstacle on
a walking direction, characterized in that a gyroscope is attached to the laser source
for maintaining an angle between the laser source and the ultrasonic sensor, a camera
for capturing an image of a refracted laser beam from the obstacle, an image processing
unit for processing the image captured by the camera, an embedded system configured
to calculate a height or depth of the obstacle based on the image of the refracted
laser beam from the obstacle, the height of the laser source and the angle between
a laser beam path and the ultrasonic sensor.
[0008] Further embodiments are subject-matter of the dependent claims and/or of the following
specification parts.
[0009] According to an exemplary embodiment, the camera may be maintained with a fixed focal
length and a fixed screen resolution. The image of the refracted laser beam can be
captured for a reference object or the obstacle, according to claim 3-8. The image
of the refracted laser beam includes a plurality of horizontal pixel lines (HPL);
and wherein the image processing unit is configured to measure the number of HPL between
laser line on the ground and laser line on the reference object or the obstacle. The
number of HPL can be transferred to the embedded system. The image processing unit
is further configured to calculate one HPL height using a ratio between the number
of HPL measured for the reference object to height of the reference object, wherein
the HPL height is stored in the embedded system. A vertical scaling factor is obtained
from the height of the laser source from ground and the angle between the laser beam
path and the ultrasonic sensor.
[0010] According to another embodiment, the embedded system is further configured for calculating
the height or depth of the obstacle based on one HPL height for the reference object,
the number of HPL between laser line on the ground and laser line on the obstacle,
and the vertical scaling factor. The embedded system further configured for calculating
the height or depth of the obstacle using the formula,

wherein
h is the height or depth of the obstacle
Lr is one HPL height for the reference object;
Ngo is the number of HPL between laser line on the ground and laser line on the obstacle;
and
V is the vertical scaling factor.
[0011] According to an exemplary embodiment, the embedded system is further configured to
measure a distance of the obstacle on the walking direction from the device based
on the height of the laser source from ground and the angle between the laser beam
path and the ultrasonic sensor. The obstacle can include but not limited to a pit
on a ground, a bump on the ground, and any object on the ground.
[0012] Further, the direction of the laser beam refraction can provide information about
whether the obstacle is a pit or a bump. The embedded system is further configured
to calculate a depth of the obstacle in case the obstacle is a pit. The depth of the
obstacle is determined based on the image of the refracted laser beam from the obstacle,
the height of the laser source and the angle between the laser beam path and the ultrasonic
sensor.
[0013] Further benefits, goals and features of the present invention will be described by
the following specification of the attached figures, in which components of the invention
are exemplarily illustrated. Components of the devices and method according to the
inventions, which match at least essentially with respect to their function can be
marked with the same reference sign, wherein such components do not have to be marked
or described in all figures.
[0014] The invention is just exemplarily described with respect to the attached figures
in the following.
Brief Description of the Drawings
[0015]
- Fig. 1
- illustrates the block diagram of a smart guide device for visually impaired person
according to the present invention;
- Fig. 2
- illustrates the graphical representation of function of the smart guide device for
visually impaired person without obstacle, according to the present invention; and
- Fig. 3
- illustrates the graphical representation of function of the smart guide device for
visually impaired person with obstacle, according to the present invention.
Detailed Description of the Drawings
[0016] Fig. 1 illustrates the block diagram 100 of a smart guide device for visually impaired
person, according to the present invention.
[0017] According to an exemplary embodiment, the smart guide device [herein referrred as
"device"] for visually impaired person comprises of a laser source 101, an ultrasonic
sensor 104, an output mechanism 108, a gyroscope 102, a camera 103, an image processing
unit 105, an embedded system 106, and a power supply 107.
[0018] According to an embodiment, the laser source 101 is attached to the device and provided
for generating laser beam, wherein the generating laser beam is focused towards a
walking path of the person. In an embodiment, the ultrasonic sensor 104 is provided
near the laser source 101 for measuring a height of the laser source 101 from the
ground. In an embodiment, the output mechanism 108 is capable of providing alert to
the visually impaired person about any obstacle 109 on a walking direction.
[0019] According to another embodiment, the gyroscope 102 is attached to the laser source
101 for maintaining an angle between the laser source 101 and the ultrasonic sensor
104. In an embodiment, the camera 103 is provided for capturing an image of a refracted
laser beam from the obstacle 109. The image processing unit 105 is provided for processing
the image captured by the camera 103.
[0020] According to a further embodiment, the embedded system 106 is configured to calculate
a height or depth of the obstacle 109 based on the image of the refracted laser beam,
the height of the laser source 101 and the angle between a laser beam path and the
ultrasonic sensor 104. The power supply 107 is provided for supplying power to the
device. In an embodiment, the output mechanism 108 may be a speaker like device or
a vibrator or the combination of both to provide alert to the visually impaired person.
[0021] Fig. 2 illustrates the graphical representation 200 of function of the device for
visually impaired person without obstacle, according to the present invention.
[0022] According to an exemplary embodiment, the function of the device is started with
a transmission of laser beam from the laser source 101. When the laser beam touches
the ground 202, the laser beam may get refracted due to disturbance in the straight
path 204. The camera 103 is configured to capture the image 207 of the refracted laser
beam 206. The camera 103 is maintained with a fixed focal length and fixed screen
resolution. That is the view angle 203 of the camera 103 is kept at constant value.
The angle 'α' 201 between the laser beam path 204 and the ultrasonic sensor 104 is
maintained at a constant by keeping the laser source 101 over the gyroscope 102. The
gyroscope 102 enables to move the laser source 101 to keep the angle 201 'α' at constant.
Using the trigonometry equation, the distance 204a of the laser beam path 204 from
source to ground 202 can be calculated. The distance 204a of laser path 204 can be
calculated from the angle 'α' 201 between the laser beam path 204 and the ultrasonic
sensor 104 and the height 'h' 205 of the laser source 101 from the ground 202.
[0023] That is,

[0024] From the above equation, a vertical scaling factor 'V' can be calculated. That is,
the vertical scaling factor is obtained from the height 205 of the laser source 101
from ground 202 and the angle 201 between the laser beam path 204 and the ultrasonic
sensor 104.
[0025] According to an exemplary embodiment, the image 207 of the refracted laser beam 206
that is captured by the camera 103 is processed to find the height or depth of the
obstacle 109. The height or depth of the obstacle 109 is calculated based on comparing
the laser beam refraction for a reference object with the obstacle 109. The reference
object may be a object with known height or depth. The image 207 of the refracted
laser beam 206 includes a plurality of horizontal pixel lines (HPL). The number of
HPL between laser line on the ground 202 and laser line on the reference object is
measured using the image processing unit 105. Further, the number of HPL between laser
line on the ground 202 and laser line on the reference object is transferred to the
embedded system 106. From that, one HPL height from the ground 202 for a reference
object is calculated based on the number of HPL and height of the reference object.
That is,

[0026] Where, N
rhpl is number of HPL, H
r is the height or depth of the reference object and H
rhpl is one HPL height from the ground 202. The H
rhpl value is stored in the embedded system 106 for reference.
[0027] Fig. 3 illustrates the graphical representation 300 of function of the device for
visually impaired person with obstacle, according to the present invention.
[0028] According to an exemplary embodiment, the image 207 of the refracted laser beam 206
is processed for the obstacle 109. The number of HPL between laser line on the ground
202 and laser line on the obstacle 109 is measured using the image processing unit
105. The number of HPL between laser line on the ground 202 and laser line on the
obstacle 109 is transferred to the embedded system 106. The embedded system 106 can
be further configured for calculating the height or depth of the obstacle 109 based
on one HPL height for the reference object, the number of HPL between laser line on
the ground 202 and laser line on the obstacle 109, and the vertical scaling factor.
[0029] According to a preferred embodiment, the height or depth of the obstacle 109 is calculated
using the formula,

wherein
ho is the height or depth of the obstacle 109;
Hlhpl is one HPL height for the reference object;
Nohpl is the number of HPL between laser line on the ground 202 and laser line on the obstacle
109; and
V is the vertical scaling factor.
[0030] According to an exemplary embodiment, the embedded system 106 further configured
to measure a distance of the obstacle 109 on the walking direction from the device
based on the height 'h' 205 of the laser source 101 from ground 202 and the angle
'α' 201 between the laser beam path 204 and the ultrasonic sensor 104. The obstacle
109 includes a pit on a road, a bump on the road, and any object on the ground 202.
The direction of the laser beam refraction provide information about whether the obstacle
109 is a pit or a bump. Hence for the bump and the pit the direction of the refracted
laser beam 206 are opposite to each other. In case if the obstacle 109 is a pit, the
embedded system 106 can be further configured to calculate a depth of the obstacle
109. The depth of the obstacle 109 is determined based on the image 207 of the refracted
laser beam 206 from the obstacle 109, the height 205 of the laser source 101 and the
angle 201 between the laser beam path 204 and the ultrasonic sensor 104. The method
of calculating the depth of the obstacle 109 is similar to the calculation for the
height of the obstacle 109. Further, the device can be wearable on to a body of the
person such as belt and so on. Similarly, the device can also be incorporated in any
accessories such as stick, guides, umberlla and so on.
[0031] Thus, the present invention refers to a smart guide device for visually impaired
person, wherein the device includes a laser source 101, an ultrasonic sensor 104 provided
near the laser source 101, an output mechanism 108 for providing alert to the visually
impaired person about any obstacle 109, a gyroscope 102 attached to the laser source
101, a camera 103 for capturing an image 207 of a refracted laser beam 206 from the
obstacle 109, an image processing unit 105 for processing the image captured by the
camera 103, an embedded system 106 configured to calculate a height of the obstacle
109 based on the image 207 of the refracted laser beam, the height 205 of the laser
source 101 and the angle 201 between a laser beam path 204 and the ultrasonic sensor
104.
List of reference numbers
[0032]
- 100
- block diagram for the device
- 101
- laser source
- 102
- a gyroscope
- 103
- a camera
- 104
- an ultrasonic sensor
- 105
- an image processing unit
- 106
- an embedded system
- 107
- power supply
- 108
- an output mechanism
- 109
- an obstacle
- 200
- graphical representation of function of the device without obstacle
- 201
- the angle between a laser beam path and the ultrasonic sensor
- 202
- ground line
- 203
- view angle of a camera
- 204
- laser beam path
- 204a
- distance of the laser beam path from source to ground (d)
- 205
- height of the laser source from the ground (h)
- 206
- refracted laser beam
- 207
- image of the refracted laser beam
- 300
- graphical representation of function of the device with the obstacle
1. A smart guide device for visually impaired person comprises of:
a laser source (101) attached to the device for generating laser beam;
an ultrasonic sensor (104) provided near the laser source (101) for measuring a height
(205) of the laser source (101) from the ground (202);
an output mechanism (108) for providing alert to the visually impaired person about
any obstacle (109) on a walking direction;
characterized in that
a gyroscope (102) attached to the laser source (101) for maintaining an angle between
the laser source (101) and the ultrasonic sensor (104);
a camera (103) for capturing an image (207) of a refracted laser beam (206) from the
obstacle (109);
an image processing unit (105) for processing the image captured by the camera (103);
an embedded system (106) configured to calculate a height or depth of the obstacle
(109) based on the image (207) of the refracted laser beam, the height (205) of the
laser source (101) and the angle (201) between a laser beam path (204) and the ultrasonic
sensor (104).
2. The smart guide device of claim 1, wherein the camera (103) is maintained with a fixed
focal length and a fixed screen resolution.
3. The smart guide device of claim 1, wherein the image (207) of the refracted laser
beam (206) is captured for a reference object or the obstacle (109).
4. The smart guide device of claim 3, wherein the image (207) of the refracted laser
beam (206) includes a plurality of horizontal pixel lines (HPL); and wherein the image
processing unit (105) configured to measure the number of HPL between laser line on
the ground (202) and laser line on the reference object or the obstacle (109), and
to transfer to the number of HPL to the embedded system (106).
5. The smart guide device of claim 4, wherein the image processing unit (105) further
configured to calculate one HPL height using a ratio between the number of HPL measured
for the reference object to height or depth of the reference object; wherein the HPL
height is stored in the embedded system (106).
6. The smart guide device of claim 5, wherein a vertical scaling factor is obtained from
the height (205) of the laser source (101) from ground (202) and the angle (201) between
the laser beam path (204) and the ultrasonic sensor (104).
7. The smart guide device of claim 6, wherein the embedded system (106) further configured
for calculating the height or depth of the obstacle (109) based on one HPL height
for the reference object, the number of HPL between laser line on the ground (202)
and laser line on the obstacle (109), and the vertical scaling factor.
8. The smart guide device of claim 7, wherein the embedded system (106) further configured
for calculating the height or depth of the obstacle (109) using the formula,

wherein
ho is the height or depth of the obstacle (109)
Hrhpl is one HPL height for the reference object;
Nohpl is the number of HPL between laser line on the ground (202) and laser line on the
obstacle (109); and
V is the vertical scaling factor.
9. The smart guide device of claim 1, wherein the embedded system (106) further configured
to measure a distance of the obstacle (109) on the walking direction from the device
based on the height (205) of the laser source (101) from ground (202) and the angle
(201) between the laser beam path (204) and the ultrasonic sensor (104).
10. The smart guide device of claim 1, wherein the obstacle (109) includes a pit on a
road, a bump on the road, and any object on the ground (202).
11. The smart guide device of claim 1, wherein direction of the laser beam refraction
provide information about whether the obstacle (109) is a pit or a bump.
12. The smart guide device of claim 1, wherein the output mechanism 108 is a speaker like
device or a vibrator or the combination of both to provide alert to the visually impaired
person.
13. The smart guide device of claim 1, wherein the device is wearable on to a body of
the person.
14. The smart guide device of claim 1, wherein the device is incorporated in accessories
which includes stick, guides, umberlla, and shoe.