(19)
(11) EP 3 489 919 A1

(12) EUROPEAN PATENT APPLICATION

(43) Date of publication:
29.05.2019 Bulletin 2019/22

(21) Application number: 17203574.3

(22) Date of filing: 24.11.2017
(51) International Patent Classification (IPC): 
G08B 5/36(2006.01)
G08B 25/00(2006.01)
G08B 7/06(2006.01)
G08B 6/00(2006.01)
G08B 3/10(2006.01)
G08B 29/18(2006.01)
(84) Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
MA MD

(71) Applicant: Vestel Elektronik Sanayi ve Ticaret A.S.
45030 Manisa (TR)

(72) Inventors:
  • Güner, ÖZTEKIN
    45030 Manisa (TR)
  • Evren Gökhan, YILMAZ
    45030 Manisa (TR)

(74) Representative: Flint, Adam 
Page White & Farrer Bedford House John Street
London WC1N 2BF
London WC1N 2BF (GB)

   


(54) APPARATUS, METHOD AND COMPUTER PROGRAM FOR INDICATING AN EMERGENCY SITUATION


(57) An apparatus (2) has a sound detection unit (20) configured to detect a sound signal in an environment of a user. The apparatus (2) has a processing unit (30) configured to determine whether the detected sound signal is indicative of an emergency situation in the environment of the user. The apparatus (2) also has a haptic output unit (22) configured to output a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user.




Description

Technical Field



[0001] The present disclosure relates to an apparatus, a method and a computer program for indicating an emergency situation.

Background



[0002] In recent years, various solutions have been proposed for assisting people with hearing impairment in their everyday life. For example, headbands, glasses, helmets or other wearable apparatuses have been equipped with microphones and displays so that visual information can be output when audio information in the environment of a user is indicative of an emergency situation.

Summary



[0003] According to a first aspect disclosed herein, there is provided an apparatus for indicating an emergency situation, the apparatus comprising:

a sound detection unit configured to detect a sound signal in an environment of a user;

a processing unit configured to determine whether the detected sound signal is indicative of an emergency situation in the environment of the user ; and

a haptic output unit configured to output a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user.



[0004] The haptic signal may comprise a vibration signal.

[0005] The detected sound signal may comprise a name (e.g. name of the user or specified names), a siren (e.g. an ambulance siren, a police siren, a fire brigade siren), an alarm (e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm), a car horn, a voice (e.g. a human voice as opposed to a machine generated voice), a word in a specific language such as "look out", "help", "beware", or other.

[0006] The apparatus may comprise:

a visual output unit configured to output a visual signal when the detected sound signal is indicative of an emergency situation in the environment of the user.



[0007] The visual signal may comprise a light signal, for example a flashing light signal.

[0008] The visual output unit may be configurable to output a light signal when the detected sound signal is indicative of an emergency situation in the environment of the user only when the light level in the environment of the user is below a threshold.

[0009] The processing unit may be configured to:

determine whether the detected sound signal is associated with a predetermined sound signal stored on the apparatus; and

if the detected sound signal is associated with a predetermined sound signal stored on the apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.



[0010] The predetermined sound signal stored on the apparatus may be programmed by the manufacturer during manufacture or by the user after purchase.

[0011] The user may program the apparatus to download a sound signal from another apparatus or to record a sound signal from the environment of the user (e.g. a sound signal generated by the user's pet such as a user's dog barking).

[0012] The apparatus may comprise:

a communication unit configured to communicate with another apparatus via a communication link.



[0013] The other apparatus may be a smart phone, a desktop computer, a laptop computer, a table computer, a television, a server or other.

[0014] The communication link may be a wireless link,

[0015] The communication link may be a Bluetooth link.

[0016] The communication link may be an Internet link, for example a Wi-Fi link.

[0017] The processing unit may be configured to: if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, communicate with another apparatus to query whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus and to receive a response to the query; and
if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.

[0018] The processing unit may be configured to:
if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display an indication of an emergency situation in the environment of the user.

[0019] The indication may comprises a text and/or an image.

[0020] The processing unit is configured to:

if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display information related to the emergency situation in the environment of the user.



[0021] The information may be retrieved by the other apparatus from a network. The information may comprise podcasts, articles, photographs, videos, etc.

[0022] The apparatus may be portable and preferably wearable.

[0023] The apparatus may be a watch.

[0024] The apparatus may be waterproof.

[0025] According to a second aspect disclosed herein, there is provided a method comprising:

detecting a sound signal in an environment of a user;

determining whether the detected sound signal is indicative of an emergency situation in the environment of the user ; and

outputting a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user.



[0026] The method may comprise:

determining whether the detected sound signal is associated with a predetermined sound signal stored on the apparatus; and

if the detected sound signal is associated with a predetermined sound signal stored on the apparatus, determining that the detected sound signal is indicative of an emergency situation in the environment of the user.



[0027] The method may comprise:

if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, instructing another apparatus to determine whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus; and

if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determining that the detected sound signal is indicative of an emergency situation in the environment of the user.



[0028] The method may comprise:

if the detected sound signal is indicative of an emergency situation in the environment of the user, instructing another apparatus to display an indication of an emergency situation in the environment of the user.



[0029] According to a third aspect disclosed herein, there is provided a computer program for an apparatus, comprising software code portions for performing the above method when said computer program is run on the apparatus.

Brief Description of the Drawings



[0030] For assisting understanding of the present disclosure and to show how embodiments may be put into effect, reference is made by way of example to the accompanying drawings in which:

Figure 1 shows schematically an example of a system according to an embodiment described herein;

Figure 2 shows schematically an example of an apparatus which is part of the system of Figure 1 according to an embodiment described herein; and

Figure 3 shows schematically an example of a flow diagram of a method of operating the apparatus of Figure 2 according to an embodiment described herein.


Detailed Description



[0031] Existing apparatus for assisting people with hearing impairment rely on visual information to indicate an emergency situation to a user. Accordingly, they may be ineffective in circumstances where visual information cannot be perceived by the user. For example, the user may have a visual impairment. The user may be asleep. The user may not be able to wear the apparatus because s/he is having a shower, a swim or practising another activity that is incompatible with wearing the apparatus.

[0032] Figure 1 shows schematically an example of a system according to an embodiment. The system comprises a watch 2, a television set 4, a tablet computer 6, a laptop computer 8, a mobile phone 10 and a server 12 connected to a network 14. The network 14 may be a local area network and/or a wide area network (e.g. Internet). In some examples, not all of a television set 4, a tablet computer 6, a laptop computer 8, a mobile phone 10 and a server 12 are required or used.

[0033] The watch 2 is indirectly connected to the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 and the server 12 via one or more routers 16. The watch 2 may communicate with the routers 16 over a medium range wireless communication link (e.g. Wi-Fi link).

[0034] The watch 2 is directly connected to the television 4, the tablet computer 6, the laptop computer 8 and the mobile phone 10. The watch 2 may communicate over a short range wireless communication link (e.g. Bluetooth link).

[0035] Figure 2 shows schematically an example of the watch 2 shown in Figure 1 according to an embodiment.

[0036] The watch 2 comprises a sound detection unit 20 for detecting a sound signal in the environment of a user. The sound detection unit 20 may comprise a microphone.

[0037] The watch 2 comprises a haptic output unit 22 for outputting a haptic signal. The haptic signal may comprise a vibration signal, a heat signal, a braille signal, an electric shock signal or other. The haptic output unit may comprise a vibration motor, a heating resistance, a braille display, an electrode or other.

[0038] The watch 2 comprises a visual output unit 24 for outputting a visual signal. The visual signal may comprise a text, an image, a light (the colour and/ or flashing frequency of which may be adjustable for example) or other. The visual output unit 24 may comprise a display, a light emitting diode or other.

[0039] The watch 2 comprises a communication unit 26 for communicating with the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 and the server 12. The communication unit 26 may comprise a Bluetooth communication unit, a Wi-Fi communication unit or other.

[0040] The watch 2 comprises a power unit 28, a processing unit 30 and a memory unit 32. The memory unit 32 contains instructions which when executed by the processing unit 30 allow the watch 2 to perform the method of Figure 3 (described in further details below). The memory unit 32 further comprises a sound database storing predetermined (i.e. predefined) sound signals.

[0041] The detected sound signal may comprise a name (e.g. name of the user and/or specified names of one or more other people), a siren (e.g. an ambulance siren, a police siren, a fire brigade siren), an alarm (e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm), a car horn, a voice (e.g. a human voice as opposed to a machine generated voice), one or more words or phrases in a specific language such as "look out", "help", "beware", or other.

[0042] The sound database may be populated by the manufacturer during manufacture and/or by the user after purchase. For example, the user may program the watch 2 to download and store a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. The user may also program the watch 2 to capture a sound (e.g. specific words or phrases, including for example specific names, the sound of a user's dog barking, etc.) and store the captured sound in the sound database. In this way, the sound database can be customized by the user.

[0043] The watch 2 comprises a housing. The housing may be waterproof.

[0044] Figure 3 shows schematically an example of a flow diagram of a method of operating the watch 2 according to an embodiment.

[0045] In step 302, the watch 2 is activated (i.e. the various units are powered on).

[0046] In step 304, the sound detection unit 20 determines whether a sound signal is detected. If a sound signal is detected the method goes to step 306. If no sound signal is detected the method loops back to step 304.

[0047] In step 306, the processing unit 30 processes the detected sound signal to perform sound recognition. The processing may include analogue to digital conversion, time to frequency conversion, filtering, speech detection or other.

[0048] The processing unit 30 determines whether the detected sound signal is associated (i.e. matches) with a predetermined sound signal in the sound database in the memory unit 32. If the detected sound signal is associated with a predetermined sound signal in the sound database in the memory unit 32, the method goes to step 308. If the detected sound signal is not associated with a predetermined sound signal in the sound database, the method goes to step 318.

[0049] In step 308, the processing unit 30 identifies an emergency situation. For example, the processing unit 30 may determine that a car is coming when the detected sound signal matches with an ambulance siren or a car horn. The processing unit may determine that a fire is ongoing when the detected sound signal matches with a fire alarm or other.

[0050] In step 310, the haptic output unit 22 outputs a haptic signal and/or the visual output unit 24 outputs a visual signal. The haptic signal and/or the visual signal may be based on the emergency situation. For example, the intensity or the frequency of the haptic signal and/or the visual signal may vary based on the type of emergency situation or the degree of emergency of the emergency situation.

[0051] In step 312, the communication unit 26 connects to the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10.

[0052] In step 314, the communication unit 26 instructs the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 to remotely display an indication (i.e. a notification) of the identified emergency situation. Alternatively, the visual output unit 24 locally displays an indication of the identified emergency situation.

[0053] The indication may comprise a text (e.g. "Warning: ongoing fire!", "Warning: ongoing hurricane!", "Warning: ongoing flood!", "Warning: ongoing earthquake!", "Warning: ongoing tsunami!", "Warning: ongoing thunder!" or other), and/or an image (e.g. a flame icon, a hurricane icon, a flood icon, an earthquake icon, a tsunami icon, a thunder icon, or other).

[0054] In an implementation, the watch 2, the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 checks whether there is an actual emergency situation before displaying the indication. Checking can be done by connecting to the network 14 and analysing one or more of current or breaking news items, articles, videos, podcast, images trending thereon, etc..

[0055] In step 316, the communication units 26 instructs the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 to display further information related to the identified emergency situation. Alternatively, the visual output unit 24 locally displays further information related to the identified emergency situation.

[0056] The information may comprise a text (e.g. fire brigade phone number, flood related newspaper article, hurricane instructions, transcription of a podcast related to an ongoing earthquake) and/or an image (e.g. map of the building showing the location of closest exits, safe rooms or fire extinguishers).

[0057] The information may be stored locally on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or remotely on the server 12.

[0058] The method then loops back to step 304.

[0059] In step 318 (i.e. for the case that the detected sound signal is not associated with a predetermined sound signal in the sound database of the watch 2), the communication unit 24 connects to some other apparatus, such as for example one or more of the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. It will be understood that the communication unit 24 may connect to the same or a different apparatus to the apparatus it connects with in step 312.

[0060] In step 320, the communication unit 24 transmits the detected sound signal to the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12 to remotely determine whether the detected sound signal is associated (e.g. matches) with a predetermined sound signal on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. The communication unit 24 then receives a result of the determination.

[0061] Alternatively, the communication unit 24 receives a predetermined sound signal from the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. The watch 2 can then locally determine whether the detected sound signal is associated (e.g. matches) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12.

[0062] If the detected sound signal is associated (e.g. matches) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12, the method goes to step 308. If the detected sound signal is not associated (e.g. does not match) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12, the method goes to step 322.

[0063] In step 322, the processing unit identifies a lack of emergency situation and loops back to step 304.

[0064] An advantage of the watch 2 over existing apparatus for assisting people with hearing impairment is that it does not only rely on visual information to indicate an emergency situation to a user. Accordingly, the user can be notified of an emergency situation even where visual information cannot be perceived by the user.

[0065] Moreover, it does not only rely on sound signals stored on the watch 2 to identify an emergency situation. Accordingly, the identification may be more accurate.

[0066] It will be understood that although the watch 2 has been described as an example of an embodiment, other embodiments may encompass other wearable or non-wearable apparatus.

[0067] The watch 2 as shown in Figure 2 is represented as a schematic block diagram for the purposes of explaining the functionality of the watch 2 only. Hence, it is understood that each unit of the watch is a functional block for performing the functionality ascribed to it herein. Each unit may be implemented in hardware, software, firmware, or a combination thereof.

[0068] It will be understood that the processing unit referred to herein may in practice be provided by a single chip or integrated circuit or plural chips or integrated circuits, optionally provided as a chipset, an application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), digital signal processor (DSP), graphics processing units (GPUs), etc. The chip or chips may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor or processors, a digital signal processor or processors, baseband circuitry and radio frequency circuitry, which are configurable so as to operate in accordance with the exemplary embodiments. In this regard, the exemplary embodiments may be implemented at least in part by computer software stored in (non-transitory) memory and executable by the processor, or by hardware, or by a combination of tangibly stored software and hardware (and tangibly stored firmware).

[0069] Reference is made herein to a memory unit. This may be provided by a single device or by plural devices. Suitable devices include for example a hard disk and nonvolatile semiconductor memory.

[0070] Although at least some aspects of the embodiments described herein with reference to the drawings comprise computer processes performed in processing systems or processors, the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice. The program may be in the form of non-transitory source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other non-transitory form suitable for use in the implementation of processes according to the invention. The carrier may be any entity or device capable of carrying the program. For example, the carrier may comprise a storage medium, such as a solid-state drive (SSD) or other semiconductor-based RAM; a ROM, for example a CD ROM or a semiconductor ROM; a magnetic recording medium, for example a floppy disk or hard disk; optical memory devices in general; etc.

[0071] The examples described herein are to be understood as illustrative examples of embodiments of the invention. Further embodiments and examples are envisaged. Any feature described in relation to any one example or embodiment may be used alone or in combination with other features. In addition, any feature described in relation to any one example or embodiment may also be used in combination with one or more features of any other of the examples or embodiments, or any combination of any other of the examples or embodiments. Furthermore, equivalents and modifications not described herein may also be employed within the scope of the invention, which is defined in the claims.

[0072] In other aspects, there may also be provided an apparatus comprising:

a communication unit configured to communicate to another apparatus via a communication link;

a sound detection unit configured to detect a sound signal in an environment of a user;

a processing unit configured to:

if the detected sound signal is not associated with a predetermined sound signal on the apparatus, instruct the other apparatus to determine whether the detected sound signal is associated with a predetermined sound signal on the other apparatus; and

if the detected sound signal is associated with a predetermined sound signal on the other apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.




Claims

1. An apparatus for indicating an emergency situation, the apparatus comprising:

a sound detection unit configured to detect a sound signal in an environment of a user;

a processing unit configured to determine whether the detected sound signal is indicative of an emergency situation in the environment of the user; and

a haptic output unit configured to output a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user.


 
2. An apparatus according to claim 1, comprising:

a visual output unit configured to output a visual signal when the detected sound signal is indicative of an emergency situation in the environment of the user.


 
3. An apparatus according to claim 2, wherein the visual output unit is configurable to output a light signal when the detected sound signal is indicative of an emergency situation in the environment of the user only when the light level in the environment of the user is below a threshold.
 
4. An apparatus according to any of claims 1 to 3, wherein the processing unit is configured to:

determine whether the detected sound signal is associated with a predetermined sound signal stored on the apparatus; and

if the detected sound signal is associated with a predetermined sound signal stored on the apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.


 
5. An apparatus according to claim 4, wherein the predetermined sound signal on the apparatus is stored by the manufacturer or by the user.
 
6. An apparatus according to any of claims 1 to 5, wherein the processing unit is configured to:

if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, communicate with another apparatus to query whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus and to receive a response to the query; and

if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.


 
7. An apparatus according to any of claims 1 to 6, wherein the processing unit is configured to:

if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display an indication of an emergency situation in the environment of the user.


 
8. An apparatus according to any of claims 1 to 7, wherein the processing unit is configured to:

if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display information related to the emergency situation in the environment of the user.


 
9. An apparatus according to claims 1 to 8, wherein the apparatus is a wearable apparatus.
 
10. A method comprising:

detecting a sound signal in an environment of a user;

determining whether the detected sound signal is indicative of an emergency situation in the environment of the user ; and

outputting a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user.


 
11. A method according determine according to claim 10, comprising:

determining whether the detected sound signal is associated with a predetermined sound signal stored on the apparatus; and

if the detected sound signal is associated with a predetermined sound signal stored on the apparatus, determining that the detected sound signal is indicative of an emergency situation in the environment of the user.


 
12. A method according to claim 11, comprising:

if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, instructing another appararus to determine whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus; and

if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determining that the detected sound signal is indicative of an emergency situation in the environment of the user.


 
13. A method according to any of claims 10 to 12, comprising:

if the detected sound signal is indicative of an emergency situation in the environment of the user, instructing another apparatus to display an indication of an emergency situation in the environment of the user.


 
14. A method according to any of claims 10 to 13, comprising:

if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display information related to the emergency situation in the environment of the user.


 
15. A computer program for an apparatus, comprising software code portions for performing the method of any of claims 10 to 14 when said computer program is run on the apparatus.
 




Drawing













Search report









Search report