(19)
(11) EP 3 767 602 A1

(12) EUROPEAN PATENT APPLICATION

(43) Date of publication:
20.01.2021 Bulletin 2021/03

(21) Application number: 20185382.7

(22) Date of filing: 10.07.2020
(51) International Patent Classification (IPC): 
G08B 21/04(2006.01)
G06N 20/10(2019.01)
A61B 5/00(2006.01)
G16H 50/20(2018.01)
A61B 5/11(2006.01)
(84) Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
KH MA MD TN

(30) Priority: 14.07.2019 BE 201905453

(71) Applicant: Niko NV
9100 Sint-Niklaas (BE)

(72) Inventors:
  • DE MEY, Jens
    9240 Zele (BE)
  • VUEGEN, Lode
    2440 Geel (BE)
  • DEKKERS, Gert
    2440 Geel (BE)
  • KARSMAKERS, Peter
    2440 Geel (BE)

(74) Representative: DenK iP bv 
Hundelgemsesteenweg 1116
9820 Merelbeke
9820 Merelbeke (BE)

   


(54) SENSOR SYSTEM FOR ACTIVITY RECOGNITION


(57) The present invention provides a sensor system (1) for activity recognition. The sensor system (1) comprises at least two sensors (S1, S2, ..., Sn) for capturing environmental data, a data processing unit (3) for each of the at least two sensors (S1, S2, ..., Sn) for processing the captured data, a feature extraction unit (6) for each of the at least two sensors (S1, S2, ..., Sn) for compacting the processed data by filtering out of the processed data information irrelevant for the activity, thereby obtaining activity relevant data, and a primary activity recognition unit (10) for, from the extracted relevant data, recognizing a primary activity. The feature extraction system and the primary activity recognition unit (10) are part of the sensor system (1), or in other words are located in the sensor system (1).




Description

Technical field of the invention



[0001] The present invention relates to a sensor system. More particularly, the present invention relates to a sensor system for activity recognition.

Background of the invention



[0002] More and more, sensor devices, such as temperatures sensors, relative humidity sensors, CO2 sensors, movement sensors and the like are integrated in homes or buildings, to improve comfort of a user. This goes from simple, single sensor devices to more sophisticated sensor systems that makes the home or building "smarter".

[0003] US 2019/0103005 relates to multi-resolution audio activity tracker based on acoustic scene recognition. The method is based on collecting data from different acoustic sensors to learn about the habits of an elderly individual and notify dedicated medical staff or a close relative about detected behavior anomalies or a shift in habits of the elderly individual.

[0004] The system described in US 2019/0103005 comprises e.g. three microphones that are connected to a centralized device, such as e.g. an RGW (residential gateway) for example, wirelessly or by PLC (programmable Logic Controller) technology. Audio feature are extracted from the acoustic signals received from the microphones to determine location and activity of the elderly person.

[0005] Feature extraction is done in the remote gateway or remote centralized device. A disadvantage hereof is that special measures have to be taken for user privacy to be guaranteed. Further, information is continuously sent back and forth from the sensor system to the cloud, any loss of connection with the cloud can interrupt the working of the system and/or can stop the system from working properly.

[0006] Further, the system described in US 2019/0103005 determines activity based on only acoustic sensors (microphones). Therefore it always requires at least three sensors to obtain robust and reliable results, although it is still difficult to get good results when all sensors are of the same type.

[0007] US 2018/0306609 describes a sensing system comprising a sensor assembly that is communicably connected to a computer system, such as a server or a cloud computing system. A block diagram of the sensing system of US 2018/0306609 is shown in Fig. 1. The sensing system 100 comprises a sensor assembly 102 having one or more sensors 110 that sense a variety of different physical phenomena. The sensor assembly 102 featurizes, via a featurization module 112, the raw sensor data and transmits the featurized data to a computer system 104. Through machine learning (machine learning module 116), the computer system 104 then trains a classifier to serve as a virtual sensor 118 for an event that is correlated to the data from one or more sensor streams within the featurized sensor data. The virtual sensor 118 can then subscribe to the relevant sensor feeds from the sensor assembly 102 and monitor for subsequent occurrences of the event. Higher order virtual sensors can receive the outputs from lower order virtual sensors to infer nonbinary details about the environment in which the sensor assemblies 102 are located. Hence, the sensing system 100 is configured to train and implement one or more virtual sensors 118, which are machine learning based classification systems or algorithms trained to detect particular events to which the virtual sensors 118 are assigned as correlated to the data sensed by the sensors 110 of the sensor assembly 102 and/or other virtual sensors 118.

[0008] Similar as for US 2019/0103005, in the system of US 2018/0306609 further processing of featurized, raw data occurs at a remote location, i.e. at a location away from the sensor. Hence, data has to be transferred over the internet to the cloud. Consequently, this system has the same disadvantages as described in US 2019/0103005, such as special measures that have to be taken for user privacy to be guaranteed, because information that is continuously sent back and forth from the sensor system to the cloud, any loss of connection with the cloud can interrupt the working of the system and/or can stop the system from working properly. Also, a lot of data needs to be sent to the cloud, especially in the case of audio data. Therefore, a connection with high bandwidth or capacity needs to be provided between the sensor and the cloud to be able to send the data through with a high enough speed. Sending of high amounts of data further requires a lot of energy, so that a system in which battery based sensors are used is hard to realize.

[0009] Further, for both US 2019/0103005 and US 2018/0306609, as the model on which the processing is based is located in the cloud, it is a generic model which cannot be adjusted to the system at a particular location.

Summary of the invention



[0010] It is an object of embodiments of the present invention to provide a sensor system for activity recognition.

[0011] The above objective is accomplished by a device according to embodiments of the present invention.

[0012] The present invention provides a system for activity recognition. The sensor system comprises at least two sensors for capturing environmental data, a data processing unit for each of the at least two sensors for processing the captured data, a feature extraction unit for each of the at least two sensors for compacting the (raw) processed data by filtering out information irrelevant for the activity out of the processed data, thereby obtaining activity relevant data, and a primary activity recognition unit for, from the extracted relevant data, recognizing a primary activity. The feature extraction unit and the primary activity recognition unit are part of the sensor system, or in other words, are located in the sensor system.

[0013] The at least two sensors are located in a same room or area, i.e. in the room or area where the activity has to be recognized.

[0014] Hence, the feature extraction unit and the primary activity recognition unit are part of the sensor system and are located close to the at least two sensors in a same unit. In other words, all data processing is done locally, i.e. internally in the sensor system.

[0015] With primary activity is meant a basic activity such as, for example but not limited to, a door that is opened or closed, water running out of a water tap, a light that is on, high relative humidity in a room, increasing temperature in a room, ....

[0016] With a sensor for capturing environmental data is meant any connected object that is capable of providing various types of information with respect to the environment, such as e.g. location, position, an individual's movements, sounds, humidity, temperature, .... Hence, according to embodiments of the invention, the at least two sensors may, for example but not limited to, be at least one of a temperature sensor, a CO2 sensor, a radar sensor, a relative humidity sensor, an acoustic sensor, a VOC sensor or the like.

[0017] According to embodiments of the invention, the at least two sensors may be at least two sensors of the same type. However, according to other embodiments, which are more preferred, the at least two sensors may be at least two sensors of a different type. An advantage of the latter is that the results will be more robust and reliable. However, for some particular applications, where a not so robust result is required, the at least two sensors may, as described above, of a same type.

[0018] An advantage of a sensor system according to embodiments of the invention is that processing of the data collected by the at least two sensors in the sensor module is done close to the sensors at the extreme edge level, on in other words is done locally. No data coming from the at least two sensors is transferred into the cloud, which increases safety and offers a better protection of the data of a user. Moreover, people tend to feel more at ease when realising their data is not transferred over the internet.

[0019] A further advantage is also that it is a more simple system, as no big amounts of data have to be sent to the cloud or any other remote system to make the sensor system according to embodiments of the invention work, as processing of the sensor results is done locally in the system. Only according to particular embodiments of the invention, where also secondary activity recognition is done, the recognized primary activities are sent to the cloud for coupling with other primary activities to determine a secondary activity (see further).

[0020] A still further advantage of a system according to embodiments of the invention is that, as the data are processed locally in the system, i.e. the model for recognizing at least a primary activity is stored locally in the system, and not in the cloud, it may learn and be adapted locally as well, so that it can be specifically adapted and optimised for the specific location of the system.

[0021] Further, as the data captured from the at least two sensors are processed locally in the system and the data do not have to be sent to the cloud, the energy necessary to process the data is rather limited, which allows to have battery based sensors being used in the system, which is a big advantage over the existing systems, which do send all data captured by the sensors to the cloud for processing. This requires a lot of energy, which makes the use of battery based sensors very difficult.

[0022] As, according to embodiments of the invention, for each sensor present in the sensor system, a data processing unit and a feature extraction unit is provided,. each of the sensors thus has its own data processing unit and feature extraction unit. An advantage thereof is that it detects events based on multi-model sensor data. Because of this, the detection robustness is very much increased.

[0023] According to further embodiments, the sensor system may furthermore comprise a secundary activity recognition unit for, from a combination of each of the primary activities recognized by the primary recognition unit, determine a higher level secundary activity.

[0024] With higher level secundary activity is meant an activity that can be derived from a combination of at least two primary activities.

[0025] The secundary activity recognition unit may be part of the sensor system or in other words may be located in the sensor system. According to other embodiments of the invention, the secundary activity recognition unit may be provided on a location remote from the sensor system such as e.g. in a gateway or in the cloud.

[0026] According to embodiments of the invention, the data processing unit may comprise means for capturing data received from the at least two sensors, and an A/D converter for converting the captured data.

[0027] The feature extraction unit may comprise means for framing the A/D processed data into overlapping frames, detector unit for evaluating each of the overlapping frames, and extracting unit for extracting activity relevant frames from the overlapping frames.

[0028] According to embodiments of the invention, the sensor system may furthermore comprise at least one further sensor, and a further data processing unit and a further feature extraction unit for each of the at least one further sensor.

[0029] The sensor system may further comprise a memory for storing parameters of the relevant data and correlated primary and/or secundary activities.

[0030] Moreover, the sensor system may furthermore comprise a training unit for, from subsequent relevant data and correlated primary and/or secundary activities, update the stored parameters for improved performance.

[0031] The sensor system may furthermore comprise a communication unit for sending signals representative of the primary and/or secundary activity to an electric or electronic device. This may, for example, be sending a notification to e.g. a smartphone, a tablet or any other suitable device for notifying a user e.g. of someone entering the home, a temperature that is increasing in a home, water that is running, or the like. According to other embodiments sending a signal may be sending a signal to a remote electric or electronic device for starting an action. For example but not limited to, when the sensor system detects that a door has been opened, it can be decided that someone is entering the home and a signal can be sent to a thermostat to start heating the home.

[0032] According to embodiments of the invention, each of the at least two sensors of the sensor system may be located inside the sensor system. However, according to other embodiments, at least one of the at least two sensors may located outside the sensor system. For example, sensors already present in a home or building can also be used to send sensor data to the sensor system, in order to help recognize the activity and make the system more robust. According to embodiments of the invention, the sensor system may be a standalone system. According to other embodiments, the sensor system may be part of an automation system.

Brief description of the drawings



[0033] It has to be noted that same reference signs in the different figures refer to same, similar or analogous elements.

Fig. 1 illustrates a sensing system according to the prior art.

Fig. 2 shows the hierarchical approach of a sensor system according to embodiments of the invention.

Fig. 3 schematically illustrates a sensor system according to an embodiment of the invention.

Fig. 4 schematically illustrates a sensor system according to an embodiment of the invention.

Fig. 5 shows the hierarchical approach of a sensor system according to embodiments of the invention.

Fig. 6 schematically illustrates a sensor system according to an embodiment of the invention.

Fig. 7 schematically illustrates a sensor system according to an embodiment of the invention.


Description of illustrative embodiments



[0034] In the description different embodiments will be used to describe the invention. Therefore reference will be made to different drawings. It has to be understood that these drawings are intended to be non-limiting, the invention is only limited by the claims. The drawings are thus for illustrative purposes, the size of some of the elements in the drawings may be exaggerated for clarity purposes.
The term "comprising" is not to be interpreted as limiting the invention in any way. The term "comprising", used in the claims, is not intended to be restricted to what means is described thereafter; it does not exclude other elements, parts or steps.

[0035] The term "connected" as used in the claims and in the description has not to be interpreted as being restricted to direct connections, unless otherwise specified. Thus, part A being connected to part B is not limited to part A being in direct contact to part B, but also includes indirect contact between part A and part B, in other words also includes the case where intermediate parts are present in between part A and part B.
Not all embodiments of the invention comprise all features of the invention. In the following description and claims, any of the claimed embodiments can be used in any combination.
The present invention provides a sensor system for activity recognition. The sensor system comprises at least two sensors for capturing environmental data, a data processing unit for each of the at least two sensors for processing the captured data, a feature extraction unit for each of the at least two sensors for compacting the processed data by filtering out of the processed data information irrelevant for the activity, thereby obtaining activity relevant data, and a primary activity recognition unit for, from the extracted relevant data, recognizing a primary activity. The feature extraction unit and the primary activity recognition unit are part of the sensor system, or in other words are located in the sensor system.
Hence, the feature extraction unit and the primary activity recognition unit are part of the sensor system and are located close to the sensor in a same unit. In other words, all data processing is done internally in the sensor system.
With activity recognition within the scope of the invention is meant using sensor data and data mining and machine learning techniques to model a wide range of human activities. With primary activity is meant a basic activity such as, for example but not limited to, a door that is opened or closed, water running out of a water tap, a light that is on, high relative humidity in a room, increasing temperature in a room, presence of a person in the room,,....
With a sensor for capturing environmental data is meant any connected object that is capable of providing various types of information with respect to the environment, such as e.g. location, position, an individual's movements, sounds, humidity, temperature, ....
An advantage of a sensor system according to embodiments of the invention is that processing of the data collected by the at least two sensors in the sensor module is done close to the sensors at the extreme edge level, on in other words is done locally. No data coming from the at least two sensors is transferred into the cloud, which increases safety and offers a better protection of the data of a user. Hence, user privacy is improved with respect to prior art sensor systems. Moreover, people tend to feel more at ease when realising their data is not transferred over the internet.
A further advantage is also that it is a more simple system, as no big amounts of data have to be sent the cloud or any other remote system to make the sensor system according to embodiments of the invention work, as processing of the sensor data is done locally in the system. Only according to particular embodiments of the invention, where also secondary activity recognition is done, the recognized primary activities are sent to the cloud for coupling with other primary activities to determine a secondary activity (see further). Hence, any interruption in the connection with the cloud does not have a big effect on the working of the sensor system.
A still further advantage of a system according to embodiments of the invention is that, as the data are processed locally in the system, i.e. the model for recognizing at least a primary activity is stored locally in the system, and not in the cloud, it may learn and be adapted locally as well, so that it can be specifically adapted and optimised for the specific location of the system.
Further, as the data captured from the at least two sensors are processed locally in the system and the data do not have to be sent to the cloud, the energy necessary to process the data is rather limited, which allows to have battery based sensors being used in the system, which is a big advantage over the existing systems, which do send all data captured by the sensors to the cloud for processing. This requires a lot of energy, which makes the use of battery based sensors very difficult.
The present invention will hereinafter be described by means of different embodiments. It has to be understood that these embodiments are only for the ease of understanding the invention and are not intended to limit the invention in any way.
Fig. 2 illustrates a hierarchical approach according to one embodiment of the invention. It is the intention according to embodiments of the invention to collect environmental information by means of at least two sensors. With a sensor for capturing environmental data is meant any connected object that is capable of providing various types of information with respect to the environment, such as e.g. location, position, an individual's movements, sounds, humidity, temperature, .... Hence, according to embodiments of the invention, the at least two sensors may, for example but not limited to, be at least one of a temperature sensor, a CO2 sensor, a radar sensor, a relative humidity sensor, an acoustic sensor, a VOC sensor or the like.
According to embodiments of the invention, the at least two sensors may be at least two sensors of the same type. However, according to other embodiments, which are more preferred, the at least two sensors may be at least two sensors of a different type. An advantage of the latter is that the results will be more robust and reliable. However, for some particular applications, where a not so robust result is required, the at least two sensors may, as described above, of a same type.
The at least two sensors are located in a same room or area, i.e. in the room or area where the activity has to be recognized.
From the data collected by the at least two sensors primary activities are detected or recognized on the extreme edge level, i.e. locally in the sensor system. With primary activities is meant basic, simple activities, such as e.g. footsteps, speech, running faucet, ventilation active, kitchen hood active, gas stove active, temperature increasing, CO2 amount increasing, flushing toilet, opening or closing of a door or window, locking or unlocking the door with a key and the like.
The above can be done with a sensor system according to embodiments of the invention.
Fig. 3 illustrates a sensor system 1 according to a first embodiment. The sensor system 1 comprises at least two sensors S1, S2, ..., Sn for capturing environmental data. In general, the sensor system 1 may comprise any number of sensors as is required or wanted by a user. The at least two sensor S1, S2, ..., Sn may, for example, be one of a temperature sensor, a CO2 sensor, a radar sensor, a relative humidity sensor, an acoustic sensor, a VOC sensor or the like. As already described above, preferably the at least two sensors S1, S2, ..., Sn may sensors of a different type. However, in more basic systems the at least two sensors S1, S2, ..., Sn may also be sensors of a same type. The sensor system 1 further comprises a data processing unit 3 for each of the at least two sensors S1, S2, ..., Sn for processing the environmental data captured by the sensor S1. The data processing unit 3 is part of the sensor system 1, or in other words is located in the sensor system 1. According to embodiments of the invention, the data processing unit 3 may comprise means 4 for capturing data received from the sensor S1 and an A/D converter 5 for converting the captured data. The sensor system 1 further comprises a feature extraction unit 6 for each of the at least two sensors S1, S2, ..., Sn for filtering out information irrelevant for the activity out of the A/D processed data, after which only activity relevant data remains in the data that is further processed within the sensor system 1. The feature extraction unit 6 is part of the sensor system 1, or in other words, is located in the sensor system. The feature extraction unit 6 may comprise means 7 for framing the A/D processed data into overlapping frames, a detector unit 8 for evaluating each of the frames and an extracting unit 9 for extracting activity relevant frames from the frames.
Irrelevant data is data that indicates different things than what the sensor wants to detect, for example in case of an acoustic sensor, irrelevant data can be data indicating silence in the environment or when in case of a movement sensor, data indicating that in between movements there is a moment of no movement, or in general an interruption in the event that the sensor wants to detect or measure. Hence, when such interruptions occur, the data will only contain sensor noise. This sensor noise is not relevant for the recognition of the primary activities and can thus be seen as irrelevant data. Consequently, relevant data is all data that related to the event that a particular sensor wants to detect or measure, such as sound, humidity, movement, ... and that thus is relevant for the recognition of the primary activities. As an example, a simple way of making a difference between relevant and irrelevant data coming from an acoustic sensor can be, as is described more in detail below, on a basis of an energy threshold.

[0036] In other words, the data coming from the at least two sensors S1, S2, ..., Sn needs to be transformed to features that are then used to recognize the primary activity. Hence, the size of the captured data is significantly reduced in this step, as it only contains data relevant for processing.
The sensor system 1 further comprises a primary activity recognition unit 10 for, from the extracted activity relevant data, recognizing a primary activity. With primary activity is meant a basic activity such as, for example but not limited to, a door that is opened or closed, water running out of a water tap, a light that is on, high relative humidity in a room, increasing temperature in a room, ....
According to the invention, the primary activity recognition unit 10 is part of the sensor system 1, or in other words is located in the sensor system. Hence, the primary activity recognition unit 10 is part of the sensor system 1 and is located close to the sensor S1 in a same unit. In other words, all data processing for determining or recognizing the primary activity is done internally in the sensor system.
The primary activity recognition unit 10 takes the features, which were extracted from the data, and compares it with data models of possible activities, stored in the primary activity recognition unit 10. Based on the similarity between both, the system decides if a certain activity is taking place or not. According to the invention, data coming from at least two sensors is used to be able to make a better decision. During a learning process, the data models are created and continuously updated with such features to have a better and more reliable activity detection in the future.
The A/D processed data is thus cut into little pieces, the frames, in the feature extraction unit 6. On the basis of a set of frames, an activity, e.g. sound, can be recognized. However, in the specifi example of an acoustic sensor, only the frames with sufficient signal energy or signal pressure are used for such activity recognition. In other words, a set of frames where the signal energy is too low, which can indicate that there are no relevant sounds to be detected, will not be sent to the primary activity recognition unit 10. In this way, then number of processing steps in the primary activity recognition unit 10 can be kept to a minimum. This is a big advantage as the primary activity recognition unit 10 is part of the sensor system 1. In that way, heating because of the processing can be kept low.
The primary activity recognition system 10 can be seen as an artificial intelligence unit. Machine learning is applied to data frames to recognize or detect the relevant activity.
According to embodiments of the invention, the sensor system 1 may furthermore comprise a memory for storing parameters of the relevant data and correlated primary activity, as was described above. The sensor system 1 may furthermore also comprise a training unit 12 for, from subsequent relevant data and correlated primary activities, update the stored parameters for improved performance. Activities are then more efficiently and correctly recognized by comparing the A/D processed features with a database of features related to activities that is stored in the sensor system 1.
The sensor system 10 may furthermore comprise a communication unit 12 for sending signals representative of the primary activity to a remote electric or electronic device. According to embodiments of the invention, the communication unit 13 may be adapted for sending a notification to the remote device so as to notify a user of the recognized primary activity. For example, a user may receive a notification on his/her smartphone or tablet from the sensor system 1 that, for example, it was detected that a water faucet is running. Another example may be that a user receives a notification on his/her smartphone or tablet that a door has been opened or closed. According to other embodiments of the invention, the communication unit 13 may be adapted for sending a signal to the remote or electronic device so as to start an action. For example, when it is detected that it is getting dark, a signal may be sent to a lighting device for being turned on and/or a signal may be sent to blinds for going down.

[0037] Hence, according to the embodiment illustrated in Fig. 3, a primary activity is recognized or detected for each of the at least two sensors S1, S2, ..., Sn. For example, sensor S1 may be temperature sensor, so the primary activity detected by the sensor system S may, for example, be that temperature is increasing. The second sensor S2 may, for example, be an environmental sensor and the primary activity detected by the sensor system 1 may be that gas concentration is increasing, ... Hence, each of the sensors S1, S2, ... , Sn leads to another primary activity.
According to a further embodiment, a sensor system 1 according to embodiments of the invention may be adapted for using sensor fusion. With sensor fusion is meant that results of the at least two sensors are combined to come to one result, i.e. to one primary activity. This is illustrated in Fig. 4. The primary activity recognition unit 10 may, according to this embodiment, be adapted to, from the extracted features of data of different sensors, detect a primary activity.
According to further embodiment, a further step is taken in the hierarchical approach, as illustrated in Fig. 5. According to this embodiment, the sensor system 1 may furthermore comprise a secondary activity recognition unit 14, as schematically illustrated in Fig. 6. The second activity recognition unit 14 is adapted for, from a combination of each of the primary activities recognized by the primary activity recognition unit 10, determine a higher level secundary activity. With higher level secundary activity is meant an activity that can be derived from a combination of at least two primary activities. A higher level secundary activity is more complex than a primary, basic activity. A higher level secundary activity may, for example, be gas stove activity derived from primary activities such as increased temperature, sound (of gas), and gas concentration increase. Another example may, for example, be presence prediction as a higher level secundary activity determined from primary activities such as CO2 increase, increase of relative humidity and increase of temperature. A further example may be abnormal water consumption as a higher level secundary activity determined from primary activities such as water leakage.
According to embodiments of the invention, the secundary activity recognition unit 14 may be part of the sensor system 1, or in other words, may be located in the sensor system 1 at the location of the sensor system 1 (see Fig. 6). In such cases, all processing is done within the sensor system 1 and no or limited internet or cloud connection is required for good functioning of the sensor system 1. According to other embodiments of the invention, the secundary activity recognition unit 14 may be provided on a remote location, or in other words, is not part of the sensor system. According to these embodiments, the secundary activity recognition unit 14 may be located in a remote gateway or in the cloud (see Fig. 7). According to these embodiments, although information has to be sent over the internet, no crucial private information is sent over the internet to the cloud or to a remote gateway, only primary activities have to be sent. Hence, a sensor system 1 according to the present embodiment is still very secure and takes care of a user's privacy because processing of crucial, private data is all done locally in the sensor system 1 and does not have to be transferred over the internet to the cloud.
According to the embodiment illustrated in Fig. 6 and Fig. 7, the sensor system 1 may also comprise a memory 11. According to such embodiments, the memory 11 may be adapted to store parameters of relevant data and correlated primary and secondary activities.
The sensor system 1 may also comprise a training unit 12 for, from subsequent relevant data and correlated primary and/or secundary activities, update the stored parameters for improved performance.
Still further, the sensor system 1 may furthermore comprise a communication unit for sending signals representative of the recognized primary and/or secundary activity to a remote electric or electronic device.
According to embodiments the at least two sensors S1, S2, ..., Sn may all be located within the sensor system 1. However, according to other embodiments, at least one of the at least two sensors S1, S2, ..., Sn may be located outside the sensor system 1. For example, already existing sensors present in a home or building can be integrated so as to work with the sensor system 1. This means that the sensor system 1 can take into account input received from the "outside" sensor(s) to determine the primary and/or secundary activities.
According to embodiments of the invention, the sensor system 1 may be a standalone system, which means that it can perfectly work on its own. According to other embodiments, the sensor system 1 may be part of an automation system.


Claims

1. Sensor system (1) for activity recognition, the sensor system (1) comprising:

- at least two sensors (S1, S2, ..., Sn) for capturing environmental data,

- a data processing unit (3) for each of the at least two sensors (S1, S2, ..., Sn) for processing the captured data,

- a feature extraction unit (6) for each of the at least two sensors (S1, S2, ..., Sn) for filtering out information irrelevant for the activity out of the A/D processed data, after which only activity relevant data remains in the data that is further processed in the sensor system (1), and

- a primary activity recognition unit (10) for, from the activity relevant data, recognizing a primary activity,

characterized in that the feature extraction unit and the primary activity recognition unit are part of the sensor system (1).
 
2. Sensor system (1) according to claim 1, wherein the sensor system (1) furthermore comprises a secundary activity recognition unit (14) for, from a combination of each of the primary activities recognized by the primary recognition unit (10), determine a higher level secundary activity.
 
3. Sensor system (1) according to claim 2, wherein the secundary activity recognition unit (14) is part of the sensor system (1).
 
4. Sensor system (1) according to claim 2, wherein the secundary activity recognition unit (14) is provided on a location remote from the sensor system (1).
 
5. Sensor system (1) according to any of the previous claims, wherein the data processing unit (3) comprises:

- means (4) for capturing data received from the at least two sensors (S1, S2, ..., Sn), and

- an A/D converter (5) for converting the captured data.


 
6. Sensor system (1) according to any of the previous claims, wherein the feature extraction unit (6) comprises:

- means (7) for framing the A/D processed data into overlapping frames,

- detector unit (8) for evaluating each of the frames, and

- extracting unit (9) for extracting activity relevant frames from the frames.


 
7. Sensor system (1) according to any of the previous claims, further comprising a memory (11) for storing parameters of the relevant data and correlated primary and/or secundary activities.
 
8. Sensor system (1) according to claim 7, furthermore comprising a training unit (12) for, from subsequent relevant data and correlated primary and/or secundary activities, update the stored parameters for improved performance.
 
9. Sensor system (1) according to any of the previous claims, furthermore comprising a communication unit (13) for sending signals representative of the recognized primary and/or secundary activity to a remote electric or electronic device.
 
10. Sensor system (1) according to claim 9, wherein the communication unit (13) is adapted for sending a notification to the remote electric or electronic device as to notify a user of the recognized primary and/or secondary activity.
 
11. Sensor system (1) according to claim 9, wherein the communication unit (13) is adapted for sending a signal to the remote electric or electronic device so as to start an action.
 
12. Sensor system (1) according to any of the previous claims, wherein each of the at least two sensors (S1, S2, ..., Sn) is located inside the sensor system (1).
 
13. Sensor system (1) according to any of claims 1 to 11, wherein at least one of the at least two sensors (S1, S2, ..., Sn) is located outside the sensor system (1).
 
14. Sensor system (1) according to any of the previous claims, wherein the at least one sensor (S1, S2, ..., Sn) is at least one of a temperature sensor, a CO2 sensor, a relative humidity sensor, an acoustic sensor, a VOC sensor, a radar sensor or the like.
 
15. Sensor system (1) according to any of the previous claims wherein the sensor system (1) is either a standalone system or is part of an automation system.
 




Drawing



















Search report












Search report




Cited references

REFERENCES CITED IN THE DESCRIPTION



This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

Patent documents cited in the description