(19)
(11)EP 3 397 525 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
06.05.2020 Bulletin 2020/19

(21)Application number: 16891731.8

(22)Date of filing:  06.09.2016
(51)International Patent Classification (IPC): 
B60R 22/48(2006.01)
B60N 2/00(2006.01)
B60R 21/015(2006.01)
B60R 16/037(2006.01)
(86)International application number:
PCT/KR2016/009940
(87)International publication number:
WO 2017/146327 (31.08.2017 Gazette  2017/35)

(54)

SYSTEM AND METHOD FOR LOCATING AN OCCUPANT WITHIN A VEHICLE

SYSTEM UND VERFAHREN ZUR ORTUNG EINES INSASSEN IN EINEM FAHRZEUG

SYSTÈME ET PROCÉDÉ DE LOCALISATION D'UN OCCUPANT À L'INTÉRIEUR D'UN VÉHICULE


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 23.02.2016 US 201615051556

(43)Date of publication of application:
07.11.2018 Bulletin 2018/45

(73)Proprietor: Samsung Electronics Co., Ltd.
Suwon-si, Gyeonggi-do 16677 (KR)

(72)Inventors:
  • WANG, Yu
    Mountain View, California 94043 (US)
  • LI, Zhiyun
    Mountain View, California 94043 (US)
  • ZHENG, Pei
    Mountain View, California 94043 (US)

(74)Representative: HGF Limited 
Saviour House 9 St. Saviourgate
York YO1 8NQ
York YO1 8NQ (GB)


(56)References cited: : 
EP-A1- 2 930 585
KR-A- 20160 006 408
US-A1- 2014 028 542
US-A1- 2014 309 862
US-A1- 2015 210 287
KR-A- 20140 059 478
KR-A- 20160 008 372
US-A1- 2014 297 220
US-A1- 2015 149 042
US-B2- 8 660 735
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description

    Technical Field



    [0001] This disclosure relates generally to vehicle personalization systems. More specifically, this disclosure relates to identifying and localizing a vehicle occupant by correlating hand gesture and seatbelt motion.

    Background Art



    [0002] The connected car, with increasing annual revenue, has attracted strong research and development interest from automakers, mobile carriers, and consumer electronics manufactures. Among the research and development focuses, a key challenge is to enable personalized services such as information and entertainment (infotainment) for each occupant, including both the driver and passengers. Wearable devices, such as Samsung Gear series smartwatches, have become increasingly popular. These wearable devices can provide activity tracking and other functions comparable to those provided by a smart phone.

    [0003] Mobile carriers have recently launched applications and services to restrict phone use while driving. Specifically, if a phone is detected to be inside a moving car, it will limit functionalities to avoid possible distractions, regardless whether it is with the driver or passengers. This is inconvenient for passengers, who will usually end up disabling this service all together. US20150149042A discloses a system for localizing a vehicle occupant.

    Disclosure of Invention


    Solution to Problem



    [0004] Embodiments of the present disclosure provide for identifying and localizing a vehicle occupant by correlating hand gesture and seatbelt buckle motion. The invention resides in the system of claim 1, the vehicle of claim 7 and the method of claim 10.

    Brief Description of Drawings



    [0005] For a more complete understanding of the present disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:

    FIG. 1 illustrates an example communication system in which various embodiments of the present disclosure may be implemented;

    FIG. 2 illustrates an example computer system according to various embodiments of the present disclosure;

    FIG. 3 illustrates an example electronic device according to various embodiments of the present disclosure;

    FIG. 4 illustrates a block diagram of an example vehicle for providing personalization services for located occupants according to one embodiment of this disclosure;

    FIG. 5 illustrates a block diagram of an example enhanced seatbelt buckle according to various embodiments of the present disclosure;

    FIG. 6 illustrates an example enhanced seatbelt buckle according to one embodiment of the present disclosure;

    FIG. 7 illustrates an example sensor fusion pipeline process for obtaining a device rotation matrix according to one embodiment of the present disclosure;

    FIG. 8 illustrates an example process for extract ground-based motion features according to one embodiment of the present disclosure;

    FIG. 9 illustrates a graph including a comparison of example ground-based acceleration features;

    FIG. 10 illustrates a flowchart of an example process for occupant identification and personalization according to one embodiment of the present disclosure; and

    FIG. 11 illustrates a flowchart for a process for locating an occupant within a vehicle according to various embodiments of this disclosure.



    [0006] Embodiments of the present disclosure recognize that a major challenge in connected car solutions is how to enable personalization automatically. For example, the driver can have her favorite music channel and seat position set up without her taking additional actions. Moreover, if the vehicle has an entertainment system for backseat passages (e.g., display, audio), further personalization is useful. To enable complete personalization for each seat, the vehicle needs to know the identity of the occupant in each seat. Accordingly, embodiments of the present disclosure provide for identification of the occupant in each seat automatically or without the occupant needing to enter identifying information herself. Various embodiments of the present disclosure provide support for occupant personalization without require occupant's intervention for identification. Some detailed examples are provided below.

    [0007] Embodiments of the present disclosure recognize that progress in driver identification has occurred. Some cars identify the driver based on the key inserted. However, embodiments of the present disclosure recognize that this method is not applicable if the key is shared by multiple drivers, (e.g., in the case of rental cars). Some other cars identify the driver by asking for an identifier (e.g., a PIN). Embodiments of the present disclosure recognize that this method is inconvenient as it relies on driver's interaction.

    [0008] Embodiments of the present disclosure also recognize that other options for identifying an occupant in each seat of a vehicle involve request occupants to take addition actions. In particular, an occupant can identify herself through barcode labels, RFID tags, or biometric collection devices (e.g., camera, fingerprint reader, etc.) at her seat. Embodiments of the present disclosure recognize that these options may be undesirable requiring intervention from each occupant. Moreover, privacy issues become prevalent when biometrics are used for identification.

    [0009] Accordingly, embodiments of the present disclosure provide convenience by using wearable devices provide personalization in connected car. Embodiments of the present disclosure address the aforementioned issues by making the vehicle aware of the identity of occupant in each seat without requiring the occupant's intervention.

    [0010] FIG. 1 illustrates an example communication system 100 in which various embodiments of the present disclosure may be implemented. The embodiment of the communication system 100 shown in FIG. 1 is for illustration only. Other embodiments of the communication system 100 could be used without departing from the scope of this disclosure.

    [0011] As shown in FIG. 1, the system 100 includes a network 102, which facilitates communication between various components in the system 100. For example, the network 102 may communicate Internet Protocol (IP) packets, frame relay frames, or other information between network addresses. The network 102 may include one or more local area networks (LANs); metropolitan area networks (MANs); wide area networks (WANs); all or a portion of a global network, such as the Internet; or any other communication system or systems at one or more locations.

    [0012] The network 102 facilitates communications between at least one server 104 and various client devices 106-110. Each server 104 includes any suitable computing or processing device that can provide computing services for one or more client devices. Each server 104 could, for example, include one or more processing devices, one or more memories storing instructions and data, and one or more network interfaces facilitating communication over the network 102.

    [0013] Each client device 106-110 represents any suitable computing or communication device with a display that interacts with at least one server or other computing device(s) over the network 102. In this example, the client devices 106-114 include electronic devices, such as, for example, mobile telephone(s) or smartphone(s) 106, wearable device(s) 108, vehicle(s) 110, a laptop computer, a tablet computer, etc. However, any other or additional client devices could be used in the communication system 100.

    [0014] In this example, some client devices 106-110 communicate indirectly with the network 102. For example, the client devices 106-110 communicate via one or more base stations 116, such as cellular base stations or eNodeBs. Also, the client devices 106-110 may communicate via one or more wireless access points (APs), such as IEEE 802.11 wireless APs. Note that these are for illustration only and that each client device could communicate directly with the network 102 or indirectly with the network 102 via any suitable intermediate device(s) or network(s). For example, the vehicle 110 may communicate with the network 102 via one or more satellites 118 for receiving and/or sending entertainment and location (e.g., GPS) information.

    [0015] In this illustrative example, vehicle 110 is a connected vehicle capable of providing information and entertainment to occupants. As described in more detail below, the vehicle 110 provides personalization options for the occupants based on a user profile as a result of locating the occupants within seats of the vehicle 110. The vehicle 110 also communicates with devices (e.g., wearable devices 108 and enhanced seatbelt buckles) located within the vehicle 110. For example, the vehicle 110 may communicate via a personal area network (PAN), such as Bluetooth, or near field communication (NFC) to send and receive information to and from devices located within the vicinity of vehicle 110. While illustrated as a car, vehicle 110 may be any suitable vehicle capable of communication and for which personalization options can be provided for occupants, such as, for example, without limitation, a bus, a train, a plane, etc.

    [0016] As described in more detail below, the server 104 may represent as server of a cloud computing system that provides for the identification and localization of a vehicle occupant by correlating hand gesture and seatbelt motion. For example, the server 104 may receive motion patterns from a wearable device 108 of an occupant of the vehicle 110 as well as from an enhanced seatbelt buckle 112 (e.g., the connected vehicle 110).

    [0017] Although FIG. 1 illustrates one example of a communication system 100, various changes may be made to FIG. 1. For example, the system 100 could include any number of each component in any suitable arrangement. In general, computing and communication systems come in a wide variety of configurations, and FIG. 1 does not limit the scope of this disclosure to any particular configuration. While FIG. 1 illustrates one operational environment in which various features disclosed in this patent document can be used, these features could be used in any other suitable system.

    [0018] FIGS. 2 and 3 illustrate example electronic devices in a communication system according to various embodiments of the present disclosure. In particular, FIG. 2 illustrates an example computer system 200, and FIG. 3 illustrates an example electronic device 300. In this illustrative example, the computer system 200 represents the server 104 in FIG. 1, and the electronic device 300 could represent one or more of the client devices 106-108 in FIG. 1.

    [0019] As shown in FIG. 2, the computer system 200 includes a bus system 205, which supports communication between at least one processor 210, at least one storage device 215, at least one communications interface 220, and at least one input/output (I/O) unit 225.

    [0020] The processor 210 executes instructions that may be loaded into a memory 230. The processor 210 may include any suitable number(s) and type(s) of processors or other devices in any suitable arrangement. Example types of processor 210 include microprocessors, microcontrollers, digital signal processors, field programmable gate arrays, application specific integrated circuits, and discreet circuitry. The processor 210 may be a general-purpose CPU or specific purpose processor for encoding or decoding of video data.

    [0021] The memory 230 and a persistent storage 235 are examples of storage devices 215, which represent any structure(s) capable of storing and facilitating retrieval of information (such as data, program code, and/or other suitable information on a temporary or permanent basis). The memory 230 may represent a random access memory or any other suitable volatile or non-volatile storage device(s). The persistent storage 235 may contain one or more components or devices supporting longer-term storage of data, such as a read-only memory, hard drive, Flash memory, or optical disc.

    [0022] The communications interface 220 supports communications with other systems or devices. For example, the communications interface 220 could include a network interface card or a wireless transceiver (e.g., satellite, cellular, WiFi, Bluetooth, NFC, etc.) facilitating communications over the network 102. The communications interface 220 may support communications through any suitable physical or wireless communication link(s). The communications interface 220 may include only one or both of a transmitter and receiver, for example, only a receiver may be included in a decoder or only a transmitter may be included in an encoder.

    [0023] The I/O unit 225 allows for input and output of data. For example, the I/O unit 225 may provide a connection for user input through a keyboard, mouse, keypad, touchscreen, or other suitable input device. The I/O unit 225 may also send output to a display, printer, or other suitable output device.

    [0024] As described in more detail below, the computer system 200 may be a server of a cloud computing system that provides for the identification and localization of a vehicle occupant by correlating hand gesture and seatbelt motion. The computer system 200 may also be located in a vehicle (e.g., such as vehicle 110) for locating an occupant and/or providing personalization options for occupants located in the vehicle.

    [0025] FIG. 3 illustrates an example electronic device 300 according to various embodiments of the present disclosure. In this embodiment, the electronic device 300 is an example of one or more of the client devices (e.g., such as client devices 106-108 of FIG. 1). The embodiment of the electronic device 300 illustrated in FIG. 3 is for illustration only, and the client devices 106-108 of FIG. 1 could have the same or similar configuration. However, electronic devices come in a wide variety of configurations, and FIG. 3 does not limit the scope of this disclosure to any particular implementation of an electronic device.

    [0026] As shown in FIG. 3, the electronic device 300 includes antenna(s) 305, a transceiver 310, transmit (TX) processing circuitry 315, a microphone 320, and receive (RX) processing circuitry 325. The electronic device 300 also includes a speaker 330, a processor 340, an input/output (I/O) interface (IF) 345, a touchscreen 350, a display 355, a memory 360, and one or more sensors 365. The memory 360 includes an operating system (OS) 361 and one or more applications 362.

    [0027] The transceiver 310 receives, from the antenna(s) 305, an incoming RF signal transmitted by an access point (e.g., base station, WiFi router, Bluetooth device) for a network (e.g., a WiFi, Bluetooth, cellular, 5G, LTE, LTE-A, WiMAX, or any other type of wireless network). The transceiver 310 down-converts the incoming RF signal to generate an intermediate frequency (IF) or baseband signal. The IF or baseband signal is sent to the RX processing circuitry 325, which generates a processed baseband signal by filtering, decoding, and/or digitizing the baseband or IF signal. The RX processing circuitry 325 transmits the processed baseband signal to the speaker 330 (such as for voice data) or to the processor 340 for further processing (such as for web browsing data).

    [0028] The TX processing circuitry 315 receives analog or digital voice data from the microphone 320 or other outgoing baseband data (such as web data, e-mail, or interactive video game data) from the processor 340. The TX processing circuitry 315 encodes, multiplexes, and/or digitizes the outgoing baseband data to generate a processed baseband or IF signal. The transceiver 310 receives the outgoing processed baseband or IF signal from the TX processing circuitry 315 and up-converts the baseband or IF signal to an RF signal that is transmitted via the antenna(s) 305. In some embodiments, the transceiver 310, the TX processing circuitry 315, and the RX processing circuitry 325 may be implemented within the same block or chip, such as a WiFi and/or Bluetooth module and/or chip.

    [0029] The processor 340 can include one or more processors or other processing devices and execute the OS 361 stored in the memory 360 in order to control the overall operation of the electronic device 300. For example, the processor 340 could control the reception of forward channel signals and the transmission of reverse channel signals by the transceiver 310, the RX processing circuitry 325, and the TX processing circuitry 315 in accordance with well-known principles. In some embodiments, the processor 340 includes at least one microprocessor or microcontroller.

    [0030] The processor 340 is also capable of executing other processes and programs resident in the memory 360. The processor 340 can move data into or out of the memory 360 as required by an executing process. In some embodiments, the processor 340 is configured to execute the applications 362 based on the OS 361 or in response to signals received from eNBs or an operator. The processor 340 is also coupled to the I/ O interface 345, which provides the electronic device 300 with the ability to connect to other devices, such as laptop computers and handheld computers. The I/O interface 345 is the communication path between these accessories and the processor 340.

    [0031] The processor 340 is also coupled to the touchscreen 350 and the display 355. The operator of the electronic device 300 can use the touchscreen 350 to enter data and/ inputs into the electronic device 300. The display 355 may be a liquid crystal display, light-emitting diode (LED) display, optical LED (OLED), active matrix OLED (AMOLED), or other display capable of rendering text and/or at graphics, such as from web sites, videos, games, etc. The touchscreen 350 can include a touch panel, a (digital) pen sensor, a key, or an ultrasonic input device.

    [0032] The memory 360 is coupled to the processor 340. Part of the memory 360 could include a random access memory (RAM), and another part of the memory 360 could include a Flash memory or other read-only memory (ROM).

    [0033] Electronic device 300 further includes one or more sensors 365 that can meter a physical quantity or detect an activation state of the electronic device 300, and convert metered or detected information into an electrical signal. For example, sensor 365 may include one or more buttons for touch input, a camera, a gesture sensor, a gyroscope or gyro sensor, an air pressure sensor, a magnetic field sensor or magnetometer, an acceleration sensor or accelerometer, a grip sensor, a proximity sensor, a color sensor (e.g., a Red Green Blue (RGB) sensor), a temperature/humidity sensor, an illumination sensor, etc. The sensor(s) 365 can further include a control circuit for controlling at least one of the sensors included therein. As will be discussed in greater detail below, one or more of these sensor(s) 365 may be one or more of an accelerometer, a gyroscope, and a magnetic field sensor to generate a motion pattern for an occupant buckling her seatbelt.

    [0034] Although FIG. 3 illustrates one example of an electronic device 300, various changes may be made to FIG. 3. For example, various components in FIG. 3 could be combined, further subdivided, or omitted and additional components could be added according to particular needs. As a particular example, the processor 340 could be divided into multiple processors, such as one or more central processing units (CPUs) and one or more graphics processing units (GPUs). In one or more embodiments, the speaker 330, the microphone 320, and or the touchscreen 350 may not be included in the device 300. For example, in some embodiments, the electronic device 300 may be a smartwatch having display/user interface functionality comparable with that of a smartphone or may be a simple activity tracker with wireless communication functionality but limited or no display/user interface functionality.

    [0035] Embodiments of the present disclosure recognize and take into account that each occupant of a vehicle 110 makes a unique gesture to buckle her seatbelt, which includes raising her hand to grab the seatbelt, pulling the seatbelt around her waist, inserting the buckle into the latch, and finally moving the hand away after the click. This entire gesture can be captured by occupant's wearable device 108 (e.g., a smartwatch or activity tracker) on her hand. Based on the assumption that the seatbelt sensors in the enhanced seatbelt buckle 112 (as discussed below) can simultaneously capture the corresponding movement during this gesture, embodiments of the present disclosure correlate these two movement patterns to uniquely identify the occupant of each seat, thus supporting personalization for all occupants.

    [0036] Components of various embodiments of the present disclosure include an enhanced seatbelt buckle 112 to capture the seatbelt's movement during the seatbelt buckling gesture; a wearable device 108 worn by an occupant to capture her hand movement during the seatbelt buckling gesture; an occupant identification method to associate the occupant's identity through her wearable device and the seat she occupies based on correlating the above two movement patterns; a connected vehicle for identifying locations of occupants within the vehicle and providing personalization options based on occupant locations; and a cloud based platform to support personalization for each occupant.

    [0037] FIG. 4 illustrates a block diagram of an example vehicle 110 for providing personalization services for located occupants according to one embodiment of this disclosure. In this embodiment, the vehicle 110 of FIG. 1 is illustrated with example components. The embodiment of the vehicle 110 illustrated in FIG. 4 is for illustration only. FIG. 4 does not limit the scope of this disclosure to any particular implementation of an electronic device.

    [0038] As illustrated, the vehicle 110 includes a computer system 200, various seats 405 for occupants (e.g., a driver and/or passengers), and personalization components 410 and 415. The personalization components 410 and 415 are components of the vehicle that can provide personalized services to occupants of the vehicle. Personalized services include media content such as infotainment that is specific to the occupant. For example, the display 410 may display a home screen of applications or content from an occupant's mobile device or preset infotainment based on the occupant's profile. The illustrated example includes displays 410 and speakers 415; however, any type of personalization components may be used. For example, each seat 405 may have a separate display 410 and speaker(s) 415 or the display 410 and speaker(s) 415 may be common for multiple seats 405. The display 410 may be a display in a dashboard, a console, or the back of another seat. In another example, the speaker 415 may be a headphones or audio jack for providing audio to the seat occupant.

    [0039] The computer system 200 provides for communication and control of components within the vehicle 110. For example, the computer system 200 may query and receive motion data from enhanced seatbelt buckles 112 and may identify that the seats 405 are occupied based on an output from the seat sensor(s) 425. The computer system 200 can communicate this information to a cloud connected server 104 and receive information about locations of occupants within the vehicle and occupant profile data for personalization. The computer system 200 can also communicate with user devices, such as, mobile phones 106 and wearable devices 108 and control the output of infotainment via displays 410 and speakers 415. In some embodiments, the computer system 200 may make a comparison of motion data for the enhanced seatbelt buckles 425 and the user wearable devices 108 to determine the locations of occupants within the vehicle and occupant profile data for personalization.

    [0040] FIG. 5 illustrates a block diagram of an example enhanced seat belt buckle 112 according to various embodiments of the present disclosure. The embodiment of the enhanced seat belt buckle 112 shown in FIG. 5 is for illustration only. Other embodiments of the enhanced seat belt buckle 112 could be used without departing from the scope of this disclosure.

    [0041] The enhanced seatbelt buckle 112 is an important component of various embodiments of the present disclosure. The enhanced seatbelt buckle 112 captures the seatbelt buckling gesture on the vehicle side. The enhanced seatbelt buckle 112 includes one or more motion sensor(s) 505 to capture the movement of buckle 112 during the seatbelt buckling gesture. The motion sensor(s) 505 may include, but are not limited to, motion sensors such as accelerometer and gyroscope. For example, the motion sensor(s) 505 may include a gesture sensor, a gyroscope or gyro sensor, a magnetic field sensor or magnetometer, an acceleration sensor or accelerometer, etc. The memory 510 stores the captured motion pattern. Transceiver 515 communicates motion pattern data to with the car. One or more of various wireless communication protocols can be used, such as Wi-Fi, Bluetooth, ZigBee, and Z-Wave, for example. Controller 520 provides control to coordinate and bridge different components of the buckle 112. Specifically, controller 520 processes the measurements of motion sensor(s) 505, stores measurements in the memory 510, and transmits them to the vehicle via the transceiver 515. The energy supply 525 supplies power the above discussed components. For example, the energy supply 525 may be a battery, a wire to energy supply of the vehicle 110 via the seatbelt or the latch, and/or may use induction charging, for example, when buckled or when in an unbuckled position.

    [0042] In practice, the enhanced seatbelt buckle 112 may be either a replacement for existing seatbelt buckles (i.e., integrated within the seatbelt buckle) or an accessory that can be attached to existing seatbelt buckles. FIG. 6 illustrates an example enhanced seat belt buckle 112 according to one embodiment of the present disclosure. The embodiment of the enhanced seat belt buckle 112 shown in FIG. 6 is for illustration only. Other embodiments of the enhanced seat belt buckle 112 could be used without departing from the scope of this disclosure. In this example, the enhanced seat belt buckle 112 implemented with TI MSP430 controller, 10KB RAM memory, a CC2420 radio, a MPU-6050 sensor, and powered by two AA batteries.

    [0043] Embodiments of the present disclosure extract robust motion features to characterize the seatbelt buckling gesture. One challenge is that motion sensors on mobile/wearable devices typically use the coordinate system of the device. Thus, measurements of the device measurements are dependent on the device orientation and may not be able to be used directly. To address this issue, embodiments of the present disclosure map the device coordinate system to the a coordinate system of the environment by computing a device rotation matrix, and then extracting motion features based on the coordinate system of the environment. Alternatively, in practice, the motion features can be significantly simplified as the scalar magnitudes of acceleration during a period of measurement, thus the orientations of either seatbelt buckle or wearable become irrelevant.

    [0044] FIG. 7 illustrates an example sensor fusion pipeline process 700 for obtaining a device rotation matrix according to one embodiment of the present disclosure. The process 700 may be performed by any component that processes the sensed motion data of the wearable device 108. For example, the server 104, the wearable device 108, and/or the computer system 200, collectively referred to here as "the system." The embodiment of the sensor fusion pipeline process 700 shown in FIG. 7 is for illustration only. Other embodiments of the sensor fusion pipeline process 700 could be used without departing from the scope of this disclosure.

    [0045] In this illustrative example, the system computes device rotation matrix based on sensor fusion. The system fuses the measurements of accelerometer 702, gyroscope 704, and magnetic field sensor 706 to get the accurate device orientation 708. For example, the system performs noise removal 710 to remove noise from the noisy orientation 712 of the accelerometer 702 and magnetic field sensor 706 outputs. The system also accounts for drifted orientation 714 of the gyroscope 704 using bias removal 716. By comparing device orientation readings over time, embodiments of the present disclosure obtain device rotation matrix 718. In one example, the noise and bias removal 710 and 716 are implemented using low-pass and high-pass filters, respectively.

    [0046] FIG. 8 illustrates an example process 800 for extract ground-based motion features according to one embodiment of the present disclosure. The process 800 may be performed by any component that processes the sensed motion data of the wearable device 108. For example, the server 104, the wearable device 108, and/or the computer system 200, collectively referred to here as "the system." The embodiment of the process 800 shown in FIG. 8 is for illustration only. Other embodiments of the process 800 could be used without departing from the scope of this disclosure.

    [0047] FIG. 8 illustrates extraction of robust motion features. The system converts raw sensor measurements 802 from the device coordinate system to the coordinate system of the environment using the rotation matrix 718, thus extracting ground-based motion features 806. In addition, the system applies exponential smoothing to remove the bias and random noise in the raw sensor measurements 802.

    [0048] FIG. 9 illustrates an example graph 900 including a comparison of example ground-based acceleration features. The example graph 900 shown in FIG. 9 is for illustration only. Other example graphs 900 could be used without departing from the scope of this disclosure.

    [0049] In this illustrative example, the ground-based acceleration features extracted from three buckling gestures for the enhanced seatbelt buckle 112 and wearable device 108 are compared during the same period of time. The buckling gesture maintains stable acceleration features along the three directions in the coordinate system of the environment. Note that such ground-based motion features do not depend on how the device is carried.

    [0050] Embodiments of the present disclosure identify an occupant by correlating the movement patterns captured by the enhanced seatbelt buckle 112 and wearable device 108 during the same period of time. In some embodiments, various metrics such as cosine similarity or other threshold correlation values may be adopted to quantify the correlation. Two movement patterns are considered as correlated if the movement patterns captured by the enhanced seatbelt buckle 112 and wearable device 108 have matching timestamps and their similarity score is higher than a certain threshold. FIG. 9 illustrates the acceleration features simultaneously captured the enhanced seatbelt buckle 112 and wearable device 108 during three buckling gestures. As illustrated, the two movement patterns are well correlated.

    [0051] FIG. 10 illustrates a flowchart of an example process 1000 for occupant identification and personalization according to one embodiment of the present disclosure. In this embodiment, the parts of process depicted in FIG. 10 are performed by various components in the communication system 100, including the cloud connected server 104; the wearable device 108, the vehicle 110, and the enhanced seatbelt buckle 112. The embodiment of the process 1000 shown in FIG. 10 is for illustration only. Other embodiments of the process 1000 could be used without departing from the scope of this disclosure.

    [0052] The process begins with the occupants (e.g., via the wearable device 108 or the user's mobile phone) and the vehicle 110 providing their current locations (steps 1002 and 1004), respectively, to the cloud server 104. The server 104 associates a group of occupants to a vehicle based on the occupants and vehicle location information (step 1006).

    [0053] When occupant (e.g., the driver or a passenger) enters the vehicle 110 and sits in a seat with the enhanced seatbelt buckle 112, the occupant will buckle her seatbelt. During this gesture, the sensors of enhanced seatbelt buckle 112 capture the movement of buckle (step 1008) as a feature-time sequence, e.g., denoted by [(p0CS, t0), (p1CS, t1), ..., (pN-1CS, tN-1)], in which feature piCS is ground-based and may include the moving speed, acceleration, timing, and/or any other motion features of seatbelt buckle at time ti. The seat belt buckling gesture is made by the occupant's hand wearing the wearable device 108. Simultaneously during this gesture, the motion sensors on wearable device 108 capture the movement of the occupant's hand (step 1010) as a feature-time sequence, e.g., denoted by [(POR, TO), (P1R, T1), ..., (PN-1R, TN-1)], in which feature PiR is ground-based and may include the moving speed, acceleration, timing, and/or any other motion features of the occupant's hand at time Ti.

    [0054] In some embodiments, a set of machine learning algorithms, such as support vector machine (SVM) and random forest (RF), are used to distinguish the seatbelt buckling gesture from other gestures. Therefore, in various embodiments, the features captured by the wearable device 108 may only characterize occupant's hand movement during the buckling gesture.

    [0055] When the seat is occupied, the vehicle 110 identifies that the seat is occupied, for example, via an output from a seat sensor and sends a notification to the server 104 that the seat is occupied (step 1012). Upon detecting the seatbelt is buckled, the vehicle 110 queries for the captured movement (step 1014). The enhanced seatbelt buckle 112 sends the captured movement to the vehicle 110 (step 1016). Then, the vehicle 110 uploads the motion data (e.g., the movement pattern) to the cloud server 104 (step 1018).

    [0056] Meanwhile, as a result of the vehicle 110 notifying the server 104 that the seat is occupied, the cloud then queries the wearable device 108 of each associated occupant for the captured motion data (step 1020). Each wearable device 108 uploads the recently captured movement of buckling gesture, if available, to the server 104 (step 1022).

    [0057] The cloud server 104 compares the movement pattern from each wearable device 108 associated with the vehicle 110 to with that from the seat buckle 112 to identify a pair of correlated movement patterns. Based on the correlated movement patterns, the server determines that a particular occupant R sits in a particular seat S of the vehicle 110.

    [0058] The cloud server 104 notifies the vehicle 110 that occupant R is in seat S and then pushes (e.g., sends identification information) of occupant R's profile to the vehicle 110 (step 1026). As a result of being formed of the identity and location of the occupant R in the vehicle, the vehicle 110 provides personalized services at S based on the occupant R's profile.

    [0059] These illustrative embodiments do not require the occupant's intervention (e.g., input PIN and scan barcode) to identify occupant in each seat. Rather, in these embodiments, the vehicle is enabled to automatically load the occupant's personalized services before the vehicle starts to move.

    [0060] FIG. 11 illustrates a process 1100 for locating an occupant within a vehicle in accordance with various embodiments of the present disclosure. For example, the process depicted in FIG. 11 may be performed by the server 104 in FIG. 1. The process may also be implemented by the vehicle 110 in FIG. 1. The embodiment of the process 1100 shown in FIG. 11 is for illustration only. Other embodiments of the process 1100 could be used without departing from the scope of this disclosure.

    [0061] The process begins with the system receiving motion data of a seat belt for a seat in the vehicle (step 1105). For example, in step 1105, the system may receive the seat belt motion data from the enhanced seat buckle 112 via the vehicle 110. The system then receives motion data of a wearable device of the occupant (step 1110). For example, in step 1110, the system may receive, from the vehicle 110, a notification that the seat in the vehicle is occupied and then request or identify received wearable device motion data for any occupant associated with the vehicle in response to receipt of the notification. The system may associate the occupant with the vehicle 110 based on earlier received location information of the vehicle and location information of the wearable device of the occupant.

    [0062] The system compares the seat belt motion data with the wearable device motion data (step 1115). For example, in step 1115, the motion data may be processed to identify features of the motion from the respective devices for comparison. The system then identifies that the occupant is located at the seat in the vehicle based on a result of the comparison (step 1120). For example, in step 1120, the system may identify a pair of correlated wearable device and seat belt motion patterns from among a plurality of wearable device motion patterns and seat belt motion patterns for a plurality of occupants of the vehicle. The motion data may be acceleration data and the system may then compare an acceleration pattern of the seat belt with an acceleration pattern of the wearable device.

    [0063] Thereafter, the system identifies personalization options for the occupant at the seat based on a user profile of the occupant (step 1125). For example, in step 1125, the system may provide user profile information to the vehicle or the vehicle may load profile information from a connected user device. Based on identifying that the occupant is located at the seat and the user's profile information, the system may load and provide personalized infotainment to the user in the vehicle (e.g., play audio or video, load applications, modify seat settings, etc.).

    [0064] Although FIGS. 10 and 11 illustrate examples of processes for occupant identification and personalization and locating an occupant within a vehicle, respectively, various changes could be made to FIGS. 10 and 11. For example, while shown as a series of steps, various steps in each figure could overlap, occur in parallel, occur in a different order, or occur multiple times.

    [0065] Embodiments of the present disclosure provide an enhanced seatbelt buckle to capture the movement of buckle during occupant's seatbelt buckling gesture. Embodiments of the present disclosure characterize the seatbelt buckling gesture uses ground-based robust motion features. By monitoring and correlating the movement of occupant's hand and seatbelt buckle, embodiments of the present disclosure can identify occupant in each seat without occupant's intervention and thus enable her personalized services automatically. Embodiments of the present disclosure can be implemented in any other transportation systems that have seatbelts and ability to support personalization.

    [0066] Embodiments of the present disclosure address a key problem in connected car solutions ? how to provide each occupant personalized services automatically. By monitoring and correlating the movement of occupant's hand and seatbelt, embodiments of the present disclosure identify the occupant in each seat without occupant's intervention and then enables personalization based on occupant's profile stored in the cloud or user device.

    [0067] Although the present disclosure has been described with an exemplary embodiment, various changes and modifications may be suggested to one skilled in the art. It is intended that the present disclosure encompass such changes and modifications as fall within the scope of the appended claims.


    Claims

    1. A system for identifying an occupant within a vehicle, the system comprising:
    at least one processor configured to:

    obtain seatbelt buckle motion data from a motion sensor of a seatbelt buckle;

    obtain wearable device motion data from a motion sensor of a wearable device worn by the occupant;

    compare the seatbelt buckle motion data with the wearable device motion data from a same period of time; and

    identify the occupant located in a seat of the vehicle based on a result of the comparison.


     
    2. The system of claim 1, wherein the at least one processor is configured to identify personalization options for the occupant located in the seat based on a user profile of the occupant in response to identification.
     
    3. The system of claim 1, wherein the at least one processor is configured to associate the occupant with the vehicle based on location information of the vehicle and location information of the wearable device of the occupant.
     
    4. The system of claim 1, wherein the at least one processor is configured to identify a pair of correlated wearable device and seatbelt buckle motion patterns from among a plurality of wearable device motion patterns and seatbelt buckle motion patterns for a plurality of occupants of the vehicle.
     
    5. The system of claim 1, wherein:

    the seatbelt buckle motion data and the wearable device motion data comprise acceleration data; and

    the at least one processor is configured to compare an acceleration pattern of the seatbelt buckle with an acceleration pattern of the wearable device.


     
    6. The system of claim 1, wherein the system is located in one of the vehicle and a server for a cloud computing system.
     
    7. A vehicle for identifying an occupant, the vehicle comprising:

    one or more seats;

    one or more seatbelts for the one or more seats, respectively,

    wherein the one or more seatbelts each comprise a seatbelt buckle comprising one or more motion sensors configured to capture motion data of the seatbelt buckle; and

    at least one processor configured to:

    obtain seatbelt buckle motion data from a seatbelt buckle motion sensor of any one of the one or more seats,

    send the seatbelt buckle motion data to a server for comparison with wearable device motion data of a wearable device worn by the occupant of the vehicle from a same period of time, and

    receive identification information indicating an identification result for the occupant located in any one of the one or more seats; and

    determine personalization options for the occupant based on the identification information and a user profile of the occupant.


     
    8. The vehicle of claim 7, wherein the at least one processor is configured to send, to the server, location information of the vehicle for associating the occupant with the vehicle based on location.
     
    9. The vehicle of claim 7, further comprising:

    one or more seat sensors,

    wherein the at least one processor is configured to identify that the seat is occupied based on an output of the one or more seat sensors, and

    send, to the server, a notification that the seat in the vehicle is occupied.


     
    10. A method for identifying an occupant within a vehicle, the method comprising:

    obtaining seatbelt buckle motion data from a motion sensor of a seatbelt buckle;

    obtaining wearable device motion data from a motion sensor of a wearable device worn by the occupant;

    comparing the seatbelt buckle motion data with the wearable device motion data from a same period of time; and

    identifying the occupant located in the seat of the vehicle based on a result of the comparison.


     
    11. The method of claim 10, further comprising identifying personalization options for the occupant located in the seat based on a user profile of the occupant in response to identifying.
     
    12. The method of claim 10, further comprising associating the occupant with the vehicle based on location information of the vehicle and location information of the wearable device of the occupant.
     
    13. The method of claim 10, wherein identifying that the occupant is located at the seat in the vehicle comprises identifying a pair of correlated wearable device and seatbelt buckle motion patterns from among a plurality of wearable device motion patterns and seatbelt buckle motion patterns for a plurality of occupants of the vehicle.
     
    14. The method of claim 10, wherein:

    the seatbelt buckle motion data and the wearable device motion data comprise acceleration data; and

    the comparison includes comparing an acceleration pattern of the seatbelt buckle with an acceleration pattern of the wearable device.


     
    15. The method of claim 10, wherein the method is performed by at least one of the vehicle and a server for a cloud computing system.
     


    Ansprüche

    1. System zum Identifizieren eines Insassen in einem Fahrzeug, wobei das System folgendes umfasst:
    mindestens einen Prozessor, der für folgende Zwecke gestaltet ist:

    Erhalten von Sicherheitsgurtschloss-Bewegungsdaten von einem Bewegungssensors eines Sicherheitsgurtschlosses;

    Erhalten von Bewegungsdaten einer tragbaren Vorrichtung von einem Bewegungssensor einer von dem Insassen getragenen tragbaren Vorrichtung;

    Vergleichen der Sicherheitsgurtschloss-Bewegungsdaten mit den Bewegungsdaten einer tragbaren Vorrichtung aus einem gleichen Zeitraum; und

    Identifizieren des sich auf einem Sitz des Fahrzeugs befindenden Insassen auf der Basis eines Ergebnisses des Vergleichs.


     
    2. System nach Anspruch 1, wobei der mindestens eine Prozessor so gestaltet ist, dass er Personalisierungsoptionen für den sich auf dem Sitz befindenden Insassen auf der Basis eines Benutzerprofils des Insassen als Reaktion auf die Identifizierung identifiziert.
     
    3. System nach Anspruch 1, wobei der mindestens eine Prozessor so gestaltet ist, dass er den Insassen mit dem Fahrzeug auf der Basis von Positionsinformationen des Fahrzeugs und Positionsinformationen der tragbaren Vorrichtung des Insassen assoziiert.
     
    4. System nach Anspruch 1, wobei der mindestens eine Prozessor so gestaltet ist, dass er ein Paar korrelierter Bewegungsmuster der tragbaren Vorrichtung und des Sicherheitsgurtschlosses aus einer Mehrzahl von Bewegungsmustern der tragbaren Vorrichtung und von Bewegungsmustern des Sicherheitsgurtschlosses für eine Mehrzahl von Insassen des Fahrzeugs identifiziert.
     
    5. System nach Anspruch 1, wobei:

    die Sicherheitsgurtschloss-Bewegungsdaten und die Bewegungsdaten einer tragbaren Vorrichtung Beschleunigungsdaten umfassen; und

    der mindestens eine Prozessor so gestaltet ist, dass er ein Beschleunigungsmuster des Sicherheitsgurtschlosses mit einem Beschleunigungsmuster der tragbaren Vorrichtung vergleicht.


     
    6. System nach Anspruch 1, wobei sich das System in dem Fahrzeug oder in einem Server für ein Cloud Computing-System befindet.
     
    7. Fahrzeug zum Identifizieren eines Insassen, wobei das Fahrzeug folgendes umfasst:

    einen oder mehrere Sitze;

    einen oder mehrere entsprechende Sicherheitsgurte für den einen oder die mehreren Sitze,

    wobei der eine oder die mehreren Sicherheitsgurte jeweils ein Sicherheitsgurtschloss umfassen, das einen oder mehrere Bewegungssensoren umfasst, die zur Erfassung von Bewegungsdaten des Sicherheitsgurtschlosses gestaltet sind; und

    mindestens einen Prozessor, der für folgende Zwecke gestaltet ist:

    Erhalten von Sicherheitsgurtschloss-Bewegungsdaten von einem Sicherheitsgurtschloss-Bewegungssensors eines Sitzes des einen oder der mehreren Sitze;

    Senden der Sicherheitsgurtschloss-Bewegungsdaten an einen Server für einen Vergleich mit Bewegungsdaten einer tragbaren Vorrichtung einer von dem Insassen des Fahrzeugs getragenen tragbaren Vorrichtung aus einem gleichen Zeitraum; und

    Empfangen von Identifizierungsinformationen, die ein Identifizierungsergebnis für den Insassen anzeigen, der sich auf einem Sitz des einen oder der mehreren Sitze befindet; und

    Bestimmen von Personalisierungsoptionen für den Insassen auf der Basis der Identifizierungsinformationen und eines Benutzerprofils des Insassen.


     
    8. Fahrzeug nach Anspruch 7, wobei der mindestens eine Prozessor so gestaltet ist, dass er Positionsinformationen des Fahrzeugs an den Server sendet, um den Insassen auf der Basis der Position mit dem Fahrzeug zu assoziieren.
     
    9. Verfahren nach Anspruch 7, ferner umfassend:

    einen oder mehrere Sitzsensoren,

    wobei der mindestens eine Prozessor so gestaltet ist, dass er auf der Basis einer Ausgabe des einen oder mehrerer Sitzsensoren identifiziert, dass der Sitz belegt ist; und

    Senden einer Mitteilung an den Server, dass der Sitz in dem Fahrzeug belegt ist.


     
    10. Verfahren zum Identifizieren eines Insassen in einem Fahrzeug, wobei das Verfahren folgendes umfasst:

    Erhalten von Sicherheitsgurtschloss-Bewegungsdaten von einem Bewegungssensors eines Sicherheitsgurtschlosses;

    Erhalten von Bewegungsdaten einer tragbaren Vorrichtung von einem Bewegungssensor einer von dem Insassen getragenen tragbaren Vorrichtung;

    Vergleichen der Sicherheitsgurtschloss-Bewegungsdaten mit den Bewegungsdaten einer tragbaren Vorrichtung aus einem gleichen Zeitraum; und

    Identifizieren des sich auf einem Sitz des Fahrzeugs befindenden Insassen auf der Basis eines Ergebnisses des Vergleichs.


     
    11. Verfahren nach Anspruch 10, ferner umfassend das Identifizieren von Personalisierungsoptionen für den sich auf dem Sitz befindenden Insassen auf der Basis eines Benutzerprofils des Insassen als Reaktion auf die Identifizierung.
     
    12. Verfahren nach Anspruch 10, ferner umfassend das Assoziieren des Insassen mit dem Fahrzeug auf der Basis der Positionsinformationen des Fahrzeugs und der Positionsinformationen der tragbaren Vorrichtung des Insassen.
     
    13. Verfahren nach Anspruch 10, wobei das Identifizieren, dass sich der Insasse in dem Sitz in dem Fahrzeug befindet, das Identifizieren eines Paares korrelierter Bewegungsmuster der tragbaren Vorrichtung und des Sicherheitsgurtschlosses aus einer Mehrzahl von Bewegungsmustern der tragbaren Vorrichtung und von Bewegungsmustern des Sicherheitsgurtschlosses für eine Mehrzahl von Insassen des Fahrzeugs umfasst.
     
    14. Verfahren nach Anspruch 10, wobei:

    die Sicherheitsgurtschloss-Bewegungsdaten und die Bewegungsdaten einer tragbaren Vorrichtung Beschleunigungsdaten umfassen; und

    der Vergleich das Vergleichen eines Beschleunigungsmusters des Sicherheitsgurtschlosses mit einem Beschleunigungsmuster der tragbaren Vorrichtung aufweist.


     
    15. Verfahren nach Anspruch 10, wobei das Verfahren wenigstens durch das Fahrzeug und/oder einen Server für ein Cloud Computing-System ausgeführt wird.
     


    Revendications

    1. Système pour identifier un occupant à l'intérieur d'un véhicule, le système comprenant :
    au moins un processeur conçu pour :

    obtenir des données de mouvement de boucle de ceinture de sécurité à partir d'un capteur de mouvement d'une boucle de ceinture de sécurité ;

    obtenir des données de mouvement de dispositif portable à partir d'un capteur de mouvement d'un dispositif portable porté par l'occupant ;

    comparer les données de mouvement de boucle de ceinture de sécurité aux données de mouvement de dispositif portable pour une même période ; et

    identifier l'occupant situé sur un siège du véhicule sur la base d'un résultat de la comparaison.


     
    2. Système selon la revendication 1, l'au moins un processeur étant conçu pour identifier les options de personnalisation pour l'occupant situé sur le siège sur la base d'un profil utilisateur de l'occupant en réponse à l'identification.
     
    3. Système selon la revendication 1, l'au moins un processeur étant conçu pour associer l'occupant au véhicule sur la base des informations d'emplacement du véhicule et des informations d'emplacement du dispositif portable de l'occupant.
     
    4. Système selon la revendication 1, l'au moins un processeur étant conçu pour identifier une paire de modèles de mouvement de dispositif portable et de boucle de ceinture de sécurité corrélés parmi une pluralité de modèles de mouvement de dispositif portable et de modèles de mouvement de boucle de ceinture de sécurité pour une pluralité d'occupants du véhicule.
     
    5. Système selon la revendication 1 :

    les données de mouvement de boucle de ceinture de sécurité et les données de mouvement de dispositif portable comprenant des données d'accélération ; et

    l'au moins processeur étant conçu pour comparer un modèle d'accélération de la boucle de ceinture de sécurité à un modèle d'accélération du dispositif portable.


     
    6. Système selon la revendication 1, le système étant situé dans le véhicule ou un serveur pour un système d'informatique en nuage.
     
    7. Véhicule pour identifier un occupant, le véhicule comprenant :

    au moins un siège ;

    au moins une ceinture de sécurité pour l'au moins un siège, respectivement,

    l'au moins une ceinture de sécurité comprenant chacune une boucle de ceinture de sécurité comprenant au moins un capteur de mouvement conçu pour saisir les données de mouvement de la boucle de ceinture de sécurité ; et

    au moins un processeur conçu pour :

    obtenir des données de mouvement de boucle de ceinture de sécurité à partir d'un capteur de mouvement de boucle de ceinture de sécurité de l'un quelconque de l'au moins un siège,

    envoyer les données de mouvement de boucle de ceinture de sécurité à un serveur pour les comparer aux données de mouvement de dispositif portable porté par l'occupant du véhicule pendant une même période, et

    recevoir des informations d'identification indiquant un résultat d'identification pour l'occupant situé sur l'un quelconque de l'au moins un siège ; et

    déterminer les options de personnalisation pour l'occupant sur la base des informations d'identification et d'un profil utilisateur de l'occupant.


     
    8. Véhicule selon la revendication 7, l'au moins un processeur étant conçu pour envoyer, au serveur, des informations d'emplacement du véhicule pour associer l'occupant au véhicule sur la base de l'emplacement.
     
    9. Véhicule selon la revendication 7, comprenant en outre :

    au moins un capteur de siège,

    l'au moins un processeur étant conçu pour identifier que le siège est occupé sur la base d'une sortie de l'au moins un capteur de siège, et

    envoyer, au serveur, une notification indiquant que le siège dans le véhicule est occupé.


     
    10. Procédé d'identification d'un occupant dans un véhicule, le procédé comprenant les étapes suivantes :

    obtention des données de mouvement de boucle de ceinture de sécurité à partir d'un capteur de mouvement de boucle de ceinture de sécurité ;

    obtention des données de mouvement de dispositif portable à partir d'un capteur de mouvement d'un dispositif portable porté par l'occupant ;

    comparaison des données de mouvement de boucle de ceinture de sécurité aux données de mouvement de dispositif portable pour une même période ; et

    identification de l'occupant situé sur un siège du véhicule sur la base d'un résultat de la comparaison.


     
    11. Procédé selon la revendication 10, comprenant en outre l'identification des options de personnalisation pour l'occupant situé sur le siège, sur la base d'un profil utilisateur de l'occupant en réponse à l'identification.
     
    12. Procédé selon la revendication 10, comprenant en outre l'association de l'occupant au véhicule sur la base des informations d'emplacement du véhicule et des informations d'emplacement du dispositif portable de l'occupant.
     
    13. Procédé selon la revendication 10, l'identification du fait que l'occupant est situé au siège dans le véhicule comprenant l'identification d'une paire de modèles de mouvement de dispositif portable et de mouvement de boucle de ceinture de sécurité corrélés parmi une pluralité de modèles de mouvement de dispositif portable et de modèles de mouvement de boucle de ceinture de sécurité pour une pluralité d'occupants du véhicule.
     
    14. Procédé selon la revendication 10,
    les données de mouvement de boucle de ceinture de sécurité et les données de mouvement de dispositif portable comprenant des données d'accélération ; et
    la comparaison comprenant la comparaison d'un modèle d'accélération de la boucle de ceinture de sécurité à un modèle d'accélération du dispositif portable.
     
    15. Procédé selon la revendication 10, le procédé étant exécuté par le véhicule et un serveur pour un système d'informatique en nuage.
     




    Drawing


























    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description