(19)
(11)EP 2 778 848 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
09.09.2020 Bulletin 2020/37

(21)Application number: 14158952.3

(22)Date of filing:  11.03.2014
(51)International Patent Classification (IPC): 
G06F 3/01(2006.01)

(54)

System, method and electronic device for providing a haptic effect to a user

System, Verfahren und elektronische Vorrichtung zur Bereitstellung eines haptischen Effekts an einen Benutzer

Système, procédé et dispositif électronique pour fournir un effet haptique à un utilisateur


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 11.03.2013 US 201313793966

(43)Date of publication of application:
17.09.2014 Bulletin 2014/38

(73)Proprietor: Immersion Corporation
San Jose CA 95134 (US)

(72)Inventors:
  • Levesque, Vincent
    Montreal, Prince Edward Island H2J 2R1 (CA)
  • Grant, Danny
    Laval, Québec H7M 2A1 (CA)

(74)Representative: Beck Greener LLP 
Fulwood House 12 Fulwood Place
London WC1V 6HR
London WC1V 6HR (GB)


(56)References cited: : 
US-A1- 2011 227 824
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description


    [0001] The present invention is related to systems, electronic devices, and methods for providing a haptic effect to a user, for example providing haptic sensations as a function of eye gaze.

    [0002] Detection methods in order to determine a direction of an eye gaze of a user of an electronic device are becoming more prevalent in human computer interactions. Eye gaze detection, for example, may be used to better determine where a user is pointing at a display when using the Microsoft's KINECT® Gaming System or Sony's PLAYSTATION EYE® camera for use with a computer game console. In general, the user's eye gaze may be used as an input to the computer device. For example, the current focus of the user interface may be directed to where the user is looking, similar to the use of a mouse as a pointer on a computer screen. As systems like the KINECT® get more sophisticated, eye gaze detection may become a standard means of providing input to software applications.

    [0003] US 2011/227824 relates to a method of providing immersive effects to a user looking at images on a display device, based on the visual focus of attention (eye gaze) of the user.

    [0004] It is desirable to use eye gaze direction information as an input to create haptic effects in an electronic device so that the haptic effects may be better matched to the user's overall experience according to what the user is currently looking at. This may provide a more immersive experience to the user.

    [0005] According to an aspect of the present invention, there is provided a system that includes a detector configured to determine a direction of an eye gaze of a user of the system, a processor configured to generate signal representative of a haptic effect based on the direction of the eye gaze, and a haptic output device configured to receive the signal from the processor and output the haptic effect to the user.

    [0006] According to an aspect of the present invention, there is provided a method for providing a haptic effect to a user of a system. The method includes determining a direction of an eye gaze of the user of the system, generating a haptic effect based on the direction of the eye gaze, and outputting the haptic effect to the user.

    [0007] According to an aspect of the present invention, there is provided an electronic device that includes a housing, a detector supported by the housing, the detector configured to determine a direction of an eye gaze of a user of the electronic device, a haptic output device supported by the housing, the haptic output device configured to generate a haptic effect to the user, and a processor supported by the housing. The processor is configured to generate a first haptic drive signal representative of a first haptic effect based on the eye gaze of the user corresponding to the user looking at the electronic device, and a second haptic drive signal representative of a second haptic effect based on the eye gaze of the user corresponding to the user looking away from the electronic device. The processor is configured to output the first haptic drive signal or the second haptic drive signal to the haptic output device to generate the haptic effect based on the direction of the eye gaze determined by the detector.

    [0008] According to a first aspect of the present invention there is provided a system comprising: a detector configured to determine a direction of an eye gaze of a user of the system; a processor configured to generate signal representative of a haptic effect based on the direction of the eye gaze; and, a haptic output device configured to receive the signal from the processor and output the haptic effect to the user.

    [0009] The first aspect may be modified in any suitable way as disclosed herein including but not limited to any one or more of the following.

    [0010] The system may be configured such that the haptic output device comprises an actuator configured to output the haptic effect.

    [0011] The system may be configured such that the haptic output device comprises a non-mechanical device configured to output the haptic effect.

    [0012] The system may be configured such that the detector comprises a camera configured to capture an image of an eye of the user, and an image processor configured to determine the direction of the eye gaze based on the image.

    [0013] The system may be configured such that the image processor is part of the processor.

    [0014] The system may be configured such that the detector comprises a sensor configured to monitor movements of muscles near an eye of the user, and a second processor configured to determine the direction of the eye gaze based on the monitored movements.

    [0015] The system may be configured such that the second processor is part of the processor.

    [0016] The system may be configured such that the detector and the processor are part of a first device and the haptic output device is part of a second device that is separate from the first device.

    [0017] The system may further comprise a communication port configured to establish a wired or wireless communication channel between the first device and the second device.

    [0018] The system may be configured such that the detector is part of a first device and the processor and the haptic output device are part of a second device that is separate from the first device.

    [0019] The system may further comprise a communication port configured to establish a wired or wireless communication channel between the first device and the second device.

    [0020] The system may be configured such that the processor is configured to generate a first signal representative of the haptic effect when the eye gaze of the user is determined to be directed at an object, and wherein the processor is configured to generate a second signal representative of a second haptic effect that is different from the haptic effect when the eye gaze of the user is determined to not be directed at the object.

    [0021] The system may be configured such that the processor is configured to generate a first signal representative of the haptic effect when the eye gaze of the user is determined to be directed at a first object, and wherein the processor is configured to generate a second signal representative of a second haptic effect that is different from the haptic effect when the eye gaze of the user is determined to be directed at a second object that is different from the first object.

    [0022] The system may be configured such that the processor is further configured to generate an augmented reality space, and use the direction of the eye gaze to determine when the user is looking at an object in the augmented reality space, wherein the haptic effect is further based on the object.

    [0023] According to a second aspect of the present invention there is provided a method for providing a haptic effect to a user of a system, the method comprising: determining a direction of an eye gaze of the user of the system; generating a haptic effect based on the direction of the eye gaze; and, outputting the haptic effect to the use.

    [0024] The second aspect may be modified in any suitable way as disclosed herein including but not limited to any one or more of the following.

    [0025] The method may be configured such that the said determining comprising capturing an image of an eye of the user, and determining the direction of the eye gaze based on the image.

    [0026] The method may be configured such that the said determining comprises monitoring movements of muscles near an eye of the user, and determining the direction of the eye gaze based on the monitored movements.

    [0027] The method may be configured such that the said generating the haptic effect comprises generating a first haptic effect when the eye gaze of the user is determined to be directed at a first object, and generating a second haptic effect that is different from the haptic effect when the eye gaze of the user is determined to not be directed at the first object.

    [0028] The method may be configured such that the said generating the haptic effect comprises generating a first haptic effect when the eye gaze of the user is determined to be directed at a first object, and generating a second haptic effect that is different from the haptic effect when the eye gaze of the user is determined to be directed at a second object that is different from the first object.

    [0029] According to a third aspect of the present invention there is provided an electronic device comprising: a housing; a detector supported by the housing, the detector configured to determine a direction of an eye gaze of a user of the electronic device; a haptic output device supported by the housing, the haptic output device configured to generate a haptic effect to the user; and, a processor supported by the housing, the processor configured to generate: a first haptic drive signal representative of a first haptic effect based on the eye gaze of the user corresponding to the user looking at the electronic device; and, a second haptic drive signal representative of a second haptic effect based on the eye gaze of the user corresponding to the user looking away from the electronic device, the processor further being configured to output the first haptic drive signal or the second haptic drive signal to the haptic output device to generate the haptic effect based on the direction of the eye gaze determined by the detector.

    [0030] The third aspect may be modified in any suitable way as disclosed herein including but not limited to any one or more of the following.

    [0031] The electronic device may be configured such that the detector comprises a camera configured to capture an image of an eye of the user, and an image processor configured to determine the direction of the eye gaze based on the image.

    [0032] The electronic device may further comprise a display supported by the housing, the display configured to display images.

    [0033] The electronic device may be configured such that the display includes a touch screen configured to receive input from the user.

    [0034] The electronic device may be configured such that an intensity of the second haptic effect is greater than an intensity of the first haptic effect.

    [0035] According to a fourth aspect of the present invention, there is provided an electronic device comprising: the system as described in the first aspect and optionally with any one or more of the optional features described for the first aspect; and, a housing; wherein; the detector is supported by the housing, the detector configured to determine a direction of an eye gaze of a user of the electronic device; the haptic output device is supported by the housing; and, the processor is supported by the housing, the processor configured to generate, a first haptic drive signal representative of a first haptic effect based on the eye gaze of the user corresponding to the user looking at the electronic device, and, a second haptic drive signal representative of a second haptic effect based on the eye gaze of the user corresponding to the user looking away from the electronic device, the processor further being configured to output the first haptic drive signal or the second haptic drive signal to the haptic output device to generate the haptic effect based on the direction of the eye gaze determined by the detector.

    [0036] The fourth aspect may be modified in any suitable way as disclosed herein including but not limited to any one or more of the following.

    [0037] The electronic device may be configured such that an intensity of the second haptic effect is greater than an intensity of the first haptic effect.

    [0038] The components of the following Figures are illustrated to emphasize the general principles of the present disclosure and are not necessarily drawn to scale. Reference characters designating corresponding components are repeated as necessary throughout the Figures for the sake of consistency and clarity.

    [0039] Embodiments of the present invention will now be described in detail with reference to the accompanying drawings, in which:

    FIG. 1 is a schematic illustration of a system for providing haptic sensations to a user of the system as a function of eye gaze of the user in accordance with an embodiment as described herein;

    FIG. 1A is a schematic illustration of an embodiment of a detector of the system of FIG. 1;

    FIG. 1B is a schematic illustration of an embodiment of a detector of the system of FIG. 1;

    FIG. 2 is a schematic illustration of an embodiment of a processor of the system of FIG. 1;

    FIG. 3 is a schematic illustration of an embodiment of the system of FIG. 1;

    FIG. 4 is a schematic illustration of an electronic device for providing haptic sensations to a user of the device as a function of eye gaze of the user in accordance with an embodiment as described herein;

    FIG. 5 is a schematic illustration of the electronic device of FIG. 5 in accordance with an embodiment as described herein;

    FIG. 6 is a schematic illustration of the electronic device of FIG. 5 in accordance with an embodiment as described herein;

    FIGs. 7A and 7B are schematic illustrations of the electronic device of FIG. 5 in accordance with an embodiment as described herein;

    FIG. 8 is a schematic illustration of the system of FIG. 1 illustrates in accordance with an embodiment as described herein; and

    FIG. 9 is a flow diagram of a method for providing haptic sensations to a user of the system of

    FIG. 1 or the electronic device of FIG. 4 as a function of eye gaze of the user in accordance with an embodiment as described herein.



    [0040] FIG. 1 illustrates a system 100 in accordance with an embodiment as described herein. As illustrated, the system 100 includes a detector 110, a processor 120, a haptic output device 130, and a display 140. The detector 110 is configured to determine a direction of an eye gaze of a user of the system 100, the processor 120 is configured to generate a signal representative of a haptic effect based on the direction of the eye gaze, the haptic output device 130 is configured to receive the signal from the processor and output the haptic effect to the user, and the display 140 is configured to display content to the user.

    [0041] The detector 110 may include any detection means that are used to detect eye gaze. For example, FIG. 1A illustrates an embodiment of a detector 110' that may include a camera 112 configured to capture an image of an eye of the user of the system 100, and a processor 114 configured to determine the direction of the eye gaze based on the image. In an embodiment, the processor 114 may be part of the processor 120 of FIG. 1. Image processing techniques to determine eye gaze direction and are well known in the literature and therefore are not described herein. FIG. 1B illustrates an embodiment of a detector 110" that may include a sensor 116 configured to monitor movements of muscles near the eye of the user, and a processor 118 configured to determine the direction of the eye gaze based on the monitored movement. In an embodiment, the sensor 116 may be configured to measure electrical activity of the muscles moving the eyes. In an embodiment, the processor 118 may be part of the processor 120 of FIG. 1.

    [0042] The illustrated embodiments of the detector are not considered to be limiting in any way and other detection means that provide for the determination of a direction of the user's eye gaze may be used in accordance with embodiments as described herein. For example, in an embodiment, the user's eye gaze direction may be estimated by analyzing the user's body or head posture.

    [0043] In an embodiment, the detector 110 may also be configured to determine where the user's current eye gaze direction is focused. This may be accomplished by using image processing techniques to determine the position and the shape of the iris of a user's eye, in combination with a model or stored reference image of the iris. In an embodiment, the user's eye gaze direction may be stored as pitch and yaw angles for each eye. With this information, the depth of field of the user's current gaze may also be determined. In an embodiment, other sensors may be used in addition to the detector 110 to better determine the user's intent or volition, such as sensors that are typically associated with functional magnetic resonance imaging ("fMRI") or electroencephalogram ("EEG"). Haptic effects may be rendered as a function of these combined sensor and detector outputs.

    [0044] FIG. 2 illustrates an embodiment of the processor 120 in more detail. The processor 120 may be configured to execute one or more computer program modules. The one or more computer program modules may include one or more of a content provision module 122, an eye gaze determination module 124, a haptic effect determination module 126, a haptic output device control module 128, and/or other modules. The processor 120 may be configured to execute the modules 122, 124, 126, and/or 128 by software, hardware, firmware, some combination of software, hardware, and/or firmware, and/or other mechanisms for configuring processing capabilities on processor 120.

    [0045] It should be appreciated that although modules 122, 124, 126, and 128 are illustrated in FIG. 3 as being co-located within a single processing unit, in embodiments in which the processor 120 includes multiple processing units, one or more of modules 122, 124, 126, and/or 128 may be located remotely from the other modules. For example, the eye gaze determination module 124 may reside in the processors 114, 118 described above. The description of the functionality provided by the different modules 122, 124, 126, and/or 128 described below is for illustrative purposes, and is not intended to be limiting, as any of the modules 122, 124, 126, and/or 128 may provide more or less functionality than is described. For example, one or more of the modules 122, 124, 126, and/or 128 may be eliminated, and some or all of its functionality may be provided by other ones of the modules 122, 124, 126, and/or 128. As another example, the processor 120 may be configured to execute one or more additional modules that may perform some or all of the functionality attributed below to one of the modules 122, 124, 126, and/or 128. The content provision module 122 is configured to control the provision of content to the user of the system 100 via the display 140. If the content includes computer generated images (e.g., in a videogame, virtual world, augmented reality virtual world, simulation, etc.), the content provision module 122 is configured to generate the images and/or views for display to the user through the display 140. If the content includes video and/or still images, the content provision module 122 is configured to access the video and/or still images and to generate views of the video and/or still images for display on the display 140. If the content includes audio content, the content provision module 122 is configured to generate the electronic signals that will drive a speaker, which may be part of the display 140, to output corresponding sounds. The content, or information from which the content is derived, may be obtained by the content provision module 122 from an electronic storage 129, which may be part of the processor 120 , as illustrated in FIG. 2, or may be separate from the processor 120.

    [0046] The eye gaze determination module 124 is configured to determine a direction of the eye gaze of the user based on information from the output signals generated by the detector 110. The information related to direction of the user's eye gaze determined by the eye gaze direction determination module 124 may describe the direction as a vector in an absolute coordinate system, with respect to other objects, and/or in other contexts. Such information may include, without limitation, coordinates and/or angular relationships, such as pitch and yaw angles, as described above.

    [0047] The haptic effect determination module 126 is configured to determine the haptic effect or sensation to be generated by the haptic output device 130 for the user, based on information received from the detector 110 and any other sensor that is configured to determine the user's intent or volition, as described above. The gaze parameters determined by the eye gaze determination module 124 may be used alone or in conjunction with other inputs such as content events, motion of any portion of the system 100, etc. Determining the haptic effect may include determining one or more parameters that include an amplitude, frequency, duration, etc., of the haptic sensation. The haptic effect is determined by the haptic effect determination module 126 to enhance one or more aspects of the experience provided by the content to the user. For example, the haptic effect may be determined to enhance one or more of the realism of the content, the enjoyability of content, perception of the content by the user, and/or other aspects of the experience provided by the content being conveyed to the user via the display 140.

    [0048] The haptic output device control module 128 is configured to control the haptic output device 130 to generate the haptic effect determined by haptic effect determination module 126. This includes communicating the haptic output signal to be generated by the processor 120 and communicated the haptic output device 130. The haptic effect to be generated may be communicated over wired communication links, wireless communication links, and/or other communication links between the processor 120 and the haptic output device 130. In an embodiment, at least a portion of the functionality attributed to the haptic output device control module 128 may be disposed in a processor carried by the haptic output device 130.

    [0049] The haptic effects or sensations can be created with any of the methods of creating haptics, such as vibration, deformation, kinesthetic sensations, electrostatic or ultrasonic friction, etc. In an embodiment, the haptic output device 130 may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass ("ERM") in which an eccentric mass is moved by a motor, a Linear Resonant Actuator ("LRA") in which a mass attached to a spring is driven back and forth, or a "smart material" such as piezoelectric materials, electro-active polymers or shape memory alloys, a macro-composite fiber actuator, an electro-static actuator, an electro-tactile actuator, and/or another type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback. The haptic output device 130 may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on. In an embodiment, multiple haptic output devices may be used to generate different haptic effects. The haptic output device 130 may be located in a gamepad, a mobile phone, a wearable device, or any other device that may be used to create the haptic effects or sensations to the user.

    [0050] FIG. 3 illustrates a system 300 in accordance with embodiments as described herein. As illustrated, the system 300 includes a camera based gaming system 302, such as a KINECT® gaming system by Microsoft or a PLAYSTATION® gaming system by Sony that is equipped with a PLAYSTATION EYE® camera. In the illustrated embodiment, the gaming system 302 includes the detector 110, which may include the detector 110' and/or the detector 110" described above, and the processor 120 describe above. The system 300 also includes a user input device 304, such as a gamepad, configured to receive an input from a user to provide a command to the gaming system 302. The user input device 304 may also include the haptic output device 130 described above. The system 300 also includes the display 140, such as a television or monitor, configured to display images output by the content provision module 122 of the gaming system 302.

    [0051] As illustrated in FIG. 3, the user input device 304 is in communication with the gaming system 302 though a wired or wireless communication channel 350 established between a communication port 306 of the gaming system 302 and a communication port 308 of the user input device 304. The haptic output device 130 is configured to receive the signal from the processor 120, such as from the haptic output device control module 128, and output a haptic effect to the user through the user input device 304. The haptic sensation the user feels may depend on the location of the display 140 the user is currently looking at. For example, the user may be looking at the current scene on the display 140, which is not necessarily where the current gamepad focus is located.

    [0052] For example, if a user is playing a first person shooter game and enters a new room, the user uses his/her eye gaze to quickly look around the scene on the display 140. When the user's eye gaze, represented by EG1, falls on a first portion of the room on the left side of the display 140, represented by area "A", a first haptic effect or sensation may be felt in the user input device 304 to indicate the presence of an object that can be interacted with. When the user's eye gaze, represented by EG2, falls on a second portion of the room on the right side of the display 140, represented by area "B", which does not contain an object that can be interacted with, the user may feel a second haptic effect or sensation that is different from the first haptic effect of sensation. The second haptic effect may be configured to communicate to the user that the second portion of the room does not contain an object that can be interacted with, so that the user does not need to continue to look at the second portion of the room.

    [0053] In an embodiment, the user input device 304 may include a gamepad with a touch screen or touch pad input. The user may play a game using the display 140 as a main display and the touch screen on the user input device 304. The detector 110 may detect the eye gaze of the user and determine whether the user is looking at the display 140 or whether the user is looking at the touch screen on the user input device 104. The processor 120 may output a signal to the haptic output device 130 based on the detected direction of the user's eye gaze so that the haptic effect generated by the haptic output device 130 may communicate to the user which screen the user should be looking at. In an embodiment, the haptic effect may communicate to the user that something has occurred on the display or screen that the user is not currently looking at to direct the user to look at the other display or screen.

    [0054] In an embodiment, the user input device 304 may include a wearable device, such as a GLASS® augmented reality display by Google Inc. In this embodiment, the detected eye gaze may be used to indicate what the user would like to select on an augmented reality visual display. For example, the user may look at a menu item or icon and then verbally say "select" or even blink for selection of the menu item or icon. The haptic output device 130 may output haptic effects as the user's gaze moves across menu elements or icons.

    [0055] In an embodiment, an augmented reality virtual world provided by the content provision module 122 may be combined with the eye gaze direction information determined by the eye gaze direction determination module 124 to determine that a user is looking at a specific object within the augmented reality virtual world at a certain depth in the visual field. The haptic effect determination module 126 may identify the type of object and determine the type of haptic effect to be output by the haptic output device 130. For example, in an embodiment, a user may play a game in which he/she must find real or virtual objects in an augmented reality type display. The user's eye gaze direction may be determined and if the user is looking at object X, a haptic effect associated with object X, i.e. object X's haptic signature, may be generated on the user input device 304 by the haptic output device 130.

    [0056] In an embodiment, the display 140 may be part of a personal computer. The detector 110 may track the user's eye gaze as the user navigates the display 140 with his/her eyes to control a pointer location on the display 140 in a similar manner as a user would navigate a display with a mouse. The haptic output device 130 may output a haptic effect as the detector 110 detects that the user is looking at each item on the display 140 or when the user is looking at certain highlighted items on the display 140.

    [0057] In an embodiment, the system 100 of FIG. 1 may be part of a single electronic device. FIG. 4 is a schematic illustration of an electronic device 400 in accordance with an embodiment as described herein. As illustrated, the electronic device 400 includes a processor 410, a memory device 420, and input/output devices 430, which are interconnected via a bus 440. In an embodiment, the input/output devices 430 may include a touch screen device 450 or other human-computer interface devices.

    [0058] The touch screen device 450 may be configured as any suitable human-computer interface or touch/contact surface assembly. The touch screen device 450 may be any touch screen, touch pad, touch sensitive structure, computer monitor, laptop display device, workbook display device, kiosk screen, portable electronic device screen, or other suitable touch sensitive device. The touch screen device 450 may be configured for physical interaction with a user-controlled device, such as a stylus, finger, etc. In some embodiments, the touch screen device 450 may include at least one output device and at least one input device. For example, the touch screen device 450 may include a visual display and a touch sensitive screen superimposed thereon to receive inputs from a user's finger. The visual display may include a high definition display screen.

    [0059] In various embodiments, the touch screen device 450 is configured to provide haptic feedback to at least a portion of the electronic device 400, which can be conveyed to a user in contact with the electronic device 400. Particularly, the touch screen device 450 can provide haptic feedback to the touch screen itself to impose a haptic effect when the user in is contact with the screen. The haptic effects can be used to enhance the user experience, and particularly can provide a confirmation to the user that the user has made sufficient contact with the screen to be detected by the touch screen device 450.

    [0060] The electronic device 400 may be any device, such as a desktop computer, laptop computer, electronic workbook, electronic handheld device (such as a mobile phone, gaming device, personal digital assistant ("PDA"), portable e-mail device, portable Internet access device, calculator, etc.), kiosk (such as an automated teller machine, ticking purchasing machine, etc.), printer, point-of-sale device, game controller, or other electronic device.

    [0061] The processor 410 may be a general-purpose or specific-purpose processor or microcontroller for managing or controlling the operations and functions of the electronic device 400. For example, the processor 410 may be specifically designed as an application-specific integrated circuit ("ASIC") to control output signals to a driver of the input/output devices 430 to provide haptic effects. In an embodiment, the processor 410 includes the processor 120 described above. The processor 410 may be configured to decide, based on predefined factors, what haptic effects are to be generated, the order in which the haptic effects are generated, and the magnitude, frequency, duration, and/or other parameters of the haptic effects. The processor 410 may also be configured to provide streaming commands that can be used to drive a haptic output device for providing a particular haptic effect. In some embodiments, the processing device 410 may actually include a plurality of processors, each configured to perform certain functions within the electronic device 400.

    [0062] The memory device 420 may include one or more internally fixed storage units, removable storage units, and/or remotely accessible storage units. The various storage units may include any combination of volatile memory and non-volatile memory. The storage units may be configured to store any combination of information, data, instructions, software code, etc. More particularly, the storage devices may include haptic effect profiles, instructions for how the haptic output device of the input/output devices 430 are to be driven, or other information for generating haptic effects.

    [0063] In addition to the touch screen device 450, the input/output devices 430 may also include specific input mechanisms and output mechanisms. For example, the input mechanisms may include such devices as keyboards, keypads, cursor control devices (e.g., computer mice), or other data entry devices. The input/output devices 450 may include the detector 110, such as the detector 110' that includes the camera 112 and the processor 114 described above. In an embodiment, the user's eye gaze that is detected by the detector 110 may be used to control a pointer location on the touch screen device 450, as described above.

    [0064] Output mechanisms may include a computer monitor, virtual reality display device, audio output device, printer, or other peripheral devices, such as the gamepad described above with respect the user interface device 304 of FIG. 3. The input/output devices 430 may include mechanisms that are designed to not only receive input from a user, but also provide feedback to the user, such as many examples of touch screen devices.

    [0065] The touch screen device 450 and other input/out devices 430 may include any suitable combination and configuration of buttons, keypads, cursor control devices, touch screen components, stylus-receptive components, or other data entry components. The touch screen device 450 may also include any suitable combination of computer monitors, display screens, touch screen displays, haptic output devices, such as the haptic output device 130 described above, or other notification devices for providing output to the user.

    [0066] In an embodiment, the touch screen device 450 includes a display surface 460, which may be rigid, that is configured to modulate its friction properties through, including but not limited to, electrostatic friction and ultra-sonic surface vibration, to give the user a feeling of surface relief (e.g., hills and valleys) when running a finger or stylus across the display that corresponds to the display image.

    [0067] FIG. 5 illustrates an embodiment of an electronic device 500, such as the electronic device 400 described above. As illustrated, the electronic device 500, which may be a mobile phone, includes a housing 502 having a first side 504 and a second side 506, a first camera 510 located on the first side 504 of the housing 502 and configured to face the user during normal operation, and a second camera 512 located on the second side 506 of the housing 502 and configured to face away from the user during normal operation. The first camera 510 may be part of the detector 110' described above and may be configured to track the direction of the user's eye gaze EG. The second camera 512 may be used to capture an image 514 that includes a real physical object RO. The device 500 also includes a display 540 that is supported by the housing 502 and is configured to project an image 516 from the first side 504 of the housing 502. The image 516 displayed by the display may be the image 514 captured by the second camera 512.

    [0068] As illustrated in FIG. 5, a processor 520 may be located within and supported by the housing 502, and may be an embodiment of the processor 120 described above. The processor 520 may be configured to augment the image 514 captured by the second camera 512 with a virtual object VO so that the image 516 displayed by the display 540 includes representations of the real object RO and the virtual object VO. A haptic output device 530, such as the haptic output device 130 described above, may be located in and/or supported by the housing 502. In an embodiment, a first haptic effect may be generated by the haptic output device 530 if the processor 520 determines that the eye gaze EG of the user is in the direction of the real object RO, and a second haptic effect that is different from the first haptic effect may be generated by the haptic output device 530 if the processor 520 determines that the eye gaze EG of the user is in the direction of the virtual object VO. In an embodiment, the cameras 510, 512 of the device 500 may be used in a similar manner to determine the user's eye gaze for navigation purposes.

    [0069] In an embodiment, the haptic effect that is generated by the haptic output device 530 may depend on whether the user's eye gaze is directed towards or away from the display 540 of the device 500, as detected by the camera 510 and determined by the processor 520. For example, in an embodiment, an intensity of the haptic effect associated with contact with a widget or button 560 on the device 500 may be increased by the haptic output device 530 when the processor 520 determines that the eye gaze of the user is directed away from the screen 540, as indicated by EGA in FIG. 6. This may assist the user of the electronic device 500 when trying to operate the widget or button 560 without looking at the device 500. When the processor 520 determines that the eye gaze of the user is directed towards the screen 540, as indicated by EGT in FIG. 6, an intensity of the haptic effect may be decreased, because it is assumed that the user can see the widget or button 560. The amplitudes of the periodic signals in FIG. 6 represent different intensities of haptic effects associated with the two different eye gazes EGA, EGT.

    [0070] In an embodiment, haptic effects may also be used to attract attention to a notification or other onscreen event, such as the arrival of a new email. Visual attention of the user with respect to the display 540 of the device 500 may be monitored with the camera 510 and the processor 520 may determine whether a haptic notification in the form of a haptic sensation generated by the haptic output device 530 is necessary or whether the user has looked at the display 540 to review the notification of other onscreen event. For example, as illustrated in FIG. 7A, if it is determined that the user has an eye gaze EGA away from the display 540, a haptic effect may be generated by the haptic output device 530 to notify the user that a new email has been received. The haptic effect may initially be very strong, as indicated in FIG. 7A, but may gradually decrease in intensity and fade away once the camera 510 and the processor 520 determine that the user is known to have looked at a notification icon 570 or the display 540 by the detection of the eye gaze EGT towards the notification icon 570 or the display 540, as illustrated in FIG. 7B, since the visual feedback is sufficient to provide the notification.

    [0071] FIG. 8 illustrates an embodiment in which an electronic device 800 is provided in the form of a vehicle's onboard computer. The device 800 may include a detector, such as the detector 110' described above, that includes a camera 810 and a processor 820, such as the processor 120 described above. In this embodiment, the camera 810 and the processor 820 are also configured to monitor the number of glances the user makes to a display 840 of the device 800 during a predetermined amount of time. The haptic effects provided by a haptic output device 830, such as the haptic output device 130 discussed above, may be tuned to reduce the number of glances the user makes to the display 840. The haptic output device 830 may be located at the display 840, if the display 840 includes a touch screen, and/or at another location within the vehicle that the driver contacts, such as a steering wheel, SW, or a surface of the dashboard DB, or the seat on which the driver sits. In an embodiment, if it is determined that the user is glancing at the display 840, for example, more than ten times during a minute, an intensity of the haptic effect provided by the haptic output device 830 may be increased, based on the assumption that the user is needing to rely on visual confirmation because the haptic feedback being provided by the device 800 is not strong enough to provide a satisfactory confirmation. Similarly, if it is determined that the user is glancing at the display 840, for example, less than two times per minute, the intensity of the haptic effect may be decreased.

    [0072] FIG. 9 illustrates a method 900 in accordance with embodiments as described herein described herein. As illustrated, the method starts at 902. As 904, a direction of an eye gaze of a user of a device or a system is determined. At 906, a haptic effect is generated based on the direction of the eye gaze. At 908, the haptic effect is output to the user. The method ends at 912.

    [0073] The illustrated and above-described embodiments are not considered to be limiting in any way, and embodiments as described herein may be used to enable haptic feedback in various electronic devices, such as touch screen handheld devices (mobile devices, PDA, and navigation systems), automotive applications, gaming consoles, etc.

    [0074] Although many of the examples described herein relate to touch screen devices, it should be understood that the present disclosure also encompasses other types of human-computer interfaces involving touch sensitive structures. In addition, other features and advantages will be apparent to one of ordinary skill in the art upon reading and understanding the general principles of the present disclosure. These other features and advantages are intended to be included in the present disclosure as well.

    [0075] The embodiments described herein represent a number of possible implementations and examples and are not intended to necessarily limit the present disclosure to any specific embodiments. Instead, various modifications can be made to these embodiments as would be understood by one of ordinary skill in the art. Any such modifications are intended to be included within the scope of the present disclosure and protected by the following claims. Embodiments of the present invention have been described with particular reference to the examples illustrated. However, it will be appreciated that variations and modifications may be made to the examples described within the scope of the present invention.


    Claims

    1. A system (100; 300) comprising:

    a detector (110; 110';110") configured to determine a direction of an eye gaze of a user of the system relative to a display device;

    a processor (120; 114; 118; 410; 520; 820) configured to:

    determine one or more parameters of a first haptic effect associated with the direction of the user's eye gaze being towards the display device;

    determine one or more parameters of a second haptic effect associated with the direction of the user's eye gaze being away from the display device,

    wherein the second haptic effect is different from the first haptic effect and the one or more parameters of the first and second haptic effects are selected from the group comprising: an amplitude; a frequency; or a duration;

    generate a first signal representative of the first haptic effect when an event occurs on the display device and the eye gaze of the user is determined to be directed towards the display device; and

    generate a second signal representative of the second haptic effect when the event occurs on the display device and the eye gaze of the user is determined to be directed away from the display device; and

    a haptic output device (130; 530; 830) configured to receive the first signal and second signal from the processor and output the first haptic effect or the second haptic effect.


     
    2. The system according to claim 1, wherein the detector comprises a camera configured to capture an image of an eye of the user, and an image processor configured to determine the direction of the eye gaze based on the image.
     
    3. The system according to claim 1, wherein the detector comprises a sensor configured to monitor movements of muscles near an eye of the user, and a second processor configured to determine the direction of the eye gaze based on the monitored movements.
     
    4. The system according to any of claims 1 - 3, wherein the processor is further configured to generate an augmented reality space, and use the direction of the eye gaze to determine when the user is looking at an object in the augmented reality space, wherein the first haptic effect and the second haptic effect are further based on the object.
     
    5. The system according to any of claims 1 - 4, wherein the detector and the processor are part of a first device and the haptic output device is part of a second device that is separate from the first device.
     
    6. The system according to any of claims 1 - 4, wherein the detector is part of a first device and the processor and the haptic output device are part of a second device that is separate from the first device.
     
    7. A method (900) for providing a haptic effect to a user of a system (100; 300), the method comprising:

    determining (904) a direction of an eye gaze of the user of the system relative to a display device;

    determining one or more parameters of a first haptic effect associated with the direction of the user's eye gaze being towards the display device;

    determining one or more parameters of a second haptic effect associated with the direction of the user's eye gaze being away from the display device, wherein the second haptic effect is different from the first haptic effect and the one or more parameters are selected from the group comprising: an amplitude; a frequency; or a duration;

    generating (906) the first haptic effect when an event occurs on the display device and the eye gaze of the user is determined to be directed towards the display device, and

    generating the second haptic effect when the event occurs on the display device and the eye gaze of the user is determined to be directed away from the display device; and

    outputting (908) the first haptic effect or the second haptic effect to the user.


     
    8. The method according to claim 7, wherein said determining a direction of an eye gaze of the user of the system comprises capturing an image of an eye of the user, and determining the direction of the eye gaze based on the image.
     
    9. The method according to claim 7, wherein said determining a direction of an eye gaze of the user of the system comprises monitoring movements of muscles near an eye of the user, and determining the direction of the eye gaze based on the monitored movements.
     
    10. An electronic device (400; 500) comprising:

    A) the system (100; 300) as claimed in any of claims 1 - 3; and,

    B) a housing (502);
    wherein

    I) the detector (110; 110';110") is supported by the housing, the detector configured to determine a direction of an eye gaze of a user of the electronic device;

    II) the haptic output device (130; 530; 830) is supported by the housing; and

    III) the processor (120; 114; 118; 410; 520; 820) is supported by the housing, the processor configured to output the first haptic drive signal or the second haptic drive signal to the haptic output device to generate the first haptic effect or the second haptic effect based on the direction of the eye gaze determined by the detector.


     
    11. The electronic device according to claim 10, wherein an intensity of the second haptic effect is greater than an intensity of the first haptic effect.
     


    Ansprüche

    1. System (100; 300), umfassend:

    einen Detektor (110; 110'; 110"), der konfiguriert ist, um eine Blickrichtung eines Benutzers des Systems relativ zu einer Anzeigevorrichtung zu bestimmen;

    einen Prozessor (120; 114; 118; 410; 520; 820), der konfiguriert ist, um:

    einen oder mehrere Parameter eines ersten haptischen Effekts zu bestimmen, der mit der Blickrichtung des Benutzers auf die Anzeigevorrichtung zusammenhängt;

    einen oder mehrere Parameter eines zweiten haptischen Effekts zu bestimmen, der damit zusammenhängt, dass der Blick des Benutzers von der Anzeigevorrichtung weg gerichtet ist;

    wobei sich der zweite haptische Effekt von dem ersten haptischen Effekt unterscheidet, und der eine oder die mehreren Parameter des ersten und zweiten haptischen Effekts ausgewählt sind, die Folgendes umfasst: eine Amplitude; eine Frequenz; oder eine Dauer;

    ein erstes Signal zu erzeugen, das für den ersten haptischen Effekt repräsentativ ist, wenn ein Ereignis auf der Anzeigevorrichtung auftritt und festgestellt wird, dass der Blick des Benutzers auf die Anzeigevorrichtung gerichtet ist; und

    ein zweites Signal zu erzeugen, das für den zweiten haptischen Effekt repräsentativ ist, wenn das Ereignis auf der Anzeigevorrichtung auftritt und festgestellt wird, dass der Blick des Benutzers von der Anzeigevorrichtung weg gerichtet ist; und

    eine haptische Ausgabevorrichtung (130; 530; 830), die konfiguriert ist, um das erste Signal und das zweite Signal vom Prozessor zu empfangen und den ersten haptischen Effekt oder den zweiten haptischen Effekt auszugeben.


     
    2. System nach Anspruch 1, wobei der Detektor eine Kamera umfasst, die konfiguriert ist, um ein Bild eines Auges des Benutzers aufzunehmen, und einen Bildprozessor, der konfiguriert ist, um die Richtung des Augenblicks basierend auf dem Bild zu bestimmen.
     
    3. System nach Anspruch 1, wobei der Detektor einen Sensor umfasst, der konfiguriert ist, um Bewegungen von Muskeln in der Nähe eines Auges des Benutzers zu überwachen, und einen zweiten Prozessor, der konfiguriert ist, um die Richtung des Blicks basierend auf den überwachten Bewegungen zu bestimmen.
     
    4. System nach einem der Ansprüche 1 bis 3, wobei der Prozessor ferner konfiguriert ist, um einen Augmented-Reality-Raum zu erzeugen, und die Richtung des Blicks zu verwenden, um zu bestimmen, wann der Benutzer ein Objekt in dem Augmented-Reality-Raum betrachtet, wobei der erste haptische Effekt und der zweite haptische Effekt ferner auf dem Objekt basieren.
     
    5. System nach einem der Ansprüche 1 bis 4, wobei der Detektor und der Prozessor Teil einer ersten Vorrichtung sind, und die haptische Ausgabevorrichtung Teil einer zweiten Vorrichtung ist, die von der ersten Vorrichtung getrennt ist.
     
    6. System nach einem der Ansprüche 1 bis 4, wobei der Detektor Teil einer ersten Vorrichtung ist und der Prozessor und die haptische Ausgabevorrichtung Teil einer zweiten Vorrichtung sind, die von der ersten Vorrichtung getrennt ist.
     
    7. Verfahren (900) zum Bereitstellen eines haptischen Effekts für einen Benutzer eines Systems (100; 300), wobei das Verfahren Folgendes umfasst:

    Bestimmen (904) einer Richtung eines Blickes des Benutzers des Systems relativ zu einer Anzeigevorrichtung;

    Bestimmen eines oder mehrerer Parameter eines ersten haptischen Effekts, der damit zusammenhängt, dass der Blick des Benutzers auf die Anzeigevorrichtung gerichtet ist;

    Bestimmen eines oder mehrerer Parameter eines zweiten haptischen Effekts, der damit zusammenhängt, dass die Blickrichtung des Benutzers von der Anzeigevorrichtung weg ist, wobei sich der zweite haptische Effekt von dem ersten haptischen Effekt unterscheidet, und der eine oder die mehreren Parameter aus der Gruppe ausgewählt sind, die Folgendes umfasst:

    eine Amplitude; eine Frequenz; oder eine Dauer;

    Erzeugen (906) des ersten haptischen Effekts, wenn ein Ereignis auf der Anzeigevorrichtung auftritt und festgestellt wird, dass der Blick des Benutzers auf die Anzeigevorrichtung gerichtet ist; und

    Erzeugen des zweiten haptischen Effekts, wenn das Ereignis auf der Anzeigevorrichtung auftritt und festgestellt wird, dass der Blick des Benutzers auf die Anzeigevorrichtung gerichtet ist; und

    Ausgeben (908) des ersten haptischen Effekts oder des zweiten haptischen Effekts an den Benutzer.


     
    8. Verfahren nach Anspruch 7, wobei das Bestimmen einer Richtung eines Blicks des Benutzers des Systems das Erfassen eines Bildes eines Auges des Benutzers und das Bestimmen der Richtung des Blicks basierend auf dem Bild umfasst.
     
    9. Verfahren nach Anspruch 7, wobei das Bestimmen einer Richtung eines Blicks des Benutzers des Systems das Überwachen von Bewegungen von Muskeln in der Nähe eines Auges des Benutzers und das Bestimmen der Richtung des Blicks basierend auf den überwachten Bewegungen umfasst.
     
    10. Elektronische Vorrichtung (400; 500), umfassend:

    A) das System (100; 300) gemäß einem der Ansprüche 1 bis 3; und,

    B) ein Gehäuse (502);
    wobei

    I) der Detektor (110; 110'; 110") von dem Gehäuse getragen wird, wobei der Detektor konfiguriert ist, um eine Blickrichtung eines Benutzers der elektronischen Vorrichtung zu bestimmen;

    II) die haptische Vorrichtung (130; 530; 830) vom Gehäuse unterstützt wird; und

    III) der Prozessor (120; 114; 118; 410; 520; 820) vom Gehäuse unterstützt wird, wobei der Prozessor konfiguriert ist, um das erste haptische Antriebssignal oder das zweite haptische Antriebssignal an die haptische Ausgabevorrichtung auszugeben, um den ersten haptischen Effekt oder den zweiten haptischen Effekt basierend auf der vom Detektor bestimmten Blickrichtung zu erzeugen.


     
    11. Elektronische Vorrichtung nach Anspruch 10, wobei eine Intensität des zweiten haptischen Effekts größer ist als eine Intensität des ersten haptischen Effekts.
     


    Revendications

    1. Système (100; 300) comprenant :

    un détecteur (110 ; 110' ; 110") étant configuré pour déterminer une direction du regard d'un utilisateur du système par rapport à un dispositif d'affichage ;

    un processeur (120 ; 114 ; 118 ; 410 ; 520 ; 820) étant configuré pour :

    déterminer un ou plusieurs paramètres d'un premier effet haptique associé à la direction du regard de l'utilisateur vers le dispositif d'affichage ;

    déterminer un ou plusieurs paramètres d'un second effet haptique associé à la direction du regard de l'utilisateur loin du dispositif d'affichage,

    dans lequel le second effet haptique est différent du premier effet haptique et l'un ou plusieurs paramètres des premier et second effets haptiques sont choisis dans le groupe comprenant : une amplitude ; une fréquence ; ou une durée ;

    générer un premier signal représentatif du premier effet haptique lorsqu'un événement se produit sur le dispositif d'affichage et que le regard de l'utilisateur est déterminé comme étant dirigé vers le dispositif d'affichage ; et

    générer un second signal représentatif du second effet haptique lorsque l'événement se produit sur le dispositif d'affichage et que le regard de l'utilisateur est déterminé comme étant dirigé loin du dispositif d'affichage ; et

    un dispositif de sortie haptique (130 ; 530 ; 830) étant configuré pour recevoir le premier signal et le second signal du processeur et délivrer en sortie le premier effet haptique ou le second effet haptique.


     
    2. Système selon la revendication 1, dans lequel le détecteur comprend une caméra étant configurée pour capturer une image d'un œil de l'utilisateur, et un processeur d'image étant configuré pour déterminer la direction du regard en fonction de l'image.
     
    3. Système selon la revendication 1, dans lequel le détecteur comprend un capteur étant configuré pour surveiller les mouvements des muscles à proximité d'un œil de l'utilisateur, et un second processeur étant configuré pour déterminer la direction du regard sur la base des mouvements surveillés.
     
    4. Système selon l'une quelconque des revendications 1 à 3, dans lequel le processeur est en outre configuré pour générer un espace de réalité augmentée, et utiliser la direction du regard pour déterminer quand l'utilisateur regarde un objet dans l'espace de réalité augmentée, dans lequel le premier effet haptique et le second effet haptique sont en outre basés sur l'objet.
     
    5. Système selon l'une quelconque des revendications 1 à 4, dans lequel le détecteur et le processeur font partie d'un premier dispositif et le dispositif de sortie haptique fait partie d'un second dispositif qui est séparé du premier dispositif.
     
    6. Système selon l'une quelconque des revendications 1 à 4, dans lequel le détecteur fait partie d'un premier dispositif et le processeur et le dispositif de sortie haptique font partie d'un second dispositif qui est séparé du premier dispositif.
     
    7. Procédé (900) permettant de fournir un effet haptique à un utilisateur d'un système (100 ; 300), le procédé comprenant :

    la détermination (904) d'une direction d'un regard de l'utilisateur du système par rapport à un dispositif d'affichage ;

    la détermination d'un ou plusieurs paramètres d'un premier effet haptique associé à la direction du regard de l'utilisateur vers le dispositif d'affichage ;

    la détermination d'un ou plusieurs paramètres d'un second effet haptique associé à la direction du regard de l'utilisateur loin du dispositif d'affichage, dans lequel le second effet haptique est différent du premier effet haptique et l'un ou plusieurs paramètres sont sélectionnés dans le groupe comprenant : une amplitude ; une fréquence ; ou une durée ;

    la génération (906) du premier effet haptique lorsqu'un événement se produit sur le dispositif d'affichage et que le regard de l'utilisateur est déterminé comme étant dirigé vers le dispositif d'affichage, et

    la génération du second effet haptique lorsque l'événement se produit sur le dispositif d'affichage et que le regard de l'utilisateur est déterminé comme étant dirigé loin du dispositif d'affichage ; et

    la délivrance en sortie (908) du premier effet haptique ou du second effet haptique à l'utilisateur.


     
    8. Procédé selon la revendication 7, dans lequel ladite détermination d'une direction d'un regard de l'utilisateur du système comprend la capture d'une image d'un œil de l'utilisateur, et la détermination de la direction du regard sur la base de l'image.
     
    9. Procédé selon la revendication 7, dans lequel ladite détermination d'une direction d'un regard de l'utilisateur du système comprend la surveillance des mouvements des muscles à proximité d'un œil de l'utilisateur, et la détermination de la direction du regard sur la base des mouvements surveillés.
     
    10. Dispositif électronique (400 ; 500) comprenant :

    A) le système (100 ; 300) selon l'une quelconque des revendications 1 à 3 ; et,

    B) un boîtier (502) ;
    dans lequel

    I) le détecteur (110 ; 110' ; 110") est supporté par le boîtier, le détecteur étant configuré pour déterminer une direction d'un regard d'un utilisateur du dispositif électronique ;

    II) le dispositif de sortie haptique (130 ; 530 ; 830) est supporté par le boîtier ; et

    III) le processeur (120 ; 114 ; 118 ; 410 ; 520 ; 820) est supporté par le boîtier, le processeur étant configuré pour délivrer en sortie le premier signal de commande haptique ou le second signal de commande haptique au dispositif de sortie haptique pour générer le premier effet haptique ou le second effet haptique sur la base de la direction du regard déterminée par le détecteur.


     
    11. Dispositif électronique selon la revendication 10, dans lequel une intensité du second effet haptique est supérieure à une intensité du premier effet haptique.
     




    Drawing





























    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description