(19)
(11)EP 2 770 408 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
22.07.2020 Bulletin 2020/30

(21)Application number: 14152597.2

(22)Date of filing:  27.01.2014
(51)International Patent Classification (IPC): 
G06F 3/041(2006.01)
G06F 3/044(2006.01)

(54)

Apparatus and method for recognizing proximity motion using sensors

Vorrichtung und Verfahren zur Erkennung von Bewegung in der Nähe mithilfe von Sensoren

Appareil et procédé de reconnaissance de mouvement de proximité à l'aide de capteurs


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 22.02.2013 KR 20130018961
12.11.2013 KR 20130137168

(43)Date of publication of application:
27.08.2014 Bulletin 2014/35

(73)Proprietor: Samsung Electronics Co., Ltd.
Gyeonggi-do 443-742 (KR)

(72)Inventors:
  • Han, Seung Ju
    446-712 Gyeonggi-do (KR)
  • Park, Joon Ah
    446-712 Gyeonggi-do (KR)
  • Ryu, Hyun Suk
    446-712 Gyeonggi-do (KR)
  • Park, Du Sik
    446-712 Gyeonggi-do (KR)
  • Lee, Ki Myung
    446-712 Gyeonggi-do (KR)

(74)Representative: Grootscholten, Johannes A.M. et al
Arnold & Siedsma Bezuidenhoutseweg 57
2594 AC The Hague
2594 AC The Hague (NL)


(56)References cited: : 
US-A1- 2006 279 548
US-A1- 2009 309 851
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description


    [0001] The invention relates to an apparatus in accordance with the preamble of claim 1 and method with the preamble of claim 10. Such an apparatus and method are described in US 2006/0279548 A1

    [0002] A touch input scheme performed on a two-dimensional (2D) plane is used in various portable devices, such as a smart phone, a tablet computer, and a laptop computer, for example. In particular, the touch input scheme has been developed from a single point recognition technique of receiving a single touch input to a multipoint recognition technique of receiving a plurality of touch inputs simultaneously.

    [0003] Relative to a user input provided by touch screen technology, proximity motion recognition technology may reduce fingerprint smudges on the input display screen, and provide a three-dimensional interface for user input. However, proximity motion recognition may also reduce an accuracy of the user input.

    [0004] Accordingly, a proximity motion apparatus may use a first sensor with a near range and a second sensor with a far range to detect a user input for an input display screen, and manage power supplied to each sensor. A user interface for the input display screen may be adapted to correspond to a detected motion in the near range, such that a user may more accurately provide the user input. For example, the user interface may be magnified as a motion is detected approaching the input display screen. The user interface may switch from a two dimensional interface to a three dimensional interface, or vice versa, as the approaching motion is detected. Icons may be adapted to shift position on the input display screen to prevent the icons from being obscured by the approaching motion. Such adaptations of the input display screen may be intuitive to a user, and mimic a natural movement of the user.

    [0005] The foregoing and/or other aspects are achieved by providing an apparatus in accordance with claim 1 and a method in accordance with claim 10.

    [0006] The apparatus may further include a proximity measurer to measure a proximity between the input object and the reference surface, and a sensor controller to selectively activate the first sensor and the second sensor, based on the measured proximity.

    [0007] The apparatus may further include an interface controller to control the proximity motion interface for at least one predetermined icon to be disposed along a perimeter of a point being indicated by a second input object, when the second input object is input in the second input space, and a signal processor to extract an axis and an endpoint of the first input object based on an output signal of the first sensor, when the first input object is input in the first input space. Here, the interface controller may control, using the axis and the endpoint of the first input object, the proximity motion interface for the at least one predetermined icon to be disposed along a perimeter of an area in which the display is obscured by the first input object, when the first input object is input in the first input space.

    [0008] The apparatus may further include a plurality of input sensing units to determine whether a plurality of input objects is sensed, based on at least one of an output signal of the first sensor and an output signal of the second sensor, and an input selector to select at least one of the plurality of input objects based on a predetermined mode, when the plurality of input sensing units determines that the plurality of input objects is sensed.

    [0009] The apparatus may further include an input pattern recognizer to recognize an input pattern based on at least one of an output signal of the first sensor and an output signal of the second sensor, and a function performer to perform a function corresponding to the input pattern.

    [0010] The apparatus may further include a display to provide a two-dimensional (2D) interface and a three-dimensional (3D) interface, a calculator to calculate at least one of a position, a velocity, and an angular velocity of the input object, based on at least one of an output signal of the first sensor and an output signal of the second sensor, and a display controller to control an operation mode of the display based on a result of the calculating.

    [0011] A method of controlling a user interface is also disclosed, the method including sensing an input in a first region near the user interface; sensing the input in a second region outside the first region; and selectively controlling the user interface in a first manner and a second manner based on at least one of a position and movement of the sensed input within the first region and the second region.

    [0012] Additional aspects of embodiments will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.

    BRIEF DESCRIPTION OF THE DRAWINGS



    [0013] These and/or other aspects will become apparent and more readily appreciated from the following description of embodiments, taken in conjunction with the accompanying drawings of which:

    FIG. 1 illustrates an apparatus for recognizing a proximity motion using sensors according to example embodiments;

    FIG. 2 illustrates an output signal of a first sensor and an output signal of a second sensor according to example embodiments;

    FIG. 3 illustrates a power saving mode of an apparatus for recognizing a proximity motion according to example embodiments;

    FIG. 4 illustrates a proximity motion interface provided by an apparatus for recognizing a proximity motion according to example embodiments;

    FIG. 5 illustrates an operation of an apparatus for recognizing a proximity motion according to example embodiments;

    FIGS. 6A and 6B illustrate an operation of disposing an icon automatically, by an apparatus for recognizing a proximity motion according to example embodiments;

    FIGS. 7, 8A, and 8B illustrate a signal processing method for an operation of disposing an icon automatically, by an apparatus for recognizing a proximity motion according to example embodiments;

    FIGS. 9A, 9B, 9C, and 10 illustrate an operation of an apparatus for recognizing a proximity motion when a plurality of input objects is sensed according to example embodiments;

    FIG. 11 illustrates an apparatus for recognizing a proximity motion that may recognize an input pattern according to example embodiments; and

    FIGS. 12A and 12B illustrate an operation of switching between a two-dimensional (2D) interface and a three-dimensional (3D) interface by an apparatus for recognizing a proximity motion according to example embodiments.


    DETAILED DESCRIPTION



    [0014] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. Embodiments are described below to explain the present disclosure by referring to the figures.

    [0015] Hereinafter, an apparatus for recognizing a proximity motion may be referred to simply as the apparatus.

    [0016] FIG. 1 illustrates an apparatus 100 for recognizing a proximity motion using sensors according to example embodiments.

    [0017] Referring to FIG. 1, the apparatus 100 may include a first sensor 110, a second sensor 120, and an information transfer controller 130.

    [0018] Here, technology for recognizing a proximity motion may refer to technology for developing a touch input scheme from a two-dimensional (2D) scheme to a three-dimensional (3D) scheme. The apparatus 100 may refer to an apparatus that may receive an input of a proximity motion from a user and recognize the input proximity motion, and may be implemented in various forms, such as a fixed device and a portable device, for example. The first sensor may be a first type of sensor and the second sensor may be a second type of sensor. However, the disclosure is not limited thereto. For example, the sensors may be of the same type, but configured to sense an input in different distance ranges.

    [0019] The first sensor 110 may sense a first input space, or region, at a distance closer than a first predetermined distance 141 from a reference surface 140 for recognizing a proximity motion.

    [0020] Although not shown in the drawings, the apparatus 100 may further include a display, and the reference surface 140 may be disposed above the display.

    [0021] In this instance, the first sensor 110 may be implemented by various schemes. For example, the first sensor 110 may be implemented using an infrared sensor or an ultrasonic sensor installed at an edge of the display, or may be implemented using an image sensor, a depth sensor, or a touch sensor panel.

    [0022] In addition, the second sensor 120 may sense a second input space at a distance farther than the first predetermined distance 141 and closer than a second predetermined distance 142 from the reference surface 140. Here, a value of the second predetermined distance 142 may be greater than a value of the first predetermined distance 141.

    [0023] When an input object 150 is input at a position corresponding to the first predetermined distance 141, the apparatus 100 may sense the input object 150 using the first sensor 110 or the second sensor 120, based on predetermined settings.

    [0024] In this instance, the second sensor 120 may be implemented by various schemes. For example, the second sensor 120 may be implemented using an infrared sensor or an ultrasonic sensor installed at an edge of the display, or may be implemented using an image sensor, a depth sensor, or a touch sensor panel.

    [0025] Here, the first predetermined distance 141 and the second predetermined distance 142 may be predetermined based on a characteristic of the first sensor 110 and a characteristic of the second sensor 120, respectively.

    [0026] The apparatus 100 may process an input being sensed in a space most proximate to the reference surface 140 and an input being sensed in a space relatively far from the reference surface 140 in different manners, using at least two different sensors.

    [0027] Accordingly, the apparatus 100 may provide technology that may combine advantages of different sensors, for example, an advantage of an image sensor sensing a relatively wide input space, and an advantage of a touch sensor panel sensing an input object rapidly and accurately.

    [0028] In addition, the apparatus 100 may provide technology for avoiding handprints generated by inputs being left behind, when compared to a touch screen input scheme. Also, the apparatus 100 may provide technology for reducing an input load of providing an input for a screen to be touched directly, and technology for receiving various inputs, for example, intuitive and natural gesture inputs, such as a motion of turning a page, a motion of picking up an object, and the like, by utilizing a 3D space.

    [0029] Hereinafter, for ease of description, the first sensor 110 may correspond to a capacitive touch sensor configured to sense a space at a distance within 5 centimeters (cm) from the reference surface 140, and the second sensor 120 may correspond to a vision sensor configured to sense a space at a distance greater than or equal to 5 cm from the reference surface 140. However, the scope of the present disclosure is not to be limited thereto.

    [0030] Here, the vision sensor may recognize spatial information of a target object, by sensing a change in light with respect to the target object over time, for example, a time-derivative value of light. When the target object is moved, the vision sensor may selectively sense the movement, by distinguishing the movement from a background. For example, the vision sensor may include a dynamic vision sensor (DVS) configured to sense only a change in an intensity of light in each pixel.

    [0031] In addition, when a transition of the input object 150 occurs between the first input space and the second input space, the information transfer controller 130 may transfer information related to the transition, between the first sensor 110 and the second sensor 120.

    [0032] For example, when the input object 150 slowly approaches from a position far from the reference surface 140, a transition of the input object 150 may occur from the second input space to the first input space.

    [0033] In this instance, when the input object 150 is input in the second input space, the apparatus 100 may sense the input object 150 using the second sensor 120. Conversely, when the input object 150 is input in the first input space, the apparatus 100 may sense the input object 150 using the first sensor 110. Accordingly, when the transition of the input object 150 occurs from the second input space to the first input space, the apparatus 100 may switch a sensor to be used from the second sensor 120 to the first sensor 110.

    [0034] Here, the information transfer controller 130 may transfer information related to the transition to the first sensor 110 or controller for controlling sensors. Here, the information related to the transition may be generated based on an output signal of the second sensor 120. The first sensor 110 may sense the input object 150 input in the first input space, based on the received information related to the transition.

    [0035] Accordingly, the apparatus 100 may provide technology for seamless switching between the sensors. For example, the information related to the transition may include information related to a position of the input object 150 sensed by the second sensor 120. The first sensor 110 may use the position of the input object 150 included in the information related to the transition, as an initial input position. In so doing, the apparatus 100 may sense the input object 150 continuously and seamlessly although a sensor to be used for sensing the input object 150 is changed.

    [0036] Identical descriptions provided above may be applied to a case of a transition of the input object 150 from the first input space to the second input space, and thus will be omitted for conciseness and ease of description.

    [0037] FIG. 2 illustrates an output signal of a first sensor and an output signal of a second sensor according to example embodiments.

    [0038] Referring to FIG. 2, the first sensor may output an x-coordinate, a y-coordinate, and a z-coordinate of an input object 250 being sensed in a first input space 230. The second sensor may output an x-coordinate, and a y-coordinate of the input object 250 being sensed in a second input space 240.

    [0039] Here, the x-coordinate, the y-coordinate, and the z-coordinate may indicate positions at which the input object 250 is being sensed in an x-axial direction 211, a y-axial direction 212, and a z-axial direction 213, respectively, based on the origin present on a reference surface 220. For example, the origin may correspond to a point at an upper left edge of the reference surface 220.

    [0040] An apparatus for recognizing a proximity motion according to the present embodiments may process an input being sensed in the first input space 230 closest to the reference surface 220, using 3D coordinates, and process an input being sensed in the second input space 240 relatively far from the reference surface 220, using 2D coordinates.

    [0041] In this instance, information related to a transition of the input object 250 to be transferred by an information transfer controller when the transition occurs between the first input space 230 and the second input space 240 may include an x-coordinate and a y-coordinate of the input object 250 being sensed in a corresponding input space.

    [0042] The first sensor or the second sensor may receive the transferred x-coordinate and y-coordinate as an initial position of the input object 250, thereby enabling seamless switching between sensors.

    [0043] FIG. 3 illustrates a power saving mode of an apparatus for recognizing a proximity motion according to example embodiments.

    [0044] Referring to FIG. 3, the apparatus may selectively activate a first sensor and a second sensor based on a proximity between an input object 310 and a reference surface 320.

    [0045] Although not shown in the drawings, the apparatus may further include a proximity measurer, and a sensor controller.

    [0046] The proximity measurer may measure the proximity between the input object 310 and the reference surface 320. Here, the proximity refers to a measure indicating an extent of closeness between the input object 310 and the reference surface 320, and may include, for example, a shortest distance between the input object 310 and the reference surface 320, and the like.

    [0047] In this instance, the proximity measurer may be implemented using a third sensor distinct from the first sensor and the second sensor. For example, the proximity measurer may be implemented using an infrared (IR) sensor, or may be of the same type as the first sensor or the second sensor. According to example embodiments, the proximity measurer may be implemented using an output signal of the first sensor and an output signal of the second sensor.

    [0048] The sensor controller may selectively activate the first sensor and the second sensor, based on the measured proximity.

    [0049] For example, when the input object 310 is sensed in a first input space 330, the sensor controller may activate the first sensor and deactivate the second sensor. In addition, the sensor controller may activate the second sensor and deactivate the first sensor, when the input object 310 is sensed in a second input space 350.

    [0050] Accordingly, the apparatus may deactivate a sensor currently not in use, among sensors, thereby providing technology for reducing power consumption.

    [0051] In addition, when the input object 310 is sensed in a transition space 340 between the first input space 330 and the second input space 350, the sensor controller may operate the second sensor in a stand-by mode while the first sensor is activated.

    [0052] According to example embodiments, when the input object 310 is sensed in the transition space 340, the sensor controller may operate the first sensor in a stand-by mode while the second sensor is activated.

    [0053] Here, the stand-by mode refers to an operation mode distinct from an inactive mode, and may include, for example, a low-power operation mode in which a sensor may require a shorter time for being activated, and the like.

    [0054] Accordingly, the apparatus may provide technology for reducing power consumption and technology for increasing a sensing response rate, simultaneously.

    [0055] FIG. 4 illustrates a proximity motion interface provided by an apparatus for recognizing a proximity motion according to example embodiments.

    [0056] Referring to FIG. 4, the apparatus may further include a display configured to provide a proximity motion interface. Here, a reference surface 440 may be disposed above the display.

    [0057] In particular, the apparatus may control the proximity motion interface for a size of an area being indicated by an input object to be changed, based on a proximity of the input object to the reference surface 440.

    [0058] For example, when an input object 410 is input in a second input space 460, the apparatus may obtain an x-coordinate and a y-coordinate of the input object 410, using a second sensor. The apparatus may display an area 415 being indicated by the input object 410 in a predetermined size, using the obtained x-coordinate and y-coordinate.

    [0059] In addition, when an input object 420 is input in a first input space 450, the apparatus may obtain an x-coordinate, a y-coordinate, and a z-coordinate of the input object 420 using a first sensor. In this instance, the apparatus may change a size of an area 425 being indicated by the input object 420, using the obtained x-coordinate and y-coordinate.

    [0060] In this instance, the apparatus may control the size of the area 425 to be changed based on the obtained z-coordinate. For example, the apparatus may extract a proximity between the input object 420 and the reference surface 440 based on the obtained z-coordinate, and control the proximity motion interface for the size of the area 425 to increase as the proximity between the input object 420 and the reference surface 440 increases.

    [0061] Accordingly, the apparatus may expand an area being indicated by an input object as a distance between the input object and a reference surface decreases, thereby providing technology for receiving an input of a more subtle pointing motion.

    [0062] In addition, the apparatus may determine whether an input object 430 is in contact with the reference surface 440. In this instance, the input object 430 may be input in the first input space 450, and the apparatus may obtain an x-coordinate, a y-coordinate, and a z-coordinate of the input object 430, using the first sensor. The apparatus may determine whether the input object 430 is in contact with the reference surface 440, based on the z-coordinate of the input object 430.

    [0063] When it is determined that the input object 430 is in contact with the reference surface 440, the apparatus may control the proximity motion interface for an area 435 being indicated by the input object 430 to be selected.

    [0064] In this instance, the proximity motion interface may include at least one icon. The apparatus may control the proximity motion interface for a size of an icon being indicated by an input object to increase as a proximity between the input object and a reference surface increases. In addition, when the input object is in contact with the reference surface, the apparatus may control the proximity motion interface for an icon being indicated by the input object to be selected.

    [0065] Further, the apparatus may perform a function corresponding to the selected icon. For example, in a case in which an icon being indicated by an input object corresponds to a call icon when the input object is in contact with a reference surface, the apparatus may perform a call function.

    [0066] According to the present embodiments, the apparatus may provide user interface (UI) technology for facilitating an input motion in a space proximate to a reference surface.

    [0067] According to example embodiments, the apparatus may further include a typical touch sensor configured to sense a touch input provided on the reference surface 440, in addition to the first sensor configured to sense the first input space and the second sensor configured to sense the second input space. In this instance, the apparatus may activate the touch sensor when it is determined that the input object 430 is in contact with the reference surface 440.

    [0068] FIG. 5 illustrates an operation of an apparatus for recognizing a proximity motion according to example embodiments.

    [0069] Referring to FIG. 5, in operation 510, the apparatus may initialize a first sensor and a second sensor. Here, the apparatus may initialize operation modes of the first sensor and the second sensor to be active modes, based on predetermined initialization settings.

    [0070] In operation 520, the apparatus may determine whether an input object is present at a proximity distance. For example, the apparatus may determine whether the input object is sensed within a second distance at which an input object may be sensed by the second sensor.

    [0071] When it is determined, in operation 520, that the input object is present at the proximity distance, the apparatus may determine whether the input object is present within a predetermined proximity distance, in operation 530. For example, when the input object is sensed within the second distance, the apparatus may determine whether the input object is sensed within a first distance at which the input object may be sensed by the first sensor.

    [0072] In this example, the apparatus may maintain the active mode of the first sensor and switch the operation mode of the second sensor to a stand-by mode, in operation 540, when the input object is sensed within both the second distance and the first distance.

    [0073] In this instance, the input object may be sensed in a first input space, and the apparatus may control an interface, for example, a UI, for performing a pointing operation using an x-coordinate and a y-coordinate of the sensed input object and a magnifying operation using a z-coordinate of the sensed input object, in operation 550.

    [0074] Conversely, the apparatus may maintain the active mode of the second sensor, and switch the operation mode of the first sensor to a stand-by mode, in operation 560, when the input object is sensed within the second distance, whereas the input object is not sensed within the first distance.

    [0075] In this instance, the input object may be sensed in a second input space, and the apparatus may control the interface for performing a pointing operation using an x-coordinate and a y-coordinate of the sensed input object.

    [0076] Identical descriptions provided with reference to FIG. 4 may be applied to operations 550 and 570 and thus, a repeated description will be omitted for conciseness.

    [0077] In addition, the apparatus may execute an application UI corresponding to the received input, in operation 580. For example, the apparatus may activate a selected icon, or may perform an operation of switching between a 2D interface and a 3D interface.

    [0078] FIGS. 6A and 6B illustrate an operation of disposing an icon automatically by an apparatus for recognizing a proximity motion according to example embodiments.

    [0079] Referring to FIG. 6A, the apparatus may control a proximity motion interface for at least one predetermined icon to be disposed along a perimeter of a point being indicated by an input object.

    [0080] In particular, when an input object 620 is input in a second input space, the apparatus may obtain a point 621 being indicated by the input object 620, based on an output signal of a second sensor. Here, the second input space may be at a distance greater than or equal to a predetermined distance 622 from a reference surface 610. In this instance, the apparatus may provide an interface in which a plurality of icons 623 is disposed along a perimeter of the point 621 being indicated by the input object 620.

    [0081] In addition, when an input object 630 is input in a first input space, the apparatus may extract an axis and an endpoint of the input object 630, based on an output signal of a first sensor. For example, referring to FIG. 7, the apparatus may extract an axis 750 and an endpoint 740 from an image 730 of a sensed input object 720. Here, the axis 750 and the endpoint 740 may be disposed on a reference surface 710.

    [0082] In this instance, the apparatus may provide an interface in which a plurality of icons 633 is disposed along a perimeter of an area in which a display is obscured by the input object 630. For example, referring to FIG. 7, the apparatus may control a proximity motion interface for a plurality of icons to be disposed in a form of a sector 760, starting from the endpoint 740 in a direction opposite to the axis 750.

    [0083] Further, when the input object 630 is input in the first input space, the apparatus may control the proximity motion interface for a size of at least one predetermined icon to be changed based on a proximity 632 of the input object 630 and the reference surface 610. Identical descriptions provided with reference to FIG. 4 may be applied to the above operations and thus a repeated description will be omitted for conciseness.

    [0084] Referring to FIG. 6B, the apparatus may be operated in an identical manner, when an input object having a mirror symmetry with the input object of FIG. 6A is input.

    [0085] In particular, when an input object 640 is input in a second input space, the apparatus may obtain a point 641 being indicated by the input object 640, based on an output signal of a second sensor. In this instance, the apparatus may provide an interface in which a plurality of icons 642 is disposed along a perimeter of the point 641 being indicated by the input object 640.

    [0086] In addition, when an input object 650 is input in a first input space, the apparatus may extract an axis and an endpoint of the input object 650, based on an output signal of a first sensor. In this instance, the apparatus may provide an interface in which a plurality of icons 651 is disposed along a perimeter of an area in which a display is obscured by the input object 650.

    [0087] Accordingly, when an input object is disposed at a distance greater than or equal to a predetermined distance from a reference surface 610, the apparatus may dispose a plurality of icons along a perimeter of a point being indicated by the object, irrespective of an axial direction of the input object, because the display may not be obscured by the input object.

    [0088] Conversely, when an input object is disposed proximate to the reference surface, the apparatus may dispose the plurality of icons along a perimeter of an area in which the display is obscured, in a line of sight of an axial direction of the input object, because a probability of the display being obscured by the input object may be high.

    [0089] FIGS. 8A through 8B illustrate a signal processing method for an operation of disposing an icon automatically by an apparatus for recognizing a proximity motion according to example embodiments.

    [0090] Referring to FIG. 8A, in operation 811, the apparatus may sense an input object. In operation 812, the apparatus may perform signal processing for extracting an axis and an endpoint of the sensed input object. In operation 813, the apparatus may extract the axis and the endpoint of the input object, based on a result of the signal processing. Operations 811 through 813 will be further described in detail with reference to FIG. 8B. In operation 814, the apparatus may dispose at least one predetermined icon, based on a proximity distance between the input object and a reference surface, the extracted axis, and the extracted endpoint.

    [0091] Referring to FIG. 8B, the apparatus may perform image sensing of an input object in operation 820, a subtraction for removing a background excluding the input object in operation 830, high-pass filtering for indicating an outline of the input object in operation 840, an amplification for defining the outline in operation 850, thresholding for removing an outlier excluding the input object in operation 860, a search for a top of a region for searching for a point corresponding to a fingertip in operation 870, and an output of a result in operation 880, thereby extracting an axis and an endpoint of the input object.

    [0092] In particular, the apparatus may acquire an input image through the image sensing performed in operation 820. For example, the apparatus may acquire a depth image of the input object as the input image. The apparatus may extract an object 831 from the input image through the subtraction performed in operation 830. For example, the apparatus may distinguish between an object and a background in the depth image, and extract a portion corresponding to the object from the depth image. A method of distinguishing between an object and a background in a depth image may be implemented using various methods. For example, the apparatus may use a threshold depth that distinguishes between an object and a background. The apparatus may classify a pixel having a depth less than or equal to the threshold depth as an object, and classify a pixel having a depth greater than the threshold depth as a background.

    [0093] The apparatus may obtain an outline 841 of the input object through the high-pass filtering performed in operation 840. For example, the apparatus may extract pixels of which depths are different from neighboring pixels by at least a predetermined amount. The outline 841 extracted by the high-pass filtering may include outlines of the thumb and the four fingers. The apparatus may define the outline 841 of the input object through the amplification performed in operation 850. For example, the apparatus may amplify values of pixels included inside the outline 841 of the input object. When the input image corresponds to a depth image, depths of pixels in the depth image may be amplified. Hereinafter, a depth of a pixel expressed using a relatively bright color may be less than a depth of a pixel expressed using a relatively dark color. For example, a depth of a pixel included in an area 851 expressed using a relatively bright color may be less than a depth of a pixel included in an area 852 expressed using a relatively dark color.

    [0094] The apparatus may remove an outlier excluding the input object, through the thresholding performed in operation 860. For example, the apparatus may remove pixels having depths greater than the threshold depth from among pixels of the provided depth image. The apparatus may identify a point corresponding to a fingertip through the search for the top of the region performed in operation 870. The apparatus may generate a rectangular model 871 surrounding a finger, with respect to each of the thumb and the four fingers. The rectangular model 871 may have a height h and a width w.

    [0095] The apparatus may search for a pixel having a least a top point, for example, a least a depth, in a single end area of the rectangular model 871. The apparatus may search for a pixel 873 having a least depth in a rectangular area 872 located at a single end of the rectangular model 871 and of which four sides are of a length h. The found pixel 873 may correspond to an endpoint of the input object. In addition, a line segment which extends in a direction of the width of the rectangular model 871 based on the endpoint may correspond to an axis of the input object. The apparatus may output at least one pair of the endpoint and the axis of the input object, through the result output performed in operation 880. For example, the apparatus may output a pair of an endpoint and an axis, with respect to each of the thumb and the four fingers.

    [0096] FIGS. 9A through 10 illustrate an operation of an apparatus for recognizing a proximity motion when a plurality of input objects is sensed according to example embodiments.

    [0097] Although not shown in the drawings, the apparatus may further include a plurality of input sensing units, and an input selector.

    [0098] The plurality of sensing units may determine whether a plurality of input objects is sensed, based on at least one of an output signal of a first sensor and an output signal of a second sensor.

    [0099] In addition, the input selector may select at least one of the plurality of input objects, based on a predetermined mode, when the plurality of input sensing units determines that the plurality of input objects is sensed.

    [0100] For example, referring to FIG. 9A, the apparatus may be operated in a mode in which a plurality of inputs is received. In this instance, the apparatus may process a plurality of inputs provided by a first input object 920 and a second input object 930.

    [0101] In particular, the apparatus may dispose first icons 922 along a perimeter of a point 921 being indicated by the first input object 920 at a distance greater than or equal to a predetermined distance from a reference surface 910. Simultaneously, the apparatus may dispose second icons 932 along a perimeter of an area obscured by the second input object 930, rather than a point 931 being indicated by the second input object 930 proximate to the reference surface 910.

    [0102] Referring to FIG. 9B, according to example embodiments, the apparatus may be operated in a mode in which an input object most proximate to the reference surface is selected. In this instance, the apparatus may not dispose icons along a perimeter of a point 941 being indicated by a first input object 940 at a distance relatively farther from the reference surface 910. The apparatus may select a second input object 950 most proximate to the reference surface 910, and dispose predetermined icons 952 along a perimeter of an area obscured by the second input object 950.

    [0103] Depending on a case, the second input object 950 most proximate to the reference surface 910 may be at a distance greater than or equal to a predetermined distance from the reference surface 910. In this instance, the apparatus may dispose the predetermined icons 952 along a perimeter of a point 951 being indicated by the second input object 950.

    [0104] Referring to FIG. 9C, according to example embodiments, the apparatus may be operated in a mode in which an input object is selected when a position being indicated by the input object is most proximate to a predetermined position above the reference surface.

    [0105] For example, the apparatus may select an input object indicating a position most proximate to a center 980 of the reference surface 910. A first input object 960 may indicate a position 961 at a distance 962 from the center 980 of the reference surface 910, and a second input object 970 may indicate a position 971 at a distance 972 from the center 980 of the reference surface 910. In this instance, the apparatus may select the second input object 970 indicating the position 971 more proximate to the center 980 of the reference surface 910.

    [0106] The apparatus may dispose at least one predetermined icon 973, based on a proximity between the selected second input object 970 and the reference surface 910.

    [0107] Referring to FIG. 10, in operation 1010, the apparatus may sense a plurality of input objects. In operation 1020, the apparatus may select at least one input object, based on a predetermined mode. In operation 1030, the apparatus may perform signal processing for extracting an axis and an endpoint of the selected input object. In operation 1040, the apparatus may extract the axis and the endpoint of the selected input object, based on a result of the signal processing. In operation 1050, the apparatus may dispose at least one predetermined icon, based on at least one of the extracted axis, the extracted endpoint, a distance between the selected input object and a reference surface, and a distance between the selected input object and a predetermined position on the reference surface, for example.

    [0108] Identical descriptions provided with reference to FIGS. 1 through 9C may be applied to the modules of FIG. 10 and thus a repeated description will be omitted for conciseness.

    [0109] FIG. 11 illustrates an apparatus 1100 for recognizing a proximity motion that may recognize an input pattern according to example embodiments.

    [0110] Referring to FIG. 11, the apparatus 1100 may include a sensor 1110, an input pattern recognizer 1120, a function performer 1130, and a display 1140.

    [0111] The sensor 1110 may include a first sensor and a second sensor. The input pattern recognizer 1120 may recognize an input pattern, based on at least one of an output signal of the first sensor and an output signal of the second sensor.

    [0112] Here, the input pattern recognizer 1120 may track a movement of an input object, thereby sensing a change in at least one of a number of proximity motion points, a direction of a proximity motion, and a change in proximity coordinates, for example.

    [0113] The function performer 1130 may perform a function corresponding to the input pattern recognized by the input pattern recognizer 1120. Here, the function performer 1130 may determine a function corresponding to the input pattern differently, based on a type of an application currently being executed. For example, although identical input patterns are input, the apparatus may perform different functions, based on the type of the application currently being executed.

    [0114] The input pattern recognizer 1120 may calculate at least one of a velocity and an angular velocity of the input object, based on at least one of the output signal of the first sensor and the output signal of the second sensor.

    [0115] For example, the input pattern recognizer 1120 may track a change in a position of the input object, based on at least one of the output signal of the first sensor and the output signal of the second sensor. In this instance, the input pattern recognizer 1120 may calculate the velocity or the angular velocity of the input object, using a value of the change in the position of the input object.

    [0116] In addition, the function performer 1130 may detect a function corresponding to the input pattern, based on the velocity or the angular velocity of the input object calculated by the input pattern recognizer 1120.

    [0117] For example, the function performer 1130 may utilize the velocity or the angular velocity of the input object calculated by the input pattern recognizer 1120, as information required for various UIs, such as a speed at which cards are shuffled and a rotational speed of a roulette wheel, for example.

    [0118] In addition, the apparatus 1100 may output a result of the performance of the function performer 1130 using the display 1140.

    [0119] FIGS. 12A and 12B illustrate an operation of switching between a 2D interface and a 3D interface by an apparatus for recognizing a proximity motion according to example embodiments.

    [0120] Although not shown in the drawings, the apparatus may further include a display, a calculator, and a display controller.

    [0121] The display may provide a 2D interface and a 3D interface. The calculator may calculate at least one of a position, a velocity, and an angular velocity of an input object, based on an output signal of a first sensor and an output signal of a second sensor. The display controller may control an operation mode of the display, based on the calculation.

    [0122] For example, referring to FIG. 12A, the apparatus may sense an input object input in a space at a predetermined distance from a reference surface 1210. When an input object sensed in the corresponding space is absent, the apparatus may display a plurality of icons 1220 using the 2D interface.

    [0123] When an input object 1240 is sensed in the corresponding space, the apparatus may switch from the 2D interface to the 3D interface. In this instance, the apparatus may display a plurality of icons 1230 using the 3D interface.

    [0124] Referring to FIG. 12B, according to example embodiments, when an input object sensed in a space at a predetermined distance from a reference surface 1250, the apparatus may display a plurality of icons 1260 using the 3D interface.

    [0125] When an input object 1280 is sensed in the corresponding space, the apparatus may switch from the 3D interface to the 2D interface. In this instance, the apparatus may display a plurality of icons 1270 using the 2D interface.

    [0126] The method according to the above-described embodiments may be recorded in non-transitory computer-readable media including program instructions to implement various operations embodied by a computer. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. The program instructions recorded on the media may be those specially designed and constructed for the purposes of embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts. Examples of non-transitory computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM discs and DVDs; magnetooptical media such as optical discs; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. The computer-readable media may also be a distributed network, so that the program instructions are stored and executed in a distributed fashion. The program instructions may be executed by one or more processors. The computer-readable media may also be embodied in at least one application specific integrated circuit (ASIC) or Field Programmable Gate Array (FPGA), which executes (processes like a processor) program instructions. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described embodiments, or vice versa.

    [0127] Although embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles of the disclosure, the scope of which is defined by the claims.


    Claims

    1. An apparatus (100) for recognizing a proximity motion using sensors (110, 120), the apparatus comprising:

    a first sensor (110) ;

    a second sensor (120) configured to sense an input space at a distance farther than a predetermined distance (141) from a reference surface (140); and

    an information transfer controller to transfer information related to a transition of an input object (150), between the first sensor (110) and the second sensor (120), in response to the transition of the input object (150) occurring between a first input space and a second input space, wherein the information related to the transition of the input object when the transition occurs between the first input space and the second input space comprises a coordinate of the input object being sensed in a corresponding input space;

    wherein the first sensor (110) is configured to sense the first input space at a distance closer than the predetermined distance (141) from the reference surface (140) for recognizing a proximate motion, and wherein the second input space is sensed by the second sensor at a distance farther than the predetermined distance (141) from the reference surface (140), and

    characterized in that the second sensor (120) is configured to sense the input object by sensing a change in light with respect to the input object over time by sensing a change in an intensity of light in each pixel of the second sensor (120).


     
    2. The apparatus of claim 1, further comprising:

    a proximity measurer to measure a proximity between the input object and the reference surface; and

    a sensor controller to selectively activate the first sensor and the second sensor, based on the measured proximity,

    wherein the sensor controller preferably comprises:

    a proximity comparator to compare the proximity to the predetermined distance; and

    an operation mode controller to control an operation mode of the first sensor (110) and an operation mode of the second sensor (120), based on the comparison,

    wherein the operation mode comprises at least one of an active mode, an inactive mode, and a stand-by mode.


     
    3. The apparatus of claim 1 or 2, wherein the information related to the transition comprises an x-coordinate and a y-coordinate of the input object being sensed in at least one of the first input space and the second input space,
    wherein preferably:

    the first sensor (110) outputs an x-coordinate, a y-coordinate, and a z-coordinate of the input object (150) being sensed in the first input space, and

    the second sensor (120) outputs an x-coordinate, and a y-coordinate of the input object (150) being sensed in the second input space.


     
    4. The apparatus of any of claims 1-3, wherein:

    the first sensor (110) comprises a touch sensor, and

    the second sensor (120) comprises a vision sensor.


     
    5. The apparatus of any of claims 1-4, further comprising:

    a display to display a proximity motion interface; and

    an interface controller to control the proximity motion interface for a size of an area being indicated by a first input object to be changed based on a proximity between the first input object and the reference surface, when the first input object is input in the first input space,

    wherein:

    the proximity motion interface comprises at least one icon, and

    the interface controller controls the proximity motion interface for a size of an icon being indicated by the first input object to increase as the proximity between the first input object (150) and the reference surface (140) increases, and controls the proximity motion interface for the icon being indicated by the first input object to be selected when the first input object (150) is in contact with the reference surface (140).


     
    6. The apparatus of any of claims 1-5, further comprising:

    a display to display a proximity motion interface; and

    an interface controller to control the proximity motion interface for at least one predetermined icon to be disposed along a perimeter of a point being indicated by a second input object, when the second input object is input in the second input space,

    further preferably comprising:

    a signal processor to extract an axis and an endpoint of the first input object (150) based on an output signal of the first sensor (110), when the first input object (150) is input in the first input space,

    wherein the interface controller controls, using the axis and the endpoint of the first input object (150), the proximity motion interface for the at least one predetermined icon to be disposed along a perimeter of an area in which the display is obscured by the first input object (150), when the first input object (150) is input in the first input space,

    wherein the interface controller preferably controls the proximity motion interface for a size of the at least one predetermined icon to be changed based on the proximity between the first input object (150) and the reference surface (140).


     
    7. The apparatus of any of claims 1-6, further comprising:

    a plurality of input sensing units to determine whether a plurality of input objects is sensed, based on at least one of an output signal of the first sensor (110) and an output signal of the second sensor (120); and

    an input selector to select at least one of a plurality of input objects based on a predetermined mode, when the plurality of input sensing units determines that the plurality of input objects is sensed,

    wherein the predetermined mode preferably comprises a mode in which a plurality of inputs is received, a mode in which an input object most proximate to the reference surface is selected, and a mode in which an input object is selected when a position being indicated by the input object is most proximate to a predetermined position above the reference surface, and

    further preferably comprising:

    a display to display a proximity motion interface;

    a signal processor to extract an axis and an endpoint of at least one input object based on an output signal of the first sensor, when the at least one input object is sensed in the first input space; and

    an interface controller to control the proximity motion interface for at least one predetermined icon to be disposed along a perimeter of a point being indicated by at least one input object, when the at least one input object is sensed in the second input space, and to control, using the axis and the endpoint of the at least one input object, the proximity motion interface for the at least one predetermined icon to be disposed along a perimeter of an area in which the display is obscured by the at least one input object, when the at least one input object is sensed in the first input space,

    wherein the interface controller preferably controls the proximity motion interface for a size of at least one predetermined icon to be changed based on a proximity between at least one input object and the reference surface


     
    8. The apparatus of any of claims 1-7, further comprising:

    an input pattern recognizer to recognize an input pattern based on at least one of an output signal of the first sensor and an output signal of the second sensor; and

    a function performer to perform a function corresponding to the input pattern,

    wherein preferably:
    the input pattern recognizer comprises:

    a calculator to calculate at least one of a velocity and an angular velocity of the input object, based on at least one of the output signal of the first sensor and the output signal of the second sensor, and

    the function performer comprises:
    a detector to detect a function corresponding to the input pattern, based on a result of the calculating.


     
    9. The apparatus of any of claims 1-8, further comprising:

    a display to provide a two-dimensional (2D) interface and a three-dimensional (3D) interface;

    a calculator to calculate at least one of a position, a velocity, and an angular velocity of the input object, based on at least one of an output signal of the first sensor and an output signal of the second sensor; and

    a display controller to control an operation mode of the display based on a result of the calculating.


     
    10. A method of recognizing a proximity motion using sensors (110, 120), the method comprising:

    sensing using a first sensor (110);

    sensing, using a second sensor (120), a second input space when an input object (150) is present in a second input space at a distance farther than a predetermined distance (141) from a reference surface; and

    transferring information related to a transition of the input object (150), between the first sensor (110) and the second sensor (120) in response to the transition of the input object (150) occurring between a first input space and a second input space, wherein the information related to the transition of the input object when the transition occurs between the first input space and the second input space comprises a coordinate of the input object being sensed in a corresponding input space;

    the method further comprises sensing, using the first sensor (110), the first input space when the input object (150) is present in the first input space at a distance closer than the predetermined distance (141) from the reference surface (140) for recognizing a proximity motion, and wherein the second input space is sensed by the second sensor at a distance farther than the predetermined distance (141) from the reference surface (140), and

    characterized in that sensing the input object using the second sensor (120) is performed by sensing a change in light with respect to the input object over time by sensing a change in an intensity of light in each pixel of the second sensor (120).


     
    11. The method of claim 10, further comprising:

    measuring a proximity between the input object (150) and the reference surface (140); and

    selectively activating the first sensor (110) and the second sensor (120), based on the measured proximity,

    wherein the information related to the transition preferably comprises an x-coordinate and a y-coordinate of the input object (150) being sensed in at least one of the first input space and the second input space,

    wherein preferably:

    the first sensor (110) outputs an x-coordinate, a y-coordinate, and a z-coordinate of the input object (150) being sensed in the first input space, and

    the second sensor (120) outputs an x-coordinate and a y-coordinate of the input object (150) being sensed in the second input space.


     
    12. The method of claim 10 or 11, wherein:

    the first sensor (110) comprises a touch sensor, and

    the second sensor (120) comprises a vision sensor.


     
    13. The method of claim 10, 11 or 12, further comprising:

    controlling a proximity motion interface for a size of an area being indicated by a first input object to be changed based on a proximity between the first input object and the reference surface, when the first input object is input in the first input space; and

    displaying the proximity motion interface,

    wherein preferably:

    the proximity motion interface comprises at least one icon, and

    the controlling comprises:

    controlling the proximity motion interface for a size of an icon being indicated by the first input object (150) to increase as the proximity between the first input object (150) and the reference surface (140) increases; and

    controlling the proximity motion interface for the icon being indicated by the first input object (150) to be selected when the first input object (150) is in contact with the reference surface (140).


     
    14. The method of any of claims 10-13, being performed by an apparatus according to any of claims 1-9.
     
    15. A non-transitory computer-readable medium comprising a program for instructing a computer to perform the method of any of claims 10-14.
     


    Ansprüche

    1. Vorrichtung (100) zur Erkennung von Bewegung in der Nähe mit Hilfe von Sensoren (110, 120), wobei die Vorrichtung Folgendes umfasst:

    einen ersten Sensor (110);

    einen zweiten Sensor (120), der zum Erfassen eines Eingaberaumes in einer Entfernung, die größer als eine vorbestimmte Entfernung (141) von einer Referenzfläche (140) ist, konfiguriert ist; und

    eine Informationsübertragungssteuerung zum Übertragen von Informationen in Verbindung mit einem Übergang eines Eingabeobjektes (150), zwischen dem ersten Sensor (110) und dem zweiten Sensor (120), als Reaktion auf den Übergang des Eingabeobjektes (150), der zwischen einem ersten Eingaberaum und einem zweiten Eingaberaum stattfindet, wobei die Informationen in Verbindung mit dem Übergang des Eingabeobjektes, wenn der Übergang zwischen dem ersten Eingaberaum und dem zweiten Eingaberaum stattfindet, eine Koordinate des in einem entsprechenden Eingaberaum erfassten Eingabeobjektes umfassen;

    wobei der erste Sensor (110) dazu konfiguriert ist, den ersten Eingaberaum in einer Entfernung zu erfassen, die kürzer als die vorbestimmte Entfernung (141) von der Referenzfläche (140) ist, um eine Bewegung in der Nähe zu erkennen, und wobei der zweite Eingaberaum von dem zweiten Sensor in einer Entfernung erfasst wird, die größer als die vorbestimmte Entfernung (141) von der Referenzfläche (140) ist, und wobei die Vorrichtung dadurch gekennzeichnet ist, dass der zweite Sensor (120) dazu konfiguriert ist, das Eingabeobjekt durch Erfassen einer Lichtveränderung in Bezug auf das Eingabeobjekt im Laufe der Zeit durch Erfassen einer Veränderung bei der Lichtintensität in jedem Pixel des zweiten Sensors (120) zu erfassen.


     
    2. Vorrichtung nach Anspruch 1, die weiterhin Folgendes umfasst:

    eine Nähemesseinrichtung zum Messen einer Nähe zwischen dem Eingabeobjekt und der Referenzfläche; und

    eine Sensorsteuerung zum wahlweisen Aktivieren des ersten Sensors und des zweiten Sensors basierend auf der gemessen Nähe,

    wobei die Sensorsteuerung vorzugsweise Folgendes umfasst:

    eine Nähevergleichseinrichtung zum Vergleichen der Nähe mit der vorbestimmten Entfernung; und

    eine Betriebsmodussteuerung zum Steuern eines Betriebsmodus des ersten Sensors (110) und eines Betriebsmodus des zweiten Sensors (120), und zwar basierend auf dem Vergleich,

    wobei der Betriebsmodus einen aktiven Modus, einen inaktiven Modus und/oder einen Bereitschaftsmodus umfasst.


     
    3. Vorrichtung nach Anspruch 1 oder 2, wobei die Informationen in Verbindung mit dem Übergang eine x-Koordinate und eine y-Koordinate des in dem ersten Eingaberaum und/oder dem zweiten Eingaberaum erfassten Eingabeobjektes umfassen,
    wobei vorzugsweise:

    der erste Sensor (110) eine x-Koordinate, eine y-Koordinate und eine z-Koordinate des in dem ersten Eingaberaum erfassten Eingabeobjektes (150) ausgibt, und

    der zweite Sensor (120) eine x-Koordinate und eine y-Koordinate des in dem zweiten Eingaberaum erfassten Eingabeobjektes (150) ausgibt.


     
    4. Vorrichtung nach einem der Ansprüche 1 bis 3, wobei:

    der erste Sensor (110) einen Berührungssensor umfasst, und

    der zweite Sensor (120) einen Bildsensor umfasst.


     
    5. Vorrichtung nach einem der Ansprüche 1 bis 4, die weiterhin Folgendes umfasst:

    eine Anzeige zum Anzeigen einer Schnittstelle zur Erkennung einer Bewegung in der Nähe; und

    eine Schnittstellensteuerung zum Steuern der Schnittstelle zur Erkennung einer Bewegung in der Nähe derart, dass die Größe eines durch ein erstes Eingabeobjekt angegebenen Bereichs basierend auf einer Nähe zwischen dem ersten Eingabeobjekt und der Referenzfläche geändert wird, wenn das erste Eingabeobjekt in den ersten Eingaberaum eingegeben wird,

    wobei:

    die Schnittstelle zur Erkennung einer Bewegung in der Nähe wenigstens ein Symbol umfasst, und

    die Schnittstellensteuerung die Schnittstelle zur Erkennung einer Bewegung in der Nähe derart steuert, dass die Größe eines durch das erste Eingabeobjekt angegebenen Symbols sich vergrößert, wenn die Nähe zwischen dem ersten Eingabeobjekt (150) und der Referenzfläche (140) sich vergrößert, und die Schnittstelle zur Erkennung einer Bewegung in der Nähe derart steuert, dass das durch das erste Eingabeobjekt angegebene Symbol ausgewählt wird, wenn das erste Eingabeobjekt (150) mit der Referenzfläche (140) in Berührung ist.


     
    6. Vorrichtung nach einem der Ansprüche 1 bis 5, die weiterhin Folgendes umfasst:

    eine Anzeige zum Anzeigen einer Schnittstelle zur Erkennung einer Bewegung in der Nähe; und

    eine Schnittstellensteuerung zum Steuern der Schnittstelle zur Erkennung einer Bewegung in der Nähe derart, dass wenigstens ein vorbestimmtes Symbol entlang eines Umfangs eines durch ein zweites Eingabeobjekt angegebenen Punktes angeordnet wird, wenn das zweite Eingabeobjekt in den zweiten Eingaberaum eingegeben wird,

    vorzugsweise weiterhin umfassend:

    eine Signalverarbeitungseinrichtung zum Extrahieren einer Achse und eines Endpunktes des ersten Eingabeobjektes (150) basierend auf einem Ausgangssignal des ersten Sensors (110), wenn das erste Eingabeobjekt (150) in den ersten Eingaberaum eingegeben wird,

    wobei die Schnittstellensteuerung mit Hilfe der Achse und des Endpunktes des ersten Eingabeobjektes (150) die Schnittstelle zur Erkennung einer Bewegung in der Nähe derart steuert,

    dass das wenigstens eine vorbestimmte Symbol entlang eines Umfangs eines Bereichs, in dem die Anzeige durch das erste Eingabeobjekt (150) verdeckt ist, angeordnet wird, wenn das erste Eingabeobjekt (150) in den ersten Eingaberaum eingegeben wird, und

    wobei die Schnittstellensteuerung die Schnittstelle zur Erkennung einer Bewegung in der Nähe vorzugsweise derart steuert, dass die Größe des wenigstens einen vorbestimmten Symbols basierend auf der Nähe zwischen dem ersten Eingabeobjekt (150) und der Referenzfläche (140) verändert wird.


     
    7. Vorrichtung nach einem der Ansprüche 1 bis 6, die weiterhin Folgendes umfasst:

    eine Vielzahl von Eingabeerfassungseinheiten zum Bestimmen, ob eine Vielzahl von Eingabeobjekten erfasst wird, und zwar basierend auf einem Ausgangssignal des ersten Sensors (110) und/oder einem Ausgangssignal des zweiten Sensors (120); und

    eine Eingabeauswahleinrichtung zum Auswählen wenigstens eines aus einer Vielzahl von Eingabeobjekten basierend auf einem vorbestimmten Modus, wenn die Vielzahl von Eingabeerfassungseinheiten bestimmt, dass die Vielzahl von Eingabeobjekten erfasst wird,

    wobei der vorbestimmte Modus vorzugsweise Folgendes umfasst: einen Modus, in dem eine Vielzahl von Eingaben erfasst wird, einen Modus, in dem ein am nächsten zu der Referenzfläche befindliches Eingabeobjekt ausgewählt wird, und einen Modus, in dem ein Eingabeobjekt ausgewählt wird, wenn eine durch das Eingabeobjekt angegebene Position am nächsten zu einer vorbestimmten Position oberhalb der Referenzfläche ist, und

    vorzugsweise weiterhin umfassend:

    eine Anzeige zum Anzeigen einer Schnittstelle zur Erkennung einer Bewegung in der Nähe;

    eine Signalverarbeitungseinrichtung zum Extrahieren einer Achse und eines Endpunktes wenigstens eines Eingabeobjektes basierend auf einem Ausgangssignal des ersten Sensors, wenn das erste Eingabeobjekt in den ersten Eingaberaum eingegeben wird; und

    eine Schnittstellensteuerung zum Steuern der Schnittstelle zur Erkennung einer Bewegung in der Nähe derart, dass wenigstens ein vorbestimmtes Symbol entlang eines Umfangs eines durch wenigstens ein Eingabeobjekt angegebenen Punktes angeordnet wird, wenn das wenigstens eine Eingabeobjekt in den zweiten Eingaberaum erfasst wird, und zum Steuern der Schnittstelle zur Erkennung einer Bewegung in der Nähe mit Hilfe der Achse und des Endpunktes des ersten Eingabeobjektes derart, dass das wenigstens eine vorbestimmte Symbol entlang eines Umfangs eines Bereichs, in dem die Anzeige durch das wenigstens eine Eingabeobjekt verdeckt ist, angeordnet wird, wenn das wenigstens eine Eingabeobjekt in den ersten Eingaberaum erfasst wird, wobei die Schnittstellensteuerung die Schnittstelle zur Erkennung einer Bewegung in der Nähe vorzugsweise derart steuert, dass die Größe des wenigstens einen vorbestimmten Symbols basierend auf der Nähe zwischen wenigstens einem Eingabeobjekt und der Referenzfläche verändert wird.


     
    8. Vorrichtung nach einem der Ansprüche 1 bis 7, die weiterhin Folgendes umfasst:

    eine Eingabemustererkennungseinrichtung zum Erkennen eines Eingabemusters basierend auf einem Ausgangssignal des ersten Sensors und/oder einem Ausgangssignal des zweiten Sensors; und

    eine Funktionsausführungseinrichtung zum Ausführen einer dem Eingabemuster entsprechenden Funktion,

    wobei vorzugsweise:
    die Eingabemustererkennungseinrichtung Folgendes umfasst:

    eine Berechnungseinrichtung zum Berechnen einer Geschwindigkeit und/oder einer Winkelgeschwindigkeit des Eingabeobjektes basierend auf dem Ausgangssignal des ersten Sensors und/oder dem Ausgangssignal des zweiten Sensors, und

    die Funktionsausführungseinrichtung Folgendes umfasst:
    eine Erfassungseinrichtung zum Erfassen einer dem Eingabemuster entsprechenden Funktion basierend auf einem Ergebnis der Berechnung.


     
    9. Vorrichtung nach einem der Ansprüche 1 bis 8, die weiterhin Folgendes umfasst:

    eine Anzeige zum Bereitstellen einer zweidimensionalen (2D) Schnittstelle und einer dreidimensionalen (3D) Schnittstelle;

    eine Berechnungseinrichtung zum Berechnen einer Position, einer Geschwindigkeit und/oder einer Winkelgeschwindigkeit des Eingabeobjektes basierend auf dem Ausgangssignal des ersten Sensors und/oder dem Ausgangssignal des zweiten Sensors; und

    eine Anzeigesteuerung zum Steuern eines Betriebsmodus der Anzeige basierend auf einem Ergebnis der Berechnung.


     
    10. Verfahren zur Erkennung von Bewegung in der Nähe mit Hilfe von Sensoren (110, 120), wobei das Verfahren Folgendes umfasst:

    Erfassen, mit Hilfe eines ersten Sensors (110);

    Erfassen, mit Hilfe eines zweiten Sensors (120), eines zweiten Eingaberaumes, wenn ein Eingabeobjekt (150) in dem zweiten Eingaberaum in einer Entfernung vorhanden ist, die größer als eine vorbestimmte Entfernung (141) von der Referenzfläche (140) ist; und

    Übertragen von Informationen in Verbindung mit einem Übergang des Eingabeobjektes (150), zwischen dem ersten Sensor (110) und dem zweiten Sensor (120), als Reaktion auf den Übergang des Eingabeobjektes (150), der zwischen einem ersten Eingaberaum und einem zweiten Eingaberaum stattfindet, wobei die Informationen in Verbindung mit dem Übergang des Eingabeobjektes, wenn der Übergang zwischen dem ersten Eingaberaum und dem zweiten Eingaberaum stattfindet, eine Koordinate des in einem entsprechenden Eingaberaum erfassten Eingabeobjektes umfassen;

    wobei das Verfahren weiterhin Folgendes umfasst: Erfassen, mit Hilfe eines ersten Sensors (110), des ersten Eingaberaumes, wenn das Eingabeobjekt (150) in dem ersten Eingaberaum in einer Entfernung vorhanden ist, die kleiner als die vorbestimmte Entfernung (141) von der Referenzfläche (140) ist, um eine Bewegung in der Nähe zu erkennen, und wobei der zweite Eingaberaum von dem zweiten Sensor in einer Entfernung erfasst wird, die größer als die vorbestimmte Entfernung (141) von der Referenzfläche (140) ist, und wobei das Verfahren dadurch gekennzeichnet ist, dass

    das Erfassen des Eingabeobjektes mit Hilfe des zweiten Sensors (120) durch Erfassen einer Lichtveränderung in Bezug auf das Eingabeobjekt im Laufe der Zeit durch Erfassen einer Veränderung bei der Lichtintensität in jedem Pixel des zweiten Sensors (120) durchgeführt wird.


     
    11. Verfahren nach Anspruch 10, das weiterhin Folgendes umfasst:

    Messen einer Nähe zwischen dem Eingabeobjekt (150) und der Referenzfläche (140); und

    wahlweises Aktivieren des ersten Sensors (110) und des zweiten Sensors (120) basierend auf der gemessen Nähe,

    wobei die Informationen in Verbindung mit dem Übergang vorzugsweise eine x-Koordinate und eine y-Koordinate des in dem ersten Eingaberaum und/oder dem zweiten Eingaberaum erfassten Eingabeobjektes (150) umfassen,

    wobei vorzugsweise:

    der erste Sensor (110) eine x-Koordinate, eine y-Koordinate und eine z-Koordinate des in dem ersten Eingaberaum erfassten Eingabeobjektes (150) ausgibt, und

    der zweite Sensor (120) eine x-Koordinate und eine y-Koordinate des in dem zweiten Eingaberaum erfassten Eingabeobjektes (150) ausgibt.


     
    12. Verfahren nach Anspruch 10 oder 11, wobei:

    der erste Sensor (110) einen Berührungssensor umfasst, und

    der zweite Sensor (120) einen Bildsensor umfasst.


     
    13. Verfahren nach Anspruch 10, 11 oder 12, das weiterhin Folgendes umfasst:

    Steuern einer Schnittstelle zur Erkennung einer Bewegung in der Nähe derart, dass die Größe eines durch ein erstes Eingabeobjekt angegebenen Bereichs basierend auf einer Nähe zwischen dem ersten Eingabeobjekt und der Referenzfläche geändert wird, wenn das erste Eingabeobjekt in den ersten Eingaberaum eingegeben wird; und

    Anzeigen der Schnittstelle zur Erkennung einer Bewegung in der Nähe,

    wobei vorzugsweise:

    die Schnittstelle zur Erkennung einer Bewegung in der Nähe wenigstens ein Symbol umfasst, und

    das Steuern Folgendes umfasst:

    Steuern der Schnittstelle zur Erkennung einer Bewegung in der Nähe derart, dass die Größe eines durch das erste Eingabeobjekt (150) angegebenen Symbols sich vergrößert, wenn die Nähe zwischen dem ersten Eingabeobjekt (150) und der Referenzfläche (140) sich vergrößert, und

    Steuern der Schnittstelle zur Erkennung einer Bewegung in der Nähe derart, dass das durch das erste Eingabeobjekt (150) angegebene Symbol ausgewählt wird, wenn das erste Eingabeobjekt (150) mit der Referenzfläche (140) in Berührung ist.


     
    14. Verfahren nach einem der Ansprüche 10 bis 13, das von einer Vorrichtung nach einem der Ansprüche 1 bis 9 durchgeführt wird.
     
    15. Nicht-flüchtiges computerlesbares Medium, das ein Programm umfasst, welches einen Computer anweist, das Verfahren nach einem der Ansprüche 10 bis 14 durchzuführen.
     


    Revendications

    1. Dispositif (100) de reconnaissance de mouvement de proximité au moyen de capteurs (110, 120), le dispositif comprenant :

    un premier capteur (110),

    un deuxième capteur (120) conçu pour détecter un espace d'entrée à une distance supérieure à une distance prédéterminée (141) par rapport à une surface de référence (140), et

    un organe de commande de transfert d'informations destiné à transférer des informations concernant la transition d'un objet d'entrée (150), entre le premier capteur (110) et le deuxième capteur (120), en réaction à la production d'une transition de l'objet d'entrée (150) entre un premier espace d'entrée et un deuxième espace d'entrée, lesdites informations concernant la transition de l'objet d'entrée comprenant, lorsqu'il se produit ladite transition entre le premier espace d'entrée et le deuxième espace d'entrée, une coordonnée de l'objet d'entrée détecté dans un espace d'entrée correspondant ;

    ledit premier capteur (110) étant conçu pour détecter le premier espace d'entrée à une distance inférieure à la distance prédéterminée (141) par rapport à la surface de référence (140) afin de reconnaître un mouvement proche, et ledit deuxième espace d'entrée étant détecté par le deuxième capteur à une distance supérieure à la distance prédéterminée (141) par rapport à la surface de référence (140) ; le dispositif étant caractérisé en ce que

    le deuxième capteur (120) est conçu pour détecter l'objet d'entrée en détectant un changement concernant la lumière dans le temps eu égard à l'objet d'entrée, par détection d'un changement concernant l'intensité lumineuse dans chaque pixel du deuxième capteur (120).


     
    2. Dispositif selon la revendication 1, comprenant en outre :

    un dispositif de mesure de proximité destiné à mesurer la proximité entre l'objet d'entrée et la surface de référence, et

    un organe de commande de capteurs destiné à activer sélectivement le premier capteur et le deuxième capteur, compte tenu de la proximité mesurée ;

    ledit organe de commande de capteurs comprenant de préférence :

    un comparateur de proximité destiné à comparer ladite proximité à ladite distance prédéterminée, et

    un organe de commande de mode de fonctionnement destiné à commander le mode de fonctionnement du premier capteur (110) et le mode de fonctionnement du deuxième capteur (120), compte tenu de la comparaison ;

    ledit mode de fonctionnement comprenant un mode actif, un mode inactif et/ou un mode veille.


     
    3. Dispositif selon la revendication 1 ou 2, dans lequel les informations concernant la transition comprennent une coordonnée x et une coordonnée y de l'objet d'entrée détecté dans le premier espace d'entrée et/ou le deuxième espace d'entrée,
    et dans lequel de préférence :

    le premier capteur (110) communique une coordonnée x, une coordonnée y et une coordonnée z de l'objet d'entrée (150) détecté dans le premier espace d'entrée, et

    le deuxième capteur (120) communique une coordonnée x et une coordonnée y de l'objet d'entrée (150) détecté dans le deuxième espace d'entrée.


     
    4. Dispositif selon l'une quelconque des revendications 1 à 3, dans lequel :

    le premier capteur (110) comprend un capteur tactile, et

    le deuxième capteur (120) comprend un capteur visuel.


     
    5. Dispositif selon l'une quelconque des revendications 1 à 4, comprenant en outre :

    un dispositif d'affichage destiné à afficher une interface de reconnaissance de mouvement de proximité, et

    un organe de commande d'interface destiné à commander l'interface de reconnaissance de mouvement de proximité afin que la taille d'une zone indiquée par un premier objet d'entrée soit modifiée compte tenu de la proximité entre le premier objet d'entrée et la surface de référence, lorsque le premier objet d'entrée subit une entrée dans le premier espace d'entrée ;

    et dans lequel :

    l'interface de reconnaissance de mouvement de proximité comprend au moins une icône, et

    l'organe de commande d'interface commande l'interface de reconnaissance de mouvement de proximité pour que la taille d'une icône indiquée par le premier objet d'entrée augmente tandis que la proximité entre le premier objet d'entrée (150) et la surface de référence (140) augmente, et

    commande l'interface de reconnaissance de mouvement de proximité pour que l'icône indiquée par le premier objet d'entrée soit sélectionnée lorsque le premier objet d'entrée (150) est en contact avec la surface de référence (140).


     
    6. Dispositif selon l'une quelconque des revendications 1 à 5, comprenant en outre :

    un dispositif d'affichage destiné à afficher une interface de reconnaissance de mouvement de proximité, et

    un organe de commande d'interface destiné à commander l'interface de reconnaissance de mouvement de proximité pour qu'au moins une icône prédéterminée soit disposée sur le pourtour d'un point indiqué par un deuxième objet d'entrée, lorsque le deuxième objet d'entrée subit une entrée dans le deuxième espace d'entrée ;

    et comprenant de préférence :

    un processeur de signaux destiné à extraire un axe et un point terminal du premier objet d'entrée (150) compte tenu d'un signal de sortie du premier capteur (110), lorsque le premier objet d'entrée (150) subit une entrée dans le premier espace d'entrée ;

    ledit organe de commande d'interface commandant, au moyen de l'axe et du point terminal du premier objet d'entrée (150), l'interface de reconnaissance de mouvement de proximité pour que l'au moins une icône prédéterminée soit disposée sur le pourtour d'une zone dans laquelle le dispositif d'affichage est masqué par le premier objet d'entrée (150), lorsque le premier objet d'entrée (150) subit une entrée dans le premier espace d'entrée, et

    ledit organe de commande d'interface commandant de préférence l'interface de reconnaissance de mouvement de proximité pour que la taille de l'au moins une icône prédéterminée soit modifiée en fonction de la proximité entre le premier objet d'entrée (150) et la surface de référence (140).


     
    7. Dispositif selon l'une quelconque des revendications 1 à 6, comprenant en outre :

    une pluralité d'unités de détection d'entrées destinées à déterminer si une pluralité d'objets d'entrée est détectée, compte tenu d'un signal de sortie du premier capteur (110) et/ou d'un signal de sortie du deuxième capteur (120), et

    un sélecteur d'entrée destiné à sélectionner au moins un objet parmi une pluralité d'objets d'entrée compte tenu d'un mode prédéterminé, lorsque la pluralité d'unités de détection d'entrées détermine que la pluralité d'objets d'entrée a été détectée ;

    ledit mode prédéterminé comprenant de préférence un mode dans lequel une pluralité d'entrées est reçue, un mode dans lequel un objet d'entrée le plus proche de la surface de référence est sélectionné, et un mode dans lequel un objet d'entrée est sélectionné lorsque la position indiquée par l'objet d'entrée est au plus proche d'une position prédéterminée au-dessus de la surface de référence, et

    le dispositif comprenant de préférence en outre :

    un dispositif d'affichage destiné à afficher une interface de reconnaissance de mouvement de proximité,

    un processeur de signaux destiné à extraire un axe et un point terminal d'au moins un objet d'entrée compte tenu d'un signal de sortie du premier capteur, lorsque l'au moins un objet d'entrée est détecté dans le premier espace d'entrée, et

    un organe de commande d'interface destiné à commander l'interface de reconnaissance de mouvement de proximité pour qu'au moins une icône prédéterminée soit disposée sur le pourtour d'un point indiqué par au moins un objet d'entrée, lorsque l'au moins un objet d'entrée est détecté dans le deuxième espace d'entrée, et destiné à commander, au moyen de l'axe et du point terminal de l'au moins un objet d'entrée, l'interface de reconnaissance de mouvement de proximité pour que l'au moins une icône prédéterminée soit disposée sur le pourtour d'une zone dans laquelle le dispositif d'affichage est masqué par l'au moins un objet d'entrée, lorsque l'au moins un objet d'entrée est détecté dans le premier espace d'entrée ;

    ledit organe de commande d'interface commandant de préférence l'interface de reconnaissance de mouvement de proximité pour que la taille de l'au moins une icône prédéterminée soit modifiée compte tenu de la proximité entre au moins un objet d'entrée et la surface de référence.


     
    8. Dispositif selon l'une quelconque des revendications 1 à 7, comprenant en outre :

    un organe de reconnaissance de motif d'entrée permettant de reconnaître un motif d'entrée compte tenu d'un signal de sortie du premier capteur et/ou d'un signal de sortie du deuxième capteur, et

    un réalisateur de fonction destiné à réaliser une fonction correspondant au motif d'entrée ;

    et dans lequel de préférence :
    l'organe de reconnaissance de motif d'entrée comprend :

    un calculateur destiné à calculer la vitesse et/ou la vitesse angulaire de l'objet d'entrée, compte tenu du signal de sortie du premier capteur et/ou du signal de sortie du deuxième capteur, et

    le réalisateur de fonction comprend :
    un détecteur destiné à détecter une fonction correspondant au motif d'entrée, compte tenu du résultat du calcul.


     
    9. Dispositif selon l'une quelconque des revendications 1 à 8, comprenant en outre :

    un dispositif d'affichage destiné à fournir une interface en deux dimensions (2D) et une interface en trois dimensions (3D),

    un calculateur destiné à calculer la position, la vitesse et/ou la vitesse angulaire de l'objet d'entrée, compte tenu du signal de sortie du premier capteur et/ou du signal de sortie du deuxième capteur, et

    un organe de commande de dispositif d'affichage destiné à commander le mode de fonctionnement du dispositif d'affichage compte tenu du résultat du calcul.


     
    10. Procédé de reconnaissance d'un mouvement de proximité au moyen de capteurs (110, 120), le procédé comprenant :

    une détection effectuée au moyen d'un premier capteur (110),

    la détection, au moyen d'un deuxième capteur (120), d'un deuxième espace d'entrée lorsqu'un objet d'entrée (150) est présent dans le deuxième espace d'entrée à une distance supérieure à une distance prédéterminée (141) par rapport à une surface de référence, et

    le transfert d'informations concernant une transition de l'objet d'entrée (150) entre le premier capteur (110) et le deuxième capteur (120), en réaction à la production d'une transition de l'objet d'entrée (150) entre un premier espace d'entrée et un deuxième espace d'entrée, lesdites informations concernant la transition de l'objet d'entrée comprenant, lorsqu'il se produit ladite transition entre le premier espace d'entrée et le deuxième espace d'entrée, une coordonnée de l'objet d'entrée détecté dans un espace d'entrée correspondant ;

    le procédé comprenant en outre la détection, au moyen du premier capteur (110), du premier espace d'entrée lorsque l'objet d'entrée (150) est présent dans le premier espace d'entrée à une distance inférieure à la distance prédéterminée (141) par rapport à la surface de référence (140) afin de reconnaître un mouvement proche, et le deuxième espace d'entrée étant détecté par le deuxième capteur à une distance supérieure à la distance prédéterminée (141) par rapport à la surface de référence (140) ; le procédé étant caractérisé en ce que

    la détection de l'objet d'entrée au moyen du deuxième capteur (120) est réalisée par détection d'un changement concernant la lumière dans le temps eu égard à l'objet d'entrée, par détection d'un changement concernant l'intensité lumineuse dans chaque pixel du deuxième capteur (120).


     
    11. Procédé selon la revendication 10, comprenant en outre :

    la mesure de la proximité entre l'objet d'entrée (150) et la surface de référence (140), et

    l'activation sélective du premier capteur (110) et du deuxième capteur (120), compte tenu de la proximité mesurée ;

    lesdites informations concernant la transition comprenant de préférence une coordonnée x et une coordonnée y de l'objet d'entrée (150) détecté dans le premier espace d'entrée et/ou le deuxième espace d'entrée ;

    et dans lequel de préférence :

    le premier capteur (110) communique une coordonnée x, une coordonnée y et une coordonnée z de l'objet d'entrée (150) détecté dans le premier espace d'entrée, et

    le deuxième capteur (120) communique une coordonnée x et une coordonnée y de l'objet d'entrée (150) détecté dans le deuxième espace d'entrée.


     
    12. Procédé selon la revendication 10 ou 11, dans lequel :

    le premier capteur (110) comprend un capteur tactile, et

    le deuxième capteur (120) comprend un capteur visuel.


     
    13. Procédé selon la revendication 10, 11 ou 12, comprenant en outre :

    la commande d'une interface de reconnaissance de mouvement de proximité afin de modifier la taille d'une zone indiquée par un premier objet d'entrée compte tenu de la proximité entre le premier objet d'entrée et la surface de référence, lorsque le premier objet d'entrée subit une entrée dans le premier espace d'entrée, et

    l'affichage de l'interface de reconnaissance de mouvement de proximité ;

    et dans lequel de préférence :

    l'interface de reconnaissance de mouvement de proximité comprend au moins une icône, et

    la commande comprend :

    une commande de l'interface de reconnaissance de mouvement de proximité pour que la taille d'une icône indiquée par le premier objet d'entrée (150) augmente tandis que la proximité entre le premier objet d'entrée (150) et la surface de référence (140) augmente, et

    une commande de l'interface de reconnaissance de mouvement de proximité afin que l'icône indiquée par le premier objet d'entrée (150) soit sélectionnée lorsque le premier objet d'entrée (150) est en contact avec la surface de référence (140).


     
    14. Procédé selon l'une quelconque des revendications 10 à 13, mis en Ĺ“uvre par un dispositif selon l'une quelconque des revendications 1 à 9.
     
    15. Support non transitoire lisible par ordinateur comprenant un programme visant à permettre à un ordinateur de mettre en oeuvre le procédé selon l'une quelconque des revendications 10 à 14.
     




    Drawing
























































    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description