(19)
(11)EP 2 534 554 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
10.06.2020 Bulletin 2020/24

(21)Application number: 10845909.0

(22)Date of filing:  11.02.2010
(51)International Patent Classification (IPC): 
G06F 3/01(2006.01)
G06F 3/041(2006.01)
G06F 3/16(2006.01)
G06F 1/3203(2019.01)
G06F 1/32(2019.01)
G06F 3/02(2006.01)
G06F 3/03(2006.01)
G06F 1/3231(2019.01)
(86)International application number:
PCT/US2010/023821
(87)International publication number:
WO 2011/099969 (18.08.2011 Gazette  2011/33)

(54)

INPUT COMMAND

EINGABEBEFEHL

ORDRE D'ENTRÉE


(84)Designated Contracting States:
AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

(43)Date of publication of application:
19.12.2012 Bulletin 2012/51

(73)Proprietor: Hewlett-Packard Development Company, L.P.
Spring TX 77389 (US)

(72)Inventors:
  • MCCARTHY, John
    Cupertino California 95014 (US)
  • BRIDEN, John
    Cupertino California 95014 (US)

(74)Representative: HGF Limited 
Fountain Precinct Balm Green
Sheffield S1 2JA
Sheffield S1 2JA (GB)


(56)References cited: : 
US-A1- 2005 071 647
US-A1- 2006 192 775
US-A1- 2008 134 102
US-A1- 2009 073 128
US-A1- 2005 071 698
US-A1- 2007 040 033
US-A1- 2009 040 193
US-A1- 2009 085 873
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description

    BACKGROUND



    [0001] When a computing machine processes an input command, a user can access an input device of the computing machine and the input device can be configured to detect an input. The user can operate the input device to enter one or more inputs for the input device. In response to one or more inputs, the input device can transmit the input for the computing machine to process as an input command.

    [0002] US2006/192775 discloses a computer system in which the connected cameras, along with visual cues based on presence detection, pose detection, and/or gaze detection software are configured to change one or more computer operating states to accomplish non-computer related tasks.

    [0003] US2009/073128 discloses a cleanable touch sensitive keyboard that includes proximity sensors that can detect a user's hand(s) approaching device.

    [0004] US2007/040033 discloses a digital mirror system that displays real-time video imagery of a user who stands before it and provides a user sensor that automatically transition between operational states in response to whether or not a user is detected before the mirror.

    [0005] US2005/0071647 relates to obtaining biogenic information of a user through a camera unit and an authentication sensor and comparing it with biogenic information of a person with use authority which has been previously stored, thereby discriminating whether or not the user has the use authority.

    [0006] US2005/0071698 relates to automatic power adjustment in an electronic device.

    [0007] US2009/0040193 relates to a data processor for an occupant identification system. The processor utilizes information from a touch sensor and a contact point in order to provide output for determining the position of a touch to the touch sensor.

    BRIEF DESCRIPTION OF THE DRAWINGS



    [0008] Various features and advantages of the disclosed embodiments will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate, by way of example, features of the embodiments.

    Figure 1 illustrates a computing machine coupled to a sensor and an input device according to an embodiment of the invention.

    Figure 2 illustrates a sensor configured to determine whether a user is within a proximity of a computing machine according to an embodiment of the invention.

    Figure 3 illustrates a block diagram of an input application processing an input command and configuring a computing machine according to an embodiment of the invention.

    Figure 4 illustrates a block diagram of an input application rejecting an input command and configuring a computing machine according to an embodiment of the invention.

    Figure 5 illustrates a computing machine with an embedded input application and an input application stored on a removable medium being accessed by the computing machine according to an embodiment of the invention.

    Figure 6 is a flow chart illustrating a method for detecting an input command according to an embodiment of the invention.

    Figure 7 is a flow chart illustrating a method for detecting an input command according to another embodiment of the invention.


    DETAILED DESCRIPTION



    [0009] Figure 1 illustrates a computing machine 100 coupled to a sensor 130 and an input device 170 according to an embodiment of the invention. In one embodiment, the computing machine 100 is a desktop, laptop/notebook, netbook, and/or any other computing device the sensor 130 and/or the input device 170 can be coupled to.

    [0010] As illustrated in Figure 1, the computing machine 100 is coupled to a processor 120, a sensor 130, an input device 170, a digital display device 160, a storage device 140, and a communication bus 150 for the computing machine 100 and/or one or more components of the computing machine 100 to communicate with one another.

    [0011] Further, as shown in Figure 1, the storage device 140 can store an input application 110. In other embodiments, the machine 100 includes additional components and/or is coupled to additional components in addition to and/or in lieu of those noted above and as illustrated in Figure 1.

    [0012] As noted above, the machine 100 includes a processor 120. The processor 120 sends data and/or instructions to one or more components of the computing machine 100, such as the sensor 130, the input device 170, the digital display device 160, and/or the input application 110. Additionally, the processor 120 receives data and/or instruction from one or more components of the computing machine 100, such as the sensor 130, the input device 170, and/or the input application 110.

    [0013] The input application 110 is an application which can be utilized in conjunction with the processor 120 and at least one sensor 130 to determine whether a user is within a proximity of the computing machine 100. For the purposes of this application, a user is determined to be within a proximity of the computing machine when the sensor 130 determines that the user is within a predefined distance or radius of the computing machine 100 and/or the sensor 130.

    [0014] Additionally, the input application 110 can configure the input device 170 to detect or reject an input command entered by the user in response to whether the user is determined to be within the proximity of the computing machine 100. Further, the input application 110 can transmit and/or process input commands received by the input device 170.

    [0015] The input application 110 can be firmware which is embedded onto the computing machine 100. In other embodiments, the input application 110 is a software application stored on the computing machine 100 within ROM or on the storage device 140 accessible by the computing machine 100 or the input application 110 is stored on a computer readable medium readable and accessible by the computing machine 100 from a different location.

    [0016] Additionally, in one embodiment, the storage device 140 is included in the computing machine 100. In other embodiments, the storage device 140 is not included in the computing machine 100, but is accessible to the computing machine 100 utilizing a network interface of the computing machine 100. The network interface can be a wired or wireless network interface card.

    [0017] In a further embodiment, the input application 110 is stored and/or accessed through a server coupled through a local area network or a wide area network. The input application 110 communicates with devices and/or components coupled to the computing machine 100 physically or wirelessly through a communication bus 150 included in or attached to the computing machine 100. In one embodiment the communication bus 150 is a memory bus. In other embodiments, the communication bus 150 is a data bus.

    [0018] As noted above, the input application 110 can be utilized in conjunction with the processor 120 and at least one sensor 130 to determine whether a user is within a proximity of the computing machine 100 and/or at least one sensor 130. When determining whether a user is within a proximity of the computing machine 100, the input application 110 and/or the processor 120 can configure the sensor 130 to scan an environment around the computing machine 100 for a user. The environment includes a space around the computing machine 100 and the objects within the space.

    [0019] A sensor 130 is a detection device configured to scan for or receive information from the environment around the sensor 130 or the computing machine 100. In one embodiment, a sensor 130 can include at least one from the group consisting of a motion sensor, a proximity sensor, and infrared sensor, and/or an image capturing device. In other embodiments, a sensor 130 can include additional devices and/or components configured to receive and/or scan for information from an environment around the sensor 130 or the computing machine 100.

    [0020] At least one sensor 130 can be coupled to one or more locations on or around the computing machine 100. In another embodiment, at least one sensor 130 can be integrated as part of the computing machine 100. In other embodiments, at least one of the sensors 130 can be coupled to or integrated as part of one or more components of the computing machine 100, such as a digital display device 160.

    [0021] Additionally, at least one sensor 130 can be configured to face towards one or more directions around the computing machine 100. In one embodiment, at least one of the sensors 130 is a front facing sensor. Further, at least one sensor 130 can be configured to rotate around and/or reposition along one or more axis.

    [0022] A digital display device 160 is a display device that can create and/or project one or more images and/or videos for display. In one embodiment, the digital display device 160 can be a monitor and/or a television. In another embodiment, the digital display device 160 is a projector that can project one or more images and/or videos. Additionally, the digital display device 160 can be coupled to the computing machine 100 or the digital display device can be integrated as part of the computing machine 100.

    [0023] A sensor 130 can be configured by the processor 120 and/or the input application 110 to actively scan for the information from the environment. When configuring the sensor 130, the processor 120 and/or the input application 110 can send one or more instructions for the sensor 130 to scan the environment for the information. In another embodiment, the sensor 130 can be configured to periodically and/or upon request scan for the information from the environment.

    [0024] For the purposes of this application, the information can specify whether an object is present in the environment, a size of the object, a shape of the object, a distance of the object, and/or whether the object is moving or stationary. In other embodiments, the information can specify additional details of the object or the environment in addition to and/or in lieu of those noted above.

    [0025] Additionally, the sensor 130 can be configured by the input application 110 and/or the processor 120 to interpret or process the information. In other embodiments, the sensor 130 can receive the information and transmit the information for the input application 110 and/or the processor 120 to interpret or process.

    [0026] When scanning for or receiving information from the environment, the sensor 130 can be configured to scan a viewing area of the sensor 130 for objects within the environment. If an object is found in the viewing area, the sensor 130 can proceed to identify a size and/or a shape of the object. Further, the sensor 130 can determine a distance of the object or user from the sensor 130 or computing machine 100. The sensor 130 can additionally scan for movement of the object to determine whether the object is active or stationary within the environment.

    [0027] In another embodiment, the sensor 130 can emit one or more signals and scan for feedback from the signals. The signals can reflect off of an object within the environment. The sensor 130 can then proceed to detect an amount of signals reflected and/or a direction of the reflected signals to identify a size and/or shape of the object within the environment.

    [0028] The sensor 130 can also detect a series of signals and compare the series of signals to determine whether the object is stationary or active. Further, the sensor 130 and/or the input application 110 can measure an amount of time taken for the signals to return to the sensor 130 to identify a distance of the object from the sensor.

    [0029] Utilizing the information received from the sensor 130, the input application 110 can determine whether an object in the environment is a user. In one embodiment, the input application 110 compares an identified shape and size of the object to a predefined size and/or shape of a user. The predefined size and/or shape of the user can be defined by a user or the computing machine 100. Additionally, the predefined size and/or shape can be stored on the computing machine 100 or the sensor 130.

    [0030] In another embodiment, the input application 110 additionally considers whether the object is active or stationary when determining whether the object is a user. Further, the sensor 130 and/or the input application 110 can additionally utilize facial detection technology when determining whether the object in the environment is a user. The facial detection technology can be hardware and/or software based. In other embodiments, the sensor 130 can scan for and/or identify users utilizing additional methods in addition to and/or in lieu of those noted above.

    [0031] Once the input application 110 has determined that the object is a user, the input application 110 can proceed to determine whether the user is within a proximity of the computing machine 100. When determining whether the user is within a proximity of the computing machine 100, the input application 110 can compare an identified distance of the user and compare it to a predefined distance. If the user is identified by the input application 110 and/or the sensor 130 to be within the predefining distance, the input application 110 will determine that the user is within a proximity of the computing machine 100.

    [0032] If the user is determined to be within the proximity of the computing machine 100, the input application 110 proceeds to configure at least one input device 170 to detect an input command entered by the user. An input device 170 is a device which a user can access and/or utilize when entering one or more input commands for the computing machine 100.

    [0033] At least one input device 170 can include a touch display device, a keyboard, a mouse, a microphone, and/or an image capturing device. The touch display device can be included and/or be part of the digital display device 160. In another embodiment, the sensor 130 of the computing machine 100 can additionally be utilized as an input device 170. In other embodiments, the input device 170 can be or include additional devices configured to detect one or more inputs entered by a user.

    [0034] When configuring the input device 170, the input application 110 and/or the processor 120 send one or more instructions for the input device 170 to scan for the user accessing the input device and to detect one or more inputs entered by the user. If the input device 170 detects any inputs entered by the user, the input device 170 can communicate and/or transmit the input to the input application 110. The input application 110 can then interpret an input command from the input entered by the user for the computing machine 100 to process.

    [0035] In another embodiment, if the input application 110 has determined that a user is not within a proximity of the computing machine 100, the input application 110 can reject an input command detected. When rejecting a detected input command, the input application 110 can ignore and/or not process any inputs received by the input device 170. The input application 110 can continue to ignore and/or not process any inputs received by the input device 170 until a user is determined to be within a proximity of the computing machine 100.

    [0036] In another embodiment, the input application 110 can additionally configure the computing machine 100 to enter and/or transition into a low power state when the sensor 130 does not detect the user within a proximity of the computing machine 100 after a predefined period of time. The predefined period of time can be defined by a user and/or by the computing machine 100.

    [0037] The computing machine 100 can include one or more power states. In one embodiment, the computing machine 100 includes a low power state and a high power state. When in the low power state, the computing machine 100 can reduce the amount of power supplied to one or more components of the computing machine 100. Additionally, when in the low power state, the input application 110 and/or the processor 120 can configure one or more components of the computing machine to enter a low power mode or sleep state.

    [0038] The computing machine 100 can be configured by the process 120 and/or the input application 100 to transition to and/or from one or more of the power states in response to whether a user is determined by the sensor 130 to be within a proximity of the computing machine 100.

    [0039] Figure 2 illustrates a sensor 230 configured to determine whether a user 200 is within a proximity of a computing machine 200 according to an embodiment of the invention. As illustrated in Figure 2, in one embodiment, the sensor 230 can be an image and/or video capturing device and the sensor 230 can be coupled to a digital display device 260 of the computing machine 200. In other embodiments, the sensor 230 can be additional detection devices and the sensor 230 can be coupled to additional locations or positions around the computing machine 200.

    [0040] As shown in the present embodiment, the sensor 230 can capture a view of an environment around the computing machine 200 by scanning and/or detecting information around the computing machine 200. The sensor 230 captures a view of any objects within the environment of the computing machine 200. As noted above, the sensor 230 can actively scan the environment for an object or the sensor 230 can periodically or upon request scan the environment for an object.

    [0041] As illustrated in Figure 2, the sensor 230 has detected object 200 within the environment. Once an object 200 has been found within a view of the sensor 230, the sensor 230 can continue to capture additional information of the object 200. In one embodiment, the sensor 230 captures and identifies a size and a shape of the object 200. In another embodiment, the sensor 230 additionally identifies a distance of the object 200 and determines whether the object 200 is stationary or moving.

    [0042] The sensor 230 passes information of the object 200 to an input application of the computing machine 200. The input application can then determine whether the object 200 is a user and whether the user is within a proximity of the computing machine 200. In other embodiments, the sensor 230 can analyze the information captured and determined whether the object 200 is a user.

    [0043] As noted above, the input application can compare the identified shape and size of the object 200 to a predefined size and/or shape of a user. The predefined size and/or shape of the user can be defined by a user or the computing machine. In one embodiment, the predefined size and/or shape of the user can be adjusted by the input application in response to an identified distance of the object 200.

    [0044] In another embodiment, the input application and/or the sensor 230 additionally utilize facial detection technology when determining whether the object is a user 200. If a face is detected, the input application will determine that the object 200 is a user 200. In another embodiment, the input application further considers whether the object is active or stationary when determining whether the object 200 is a user 200.

    [0045] Once a user 200 has been identified within the environment around the computing machine 200, the input application can proceed to determine whether the user 200 is within a proximity of the computing machine 200. As noted above, the input application can compare the identified distance of the user 200 and compare it to a predefined distance. If the user is identified to be of a distance less than or equal to the predefined distance, then the input application will determine that the user 200 is within a proximity of the computing machine 200.

    [0046] As illustrated in Figure 2, in response to the user 200 being determined to be within a proximity of the computing machine 200, the input application configures at least one input device 270 to detect an input command entered by the user. As shown in the present embodiment, the input device 270 can be coupled to and/or integrated as part of the computing machine 200 and/or the digital display device 260. Additionally, as illustrated in Figure 2, the digital display device 260 can be configured to render an interface for the user 200 to interact with when entering inputs.

    [0047] Figure 3 illustrates a block diagram of an input application 310 processing an input command and configuring a computing machine 300 according to an embodiment of the invention. As illustrated in Figure 3, the input application 310 and/or a processor of the computing machine 300 configure a sensor 330 to determine whether a user is within a proximity of the computing machine.

    [0048] One or more instructions are sent to the sensor 330 by the input application 310 or the processor for the sensor 330 to scan an environment around the computing machine 300 for any objects. As illustrated in the present embodiment, the sensor 330 has detected an object within the environment and the sensor 330 has determined that the object is a user within a proximity of the computing machine 300.

    [0049] In another embodiment, when the sensor 330 detects an object within the environment, the sensor 330 identifies additional information of the object and sends the information to the input application 310. The input application 310 then analyzes the information to determine whether the object is a user and whether the user is within a proximity of the computing machine 300.

    [0050] In one embodiment, if the user is determined to be within a proximity of the computing machine 300, the input application proceeds to determine whether the computing machine 300 is in a high power state 320 or a low power state 325. As noted above, the computing machine 300 can enter and/or transition into one or more power states.

    [0051] If the input application 310 determines that the computing machine 300 is currently in a low power state 325, the input application 310 instructs the process and/or the computing machine 300 to transition into a high power state. If the computing machine 300 is already in a high power state 320, the input application 310 instructs the processor and/or the computing machine 300 to remain in the high power state 320.

    [0052] When in a high power state 320, the computing machine 300 continues to supply power to the sensor 330 and the input device 370. The sensor 330 continues to receive power from the computing machine 300 so that the input application 310 and/or the sensor 330 can continue to determine whether a user is within a proximity of the computing machine.

    [0053] Additionally, when in the high power state 320, an input device 370 is configured to detect and scan for any inputs entered by the user. The input application 310 can send one or more instructions for the input device 370 to detect and/or scan for any input or input commands entered by the user.

    [0054] The input device 370 can send any detected inputs to the input application 310 for the input application 310 to process. When processing an input entered by a user, the input application 310 identifies an associated input command corresponding to the input and proceeds to process the input command. In another embodiment, the input application transmits the input command for the processor and/or one or more additional components of the computing machine 300 to process. In other embodiments, the input device 370 interprets an input entered by the user and identifies an associated input command. The input device 370 then transmits the input command for the input application 310 and/or the computing machine 300 to process.

    [0055] Figure 4 illustrates a block diagram of an input application 410 rejecting an input command and configuring a computing machine 400 according to an embodiment of the invention. As illustrated in Figure 4, a sensor 430 coupled to the computing 400 has determined that a user is not detected within a proximity of the computing machine 400. In one embodiment, a user is determined to not be within a proximity of the computing machine 400 when the user is not present in an environment around the computing machine 400 or the user is not within a predefined distance of the computing machine 400.

    [0056] When determining whether the user is within a proximity of the computing machine 400, an input application 410 of the computing machine 400 polls the sensor 430 for this information. In other embodiments, the sensor 430 periodically sends the information or updates to the input application 410.

    [0057] As noted above, in response to the user not being within a proximity of the computing machine 400, the input application 410 rejects any inputs received from an input device 470 of the computing machine 400. In one embodiment, when rejecting the inputs, the input device 470 remains powered on and continues to detect and sends inputs to the input application 410. The input application 410 can then reject any of the inputs received while a user is not within a proximity of the computing machine 400.

    [0058] In another embodiment, when rejecting inputs, the input application 410 can configure the input device 470 to reject any detected inputs. In other embodiments, the input application 410 can and/or the processor can configure the computing machine 400 to enter into a low power state 425. The input application 410 and/or the processor can configure the computing machine 400 to enter and/or transition into the low power state 425 after a period of time of not detecting a user within a proximity of the computing machine 400. The period of time can be predefined by a user or the computing machine 400.

    [0059] As noted above, when in a low power state 425, one or more components of the computing machine 400 can be powered off and/or configured to enter into a sleep state or low power state. As illustrated in Figure 4, in one embodiment, when entering and/or transitioning into the computing machine 400, the input device 470 and/or a digital display device 460 coupled to the computing machine 400 are configured to power off. As a result, the input device 470 will not detect any inputs.

    [0060] Additionally, while the computing machine 400 is in a low power state, the sensor 430 continues to receive power from the computing machine 400 and continues to scan for a user within a proximity of the computing machine 400. While the computing machine 400 is in a low power state, if a user is determined by the sensor 430 and/or the input application 410 to come within a proximity of the computing machine 400, the input application 410 and/or the processor can configure the computing machine 400 to transition from the low power state 425 to the high power state 420.

    [0061] As a result, the input device 470 can proceed to detect inputs entered by the user and the inputs can be processed as input commands by the input application 410 or the computing machine 400.

    [0062] Figure 5 illustrates a computing machine 500 with an embedded input application 510 and an input application 510 stored on a removable medium being accessed by the computing machine 500 according to an embodiment of the invention. For the purposes of this description, a removable medium is any tangible apparatus that contains, stores, communicates, or transports the application for use by or in connection with the computing machine 500. As noted above, in one embodiment, the input application 510 is firmware that is embedded into one or more components of the computing machine 500 as ROM. In other embodiments, the input application 510 is a software application which is stored and accessed from a hard drive, a compact disc, a flash disk, a network drive or any other form of computer readable medium that is coupled to the computing machine 500.

    [0063] Figure 6 is a flow chart illustrating a method for detecting an input command according to an embodiment of the invention. The method of Figure 6 uses a computing machine coupled to a processor, a sensor, an input device, a digital display device, and an input application. In other embodiments, the method of Figure 6 uses additional components and/or devices in addition to and/or in lieu of those noted above and illustrated in Figures 1, 2, 3, 4, and 5.

    [0064] As noted above, the input application is initially configured by the processor to access at least one sensor and to configure a sensor to determine whether a user is within a proximity of the computing machine 600. The input application and/or the processor send one or more instructions for the sensor to continuously, periodically, and/or upon request scan for the user.

    [0065] As noted above, when scanning for a user, the sensor and/or the input application analyze information of an object captured by the sensor. A sensor is a detection device configured to scan and/or receive information from an environment around the computing machine. A sensor can be positioned at one or more positions around the environment. Additionally, the sensor can be coupled to and/or integrated to the computing machine or one or more components of the computing machine.

    [0066] A sensor can be a motion sensor, a proximity sensor, an infrared sensor, and/or an image capturing device. In other embodiments, a sensor can be other detection devices configured to scan an environment around the computing machine.

    [0067] As noted above, if the sensor detects an object within the environment, the sensor can continue to identify and capture additional information of the object. The information can specify a size and/or shape of the object. Additionally, the information can identify a distance of the object and/or determine whether the object is stationary or moving.

    [0068] Utilizing the information of the object, the sensor and/or the input application can determine whether the object is a user. If the object has a shape and a size that match a predefined shape and size of a user, then the object will be identified.by the sensor and/or the input application as a user. In one embodiment, the sensor and/or the input application additionally utilize facial detection technology when determining whether the object is a user.

    [0069] If the object is determined to be a user, the sensor and/or the input application proceed to determine whether the user is within a proximity of the computing machine. The sensor and/or the input application can compare an identified distance of the user to a predefined distance. If the identified distance of the user is less than or equal to the predefined distance, then the user will be determined to be within a proximity of the computing machine.

    [0070] Once a user is determined to be within a proximity of the computing machine, the input device can be configured by the processor and/or the input application to detect any inputs entered by the user 610. In one embodiment, once the user is determined to be within the proximity of the computing machine, the input application will continue to determine whether the computing machine is in a high power state or a low power state.

    [0071] As noted above, the computing machine can include at least two power states and the computing machine can enter and/or transition into one or more of the power states in response to whether the user is within a proximity of the computing machine.

    [0072] If the computing machine is already in a high power state, the input application can configure the computing machine to remain in the high power state. In another embodiment, if the computing machine is currently in a low power state, the computing machine can be configured to transition into the high power state.

    [0073] When in the high power state, an input device can be configured to detect inputs entered by the user. The input device can be any device which a user can interact with to enter one or more inputs as commands. In one embodiment, the input device is a touch display and is integrated with a digital display device of the computing machine. In another embodiment, the sensor can additionally operate as an input device.

    [0074] If the input device detects any inputs from the user, the input device can transmit the input for the input application and/or the computing machine to indentify a corresponding input command and to process the input command 620. In other embodiments, the input device can identify a corresponding input command proceed to transmit the input command for the input application or the computing machine to process.

    [0075] In other embodiments, if a user is not determined to be within a proximity of the computing machine, the computing machine can enter a low power state. Additionally, any inputs from the input device can be rejected by the input application and/or by the input device. In one embodiment, the input device can additionally be configured to power off or enter a sleep state while in the low power state.

    [0076] The method is then complete or the input application can continue to determine whether a user comes within a proximity of the computing machine repeating the method disclosed above. In other embodiments, the method of Figure 6 includes additional steps in addition to and/or in lieu of those depicted in Figure 6.

    [0077] Figure 7 is a flow chart illustrating a method for detecting an input command according to another embodiment of the invention. Similar to the method of Figure 6, the method of Figure 7 uses a computing machine coupled to a processor, a sensor, an input device, a digital display device, and an input application. In other embodiments, the method of Figure 7 uses additional components and/or devices in addition to and/or in lieu of those noted above and illustrated in Figures 1, 2, 3, 4, and 5.

    [0078] As noted above, the processor and/or the input application will initially configure at least one sensor to scan an environment around the computing machine and determine whether there are any objects in the environment. The sensor can scan the environment by scanning a view of the sensor for any objects. In another embodiment, the sensor can emit one or more signals and scanning for feedback from the signals.

    [0079] If any objects are found in the environment, the sensor can further be configured to determine whether a user is within a proximity of a computing machine 700. The sensor can capture information of the object for the sensor and/or the input application to analyze when determining whether the object is a user. When capturing the information, the sensor can capture a view of the object to identify a size, a shape, a distance, and/or to determine whether the object is stationary or if it has moved.

    [0080] In another embodiment, the sensor can scan emit one or more signals. The signals can reflect or bounce off the object and the sensor and/or the input application can measure an amount of the reflected signals, an angle of the reflected signals, a series of the reflected signals and a time duration taken for the reflected signals to return. Utilizing the results, the sensor and/or the input application can determine a size of the object, shape of the object, distance of the object, and whether the object is stationary or has moved.

    [0081] By comparing the size and/or shape to a predefined size and/or shape of a user, the sensor and/or input application can determine whether the object is a user. In other embodiments, the sensor and/or the input application additionally consider whether the object is stationary or in motion when determining whether the object is a user.

    [0082] If the object is determined to be user, the sensor and/or the input application can continue to compare the identified distance of the user to a predefined distance to determine whether the user is within a proximity of the computing machine 700.

    [0083] If the user is not within the proximity of the computing machine, the sensor will notify the input application and the input application will proceed to reject any input detected 750. When rejecting an input the input application can ignore any inputs received from the input device. In another embodiment, the input application can configure the input device to not transmit any inputs detected.

    [0084] In other embodiments, when rejecting any input, the input application can configure the computing machine to enter a low power state. When in the low power state, the input application can power off a digital display device and the input device 770. Additionally, as noted above, when in the low power state, the sensor can continue to receive power and continue to scan for a user coming within proximity of the computing machine.

    [0085] In another embodiment, if the sensor detects a user within a proximity of the computing machine, the input application will proceed to determine whether the computing machine is currently in a low power state 710. If the computing machine is currently in a low power state, the input application can configure the computing machine to transition into a high power state 720. As noted above, when in the high power state, the input device is power on and can detect an input from a user.

    [0086] In another embodiments, if the computing machine is not in a low power state, the input application can proceed to configure the input device to detect an input entered by the user 730. As noted above, the user can enter an input command by accessing and interacting with an input device. The input device can be any device which a user can utilize to enter an input.

    [0087] In one embodiment, the input device is a touch display device, a keyboard, a mouse, a microphone, and an image capturing device. In another embodiment, the sensor can additionally operate as the input device.

    [0088] Once the input application has detected an input, the input device and/or the input application can proceed to identify a corresponding input command entered by the user. The input application and/or the input device can then transmit the input command for the computing machine or the input application to process 740.

    [0089] The method is then complete or input application can continue to determine whether a user is within a proximity of the computing machine and repeat the method disclosed above. In other embodiments, the method of Figure 7 includes additional steps in addition to and/or in lieu of those depicted in Figure 7.

    [0090] By configuring a sensor to determine whether a user is within a proximity of a computing machine, the computing machine can accurately and efficiently process an input command detected by an input device when the user is within the proximity of the computing machine. Additionally, by configuring the input device to reject an input command received when the sensor does not detect the user within a proximity of the computing machine, security can be increased for the computing machine and mistaken inputs can be decreased.


    Claims

    1. A method for detecting an input command by an input device (170, 270), the method comprising:

    configuring (600) a sensor (130, 230) to determine whether a user is within a proximity of a computing machine (200);

    in response to the sensor (130, 230) determining that the user is within the proximity of the computing machine, further configuring (610) the sensor (130, 230) to additionally operate as the input device (170, 270) to detect an input command entered by the user;

    transmitting (620) the input command for the computing machine to process; and, in response to the sensor (130, 230) determining that the user is not within the proximity of the computing machine,

    rejecting an input command detected by the input device (170, 270);

    wherein the sensor (130, 230) is a motion sensor, a proximity sensor, an infrared sensor or an image capture device.


     
    2. The method for detecting an input command of claim 1 further comprising configuring the computing machine to enter a low power state when the sensor does not detect the user within the proximity of the computing machine after a predefined period of time.
     
    3. The method for detecting an input command of claim 2 further comprising configuring the computing machine to enter a high power state in response to the sensor detecting the user within the proximity of the computing machine.
     
    4. The method for detecting an input command of claim 2 further comprising powering off a digital display coupled to the computing machine and the input device when the computing machine enters the low power state.
     
    5. A computing machine comprising:

    a processor (120);

    a sensor (130, 230) configured to determine whether a user is within a proximity of the computing machine;

    an input device (170, 270) configured to receive an input from the user;

    an input application (110) executed by the processor (120) from computer readable memory (570) and configured to:

    in response to the sensor (130, 230) determining that the user is within the proximity of the computing machine, further configure the sensor (130, 230) to additionally operate as an input device (170, 270) to detect an input command entered by the user;

    process the input command from the input device (170, 270); and

    in response to the sensor (130, 230) determining that the user is not within the proximity of the computing machine, reject an input command detected by the input device (170, 270);

    wherein the sensor (130, 230) is a motion sensor, a proximity sensor, an infrared sensor or an image capture device.


     
    6. The computing machine of claim 5 further comprising a digital display device and the sensor is coupled to the digital display device.
     
    7. The computing machine of claim 5 wherein the input application additionally utilizes facial detection technology when configuring the sensor to determine whether the user is within the proximity of the computing machine.
     
    8. The computing machine of claim 5 wherein the computing machine includes a low power state and the sensor is configured to receive power from the computing machine while the computing machine is in the low power state.
     
    9. The computing machine of claim 8 wherein the input device is configured to reject an input command received from the user while the computing machine is in the low power state.
     
    10. A computer-readable program in a computer-readable medium comprising:

    an input application configured to, when executed by a processor (120) of a computing machine comprising an input device (170, 270) and a sensor (130, 230), utilize the sensor (130, 230) to determine whether a user is within a proximity of the computing machine;

    wherein the input application is additionally configured to, in response to the sensor (130, 230) determining that the user is within the proximity of the computing machine, configure the sensor (130, 230) to additionally operate as the input device (170, 270) to detect an input command entered by the user and to receive and process the input command received from the input device (170, 270),

    wherein the input application is further configured to, in response to the sensor (130, 230) determining that the user is not within the proximity of the computing machine, reject an input command detected by the input device (170, 270); and

    wherein the sensor (130, 230) is a motion sensor, a proximity sensor, an infrared sensor or an image capture device.


     
    11. The computer-readable program in a computer-readable medium of claim 10 wherein determining whether a user is within a proximity of the computing machine includes configuring the sensor to scan for at least one object in an environment of the computing machine and determining whether the object is a user.
     
    12. The computer-readable program in a computer-readable medium of claim 11 wherein determining whether a user is within a proximity of the computing machine includes configuring the sensor to identify a distance of the user from the computing machine.
     


    Ansprüche

    1. Verfahren zum Erfassen eines Eingabebefehls durch eine Eingabevorrichtung (170, 270), wobei das Verfahren Folgendes umfasst:

    Konfigurieren (600) eines Sensors (130, 230), um zu bestimmen, ob sich ein Benutzer in einer Nähe einer Rechenmaschine (200) befindet;

    als Reaktion darauf, dass der Sensor (130, 230) bestimmt, dass sich der Benutzer in der Nähe der Rechenmaschine befindet, ferner Konfigurieren (610) des Sensors (130, 230), um zusätzlich als die Eingabevorrichtung (170, 270) zu arbeiten, um einen von dem Benutzer eingetragenen Eingabebefehl zu erfassen;

    Senden (620) des von der Rechenmaschine zu verarbeitenden Eingabebefehls; und

    als Reaktion darauf, dass der Sensor (130, 230) bestimmt, dass sich der Benutzer nicht in der Nähe der Rechenmaschine befindet, Ablehnen eines von der Eingabevorrichtung (170, 270) erfassten Eingabebefehls;

    wobei der Sensor (130, 230) ein Bewegungssensor, ein Nähesensor, ein Infrarotsensor oder eine Bilderfassungsvorrichtung ist.


     
    2. Verfahren zum Erfassen eines Eingabebefehls nach Anspruch 1, das ferner das Konfigurieren der Rechenmaschine umfasst, um in einen Niedrigleistungszustand einzutreten, wenn der Sensor nach einem vordefinierten Zeitraum den Benutzer nicht in der Nähe der Rechenmaschine erfasst.
     
    3. Verfahren zum Erfassen eines Eingabebefehls nach Anspruch 2, das ferner das Konfigurieren der Rechenmaschine umfasst, um in einen Hochleistungszustand einzutreten, als Reaktion darauf, dass der Sensor den Benutzer in der Nähe der Rechenmaschine erfasst.
     
    4. Verfahren zum Erfassen eines Eingabebefehls nach Anspruch 2, das ferner ein Abschalten einer digitalen Anzeige umfasst, die mit der Rechenmaschine und der Eingabevorrichtung gekoppelt ist, wenn die Rechenmaschine in den Niedrigleistungszustand eintritt.
     
    5. Rechenmaschine, die Folgendes umfasst:

    einen Prozessor (120);

    einen Sensor (130, 230)

    , der konfiguriert ist, um zu bestimmen, ob sich ein Benutzer in einer Nähe der Rechenmaschine befindet;

    eine Eingabevorrichtung (170, 270), die konfiguriert ist, um eine Eingabe von dem Benutzer zu empfangen;

    eine Eingabeanwendung (110), die von dem Prozessor (120) aus einem computerlesbaren Speicher (570) ausgeführt wird und für Folgendes konfiguriert ist:

    als Reaktion darauf, dass der Sensor (130, 230) bestimmt, dass sich der Benutzer in der Nähe der Rechenmaschine befindet, ferner Konfigurieren des Sensors (130, 230), um zusätzlich als eine Eingabevorrichtung (170, 270) zu arbeiten, um einen von dem Benutzer eingetragenen Eingabebefehl zu erfassen;

    Verarbeiten des Eingabebefehls aus der Eingabevorrichtung (170, 270); und

    als Reaktion darauf, dass der Sensor (130, 230) bestimmt, dass der Benutzer sich nicht in der Nähe der Rechenmaschine befindet, Ablehnen eines von der Eingabevorrichtung (170, 270) erfassten Eingabebefehls;

    wobei der Sensor (130, 230) ein Bewegungssensor, ein Nähesensor, ein Infrarotsensor oder eine Bilderfassungsvorrichtung ist.


     
    6. Rechenmaschine nach Anspruch 5, die ferner eine digitale Anzeigevorrichtung umfasst und wobei der Sensor mit der digitalen Anzeigevorrichtung gekoppelt ist.
     
    7. Rechenmaschine nach Anspruch 5, wobei die Eingabeanwendung zusätzlich Gesichtserkennungstechnologie nutzt, wenn der Sensor konfiguriert wird, um zu bestimmen, ob sich der Benutzer in der Nähe der Rechenmaschine befindet.
     
    8. Rechenmaschine nach Anspruch 5, wobei die Rechenmaschine einen Niedrigleistungszustand einschließt und der Sensor konfiguriert ist, um Leistung aus der Rechenmaschine zu empfangen, während sich die Rechenmaschine in dem Niedrigleistungszustand befindet.
     
    9. Rechenmaschine nach Anspruch 8, wobei die Eingabevorrichtung konfiguriert ist, um einen von dem Benutzer empfangenen Eingabebefehl abzulehnen, während die Rechenmaschine sich in dem Niedrigleistungszustand befindet.
     
    10. Computerlesbares Programm in einem computerlesbaren Medium, das Folgendes umfasst: eine Eingabeanwendung, die konfiguriert ist, wenn sie von einem Prozessor (120) einer Rechenmaschine, die eine Eingabevorrichtung (170, 270) und einen Sensor (130, 230) umfasst, ausgeführt wird, um den Sensor (130, 230) zu nutzen, um zu bestimmen, ob sich ein Benutzer in einer Nähe der Rechenmaschine befindet;
    wobei die Eingabeanwendung zusätzlich konfiguriert ist, um als Reaktion darauf, dass der Sensor (130, 230) bestimmt, dass sich der Benutzer in der Nähe der Rechenmaschine befindet,
    den Sensor (130, 230) zu konfigurieren, um zusätzlich als die Eingabevorrichtung (170, 270) zu arbeiten, um einen von dem Benutzer eingetragenen Eingabebefehl zu erfassen und um den aus der Eingabevorrichtung (170, 270) empfangenen Eingabebefehl zu empfangen und zu verarbeiten,
    wobei die Eingabeanwendung ferner konfiguriert ist, um als Reaktion darauf, dass der Sensor (130, 230) bestimmt, dass der Benutzer sich nicht in der Nähe der Rechenmaschine befindet, einen von der Eingabevorrichtung (170, 270) erfassten Eingabebefehl abzulehnen;
    und
    wobei der Sensor (130, 230) ein Bewegungssensor, ein Nähesensor, ein Infrarotsensor oder eine Bilderfassungsvorrichtung ist.
     
    11. Computerlesbares Programm in einem computerlesbaren Medium nach Anspruch 10, wobei das Bestimmen, ob sich ein Benutzer in einer Nähe der Rechenmaschine befindet, das Konfigurieren des Sensors, um nach wenigstens einem Objekt in einer Umgebung der Rechenmaschine abzutasten, und das Bestimmen, ob das Objekt ein Benutzer ist, einschließt.
     
    12. Computerlesbares Programm in einem computerlesbaren Medium nach Anspruch 11, wobei das Bestimmen, ob sich ein Benutzer in einer Nähe der Rechenmaschine befindet, das Konfigurieren des Sensors, um einen Abstand des Benutzers von der Rechenmaschine zu identifizieren, einschließt.
     


    Revendications

    1. Procédé de détection d'une commande d'entrée par un dispositif d'entrée (170, 270), le procédé comprenant :

    la configuration (600) d'un capteur (130, 230) pour déterminer si un utilisateur se trouve à proximité d'une machine informatique (200) ;

    en réponse au fait que le capteur (130, 230) détermine que l'utilisateur se trouve à proximité de la machine informatique, la configuration en outre (610) du capteur (130, 230) pour fonctionner par ailleurs en tant que dispositif d'entrée (170, 270) pour détecter un commande d'entrée entrée par l'utilisateur ;

    la transmission (620) de la commande d'entrée à la machine informatique à traiter ; et,

    en réponse à la détection par le capteur (130, 230) que l'utilisateur n'est pas à proximité de la machine informatique, le rejet d'une commande d'entrée détectée par le dispositif d'entrée (170, 270) ;

    le capteur (130, 230) étant un capteur de mouvement, un capteur de proximité, un capteur infrarouge ou un dispositif de capture d'image.


     
    2. Procédé de détection d'une commande d'entrée selon la revendication 1, comprenant en outre la configuration de la machine informatique pour entrer dans un état de faible puissance lorsque le capteur ne détecte pas l'utilisateur à proximité de la machine informatique après une période de temps prédéfinie.
     
    3. Procédé de détection d'une commande d'entrée selon la revendication 2, comprenant en outre la configuration de la machine informatique pour entrer dans un état de forte puissance en réponse à la détection par le capteur de l'utilisateur à proximité de la machine informatique.
     
    4. Procédé de détection d'une commande d'entrée selon la revendication 2, comprenant en outre la mise hors tension d'un affichage numérique couplé à la machine informatique et au dispositif d'entrée lorsque la machine informatique entre dans l'état de faible puissance.
     
    5. Machine informatique comprenant :

    un processeur (120) ;

    un capteur (130, 230)

    configuré pour déterminer si un utilisateur se trouve à proximité de la machine informatique ;

    un dispositif d'entrée (170, 270) configuré pour recevoir une entrée de l'utilisateur ;

    une application d'entrée (110) exécutée par le processeur (120) à partir d'une mémoire lisible par ordinateur (570) et configurée pour :

    en réponse au fait que le capteur (130, 230) détermine que l'utilisateur se trouve à proximité de la machine informatique, configurer en outre le capteur (130, 230) pour fonctionner par ailleurs en tant que dispositif d'entrée (170, 270) afin de détecter une commande d'entrée entrée par l'utilisateur ;

    traiter la commande d'entrée à partir du dispositif d'entrée (170, 270) ; et

    en réponse à la détection par le capteur (130, 230) que l'utilisateur n'est pas à proximité de la machine informatique, rejeter une commande d'entrée détectée par le dispositif d'entrée (170, 270) ;

    le capteur (130, 230) étant un capteur de mouvement, un capteur de proximité, un capteur infrarouge ou un dispositif de capture d'image.


     
    6. Machine informatique selon la revendication 5 comprenant en outre un dispositif d'affichage numérique et le capteur est couplé au dispositif d'affichage numérique.
     
    7. Machine informatique selon la revendication 5, dans laquelle l'application d'entrée utilise par ailleurs une technologie de détection faciale lors de la configuration du capteur pour déterminer si l'utilisateur est à proximité de la machine informatique.
     
    8. Machine informatique selon la revendication 5, dans laquelle la machine informatique comporte un état de faible puissance et le capteur est configuré pour recevoir de la puissance de la machine informatique tandis que la machine informatique est dans l'état de faible puissance.
     
    9. Machine informatique selon la revendication 8, dans laquelle le dispositif d'entrée est configuré pour rejeter une commande d'entrée reçue de l'utilisateur tandis que la machine informatique est à l'état de faible puissance.
     
    10. Programme lisible par ordinateur sur un support lisible par ordinateur comprenant : une application d'entrée configurée pour, lorsqu'elle est exécutée par un processeur (120) d'une machine informatique comprenant un dispositif d'entrée (170, 270) et un capteur (130, 230), utiliser le capteur (130, 230) pour déterminer si un utilisateur se trouve à proximité de la machine informatique ;
    l'application d'entrée étant par ailleurs configurée pour, en réponse au capteur (130, 230) déterminer que l'utilisateur est à proximité de la machine informatique,
    la configuration du capteur (130, 230) pour fonctionner par ailleurs en tant que dispositif d'entrée (170, 270) pour détecter une commande d'entrée entrée par l'utilisateur et pour recevoir et traiter la commande d'entrée reçue du dispositif d'entrée (170, 270),
    l'application d'entrée étant en outre configurée pour, en réponse au fait que le capteur (130, 230) détermine que l'utilisateur n'est pas à proximité de la machine informatique, le rejet d'une commande d'entrée détectée par le dispositif d'entrée (170, 270) ;
    et
    le capteur (130, 230) étant un capteur de mouvement, un capteur de proximité, un capteur infrarouge ou un dispositif de capture d'image.
     
    11. Programme lisible par ordinateur dans un support lisible par ordinateur selon la revendication 10, dans lequel déterminer si un utilisateur est à proximité de la machine informatique comporte la configuration du capteur pour balayer au moins un objet dans un environnement de la machine informatique et le fait de déterminer si l'objet est un utilisateur.
     
    12. Programme lisible par ordinateur dans un support lisible par ordinateur selon la revendication 11, dans lequel déterminer si un utilisateur est à proximité de la machine informatique comporte la configuration du capteur pour identifier une distance de l'utilisateur de la machine informatique.
     




    Drawing


























    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description