(19)
(11) EP 3 278 201 B1

(12) EUROPEAN PATENT SPECIFICATION

(45) Mention of the grant of the patent:
15.05.2019 Bulletin 2019/20

(21) Application number: 15826078.6

(22) Date of filing: 08.12.2015
(51) International Patent Classification (IPC): 
G06F 3/0488(2013.01)
G06F 21/32(2013.01)
G06K 9/00(2006.01)
H04M 1/67(2006.01)
G06F 21/36(2013.01)
(86) International application number:
PCT/US2015/064561
(87) International publication number:
WO 2016/160082 (06.10.2016 Gazette 2016/40)

(54)

AUTHENTICATING A USER AND LAUNCHING AN APPLICATION ON A SINGLE INTENTIONAL USER GESTURE

BENUTZERAUTHENTIFIZIERUNG UND START EINER ANWENDUNG NACH EINER EINZIGEN VORSÄTZLICHEN GESTE EINES BENUTZERS

AUTHENTIFICATION DE L'UTILISATEUR ET LANCEMENT D'UNE APPLICATION APRÈS UN SEUL GESTE INTENTIONNEL D'UN UTILISATEUR


(84) Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30) Priority: 30.03.2015 US 201514673581

(43) Date of publication of application:
07.02.2018 Bulletin 2018/06

(73) Proprietor: Google LLC
Mountain View, CA 94043 (US)

(72) Inventors:
  • LU, Hao
    Mountain View, California 94043 (US)
  • LI, Yang
    Mountain View, California 94043 (US)

(74) Representative: Grant, David Michael 
Marks & Clerk LLP 15 Fetter Lane
London EC4A 1BW
London EC4A 1BW (GB)


(56) References cited: : 
US-A1- 2012 089 952
US-A1- 2012 191 993
   
       
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description

    TECHNICAL FIELD



    [0001] This specification relates to authenticating users and launching applications on devices with gesture-based user interfaces.

    BACKGROUND



    [0002] User authentication and application launching are often implemented as separate processes, each requiring different user inputs. Also, every time a user attempts an authentication on a device, the device needs to be activated from a sleep or standby mode, which will consume power.

    [0003] US2012089952 A1 proposes an apparatus and method for gesture recognition in a portable terminal. An operation of the portable terminal includes determining a user situation by using at least one situation information, determining a user's gesture by using at least one sensor measurement value, and performing a function corresponding to the user situation and the gesture.

    [0004] US2012191993 A1 proposes a system and method for reducing power consumption in an electronic device by controlling the transition of the electronic device from a sleep mode to a full power mode. The electronic device comprises a main processor a touch-sensitive overlay, and an overlay controller. A sequence of touch inputs on the touch-sensitive overlay are detected and captured using the overlay controller while the main processor is in the sleep mode. A subset of the sequence of touch inputs is processed using the overlay controller to determine that the sequence of touch inputs corresponds to a coarse model of a predetermined wake-up gesture prior to transitioning the electronic device from the sleep mode to the full power mode.

    SUMMARY



    [0005] In general, this specification describes techniques for authenticating a user and launching a user-selected application in response to and in accordance with a single intentional user gesture.

    [0006] According to a first aspect of the present invention, there is provided a method as set out in claim 1. According to a second aspect of the present invention, there is provided a computing device as set out in claim 14. According to a third aspect of the present invention, there is provided a non-transitory computer storage medium as set out in claim 15.

    [0007] Particular embodiments of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages. Power consumption may be reduced: a user device remains in a sleep mode, which consumes less power compared to an active mode, when only accidental gestures are detected. User input required to securely launch an application may be reduced: a single user gesture can suffice to not only authenticate a user, but also launch a user-selected application.

    [0008] The details of one or more embodiments of the subject matter of this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.

    BRIEF DESCRIPTION OF THE DRAWINGS



    [0009] 

    FIG. 1 is a block diagram illustrating an example process for authenticating a user and launching a user-selected application responsive to a single intentional user gesture.

    FIG. 2 is a flow diagram illustrating an example process for detecting and disregarding accidental gestures, as well as authenticating a user and launching an application responsive to a single intentional gesture.

    FIG. 3 is a flow diagram illustrating an example process for detecting and disregarding accidental gestures based on predefined criteria.

    FIG. 4 is a block diagram of an example device.



    [0010] Like reference numbers and designations in the various drawings indicate like elements.

    DETAILED DESCRIPTION



    [0011] The implementations described in this specification provide various technical solutions to combine user authentication and application selection shortcuts in accordance with a single intentional user gesture by (1) detecting and disregarding accidental user gestures on a device's touchscreen without transitioning the device from a reduced power consumption mode, e.g., a sleep mode, into an increased power consumption mode, e.g., an active mode, and (2) securely launching an application on the device upon detecting an intentional user gesture that matches a predefined user gesture.

    [0012] For example, a smartphone can maintain a gesture detection component, e.g., a software module, a hardware unit, or a combination of both, in an active detection mode, but other more power-consuming components, e.g., the touchscreen, the processor, and the main memory, in a sleep or standby mode.

    [0013] When a user swipes two fingers across the smartphone's touchscreen, the gesture detection component classifies-without activating the other more power-consuming components from the sleep mode-this two-finger swipe gesture as either an intentional user gesture or an accidental gesture. For example, the gesture detection component tries to determine whether the user making the two-finger swipe gesture is trying to unlock the smartphone and launch a map application or whether the user has accidentally swiped her fingers on the touchscreen while grabbing the smartphone.

    [0014] In some cases, the gesture detection component makes this classification based on a set of predefined criteria, e.g., the orientation of the smartphone when the swipe occurred, the movement speed or acceleration of the smartphone when the swipe occurred, and the total number of fingers involved in making the swipe.

    [0015] For instance, if the smartphone was placed upside down when the swipe occurred, then it is more likely that the swipe was accidental, based on the assumption that a user is more likely to make intentional gesture when a device is in an upright position. For another instance, if the smartphone was moving at an eight-mile per hour speed when the swipe occurred, then it is more likely that the swipe was accidental, based on the assumption that a user is more likely to make intentional gestures when keeping a device still. For a third instance, if the swipe was made with four fingers, then it is more likely that the swipe was accidental, based on the assumption that a user is more likely to make intentional gestures using two or fewer fingers, given the size of the touchscreen relative to that of the user's fingers.

    [0016] Based on one or more of these predefined criteria, if the gesture detection component classifies the user's two-finger swipe on the touchscreen as accidental, the gesture detection component disregards the swipe gesture without activating some of the smartphone components that are in a sleep mode, to keep the power consumption low.

    [0017] If, however, the gesture detection component classifies the user's two-finger swipe as intentional, the gesture detection component activates other smartphone components on an as-needed basis.

    [0018] For example, upon detecting an intentional gesture, the gesture detection component, in some cases, activate a gesture-based authentication component from the sleep mode, which can further determine whether to unlock the smartphone based on the two-finger swipe. If the authentication is successful, the gesture detection component in some cases next activates an application launching component from the sleep mode, which then determines which one of the several different applications resident on the smartphone may be launched based on the two-finger swipe.

    [0019] The gesture-based authentication component, in some cases, unlocks the smartphone if the two-finger swipe matches a user gesture that has been confirmed as sufficient to unlock the smartphone, e.g., unlocking the smartphone after detecting an "L" shape gesture using 2 fingers.

    [0020] The application launching component, in some cases, selectively launches an application, among several different applications, if the two-finger swipe matches a user gesture that has been confirmed as a shortcut to launch the application, e.g., launching an email application, e.g., as opposed to a map application, after detecting an "L" shape gesture using two fingers.

    [0021] In these ways, added user convenience is provided and power consumption maintained low: because a user is not required to provide separate inputs for authentication and application launching; and smartphone components are kept in sleep mode and selectively activated on an as-needed basis.

    [0022] These technical solutions may be particularly advantageous when implemented on mobile devices, e.g., smartphones or tablet computers, where power consumption is a more prominent issue and user authentications are more frequently attempted.

    [0023] FIG. 1 is a block diagram illustrating an example process 100 for user authentication and application launching in response to a single intentional user gesture.

    [0024] The process begins with a device 102 in a sleep or standby mode (110).

    [0025] The device is locked when in the sleep mode and requires a successful user authentication to unlock. For example, a user may need to provide passcode, fingerprint identification, voice verification, or touchscreen gesture, in order to access the device.

    [0026] In some implementations, a device or a component within the device consumes less power when placed in the sleep mode. For example, when a device 102 is in a sleep mode, a processor in the device 102 is placed in an idle mode or the device's display or touchscreen is dimmed or turned off.

    [0027] A gesture detection component within the device 102 is maintained in an active mode to detect user gesture on the device's touchscreen, while some other components of the device 102 is in a sleep mode. In some implementations, the gesture detection component is implemented as a low power consumption digital signal processor (DSP). In other implementations, the gesture detection component is implemented as a software program, e.g., a touchscreen driver program, running in the background while the device's touchscreen is dimmed.

    [0028] Based on a set of predefined criteria, the gesture detection component sometimes classifies a gesture detected on the touchscreen of the device 102 as an accidental gesture 112.

    [0029] The gesture detection component does not activate the device into an active mode, after detecting an accidental gesture. In some implementations, when activated, a device or a device component is placed on an increased power consumption level.

    [0030] Techniques for classifying a user gesture as an intentional user gesture or an accidental user gesture are described in more detail in reference to FIG. 2 and FIG. 3.

    [0031] If the gesture detection component classifies a gesture as an intentional gesture 152, the gesture detection component activates an authentication component of the device from the sleep mode.

    [0032] If the authentication component recognizes one or more characteristics, e.g., shape, speed, or location, of the intentional gesture 152 and, based on these characteristics, compares the gesture 152 with a set of confirmed gestures, the device can determine whether the user providing the gesture can be authenticated on the device 102.

    [0033] If the authentication component determines that the user can be authenticated on the device, it unlocks the device 102.

    [0034] In some implementations, as part of a successful authentication, the authentication component also activates an application launching component from the sleep mode.

    [0035] In some other implementations, as part of a successful authentication, the authentication component optionally activates one or more additional components of the device from the sleep mode into the active mode, e.g., increasing the brightness of the touchscreen and placing the processor in a working mode from an idle mode to enable faster processing of user tasks.

    [0036] Based on one or more recognized characteristics, e.g., shape, speed, or location, of the intentional gesture 152, the application launching component selectively launches one or more applications, from a set of different applications. For example, based on the shape of the intentional gesture 152, the application launching component launches an email application 182 rather than a map application.

    [0037] Note that these actions do not require additional user input, other than the intentional gesture 152. These techniques can be advantageous, as a single intentional gesture suffices to both authenticate a user on the device and launch an application as predefined by the user.

    [0038] FIG. 2 is a flow diagram illustrating an example process 200 for detecting and disregarding accidental gestures, as well as authenticating a user and launching an application in response to and in accordance with a single intentional gesture.

    [0039] For convenience, the process 200 will be described as being performed by a device having a touch screen, one or more processors, and memory for storing executable instruction for execution by the one or more processors. For example, the device 400 shown in FIG. 4, appropriately programmed in accordance with this specification, can perform the process 200.

    [0040] The device detects a gesture by a user on the touchscreen while the computing device is in a sleep mode (step 202). The computing device can be, e.g., a smartphone, a tablet computer, or a desktop or notebook computer, with a touchscreen.

    [0041] In some implementations, the device maintains a blank screen on its touchscreen when detecting user gestures, e.g., to reduce power consumption. Maintaining a blank screen may be particularly advantageous when accidental gestures frequently occur.

    [0042] In some other implementations, when in sleep mode, the device provides one or more visual aids to facilitate gesturing on the touchscreen. For example, the device, without transitioning the touchscreen into a fully lit mode, may display a gesture trail to guide the user to complete the instant gesture or additional gestures. These visual feedbacks from the device inform a user not only that the device is in a working mode, but also what gesture it has detected.

    [0043] After detecting the gesture, the computing device next determines whether to classify the gesture as an accidental gesture or as an intentional gesture (step 204). The computing device can make this determination by comparing the gesture against a set of predefined criteria.

    [0044] If the device classifies the gesture as an accidental user gesture 207, the device disregards the gesture (step 208) and maintains or returns to the sleep mode. The set of predefined criteria may include, for example, one or more characteristics associated with the gesture or the device.

    [0045] In some cases, when in or maintaining a sleep mode, the device maintains a blank screen on the touchscreen and one or more components of the computing device other than the touchscreen in a reduce power consumption mode, e.g., an idle mode.

    [0046] If the gesture is classified as an intentional user gesture (205), the process 200 continues to the authentication step (step 206).

    [0047] The device authenticates the user on the computing device by matching the intentional gesture to one or more confirmed gestures (step 206).

    [0048] In some implementations, the intentional user gesture identifies an alphanumeric character, e.g., a letter "B" or a number "4." In some implementations, the device matches an alphanumeric gesture against predefined alphanumeric values using a pre-trained handwriting recognition module that recognize whether handwriting embodied in an alphanumeric gesture matches handwritings of an authenticated user.

    [0049] In some implementations, the intentional user gesture identifies a predefined shape, e.g., a triangle shape or a circle shape. In some implementations, the device matches shape gestures against a set of predefined shape gesture templates, which an authenticated user can customize.

    [0050] In some implementations, authenticating a user on a device includes determining whether to unlock the device in accordance with one or more inputs, e.g., a finger gesture, a manually entered passcode, or a voice identification, provided by the user. For example, if a user has gestured a letter "B" on a smartphone' s touchscreen and the letter "B" is one of the confirmed gestures stored on the smartphone, the smartphone unlocks to allow user access. Otherwise, the device remains locked and the device may offer alternative authentication means to the user.

    [0051] If an authentication is successful, the device, without requiring additional user input, places the computing device in an active mode (step 212).

    [0052] Upon a successful authentication, the device, without requiring additional user input, also selects an application, from a plurality of different applications, according to the gesture, and launches the application on the device (step 214). For example, based on the same letter "B" gesture, the smartphone searches against a gesture-application mapping table that maps user gestures to user-designated applications, and determines that the letter "B" gesture identifies a particular application, which the devices next launches.

    [0053] In some implementations, after launching an application, the device automatically provides input to the application based on the detected intentional user gesture. In other words, the single intentional user gesture detected not only serves authentication and application shortcut purposes, but also serves as an input to an application launched at the step 214. In some implementations, if the intentional user gesture matches a predefined shape, e.g., a rectangle shape or a triangle shape, the process 200 optionally includes selecting the application, from a plurality of different applications, in accordance with the predefined shape. For example, based on a triangle-shape gesture, the smartphone searches against a gesture-application table that maps user gestures to designated applications, and determines that the triangle-shape gesture identifies a map application on the smartphone, and launches the map application without requiring the user to provide any additional inputs as to which application to launch.

    [0054] In some implementations, when a gesture-based authentication fails, the device provides an alternative authentication means (step 210). For example, when the computing device determines that an intentional user gesture is insufficient to authenticate a user, the computing device promotes the user to manually provide a password or a fingerprint, and attempts to authenticate the user based on these inputs (step 216).

    [0055] In some cases where a gesture-based authentication fails, the device reuses information identified in the single intentional gesture for further authentication, reducing the total amount of user input required for further authentications.

    [0056] For example, when a smartphone obtains, from a user, a number "4" gesture, but determines that the number "4" gesture (e.g., alone) does not suffice to authenticate the user, the smartphone provides a password-based authentication that requires a 2-digit passcode, and provides the number "4" as the first digit of the 2-digit passcode. These techniques are advantageous, as required user inputs are reduced, even in an event of an unsuccessful gesture-based authentication.

    [0057] In some implementations, the devices enables gesturing in an alternative authentication means.

    [0058] For example, after invoking a password-based alternative authentication, the device continues to accept user gesture and provides corresponding input (e.g., symbols or numbers) to the password-based authentication. To continue with the above example, after providing the number "4" as the first digit of the 4-digit passcode, the smartphone detects a gesture having a number "6" shape and accordingly provides the number "6" as the second digit of the 2-digit passcode. The smartphone can continue the password-based authentication by attempting to authenticate the user based on the 2-digit passcode "46."

    [0059] In some implementations, the devices reuses information capture in a gesture in an alternative authentication means.

    [0060] For another example, after invoking a fingerprint-based alternative authentication, the computing device reuses a fingerprint characteristic included in the detected intentional user gesture and provides the fingerprint characteristic to the fingerprint-based alternative authentication. For example, if a gesture includes a pattern (e.g., arch, loop, and whorl) of a user's index finger, the computing device can capture and provide the characteristic as part of the input to the fingerprint authentication.

    [0061] An authenticated user can customize gestures for user authentication and application launching. For example, a user may designate a "B" shape gesture as sufficient to unlock a smartphone and launching an email app on the smartphone. In these ways, an authenticated user can modify confirmed gestures or symbols stored on a mobile device, which in turn modifies subsequent gesture-based authentication process and application launching process.

    [0062] FIG. 3 is a flow diagram illustrating an example process 300 for detecting and disregarding accidental gesture based on predefined criteria.

    [0063] For convenience, the process 300 will be described as being performed by a computing device having one or more processors and memory for storing executable instruction for execution by the one or more processors. For example, the device 400 shown in FIG. 4, appropriately programmed in accordance with this specification, can perform the process 300.

    [0064] In some cases, to determine whether a detected user gesture is an accidental gesture or an intentional gesture, a device identifies a first characteristic associated with a user gesture (step 302) and optionally a second characteristic associated with a device (step 304).

    [0065] In some cases, the device 300 analyzes one or more of the following characteristics associated with a user gesture: the number of touches included in the gesture, the type of the gesture (e.g., swipe, release, and release), the on-touchscreen locations where the gesture occurred/covered, the acceleration magnitude of the gesture, the size (e.g., in terms of the number of pixels) of the gesture, the amount of pressure exerted on the touchscreen by the gesture, the distance and width covered by the gesture, the duration of the gesture, the number of simultaneous finger-touches included in the gesture, and if the gesture includes multiple strokes, the proximity of two (e.g., consecutive) strokes,

    [0066] In some cases, the device optionally analyzes one or more of the following the characteristics associated with the computing device when a user gesture is detected: the computing device's orientation, the computing device's tilt angle (e.g., vertically or horizontally), the computing device's speed of movement, and the computing device's acceleration magnitude.

    [0067] Based on one or more of these gesture- and device-related characteristics, the device then determines whether the user gesture is an intentional gesture (step 306). For example, the computing device determines the likelihood (L) of a gesture being an intentional gesture by assigning a weight (Wi, e.g., ranging from 0 to 1) to each of the analyzed characteristics (Ci, e.g., with values normalized ranging from 0 to 1) and calculating a weighted total L. In some implementations, L= Total over i of (Wi×Ci), where i = 1 to the total number of characteristics analyzed. In some cases, the device classifies a gesture having a weighted total L equal to or more than a predefined value, as an intentional gesture and every other gesture as an accidental gesture.

    [0068] After classifying a detected gesture as an intentional gesture, the device attempts to authenticate the user providing the gesture based on one or more characteristics associated with the gesture.

    [0069] In some implementations, a user authentication process includes calculating the probability (P(u(g)) that a user providing the gesture is an authorized user as identified by a confirmed gesture stored on the computing device as follows:



    [0070] Here, S represents a set of symbols (e.g., "4"; "A"; "a"; "×"; and "◁"); G represents the set of confirmed gestures (provided by one or more authenticated users) identifying the set of symbols; and P (s|g) is calculated using a user-independent gesture recognition module, e.g., a handwriting recognition module.

    [0071] In some implementations, the probability (P (u(g)) is calculated as follows:



    [0072] Here,

    is the probability that two gestures identifying a same symbol (e.g., "4"; "A"; "a"; "×"; and "◁") are provided by a same user.

    [0073] FIG. 4 is a block diagram of an example computing device 400. The user authentication and application launching technologies described in this specification can be implemented on the computing device 400.

    [0074] The device 400 typically includes one or more processing units CPU(s) 402 (also referred to as processors), one or more network interfaces 404, memory 406, and one or more communication buses 408 for interconnecting these components. The communication buses 408 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. The memory 406 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. The memory 406 optionally includes one or more storage devices remotely located from CPU(s) 402. The memory 406, or alternatively the non-volatile memory device(s) within the memory 406, comprises a non-transitory computer readable storage medium. In some implementations, the memory 406 or alternatively the non-transitory computer readable storage medium stores the following programs, modules and data structures, or a subset thereof:
    • an operating system 410, which includes procedures for handling various basic system services and for performing hardware dependent tasks;
    • a network communication module (or instructions) 412 for connecting the device 400 with other devices (e.g., one or more server computers) via the one or more network interfaces 404 (wired or wireless);
    • a gesture detection module 416 for detecting user gestures on the touchscreen 405 and classifying a detected user gesture as either an accidental gesture or an intentional gesture;
    • an authentication module 418 for authenticating a user on the device 400 (e.g., unlocking the device) in accordance with a detected intentional user gesture;
    • an application launching module 418 for selecting, among several different applications, an application in accordance with a detected intentional user gesture and launching the application on the device; and
    • data 420 stored on the device, which include:

      ▪ one or more confirmed gestures 422, e.g., a two-finger gesture having a letter "B" shape or a single-finger gesture having a number "4" shape; and

      ▪ a set of different applications 426, such as an email application 428-1 and a chat application 428-2.



    [0075] In some implementations, the gesture detection module 414 maintains the touchscreen 405 in an active mode to detect user gestures, when other components of device 400 (e.g., the authentication module 416 and the application launching module 418) are in a sleep or standby model. In some implementations, unless a successful authentication occurs, the gesture detection module 414 maintains the touchscreen 405 in a blank screen mode, e.g., to consume less power.

    [0076] In some implementations, one or more of the above identified elements are stored in one or more of the previously mentioned memory devices, and correspond to a set of instructions for performing a function described above. The above identified modules or programs (e.g., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, the memory 306 optionally stores a subset of the modules and data structures identified above. Furthermore, the memory 306 may store additional modules and data structures not described above.

    [0077] Embodiments of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, in tangibly-embodied computer software or firmware, in computer hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible non transitory program carrier for execution by, or to control the operation of, data processing apparatus. Alternatively or in addition, the program instructions can be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. The computer storage medium can be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.

    [0078] The term "data processing apparatus" encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.

    [0079] A computer program (which may also be referred to or described as a program, software, a software application, a module, a software module, a script, or code) can be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data, e.g., one or more scripts stored in a markup language document, in a single file dedicated to the program in question, or in multiple coordinated files, e.g., files that store one or more modules, sub programs, or portions of code. A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.

    [0080] The processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).

    [0081] Computers suitable for the execution of a computer program include, by way of example, can be based on general or special purpose microprocessors or both, or any other kind of central processing unit. Generally, a central processing unit will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a central processing unit for performing or executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device, e.g., a universal serial bus (USB) flash drive, to name just a few.

    [0082] Computer readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.

    [0083] To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.

    [0084] Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network ("LAN") and a wide area network ("WAN"), e.g., the Internet.

    [0085] The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

    [0086] While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.

    [0087] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.

    [0088] Particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order and still achieve desirable results. As one example, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous.


    Claims

    1. A method comprising:

    detecting (202), by a device (102) having a touchscreen, a first gesture by a user on the touchscreen while the device is in a sleep mode (110);

    classifying (204) the first gesture, by the device (102), as an intentional gesture (152, 205) or an accidental gesture (112, 207) and either (i) maintaining the device (102) in the sleep mode (110) if the first gesture is classified as an accidental gesture (112, 207), or (ii) determining, by the device, whether the first gesture matches one or more confirmed gestures stored on the device if the first gesture is classified as an intentional gesture (152, 205):
    if the first gesture is classified as an intentional gesture (152), either (i) requiring (210) an additional user input to authenticate the user if the first gesture does not match one or more of the confirmed gestures stored on the device (102), or (ii) recognizing (211) the user as authenticated and selecting and launching (214) an application from a plurality of applications on the device according to the first gesture, if the first gesture matches one or more of the confirmed gestures stored on the device (102), without requiring any user input in addition to the first gesture to authenticate the user or select and launch the application.


     
    2. The method of claim 1, further comprising identifying the first gesture as a drawing of an alphanumeric character on the touchscreen, the alphanumeric character having an alphanumeric value; and optionally comprising providing the alphanumeric value as an input to the application.
     
    3. The method of claim 1, further comprising identifying the first gesture as a drawing of a predefined shape on the touchscreen.
     
    4. The method of claim 1, further comprising:
    in accordance with a determination that the first gesture is classified as an intentional gesture and the first gesture fails to match one or more of the confirmed gestures stored on the device:
    presenting (216), to the user, a password-based authentication mechanism.
     
    5. The method of claim 4, further comprising:

    determining, by the device (102), that the first gesture identifies an alphanumeric value or a predefined shape; and

    automatically providing the alphanumeric value or the predefined shape identified by the first gesture as a first portion of an input to the password-based authentication mechanism.


     
    6. The method of claim 5, further comprising:
    responsive to detecting an intentional second gesture:

    determining, by the device (102), that the second gesture identifies a second alphanumeric value or a second predefined shape on the touchscreen; and

    automatically providing the second alphanumeric value or the second predefined shape as a second portion of the input to the password-based authentication mechanism.


     
    7. The method of claim 6, further comprising: authenticating the user in accordance with the first portion of the input and the second portion of the input.
     
    8. The method of claim 1, further comprising:
    in accordance with a determination that the first gesture is classified as an intentional gesture and the first gesture fails to match one or more of the confirmed gestures stored on the device (102):

    determining that the first gesture includes a fingerprint characteristic of the user;

    presenting, to the user, a fingerprint-based authentication mechanism; and

    providing the fingerprint characteristic included in the first gesture as part of an input to the fingerprint-based authentication mechanism.


     
    9. The method of claim 8, wherein identifying the intentional gesture as a confirmed gesture includes:

    identifying the intentional gesture as sufficient to authenticate a user; and

    identifying the intentional gesture as a shortcut to launch a particular application on the computing device.


     
    10. The method of claim 1, further comprising:

    collecting an intentional gesture from a user who has been authenticated; and

    identifying the intentional gesture as a confirmed gesture.


     
    11. The method of claim 1, comprising classifying (204) the first gesture, by the device, as an intentional gesture (152, 205) or an accidental gesture (112, 507) based on a set of predefined criteria that includes a characteristic associated with the first gesture or a characteristic associated with the computing device.
     
    12. The method of claim 1, further comprising:
    in accordance with a determination that the first gesture is classified as an intentional gesture and the first gesture fails to match one or more of the confirmed gestures stored on the device:
    maintaining the device in the sleep mode by

    maintaining a blank screen on the touchscreen, and

    maintaining one or more electronic components of the device other than the touchscreen in the sleep mode (110).


     
    13. The method of claim 1, further comprising:
    while detecting the first gesture on the touchscreen while the device (102) is in a sleep mode (110), displaying a visual aid tracing the first gesture on the touchscreen at a reduced power consumption level.
     
    14. A computing device (102) comprising:

    one or more processors;

    a touchscreen; and

    one or more storage units storing instructions that when executed by the one or more processors cause the computing device to perform a method according to any one of the preceding claims.


     
    15. A non-transitory computer storage medium encoded with a computer program, the computer program comprising instructions that when executed by a computing device having a touchscreen cause the computing device to perform a method according to any one of claims 1 to 13.
     


    Ansprüche

    1. Verfahren, umfassend:

    Detektieren (202), durch eine Vorrichtung (102) mit einem Berührungsbildschirm, einer ersten Geste durch einen Benutzer auf dem Berührungsbildschirm, während die Vorrichtung in einem Ruhemodus (110) ist;

    Klassifizieren (204) der ersten Geste, durch die Vorrichtung (102), als eine beabsichtigte Geste (152, 205) oder als eine unbeabsichtigte Geste (112, 207) und entweder (i) Belassen der Vorrichtung (102) in dem Ruhemodus (110), wenn die erste Geste als eine unbeabsichtigte Geste (112, 207) klassifiziert wird, oder (ii) Bestimmen, durch die Vorrichtung, ob die erste Geste mit einer oder mehreren in der Vorrichtung gespeicherten bestätigten Gesten übereinstimmt, wenn die erste Geste als eine beabsichtigte Geste (152, 205) klassifiziert wird:
    wenn die erste Geste als eine beabsichtigte Geste (152) klassifiziert wird, entweder (i) Erfordern (210) einer zusätzlichen Benutzereingabe, um den Benutzer zu authentifizieren, wenn die erste Geste nicht mit einer oder mehreren in der Vorrichtung (102) gespeicherten bestätigten Gesten übereinstimmt, oder (ii) Erkennen (211) des Benutzers als authentifiziert und Auswählen und Starten (214) einer Anwendung aus einer Vielzahl von Anwendungen in der Vorrichtung gemäß der ersten Geste, wenn die erste Geste mit einer oder mehreren der in der Vorrichtung (102) gespeicherten bestätigten Gesten übereinstimmt, ohne irgendeine Benutzereingabe zusätzlich zu der ersten Geste zu erfordern, um den Benutzer zu authentifizieren oder die Anwendung auszuwählen und zu starten.


     
    2. Verfahren nach Anspruch 1, ferner umfassend, die erste Geste als eine Zeichnung eines alphanumerischen Zeichens auf dem Berührungsbildschirm zu identifizieren, wobei das alphanumerische Zeichen einen alphanumerischen Wert aufweist; und wahlweise umfassend, den alphanumerischen Wert als eine Eingabe in die Anwendung bereitzustellen.
     
    3. Verfahren nach Anspruch 1, ferner umfassend, die erste Geste als eine Zeichnung einer im Voraus definierten Form auf dem Berührungsbildschirm zu identifizieren.
     
    4. Verfahren nach Anspruch 1, ferner umfassend:
    gemäß einer Bestimmung, dass die erste Geste als eine beabsichtigte Geste klassifiziert wird und die erste Geste nicht mit einer oder mehreren in der Vorrichtung gespeicherten bestätigten Gesten übereinstimmt:
    Präsentieren (216), dem Benutzer, eines kennwortbasierten Authentifizierungsmechanismus.
     
    5. Verfahren nach Anspruch 4, ferner umfassend:

    Bestimmen, durch die Vorrichtung (102), dass die erste Geste einen alphanumerischen Wert oder eine im Voraus definierte Form identifiziert; und

    automatisches Bereitstellen, dem kennwortbasierten Authentifizierungsmechanismus, des alphanumerischen Werts oder der im Voraus definierten Form, identifiziert durch die erste Geste, als ein erster Abschnitt einer Eingabe.


     
    6. Verfahren nach Anspruch 5, ferner umfassend:
    als Reaktion auf Detektieren einer beabsichtigten zweiten Geste:

    Bestimmen, durch die Vorrichtung (102), dass die zweite Geste einen zweiten alphanumerischen Wert oder eine zweite im Voraus definierte Form auf dem Berührungsbildschirm identifiziert; und

    automatisches Bereitstellen, dem kennwortbasierten Authentifizierungsmechanismus, des zweiten alphanumerischen Werts oder der zweiten im Voraus definierten Form als einen zweiten Abschnitt der Eingabe.


     
    7. Verfahren nach Anspruch 6, ferner umfassend: Authentifizieren des Benutzers gemäß dem ersten Abschnitt der Eingabe und dem zweiten Abschnitt der Eingabe.
     
    8. Verfahren nach Anspruch 1, ferner umfassend:
    gemäß einer Bestimmung, dass die erste Geste als eine beabsichtigte Geste klassifiziert wird und die erste Geste mit einer oder mehreren der in der Vorrichtung (102) gespeicherten bestätigten Gesten nicht übereinstimmt:

    Bestimmen, dass die erste Geste ein Fingerabdruckcharakteristikum des Benutzers enthält;

    Präsentieren, dem Benutzer, eines fingerabdruckbasierten Authentifizierungsmechanismus; und

    Bereitstellen, dem fingerabdruckbasierten Authentifizierungsmechanismus, des in der ersten Geste enthaltenen Fingerabdruckcharakteristikums als einen Teil einer Eingabe.


     
    9. Verfahren nach Anspruch 8, wobei Identifizieren der beabsichtigten Geste als eine bestätigte Geste enthält:

    Identifizieren der beabsichtigten Geste als ausreichend, den Benutzer zu authentifizieren; und

    Identifizieren der beabsichtigten Geste als ein abgekürztes Verfahren zum Starten einer besonderen Anwendung auf der Berechnungsvorrichtung.


     
    10. Verfahren nach Anspruch 1, ferner umfassend:

    Erfassen einer beabsichtigten Geste von einem Benutzer, der authentifiziert wurde; und

    Identifizieren der beabsichtigten Geste als eine bestätigte Geste.


     
    11. Verfahren nach Anspruch 1, umfassend, die erste Geste als eine beabsichtigte Geste (152, 205) oder als eine unbeabsichtigte Geste (112, 507) basierend auf einer Menge von im Voraus definierten Kriterien, die ein mit der ersten Geste assoziiertes Charakteristikum oder ein mit der Berechnungsvorrichtung assoziiertes Charakteristikum enthält, durch die Vorrichtung zu klassifizieren (204).
     
    12. Verfahren nach Anspruch 1, ferner umfassend:
    gemäß einer Bestimmung, dass die erste Geste als eine beabsichtigte Geste klassifiziert wird und die erste Geste mit einer oder mehreren der in der Vorrichtung gespeicherten bestätigten Gesten nicht übereinstimmt:

    Belassen der Vorrichtung in dem Ruhemodus durch

    Belassen eines leeren Bildschirms auf dem Berührungsbildschirm und

    Belassen einer oder mehrerer elektronischer Komponenten der Vorrichtung außer dem Berührungsbildschirm in dem Ruhemodus (110).


     
    13. Verfahren nach Anspruch 1, ferner umfassend:
    während des Detektierens der ersten Geste auf dem Berührungsbildschirm, während die Vorrichtung (102) in einem Ruhemodus (110) ist, Anzeigen einer visuellen Hilfe, die die erste Geste auf dem Berührungsbildschirm bei einem reduzierten Leistungsverbrauchsniveau nachzeichnet.
     
    14. Berechnungsvorrichtung (102), umfassend:

    einen oder mehrere Prozessoren;

    einen Berührungsbildschirm; und

    eine oder mehrere Speichereinheiten, die Anweisungen speichern, die, wenn sie durch den einen oder die mehreren Prozessoren ausgeführt werden, bewirken, dass die Berechnungsvorrichtung ein Verfahren nach einem der vorstehenden Ansprüche durchführt.


     
    15. Nicht vergängliches Computerspeichermedium, codiert mit einem Computerprogramm, wobei das Computerprogramm Anweisungen umfasst, die, wenn sie durch eine Berechnungsvorrichtung mit einem Berührungsbildschirm ausgeführt werden, bewirken, dass die Berechnungsvorrichtung ein Verfahren nach einem der Ansprüche 1 bis 13 durchführt.
     


    Revendications

    1. Procédé comprenant :

    la détection (202), au moyen d'un dispositif (102) qui comporte un écran tactile, d'un premier geste effectué par un utilisateur sur l'écran tactile, tandis que le dispositif est dans un mode sommeil (110) ;

    la classification (204) du premier geste, au moyen du dispositif (102), en tant que geste intentionnel (152, 205) ou en tant que geste accidentel (112, 207) et soit (i) le maintien du dispositif (102) dans le mode sommeil (110), si le premier geste est classifié en tant que geste accidentel (112, 207), soit (ii) la détermination, au moyen du dispositif, de si oui ou non le premier geste correspond à un ou à plusieurs geste(s) confirmé(s) qui est/sont stocké(s) sur le dispositif si le premier geste est classifié en tant que geste intentionnel (152, 205) ;

    si le premier geste est classifié en tant que geste intentionnel (152), soit (i) la demande (210) d'une entrée d'utilisateur additionnelle de manière à authentifier l'utilisateur si le premier geste ne correspond pas à un ou à plusieurs des gestes confirmés qui sont stockés sur le dispositif (102), soit (ii) la reconnaissance (211) de l'utilisateur comme étant authentifié et la sélection et le lancement (214) d'une application prise parmi une pluralité d'applications sur le dispositif conformément au premier geste, si le premier geste correspond à un ou à plusieurs des gestes confirmés qui sont stockés sur le dispositif (102), sans demander une quelconque entrée d'utilisateur en plus du premier geste pour authentifier l'utilisateur ou pour sélectionner et lancer l'application.


     
    2. Procédé selon la revendication 1, comprenant en outre l'identification du premier geste en tant que dessin d'un caractère alphanumérique sur l'écran tactile, le caractère alphanumérique présentant une valeur alphanumérique ; et en option, comprenant la fourniture de la valeur alphanumérique en tant qu'entrée sur l'application.
     
    3. Procédé selon la revendication 1, comprenant en outre l'identification du premier geste en tant que dessin d'une forme prédéfinie sur l'écran tactile.
     
    4. Procédé selon la revendication 1, comprenant en outre :
    conformément à une détermination du fait que le premier geste est classifié en tant que geste intentionnel et que le premier geste est en échec en termes de correspondance vis-à-vis de l'un ou de plusieurs des gestes confirmés qui sont stockés sur le dispositif :
    la présentation (216), à l'utilisateur, d'un mécanisme d'authentification basé sur mot de passe.
     
    5. Procédé selon la revendication 4, comprenant en outre :

    la détermination, au moyen du dispositif (102), du fait que le premier geste identifie une valeur alphanumérique ou une forme prédéfinie ; et

    la fourniture de manière automatique de la valeur alphanumérique ou de la forme prédéfinie qui est identifiée au moyen du premier geste en tant que première partie d'une entrée sur le mécanisme d'authentification basé sur mot de passe.


     
    6. Procédé selon la revendication 5, comprenant en outre :
    en réponse à la détection d'un second geste intentionnel :

    la détermination, au moyen du dispositif (102), du fait que le second geste identifie une seconde valeur alphanumérique ou une seconde forme prédéfinie sur l'écran tactile ; et

    la fourniture de manière automatique de la seconde valeur alphanumérique ou de la seconde forme prédéfinie en tant que seconde partie de l'entrée sur le mécanisme d'authentification basé sur mot de passe.


     
    7. Procédé selon la revendication 6, comprenant en outre : l'authentification de l'utilisateur conformément à la première partie de l'entrée et à la seconde partie de l'entrée.
     
    8. Procédé selon la revendication 1, comprenant en outre :
    conformément à une détermination du fait que le premier geste est classifié en tant que geste intentionnel et que le premier geste est en échec en termes de correspondance vis-à-vis de l'un ou de plusieurs des gestes confirmés qui sont stockés sur le dispositif (102) :

    la détermination du fait que le premier geste inclut une caractéristique d'empreinte digitale de l'utilisateur ;

    la présentation, à l'utilisateur, d'un mécanisme d'authentification basé sur empreinte digitale ; et

    la fourniture de la caractéristique d'empreinte digitale qui est incluse dans le premier geste en tant que partie d'une entrée sur le mécanisme d'authentification basé sur empreinte digitale.


     
    9. Procédé selon la revendication 8, dans lequel l'identification du geste intentionnel en tant que geste confirmé inclut :

    l'identification du geste intentionnel comme étant suffisant pour authentifier un utilisateur ; et

    l'identification du geste intentionnel en tant que raccourci pour lancer une application particulière sur le dispositif informatique.


     
    10. Procédé selon la revendication 1, comprenant en outre :

    la collecte d'un geste intentionnel émanant d'un utilisateur qui a été authentifié ; et

    l'identification du geste intentionnel en tant que geste confirmé.


     
    11. Procédé selon la revendication 1, comprenant la classification (204) du premier geste, au moyen du dispositif, en tant que geste intentionnel (152, 205) ou en tant que geste accidentel (112, 507) sur la base d'un jeu de critères prédéfinis qui inclut une caractéristique qui est associée au premier geste ou une caractéristique qui est associée au dispositif informatique.
     
    12. Procédé selon la revendication 1, comprenant en outre :
    conformément à une détermination du fait que le premier geste est classifié en tant que geste intentionnel et que le premier geste est en échec en termes de correspondance vis-à-vis de l'un ou de plusieurs des gestes confirmés qui sont stockés sur le dispositif :
    le maintien du dispositif dans le mode sommeil en :

    maintenant un écran vide sur l'écran tactile ; et en

    maintenant un ou plusieurs composant(s) électronique(s) du dispositif autre(s) que l'écran tactile dans le mode sommeil (110).


     
    13. Procédé selon la revendication 1, comprenant en outre :
    pendant la détection du premier geste sur l'écran tactile tandis que le dispositif (102) est dans un mode sommeil (110), l'affichage d'une aide visuelle qui suit à la trace le premier geste sur l'écran tactile à un niveau de consommation de puissance réduit.
     
    14. Dispositif informatique (102) comprenant :

    un ou plusieurs processeur(s) ;

    un écran tactile ; et

    une ou plusieurs unité(s) de stockage qui stocke(nt) des instructions qui, lorsqu'elles sont exécutées par les un ou plusieurs processeurs, forcent le dispositif informatique à réaliser un procédé selon l'une quelconque des revendications qui précèdent.


     
    15. Support de stockage informatique non transitoire codé au moyen d'un programme informatique, le programme informatique comprenant des instructions qui, lorsqu'elles sont exécutées par un dispositif informatique qui comporte un écran tactile, forcent le dispositif informatique à réaliser un procédé selon l'une quelconque des revendications 1 à 13.
     




    Drawing

















    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description