(19)
(11)EP 2 764 427 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
29.07.2020 Bulletin 2020/31

(21)Application number: 11767428.3

(22)Date of filing:  06.10.2011
(51)International Patent Classification (IPC): 
G06F 3/048(2013.01)
G06F 3/0488(2013.01)
G06F 3/0486(2013.01)
(86)International application number:
PCT/EP2011/067457
(87)International publication number:
WO 2013/050077 (11.04.2013 Gazette  2013/15)

(54)

METHOD AND ELECTRONIC DEVICE FOR MANIPULATING A FIRST OR A SECOND USER INTERFACE OBJECT

VERFAHREN UND ELEKTRONISCHE VORRICHTUNG ZUR MANIPULATION EINES ERSTEN ODER ZWEITEN BENUTZERSCHNITTSTELLENOBJEKTS

PROCÉDÉ ET DISPOSITIF ÉLECTRONIQUE POUR MANIPULER UN PREMIER OU UN SECOND OBJET D'INTERFACE UTILISATEUR


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(43)Date of publication of application:
13.08.2014 Bulletin 2014/33

(73)Proprietor: Sony Corporation
Tokyo 108-0075 (JP)

(72)Inventor:
  • ALEXANDERSSON, Petter
    S-237 35 Bjärred (SE)

(74)Representative: Aera A/S 
Gammel Kongevej 60, 18th floor
1850 Frederiksberg C
1850 Frederiksberg C (DK)


(56)References cited: : 
US-A1- 2005 166 159
US-A1- 2011 161 807
US-A1- 2006 070 007
US-B1- 6 590 568
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description

    TECHNICAL FIELD



    [0001] The present disclosure relates to the field of user interface technologies for an electronic device comprising a touch screen. More particularly, the present disclosure relates to a method and an electronic device for manipulating a first user interface object.

    BACKGROUND



    [0002] A known user interface technology, commonly referred to as drag-and-drop, allows a user of a computer to move files between different folders in a file system in a user friendly manner as compared to use of command line instructions. Other uses of drag-and-drop include addition of a music track to a playlist, moving of text within a word processing application and more. Typically, the user uses a mouse, or similar pointing device, connected to the computer for manipulating the file, or other user interface objects, such as music tracks, text or the like.

    [0003] For the example of moving files above, the user first clicks an icon representing the file and then the user presses and holds a button of the mouse when pointing at the file. Next, the user drags the file to a desired location and then the user releases the button of the mouse to complete the move of the file. In this manner, the user is able to move user interface objects in a manner that is similar to moving objects, such as piles of paper, binders and the like, on a desk. A disadvantage of drag-and-drop is that the user is required to press and hold the mouse button, Pressing and holding may be cumbersome when simultaneously moving the mouse.

    [0004] Another known user interface technology, commonly referred to as cut-and-paste, allows the user to move files between different folders. Additional uses of cut-and-paste include adding music tracks to a playlist, moving text within a word processing application and more.

    [0005] By means of cut-and-paste a file can be moved as described in the following. The user clicks, typically a so called left-click, on an icon representing the file and then the user clicks the file again, but this time the user uses another kind of click, typically a so called right-click. Now, a menu displaying for example "cut" is shown to the user. As a next step, the user cuts out the file by left-clicking on "cut". After the user has found a location to which the file is to be moved, the user right-clicks at this location. In response thereto, a menu displaying for example "paste" is shown to the user. When the user clicks on "paste", a paste action is performed, i.e. the file cut out with the "cut" command is copied to the location and deleted from its original location. In this manner, the move of the file is completed. A disadvantage of cut-and-paste is that the user needs to remember what has been cut while finding the location for the paste action.

    [0006] A known electronic device, such a cellular phone, comprises a touch screen. By means of the touch screen, the user is able to manipulate user interface objects displayed on the touch screen. User interface technologies, such as drag-and-drop and cut-and-paste, may be problematic to implement in the electronic device since the electronic device is typically be operated without a mouse, since the touch screen provides a possibility to manipulate user interface objects.

    [0007] The following references represent prior art relevant to the present invention.

    [0008] US 6590568 B1 relates to a method for dragging and dropping icons based on press touch.

    [0009] US2006070007 A1 discloses different types of visual feedback to assist the user when performing a drag and drop operation.

    [0010] US2005166159 A1 describes drag and drop operations involvingmultiple icons simultaneously.

    SUMMARY



    [0011] The present invention is defined by the independent claims. The dependent claims define further advantageous embodiments.

    [0012] An advantage is that the user is able to manipulate user interface objects in a user friendly manner, e.g. without the need for remembering contents of a clipboard of the electronic device or need for pressing and holding at a user interface object.

    [0013] It shall be noted that the references made through this description to an "embodiment", "example", "aspect" or "configuration" may point to alternative aspects related to the invention but do not necessarily correspond to real realisations of it. The actual embodiments of the invention fall within the scope of the appended claims. It is to be understood that other examples can be practiced and various modifications and structural changes can be made to these actual embodiments of the invention without departing from the scope of the claims, as would be understood by one of ordinary skill in the art.

    BRIEF DESCRIPTION OF THE DRAWINGS



    [0014] The various aspects of embodiments disclosed herein, including particular features and advantages thereof, will be readily understood from the following detailed description and the accompanying drawings, in which:

    Fig. 1A-1I show schematic block diagrams of an exemplifying electronic device when the user picks up an item, drops the item, adds a further item and selects one out of two items for managing thereof,

    Fig. 2 shows a schematic flowchart of exemplifying methods performed by the electronic device, and

    Fig. 3 shows a schematic block diagram of an exemplifying electronic device configured to perform the methods illustrated in Fig. 2.


    DETAILED DESCRIPTION



    [0015] Throughout the following description similar reference numerals have been used to denote similar elements, network nodes, parts, items or features, when applicable. In the Figures, features that appear in some embodiments are indicated by dashed lines.

    [0016] As used herein, an electronic device may be a user equipment, a mobile phone, a cellular phone, a Personal Digital Assistant (PDA) equipped with radio communication capabilities, a smartphone, a tablet, a table PC, a Personal Computer (PC) with a touch screen, a portable electronic device, a portable electronic radio communication device, a touch pad or the like. Specifically, each of the exemplifying devices listed above comprises a touch screen.

    [0017] As used herein, the expression "press" is exemplified by "hard press", "a press action", "a touch of a first type" and the like. Press is herein intended to denote an action performed by the user. For example, the press action is different from a tap action. Typically, the force exerted to the touch screen is greater for the press action than for the tap action.

    [0018] As used herein, the expression "slide" indicates that for example a user slides a finger across the touch screen. Such slide typically exerts a force in the same range as the tap action.

    [0019] As used herein, the expression "hint" indicates that a rectangle, such as a window, may display informative text or informative symbols to the user of the electronic device.

    [0020] Fig. 1A-1I show schematic block diagrams of an exemplifying electronic device in different situations such as when the user picks up a first item 101, drops the item 101, adds a second item 102 and selects one out of these two items for managing thereof. As an example, the first item 101 may be a file, a music track, a portion of text, an image file or the like. Generally, the first item 101 indicates user content. The user content may for example be downloaded, created or edited by the user.

    [0021] The exemplifying electronic device comprises a touch screen. The touch screen may be any type of screen, or display device, capable of detecting tactile input from a user. The electronic device 100 may display user interface objects (UIOs) on the touch screen.

    [0022] In some examples, the electronic device comprises a clipboard memory. The clipboard memory may be a separate memory or a portion of any kind of general purpose memory. The memory may be a hard disk, a magnetic storage medium, a portable computer diskette or disc, flash memory, random access memory (RAM) or the like. Furthermore, the memory may be an internal register memory of a processor comprised in the electronic device.

    [0023] In the following, non-limiting examples of the methods performed in the electronic device will be described with reference to Fig. 1A through Fig. 1I.

    [0024] In Fig. 1A, the user slides his/her finger across the touch screen of the electronic device. As the user slides his/her finger across the touch screen, a hint is displayed. The hint may comprise a text, such as "press to pick up", in order to inform the user about that the first item 101 may be picked up by pressing the first item 101.

    [0025] In Fig. 1B, the electronic device 100 detects a first press on the touch screen when the user presses the first item 101. In some examples, the first item 101 shrinks to indicate that the first item 101 has been picked by the user. Metaphorically speaking, the first item 101 sticks to for example the finger of the user. The finger may of course be replaced by any other pointing device, such as a pen or the like.

    [0026] In Fig. 1C, the electronic device 100 displays a hint in response to that the user taps, by means of a first tap, the touch screen. The hint indicates, or shows, the first item 101 that was picked as described with reference to Fig. 1B. Hence, the user is reminded about what item he/she has picked up. Expressed differently, the user is reminded about what item has been stuck to the finger. Fig. 1C may show that the user slides his/her finger across the screen while the hint is displayed. The hint may follow the position indicated by the finger, i.e. the display, or showing, of the hint is continuously moved and the position thereof is updated.

    [0027] In Fig. 1D, the electronic device 100 displays a hint. The hint may show a text, such as "press to drop" for indicating to the user that the user may press the touch screen to drop the first item 101. The hint in Fig. 1C has informed the user about that the first item 101 was picked in case this has been forgotten by the user. The first item 101 may be comprised in the clipboard memory of the electronic device.

    [0028] In Fig. 1E, the electronic device 100 detects a second press at the touch screen at a location where the first item 101 may be dropped. For example, if the first item 101 is a file the location may be a folder in which case the file is copied or moved to said folder. In case the location is occupied by an application icon representing a program, installed in the electronic device, the electronic device opens the file while using the program. Needless to say, the electronic device checks that the program and the file are compatible, i.e. that it is possible, and makes sense, to use the program for opening the file.

    [0029] It shall be noted that the location is within some user interface object, the category of which determines the outcome, such as drop or pick, of the second press. Categories will be explained in more detail below.

    [0030] In Fig. 1F, an addition of a second item 102 is illustrated. Typically, the action illustrated in Fig. 1F may be performed after the situation shown in Fig. 1C, in which situation the electronic device 100 initially has pick up a first item 101. In more detail, the user has instructed, via the touch screen, the electronic device to pick up the first item 101 as in Fig. 1B. Fig. 1F shows that the electronic device 100 detects a third press at the second item 102. Since the electronic device detects that the first item 101 can not be dropped at the second item 102, the electronic device 100 picks up the second item 102 as well in response to the third press. Now, the electronic device 100 has picked two items, i.e. the first and second items 101, 102, Again, a hint may be displayed as shown in Fig. 1G. The hint may show the contents of the clipboard memory, i.e. in this example the first and second items 101, 102.

    [0031] In Fig. 1G, the electronic device 100 displays a hint showing the first and second items 101, 102 which previously have been picked by the user. The hint may be displayed in response to that the electronic device 100 detects a second tap on the touch screen. It may be that the hint is only displayed when a location of the second tap is suitable for dropping. This action is similar to the action shown in Fig. 1C. However, it is preferred that the hint in Fig. 1G is not continuously moved, Generally, Fig. 1H illustrates that the electronic device displays a hint in response to the third press, whereby the user is able to press one item out of the first and second items 101, 102, which item is to be dropped.

    [0032] In Fig. 1H, the electronic device 100 detects a fourth press at the touch screen. In response to the detection of the fourth press, the electronic device 100 drops the first or second item 101, 102. The location, or position, of the fourth press, determines whether the first or second item 101, 102 is to be dropped. Expressed differently, the first item 101 is dropped when the fourth press is at, i.e. within an area of, the first item 101, and the second item 102 is dropped when the fourth press is at the second item 102.

    [0033] In Fig. 1I, the electronic device 100 drops the item, such as the first or second item 101, 102, pressed, by means of the fourth press, by the user in Fig. 1H. This action is similar to the action illustrated in Fig. 1E. The second tap in Fig. 1G determines where the item pressed at by the fourth press in Fig. 1H is to be dropped.

    [0034] Fig. 2 shows a schematic flowchart of exemplifying methods performed by the electronic device 100. The electronic device 100 performs a method for manipulating a first user interface object, such as a file, an icon representing items indicating user content or the like. As mentioned, the electronic device 100 comprises a touch screen. Furthermore, the electronic device 100 is capable of detecting a first type of touch on the touch screen and a second type of touch on the touch screen. As an example, the first type of touch may be a press, a press action, a hard press, a hard touch or the like. As an example, the second type of touch may be a tap or the like. Touches of the first type exert the touch screen to forces that are greater than forces exerted to the touch screen by touches of the second type. The forces of the first and second types of touches are applied to the touch screen along a direction that is perpendicular to a plane of the touch screen. In the literature, the first force may be referred to as a Z-force. It shall be noted that a general force applied at an arbitrary angle with respect to the plane of the touch screen always comprises a component, e.g. a force component, that is perpendicular to the plane of the touch screen.

    [0035] In some embodiments, the electronic device 100 is a portable electronic device or the like as mentioned above.

    [0036] In some embodiments, the first user interface object is associated to a first category of user interface objects, which user interface objects indicate user content. Examples of user interface objects indicating user content are files, music tracks, text, images, documents, sound files and the like. Hence, these examples are user interface objects of the first category. It may be noted that a key pad, or a virtual key pad, displayed on the touch screen can not be manipulated as described herein, because a key of a key pad does not indicate user content. Moreover, such key can typically not be moved from one location to another location. Therefore, a key of a key pad is not a user interface object of the first category.

    [0037] In some embodiments, user interface objects associated to a second category of user interface objects manage user content.

    [0038] The following actions may be performed. Notably, in some embodiments of the method the order of the actions may differ from what is indicated below.

    Action 201



    [0039] This action corresponds to the action illustrated in Fig. 1B, in particular the detection of the first press.

    [0040] The electronic device 100 detects a first touch of the first type, such as the first press, at the first user interface object. As an example, the first touch is at the first user interface object (UIO), when the first touch is within an area of the first user interface object. The first user interface object may be the first item 101.

    [0041] When the first touch has been detected, the first interface object may be said to stick to for example the finger of the user. This is explained in action 202. Thus, the first user interface object may follow the actions of the user as he/she navigates through menus or the like in the electronic device 100. Typically, the user navigates while using touches of the second type, such as slides and taps on the touch screen.

    Action 202



    [0042] This action corresponds to the action illustrated in Fig. 1B and/or Fig. 1C.

    [0043] The electronic device 100 sets the first user interface object to a first state in response to the detection of the first touch. Expressed differently, the first user interface object is picked by the user. As an example, a reference to the first user interface object may be put in a clipboard memory of the electronic device 100. Alternatively, the actual first user interface object may be copied to the clipboard memory.

    Action 203



    [0044] This action corresponds to the action illustrated in Fig. 1B, in particular the shrinking of the item to indicate that the item has been picked.

    [0045] The electronic device 100 visually indicates the first state by altering appearance of the first user interface object. As an example, the appearance is altered by shrinking the item that has been picked up, i.e. the first user interface object.

    [0046] In some examples, a flag, or hint, may follow a finger of the user when sliding across the screen or the flag may appear at a point at which the user taps.

    Action 204



    [0047] The electronic device 100 detects a second touch of the first type at a second user interface object.

    [0048] In some examples, this action corresponds to the action illustrated in Fig. 1E, in particular the detection of the second press resulting in the dropping of the item. In these examples, the second user interface object is associated to the second category. Hence, the second user interface object corresponds to the location of the second press in these examples.

    [0049] In some example, this action corresponds to the action illustrated in Fig. 1F, in particular the detection of the third press resulting in the addition of an item. In these examples, the second user interface object is associated to the first category. Thus, the second user interface object corresponds to the second item in these examples.

    [0050] Therefore, the second touch is the second press in some examples, and the second touch is the third press in some other examples.

    [0051] As an example, the electronic device may detect a tap within an area of the second user interface object prior to the detection of the second touch. This example corresponds to the action illustrated in Fig. 1C, in particular the showing, or display, of the hint indicating the contents of the clipboard.

    Action 205



    [0052] This action corresponds to the action illustrated in Fig. 1E and Fig. 1I.

    [0053] The electronic device 100 manipulates the first or second user interface object based on properties of the first and second user interface objects in response to the detection of the second touch. The first category and the second category are examples of properties of the first and/or second user interface object. In some examples, the electronic device 100 further manipulates the first or second user interface object based on a position of the second touch, e.g. the second user interface object may specify a destination folder when copying or moving a file.

    Action 206



    [0054] In some embodiments, the second user interface object is associated to the first category. In these embodiments, the manipulating 205 may be performed in that the electronic device 100 sets the second user interface object to the first state in response to the detection of the second touch. As an example, the electronic device 100 adds the second user interface object to the clipboard memory of the electronic device 100. The clipboard memory may be indicated visually as shown in for example Fig. 1C and Fig. 1G.

    [0055] In this manner, more than one user interface object may be selected and managed at a later stage, such as in action 211.

    Action 207



    [0056] This action corresponds to the action illustrated in Fig. 1B and Fig. 1C.

    [0057] The electronic device 100 visually indicates the first state by altering appearance of the second user interface object. This action is similar to action 203, but here the first state relates to the second user interface object while in action 203 the first state relates to the first user interface object.

    Action 208



    [0058] This action corresponds to the action illustrated in Fig. 1G.

    [0059] In some embodiments, the manipulating 205 is performed in that the electronic device 100 detects a third touch of the second type, such as the second tap, at a third user interface object associated to the second category. When the third user interface object is associated to the second category it means that the third user interface object is of the second category. The third user interface object is a target user interface object. Thus, dropping of the first or second user interface object is possible at the third user interface object.

    [0060] The location in Fig. G may be at, or within an area of, the third user interface object.

    Action 209



    [0061] This action corresponds to the action illustrated in Fig. 1G.

    [0062] The electronic device 100 displays, in response to the detection of the third touch as described in action 208, the first and second user interface objects such as to allow a user to select the first or second user interface object. The first and second user interface objects 101, 102 may be comprised in a hint displayed at the touch screen.

    Action 210



    [0063] The electronic device 100 detects a fourth touch of the first type, such as the fourth press, at the first or second user interface object, wherein the first or second user interface object is a selected user interface.

    [0064] In some embodiments, the fourth touch may be preceded by detection of a slide across the touch screen.

    Action 211



    [0065] This action corresponds to the action illustrated in Fig. 1I.

    [0066] In some embodiments when action 211 follows directly after action 205, the second user interface object is associated to a second category of user interface objects, which user interface objects manage user content, wherein the first user interface object is a selected user interface object, and wherein the second user interface object is a target user interface object. Examples of user interface objects of the second category include drop-targets, drop-containers, folders, play-lists, execution buttons or the like.

    [0067] The selected user interface object may determined in response to the detection of the first touch of the first type as in action 202 or in response to the detection of the fourth touch of the first type as in action 210. Thus, the electronic device 100 has detected a touch of the first type at the selected user interface object, whereby the user indicates to the electronic device that the selected user interface object is to be managed, such as moved, copied or the like.

    [0068] In some embodiments, the manipulating 205 further is performed in that the electronic device 100 manages the selected user interface object.

    [0069] In some embodiments, the selected user interface object represents a file residing in a file system for managing data on a memory comprised in the electronic device 100 and the target user interface object represents a folder of the file system. The electronic device 100 manages the file by moving the file to the folder.

    [0070] In some embodiments, the selected user interface object represents a music track and the target user interface object represents a play list. The electronic device 100 manages the music track by adding the music track to the play list.

    [0071] In some embodiments, the selected user interface object represents an application icon and the target user interface object represents a user desktop. The electronic device 100 manages the application icon by moving the application icon to a position, indicated by the target user interface object, of the user desktop.

    [0072] In some embodiments, the selected user interface object represents text in word processing application and the target user interface object represents a location for insertion of the text. The electronic device 100 manages the text by moving the text to the location for insertion, or copying the text to the location for insertion.

    [0073] Fig. 3 shows a schematic block diagram of an exemplifying electronic device configured to perform the methods illustrated in Fig. 2. The electronic device 100 is configured to manipulate a first user interface object. The electronic device 100 is capable of detecting a first type of touch on the touch screen and a second type of touch on the touch screen. Expressed differently, the electronic device 100 is configured to detect the first type of touch and the second type of touch. Touches of the first type exert the touch screen to forces that are greater than forces exerted to the touch screen by touches of the second type. The forces of the first and second types of touches are applied to the touch screen along a direction that is perpendicular to a plane of the touch screen.

    [0074] In some embodiments of the electronic device 100, the electronic device 100 is a portable electronic device or the like as explained above.

    [0075] The electronic device 100 comprises a touch screen 110 configured to display user interface objects.

    [0076] Furthermore, the electronic device 100 comprises a processing circuit 120 configured to manage display, or showing, of the user interface objects at the touch screen.

    [0077] The processing circuit 120 may be a processing unit, a processor, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA) or the like. As an example, a processor, an ASIC, an FPGA or the like may comprise one or more processor kernels.

    [0078] In some embodiments of the electronic device 100, the electronic device 100 may further comprise a memory 130, such as the clipboard memory, for storing references to the user interface object and/or a copy of the user interface object. The user interface object may be the first or second user interface object.

    [0079] The memory 130 may be used for storing software to be executed by, for example, the processing circuit. The software may comprise instructions to enable the processing circuit to perform the method in the electronic device 100 as described above in conjunction with Fig. 2. The memory 130 may be a hard disk, a magnetic storage medium, a portable computer diskette or disc, flash memory, random access memory (RAM) or the like. Furthermore, the memory may be an internal register memory of a processor.

    [0080] Even though embodiments of the various aspects have been described, many different alterations, modifications and the like thereof will become apparent for those skilled in the art. The described embodiments are therefore not intended to limit the scope of the present disclosure, which is defined by the appended claims.


    Claims

    1. A method in an electronic device (100), comprising
    a touch screen, for manipulating a first user interface object, wherein the electronic device (100) is capable of detecting a first type of touch on the touch screen and a second type of touch on the touch screen, wherein the forces exerted by the first type of touch on the touch screen are greater than the forces exerted by the second type of touch, wherein the forces of the first and second types of touches are applied to the touch screen along a direction that is perpendicular to a plane of the touch screen, wherein the method comprises:

    detecting a first touch of the second type corresponding to a slide on the touch screen;

    displaying, in response to the slide, a hint that the first user interface object may be picked up;

    detecting (201) a first touch of the first type at the first user interface object;

    setting (202) the first user interface object to a first internal state in response to the detection of the first touch of the first type, wherein the first user interface object is picked up;

    detecting a second touch of the second type corresponding to a tap on the touch screen;

    displaying, in response to the tap, a hint indicating or showing the first user interface object that was picked up;

    detecting (204) a second touch of the first type at a second user interface object, wherein the first user interface object is dropped at the second user interface object; and

    manipulating (205) the first or second user interface object based on properties of the first and second user interface objects in response to the detection of the second touch of the first type.


     
    2. The method according to claim 1, wherein the first user interface object is associated to a first category of user interface objects, which user interface objects indicate user content.
     
    3. The method according to claim 2, wherein the second user interface object is associated to the first category, wherein the manipulating (205) further comprises:
    setting (206) the second user interface object to the first internal state in response to the detection of the second touch.
     
    4. The method according to any one of claim 1-3, wherein user interface objects associated to a second category of user interface objects manage user content.
     
    5. The method according to claim 4, wherein the manipulating (205) further comprises:

    detecting (208) a third touch of the second type at a third user interface object associated to the second category, wherein the third user interface object is a target user interface object;

    displaying (209), in response to the detection of the third touch of the second type, the first and second user interface objects get displayed in a hint such as to allow a user to select the first or second user interface object; and

    detecting (210) a third touch of the first type at the first or second user interface object, wherein the touched first or second user interface object becomes selected.


     
    6. The method according claim 1 or 2, wherein the second user interface object is associated to a second category of user interface objects, which user interface objects manage user content, wherein the first user interface object is a selected user interface object, and wherein the second user interface object is a target user interface object.
     
    7. The method according to claim 5 or 6, wherein the manipulating (205) further comprises:
    managing (211) the selected user interface object.
     
    8. The method according to claim 7, wherein the selected user interface object represents a file residing in a file system for managing data on a memory comprised in the electronic device (100) and the target user interface object represents a folder of the file system, wherein the managing further comprises moving the file to the folder.
     
    9. The method according to claim 7, wherein the selected user interface object represents a music track and the target user interface object represents a play list, wherein the managing further comprises adding the music track to the play list.
     
    10. The method according to claim 7, wherein the selected user interface object represents an application icon and the target user interface object represents a user desktop, wherein the managing further comprises moving the application icon to a position, indicated by the target user interface object, of the user desktop.
     
    11. The method according to claim 7, wherein the selected user interface object represents text in word processing application and the target user interface object represents a location for insertion of the text, wherein the managing further comprises:

    moving the text to the location for insertion, or

    copying the text to the location for insertion.


     
    12. The method according to any one of claims 1-11, further comprising:
    visually indicating (203, 207) the first internal state by altering appearance of the first or second user interface object, respectively.
     
    13. The method according to any one of claims 1-12, wherein the electronic device (100) is a portable electronic device.
     
    14. An electronic device (100) for manipulating a first user interface object, wherein the electronic device is capable of detecting a first type of touch on the touch screen and a second type of touch on the touch screen, wherein the forces exerted by the first type of touch on the touch screen are greater than the forces exerted by the second type of touch, wherein the forces of the first and second types of touches are applied to the touch screen along a direction that is perpendicular to a plane of the touch screen, wherein the electronic device (100) comprises:

    a touch screen (110) configured to display user interface objects,

    a processing circuit (120) configured to manage display of the user interface objects at the touch screen, wherein the processing circuit (120) further is configured to:

    detect a first touch of the second type corresponding to a slide on the touch screen;

    display, in response to the slide, a hint that the first user interface object may be picked up;

    detect a first touch of the first type at the first user interface object;

    set the first user interface object to a first internal state in response to the detection of the first touch of the first type, wherein the first user interface object is picked up;

    detect a second touch of the second type corresponding to a tap on the touch screen;

    display, in response to the tap, a hint indicating or showing the first user interface object that was picked up;

    detect a second touch of the first type at a second user interface object, wherein the first user interface object is dropped at the second user interface object; and

    manipulate the first or second user interface object based on properties of the first and second user interface objects in response to the detection of the second touch of the first type.


     
    15. The electronic device according to claim 14, wherein the first user interface object is associated to a first category of user interface objects, which user interface objects indicate user content, wherein the second user interface object is associated to the first category, wherein the processing circuit further is configured to:
    set the second user interface object to the first internal state in response to the detection of the second touch of the first type.
     


    Ansprüche

    1. Verfahren in einer elektronischen Vorrichtung (100), Folgendes umfassend:
    einen Berührungsbildschirm zum Manipulieren eines ersten Benutzerschnittstellenobjekts, wobei die elektronische Vorrichtung (100) in der Lage ist, eine erste Art von Berührung auf dem Berührungsbildschirm und eine zweite Art von Berührung auf dem Berührungsbildschirm zu erkennen, wobei die durch die erste Art von Berührung auf den Berührungsbildschirm ausgeübten Kräfte größer sind als die durch die zweite Art von Berührung auf den Berührungsbildschirm ausgeübten Kräfte, wobei die Kräfte der ersten und der zweiten Art von Berührung auf dem Berührungsbildschirm entlang einer Richtung angewandt werden, die in einem rechten Winkel zu einer Ebene des Berührungsbildschirms stehen, wobei das Verfahren Folgendes umfasst:

    Erkennen einer ersten Berührung der zweiten Art, die einer Wischbewegung auf dem Berührungsbildschirm entspricht;

    Anzeigen eines Hinweises darauf, dass das erste Benutzerschnittstellenobjekt aufgehoben werden kann, in Reaktion auf die Wischbewegung;

    Erkennen (201) einer ersten Berührung der ersten Art an dem ersten Benutzerschnittstellenobjekt;

    Setzen (202) des ersten Benutzerschnittstellenobjekts in einen ersten internen Zustand in Reaktion auf das Erkennen der ersten Berührung der ersten Art, wobei das erste Benutzerschnittstellenobjekt aufgehoben wird;

    Erkennen einer zweiten Berührung der zweiten Art, die einem Antippen des Berührungsbildschirms entspricht;

    Anzeigen eines Hinweises, der das erste Benutzerschnittstellenobjekt, das aufgehoben wurde, angibt oder zeigt, in Reaktion auf das Antippen;

    Erkennen (204) einer zweiten Berührung der ersten Art an einem zweiten Benutzerschnittstellenobjekt, wobei das erste Benutzerschnittstellenobjekt an dem zweiten Benutzerschnittstellenobjekt fallen gelassen wird; und

    Manipulieren (205) des ersten oder des zweiten Benutzerschnittstellenobjekts, basierend auf Eigenschaften des ersten und des zweiten Benutzerschnittstellenobjekts, in Reaktion auf das Erkennen der zweiten Berührung der ersten Art.


     
    2. Verfahren nach Anspruch 1, wobei das erste Benutzerschnittstellenobjekt einer ersten Kategorie von Benutzerschnittstellenobjekten zugeordnet ist, wobei die Benutzerschnittstellenobjekte Benutzerinhalte anzeigen.
     
    3. Verfahren nach Anspruch 2, wobei das zweite Benutzerschnittstellenobjekt der ersten Kategorie zugeordnet ist, wobei das Manipulieren (205) ferner Folgendes umfasst: Setzen (206) des zweiten Benutzerschnittstellenobjekts in den ersten internen Zustand in Reaktion auf das Erkennen der zweiten Berührung.
     
    4. Verfahren nach einem der Ansprüche 1-3, wobei Benutzerschnittstellenobjekte, die einer zweiten Kategorie von Benutzerschnittstellenobjekten zugeordnet sind, Benutzerinhalte verwalten.
     
    5. Verfahren nach Anspruch 4, wobei das Manipulieren (205) ferner Folgendes umfasst:

    Erkennen (208) einer dritten Berührung der zweiten Art an einem dritten Benutzerschnittstellenobjekt, das der zweiten Kategorie zugeordnet ist, wobei das dritte Benutzerschnittstellenobjekt ein Zielbenutzerschnittstellenobjekt ist;

    Anzeigen (209) des ersten und des zweiten Benutzerschnittstellenobjekts, die in einem Hinweise angezeigt werden, sodass ein Benutzer das erste oder das zweite Benutzerschnittstellenobjekt auswählen kann, in Reaktion auf das Erkennen der dritten Berührung der zweiten Art; und

    Erkennen (210) einer dritten Berührung der ersten Art an dem ersten oder dem zweiten Benutzerschnittstellenobjekt, wobei das berührte erste oder zweite Benutzerschnittstellenobjekt ausgewählt wird.


     
    6. Verfahren nach Anspruch 1 oder 2, wobei das zweite Benutzerschnittstellenobjekt einer zweiten Kategorie von Benutzerschnittstellenobjekten zugeordnet ist, wobei die Benutzerschnittstellenobjekte Benutzerinhalte verwalten, wobei das erste Benutzerschnittstellenobjekt ein ausgewähltes Benutzerschnittstellenobjekt ist, und wobei das zweite Benutzerschnittstellenobjekt ein Zielbenutzerschnittstellenobjekt ist.
     
    7. Verfahren nach Anspruch 5 oder 6, wobei das Manipulieren (205) ferner Folgendes umfasst:
    Verwalten (211) des ausgewählten Benutzerschnittstellenobjekts.
     
    8. Verfahren nach Anspruch 7, wobei das ausgewählte Benutzerschnittstellenobjekt eine Datei darstellt, die sich in einem Dateisystem befindet, das dem Verwalten von Daten auf einem Speicher dient, den die elektronische Vorrichtung (100) umfasst, und wobei das Zielbenutzerschnittstellenobjekt einen Ordner des Dateisystems darstellt, wobei das Verwalten ferner ein Bewegen der Datei in den Ordner umfasst.
     
    9. Verfahren nach Anspruch 7, wobei das ausgewählte Benutzerschnittstellenobjekt einen Musiktitel und das Zielbenutzerschnittstellenobjekt eine Wiedergabeliste darstellt, wobei das Verwaltern ferner ein Hinzufügen des Musiktitels zu der Wiedergabeliste umfasst.
     
    10. Verfahren nach Anspruch 7, wobei das ausgewählte Benutzerschnittstellenobjekt ein Anwendungssymbol und das Zielbenutzerschnittstellenobjekt einen Benutzerdesktop darstellt, wobei das Verwalten ferner ein Bewegen des Anwendungssymbols zu einer Position auf dem Benutzerdesktop umfasst, die durch das Zielbenutzerschnittstellenobjekt angezeigt ist.
     
    11. Verfahren nach Anspruch 7, wobei das ausgewählte Benutzerschnittstellenobjekt einen Text in einer Textverarbeitungsanwendung und das Zielbenutzerschnittstellenobjekt eine Stelle zum Einfügen des Textes darstellt, wobei das Verwalten ferner Folgendes umfasst: Bewegen des Textes zu der Stelle zum Einfügen, oder Kopieren des Textes zu der Stelle zum Einfügen.
     
    12. Verfahren nach einem der Ansprüche 1-11, ferner Folgendes umfassend:
    visuelles Anzeigen (203, 207) des ersten internen Zustands durch ein Verändern des Aussehens jeweils des ersten oder des zweiten Benutzerschnittstellenobjekts.
     
    13. Verfahren nach einem der Ansprüche 1-12, wobei die elektronische Vorrichtung (100) eine tragbare elektronische Vorrichtung ist.
     
    14. Elektronische Vorrichtung (100) zum Manipulieren eines ersten Benutzerschnittstellenobjekts, wobei die elektronische Vorrichtung in der Lage ist, eine erste Art von Berührung auf dem Berührungsbildschirm und eine zweite Art von Berührung auf dem Berührungsbildschirm zu erkennen, wobei die durch die erste Art von Berührung auf den Berührungsbildschirm ausgeübten Kräfte größer sind als die durch die zweite Art von Berührung auf den Berührungsbildschirm ausgeübten Kräfte, wobei die Kräfte der ersten und der zweiten Art von Berührung auf dem Berührungsbildschirm entlang einer Richtung angewandt werden, die in einem rechten Winkel zu einer Ebene des Berührungsbildschirms stehen, wobei die elektronische Vorrichtung (100) Folgendes umfasst:

    einen Berührungsbildschirm (110), der dazu konfiguriert ist, Benutzerschnittstellenobjekte anzuzeigen,

    eine Verarbeitungsschaltung (120), die dazu konfiguriert ist, das Anzeigen der Benutzerschnittstellenobjekte auf dem Berührungsbildschirm zu verwalten, wobei die Verarbeitungsschaltung (120) ferner zu Folgendem konfiguriert ist:

    Erkennen einer ersten Berührung der zweiten Art, die einer Wischbewegung auf dem Berührungsbildschirm entspricht;

    Anzeigen eines Hinweises darauf, dass das erste Benutzerschnittstellenobjekt aufgehoben werden kann, in Reaktion auf die Wischbewegung;

    Erkennen einer ersten Berührung der ersten Art an dem ersten Benutzerschnittstellenobjekt;

    Setzen des ersten Benutzerschnittstellenobjekts in einen ersten internen Zustand in Reaktion auf das Erkennen der ersten Berührung der ersten Art, wobei das erste Benutzerschnittstellenobjekt aufgehoben wird;

    Erkennen einer zweiten Berührung der zweiten Art, die einem Antippen des Berührungsbildschirms entspricht;

    Anzeigen eines Hinweises, der das erste Benutzerschnittstellenobjekt, das aufgehoben wurde, angibt oder zeigt, in Reaktion auf das Antippen;

    Erkennen einer zweiten Berührung der ersten Art an einem zweiten Benutzerschnittstellenobjekt, wobei das erste Benutzerschnittstellenobjekt an dem zweiten Benutzerschnittstellenobjekt fallen gelassen wird; und Manipulieren des ersten oder des zweiten Benutzerschnittstellenobjekts, basierend auf Eigenschaften des ersten und des zweiten Benutzerschnittstellenobjekts, in Reaktion auf das Erkennen der zweiten Berührung der ersten Art.


     
    15. Elektronische Vorrichtung nach Anspruch 14, wobei das erste Benutzerschnittstellenobjekt einer ersten Kategorie von Benutzerschnittstellenobjekten zugeordnet ist, wobei die Benutzerschnittstellenobjekte Benutzerinhalte anzeigen, wobei das zweite Benutzerschnittstellenobjekt der ersten Kategorie zugeordnet ist, wobei die Verarbeitungsschaltung ferner zu Folgendem konfiguriert ist:
    Setzen des zweiten Benutzerschnittstellenobjekts in den ersten internen Zustand in Reaktion auf das Erkennen der zweiten Berührung der ersten Art.
     


    Revendications

    1. Procédé dans un dispositif électronique (100), comprenant : un écran tactile, pour manipuler un premier objet d'interface utilisateur, dans lequel le dispositif électronique (100) est capable de détecter un premier type de toucher sur l'écran tactile et un deuxième type de toucher sur l'écran tactile, dans lequel les forces exercées par le premier type de toucher sur l'écran tactile sont supérieures aux forces exercées par le deuxième type de toucher, dans lequel les forces des premier et deuxième types de toucher sont appliquées sur l'écran tactile le long d'une direction perpendiculaire à un plan de l'écran tactile, dans lequel le procédé comprend :

    la détection d'un premier toucher du deuxième type correspondant à un glissement sur l'écran tactile ;

    l'affichage, en réponse au glissement, d'une indication que le premier objet d'interface utilisateur peut être récupéré ;

    la détection (201) d'un premier toucher du premier type au niveau du premier objet d'interface utilisateur ;

    le réglage (202) du premier objet d'interface utilisateur sur un premier état interne en réponse à la détection du premier toucher du premier type, dans lequel le premier objet d'interface utilisateur est récupéré ;

    la détection d'un deuxième toucher du deuxième type correspondant à un clic sur l'écran tactile ;

    l'affichage, en réponse au clic, d'une indication indiquant ou montrant le premier objet d'interface utilisateur qui a été récupéré ;

    la détection (204) d'un deuxième toucher du premier type au niveau d'un deuxième objet d'interface utilisateur, dans lequel le premier objet d'interface utilisateur est déposé au niveau du deuxième objet d'interface utilisateur ; et

    la manipulation (205) du premier ou du deuxième objet d'interface utilisateur sur la base de propriétés des premier et deuxième objets d'interface utilisateur en réponse à la détection du deuxième toucher du premier type.


     
    2. Procédé selon la revendication 1, dans lequel le premier objet d'interface utilisateur est associé à une première catégorie d'objets d'interface utilisateur, lesquels objets d'interface utilisateur indiquent un contenu d'utilisateur.
     
    3. Procédé selon la revendication 2, dans lequel le deuxième objet d'interface utilisateur est associé à la première catégorie, dans lequel la manipulation (205) comprend en outre : le réglage (206) du deuxième objet d'interface utilisateur sur le premier état interne en réponse à la détection du deuxième toucher.
     
    4. Procédé selon l'une quelconque des revendications 1 à 3, dans lequel les objets d'interface utilisateur associés à une seconde catégorie d'objets d'interface utilisateur gèrent le contenu d'utilisateur.
     
    5. Procédé selon la revendication 4, dans lequel la manipulation (205) comprend en outre :

    la détection (208) d'un troisième toucher du deuxième type au niveau d'un troisième objet d'interface utilisateur associé à la seconde catégorie, dans lequel le troisième objet d'interface utilisateur est un objet d'interface utilisateur cible ;

    l'affichage (209), en réponse à la détection du troisième toucher du deuxième type, des premier et deuxième objets d'interface utilisateur affichés dans une indication de manière à permettre à un utilisateur de sélectionner le premier ou le deuxième objet d'interface utilisateur ; et

    la détection (210) d'un troisième toucher du premier type au niveau du premier ou du deuxième objet d'interface utilisateur, dans lequel le premier ou le deuxième objet d'interface utilisateur touché est sélectionné.


     
    6. Procédé selon la revendication 1 ou 2, dans lequel le deuxième objet d'interface utilisateur est associé à une seconde catégorie d'objets d'interface utilisateur, lesquels objets d'interface utilisateur gèrent le contenu d'utilisateur, dans lequel le premier objet d'interface utilisateur est un objet d'interface utilisateur sélectionné, et dans lequel le deuxième objet d'interface utilisateur est un objet d'interface utilisateur cible.
     
    7. Procédé selon la revendication 5 ou 6, dans lequel la manipulation (205) comprend en outre :
    la gestion (211) de l'objet d'interface utilisateur sélectionné.
     
    8. Procédé selon la revendication 7, dans lequel l'objet d'interface utilisateur sélectionné représente un fichier résidant dans un système de fichiers pour gérer des données sur une mémoire comprise dans le dispositif électronique (100) et l'objet d'interface utilisateur cible représente un dossier du système de fichiers, dans lequel la gestion comprend en outre le déplacement du fichier vers le dossier.
     
    9. Procédé selon la revendication 7, dans lequel l'objet d'interface utilisateur sélectionné représente une piste musicale et l'objet d'interface utilisateur cible représente une liste de lecture, dans lequel la gestion comprend en outre l'ajout de la piste musicale à la liste de lecture.
     
    10. Procédé selon la revendication 7, dans lequel l'objet d'interface utilisateur sélectionné représente une icône d'application et l'objet d'interface utilisateur cible représente un bureau d'utilisateur, dans lequel la gestion comprend en outre le déplacement de l'icône d'application vers une position, indiquée par l'objet d'interface utilisateur cible, du bureau d'utilisateur.
     
    11. Procédé selon la revendication 7, dans lequel l'objet d'interface utilisateur sélectionné représente du texte dans une application de traitement de texte et l'objet d'interface utilisateur cible représente un emplacement pour l'insertion du texte, dans lequel la gestion comprend en outre :

    le déplacement du texte à l'emplacement d'insertion, ou

    la copie du texte à l'emplacement d'insertion.


     
    12. Procédé selon l'une quelconque des revendications 1 à 11, comprenant en outre :
    l'indication visuelle (203, 207) du premier état interne en modifiant l'apparence du premier ou du deuxième objet d'interface utilisateur, respectivement.
     
    13. Procédé selon l'une quelconque des revendications 1 à 12, dans lequel le dispositif électronique (100) est un dispositif électronique portable.
     
    14. Dispositif électronique (100) pour manipuler un premier objet d'interface utilisateur, dans lequel le dispositif électronique est capable de détecter un premier type de toucher sur l'écran tactile et un deuxième type de toucher sur l'écran tactile, dans lequel les forces exercées par le premier type de toucher sur l'écran tactile sont supérieures aux forces exercées par le deuxième type de toucher, dans lequel les forces des premier et deuxième types de toucher sont appliquées sur l'écran tactile le long d'une direction perpendiculaire à un plan de l'écran tactile, dans lequel le dispositif électronique (100) comprend :

    un écran tactile (110) configuré pour afficher des objets d'interface utilisateur,

    un circuit de traitement (120) configuré pour gérer l'affichage des objets d'interface utilisateur au niveau de l'écran tactile, dans lequel le circuit de traitement (120) est en outre configuré pour :

    détecter un premier toucher du deuxième type correspondant à un glissement sur l'écran tactile ;

    afficher, en réponse au glissement, une indication que le premier objet d'interface utilisateur peut être récupéré ;

    détecter un premier toucher du premier type au niveau du premier objet d'interface utilisateur ;

    régler le premier objet d'interface utilisateur sur un premier état interne en réponse à la détection du premier toucher du premier type, dans lequel le premier objet d'interface utilisateur est récupéré ;

    détecter un deuxième toucher du deuxième type correspondant à un clic sur l'écran tactile ;

    afficher, en réponse au clic, une indication indiquant ou montrant le premier objet d'interface utilisateur qui a été récupéré ;

    détecter un deuxième toucher du premier type au niveau d'un deuxième objet d'interface utilisateur, dans lequel le premier objet d'interface utilisateur est déposé au niveau du deuxième objet d'interface utilisateur ; et

    manipuler le premier ou le deuxième objet d'interface utilisateur sur la base de propriétés des premier et deuxième objets d'interface utilisateur en réponse à la détection du deuxième toucher du premier type.


     
    15. Dispositif électronique selon la revendication 14, dans lequel le premier objet d'interface utilisateur est associé à une première catégorie d'objets d'interface utilisateur, lesquels objets d'interface utilisateur indiquent un contenu d'utilisateur, dans lequel le deuxième objet d'interface utilisateur est associé à la première catégorie, dans lequel le circuit de traitement est en outre configuré pour :
    régler le deuxième objet d'interface utilisateur sur le premier état interne en réponse à la détection du deuxième toucher du premier type.
     




    Drawing

















    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description