(19)
(11)EP 3 198 394 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
06.05.2020 Bulletin 2020/19

(21)Application number: 16755789.1

(22)Date of filing:  05.02.2016
(51)International Patent Classification (IPC): 
G06F 3/0488(2013.01)
G06F 3/0484(2013.01)
G06F 3/0482(2013.01)
G06F 3/048(2013.01)
G06F 3/0486(2013.01)
G06F 3/0481(2013.01)
(86)International application number:
PCT/KR2016/001290
(87)International publication number:
WO 2016/137139 (01.09.2016 Gazette  2016/35)

(54)

METHOD AND DEVICE FOR MANAGING ITEMS

VERFAHREN UND VORRICHTUNG ZUR VERWALTUNG VON ELEMENTEN

PROCÉDÉ ET DISPOSITIF DE GESTION D'ARTICLES


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 26.02.2015 GB 201503224
25.09.2015 KR 20150137100

(43)Date of publication of application:
02.08.2017 Bulletin 2017/31

(73)Proprietor: Samsung Electronics Co., Ltd.
Suwon-si, Gyeonggi-do 16677 (KR)

(72)Inventor:
  • CRITCHLOW, Stephen Paul
    Staines Middlesex TW18 4QE (GB)

(74)Representative: Walaski, Jan Filip 
Venner Shipley LLP 200 Aldersgate
London EC1A 4HD
London EC1A 4HD (GB)


(56)References cited: : 
WO-A1-2014/106495
US-A1- 2010 058 182
US-A1- 2012 030 628
US-A1- 2013 174 069
KR-A- 20140 079 939
US-A1- 2011 059 759
US-A1- 2013 167 090
US-B1- 8 954 887
  
  • Anonymous: "iPhone User Guide for iOS 5.1", , 3 December 2014 (2014-12-03), XP055398540, Retrieved from the Internet: URL:https://manuals.info.apple.com/MANUALS /1000/MA1622/en_US/iphone_ios5_user_guide. pdf [retrieved on 2017-08-14]
  • Anonymous: "Apple - Support - Search", , 14 August 2017 (2017-08-14), XP055398543, Retrieved from the Internet: URL:https://support.apple.com/kb/index?pag e=search&type=organic&src=support_searchbo x_main&locale=en_US&q=ios+5.1 [retrieved on 2017-08-14]
  
Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


Description

Technical Field



[0001] Methods and apparatuses consistent with exemplary embodiments relate to managing items, and moving an item into a container object.

Background Art



[0002] With advances in multimedia and data processing technologies, devices have become capable of executing a large number of applications and processing a variety of information. Accordingly, devices display many items related to various applications, image files, and text files on screens of the devices. To efficiently manage these items, the devices arrange the items by displaying a graphic user interface (GUI) having a directory structure.

[0003] Traditional GUIs used in desktop computers execute an operation of moving a plurality of items into a folder based on combinations of various keys. However, these traditional GUIs are not suitable for devices, such as smartphones, tablet computers, and wearable devices, in which a user controller is limited. In addition, as a screen size and resolution of a device increase, more items are displayed on a screen of the device. Therefore, there is a need for a method capable of promptly and efficiently managing various items. WO 2014/106495 A1 discloses a method for adding an application icon in a folder, comprising dragging the folder and adding an application in a capturing region of the folder to the folder when a preset shift-in triggering event is detected.

Disclosure of Invention


Technical Problem



[0004] There is a need for a method capable of promptly and efficiently managing various items.

Solution to Problem



[0005] There is provided a device according to claim 1, and a method according to claim 12.

Advantageous Effects of Invention



[0006] Exemplary embodiments can manage various items promptly and efficiently.

Brief Description of Drawings



[0007] These and/or other aspects will become apparent and more readily appreciated from the following description of exemplary embodiments, taken in conjunction with the accompanying drawings in which:

FIG. 1 is a diagram for describing an example in which a device moves an item into a container object, according to an exemplary embodiment;

FIG. 2 is a flowchart of a method of moving an item into a container object in a device, according to an exemplary embodiment;

FIG. 3 is a flowchart of a method of moving an item into a container object in a device, according to another exemplary embodiment;

FIG. 4 is a diagram of a screen of a device according to an exemplary embodiment;

FIG. 5 is a diagram for describing an example in which a device moves an item into a container object, according to another exemplary embodiment;

FIGS. 6A to 6C are diagrams for describing an example in which a device distinguishably displays an item to be moved into a container object, according to one or more exemplary embodiments;

FIG. 7 is a diagram for describing an example in which a device displays a state in which an item is moved into a container object, according to an exemplary embodiment;

FIG. 8A is a diagram for describing an example in which a device determines the number of items to be moved into a container object based on a strength of a user input, according to an exemplary embodiment;

FIG. 8B is a diagram for describing an example in which a device determines the number of items to be moved into a container object based on a thickness of a connector, according to an exemplary embodiment;

FIGS. 9A and 9B are diagrams for describing an example in which a device turns a page, according to one or more exemplary embodiments;

FIG. 10 is a diagram for describing an example in which a device displays a menu for managing a container object, according to an exemplary embodiment;

FIG. 11 is a diagram for describing an example in which a device displays a container object and a plurality of items in a small size, according to an exemplary embodiment;

FIG. 12 is a flowchart of a method of moving a plurality of items into a container object in a device, according to another exemplary embodiment;

FIG. 13 is a flowchart of a method of moving a plurality of items into a container object in a device, according to another exemplary embodiment;

FIG. 14 is a diagram for describing an example in which a device receives a user input forming a loop, according to an exemplary embodiment;

FIG. 15 is a diagram for describing an example in which a device distinguishably displays a process of moving items into a container object, according to another exemplary embodiment;

FIGS. 16A and 16B are diagrams for describing an example in which a device changes a shape of a connector, according to one or more exemplary embodiments;

FIG. 17 is a flowchart of a method of changing a shape of a connector in a device, according to an exemplary embodiment;

FIG. 18 is a diagram for describing an example in which a device changes a shape of a connector, according to another exemplary embodiment; and

FIGS. 19 to 21 are block diagrams of a device according to one or more exemplary embodiments.


Best Mode for Carrying out the Invention



[0008] Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented exemplary embodiments. In any case, the main embodiment is defined by the independent claims. Additional embodiments are defined by the dependent claims. Embodiments which do not fall within the scope of the claims do not describe part of the invention.

[0009] According to an aspect of an exemplary embodiment, there is provided a device including: a display configured to display a container object and a plurality of items that are movable into the container object; a user input device configured to receive a user input; and a controller configured to, in response to the user input indicating a first gesture moving from the container object to a current position of the user input, control the display to display a first connector along a first path of the first gesture, determine a first item of the plurality of items, the first item being located within a threshold distance from an end of the connector, and move the first item into the container object.

[0010] The controller may be further configured to, in response to the user input indicating a second gesture forming a loop, control the display to display a second connecter along a second path of the second gesture, determine a second item of the plurality of items, enclosed by the loop, and move the second item into the container object.

[0011] The controller may be further configured to determine the second item in response to at least one among all portions of the second item being enclosed by the loop, and a threshold portion or more of the second item being enclosed by the loop.

[0012] The controller may be further configured to control the display the second item moving into the container object along the second connector based on a predetermined animation effect.

[0013] The controller may be further configured to determine a third item of the plurality of items within the threshold distance from the current position for a predetermined time, and move the third item into the container object.

[0014] The controller may be further configured to control the display to distinguishably display the first item.

[0015] The controller may be further configured to control the display to display the first item moving into the container object along the first connector.

[0016] The container object and the plurality of items may be displayed on a first page of a plurality of pages, and the controller may be further configured to, in response to the user input corresponding to an edge of the first page, control the display to display a second page of the plurality of pages.

[0017] The controller may be further configured to change a shape of the connector during the receiving of the user input.

[0018] The controller may be further configured to shorten the connector according to a predetermined criteria during the receiving of the user input.

[0019] The controller may be further configured to control the display to display a menu for managing the container object in response to receiving a user input of touching the container object for a threshold time.

[0020] The controller may be further configured to reduce a displayed size of the container object and the plurality of items during the displaying of the menu.

[0021] According to an aspect of another exemplary embodiment, there is provided a method of managing an item, the method including: displaying a container object and a plurality of items that are movable to the container object; receiving a user input; displaying, in response to the received user input indicating a first gesture moving from the container object to a current position, a first connector along a first path of the first gesture; determining a first item of the plurality of items within a threshold distance from an end of the connector; and moving the first item into the container object.

[0022] The method may further include: displaying, in response to the received user input forming a loop, a second connector along a second path of the second gesture; determining a second item of the plurality of items enclosed by the loop; and moving the second item into the container object.

[0023] The determining the second item may include determining at least one among all portions of the second item are completely enclosed by the loop and at least a threshold portion of the second item is enclosed by the loop.

[0024] The may further include determining a third item of the plurality of items within the threshold distance from the current position for a predetermined time; and moving the third item into the container object.

[0025] The method may further include distinguishably displaying the first item.

[0026] The method may further include displaying the first item moving into the container object along the first connector.

[0027] The container object and the plurality of items may be displayed on a first page of the plurality of pages; and the method may further include displaying a second page of the plurality of pages in response to the user input corresponding to an edge of the first page.

[0028] The method may further include changing a shape of the first connector during the receiving of the user input.

[0029] According to an aspect of yet another exemplary embodiment, there is provided a device including: a touchscreen display configured to display a container object and a plurality of items; and a controller configured to, in response to the user input indicating a gesture originating from the container object, determine a facing direction of the user input, determine at least one item of the plurality of items within a threshold distance of the user input and corresponding to the facing direction, control the touchscreen display to display an indicator corresponding to the determined at least one item, and, in response to the user input stopping, move the determined at least one item into the container object.

[0030] The threshold distance may be adjustable according to a user selection.

Mode for the Invention



[0031] Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. In this regard, exemplary embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, exemplary embodiments are described below, by referring to the figures, to explain aspects of the present inventive concept. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. Expressions such as "at least one of" when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.

[0032] The terms used in the present disclosure will be described briefly and exemplary embodiments will then be described in detail.

[0033] The terms used in the present disclosure are those terms currently used in the art in consideration of functions in regard to the inventive concept, but the terms may vary according to the intention of those of ordinary skill in the art, precedents, or new technology in the art. Also, specified terms may be selected by the applicant, and in this case, the detailed meaning thereof will be described in the detailed description of the inventive concept. Thus, the terms used in the present disclosure should be understood based on the meaning of the terms and the overall description of the inventive concept.

[0034] It will also be understood that the terms "comprises", "includes", and "has", when used herein, specify the presence of stated elements, but do not preclude the presence or addition of other elements, unless otherwise defined.

[0035] Also, the terms "unit" and "module" used herein represent a unit for processing at least one function or operation, which may be implemented by hardware, software, or a combination of hardware and software.

[0036] In the specification, the term "container object" may mean a user interface (UI) object capable of containing a plurality of items in a device 1000. The container object may be a folder and may be a different type of user interface object.

[0037] In the specification, the term "item" may be an object displayed on a screen of the device 1000 to execute an application, an image file such as a photograph, and a text file and may be an icon, for example, an image, or a text.

[0038] Hereinafter, exemplary embodiments will be described in detail with reference to the accompanying drawings.

[0039] FIG. 1 is a diagram for describing an example in which the device 1000 moves an item into a container object, according to an exemplary embodiment.

[0040] The device 1000 according to the exemplary embodiment may be implemented in various forms. Examples of the device 1000 may include a mobile phone, a smartphone, a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a smart television, a laptop computer, a media player, an MP3 player, a portable multimedia player (PMP), a digital camera, a kiosk, a navigation device, a global positioning system (GPS) device, an e-book reader, a digital broadcasting terminal, a micro server, another mobile device, a non-mobile computing device, and a home appliance, such as a refrigerator or washing machine including a display device, but are not limited thereto. In addition, the device 1000 may include a wearable device, such as a watch, glasses, a hair band, or a ring, which has a communication function and a data processing function. However, the device 1000 is not limited thereto. The device 1000 may include any type of device capable of receiving an object from a server through a network and executing the received object.

[0041] Referring to FIG. 1, the device 1000 may display a container object and a plurality of items on a screen. In addition, the device 1000 may receive a user input of touching and dragging the container object. In addition, the device 1000 may display a connector on the screen in response to the user input, the connector being connected from the container object to a current position of the user input. Furthermore, the device 1000 may move an item of the plurality of items on the screen, the item being located within a predetermined distance from an end of the connector, into the container object.

[0042] FIG. 2 is a flowchart of a method of moving an item into a container object in a device 1000, according to an exemplary embodiment.

[0043] In operation S210, the device 1000 may display a container object and a plurality of items on a screen. The item may be an object displayed on the screen of the device 1000 to execute an application, an image file, and/or a text file. For example, the item may be an icon, an image, or a text.

[0044] When a file corresponding to the item is an application, the device 1000 may execute the application corresponding to the item based on a user input of touching the item.

[0045] In addition, a software screen (for example, a home screen, a lock screen, or an application screen) of the device 1000 may include a plurality of pages, and the container object and the plurality of items may be displayed on each of the pages. The device 1000 may display one page or may display other pages in response to a user input of turning the displayed page.

[0046] In operation S220, the device 1000 may receive a user input of touching and dragging the container object. The device 1000 may receive the user input of touching the container object and dragging the container object to an item to be moved into the container object.

[0047] In operation S230, the device 1000 may display a connector in response to the user input. The device 1000 may display the connector along a movement path of the user input. For example, the connector may be displayed along a locus of a touching and dragging input.

[0048] In addition, the device 1000 may change a shape of the connector while receiving the user input. The device 1000 may change the shape of the connector such that a distance between two ends of the connector is minimized. The two ends of the connector may be ends of the connector, which are respectively located on the container object and at a current position of the user input.

[0049] In S240, the device 1000 may move an item, which is located within a predetermined distance from the end of the connector, into the container object. While the connector is displayed on the screen, the device 1000 may determine whether there is an item located within the predetermined distance from the end of the connector. The predetermined distance may be changed by a user setting. According to one or more exemplary embodiments, the device 1000 may move only an item overlapping the end of the connector into the container object.

[0050] FIG. 3 is a flowchart of a method of moving an item into a container object in a device 1000, according to another exemplary embodiment.

[0051] Because operations S310 to S330 are substantially the same as operations S210 to S230 of FIG. 2, redundant descriptions thereof will be omitted.

[0052] In operation S340, the device 1000 may determine whether an item is located in the vicinity of an end of a connector. According to one or more exemplary embodiments, the device 1000 may determine whether the end of the connector overlaps the item or may determine whether the item is located within a predetermined distance from the end of the connector.

[0053] When the item is not located in the vicinity of the end of the connector, the device 1000 may return to operation S320 and may receive a new user input.

[0054] In operation S340, when there is an item located in the vicinity of the end of the connector, the device 1000 may perform operation S350.

[0055] In operation S350, the device 1000 may determine whether the item in the vicinity of the end of the connector satisfies a selection criterion. At this time, the item satisfying the selection criterion may be, for example, an item that is located within the predetermined distance from the end of the connector for a predetermined time. The predetermined time may be preset by a manufacturer of the device 1000 or may be set to be changed by a user. However, the selection criterion may be changed according to one or more exemplary embodiments and is not limited to the example described above.

[0056] In operation S360, the device 1000 may select the item satisfying the selection criterion as an item to be moved into the container object.

[0057] In operation S370, when the user input continues, the device 1000 may return to operation S320 and may receive a new user input. For example, when the device 1000 does not determine that the user input is ended, the device 1000 may receive the new user input. However, when the user input is ended, the device 1000 may move the selected item into the container object.

[0058] When the device 1000 selects the item to be moved, even while the user input continues, the device 1000 may directly move the selected item into the container object. According to one or more exemplary embodiments, after the user input is ended, the device 1000 may move selected items into the container object at a time. In this case, before the device 1000 moves the selected items to the container object, the device 1000 may additionally receive a user input of confirming the selected items. For example, the device 1000 may display, on a screen, a message inquiring whether items currently selected are allowed to move into the container object and may receive the user input of confirming the currently selected items as the items to be moved into the container object.

[0059] FIG. 4 is a diagram of a screen 1001 of a device 1000 according to an exemplary embodiment.

[0060] Referring to FIG. 4, the device 1000 may display a container object 400 and a plurality of items 101-111 on the screen 1001. As illustrated in FIG. 4, the container object 400 may be a folder, but is not limited thereto.

[0061] The plurality of items displayed on the screen 1001 may be an object displayed on the screen 1001 of the device 1000 to execute an application, an image file, and/or a text file and may be, for example, an icon, an image, or a text. For example, referring to FIG. 4, the plurality of items may include items 101 to 104 related to images, items 105 and 106 related to messages such as short message service (SMS) messages or e-mails, items 107 to 109 related to sounds, and items 110 and 111 related to texts but are not limited thereto.

[0062] As illustrated in FIG. 4, the items 101 to 111 displayed on the screen 1001 may be displayed having a rectangular shape but are limited thereto.

[0063] In response to a user input of touching an item, the device 1000 may execute a file corresponding to the touched item. For example, in the case of the items 110 and 111 related to the texts, in response to a user input of touching an item, the device 1000 may open a text file corresponding to the touched item. In addition, in the case of the items 105 and 106 related to the e-mails, in response to a user input of touching an item, the device 1000 may open an application corresponding to the touched item.

[0064] Furthermore, in the device 1000, a software screen (for example, a home screen, a lock screen, or an application screen) may include a plurality of pages, and the container object 400 and a plurality of items may be displayed on each of the pages. For example, a plurality of items displayed on the first page may be different from a plurality of items displayed on the second page. The device 1000 may display one page or may display other pages in response to a user input of turning the displayed page.

[0065] FIG. 5 is a diagram for describing an example in which a device 1000 moves an item into a container object 400, according to another exemplary embodiment.

[0066] Referring to FIG. 5, the device 1000 may display a connector 500 on a screen 1001 in response to a user input of touching and dragging the container object 400, the connector 500 being connected from the container object 400 to a current position 501 of the user input. At this time, the connector 500 may be displayed along a movement path of the user input. For example, the connector 500 may be displayed along a locus of a touching and dragging input. Referring to FIG. 5, the device 1000 may receive the user input of touching the container object 400, dragging the container object 400 downward toward an item 103, and dragging the container object 400 rightward toward an item 104. As illustrated in FIG. 5, the connector 500 may be displayed along a locus of the touching and dragging input.

[0067] In addition, the device 1000 may change a shape of the connector 500 in response to the user input. For example, the device 1000 may change a length of the connector 500 such that a distance between two ends 501 and 502 of the connector 500 is minimized, but the exemplary embodiment is not limited thereto.

[0068] In addition, the device 1000 may determine whether there is an item located within a predetermined distance from the end 501 of the connector 500. The predetermined distance may be changed by a user setting.

[0069] In addition, the device 1000 may move the item, which is located within the predetermined distance from the end 501 of the connector 500, into the container object 400. According to one or more exemplary embodiments, the device 1000 may move only an item overlapping the end 501 of the connector 500 into the container object 400.

[0070] FIGS. 6A to 6C are diagrams for describing an example in which a device 1000 distinguishably displays an item to be moved into a container object 400, according to one or more exemplary embodiments.

[0071] The device 1000 may display the item to be moved into the container object 400 of a plurality of items in such a manner that it is distinguishable from other items. For example, as illustrated in FIGS. 6A and 6B, the device 1000 may change a shape of an item 104 to be moved into the container object 400 and may display the changed shape. The device 1000 may display the item 104 to be moved into the container object 400 in such a manner that it appears to shake on a screen. However, the distinguished displaying is not limited to the examples described above.

[0072] In addition, as illustrated in FIG. 6C, the device 1000 may highlight the item 104 to be moved into the container object 400.

[0073] Furthermore, when the device 1000 selects the item 104 to be moved to the container object 400, the device 1000 may separately use a non-visual effect or may use the non-visual effect together with the visual effects described above.

[0074] The non-visual effect may include, for example, a haptic effect or a sound effect. As a user input starts, the device 1000 may vibrate, allowing a user touching the device to feel a vibration. At this time, as the end 501 of the connector 500 approaches the item 104, an intensity of the vibration may increase.

[0075] The device 1000 may use various non-visual effects in addition to the haptic effect and the sound effect, but is not limited to the example described above

[0076] FIG. 7 is a diagram for describing an example in which a device 1000 displays a state in which an item 104 is moved into a container object 400, according to an exemplary embodiment.

[0077] As illustrated in FIG. 7, the device 1000 may display the state in which the selected item 104 is moved into the container object 400 along a connector 500. For example, the selected item 104 may be displayed, through an animated effect, as an oval shape 701 that is moved into the container object 400 along the connector 500.

[0078] In addition, as the item 104 is moved into the container object 400, the device 1000 may faintly display the item 104. Accordingly, a user may confirm what the item 104 moved into the container object 400 is. According to one or more exemplary embodiments, the device 1000 may not display the item 104 moved into the container object 400 on a screen 1001.

[0079] FIG. 8A is a diagram for describing an example in which a device 1000 moves items into a container object 400 according to an intensity of a user input, according to another exemplary embodiment. The intensity of the user input may be an intensity of a pressure applied to a screen 1001 of the device 1000 by the user input or may be a time in which the user input is located at a point 501 (end of a connector 500) on the screen 1001.

[0080] For example, when the user input is located at the point 501 on the screen 1001 for a predetermined time or more, the device 1000 may move a plurality of items, which are located within a predetermined distance from the end 501 of the connector 500, into the container object 400. As the user input is located at the point 501 on the screen 1001 for the predetermined time or more, the device 1000 may display a selection region 810 capable of moving the plurality of items into the container object 400 at a time. At this time, the longer the time the user input is located at the point 501, the larger the selection region 810 may be.

[0081] In addition, when the pressure applied to the screen 1001 of the device 1000 by the user input increases, the device 1000 may display the selection region 810 capable of moving the plurality of items into the container object 400 at a time.

[0082] Referring to FIG. 8A, when the user input is located in the vicinity of an item 104 for a predetermined time or more, the device 1000 may display the selection region 810 on the basis of the point 501 at which the user input is located. As illustrated in FIG. 8A, the selection region 810 may have a circular shape, but is not limited thereto.

[0083] The device 1000 may move items included in the selection region into the container object 400 at one time. At this time, each of the items to be moved into the container object 400 may be an item, all portions of which are included in the selection region 810. Alternatively, according to one or more exemplary embodiments, an item, a predetermined portion or more of all portions of which is included in the selection region 810, may be moved into the container object 400. For example, referring to FIG. 8A, when the device 1000 selects an item, about 50% or more of all portions of which are included in the selection region 810, the device 1000 may move three items 103, 104, and 107 into the container object 400.

[0084] FIG. 8B is a diagram for describing an example in which a device 1000 determines the number of items to be moved into a container object 400 based on a thickness of a connector 500, according to another exemplary embodiment.

[0085] The device 1000 may variably set the thickness of the connector 500 according to a user setting. A plurality of items may be simultaneously moved into the container object 400 by setting the connector 500 to have a large thickness.

[0086] For example, referring to FIG. 8B, the device 1000 may move all of an item 104 located within a predetermined distance from an end 501 of the connector 500 and an item 105 located adjacent to the item 401, into the container object 400.

[0087] The adjacent item 105 may be selected based on a direction in which the end 501 of the connector 500 faces. Referring to FIG. 8B, the end 501 of the connector 500 faces rightward. Therefore, the device 1000 may move the item 104, which is located within the predetermined distance from the end 501 of the connector 500, and the item 105, which is adjacent to the item 401, into the container object 400. However, the selecting of an adjacent item is not limited to the example described above.

[0088] FIGS. 9A and 9B are diagrams for describing an example in which a device 1000 turns a page, according to one or more exemplary embodiments.

[0089] FIG. 9A is a diagram of a first page 901 displaying a container object 400 and a plurality of items on a screen 1001 of the device 1000.

[0090] When a user input is located at an edge of the first page 901, the device 1000 may display page turning regions 911 and 912 at the right side and the left side of the first page 901, respectively.

[0091] When the user input is located at the page turning regions 911 and 912 for a predetermined time, the device 1000 may display other pages adjacent to the first page 901 on the screen 1001. When the user input is located at the page turning region 911 at the right side of the first page 901, the device 1000 may display a second page 902 adjacent to the right side of the first page 901.

[0092] FIG. 9B is a diagram of the second page 902 on the screen 1001 of the device 1000. Referring to FIG. 9B, the second page 902 may display a plurality of items different from the plurality of items displayed on the first page 901.

[0093] In the same manner as illustrated in FIG. 9A, when the user input is located at the page turning region 912 at the left side of the second page 902 for a predetermined time, the device 1000 may re-display the first page 901 on the screen 1001.

[0094] Also, when the user input is located at the page turning region 912 at the left side of the first page 901, the device 1000 may display a third page adjacent to the left side of the first page 901.

[0095] FIG. 10 is a diagram for describing an example in which a device 1000 displays a menu 1010 for managing a container object 400, according to an exemplary embodiment.

[0096] When the device 1000 receives a user input of touching the container object 400 for a predetermined time, the device 1000 may display the menu 1010 for managing the container object 400.

[0097] Referring to FIG. 10, the menu 1010 may include options such as "ADD", "MOVE", or "DELETE", but is not limited thereto.

[0098] The device 1000 may select an item to be moved into the container object 400 in response to a user input of selecting the "ADD" option. In addition, the device 1000 may move the container object 400 to a new position on a screen 1001 in response to a user input of selecting the "MOVE" option. Furthermore, the device 1000 may delete the container object 400 in response to a user input of touching the "DELETE" option.

[0099] As illustrated in FIG. 10, the device 1000 may display the menu 1010 in a list form, but exemplary embodiments are not limited thereto.

[0100] FIG. 11 is a diagram for describing an example in which a device 1000 displays a container object 400 and a plurality of items in a small space, according to an exemplary embodiment.

[0101] Referring to FIG. 11, the device 1000 may display the plurality of items in a small size on a screen 1001 in response to a user input of selecting "ADD". Accordingly, the device 1000 may allow a user to easily perform an input of touching and dragging the container object 400.

[0102] The container object 400 and the items to be moved into the container object 400 may be displayed on different pages. For example, the container object 400 and the items to be moved into the container object 400 may be displayed on a first page and a second page, respectively.

[0103] In this case, to receive the user input, the device 1000 turns the first page to the second page. At this time, when the items are displayed in a smaller size on the screen 1001, the user input of touching the container object 400 and dragging the container object 400 to items to be moved may be easily performed.

[0104] When the items to be moved are located at an edge of the page, the user input of touching the container object 400 and dragging the container object 400 to the items to be moved may not be easily performed. At this time, when the items are displayed in a small size on the screen 1001, the items located on the edge of the page may also be easily selected.

[0105] FIG. 12 is a flowchart of a method of moving a plurality of items into a container object in a device 1000, according to another exemplary embodiment.

[0106] In operation S1210, the device 1000 may display the container object and the plurality of items on a screen.

[0107] In operation S1220, the device 1000 may receive a user input of making a connector form a loop. At this time, the user input may form the loop to enclose items to be moved into the container.

[0108] In operation S1230, the device 1000 may display the connector connected from the container object to a current position of the user input in response to the user input. The device 1000 may display the connector along a locus of a touching and dragging input. Therefore, when the user input forming the loop is received, the connector forming the loop may be displayed.

[0109] In addition, the device 1000 may change a shape of the connector during the receiving of the user input.

[0110] In operation S1240, the device 1000 may move the items enclosed by the loop into the container object. At this time, each of the items enclosed by the loop may be an item, all portions of which are enclosed by the loop. Alternatively, each of the items determined to be enclosed by the loop may have a predetermined portion or more of all portions of which is enclosed by the loop.

[0111] For example, the device 1000 may move only items which are entirely enclosed by the loop, into the container object. According to one or more exemplary embodiments, the device 1000 may move items partially enclosed by a predetermined portion or more of the loop, into the container object. The predetermined portion may be defined as a predetermined percentage of a total area of an item. In a case of a three-dimensional user interface (3D UI), the predetermined potion may be defined as a percentage of a total volume of an item.

[0112] In addition, even if the connector does not completely form a closed loop, when the device 1000 receives a user input satisfying a predetermined condition, the device 1000 may automatically complete the closed loop. For example, although a portion of the loop formed by the connector is opened, when an end of the connector approaches other portions of the connector within a predetermined distance, the device 1000 may make the connector automatically form the loop. According to one or more exemplary embodiments, when the end of the connector comes into contact with or overlaps other portions of the connector, the device 1000 may make the connector automatically form the loop. Therefore, the user may more conveniently select the items to be moved into the container object.

[0113] FIG. 13 is a flowchart of a method of moving a plurality of items into a container object in a device 1000, according to another exemplary embodiment.

[0114] FIG. 13 is a flowchart for describing an exemplary embodiment. Therefore, a redundant description thereof will be omitted. Because operations S1310 to S1330 are substantially the same as operations S310 to S330 of FIG. 3, a redundant description thereof will be omitted.

[0115] In operation S1340, the device 1000 may determine whether the connector displayed in response to the user input forming a loop.

[0116] In operation S1350, the device 1000 may determine whether one or more items are enclosed by the loop.

[0117] In operation S1360, the device 1000 may move the items enclosed by the loop into the container object. However, when there is no item enclosed by the loop, the device 1000 may return to operation S1320 and may receive a new user input.

[0118] In operation S1370, the device 1000 may determine whether the user input continues. When the user input continues, the device 1000 may receive a new user input and may additionally select items to be moved into the container object. When the user input ends, the device 1000 may move the selected items into the container object.

[0119] FIG. 14 is a diagram for describing an example in which a device 1000 receives a user input forming a loop, according to an exemplary embodiment.

[0120] The device 1000 may receive a user input of touching the container object 400 and forming the loop. In addition, the device 1000 may display a connector 500 having a loop shape in response to the user input. At this time, the connector 500 may indicate a plurality of items.

[0121] The device 1000 may move an item indicated by the loop into the container object 400. According to one or more exemplary embodiments, the device 1000 may move items entirely or partially enclosed by the loop into the container object 400.

[0122] Referring to FIG. 14, the connector 500 forming the loop encloses a portion of each of items 107, 108, 110, and 111. For example, when 60% or more of an item is enclosed by the loop, the item is moved into the container object 400. In this example, the device 1000 may move the items 107 108, 110 and 111 into the container object 400.

[0123] FIG. 15 is a diagram for describing an example in which a device 1000 distinguishably displays a state in which items are moved into a container object 400, according to another exemplary embodiment.

[0124] Referring to FIG. 15, the device 1000 may display a state in which the items 107, 108, 110 and 111 are moved into the container object 400 along a connector 500. For example, as illustrated in FIG. 15, the device 1000 may display a state in which four oval shapes 1510 to 1540 are moved into the container object 400 along the connector 500, but the exemplary embodiment is not limited thereto. The four oval shapes 1510 to 1540 may respectively mean the items 107, 108, 110, and 111, which are moved into the container object 400.

[0125] At this time, as described in FIG. 15, the device 1000 may stop displaying items 107, 108, 119, and 111, which are moved into the container object 400. According to one or more exemplary embodiments, the device 1000 may faintly display the items 107, 108, 110, and 111, which are moved into the container object 400.

[0126] FIGS. 16A and 16B are diagrams for describing an example in which a device 1000 changes a shape of a connector 500, according to one or more exemplary embodiments.

[0127] Referring to FIG. 16, the device 1000 may receive a user input forming a loop.

[0128] The device 1000 may display the connector 500 connected from a container object 400 to a current position of the user input in response to the user input. The device 1000 may display the connector 500 along a movement path of the user input. For example, the connector 500 may be displayed along a locus of a touching and dragging input. Therefore, the connector 500 may be displayed in various shapes depending on the movement path of the user input.

[0129] At this time, the device 1000 may change the shape of the connector 500 while receiving of the user input. The device 1000 may change the shape of the connector 500 such that a distance between two ends 501 and 502 of the connector 500 is shortened. For example, as illustrated in FIG. 16A, the device 1000 may gradually reduce a size of the loop, thereby decreasing the distance between the two ends 501 and 502 of the connector 500.

[0130] When the user input is located at the same position for a predetermined time or more, the device 1000 may change the shape of the connector 500 such that the distance between the two ends 501 and 502 of the connector 500 is minimized. In this case, as described in FIG. 16B, the device 1000 may change the shape of the connector 500 into a shape of a straight line that connects the two ends 501 and 502 of the connector 500.

[0131] Accordingly, when the user input continues at the same location for a predetermined time or more, the device 1000 may prevent the connector 500 from occupying a predetermined portion or more of a screen 1001 by changing the shape of the connector 500. In addition, the device 1000 may prevent a loop unintended by a user from being formed and an item unintended by the user from being moved into the container object 400 by changing a length of the connector 500.

[0132] Furthermore, the device 1000 may preset a time that is spent to change the shape of the connector 500. A manufacturer of the device 1000 may preprogram a time spent to change the shape of the connector 500 such that the distance between the two ends 501 and 502 of the connector 500 is minimized, or a user may set the time.

[0133] For example, the device 1000 allows the user to select the time spent to change the shape of the connector 500 in the range of about 0.5 seconds to 5 seconds. However, the set time is not limited to the example described above.

[0134] A short time in changing the shape of the connector 500 may mean that although the connector 500 displayed on the screen 1001 is initially long, and the length of the connector 500 is able to get shorter in a very short time. Therefore, a short time for the device 1000 to change the shape of the connector 500 may correspond to that elasticity of the connector 500 is good.

[0135] FIG. 17 is a flowchart of a method of changing a shape of a connector 500 in a device 1000, according to an exemplary embodiment.

[0136] A path of the connector 500 on a screen 1001 may be expressed by coordinate values of a plurality of points. The coordinate values constituting the connector 500 will be referred to as a connector coordinate set. In this case, the connector coordinate set may include coordinate values of one or more points exiting along the path of the connector 500. The device 1000 may add, to the connector coordinate set, coordinate values of all of points sensed on the screen 1001 while receiving of a user input. The device 1000 may add, to the connector coordinate set, coordinate values of some of the points sensed on the screen 1001 during the receiving of the user input. For example, the device 1000 may add, to the connector coordinate set, coordinate values of N-th points of the points sensed on the screen 1001 during the receiving of the user input.

[0137] In operation S1710, the device 1000 may determine a straight line that connects two ends 501 and 502 of the connector 500 displayed in response to the user input. For example, the straight line may be expressed by a straight line equation having a form of y=ax+b in a coordinate system defined by the connector coordinate set. However, the expressing is not limited to the straight line equation described above.

[0138] In operation S1720, the device 1000 may initialize a value of a counter i to 1.

[0139] In operation S1730, the device 1000 may calculate the shortest distance between the straight line connecting the two ends 501 and 502 of the connector 500 and a point corresponding to an i-th coordinate value constituting the connector coordinate set. The shortest distance may be a distance defined when the point corresponding to the i-th coordinate value satisfies the straight line connecting the two ends 501 and 502 of the connector 500 at right angles.

[0140] In operation S1740, the device 1000 may determine a distance in which the i-th coordinate value is to be moved, based on the shortest distance calculated in operation S1730. For example, the distance, in which the i-th coordinate value is to be moved, may be determined by a fixed ratio with respect to the shortest distance calculated in operation S1730, but is not limited thereto.

[0141] In operation S1750, the device 1000 may determine whether a value of the counter i is equal to N (the number of coordinate values constituting the connector coordinate set). When the value of the counter i is equal to N, it may be considered that a distance in which all of coordinate values are to be moved is calculated. However, when the value of the counter i is not equal to N, the device 1000 may increase the value of the counter i by 1 (operation S1760) and perform operation S1730 on an updated value of the i-th coordinate.

[0142] In operation S1770, the device 1000 may move all of the coordinate values toward the straight line connecting two ends 501 and 502 of the connector 500 by the distance determined in operation S1740. The device 1000 may more quickly move the coordinate value as the distance thereof determined in operation S1740 is longer. Therefore, the device 1000 may equally set movement times of all the coordinate values constituting the connector coordinate set.

[0143] In operation S1780, the device 1000 may determine whether a new path defined by the connector coordinate set is equal to the straight line that connects the two ends 501 and 502 of the connector 500.

[0144] Although the new path is not necessarily a perfect straight line, when the new path is recognized as having a predetermined correlation with a straight line, the device 1000 may end the changing of the shape of the connector 500. For example, the device 1000 may calculate a correlation coefficient between the coordinate values constituting the connector coordinate set and the equation defining the straight line that connects the two ends 501 and 502 of the connector 500. When the calculated correlation coefficient becomes a value of a predetermined critical value or more, the device 1000 may end the changing of the shape of the connector 500. Therefore, it is possible to prevent excessive use of a resource in the device 1000 for changing the shape of the connector 500.

[0145] FIG. 18 is a diagram for describing an example in which a device 1000 changes a shape of a connector 500, according to another exemplary embodiment.

[0146] As illustrated in FIG. 18, a path of the connector 500 may be expressed by a connector coordinate set. At this time, the connector coordinate set may include coordinate values of points marked by "x".

[0147] In addition, the device 1000 may determine a straight line 1800 that connects two ends 501 and 502 of the connector 500. The device 1000 may change the shape of the connector 500 by moving the coordinate values constituting the connector coordinate set by a predetermined distance.

[0148] For example, referring to FIG. 18, the device 1000 may calculate a shortest distance Xi between the straight line 1800 and a point 1810 corresponding to an i-th coordinate value of the coordinate values constituting the connector coordinate set. At this time, the shortest distance Xi may be a distance defined when the point 1810 corresponding to the i-th coordinate value satisfies the straight line 1800 at right angles. The device 1000 may determine a distance in which the i-th coordinate value is to be moved, based on the shortest distance Xi. For example, the distance, in which the i-th coordinate value is to be moved, may be defined by a ration with respect to the shortest distance Xi, but is not limited thereto.

[0149] The device 1000 may calculate the shortest distance between the straight line 1800 and each of the coordinate values and calculate a distance in which each of the coordinate values is to be moved, with respect to all of the coordinate values constituting the connector coordinate set. The device 1000 may move all of the coordinate values constituting the connector coordinate set toward the straight line 1800 by the calculated distance. Therefore, when all of the coordinate values are moved by the calculated distance, a new path defined by the connector coordinate set may be the straight line 1800.

[0150] FIGS. 19 to 21 as block diagrams of a device 1000 according to one or more exemplary embodiments.

[0151] Referring to FIG. 19, the device 1000 may include a display 1101, a user input device 1200, and a controller 1300. However, not all illustrated elements are necessary elements. The device 1000 may be embodied with more or less elements than the illustrated elements.

[0152] For example, as illustrated in FIG. 20, the device 1000 may further include a memory 1700 and a display controller 1111 in addition to the display 1101, the user input device 1200, and the controller 1300.

[0153] In addition, as illustrated in FIG. 21, the device 1000 may further include an output device 1100, a communicator 1400, a sensing device 1500, and an audio/video (A/V) input device 1600.

[0154] Hereinafter, the elements will be described.

[0155] The output device 1100 may output an audio signal, a video signal, or a vibration signal and may include the display 1101, a sound output device 1102, a vibration motor 1103, and the like.

[0156] The display 1101 may display information processed by the device 1000.

[0157] The display 1101 may display a container object and a plurality of items capable of being moved to the container object. The container object may be a folder or a different type of user interface object.

[0158] The items displayed on the display 1101 may be an object displayed on the display 1101 of the device 1000 to execute an application, an audio file, and a text file and may be, for example, an icon, an image, or a text.

[0159] The display 1101 may display items to be moved to the container object of the plurality of items to be distinguished from other items under control of the controller 1300 described below. The display 1101 may display a state in which the items are moved into the container object along a connector, under control of the controller 1300 described below.

[0160] A predetermined software screen displayed on the display 1101 may include a plurality of pages. For example, the display 1101 may display a first page and a second page, the first page displaying the container object and a plurality of items and the second page displaying a plurality of items different from the plurality of items displayed on the first page.

[0161] In addition, the display 1101 may display a menu for managing the container object under control of the controller 1300 described later.

[0162] A touch screen may be implemented by forming the display 1101 and a touch pad to have a mutual layer structure, allowing the display 1101 to be used as both an output device and input device. The display 1101 may include at least one of a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display (3D display), and an electrophoretic display. According to a type of the device 1000, the device 1000 may include two or more displays 1101.

[0163] The sound output device 1102 may output audio data that is received from the communication device 1400 or is stored in a memory 1700. The sound output device 112 may also output a sound signal (e.g., a call signal receiving sound, a message receiving sound, a notifying sound, or the like) related to capabilities performed by the device 1000. The sound output device 1102 may include a speaker, a buzzer, or the like

[0164] The vibration motor 1103 may output a vibration signal For example, the vibration motor 1103 may output the vibration signal in concert with an output of the audio data (e.g., the call signal receiving sound, the message receiving sound, or the like) or video data. In addition, the vibration motor 1103 may output a vibration signal when a touch is input to the touch screen.

[0165] The user input device 1200 may mean a unit by which a user inputs data to control the device 1000. For example, the user input device 1200 may include one or more of a key pad, a dome switch, a touch capacitive type touch pad, a pressure resistive type touch pad, an infrared beam sensing type touch pad, a surface acoustic wave type touch pad, an integral strain gauge type touch pad, a piezo effect type touch pad, any other type of touch pad, a jog wheel, and a jog switch. However, the exemplary embodiment is not limited thereto.

[0166] The user input device 1200 may receive a user input of touching and dragging the container object. In addition, the user input device 1200 may receive a user input of touching the container object and forming a loop. The user input device 1200 may include the touch screen.

[0167] The controller 1300 may control all operations of the device 1000. For example, the controller 1300 may control the output device 1100, the user input device 1200, the communicator 1400, the sensing device 1500, the A/V input device 1600, and the like by executing programs stored in the memory 1700. Accordingly, the device 1000 may move the items into the container object by using the connector.

[0168] The controller 1300 may display the connector connected from the container object to a current position of the user input on the display 1101 in response to the user input. In addition, the controller 1300 may change a shape of the connector during the receiving of the user input. At this time, the controller 1300 may change the shape of the connector such that a distance between two ends of the connector is minimized.

[0169] The controller 1300 may move an item of the plurality of items on the display 1101, which is located within a predetermined distance from the end of the connector, into the container object.

[0170] In addition, the controller 1300 may move an item of the plurality of items on the display 1101, which is enclosed by the loop into the container object in response to the user input, into the container object.

[0171] The controller 1300 may move only an item completely enclosed by the loop into the container object. According to one or more exemplary embodiments, the controller 1300 may move an item partially enclosed by the loop into the container object.

[0172] In addition, the controller 1300 may move an item of the plurality of items, which is located within the predetermined distance from the end of the connector for a predetermined time or more, into the container object.

[0173] The communicator 1400 may include one or more elements allowing communication between the device 1000 and external devices or between the device 1000 and a server. For example, the communicator 1400 may include a short-range wireless communicator 1401, a mobile communicator 1402, and a broadcast receiver 1403.

[0174] The short-range wireless communicator 1401 may include, but is not limited to, a Bluetooth communicator, a near field communicator, a wireless local area network (WLAN or Wi-Fi) communicator, a ZigBee communicator, an infrared data association (IrDA) communicator, a Wi-Fi direct (WFD) communicator, an ultra wideband (UWB) communicator, or an Ant+ communicator.

[0175] The mobile communicator 1402 may exchange a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to communication of a sound call signal, a moving picture call signal, or a text/multimedia message.

[0176] The broadcast receiver 1403 may receive a broadcast signal and/or information related to a broadcast from the outside through a broadcast channel. The broadcast channel may include a satellite channel and a ground wave channel. According to one or more exemplary embodiments, the device 1000 may not include the broadcast receiver 1403.

[0177] The sensing device 1500 may sense a state of the device 1000 or a state around the device 1000 and may transmit the sensed information to the controller 1300.

[0178] The sensing device 1500 may include, but is not limited to, at least one of a magnetic sensor 1501, an acceleration sensor 1502, a temperature/humidity sensor 503, an infrared sensor 1504, a gyro sensor 1505, a position sensor (for example, a GPS sensor) 1506, a pressure sensor 1507, a proximity sensor 1508, and an RGB sensor (illuminance sensor) 1509.

[0179] The A/V input device 1600 may input an audio signal or a video signal and may include a camera 1601 and a microphone 1602. The camera 1601 may obtain an image frame such as a still image or a moving picture via an image sensor during a moving picture call mode or an image-capturing mode. An image captured via the image sensor may be processed by the controller 1300 or a separate image processor.

[0180] The image frame processed by the camera 1601 may be stored in the memory 1700 or may be transmitted to the outside via the communicator 1400. According to a type of the device 1000, the device 1000 may include two or more cameras 1601.

[0181] The microphone 1602 may receive a sound signal from the outside as an input and may process the received sound signal to an electrical voice data signal. For example, the microphone 1602 may receive a sound signal from an external device or a speaker. To remove noise that occurs while the sound signal is received from the outside, the microphone 1602 may use various noise removing algorithms.

[0182] The memory 1700 may store a program for processing and controlling the controller 1300 or may store a plurality of pieces of input/output data (e.g., an application, content, an image file, a text file, etc.).

[0183] The memory 1700 may include a storage medium of at least one type of a flash memory, a hard disk, a multimedia card type memory, a card type memory such as a Secure Digital (SD) or eXtreme Digital (XD) card memory, RAM, static random-access memory (SRAM), ROM, Electrically Erasable Programmable Read-Only Memory (EEPROM), programmable read-only memory (PROM), a magnetic memory, a magnetic disc, and an optical disc. In addition, the device 1000 may run web storage or a cloud server that performs a storage function of the memory 1700 on the Internet.

[0184] The programs stored in the memory 1700 may be classified into a plurality of modules according to their functions. For example, the programs stored in the memory 260 may be classified into a UI module 1701, a touch screen module 1702, an alarm module 1703, a speak to test (STT) module, etc.

[0185] The UI module 1701 may provide a UI or GUI in connection with the device 1000 for each application. The touch screen module 1702 may detect a user's touch gesture on the touch screen and may transmit information on the touch gesture to the controller 1300. The touch screen module 1702 may include separate hardware including a controller.

[0186] Various sensors may be arranged in or near the touch screen to detect a touch or a proximate touch on the touch sensor. An exemplary sensor to detect the touch may include a tactile sensor. The tactile sensor may detect a contact of an object at or beyond a sensitivity of a human being. The tactile sensor may detect various types of information such as roughness of a contact surface, hardness of a contact object, a temperature of a contact point, or the like.

[0187] Examples of the sensor that detects the touch on the touch screen may include a proximity sensor.

[0188] The proximity sensor may detect the presence or absence of an object approaching a predetermined detection surface or an object existing nearby the proximity sensor, by using a force of an electro-magnetic field or an infrared ray without a mechanical contact. Examples of the proximity sensor may include a transmission-type photoelectric sensor, a direction reflection-type photoelectric sensor, a mirror reflection-type photoelectric sensor, a high frequency oscillation-type proximity sensor, a capacitance-type proximity sensor, a magnetic proximity sensor, an infrared-type proximity sensor, or the like. The touch gesture of the user may include a tap gesture, a touch and hold gesture, a double tap gesture, a drag gesture, a panning gesture, a flick gesture, a drag & drop gesture, a swipe gesture, or the like.

[0189] The alarm module 1703 may generate a signal for notifying the user about an occurrence of an event in the device 1000. Examples of the event occurring in the device 1000 may include a call signal receiving event, a message receiving event, a key signal input event, a schedule notifying event, or the like. The alarm module 1703 may output an alarm signal in a video signal form through the display 1101, may output the alarm signal in an audio signal form through the sound output device 1102, and may output the alarm signal in a vibration signal form through the vibration motor 1103.

[0190] The STT module 1704 may change a voice included in a multimedia content into a text and may generate a transcript corresponding to the multimedia content. At this time, the transcript may be mapped to replay time information of the multimedia content.

[0191] The display controller 1111 may be control the display 1101 through a wired or wireless connection. Computer programming commands may control the display controller 1111 to display the container object and the plurality of items on the display 1101.

[0192] The exemplary embodiments set forth herein may be embodied as program instructions that can be executed by various computing units and recorded on a non-transitory computer-readable recording medium. Examples of the non-transitory computer-readable recording medium may include program instructions, data files, and data structures solely or in combination. The program instructions recorded on the non-transitory computer-readable recording medium may be designed and configured for exemplary embodiments, or may be well known to and usable by one of ordinary skill in the field of computer software. Examples of the non-transitory computer-readable recording medium may include magnetic media (e.g., a hard disk, a floppy disk, a magnetic tape, etc.), optical media (e.g., a compact disc-read-only memory (CD-ROM), a digital versatile disk (DVD), etc.), magneto-optical media (e.g., a floptical disk, etc.), and a hardware device configured to store and execute program instructions (e.g., a read only memory (ROM), a random access memory (RAM), a flash memory, etc.). Examples of the program instructions may include not only machine language codes prepared by a compiler but also high-level codes executable by a computer by using an interpreter.

[0193] It should be understood that exemplary embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each exemplary embodiment should typically be considered as available for other similar features or aspects in other exemplary embodiments. For example, each element described as a singular form may be implemented in a distributed manner, and elements described as distributed may be implemented in an integrated manner.

[0194] While one or more exemplary embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the scope as defined by the following claims


Claims

1. A device (1000) comprising:

a display (1101) configured to display a container object (400) and a plurality of items (103, 104, 105, 107, 108, 110, 111) that are movable into the container object;

a user input device (1200) configured to obtain a user input; and

at least one processor (1300),

wherein the at least one processor is configured to: until a user touch input is ended, determine whether the user input (501) is indicative of:

- a first gesture moving from the container object to a current position of the user input or

- a second gesture forming a loop;

based on the user input (501) indicating the first gesture moving from the container object to a current position of the user input, control the display to display a first connector along a first path of the first gesture, the first connector being connected from the container object to the current position of the user input, identify a first item among the plurality of items, located within a predetermined distance of an end of the first connector at the current location of the user input, and, if the first item is located within the predetermined distance from the end of the first connector for a predetermined time, move the first item into the container object, and

based on the user input (501) indicating the second gesture forming a loop, control the display to display a second connector along a second path of the second gesture, the second connector being connected from the container object to the current position of the user input, identify the items, among the plurality of items, enclosed by the loop, and move the identified items into the container object.


 
2. The device of claim 1, wherein the at least one processor is further configured to identify the items enclosed by the loop in response to at least one among all portions of each of the items being enclosed by the loop, and a threshold portion or more of each of the items being enclosed by the loop.
 
3. The device of claim 1, wherein at least one processor is further configured to control the display the items moving into the container object along the second connector based on a predetermined animation effect (1510, 1520, 1530, 1540).
 
4. The device of claim 1, wherein the at least one processor is further configured to determine a third item of the plurality of items within the predetermined distance from the current position for the predetermined time, and move the third item into the container object.
 
5. The device of claim 1, wherein the at least one processor is further configured to control the display to distinguishably display the first item.
 
6. The device of claim 1, wherein the at least one processor is further configured to control the display to display the first item moving into the container object along the first connector.
 
7. The device of claim 1, wherein the container object and the plurality of items are displayed on a first page of a plurality of pages, and the at least one processor is further configured to, in response to the user input corresponding to an edge of the first page, control the display to display a second page of the plurality of pages.
 
8. The device of claim 1, wherein the at least one processor is further configured to change a shape of the connector during the obtaining of the user input.
 
9. The device of claim 1, wherein the at least one processor is further configured to shorten the connector according to a predetermined criteria during the obtaining of the user input.
 
10. The device of claim 1, wherein, the at least one processor is further configured to control the display to display a menu for managing the container object in response to obtaining a user input of touching the container object for a threshold time.
 
11. The device of claim 10, wherein the at least one processor is further configured to reduce a displayed size of the container object and the plurality of items during the displaying of the menu.
 
12. A method of managing an item, the method comprising:

displaying (S210; S310; S1210; S1310) a container object and a plurality of items that are movable to the container object;

obtaining (S220; S320; S1220; S1320) a user input;

wherein the method further comprises:

until a user touch input is ended, determining whether the user input is indicative of:

- a first gesture moving from the container object to a current position of the user input or

- a second gesture forming a loop;

displaying (S230; S330; S1230; S1330), based on the obtained user input indicating the first gesture moving from the container object to a current position of the user input, a first connector along a first path of the first gesture, the first connector being connected from the container object to the current position of the user input;

identifying (S350; S1350) a first item among the plurality of items, located within a predetermined distance of an end of the first connector at the current location of the user input;

moving (S240; S360; S1240; S1360) the first item into the container object, if the first item is located within the predetermined distance from the end of the first connector for a predetermined time;

displaying, based on the user input indicating the second gesture forming a loop, a second connector along a second path of the second gesture, the second connector being connected from the container object to the current position of the user input;

identifying the items, among the plurality of items enclosed by the loop; and

moving the identified items into the container object.


 
13. The method of claim 12, wherein the identifying the items enclosed by the loop comprises determining at least one among all portions of each of the items are completely enclosed by the loop and at least a threshold portion of each of the items is enclosed by the loop.
 


Ansprüche

1. Vorrichtung (1000), die Folgendes aufweist:

eine Anzeige (1101), die zum Anzeigen eines Container-Objekts (400) und mehrerer Elemente (103, 104, 105, 107, 108, 110, 111), die in das Container-Objekt bewegt werden können, konfiguriert ist;

eine Benutzereingabevorrichtung (1200), die zum Erhalten einer Benutzereingabe konfiguriert ist; und

wenigstens einen Prozessor (1300),

wobei der wenigstens eine Prozessor konfiguriert ist zum: Bestimmen, bis eine Benutzerberührungseingabe beendet wird, ob die Benutzereingabe (501) Folgendes erkennen lässt:

- eine erste, sich vom Container-Objekt zu einer aktuellen Position der Benutzereingabe bewegende Geste oder

- eine zweite, eine Schleife bildende Geste;

auf Basis des Zeigens der ersten, sich vom Container-Objekt zu einer aktuellen Position der Benutzereingabe bewegenden Geste durch die Benutzereingabe (501) Steuern der Anzeige zum Anzeigen eines ersten Verbinders an einem ersten Weg der ersten Geste entlang, wobei der erste Verbinder vom Container-Objekt aus mit der aktuellen Position der Benutzereingabe verbunden ist, Identifizieren eines ersten Elements unter den mehreren Elementen, das sich in einer vorbestimmten Entfernung von einem Ende des ersten Verbinders an der aktuellen Position der Benutzereingabe befindet, und, falls sich das erste Element für eine vorbestimmte Zeit innerhalb der vorbestimmten Entfernung vom Ende des ersten Verbinders befindet, Bewegen des ersten Elements in das Container-Objekt, und

auf Basis des Zeigens der zweiten, eine Schleife bildenden Geste durch die Benutzereingabe (501) Steuern der Anzeige zum Anzeigen eines zweiten Verbinders an einem zweiten Weg der zweiten Geste entlang, wobei der zweite Verbinder vom Container-Objekt aus mit der aktuellen Position der Benutzereingabe verbunden ist, Identifizieren der Elemente unter den mehreren Elementen, die von der Schleife umschlossen sind, und Bewegen der identifizierten Elemente in das Container-Objekt.


 
2. Vorrichtung nach Anspruch 1, wobei der wenigstens eine Prozessor ferner zum Identifizieren der Elemente, die von der Schleife umschlossen sind, als Reaktion auf wenigstens eins unter allen Teilen von jedem der Elemente, die von der Schleife umschlossen sind, und einem Schwellenteil oder mehr von jedem der Elemente, die von der Schleife umschlossen sind, konfiguriert ist.
 
3. Vorrichtung nach Anspruch 1, wobei wenigstens ein Prozessor ferner zum Steuern der Anzeige, wobei die Elemente sich am zweiten Verbinder entlang in das Container-Objekt bewegen, auf Basis eines vorbestimmten Animationseffekts (1510, 1520, 1530, 1540) konfiguriert ist.
 
4. Vorrichtung nach Anspruch 1, wobei der wenigstens eine Prozessor ferner zum Bestimmen eines dritten Elements der mehreren Elemente, das für die vorbestimmte Zeit innerhalb der vorbestimmten Entfernung von der aktuellen Position ist, und zum Bewegen des dritten Elements in das Container-Objekt konfiguriert ist.
 
5. Vorrichtung nach Anspruch 1, wobei der wenigstens eine Prozessor ferner zum Steuern der Anzeige zum unterscheidbaren Anzeigen des ersten Elements konfiguriert ist.
 
6. Vorrichtung nach Anspruch 1, wobei der wenigstens eine Prozessor ferner zum Steuern der Anzeige zum Anzeigen des ersten, sich am ersten Verbinder entlang in das Container-Objekt bewegenden Elements konfiguriert ist.
 
7. Vorrichtung nach Anspruch 1, wobei das Container-Objekt und die mehreren Elemente auf einer ersten Seite von mehreren Seiten angezeigt werden und der wenigstens eine Prozessor ferner, als Reaktion auf die einem Rand der ersten Seite entsprechende Benutzereingabe, zum Steuern der Anzeige zum Anzeigen einer zweiten Seite der mehreren Seiten konfiguriert ist.
 
8. Vorrichtung nach Anspruch 1, wobei der wenigstens eine Prozessor ferner zum Ändern einer Form des Verbinders während des Erhaltens der Benutzereingabe konfiguriert ist.
 
9. Vorrichtung nach Anspruch 1, wobei der wenigstens eine Prozessor ferner zum Kürzen des Verbinders gemäß vorbestimmten Kriterien während des Erhaltens der Benutzereingabe konfiguriert ist.
 
10. Vorrichtung nach Anspruch 1, wobei der wenigstens eine Prozessor ferner zum Steuern der Anzeige zum Anzeigen eines Menüs zum Verwalten des Container-Objekts als Reaktion auf das Erhalten einer Benutzereingabe des Berührens des Container-Objekts für eine Schwellenzeit konfiguriert ist.
 
11. Vorrichtung nach Anspruch 10, wobei der wenigstens eine Prozessor ferner zum Reduzieren einer angezeigten Größe des Container-Objekts und der mehreren Elemente während des Anzeigens des Menüs konfiguriert ist.
 
12. Verfahren zum Verwalten eines Elements, wobei das Verfahren Folgendes aufweist:

Anzeigen (S210; S310; S1210; S1310) eines Container-Objekts und mehrerer Elemente, die in das Container-Objekt bewegt werden können;

Erhalten (S220; S320; S1220; S1320) einer Benutzereingabe;

wobei das Verfahren ferner Folgendes aufweist:

Bestimmen, bis eine Benutzerberührungseingabe beendet wird, ob die Benutzereingabe Folgendes erkennen lässt:

- eine erste, sich vom Container-Objekt zu einer aktuellen Position der Benutzereingabe bewegende Geste oder

- eine zweite, eine Schleife bildende Geste;

auf Basis der Zeigens der ersten, sich vom Container-Objekt zu einer aktuellen Position der Benutzereingabe bewegenden Geste durch die erhaltene Benutzereingabe Anzeigen (S230; S330; S1230; S1330) eines ersten Verbinders an einem ersten Weg der ersten Geste entlang, wobei der erste Verbinder vom Container-Objekt aus mit der aktuellen Position der Benutzereingabe verbunden ist;

Identifizieren (S350; S1350) eines ersten Elements unter den mehreren Elementen, das sich innerhalb einer vorbestimmten Entfernung von einem Ende des ersten Verbinders an der aktuellen Position der Benutzereingabe befindet;

Bewegen (S240; S360; S1240; S1360) des ersten Elements in das Container-Objekt, falls sich das erste Element für eine vorbestimmte Zeit innerhalb der vorbestimmten Entfernung vom Ende des ersten Verbinders befindet;

auf Basis des Zeigens der zweiten, eine Schleife bildenden Geste durch die Benutzereingabe Anzeigen eines zweiten Verbinders an einem zweiten Weg der zweiten Geste entlang, wobei der zweite Verbinder vom Container-Objekt aus mit der aktuellen Position der Benutzereingabe verbunden ist;

Identifizieren der Elemente unter den mehreren Elementen, die von der Schleife umschlossen sind; und

Bewegen der identifizierten Elemente in das Container-Objekt.


 
13. Verfahren nach Anspruch 12, wobei das Identifizieren der Elemente, die von der Schleife umschlossen sind, das Bestimmen aufweist, dass wenigstens eines unter allen Teilen von jedem der Elemente vollständig von der Schleife umschlossen ist und wenigstens ein Schwellenteil von jedem der Elemente von der Schleife umschlossen ist.
 


Revendications

1. Dispositif (1000), comprenant :

un écran (1101) configuré pour afficher un objet conteneur (400) et une pluralité d'éléments (103, 104, 105, 107, 108, 110, 111) qui peuvent être déplacés dans l'objet conteneur ;

un dispositif d'entrée d'utilisateur (1200) configuré pour obtenir une entrée d'utilisateur ; et

au moins un processeur (1300),

l'au moins un processeur étant configuré pour :

jusqu'à ce qu'une entrée tactile d'utilisateur soit terminée, déterminer si l'entrée d'utilisateur (501) indique :

- un premier geste se déplaçant de l'objet conteneur à une position actuelle de l'entrée d'utilisateur, ou

- un second geste formant une boucle ;

sur la base de l'entrée d'utilisateur (501) indiquant le premier geste se déplaçant de l'objet conteneur à une position actuelle de l'entrée d'utilisateur, contrôler l'écran pour afficher un premier connecteur suivant une première trajectoire du premier geste, le premier connecteur étant connecté de l'objet conteneur à la position actuelle de l'entrée d'utilisateur, identifier parmi la pluralité d'éléments un premier élément situé à une distance prédéterminée ou moins d'une extrémité du premier connecteur à l'emplacement actuel de l'entrée d'utilisateur, et, si le premier élément est situé à la distance prédéterminée ou moins de l'extrémité du premier connecteur pendant une durée prédéterminée, déplacer le premier élément dans l'objet conteneur, et

sur la base de l'entrée d'utilisateur (501) indiquant le second geste formant une boucle, contrôler l'écran pour afficher un second connecteur suivant une seconde trajectoire du second geste, le second connecteur étant connecté de l'objet conteneur à la position actuelle de l'entrée d'utilisateur, identifier parmi la pluralité d'éléments les éléments enfermés par la boucle, et déplacer les éléments identifiés dans l'objet conteneur.


 
2. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour identifier les éléments enfermés par la boucle en réponse au fait que toutes les parties de chacun des éléments sont enfermées par la boucle et/ou au fait qu'une ou plusieurs parties-seuils de chacun des éléments sont enfermées par la boucle.
 
3. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour contrôler l'écran pour afficher les éléments se déplaçant dans l'objet conteneur suivant le second connecteur sur la base d'un effet d'animation prédéterminé (1510, 1520, 1530, 1540) .
 
4. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour déterminer un troisième élément de la pluralité d'éléments à la distance prédéterminé ou moins de la position actuelle pendant la durée prédéterminée, et déplacer le troisième élément dans l'objet conteneur.
 
5. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour contrôler l'écran pour afficher distinctement le premier élément.
 
6. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour contrôler l'écran pour afficher le premier élément se déplaçant dans l'objet conteneur suivant le premier connecteur.
 
7. Dispositif selon la revendication 1, dans lequel l'objet conteneur et la pluralité d'éléments sont affichés sur une première page d'une pluralité de pages, et l'au moins un processeur est en outre configuré pour, en réponse au fait que l'entrée d'utilisateur correspond à un bord de la première page, contrôler l'écran pour afficher une seconde page de la pluralité de pages.
 
8. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour changer une forme du connecteur pendant l'obtention de l'entrée d'utilisateur.
 
9. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour raccourcir le connecteur selon un critère prédéterminé pendant l'obtention de l'entrée d'utilisateur.
 
10. Dispositif selon la revendication 1, dans lequel l'au moins un processeur est en outre configuré pour contrôler l'écran pour afficher un menu pour gérer l'objet conteneur en réponse à l'obtention d'une entrée d'utilisateur consistant à toucher l'objet conteneur pendant une durée-seuil.
 
11. Dispositif selon la revendication 10, dans lequel l'au moins un processeur est en outre configuré pour réduire une taille affichée de l'objet conteneur et de la pluralité d'éléments pendant l'affichage du menu.
 
12. Procédé de gestion d'un élément, le procédé consistant à :

afficher (S210 ; S310 ; S1210 ; S1310) un objet conteneur et une pluralité d'éléments qui peuvent être déplacés dans l'objet conteneur ;

obtenir (S220 ; S320 ; S1220 ; S1320) une entrée d'utilisateur ;

le procédé consistant en outre à :

jusqu'à ce qu'une entrée tactile d'utilisateur soit terminée, déterminer si l'entrée d'utilisateur indique :

- un premier geste se déplaçant de l'objet conteneur à une position actuelle de l'entrée d'utilisateur, ou

- un second geste formant une boucle ;

afficher (S230 ; S330 ; S1230 ; S1330), sur la base de l'entrée d'utilisateur obtenue indiquant le premier geste se déplaçant de l'objet conteneur à une position actuelle de l'entrée d'utilisateur, un premier connecteur suivant une première trajectoire du premier geste, le premier connecteur étant connecté de l'objet conteneur à la position actuelle de l'entrée d'utilisateur ;

identifier (S350 ; S1350) parmi la pluralité d'éléments un premier élément situé à une distance prédéterminée ou moins d'une extrémité du premier connecteur à l'emplacement actuel de l'entrée d'utilisateur ;

déplacer (S240 ; S360 ; S1240 ; S1360) le premier élément dans l'objet conteneur si le premier élément est situé à la distance prédéterminée ou moins de l'extrémité du premier connecteur pendant une durée prédéterminée ;

afficher, sur la base de l'entrée d'utilisateur indiquant le second geste formant une boucle, un second connecteur suivant une seconde trajectoire du second geste, le second connecteur étant connecté de l'objet conteneur à la position actuelle de l'entrée d'utilisateur ;

identifier parmi la pluralité d'éléments les éléments enfermés par la boucle ; et

déplacer les éléments identifiés dans l'objet conteneur.


 
13. Procédé selon la revendication 12, dans lequel l'identification des éléments enfermés par la boucle consiste à déterminer le fait que toutes les parties de chacun des éléments sont complètement enfermées par la boucle et/ou le fait qu'au moins une partie-seuil de chacun des éléments est enfermée par la boucle.
 




Drawing













































































Cited references

REFERENCES CITED IN THE DESCRIPTION



This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

Patent documents cited in the description