(19)
(11)EP 3 687 161 A1

(12)EUROPEAN PATENT APPLICATION
published in accordance with Art. 153(4) EPC

(43)Date of publication:
29.07.2020 Bulletin 2020/31

(21)Application number: 18876905.3

(22)Date of filing:  07.11.2018
(51)International Patent Classification (IPC): 
H04N 5/232(2006.01)
G06T 5/50(2006.01)
(86)International application number:
PCT/CN2018/114422
(87)International publication number:
WO 2019/091411 (16.05.2019 Gazette  2019/20)
(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
KH MA MD TN

(30)Priority: 13.11.2017 CN 201711116498

(71)Applicant: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD.
Wusha, Chang'an Dongguan, Guangdong 523860 (CN)

(72)Inventor:
  • CHEN, Yan
    Guangdong 523860 (CN)

(74)Representative: Manitz Finsterwald Patent- und Rechtsanwaltspartnerschaft mbB 
Martin-Greif-Strasse 1
80336 München
80336 München (DE)

  


(54)IMAGE CAPTURING METHOD, DEVICE, TERMINAL, AND STORAGE MEDIUM


(57) Embodiments of the present disclosure relate to the field of electronics and provide a method for image shooting, an apparatus, a terminal device, and a storage medium. The method includes: shooting, via an image shooting component, a preview image and acquiring an exposure parameter of the preview image, when a terminal device is in a state of being ready for image shooting; obtaining an image shooting parameter for a present night scene via an image shooting parameter estimation model based on the preview image and the exposure parameter; performing, in response to receiving an image shooting instruction, image shoot processing based on the estimated image shooting parameter to obtain a target synthesized image.




Description


[0001] The present application claims foreign priority of Chinese Patent Application No. 201711116498.8, filed on November 13, 2017 in the National Intellectual Property Administration of China, the entire contents of which are hereby incorporated by reference in their entireties.

TECHNICAL FIELD



[0002] The present disclosure relates to the field of electronics, and in particular to a method for image shooting, an apparatus, a terminal device, and a non-transitory storage medium.

BACKGROUND



[0003] As electronic technologies develop, terminal devices, such as mobile phones, computers, and the like, have been widely used. Varieties of applications installed in the terminal devices are increasing, and the applications may have an increased number of functions. An application for image shooting may be a commonly installed application, and a user may activate the application to shoot an image.

SUMMARY OF THE DISCLOSURE



[0004] Embodiments of the present disclosure may provide a method for image shooting, an apparatus, a terminal device, and a non-transitory storage medium.

[0005] According to a first aspect of the present disclosure, the method for image shooting may include following operations.

[0006] When the terminal device in a pending state for shooting an image, a preview image may be acquired by an image shooting component, and an exposure parameter value corresponding to the preview image may be acquired.

[0007] An image shooting parameter value of a present night scene may be acquired via an image shooting parameter value estimation model, based on the preview image and a processing of the exposure parameter value corresponding to the preview image. The image shooting parameter value estimation model may be trained by taking an image data parameter, the exposure parameter, and the image shooting parameter as variables. The image shooting parameter may include the number of images for synthesis, and the number of images for synthesis may be the number of preview image required for obtaining a target synthesized image.

[0008] In response to receiving an image shooting instruction, image shooting and processing may be performed based on an estimated image shooting parameter value to obtain the target synthesized image.

[0009] According to a second aspect of the present disclosure, an apparatus for image shooting may be provided and include following modules.

[0010] A first capturing module may be arranged to shoot a preview image via an image shooting component when a terminal device is in a pending state for image shooting, and may be arranged to acquire an exposure parameter value corresponding to the preview image.

[0011] An estimation module may be arranged to acquire an image shooting parameter value of a present night scene via an image shooting parameter value estimation model, based on the preview image and a processing of the exposure parameter value corresponding to the preview image. The image shooting parameter value estimation model may be trained by taking an image data parameter, the exposure parameter, and the image shooting parameter as variables. The image shooting parameter may include the number of images for synthesis, and the number of images for synthesis may be the number of preview images required for obtaining a target synthesized image.

[0012] An execution module may be arranged to perform image shooting and processing based on an estimated image shooting parameter value, in response to receiving an image shooting instruction, to obtain the target synthesized image.

[0013] According to a third aspect of the present disclosure, a terminal device may be provided and include a processor and a non-transitory memory. The non-transitory memory may be arranged to store at least one instruction, at least one program, a code set or an instruction set. The at least one instruction, the at least one program, the code set, or the instruction set may be loaded and executed by the processor to perform the method for image shooting as described in the first aspect.

[0014] According to a fourth aspect of the present disclosure, a computer-readable non-transitory storage medium may be arranged to store at least one instruction, at least one program, a code set or an instruction set. The at least one instruction, the at least one program, the code set, or the instruction set may be capable of being loaded and executed by a processor to perform the method for image shooting as described in the first aspect.

[0015] According to the present disclosure, following beneficial effects may be achieved.

[0016] When the terminal device is in a pending state for image shooting, a preview image may be acquired by an image shooting component, and an exposure parameter corresponding to the preview image may be acquired. An image shooting parameter of a present night scene may be estimated by an image shooting parameter estimation model based on the preview image, wherein the image shooting parameter estimation model may be trained in advance by taking an image data parameter, the exposure parameter, and the image shooting parameter as variables. The image shooting parameter may include the number of images for synthesis. In response to receiving an image shooting instruction, image shooting and processing may be performed based on the estimated image shooting parameter. In such a way, when a user intends to shoot an image in a night scene, the terminal device may automatically calculate the number of images for synthesis for the present night scene, such that the images may be acquired and processed based on the number of images for synthesis, and the user may not need to manually activate a function of image synthesis for night scenes, improving an efficiency of image shooting.

BRIEF DESCRIPTION OF THE DRAWINGS



[0017] In order to illustrate technical solutions of embodiments of the present disclosure in details, drawings required for illustrating the embodiments will be described in brief. Obviously, the following drawings illustrate only some embodiments of the present disclosure, and to any one of skill in the related art, other drawings may be obtained based on the following drawings without any creative work.

FIG. 1 is a flow chart of a method for image shooting according to an embodiment of the present disclosure.

FIG. 2 is a diagram of various predefined numbers of images for synthesis corresponding to a set of target preview images according to an embodiment of the present disclosure.

FIG. 3 is a diagram of a predefined number of images for synthesis corresponding to a plurality of predefined performance parameter values of a terminal device according to an embodiment of the present disclosure.

FIG. 4 is a structural view of an apparatus for image shooting according to an embodiment of the present disclosure.

FIG. 5 is a structural view of an apparatus for image shooting according to an embodiment of the present disclosure.

FIG. 6 is a structural view of a terminal device according to an embodiment of the present disclosure.

FIG. 7 is a structural view of a terminal device according to an embodiment of the present disclosure.

FIG. 8 is a structural view of a terminal device according to an embodiment of the present disclosure.

FIG. 9 is a structural view of a terminal device according to an embodiment of the present disclosure.

FIG. 10 is a structural view of a terminal device according to an embodiment of the present disclosure.

FIG. 11 is a structural view of a terminal device according to an embodiment of the present disclosure.

FIG. 12 is a structural view of a terminal device according to an embodiment of the present disclosure.

FIG. 13 is a structural view of a terminal device according to an embodiment of the present disclosure.


DETAILED DESCRIPTION



[0018] Implementations of the present disclosure may further be described in details by referring to the drawings in order to illustrate technical solutions and objectives of the present disclosure clearly.

[0019] Embodiments of the present disclosure may provide a method for image shooting, and the method may be performed by a terminal device. The terminal device may be a terminal device having a function of image shooting, such as a terminal device installed with an application of image shooting. The terminal device may include components such as a processor, a non-transitory memory, an image shooting component, a display screen, and the like. The processor may be a central processing unit (CPU) or the like, and may be arranged to determine an image shooting parameter value and perform a process related to image shooting. The non-transitory memory may be a random access memory (RAM), a flash memory, and the like, and may be arranged to store received data, data required during performing the process, data generated during performing the process, and the like, such as an image shooting parameter value estimation model. The image shooting component may be a camera, and may be arranged to acquire a preview image. The display screen may be a touch screen, and may be arranged to display the preview image acquired by the image shooting component and to detect a touch signal.

[0020] In the related art, when a user activates the application of image shooting, the application of image shooting may provide a function of night scene synthesis to allow the user to shoot an image in a night scene. Specifically, when the user intends to shoot an image in a night scene, the user may find a click to activate or deactivate the function of night scene synthesis. By pressing the click, the function of night scene synthesis may be activated, and when the user press a click for image shooting, the terminal device may perform image shooting and processing based on a predefined number of images for image synthesis. Specifically, the terminal device may acquire the predefined number of images for image synthesis consecutively via the image shooting component (such as the camera). The images may be preview images, and the acquired predefined number of images for image synthesis may be processed to obtain a synthesized image, and that is, an image synthesized from a plurality of preview images may be obtained, and the synthesized image may be stored in a gallery of the terminal device. In such a way, whenever the user intends to shoot an image in a night scene, the click to activate the function of night scene synthesis needs to be found and clicked manually, reducing an efficiency of image shooting.

[0021] As shown in FIG. 1, a flow chart of a method for image shooting according to an embodiment of the present disclosure is provided. The method may include following operations.

[0022] At an operation of 101, a terminal device may be in a pending state for image shooting, a preview image may be acquired via an image shooting component, and an exposure parameter value corresponding to the preview image may be acquired.

[0023] The pending state for image shooting may refer to the terminal device displaying an interface for image shooting. The image shooting component may be a camera of the terminal device. The preview image may be an image displayed in the terminal device acquired by the image shooting component, and that is an image prior to image synthesis. That is, the preview image may be an image acquired by the image shooting component before the user pressing a click for image capturing. The exposure parameter value corresponding to the preview image may be an exposure parameter value determined during shooting the preview image, and may include exposure duration, white balance, and the like. The exposure parameter value may be arranged to identify a night scene, that is, various exposure parameter values may identify various night scenes.

[0024] Alternatively, the terminal device may be installed with an application of image shooting. When the user intends to shoot an image, an icon of the application may be clicked, and the terminal device may receive an activation instruction correspondingly, such that the application of image shooting may be activated, and at this moment, the terminal device may be in the pending state for image shooting.

[0025] When the terminal device is in the pending state for image shooting, the terminal device may shoot the preview image via the image shooting component. Further, after the application of image shooting is activated, the terminal device may determine the exposure parameter value (the exposure parameter value may include the exposure duration, the white balance, and the like) at real time based on brightness of an environment and a color of a light source in the environment, such that the terminal device may perform image shooting and processing based on the exposure parameter value.

[0026] In addition, a predefined period for image shooting may be set in the terminal device. When the terminal device is in the pending state for image shooting, the terminal device may shoot the preview image via the image shooting component and acquire the exposure parameter value corresponding to the preview image in response to the predefined period for image shooting being met. Further, a predefined time range may be stored in the terminal device. The predefined time range may be a time range corresponding to night scenes, such as from 5pm to 12am, and from 12am to 6am. In response to a present time point being within the predefined time range, when the terminal device is in the pending state for image shooting, the preview image may be acquired via the image shooting component, and the exposure parameter value corresponding to the preview image may be acquired.

[0027] At an operation of 102, an image shooting parameter value of a present night scene may be acquired via an image shooting parameter value estimation model based on the preview image and the exposure parameter value corresponding to the preview image.

[0028] The image shooting parameter value estimation model may be obtained by training, and during the training, an image data parameter, the exposure parameter, and the image shooting parameter may be taken as variables. In the present embodiment, the terminal device may set the preview image as a value of the image data parameter, and the exposure parameter value corresponding to the preview image may be determined as a value of the exposure parameter. The image shooting parameter may include the number of images for image synthesis. A value of the number of images for image synthesis may refer to the number of preview images required to obtain a target synthesized image. Alternatively, the image shooting parameter may further include a performance parameter of the terminal device. The performance parameter of the terminal device may be a parameter able to affect a capability of the terminal device, such as an operating frequency of a central processing unit (also termed as a CPU dominant frequency).

[0029] The image shooting parameter value may refer to a value of the image shooting parameter required to obtain the target synthesized image. When the image shooting parameter includes the number of images for image synthesis, the image shooting parameter value may include a value of the number of images for image synthesis, and will be referred as a target predefined synthesis number hereafter. When the image shooting parameter includes the performance parameter of the terminal device, the image shooting parameter value may include a value of the performance parameter of the terminal device, and will be referred as a target predefined performance parameter value of the terminal device. During implementation, the terminal device may store the image shooting parameter value estimation model in advance, and the image shooting parameter estimation model is trained in advance. The image shooting parameter estimation model may be arranged to estimate the image shooting parameter value of the present night scene based on the preview image acquired at present and the exposure parameter value corresponding to the preview image. Each preview image and each exposure parameter value corresponding to each preview image may be input into the image shooting parameter estimation model trained in advance immediately after each preview image and each exposure parameter is acquired. An output of the image shooting parameter estimation model may be obtained, and the image shooting parameter value of the present night scene may be obtained. To be specific, after the preview image and the exposure parameter value corresponding to the preview image are acquired, the terminal device may take the preview image as the value of the image data parameter, and take the exposure parameter value corresponding to the preview image as the value of the exposure parameter. The values may be input into the image shooting parameter estimation model to obtain the image shooting parameter value for the present night scene.

[0030] At an operation of 103, in response to receiving an image shooting instruction, an image-shooting process may be performed based on the estimated image shooting parameter value to obtain the target synthesized image.

[0031] The image shooting instruction may be triggered by the user. When the user intends to shoot the image, the user may click an image shooting control, and the terminal device may receive the image shooting instruction.

[0032] When the image shooting parameter includes the number of images for image synthesis, the image shooting parameter value may include the target predefined synthesis number. The operation of 103 may be achieved by: acquiring a set of target preview images based on the target predefined synthesis number; performing the image-synthesis process to the set of target preview images to obtain the target synthesized image. Specifically, the terminal device may consecutively shoot an estimated number of preview images for image synthesis, and perform image synthesis to the acquired target predefined number of images for image synthesis to obtain the target synthesized image. The target synthesized image may be stored in a gallery of the terminal device.

[0033] When the image shooting parameter includes the number of images for image synthesis and the performance parameter of the terminal device, the image shooting parameter value may include the target predefined synthesis number and the target predefined performance parameter value of the terminal device. The operation of 103 may be achieved by: acquiring a set of target preview images based on the target predefined synthesis number; performing the image-synthesis process to the set of target preview images based on the target predefined performance parameter value of the terminal device to obtain the target synthesized image. When the image shooting parameter includes the number of images for image synthesis and the performance parameter of the terminal device, the terminal device may consecutively shoot an estimated number of preview images for image synthesis, and may set the performance parameter of the terminal device as an estimated performance parameter of the terminal device. Further, based on the estimated performance parameter of the terminal device, image synthesis may be performed to the number of preview images for image synthesis acquired by the terminal device (image synthesis may be achieved by performing an algorithm to synthesize an image of a night scene, and the algorithm may be available in the related art) to obtain the synthesized image. The synthesized image may be stored in the gallery of the terminal device.

[0034] Further, referring to the operation of 101, in response to the predefined period being met, under the situation of shooting the preview image via the image shooting component and acquiring the exposure parameter corresponding to the preview image, the operation of 102 may be performed once after each preview image is acquired in order to determine the image shooting parameter value of the present night scene (i.e., corresponding to a present acquisition period). In the present shooting period, the terminal device may perform image shooting and processing based on the estimated image shooting parameter value corresponding to the present shooting period in response to each image shooting instruction.

[0035] According to the present disclosure, when the terminal device is in the pending state for image shooting, the terminal device may shoot the preview image via the image shooting component and acquire the exposure parameter corresponding to the preview image. The image shooting parameter value of the present night scene may be estimated by the image shooting parameter estimation model based on the preview image. The image shooting parameter estimation model may be trained in advance by taking the image data parameter, the exposure parameter, and the image shooting parameter as variables. The image shooting parameter may include the number of images for synthesis. In response to receiving the image shooting instruction, the terminal device may perform image shooting and processing based on the estimated image shooting parameter value. In such a way, when the user intends to shoot an image at a night scene, the terminal device may automatically calculate the number of images for image synthesis for the present night scene, and may perform image shooting and processing based on the number of images for synthesis. The user may not need to manually activate the function of night scene synthesis, improving the efficiency of image shooting.

[0036] Prior to the image shooting parameter estimation model acquiring the image shooting parameter value of the present night scene, the image shooting parameter estimation model is required to be trained. On the basis of an available embodiment according to the embodiment shown in FIG. 1, prior to the operation of 102, the method for image shooting may further include following operations.

[0037] The image shooting parameter estimation model may be trained by following a training principle and based on a correspondence relationship. The correspondence relationship may be correspondence among the preview image, the exposure parameter value, and the image shooting parameter value, wherein the preview image, the exposure parameter value, and the image shooting parameter value are stored in the training set in advance. The training principle may be approaching the image shooting parameter value estimated by the image shooting parameter estimation model towards the image shooting parameter value corresponding to the preview image and the exposure parameter, wherein the image shooting parameter value corresponding to the preview image and the exposure parameter value may be stored in the terminal device in advance. A trained image shooting parameter estimation model may be obtained and may take the image data parameter, the exposure parameter, and the image shooting parameter as variables.

[0038] During implementation, the terminal device may store the training set, and the training set may include a correspondence relationship among each preview image, each exposure parameter value, and each image shooting parameter value. For each of the plurality of correspondence relationships, the image shooting parameter value may be a value of the image shooting parameter allowing the synthesized image to have optimal image quality under a situation of the preview image and the exposure parameter being expressed. The terminal device may train the image shooting parameter estimation model based on the stored training set, wherein the image shooting parameter estimation model may include a parameter to be determined. That is, the terminal device may train the image shooting parameter estimation model by following the training principle, wherein the training principle may be approaching the image shooting parameter value estimated by the image shooting parameter estimation model towards the image shooting parameter value stored in the terminal device in advance, and the image shooting parameter value may correspond to the preview image and the exposure parameter value. Specifically, the terminal device may input the preview image and the exposure parameter value of each correspondence relationship into the image shooting parameter estimation model including the parameter to be determined to obtain an image shooting parameter value including a parameter to be determined. Further, a target function may be obtained by following the training principle, wherein the training principle may be approaching the image shooting parameter value including the parameter to be determined towards the image shooting parameter value of the correspondence relationship. For example, the target function may be a function of subtracting the image shooting parameter value of the correspondence relationship from a value of the obtained image shooting parameter including the parameter to be determined. After the target function is obtained, a training value of the parameter to be determined may be obtained by performing a gradient descent method, and the training value may be taken as a value corresponding to the parameter to be determined when training based on a next correspondence relationship. Similarly, after the training is completed, the training value of the parameter to be determined may be determined. Further, the above-mentioned image shooting parameter estimation model may be a convolutional neural network model, and under such a situation, the parameter to be determined may be each convolutional kernal of the convolutional neural network model.

[0039] Alternatively, each of the plurality of correspondence relationships may be selected based on quality of the synthesized image. Specifically, a predefined number of preview images may be acquired by the image shooting component. A target preview image may be determined among the acquired preview images, and a target exposure parameter corresponding to the target preview image may be acquired. A plurality of predefined synthesis numbers may be stored in the terminal device. A predefined synthesis number of preview images including the target preview image may be selected from the predefined number of preview images, and a set of target preview images corresponding to each predefined synthesis number may be obtained. Image synthesis may be performed to the set of target preview images, and a synthesized image corresponding to each predefined synthesis number may be obtained. A target synthesized image having optimal image quality may be determined among a plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and a target predefined synthesis number corresponding to the target synthesized image may be determined as the target image shooting parameter value. The target preview image, the target exposure parameter value, and the target image shooting parameter value may be stored into the training set correspondingly.

[0040] During implementation, each of the plurality of correspondence relationships may be determined based on the predefined number of preview images acquired by the terminal device. At various night scenes (such as at various time points), the terminal device may shoot various predefined numbers of preview images, and various correspondence relationships may be determined based on the various predefined numbers of preview images acquired. For example, a first correspondence relationship may be determined based on the predefined number of preview images acquired at 19:00 by the terminal device, and a second correspondence relationship may be determined based on the predefined number of preview images acquired at 19:20 by the terminal device. A process of determining a certain correspondence relationship will be illustrated in details hereafter, and other correspondence relationships may be determined in a same manner.

[0041] Specifically, in a night scene, the terminal device may consecutively shoot the predefined number of preview images via the image shooting component. The preview images may be acquired directly by the image shooting component and may not be processed for image synthesis. Further, one preview image (referred as the target preview image) may be selected from the acquired predefined number of preview images, and the exposure parameter value corresponding to the target preview image (referred as the target exposure parameter value) may be acquired. The terminal device may store a plurality of predefined synthesis numbers. After the target preview image is selected, the predefined synthesis number of preview images including the target preview image may be selected from the acquired predefined number of preview images, and a set of target preview images corresponding to the predefined synthesis number may be obtained. Therefore, a plurality of sets of target preview images corresponding to the plurality of predefined synthesis numbers may be obtained. For example, as shown in FIG. 2, the terminal device may shoot 3 preview images, an image 1, an image 2, and an image 3. The first preview image, i.e., the image 1, may be the target preview image. The plurality of predefined synthesis numbers may be 1, 2, and 3. When the predefined synthesis number is 1, the terminal device may select the image 1 to obtain a set of target preview images corresponding to the predefined synthesis number being 1. When the predefined synthesis number is 2, the terminal device may select the image 1 and the image 2 to obtain a set of target preview images corresponding to the predefined synthesis number being 2. When the predefined synthesis number is 3, the terminal device may select the image 1, the image 2, and the image 3 to obtain a set of target preview images corresponding to the predefined synthesis number being 3.

[0042] After the set of target preview images of each predefined synthesis number is obtained, image synthesis may be performed to the set of target preview images, and that is, image synthesis may be performed to the target preview images in the set to obtain the synthesized image corresponding to each predefined synthesis number. After a plurality of synthesized images corresponding to a plurality of predefined synthesis numbers are obtained, image quality (such as image definition) of each of the plurality of synthesized images may be calculated. A synthesized image having the optimal image quality (referred as a target synthesized image) among the plurality of synthesized images may be determined, and the predefined synthesis number corresponding to the target synthesized image may be determined and referred as a target predefined synthesis number. After the target predefined synthesis number is determined, the target predefined synthesis number may further be determined as the target image shooting parameter value. The target preview image, the target exposure parameter value, and the target image shooting parameter value may be determined and subsequently stored into the training set correspondingly. The target preview image, the target exposure parameter value, and the target image shooting parameter value corresponding to another time point may be obtained by the terminal device by following the above-mentioned process similarly, in such a way, each correspondence relationship in the training set may be obtained.

[0043] Alternatively, the image shooting parameter may further include the performance parameter of the terminal device. The number of images for image synthesis corresponding to the preview image and the exposure parameter, and the performance parameter of the terminal device corresponding to the preview image and the exposure parameter may be determined during determining the training set. The terminal device may perform operations of: performing image synthesis to the set of the target preview image based on a plurality of performance parameter values of the terminal device to obtain a synthesized image corresponding to the predefined synthesis number and each of the plurality of performance parameter values of the terminal device. Accordingly, the terminal device may further perform operations of: determining the target synthesized image having the optimal image quality among the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers and the plurality of performance parameter values of the terminal device; and determining the target predefined synthesis number and a target performance parameter value of the terminal device corresponding to the target synthesized image to be the target image shooting parameter value.

[0044] During implementation, the terminal device may store a plurality of performance parameter values. In a situation of the image shooting parameter including the performance parameter of the terminal device, a plurality of sets of target preview images corresponding to the plurality of predefined synthesis numbers may be determined, and the terminal device may subsequently perform image synthesis to the plurality of sets of target preview images based on the plurality of performance parameter values of the terminal device respectively to obtain a plurality of synthesized images corresponding to the predefined synthesis number and the plurality of predefined performance parameter values of the terminal device. That is, each predefined synthesis number and each performance parameter may correspond to one of the plurality of synthesized images, and that is, the number of obtained synthesized images corresponding to the plurality of predefined synthesis numbers and the plurality of predefined performance parameter values of the terminal device may equal to the number of the predefined synthesis numbers multiplied by the number of the plurality of performance parameter values of the terminal device. For example, as shown in FIG. 3, one of the plurality of predefined synthesis numbers is 2, and the plurality of performance parameter values of the terminal device may include a value of a and a value of b. In the set of target preview images corresponding to the predefined synthesis number being equal to 2, the terminal device may perform image synthesis to the set of target preview images, in response to the performance parameter value of the terminal device being a (i.e., the terminal device may set the performance parameter of the terminal device to be a), to obtain a synthesized image corresponding to the predefined synthesis number being 2 and the performance parameter value of the terminal device being a. The terminal device may perform image synthesis to the set of target preview images, in response to the performance parameter of the terminal device being b, to obtain a synthesized image corresponding to the predefined synthesis number being 2 and the performance parameter of the terminal device being b.

[0045] After the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers and the plurality of predefined performance parameters of the terminal device are obtained, image quality of each of the plurality of synthesized images may be calculated. Among the plurality of synthesized images, the target synthesized image having the optimal image quality may be determined. Further, the target predefined synthesis number and the target predefined performance parameter value of the terminal device corresponding to the target synthesized image may be determined to be the target image shooting parameter value. Three parameter values (i.e., the target preview image, the target exposure parameter value, and the target image shooting parameter value) may be stored into the training set correspondingly.

[0046] Alternatively, the terminal device may record power consumption for performing image synthesis to each set of target preview images. Accordingly, the terminal device may perform following operations to determine the target image shooting parameter value. The terminal device may record the power consumption for obtaining each synthesized image corresponding to each predefined synthesis number. The target synthesized image having comprehensively optimal image quality and power consumption may be determined among the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and the target predefined synthesis number corresponding to the target synthesized image may be determined to be the target image shooting parameter value.

[0047] During implementation, after the set of target preview images corresponding to each of the plurality of predefined synthesis numbers is obtained, image synthesis may be performed to the set of target preview images, the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers may be obtained, and the power consumption for performing image synthesis may be recorded. That is, the power consumption for obtaining each synthesized image corresponding to each predefined synthesis number may be recorded. The power consumption may include one or more of: a consumed battery level and duration spent for image synthesis. In such a situation, after the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers and the power consumption corresponding to the plurality of predefined synthesis numbers are obtained, the terminal device may determine the target synthesized image having comprehensively optimal image quality and power consumption among the plurality of synthesized images (a synthesized image having a greatest quotient of the image quality to the power consumption may be determined to be the target synthesized image). The target predefined synthesis number corresponding to the target synthesized image may be determined to be the target image shooting parameter value.

[0048] An apparatus embodiment of the present disclosure will be described hereafter, and may be arranged to perform the method embodiments of the present disclosure. Details that are not illustrated in the apparatus embodiment may be referred to the method embodiments of the present disclosure.

[0049] As shown in FIG. 4, a diagram of an image shooting apparatus according to an embodiment of the present disclosure is provided. The apparatus may have a function of performing the method as described in the above embodiments, and the function may be realized by hardware or by hardware executing corresponding software. The apparatus may include following modules. A first capturing module 410 of the present disclosure may be arranged to allow an image shooting component to shoot a preview image and may be arranged to acquire an exposure parameter value corresponding to the preview image, when the terminal device is in a pending state for image shooting.

[0050] An estimation module 420 may be arranged to obtain an image shooting parameter value of a present night scene via an image shooting parameter estimation model based on the preview image and the exposure parameter corresponding to the preview image. The image shooting parameter estimation model may be trained by taking an image data parameter, the exposure parameter, and the image shooting parameter as variables. The image shooting parameter may include the number of images for image synthesis, and the number of images for image synthesis may be the number of preview images required to obtain a target synthesized image.

[0051] An execution module 430 may be arranged to perform an image-shooting process, in response to receiving an image shooting instruction, based on the image shooting parameter, to obtain the target synthesized image.

[0052] Alternatively, the image shooting parameter may include a performance parameter of the terminal device.

[0053] Alternatively, as shown in FIG. 5, the apparatus may further include following modules.

[0054] A training module 440 may be arranged to train the image shooting parameter estimation model based on a correspondence relationship and by following a training principle. The correspondence relationship may be correspondence among the preview image, the exposure parameter value, and the image shooting parameter value. The training principle may be approaching the image shooting parameter value estimated by the image shooting parameter estimation model towards a stored image shooting parameter value corresponding to the preview image and the exposure parameter.

[0055] Alternatively, as shown in FIG. 5, the apparatus may further include following modules.

[0056] A second capturing module 450 may be arranged to allow the image shooting component to shoot a predefined number of preview images.

[0057] A first determination module 460 may be arranged to determine a target preview image from the acquired predefined number of preview images, and may be arranged to acquire a target exposure parameter corresponding to the target preview image.

[0058] A second determination module 470 may be arranged to select a predefined synthesis number of preview images including the target preview image from the predefined number of preview images to obtain a set of target preview images corresponding to the predefined synthesis number, and may be arranged to perform an image-synthesis process to the set of target preview images to obtain a synthesized image corresponding to the predefined synthesis number. A plurality of predefined synthesis numbers may be stored in the terminal device, a plurality of sets of target preview images may be obtained correspondingly, and a plurality of synthesized images may be obtained correspondingly.

[0059] A third determination module 480 may be arranged to determine a target synthesized image having optimal image quality from the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and may be arranged to determine a target predefined synthesis number corresponding to the target synthesized image to be the target image shooting parameter.

[0060] A storage module 490 may be arranged to store the target preview image, the target exposure parameter value, and the target image shooting parameter value into the training set correspondingly.

[0061] Alternatively, the second determination module 470 may be arranged to perform following operations.

[0062] Image synthesis may be performed to the set of target preview images based on a plurality of predefined performance parameter of the terminal device respectively, and the synthesized images corresponding to the predefined synthesis number and each of the plurality of performance parameters of the terminal device may be obtained.

[0063] The third determination module 480 may further be arranged to perform following operations.

[0064] The target synthesized image having optimal image quality may be determined from the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers and the plurality of predefined performance parameter values of the terminal device. The target predefined synthesis number and the target performance parameter value of the terminal device corresponding to the target synthesized image may be determined to be the target image-shooting parameter value.

[0065] Alternatively, as shown in FIG. 5, the apparatus may further include following modules.

[0066] A recording module 4100 may be arranged to record power consumption for obtaining each of the plurality of synthesized images corresponding to each of the plurality of predefined synthesis numbers.

[0067] The third determination module 480 may be arranged to perform following operations.

[0068] The target synthesized image having comprehensively optimal image quality and power consumption may be determined from the plurality of synthesized images, and the target predefined synthesis number corresponding to the target synthesized image may be determined to be the target image shooting parameter.

[0069] Alternatively, the image shooting parameter may include the target predefined synthesis number, and the execution module 430 may be arranged to perform following operations.

[0070] The set of target preview images may be obtained based on the target predefined synthesis number, and image synthesis may be performed to the set of target preview images to obtain the target synthesized image.

[0071] Alternatively, the image shooting parameter may include the target predefined performance parameter of the terminal device, and the execution module 430 may be arranged to perform following operations.

[0072] Image synthesis may be performed to the set of target preview images, based on the target predefined performance parameter of the terminal device, to obtain the target synthesized image.

[0073] Alternatively, as shown in FIG. 5, the apparatus may further include following modules.

[0074] A detection module 4110 may be arranged to detect whether a present time stamp is within a predefined time range.

[0075] The first capturing module 410 may be arranged to perform the preview image shooting process via the image shooting component, in response to the present time stamp being within the predefined time range, and may be arranged to acquire the exposure parameter corresponding to the preview image.

[0076] In the embodiment of the present disclosure, when the terminal device is in the pending state for image shooting, the preview image may be acquired via the image shooting component, and the exposure parameter value corresponding to the preview image may be acquired. The image shooting parameter value of the present night scene may be estimated based on the preview image and the image shooting parameter estimation model. The image shooting parameter estimation model may be trained in advance by taking the image data parameter, the exposure parameter, and the image shooting parameter as variables. The image shooting parameter may include the number of images for synthesis. In response to receiving an image shooting instruction, an image-shooting process may be performed based on the estimated image shooting parameter value. In such a way, when the user tends to shoot an image in a night scene, the terminal device may automatically calculate the number of images for synthesis for the present night scene. Further, the acquired image may be processed based on the number of images for synthesis, and the user may not need to manually activate the function of image synthesis for night scenes, the efficiency of image shooting may be improved.

[0077] To be noted that, when the image shooting apparatus provided by the above-mentioned embodiment is shooting images, the apparatus provided in examples may be illustrated according to each functional module. In an application, functions of the apparatus may be achieved by being assigned to various functional modules, that is, an internal structure of the terminal device may be divided into various functional modules to achieve all of or a part of the above-mentioned functions. In addition, the apparatus for image shooting provided in the above-mentioned embodiments and the method for image shooting provided in the embodiments share a same invention concept. A process of achieving the functions may refer to the method embodiments, and will not be repeatedly described herein.

[0078] As shown in FIG. 6 and FIG. 7, a structural diagram of the terminal device 100 according to an embodiment of the present disclosure is provided. The terminal device 100 may be a mobile phone, a tablet computer, a laptop, an electronic book, and the like. The terminal device 100 of the present disclosure may include one or more of: a processor 110, a non-transitory memory 120, and a touch screen 130.

[0079] The processor 110 may include one or more processing cores. The processor 110 may be arranged to connect each internal component of the terminal device 100 via various interfaces and lines, and may be arranged to execute various functions of the terminal device 100 and process data by running or executing an instruction, a program, a code set, or an instruction set stored in the non-transitory memory 120 and by invoking data stored in the non-transitory memory 120. Alternatively, the processor 110 may be realized by at least one of: a hardware form of digital signal processing (DSP), a hardware form of field-programmable gate array (FPGA), and a hardware form of programmable logic array (PLA). The processor 110 may integrate at least one of: a central processing unit (CPU), a graphics processing unit (GPU), and a modem. The CPU may substantially be arranged to process an operating system, a user interface, an application, and the like. The GPU may be arranged to render and draw content to be displayed by the touch screen 130. The modem may be arranged to process wireless communication. It may be understood that, the modem may not be integrated into the processor 110, but may be presented as a chip instead.

[0080] The non-transitory memory 120 may include a random access memory (RAM) or a read-only memory (ROM). Alternatively, the non-transitory memory 120 may include a non-transitory computer-readable storage medium. The non-transitory memory 120 may be arranged to store the instruction, the program, the code, the code set, or the instruction set. The non-transitory memory 120 may include a program storage area and a data storage area. An instruction for achieving the operation system, an instruction for achieving at least one function (such as touching, audio playing, image playing, and the like), and an instruction for performing the method in each embodiment may be stored in the program storage area. Data generated based on usage of the terminal device 100 (such as audio data and contact information) may be stored in the data storage area.

[0081] Taking the Android operating system as an example, the program and the data stored in the non-transitory memory 120 may be shown in FIG. 6. The non-transitory memory 120 may be arranged to store a Linex kernal layer 220, a library and Android runtime level 240, an application framework layer 260, and an application layer 280. The Linex kernal layer 220 may provide underlying drives for various hardware of the terminal device 100, such as a drive for displaying, a drive for an audio, a drive for a camera, a drive for Bluetooth, a drive for Wireless-Fidelity (Wi-Fi), power management, and the like. The library and Android runtime level 240 may provide a feature support for the Android system via some C/C++ libraries. For example, a SQLite library may provide a support for database, an OpenGL/ES library may provide a support for three-dimensional drawing, and a Webkit library may provide a support for a browser kernal. The library and Android runtime level 240 may further include an Android Runtime, and the Android Runtime may provide some core libraries allowing a developer to program Android applications through Java. The application framework layer 260 may provide various application programming interfaces (API) for generating applications, and the developer may generate their own applications, such as activity management, window management, view management, notice management, a content provider, package management, call management, resource management, and positioning management, through the various APIs. At least one program may run in the application layer 280, and the at least one program may be originally installed with the system, such as a contact program, a message program, a time program, a camera application, and the like; alternatively, the at least one program may be developed by a third developer, such as a real-time communication program, an image retouching program, and the like.

[0082] Taking an iPhone operating system (iOS) as an example, the program and the data stored in the non-transitory memory 120 may be shown in FIG. 9. The iOS may include: a core OS layer 320, a core service layer 340, a media layer 360, and a cocoa touch layer 380. The core OS layer 320 may include an OS kernal, a driving program, and an underlying program framework. Functions provided by the underlying program framework may be similar to functions of hardware, and an application framework in the core service layer 340 may use the underlying program framework. The core service layer 340 may provide systemic services and/or the program framework, such as a foundation framework, an account framework, an advertisement framework, a data storage framework, a network connection framework, a geographic location framework, a moving framework, and the like, for an application. The media layer 360 may provide an audio-visual related interface for the application, such as a graphics-related and image-related interface, an audio technology-related interface, a video technology-related interface, an airplay interface of audio-video transmission, and the like. The cocoa touch layer 380 may provide various frameworks related to various common interfaces for program development, such as a local notification service, a remote notification service, an advertisement framework, a game tool framework, a user interface (UI) framework, a UI kit framework, a map framework, and the like, and may manage interactive touching operations performed by a user on the terminal device 100.

[0083] As shown in FIG. 7, the frameworks related to most of the applications may include, but may not be limited to, the foundation framework in the core service layer 340 and the UI kit framework in the cocoa touch layer 380. The foundation framework may provide various foundational object types and data types, and may provide foundational systemic service for all applications, but may not be related to the UI. The UI kit framework may provide a foundational UI class library for generating a touchable UI. Applications of iOS may provide the UI based on the UI kit framework. Therefore, the UI kit framework may provide foundational frameworks for the applications to generate the UI, to draw, to process interactive events with the user, to respond to postures, and the like.

[0084] The touch screen 130 may be arranged to receive a touch operation performed on or near the touch screen 130 by a finger, a touch pencil, or any appropriate object, and may be arranged to display the user interface for each application. The touch screen 130 may commonly be arranged on a front plate of the terminal device 130. The touch screen 130 may be provided as a full screen, a curved screen, or an anomalous screen. The touch screen 130 may also be provided as a curved full screen, or a curved anomalous screen, and will not be limited by the present disclosure.

[0085] A full screen may be provided.

[0086] For the full screen, an area occupied by the touch screen 130 may be greater than a threshold value (such as 80%, or 90%, or 95%). The area occupied by the touch screen 130 may be calculated as: (an area of the touch screen 130/an area of the front plate of the terminal device 100)100%. The area occupied by the touch screen 130 may also be calculated as: (an area of a display region defined by the touch screen 130/the area of the front plate of the terminal device 100)100%. The area occupied by the touch screen 130 may also be calculated as: (a diagonal of the touch screen 130/a diagonal of the front plate of the terminal device 100)100%. According to an example provided in FIG. 8, almost an entire area of the front plate of the terminal device 100 is covered by the touch screen 130. On the front plate 40 of the terminal device 100, all the area, excluding an edge defined by a middle frame, may be covered by the touch screen 130. Four angles of the touch screen 130 may be right angles or rounded-angles.

[0087] For the full screen, at least one of front-plate components may be integrated within or under the touch screen 130. Alternatively, the at least one of front-plate components may include: a camera, a fingerprint sensor, a proximity light sensor, a distance sensor, and the like. In some embodiments, other components arranged on the front plate of a conventional terminal device may be integrated into an entire region or a part of the entire region of the touch screen 130. For example, the photosensitive element of the camera may be divided into various photosensitive pixels, and each of the various photosensitive pixels may be integrated into a black region of each display pixel of the touch screen 130. As the at least one of front-plate components is integrated within the touch screen 130, the area occupied by the full screen may be improved.

[0088] In other embodiments, the at least one front-plate component arranged with the conventional terminal device may be arranged on a side of a back of the terminal device 100. For example, an ultrasound fingerprint sensor may be arranged below the touch screen 130, a bone conduction earpiece may be arranged inside the terminal device 100, and the camera may be arranged on the side of the terminal device 100 or may be arranged as pluggable.

[0089] In some available embodiments, when the terminal device 100 is arranged with the full screen, one side, two sides (such as a left side and a right side), or four sides (such as a top side, a bottom side, a left side, and a right side) of the middle frame of the terminal device 100 may be arranged with an edged touch sensor 120. The edged touch sensor 120 may be arranged to detect at least one of the touch operation, a click operation, a press operation, and a sliding operation performed by the user. The edged touch sensor 120 may be any one of a touch sensor, a thermal sensor, and a pressure sensor. The user may perform operations on the edged touch sensor 120 to manipulate the applications and processes of the terminal device 100.

[0090] A curved screen may be arranged.

[0091] For a curved screen, a screen region of the touch screen 130 may not be in a planar. Typically, the curved screen may at least have a cross section. The cross section may be curved, a projection of the curved screen along a direction perpendicular the cross section may be a flat plane, and the curved cross section may be U-shaped. Alternatively, at least one side edge of the curved screen maybe curved. Alternatively, at least one side edge of the curved touch screen 130 may extend to cover the middle frame of the terminal device 100. In such a way, the middle frame, which may initially be unable to display and unable to be responsive to the touch operation, may be covered by the curved touch screen 130 to form a display region and/or a region able to receive operations, such that the area occupied by the curved screen may be improved. Alternatively, as shown in FIG. 9, the left side and the right side 42 of the touch screen may be curved. Alternatively, the top side and the bottom side of the touch screen may be curved. Alternatively, the top side, the bottom side, the left side, and the right side of the touch screen may be curved. In an available embodiment, the touch screen may be made of flexible material.

[0092] An anomalous screen may be arranged.

[0093] For an anomalous screen, a shape of the touch screen may be irregular, and an irregular shape may not include a rectangle or a rounded rectangle. Alternatively, the anomalous screen may refer to a rectangular or a rounded rectangular touch screen 130 having a protrusion, a notch, and/or defining a hole. Alternatively, the protrusion, the notch, and/or the hole may be arranged or defined on the edge and/or in a center of the touch screen 130. The protrusion, the notch, and/or the hole may be arranged or defined at middle or two ends of the edge. The protrusion, the notch, and/or the hole may be arranged or defined in at least one of a top region, a left top region, a left region, a left bottom region, a bottom region, a right bottom region, a right region, and a right top region. When the protrusion, the notch, and/or the hole is arranged or defined in various regions, the protrusion, the notch, and/or the hole may be clustered or spaced apart from each other, distributed symmetrically or asymmetrically. Alternatively, the number of the protrusion, the number of the notch, and the number of the hole may not be limited by the present disclosure.

[0094] As the anomalous screen may cover an upper region and a bottom region of the touch screen, allowing the upper and bottom regions to display and/or to receive operations, the area occupied of the front plate of the terminal device by the touch screen may be increased, and an occupation ratio of the anomalous region may be greater. In some embodiments, the notch and/or the hole may be defined to receive the at least one front-plate component, and the at least one front-plate component may include at least one of the camera, the fingerprint sensor, the proximity light sensor, the distance sensor, the ear piece, an ambient light sensor, and a physical button.

[0095] To be exemplary, the notch may be defined on one or more edges. The notch may be a semicircular notch, a rectangular notch with a right angle, a rounded rectangular notch, or an irregular notch. According to examples shown in FIG. 10, the anomalous screen may refer to a touch screen 130 defining the semicircular notch 43 at a middle of the top side, and the notch 43 may be defined to receive the at least one front-plate component, including the camera, the distance sensor (also referred as a proximity sensor), the ear piece, the ambient light sensor. As shown in FIG. 11, the anomalous screen may refer to a touch screen 130 defining the semicircular notch 44 at a middle of the bottom side, and the semicircular notch 44 may be defined to receive at least one of the physical button, the fingerprint sensor, and a microphone. As shown in FIG. 12, , the anomalous screen may refer to a touch screen 130 defining a half-elliptical notch 45 at a middle of the bottom side and defining another half-elliptical notch on the front plate of the terminal device 100. A side wall of the half-elliptical notch 45 and a side wall of the another half-elliptical notch may be combined to form an elliptical region to receive the physical button or the fingerprint sensor. As shown in FIG. 13, the anomalous screen may refer to a touch screen 130 defining a hole 45 at a top, and the hole 45 may be defined to receive the at least one front-plate components, including the camera, the distance sensor, the earpiece, the ambient light sensor.

[0096] Further, it may be understood to any one of skill in the art that, a structure of the terminal device 100 as shown in the above-mentioned figures may not limit the terminal device 100. The terminal device may be arranged with more or fewer components compared to the components shown in the figures, or arranged with combination of some components, or the components may be distributed in a different manner. For example, the terminal device 100 may further include a radio frequency circuit, an input unit, a sensor, an audio circuit, a Wi-Fi module, a power, a Bluetooth module, and the like, which may not be illustrated in details herein.

[0097] In the embodiments of the present disclosure, when the terminal device is in the pending state for image shooting, the image shooting component may acquire the preview image and the exposure parameter corresponding to the preview image. Based on the preview image and the image shooting parameter estimation model, the image shooting parameter of the present night scene may be estimated, wherein the image shooting parameter estimation model may be trained in advance by taking the image data parameter, the exposure parameter, and the image shooting parameter as variables, and the image shooting parameter ma include number of images for synthesis. In response to receiving the image shooting instruction, an image-shooting process may be performed based on the estimated image shooting parameter. In such away, when the user is intending to shoot an image at a night scene, the terminal device may automatically calculate the number of images for synthesis for the present night scene, and therefore, the image-shooting process may be performed based on the number of images for synthesis, and the user may not be required to manually activate the function for image shooting at night scenes, improving the efficiency of image shooting.

[0098] Any one of skill in the art may understand that all of or a part of operations of the above-mentioned may be achieved through hardware or achieved by programs instructing related hardware. The program may be stored in a non-transitory computer-readable storage medium, and the storage medium may be a ROM, a disc or an optical disc.

[0099] The above-mentioned embodiments are preferred embodiments of the present disclosure, and shall not limit the present disclosure. Any modification, any equivalent replacement, any improvement within the spirit and the principle of the present disclosure shall be within the scope of the present disclosure.


Claims

1. A method of image shooting, comprising:

shooting preview images via an image shooting component and acquiring values of an exposure parameter corresponding to the preview images, when a terminal device is in apending state for image shooting;

estimating a value of an image-shooting parameter of a present night scene via an image shooting parameter estimation model according to the preview images and the values of the exposure-parameter corresponding to the preview images, wherein

the image shooting parameter estimation model is trained by taking an image data parameter, the exposure parameter, and the image shooting parameter as variables; and

the image shooting parameter comprises an image-synthesizing number, which is a number of preview images required to obtain a target synthesized-image; and

performing an image-shooting process according to the estimated value of the image-shooting parameter, in response to an image-shooting instruction, to obtain the target synthesized-image.


 
2. The method according to claim 1, wherein the image-shooting parameter further comprises a performance parameter of the terminal device.
 
3. The method according to claim 1 or claim 2, wherein before the estimating a value of an image-shooting parameter of a present night scene via an image shooting parameter estimation model according to the preview images and the values of the exposure-parameter corresponding to the preview images, the method further comprises:
training the image shooting parameter estimation model based on a pre-stored correspondence relationship among preview images, values of the exposure parameter, and values of the image-shooting parameter in a training set, and by following a pre-stored training principle of values of the image-shooting parameter estimated by the estimation model, being approached towards values of the image-shooting parameter corresponding to the preview images and the values of the exposure parameter, to achieve the estimation model.
 
4. The method according to claim 3, further comprising:

shooting a predefined number of preview images via the image shooting component;

determining a target preview image among the predefined number of preview images and acquiring a target value of the exposure parameter corresponding to the target preview image;

selecting a predefined synthesis number of preview images comprising the target preview image from the predefined number of preview images to obtain a set of target preview images corresponding to the predefined synthesis number, wherein a plurality of predefined synthesis numbers are pre-stored in the terminal device, and a plurality of sets of target preview images corresponding to the plurality of predefined synthesis numbers are obtained;

performing an image-synthesis process for each of the plurality of sets of target preview images to obtain a synthesized image corresponding to the predefined synthesis number;

determining the target synthesized image having optimal image quality from the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and determining a target predefined synthesis number corresponding to the target synthesized-image to be a target value of the image-shooting parameter; and

storing the target preview image, the target value of the exposure parameter, and the target value of the image-shooting parameter into the training set correspondingly.


 
5. The method according to claim 4, wherein
the performing an image-synthesis process for each of the plurality of sets of target preview images to obtain a synthesized image corresponding to the predefined synthesis number comprises:
performing the image-synthesis process to the set of target preview images based on a plurality of values of a performance parameter of the terminal device respectively to obtain a plurality of synthesized images corresponding to the predefined synthesis number and the plurality of values of the performance parameter of the terminal device; and
the determining the target synthesized image having optimal image quality from the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and determining a target predefined synthesis number corresponding to the target synthesized-image to be a target value of the image-shooting parameter, comprises:
determining the target synthesized image having the optimal image quality from the obtained plurality of synthesized images corresponding to the plurality of predefined synthesis numbers and the plurality of values of the performance parameter, and determining the target predefined synthesis number and a target predefined value of the performance parameter of the terminal device corresponding to the target synthesized image to be the target value of the image-shooting parameter.
 
6. The method according to claim 4, further comprising:

recording power consumption for obtaining the synthesized image corresponding to the predefined synthesis number; and

the determining the target synthesized image having optimal image quality from the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and determining a target predefined synthesis number corresponding to the target synthesized-image to be a target value of the image-shooting parameter, comprises:
determining the target synthesized-image having comprehensively optimal image quality and power consumption among the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and determining the target predefined synthesis number to be the target value of the image-shooting parameter.


 
7. The method according to any one of claims 1 to 6, wherein the performing an image-shooting process according to the estimated value of the image-shooting parameter, in response to an image-shooting instruction, to obtain the target synthesized-image comprises:

acquiring a set of target preview images based on the target predefined synthesis number; and

performing the image-synthesis process to the set of target preview images to obtain the target synthesized-image.


 
8. The method according to claim 7, wherein the value of the image-shooting parameter comprises a target value of the performance parameter of the terminal device, and the performing the image-synthesis process to the set of target preview images to obtain the target synthesized-image comprises:
performing the image-synthesis process to the set of target preview images based on the target value of the performance parameter of the terminal device to obtain the target synthesized-image.
 
9. The method according to any one of claims 1 to 6, wherein before the shooting preview images via an image shooting component and acquiring values of an exposure parameter corresponding to the preview images, the method further comprises:

detecting whether a present time stamp is within a predefined time range; and

performing, in response to the present time stamp being within the predefined time range, the shooting preview images via an image shooting component and acquiring values of an exposure parameter corresponding to the preview images.


 
10. An apparatus for image shooting, comprising:

a first capturing module, arranged to shoot, via an image shooting component, preview images and to acquire values of an exposure parameter corresponding to the preview images, when the terminal device is in a pending state for image shooting;

an estimation module, arranged to allow an image shooting parameter estimation model to determine an image-shooting parameter of a present night scene according to the preview images and the values of the exposure parameter corresponding to the preview images, wherein

the image shooting parameter estimation model is trained by taking an image data parameter, the exposure parameter, and the image shooting parameter as variables; and

the image shooting parameter comprises an image-synthesizing number, which is a number of preview images required to obtain a target synthesized-image; and

an execution module, arranged to perform an image-shooting process according to the estimated value of the image-shooting parameter, in response to an image-shooting instruction, to obtain the target synthesized-image.


 
11. The apparatus according to claim 10, wherein the image shooting parameter further comprises a performance parameter of the terminal device.
 
12. The apparatus according to claim 10 or claim 11, further comprising:
a training module, arranged to train the image shooting parameter estimation model based on a pre-stored correspondence relationship among preview images, values of the exposure parameter, and values of the image-shooting parameter in a training set, and by following a pre-stored training principle of values of the image-shooting parameter estimated by the estimation model, being approached towards values of the image-shooting parameter corresponding to the preview images and the values of the exposure parameter, to achieve the estimation model.
 
13. The apparatus according to claim 12, further comprising:

a second image capturing module, arranged to shoot, via the image shooting component, a predefined number of preview images;

a first determination module, arranged to determine a target preview image from the predefined number of preview images and acquire a target value of the exposure parameter corresponding to the target preview image;

a second determination module, arranged to select a predefined synthesis number of preview images comprising the target preview image from the predefined number of preview images to obtain a set of target preview images corresponding to the predefined synthesis number; and arranged to perform an image synthesis process to the set of target preview images to obtain a synthesized image corresponding to the predefined synthesis number, wherein a plurality of predefined synthesis numbers are stored in the terminal device, a plurality of sets of target preview images corresponding to the plurality of predefined synthesis numbers are obtained, and a plurality of synthesized images corresponding to the plurality of predefined synthesis numbers are obtained;

a third determination module, arranged to determine the target synthesized image having optimal image quality from the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and arranged to determine a target predefined synthesis number corresponding to the target synthesized image to be a target value of the image shooting parameter; and

a storage module, arranged to store the target preview image, the target value of the exposure parameter, the target value of the image shooting parameter into the training set correspondingly.


 
14. The apparatus according to claim 13, wherein
the second determination module is arranged to perform operations of:
performing the image-synthesis process to the set of target preview images, based on a plurality of predefined values of a performance parameter of the terminal device respectively, to obtain a synthesized image corresponding to the predefined synthesis number and each of the plurality of predefined values of the performance parameter of the terminal device, wherein a plurality of synthesized images are obtained corresponding to the plurality of predefined synthesis numbers and the plurality of predefined values of the performance parameter of the terminal device; and
the third determination module is arranged to perform operations of:
determining the target synthesized image having the optimal image quality from the obtained plurality of synthesized images corresponding to the plurality of predefined synthesis numbers and the plurality of predefined values of the performance parameter of the terminal device, and determining the target predefined synthesis number and a target predefined value of the performance parameter of the terminal device corresponding to the target synthesized image to be target values of the image shooting parameter.
 
15. The apparatus according to claim 13, further comprising:

a recording module, arranged to record power consumption for obtaining each of the plurality of synthesized images corresponding to each of the plurality of predefined synthesis numbers; and

the third determination module, arranged to perform operations of:
determining the target synthesized image having comprehensively optimal image quality and the power consumption from the plurality of synthesized images corresponding to the plurality of predefined synthesis numbers, and determining the target predefined synthesis number corresponding to the target synthesized image to be the target value of the image shooting parameter.


 
16. The apparatus according to any one of claims 9 to 15, wherein the value of the image shooting parameter comprises the target predefined synthesis number, and the execution module is arranged to perform operations of:
obtaining the set of preview images based on the target predefined synthesis number, and performing the image-synthesis process to the set of target preview images to obtain the target synthesized image.
 
17. The apparatus according to claim 16, wherein the value of the image shooting parameter comprises a target value of the performance parameter of the terminal device, and the execution module is arranged to perform operations of:
performing the image-synthesis process to the set of target preview images, based on the target value of the predefined performance parameter of the terminal device, to obtain the target synthesized image.
 
18. The apparatus according to any one of claims 9 to 15, further comprising:

a detection module, arranged to detect whether a present time stamp is within a predefined time range; and

the estimation module, arranged to shoot, via the image shooting component, the preview images and to acquire the values of the exposure parameter corresponding to the preview images, in response to the present time stamp being within the predefined time range.


 
19. A terminal device, comprising a processor and a non-transitory memory, wherein the non-transitory memory is arranged to store at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program, the code set, or the instruction set are loaded and executed by the processor to perform the method according to any one of claims 1 to 9.
 
20. A non-transitory computer-readable storage medium, comprising at least one instruction, at least one program, a code set, or an instruction set stored in, wherein the at least one instruction, the at least one program, the code set, or the instruction set are capable of being loaded and executed by a processor to perform the method according to any one of claims 1 to 9.
 




Drawing































Search report










Cited references

REFERENCES CITED IN THE DESCRIPTION



This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

Patent documents cited in the description