CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to Chinese Patent Application No.
201710557843.5, filed by Guangdong OPPO Mobile Telecommunications Corp. Ltd. on July 10, 2017 and
entitled "White Balance Processing Method and Apparatus", the contents of which are
hereby incorporated by reference in its entirety.
TECHNICAL FIELD
[0002] The disclosure relates to the technical field of terminals, and particularly to a
white balance processing method and device.
BACKGROUND
[0003] Along with the progress of sciences and technologies and development of image processing
technologies, photographing technologies for mobile terminals (for example, smart
phones and personal digital assistants) have also changed rapidly, including image
processing software for processing Automatic White Balance (AWB) as well as automatic
white balance for a face, i.e., FACE AWB.
[0004] However, during practical use, when FACE AWB is applied to a rear camera for portrait
shooting, the color accuracy of the shot photo is still low after the photo is adjusted
by the FACE AWB, which results in white balance regulation error and poor user experience.
SUMMARY
[0005] The disclosure is intended to solve one of the technical problems in a related art
at least to a certain extent.
[0006] To this end, the disclosure discloses a white balance processing method, which solves
the problems of inaccurate image color reproduction and poor user experience caused
by the fact that an area proportion of a face region in an image is relatively low
when the white balance regulation is performed according to a white balance gain value
determined based on an area occupied by the face region in condition of a relatively
long shooting distance.
[0007] The disclosure also discloses a white balance processing device.
[0008] The disclosure also discloses a computer device.
[0009] The disclosure also discloses a computer-readable storage medium.
[0010] The disclosure also discloses a computer program product.
[0011] A first aspect of embodiments of the disclosure discloses a white balance processing
method, which may include the following operations.
[0012] A portrait region in an image is identified.
[0013] A target white balance gain value is calculated according to an area occupied by
the portrait region in the image.
[0014] White balance processing is performed on the image according to the target white
balance gain value.
[0015] In the white balance processing method of the embodiments of the disclosure, the
portrait region in the image is identified, the target white balance gain value is
calculated according to the area occupied by the portrait region in the image, and
white balance processing is performed on the image according to the target white balance
gain value. This solves the technical problems of inaccurate image color reproduction
and poor user experience caused by the fact that an area proportion of a face region
in an image is relatively low when the white balance regulation is performed according
to a white balance gain value determined based on an area occupied by the face region
in condition of a relatively long shooting distance.
[0016] A second aspect of the embodiments of the disclosure discloses a white balance processing
device, which may include a recognition module, a calculation module and a white balance
module.
[0017] The recognition module may be configured to identify a portrait region in an image.
[0018] The calculation module may be configured to calculate a target white balance gain
value according to an area occupied by the portrait region in the image.
[0019] The white balance module may be configured to perform white balance processing on
the image according to the target white balance gain value.
[0020] In the white balance processing device of the embodiments of the disclosure, the
recognition module is configured to identify the portrait region in the image, the
calculation module is configured to calculate the target white balance gain value
according to the area occupied by the portrait region in the image, and the white
balance module is configured to perform white balance processing on the image according
to the target white balance gain value. This solves the technical problems of inaccurate
image color reproduction and poor user experience caused by the fact that an area
proportion of a face region in an image is relatively low when the white balance regulation
is performed according to a white balance gain value determined based on an area occupied
by the face region in condition of a relatively long shooting distance.
[0021] A third aspect of the embodiments of the disclosure discloses a computer device,
which may include a memory, a processor and a computer program stored in the memory
and capable of running in the processor. The processor executes the program to implement
the white balance processing method of the first aspect of the embodiments.
[0022] A fourth aspect of the embodiments of the disclosure discloses a computer-readable
storage medium, in which a computer program may be stored. The program is executed
by a processor to implement the white balance processing method of the first aspect
of the embodiments.
[0023] A fifth aspect of the embodiments of the disclosure discloses a computer program
product. An instruction in the computer program product is executed by a processor
to execute the white balance processing method of the first aspect of the embodiments.
[0024] Additional aspects and advantages of the disclosure will be partially presented in
the following descriptions and partially become apparent from the following descriptions
or get understood by implementing the disclosure.
BRIEF DESCRIPTION OF DRAWINGS
[0025] The abovementioned and/or additional aspects and advantages of the disclosure will
become apparent and easy to understand from the descriptions made to the embodiments
below in combination with the drawings.
FIG. 1 is a schematic flowchart of a white balance processing method according to
an embodiment of the disclosure.
FIG. 2 is a schematic flowchart of another white balance processing method according
to an embodiment of the disclosure.
FIG. 3 is a structure diagram of a white balance processing device according to an
embodiment of the disclosure.
FIG. 4 is a structure diagram of another white balance processing device according
to an embodiment of the disclosure.
FIG. 5 is a block diagram of an exemplary computer device suitable for implementing
implementation modes of the disclosure.
DETAILED DESCRIPTION
[0026] The embodiments of the disclosure will be described below in detail. Examples of
the embodiments are illustrated in the drawings and the same or similar reference
signs always represent the same or similar components or components with the same
or similar functions. The embodiments described below with reference to the drawings
are exemplary and intended to explain the disclosure and should not be understood
as limits to the disclosure.
[0027] A white balance processing method and device of the embodiments of the disclosure
will be described below with reference to the drawings.
[0028] FIG. 1 is a schematic flowchart of a white balance processing method according to
an embodiment of the disclosure. The method of the embodiment may be executed by a
terminal with a data processing function, for example, a smart phone, a pad and a
personal computer. As illustrated in FIG. 1, the method includes the following operations.
[0029] In 101, a portrait region in an image is identified.
[0030] The portrait region includes a face region and a body region. That is, an area of
the face region is smaller than an area of the portrait region.
[0031] Specifically, the image is obtained by shooting with a rear camera and/or shooting
with a focal length greater than a preset focal length threshold, and face recognition
is performed on the image to obtain the face region. Face recognition may be implemented
by use of a face recognition algorithm in the related art. There are no specific limits
made herein.
[0032] Each pixel in the image includes depth information, and depth information of the
face region is determined according to the identified face region. Regions with depths
similar to that of the face region is determined as candidate regions, and the region
adjacent to the face region in the candidate regions is identified as the portrait
region in the image.
[0033] In 102, a target white balance gain value is calculated according to an area occupied
by the portrait region in the image.
[0034] Specifically, an area proportion of the portrait region in the image is calculated
according to the area occupied by the portrait region in the image, and a first gain
value and second gain value of each color component are calculated according to the
area proportion to obtain the white balance gain value.
[0035] The first gain value is used to regulate a face in the image to a skin color.
[0036] Specifically, it is determined whether the skin color of the face in the image is
a normal face skin color. When the skin color of the face in the image is not a normal
face skin color, the first gain value capable of regulating the skin color of the
face to the normal skin color is generated.
[0037] As a possible implementation mode, color components of all the pixels of the face
region are acquired, a color of each pixel is represented by a color component (R,
G, B), and the color vectors of each pixel may be averaged to calculate a color vector
corresponding to the skin color of the face. It is determined whether R, G and B values
corresponding to the skin color of the face are within the range of R, G and B values
corresponding to the normal face skin color. When R, G and B values corresponding
to the skin color of the face are not within the range of R, G and B values corresponding
to the normal face skin color, the R, G and B values corresponding to the skin color
of the face are adjusted through a gain value to be within the range of R, G and B
values corresponding to the normal face skin color, and the gain value is the first
gain value.
[0038] The range of R, G and B values corresponding to the normal face skin color may be
determined according to R, G and B values provided in a color matrix CC. The R, G
and B values in the color matrix CC may be obtained according to a CIE color space
provided by the Commission Internationale de L'Eclairage.
[0039] The second gain value is different from the first gain value. The second gain value
refers to a gain value determined according to the portrait region to adjust white
balance and is calculated according to each color component in the portrait region.
[0040] As a possible implementation mode, when a color change in the colors of the image
is enough, an average value of the three components R, G and B in the color vectors
of all the pixels tends to be balanced (1:1:1), and a relatively accurate white balance
gain value, i.e., the second gain value, may be obtained by a grayscale weighting
algorithm.
[0041] Specifically, the portrait region is divided into a plurality of sub-blocks, color
vectors of all pixels in each sub-block are acquired, and each pixel is represented
by a color vector (R, G, B). Then an average value and standard deviation of three
channels R, G and B in each sub-block are calculated, and the standard deviation of
each sub-block is weighted (the low-correlated sub-blocks are discarded and the high-correlated
sub-blocks are reserved) to reduce influence of a large-area single color and make
the image colorful. An average value of the three channels R, G and B weighted with
the standard deviation is further calculated, and a gain coefficient of the three
channels R, G and B is calculated to obtain the second gain value.
[0042] In 103, white balance processing is performed on the image according to the target
white balance gain value.
[0043] Specifically, Red (R) value and Blue (B) value data of each regulated pixel are calculated
according to the calculated target white balance gain value, thereby achieving color
correction.
[0044] It is to be noted that, since a human eye is most sensitive to light with a Green
(G) light wavelength (480nm-600nm) in a frequency spectrum and the number of green
pixels acquired in a Bayer array is greatest, a present camera usually fixes a gain
value of a component G and then regulates gain values of a component R and a component
B to regulate the component R and the component B respectively.
[0045] Furthermore, before the operation in 102, the method further includes the following
operation. It is determined that the area occupied by the portrait region in the image
is less than a preset area threshold.
[0046] This is because, when the portrait region is relatively small, the area of the face
region is smaller, and in such case, if weights of the first gain value and the second
gain value are regulated based on the area occupied by the face region in the image,
a face skin color regulation does not have a significant effect. Responsive to determining
that the area occupied by the portrait region in the image is less than the preset
area threshold, it is necessary to adopt a calculation manner of calculating the target
white balance gain value based on the area occupied the portrait region in the image
instead.
[0047] It is to be understood that, when the area occupied by the portrait region in the
image is less than the preset area threshold, it is indicated that the image is acquired
in a distant shooting manner and is applied to an application scenario of the disclosure.
[0048] Specifically, there are multiple possible implementation modes for calculating the
area occupied by the target region in the image. As a possible implementation mode,
the image is divided into multiple sub-blocks, and each sub-block has the same area.
For example, a target picture is divided into m
∗n sub-blocks, a length of each sub-block is 1/m of a length of the target picture,
and a width of each sub-block is 1/n of a width of the target picture. Therefore,
an area of each sub-block is 1/m
∗n, where m and n are positive integers, and preferably, m is 9 and n is 7.
[0049] Furthermore, the acquired m
∗n sub-blocks are searched for sub-blocks in a coordinate interval of the face region
and sub-blocks including an edge of the coordinate interval of the face region to
obtain all the sub-blocks in the face region. An area of each sub-block is known,
so that the area of the face region may be calculated.
[0050] All sub-blocks in the portrait region may be found by the same method. An area of
each sub-block is known, so that the area occupied by the portrait region in the image
may be calculated.
[0051] It is determined whether the area occupied by the portrait region in the image is
less than the preset area threshold. Responsive to determining that the area occupied
by the portrait region in the image is less than the preset area threshold, it is
necessary to adopt the calculation manner of calculating the target white balance
gain value based on the area occupied by the portrait region in the image.
[0052] In a practical application scenario, there is another possible circumstance. The
area of the portrait region in the acquired image is not small, and when it is determined
that the area occupied by the portrait region in the image is not less than the preset
area threshold, namely the area of the portrait region is relatively large, the area
of the face region is correspondingly large. Under the circumstance, the effect of
the face skin color regulation is more significant if the weights of the first gain
value and the second gain value are adjusted based on the area occupied by the face
region in the image. Therefore, responsive to determining that the area occupied by
the portrait region in the image is not less than the preset area threshold, a calculation
manner of calculating the target white balance gain value based on the area occupied
by the face region in the image may also be adopted instead. In the white balance
processing method of the embodiments of the disclosure, the portrait region in the
image is identified, the target white balance gain value is calculated according to
the area occupied by the portrait region in the image, and white balance processing
is performed on the image according to the target white balance gain value. This solves
the technical problems of inaccurate image color reproduction and poor user experience
caused by the fact that in condition of a relatively long shooting distance, an area
proportion of a face region in an image is relatively low when the white balance regulation
is performed according to a white balance gain value determined based on an area occupied
by the face region.
[0053] For describing the previous embodiment clearly, the embodiments of the disclosure
provide another possible white balance processing method. FIG. 2 is a flowchart of
another white balance processing method according to an embodiment of the disclosure.
The weights of the first gain value and the second gain value are determined according
to the area proportion of the portrait region in the image, and a final white balance
gain value is obtained by performing weighted calculation based on the weights. As
illustrated in FIG. 2, the method includes the following operations.
[0054] In 201, an image is obtained by shooting with a rear camera, and/or, the image is
obtained by shooting with a focal length greater than a preset focal length threshold.
[0055] Specifically, when the rear camera is adopted to shoot a portrait, a distance between
the portrait and the camera is relatively long and an area proportion of a face in
the image is relatively low, and/or, when the focal length greater than the preset
focal length threshold is adopted to shoot the portrait, namely distant shooting is
performed, the area occupied by the face in the obtained image is also relatively
small.
[0056] It is to be noted that the rear camera may be a depth (Red-Green-Blue Depth, RGBD)
camera or a structured light camera, may also be a dual camera or a Time of Flight
(TOF) camera and will not be enumerated herein. Through these cameras, depth information
of the shot image may be obtained.
[0057] In 202, face recognition is performed on the image to obtain a face region.
[0058] Specifically, the face in the image is identified through a face recognition technology
to obtain a coordinate interval of the face region. There are multiple implementation
manners for a face recognition algorithm in the related art. For example, an Adaboost
model algorithm may be adopted for face recognition, and another algorithm capable
of rapidly identifying the face region may also be adopted to identify the face region.
The corresponding implementation manner for face recognition is not limited in the
embodiments of the disclosure.
[0059] In 203, candidate regions with depths similar to that of the face region are determined
according to depth information of the image, and the regions adjacent to the face
region in the candidate regions are identified as a portrait region in the image.
[0060] The depth information indicates a distance between each pixel in the image and the
camera.
[0061] Specifically, depth information corresponding to each pixel in the image is obtained.
The depth information of the pixels corresponding to the face region may be determined
according to the determined face region, and the pixels with depth information similar
to that of the pixels corresponding to the face region are determined as candidate
pixels. Regions formed by the candidate pixels are the candidate regions, and the
regions adjacent to the face region in the candidate regions are identified as the
portrait region in the image.
[0062] In 204, an area proportion of the portrait region in the image is calculated according
to an area occupied by the portrait region in the image.
[0063] Specifically, a quotient obtained by dividing the area of the portrait region by
a total area of the image is the area proportion of the portrait region in the image.
[0064] In 205, a weight of a first gain value and a weight of a second gain value are determined
according to the area proportion.
[0065] Specifically, for convenient description, the weight of the first gain value is set
to be K, and meanwhile, and the weight of the second gain value is determined to be
1-K. A value of K is determined according to the area proportion. In general, the
area proportion is positively correlated to the value of K.
[0066] In 206, weighted calculation is performed on the first gain value and the second
gain value according to the determined weight of the first gain value and the weight
of the second gain value to obtain a white balance gain value.
[0067] Specifically, the first gain value and the second gain value are multiplied by the
respective weights to calculate the white balance gain value, namely the white balance
gain value=the first gain value
∗K+the second gain value
∗(1-K).
[0068] In 207, white balance processing is performed on the image according to the target
white balance gain value.
[0069] Specifically, an R value and B value in each color component in the image are multiplied
by the respective gain values in the white balance gain value according to the calculated
white balance gain value to obtain R value and B value of the color component subjected
to the white balance processing, so as to implement color adjustment of the image.
[0070] In the white balance processing method of the embodiments of the disclosure, the
portrait region in the image is identified, the target white balance gain value is
calculated according to the area occupied by the portrait region in the image, and
white balance processing is performed on the image according to the target white balance
gain value. This solves the technical problems of inaccurate image color reproduction
and poor user experience caused by the fact that in condition of a relatively long
shooting distance, an area proportion of a face region in an image is relatively low
when the white balance regulation is performed according to a white balance gain value
determined based on an area occupied by the face region.
[0071] For implementing the abovementioned embodiments, the disclosure also discloses a
white balance processing device.
[0072] FIG. 3 is a structure diagram of a white balance processing device according to an
embodiment of the disclosure. As illustrated in FIG. 3, the device includes a recognition
module 31, a calculation module 32 and a white balance module 33.
[0073] The recognition module 31 is configured to identify a portrait region in an image.
[0074] The calculation module 32 is configured to calculate a target white balance gain
value according to an area occupied by the portrait region in the image.
[0075] The white balance module 33 is configured to perform white balance processing on
the image according to the target white balance gain value.
[0076] As a possible implementation mode, the recognition module 31 is specifically configured
to perform face recognition on the image to obtain a face region, determine candidate
regions with depths similar to that of the face region according to depth information
of the image and identify regions adjacent to the face region in the candidate regions
as the portrait region in the image.
[0077] It is to be noted that explanations and descriptions about the method embodiments
are also applied to the device of the embodiment and will not be elaborated herein.
[0078] In the white balance processing device of the embodiment, the recognition module
is configured to identify the portrait region in the image, the calculation module
is configured to calculate the target white balance gain value according to the area
occupied by the portrait region in the image, and the white balance module is configured
to perform white balance processing on the image according to the target white balance
gain value. When a rear camera is adopted to shoot a portrait, the area of the portrait
region in the image is relatively large, and thus the target white balance gain value
of the image is calculated according to the area occupied by the portrait region.
This solves the technical problems of inaccurate image color reproduction and poor
user experience caused by the fact that an area proportion of a face region in an
image is relatively low when the white balance regulation is performed according to
a white balance gain value determined based on an area occupied by the face region.
[0079] Based on the abovementioned embodiments, an embodiment of the disclosure also provides
a possible implementation mode of a white balance processing device.
[0080] FIG. 4 is a structure diagram of another white balance processing device according
to an embodiment of the disclosure. Based on the previous embodiments, the device
further includes a shooting module 34 and a determination module 35.
[0081] The shooting module 34 is configured to obtain the image by shooting with a rear
camera, and/or, obtain the image by shooting with a focal length greater than a preset
focal length threshold.
[0082] The determination module 35 is configured to determine that the area occupied by
the portrait region in the image is less than a preset area threshold.
[0083] As a possible implementation mode, the calculation module 32 may further include
a first calculation unit 321 and a second calculation unit 322.
[0084] The first calculation unit 321 is configured to calculate an area proportion of the
portrait region in the image according to the area occupied by the portrait region
in the image.
[0085] The second calculation unit 322 is configured to calculate a first gain value and
a second gain value of each color component according to the area proportion to obtain
the white balance gain value. The first gain value is used to regulate a face in the
image to a skin color and the second gain value is different from the first gain value.
[0086] As a possible implementation mode, the second calculation unit 322 may further include
a determination subunit 3221 and a second calculation subunit 3222.
[0087] The determination subunit 3221 is configured to determine a weight of the first gain
value and a weight of the second gain value according to the area proportion.
[0088] The second calculation subunit 3222 is configured to perform weighted calculation
on the first gain value and the second gain value according to the determined weight
of the first gain value and the weight of the second gain value to obtain the white
balance gain value.
[0089] As a possible implementation mode, the device further includes a determination and
calculation module.
[0090] The determination and calculation module is configured to, when the area occupied
by the portrait region in the image is not less than the preset area threshold, calculate
the target white balance gain value according to an area occupied by a face region
in the image.
[0091] As a possible implementation mode, the device further includes an acquisition module.
[0092] The acquisition module is configured to perform synchronization imaging between a
structured light camera or a depth camera and the camera for obtaining the image to
obtain the depth information of the image.
[0093] It is to be noted that the explanations and descriptions about the method embodiments
are also applied to the device of the embodiments and will not be elaborated herein.
[0094] In the white balance processing device of the embodiments, the recognition module
is configured to identify the portrait region in the image, the calculation module
is configured to calculate the target white balance gain value according to the area
occupied by the portrait region in the image, and the white balance module is configured
to perform white balance processing on the image according to the target white balance
gain value. When the rear camera is adopted to shoot the portrait, the area of the
portrait region in the image is relatively large, and the target white balance gain
value of the image is calculated according to the area occupied by the portrait region.
This solves the technical problems of inaccurate image color reproduction and poor
user experience caused by the fact that an area proportion of a face region in an
image is relatively low when the white balance regulation is performed according to
a white balance gain determined based on an area occupied by the face region. For
implementing the abovementioned embodiments, the disclosure also discloses another
device, which includes a processor and a memory configured to store an instruction
executable for the processor.
[0095] For implementing the abovementioned embodiments, the disclosure also discloses a
computer device, which includes a memory, a processor and a computer program stored
in the memory and capable of running in the processor. The processor executes the
program to implement the white balance processing method of the method embodiments.
[0096] FIG. 5 is a block diagram of an exemplary computer device suitable for implementing
implementation modes of the disclosure. The computer device 12 illustrated in FIG.
5 is only an example and should not form any limit to functions and scope of application
of the embodiments of the disclosure.
[0097] As illustrated in FIG. 5, the computer device 12 is embodied in form of a universal
computer device. Components of the computer device 12 may include, but not limited
to: one or more processors or processing units 16, a system memory 28 and a bus 18
connecting different system components (including the system memory 28 and the processing
unit 16).
[0098] The bus 18 represents one or more of several types of bus structures, including a
memory bus or memory controller, a peripheral bus, an accelerated graphics port, a
processor or a local bus adopting any bus structure in multiple bus structures. For
example, these system structures include, but not limited to, an Industry Standard
Architecture (ISA) bus, a Micro Channel Architecture (MAC) bus, an enhanced ISA bus,
a Video Electronics Standards Association (VESA) local bus and a Peripheral Component
Interconnection (PCI) bus.
[0099] The computer device 12 typically includes multiple computer system-readable media.
These media may be any available medium that the computer device 12 may access, including
volatile and nonvolatile media and movable and immovable media.
[0100] The memory 28 may include a computer system-readable medium in form of a nonvolatile
memory, for example, a Random Access Memory (RAM) 30 and/or a high-speed cache memory
32. The computer device 12 may further include another movable/immovable and volatile/nonvolatile
computer system storage medium. Only as an example, a storage system 34 may be configured
to read and write an immovable and nonvolatile magnetic medium (not illustrated in
FIG. 5 and usually called a "hard disk drive"). Although not illustrated in FIG. 5,
a magnetic disk drive configured to read and write a movable nonvolatile magnetic
disk (for example, a "floppy disk") and an optical disk drive configured to read and
write a movable nonvolatile optical disk (for example, a Compact Disc Read Only Memory
(CD-ROM), a Digital Video Disc Read Only Memory (DVD-ROM) or another optical medium)
may be provided. Under such circumstances, each drive may be connected with the bus
18 through one or more data medium interfaces. The memory 28 may include at least
one program product. The program product includes a group of (for example, at least
one) program modules, and these program modules are configured to execute the functions
of each embodiment of the disclosure.
[0101] A program/utility tool 40 with a group of (at least one) program modules 42 may be
stored in, for example, the memory 28. Such a program module 42 includes, but not
limited to, an operating system, one or more application programs, another program
module and program data, and each or certain combination of these examples may include
implementation of a network environment. The program module 42 usually executes the
functions and/or method in the embodiments described in the disclosure.
[0102] The computer device 12 may also communicate with one or more external devices 14
(for example, a keyboard, a pointing device and a display 24), and may further communicate
with one or more devices through which a user may interact with the computer device
12 and/or communicate with any device (for example, a network card and a modem) through
which the computer device 12 may communicate with one or more other computer devices.
Such communication may be implemented through an Input/Output (I/O) interface 22.
Moreover, the computer device 12 may further communicate with one or more networks
(for example, a Local Area Network (LAN) and a Wide Area Network (WAN) and/or public
network, for example, the Internet) through a network adapter 20. As illustrated in
FIG. 5, the network adapter 20 communicates with the other modules of the computer
device 12 through the bus 18. It is to be understood that, although not illustrated
in the figure, other hardware and/or software modules may be used in combination with
the computer device 12, including, but not limited to, a microcode, a device driver,
a redundant processing unit, an external disk drive array, a Redundant Array of Independent
Disks (RAID) system, a magnetic tape drive, a data backup storage system and the like.
[0103] The processing unit 16 runs the program stored in the system memory 28, to execute
various function applications and data processing, for example, implementing the white
balance processing method mentioned in the abovementioned embodiments.
[0104] For implementing the abovementioned embodiments, the disclosure also discloses a
computer-readable storage medium, in which a computer program is stored. The program
is executed by a processor to implement the white balance processing method in the
abovementioned method embodiments.
[0105] For implementing the abovementioned embodiments, the disclosure also discloses a
computer program product. An instruction in the computer program product is executed
by a processor to implement the white balance processing method in the abovementioned
method embodiments.
[0106] In the descriptions of the specification, the descriptions made with reference to
terms "an embodiment", "some embodiments", "example", "specific example", "some examples"
or the like refer to that specific features, structures, materials or characteristics
described in combination with the embodiment or the example are included in at least
one embodiment or example of the disclosure. In the specification, these terms are
not always schematically expressed for the same embodiment or example. The specific
described features, structures, materials or characteristics may be combined in a
proper manner in any one or more embodiments or examples. In addition, those skilled
in the art may integrate and combine different embodiments or examples described in
the specification and features of different embodiments or examples without conflicts.
[0107] In addition, terms "first" and "second" are only adopted for description and should
not be understood to indicate or imply relative importance or implicitly indicate
the number of indicated technical features. Therefore, a feature defined by "first"
and "second" may explicitly or implicitly indicate inclusion of at least one such
feature. In the descriptions of the disclosure, "multiple" means at least two, for
example, two and three, unless otherwise limited definitely and specifically.
[0108] Any process or method in the flowcharts or described herein in another manner may
be understood to represent a module, segment or part including codes of one or more
executable instructions configured to realize specific logic functions or operations
of the process and, moreover, the scope of the preferred implementation mode of the
disclosure includes other implementation, not in a sequence illustrated or discussed
herein, including execution of the functions basically simultaneously or in an opposite
sequence according to the involved functions. This should be understood by those skilled
in the art of the embodiments of the disclosure.
[0109] Logics and/or operations represented in the flowcharts or described herein in another
manner, for example, may be considered as a fixed sequence list of executable instructions
configured to realize the logic functions and may specifically implemented in any
computer-readable medium for an instruction execution system, device or equipment
(for example, a computer-based system, a system including a processor or another system
capable of reading instructions from the instruction execution system, device or equipment
and executing the instructions) to use or for use in combination with the instruction
execution system, device or equipment. For the specification, "computer-readable medium"
may be any device capable of including, storing, communicating with, propagating or
transmitting a program for the instruction execution system, device or equipment to
use or for use in combination with the instruction execution system, device or equipment.
A more specific example (non-exhaustive list) of the computer-readable medium includes:
an electric connection portion (electronic device) with one or more wires, a portable
computer disk (magnetic device), a RAM, a Read-Only Memory (ROM), an Erasable Programmable
ROM (EPROM) (or flash memory), an optical fiber device and a portable CD-ROM. In addition,
the computer-readable medium may even be paper or another medium on which the program
may be printed because, for example, the paper or the other medium may be optically
scanned then edited, explained or, when necessary, processed in another proper manner
to obtain the program in an electronic manner for storage in the computer memory.
[0110] It is to be understood that each part of the disclosure may be implemented by hardware,
software, firmware or a combination thereof. In the abovementioned implementation
modes, multiple operations or methods may be implemented by software or firmware stored
in a memory and executed by a proper instruction execution system. For example, in
case of implementation with the hardware, like another implementation mode, any one
or combination of the following technologies well-known in the art may be adopted
for implementation: a discrete logic circuit with a logic gate circuit configured
to realize a logic function for a data signal, an application-specific integrated
circuit with a proper combined logic gate circuit, a Programmable Gate Array (PGA),
a Field Programmable Gate Array (FPGA) and the like.
[0111] Those of ordinary skill in the art should understand that all or part of the operations
in the method of the abovementioned embodiment may be completed through related hardware
instructed by a program. The program may be stored in a computer-readable storage
medium, and when the program is executed, one or combination of the operations of
the method embodiments is included.
[0112] In addition, each functional unit in each embodiment of the disclosure may be integrated
into a processing module, each unit may also physically exist independently, and two
or more than two units may also be integrated into a module. The integrated module
may be implemented in a hardware form and may also be implemented in form of software
functional module. When being implemented in form of software functional module and
sold or used as an independent product, the integrated module may be stored in a computer-readable
storage medium.
[0113] The storage medium may be a ROM, a magnetic disk, an optical disk or the like. The
embodiments of the disclosure have been illustrated or described above. It can be
understood that the abovementioned embodiments are exemplary and should not be understood
as limits to the disclosure and those of ordinary skill in the art may make variations,
modifications, replacements, transformations to the abovementioned embodiments within
the scope of the disclosure.
1. A white balance processing method, comprising:
identifying a portrait region in an image;
calculating a target white balance gain value according to an area occupied by the
portrait region in the image; and
performing white balance processing on the image according to the target white balance
gain value.
2. The white balance processing method of claim 1, wherein calculating the white balance
gain value according to the area occupied by the portrait region in the image comprises:
calculating an area proportion of the portrait region in the image according to the
area occupied by the portrait region in the image; and
calculating a first gain value and a second gain value of each color component according
to the area proportion to obtain the white balance gain value, the first gain value
being used to regulate a face in the image to a skin color and the second gain value
being different from the first gain value.
3. The white balance processing method of claim 2, wherein calculating the first gain
value and the second gain value of each color component according to the area proportion
to obtain the white balance gain value comprises:
determining a weight of the first gain value and a weight of the second gain value
according to the area proportion; and
performing weighted calculation on the first gain value and the second gain value
according to the determined weight of the first gain value and the weight of the second
gain value to obtain the white balance gain value.
4. The white balance processing method of any one of claims 1-3, wherein before identifying
the portrait region in the image, the method further comprises at least one of the
following:
obtaining the image by shooting with a rear camera;
or, obtaining the image by shooting with a focal length greater than a preset focal
length threshold.
5. The white balance processing method of any one of claims 1-3, wherein before calculating
the target white balance gain value according to the area occupied by the portrait
region in the image, the method further comprises:
determining that the area occupied by the portrait region in the image is less than
a preset area threshold.
6. The white balance processing method of claim 5, further comprising:
when the area occupied by the portrait region in the image is not less than the preset
area threshold, calculating the target white balance gain value according to an area
occupied by a face region in the image.
7. The white balance processing method of any one of claims 1-6, wherein identifying
the portrait region in the image comprises:
performing face recognition on the image to obtain a face region;
determining candidate regions with depths similar to that of the face region according
to depth information of the image; and
identifying a region adjacent to the face region in the candidate regions as the portrait
region in the image.
8. The white balance processing method of claim 7, wherein before identifying the portrait
region in the image, the method further comprises:
performing synchronization imaging between a structured light camera or a depth camera
and a camera for obtaining the image to obtain the depth information of the image.
9. A white balance processing device, comprising:
a recognition module, configured to identify a portrait region in an image;
a calculation module, configured to calculate a target white balance gain value according
to an area occupied by the portrait region in the image; and
a white balance module, configured to perform white balance processing on the image
according to the target white balance gain value.
10. The white balance processing device of claim 9, wherein the calculation module comprises:
a first calculation unit, configured to calculate an area proportion of the portrait
region in the image according to the area occupied by the portrait region in the image;
and
a second calculation unit, configured to calculate a first gain value and a second
gain value of each color component according to the area proportion to obtain the
white balance gain value, the first gain value being used to regulate a face in the
image to a skin color and the second gain value being different from the first gain
value.
11. The white balance processing device of claim 10, wherein the second calculation unit
comprises:
a determination subunit, configured to determine a weight of the first gain value
and a weight of the second gain value according to the area proportion; and
a second calculation subunit, configured to perform weighted calculation on the first
gain value and the second gain value according to the determined weight of the first
gain value and the weight of the second gain value to obtain the white balance gain
value.
12. The white balance processing device of any one of claims 9-11, further comprising:
a shooting module, configured to obtain the image by shooting with a rear camera,
and/or, obtain the image by shooting with a focal length greater than a preset focal
length threshold.
13. The white balance processing device of any one of claims 9-11, further comprising:
a determination module, configured to determine that the area occupied by the portrait
region in the image is less than a preset area threshold.
14. The white balance processing device of claim 13, further comprising:
a determination and calculation module, configured to calculate, when the area occupied
by the portrait region in the image is not less than the preset area threshold, the
target white balance gain value according to an area occupied by a face region in
the image.
15. The white balance processing device of any one of claims 9-14, wherein the recognition
module is configured to:
perform face recognition on the image to obtain a face region;
determine candidate regions with depths similar to that of the face region according
to depth information of the image; and
identify a region adjacent to the face region in the candidate regions as the portrait
region in the image.
16. The white balance processing device of claim 15, further comprising:
an acquisition module, configured to perform synchronization imaging between a structured
light camera or a depth camera and a camera for obtaining the image to obtain the
depth information of the image.
17. A computer device, comprising a memory, a processor and a computer program stored
in the memory and capable of running in the processor, wherein the processor executes
the program to implement the white balance processing method of any one of claims
1-8.
18. A computer-readable storage medium, having a computer program stored thereon, wherein
the program is executed by a processor to implement the white balance processing method
of any one of claims 1-8.
19. A computer program product, an instruction in the computer program product being executed
by a processor to implement the white balance processing method of any one of claims
1-8.