[Technical Field]
[0001] Embodiments of the disclosure relate to an electronic device including a display
and a method of operating the same.
[Background Art]
[0002] Along with the development of electronic, information, and communication technologies,
various functions are integrated into a single portable communication device or electronic
device. For example, an electronic device (e.g., a smartphone, a wearable electronic
device, or other portable electronic devices) may perform a variety of functions by
installing and executing various applications, in addition to communications functions.
[0003] The electronic device may also include a display to enable display of information
associated with execution of various functions. For example, the display may include
various display devices such as a liquid crystal display (LCD), an organic light emitting
diode (OLED) display, or an electrophoretic display.
[Disclosure of Invention]
[Solution to Problems]
[0004] An electronic device according to an embodiment includes memory, a processor, a display
controller, and a display.
[0005] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment is configured
to transmit a first frame to the display controller based on a first tearing signal
output at a first output time from the display.
[0006] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment is configured
to obtain a second frame for updating a portion of the plurality of layers related
to the first frame.
[0007] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment is configured
to, based on identifying that the portion of the plurality of layers corresponds to
a region of interest (ROI) of the first frame, identify that screen tearing occurs
when the second frame is transmitted to the display through the display controller.
[0008] The electronic device according to an embodiment, wherein the second frame is not
written to memory included in the display while the first frame is scanned on the
display, based on identifying that screen tearing occurs.
[0009] A method of operating an electronic device according to an embodiment includes transmitting
a first frame to a display controller included in the electronic device based on a
first tearing signal output at a first output time from the display.
[0010] The method of operating the electronic device according to an embodiment includes
obtaining a second frame for updating a portion of the plurality of layers related
to the first frame.
[0011] The method of operating the electronic device according to an embodiment includes,
based on identifying that portion of the plurality of layers corresponds to region
of interest (ROI) of the first frame, identifying that screen tearing occurs when
the second frame is transmitted to the display through the display controller.
[0012] The method of operating the electronic device according to an embodiment includes,
wherein the second frame is not written to memory included in the display while the
first frame is scanned on the display, based on identifying that screen tearing occurs.
[0013] A non-transitory computer readable medium storing at least one instruction, wherein
the at least one instruction, when executed by a processor of an electronic device,
may cause the electronic device to, according to an embodiment, transmit a first frame
to a display controller included in an electronic device based on a first tearing
signal output at a first output time from the display.
[0014] The non-transitory computer readable medium storing at least one instruction, wherein
the at least one instruction, when executed by the processor of the electronic device,
may cause the electronic device to, according to an embodiment, obtain a second frame
for updating a portion of the plurality of layers related to the first frame.
[0015] The non-transitory computer readable medium storing at least one instruction, wherein
the at least one instruction, when executed by the processor of the electronic device,
may cause the electronic device to, according to an embodiment, based on identifying
that the portion of the plurality of layers corresponds to a bottom region (ROI) of
the first frame, identify that screen tearing occurs when the second frame is transmitted
to the display through the display controller.
[0016] The non-transitory computer readable medium storing at least one instruction, according
to an embodiment, wherein the second frame is not written to memory included in the
display while the first frame is scanned on the display, based on identifying that
screen tearing occurs.
[Brief Description of Drawings]
[0017]
FIG. 1 is a block diagram illustrating an electronic device in a network environment
according to various embodiments.
FIG. 2 is a block diagram illustrating a display module according to various embodiments.
FIG. 3A is a block diagram illustrating a configuration of an electronic device according
to an embodiment.
FIG. 3B is a block diagram illustrating a configuration of an electronic device according
to an embodiment.
FIG. 4A is a flowchart illustrating a method of operating an electronic device according
to an embodiment.
FIG. 4B is a flowchart illustrating a method of operating an electronic device according
to an embodiment.
FIG. 5 is a timing diagram illustrating a process of transmitting, writing, and scanning
a frame after adjustment of an output time of a tearing signal according to a comparative
embodiment.
FIG. 6A is a flowchart illustrating a method of operating an electronic device to
prevent screen tearing according to an embodiment.
FIG. 6B is a timing diagram illustrating a process of transmitting, writing, and scanning
a frame according to an embodiment.
FIG. 7A is a flowchart illustrating a method of operating an electronic device to
prevent screen tearing according to an embodiment.
FIG. 7B is a timing diagram illustrating a process of transmitting, writing, and scanning
a frame according to an embodiment.
FIG. 8A is a flowchart illustrating a method of operating an electronic device to
prevent screen tearing according to an embodiment.
FIG. 8B is a timing diagram illustrating a process of transmitting, writing, and scanning
a frame according to an embodiment.
[Mode for the Invention]
[0018] The various embodiments of the invention mentioned herein can be advantageously combined
with each other, unless otherwise specified in the individual case.
[0019] FIG. 1 is a block diagram illustrating an electronic device 101 in a network environment
100 according to various embodiments. Referring to FIG. 1, the electronic device 101
in the network environment 100 may communicate with an electronic device 102 via a
first network 198 (e.g., a short-range wireless communication network), or at least
one of an electronic device 104 or a server 108 via a second network 199 (e.g., a
long-range wireless communication network). According to an embodiment, the electronic
device 101 may communicate with the electronic device 104 via the server 108. According
to an embodiment, the electronic device 101 may include a processor 120, memory 130,
an input module 150, a sound output module 155, a display module 160, an audio module
170, a sensor module 176, an interface 177, a connecting terminal 178, a haptic module
179, a camera module 180, a power management module 188, a battery 189, a communication
module 190, a subscriber identification module (SIM) 196, or an antenna module 197.
In some embodiments, at least one of the components (e.g., the connecting terminal
178) may be omitted from the electronic device 101, or one or more other components
may be added in the electronic device 101. In some embodiments, some of the components
(e.g., the sensor module 176, the camera module 180, or the antenna module 197) may
be implemented as a single component (e.g., the display module 160).
[0020] The processor 120 may execute, for example, software (e.g., a program 140) to control
at least one other component (e.g., a hardware or software component) of the electronic
device 101 coupled with the processor 120, and may perform various data processing
or computation. According to an embodiment, as at least part of the data processing
or computation, the processor 120 may store a command or data received from another
component (e.g., the sensor module 176 or the communication module 190) in volatile
memory 132, process the command or the data stored in the volatile memory 132, and
store resulting data in non-volatile memory 134. According to an embodiment, the processor
120 may include a main processor 121 (e.g., a central processing unit (CPU) or an
application processor (AP)), or an auxiliary processor 123 (e.g., a graphics processing
unit (GPU), a neural processing unit (NPU), an image signal processor (ISP), a sensor
hub processor, or a communication processor (CP)) that is operable independently from,
or in conjunction with, the main processor 121. For example, when the electronic device
101 includes the main processor 121 and the auxiliary processor 123, the auxiliary
processor 123 may be adapted to consume less power than the main processor 121, or
to be specific to a specified function. The auxiliary processor 123 may be implemented
as separate from, or as part of the main processor 121.
[0021] The auxiliary processor 123 may control at least some of functions or states related
to at least one component (e.g., the display module 160, the sensor module 176, or
the communication module 190) among the components of the electronic device 101, instead
of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep)
state, or together with the main processor 121 while the main processor 121 is in
an active state (e.g., executing an application). According to an embodiment, the
auxiliary processor 123 (e.g., an image signal processor or a communication processor)
may be implemented as part of another component (e.g., the camera module 180 or the
communication module 190) functionally related to the auxiliary processor 123. According
to an embodiment, the auxiliary processor 123 (e.g., the neural processing unit) may
include a hardware structure specified for artificial intelligence model processing.
An artificial intelligence model may be generated by machine learning. Such learning
may be performed, e.g., by the electronic device 101 where the artificial intelligence
is performed or via a separate server (e.g., the server 108). Learning algorithms
may include, but are not limited to, e.g., supervised learning, unsupervised learning,
semi-supervised learning, or reinforcement learning. The artificial intelligence model
may include a plurality of artificial neural network layers. The artificial neural
network may be a deep neural network (DNN), a convolutional neural network (CNN),
a recurrent neural network (RNN), a restricted boltzmann machine (RBM), a deep belief
network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-network
or a combination of two or more thereof but is not limited thereto. The artificial
intelligence model may, additionally or alternatively, include a software structure
other than the hardware structure.
[0022] The memory 130 may store various data used by at least one component (e.g., the processor
120 or the sensor module 176) of the electronic device 101. The various data may include,
for example, software (e.g., the program 140) and input data or output data for a
command related thereto. The memory 130 may include the volatile memory 132 or the
non-volatile memory 134.
[0023] The program 140 may be stored in the memory 130 as software, and may include, for
example, an operating system (OS) 142, middleware 144, or an application 146.
[0024] The input module 150 may receive a command or data to be used by another component
(e.g., the processor 120) of the electronic device 101, from the outside (e.g., a
user) of the electronic device 101. The input module 150 may include, for example,
a microphone, a mouse, a keyboard, a key (e.g., a button), or a digital pen (e.g.,
a stylus pen).
[0025] The sound output module 155 may output sound signals to the outside of the electronic
device 101. The sound output module 155 may include, for example, a speaker or a receiver.
The speaker may be used for general purposes, such as playing multimedia or playing
record. The receiver may be used for receiving incoming calls. According to an embodiment,
the receiver may be implemented as separate from, or as part of the speaker.
[0026] The display module 160 may visually provide information to the outside (e.g., a user)
of the electronic device 101. The display module 160 may include, for example, a display,
a hologram device, or a projector and control circuitry to control a corresponding
one of the display, hologram device, and projector. According to an embodiment, the
display module 160 may include a touch sensor adapted to detect a touch, or a pressure
sensor adapted to measure the intensity of force incurred by the touch.
[0027] The audio module 170 may convert a sound into an electrical signal and vice versa.
According to an embodiment, the audio module 170 may obtain the sound via the input
module 150, or output the sound via the sound output module 155 or a headphone of
an external electronic device (e.g., an electronic device 102) directly (e.g., wiredly)
or wirelessly coupled with the electronic device 101.
[0028] The sensor module 176 may detect an operational state (e.g., power or temperature)
of the electronic device 101 or an environmental state (e.g., a state of a user) external
to the electronic device 101, and then generate an electrical signal or data value
corresponding to the detected state. According to an embodiment, the sensor module
176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure
sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor,
a color sensor, an infrared (IR) sensor, a biometric sensor, a temperature sensor,
a humidity sensor, or an illuminance sensor.
[0029] The interface 177 may support one or more specified protocols to be used for the
electronic device 101 to be coupled with the external electronic device (e.g., the
electronic device 102) directly (e.g., wiredly) or wirelessly. According to an embodiment,
the interface 177 may include, for example, a high definition multimedia interface
(HDMI), a universal serial bus (USB) interface, a secure digital (SD) card interface,
or an audio interface.
[0030] A connecting terminal 178 may include a connector via which the electronic device
101 may be physically connected with the external electronic device (e.g., the electronic
device 102). According to an embodiment, the connecting terminal 178 may include,
for example, a HDMI connector, a USB connector, a SD card connector, or an audio connector
(e.g., a headphone connector).
[0031] The haptic module 179 may convert an electrical signal into a mechanical stimulus
(e.g., a vibration or a movement) or electrical stimulus which may be recognized by
a user via his tactile sensation or kinesthetic sensation. According to an embodiment,
the haptic module 179 may include, for example, a motor, a piezoelectric element,
or an electric stimulator.
[0032] The camera module 180 may capture a still image or moving images. According to an
embodiment, the camera module 180 may include one or more lenses, image sensors, image
signal processors, or flashes.
[0033] The power management module 188 may manage power supplied to the electronic device
101. According to an embodiment, the power management module 188 may be implemented
as at least part of, for example, a power management integrated circuit (PMIC).
[0034] The battery 189 may supply power to at least one component of the electronic device
101. According to an embodiment, the battery 189 may include, for example, a primary
cell which is not rechargeable, a secondary cell which is rechargeable, or a fuel
cell.
[0035] The communication module 190 may support establishing a direct (e.g., wired) communication
channel or a wireless communication channel between the electronic device 101 and
the external electronic device (e.g., the electronic device 102, the electronic device
104, or the server 108) and performing communication via the established communication
channel. The communication module 190 may include one or more communication processors
that are operable independently from the processor 120 (e.g., the application processor
(AP)) and supports a direct (e.g., wired) communication or a wireless communication.
According to an embodiment, the communication module 190 may include a wireless communication
module 192 (e.g., a cellular communication module, a short-range wireless communication
module, or a global navigation satellite system (GNSS) communication module) or a
wired communication module 194 (e.g., a local area network (LAN) communication module
or a power line communication (PLC) module). A corresponding one of these communication
modules may communicate with the external electronic device via the first network
198 (e.g., a short-range communication network, such as Bluetooth
™, wireless-fidelity (Wi-Fi) direct, or infrared data association (IrDA)) or the second
network 199 (e.g., a long-range communication network, such as a legacy cellular network,
a 5G network, a next-generation communication network, the Internet, or a computer
network (e.g., LAN or wide area network (WAN)). These various types of communication
modules may be implemented as a single component (e.g., a single chip), or may be
implemented as multi components (e.g., multi chips) separate from each other. The
wireless communication module 192 may identify and authenticate the electronic device
101 in a communication network, such as the first network 198 or the second network
199, using subscriber information (e.g., international mobile subscriber identity
(IMSI)) stored in the subscriber identification module 196.
[0036] The wireless communication module 192 may support a 5G network, after a 4G network,
and next-generation communication technology, e.g., new radio (NR) access technology.
The NR access technology may support enhanced mobile broadband (eMBB), massive machine
type communications (mMTC), or ultra-reliable and low-latency communications (URLLC).
The wireless communication module 192 may support a high-frequency band (e.g., the
mmWave band) to achieve, e.g., a high data transmission rate. The wireless communication
module 192 may support various technologies for securing performance on a high-frequency
band, such as, e.g., beamforming, massive multiple-input and multiple-output (massive
MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large
scale antenna. The wireless communication module 192 may support various requirements
specified in the electronic device 101, an external electronic device (e.g., the electronic
device 104), or a network system (e.g., the second network 199). According to an embodiment,
the wireless communication module 192 may support a peak data rate (e.g., 20Gbps or
more) for implementing eMBB, loss coverage (e.g., 164dB or less) for implementing
mMTC, or U-plane latency (e.g., 0.5ms or less for each of downlink (DL) and uplink
(UL), or a round trip of 1ms or less) for implementing URLLC.
[0037] The antenna module 197 may transmit or receive a signal or power to or from the outside
(e.g., the external electronic device) of the electronic device 101. According to
an embodiment, the antenna module 197 may include an antenna including a radiating
element composed of a conductive material or a conductive pattern formed in or on
a substrate (e.g., a printed circuit board (PCB)). According to an embodiment, the
antenna module 197 may include a plurality of antennas (e.g., array antennas). In
such a case, at least one antenna appropriate for a communication scheme used in the
communication network, such as the first network 198 or the second network 199, may
be selected, for example, by the communication module 190 (e.g., the wireless communication
module 192) from the plurality of antennas. The signal or the power may then be transmitted
or received between the communication module 190 and the external electronic device
via the selected at least one antenna. According to an embodiment, another component
(e.g., a radio frequency integrated circuit (RFIC)) other than the radiating element
may be additionally formed as part of the antenna module 197.
[0038] According to various embodiments, the antenna module 197 may form an mmWave antenna
module. According to an embodiment, the mmWave antenna module may include a printed
circuit board, a RFIC disposed on a first surface (e.g., the bottom surface) of the
printed circuit board, or adjacent to the first surface and capable of supporting
a designated high-frequency band (e.g., the mmWave band), and a plurality of antennas
(e.g., array antennas) disposed on a second surface (e.g., the top or a side surface)
of the printed circuit board, or adjacent to the second surface and capable of transmitting
or receiving signals of the designated high-frequency band.
[0039] At least some of the above-described components may be coupled mutually and communicate
signals (e.g., commands or data) therebetween via an inter-peripheral communication
scheme (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface
(SPI), or mobile industry processor interface (MIPI)).
[0040] According to an embodiment, commands or data may be transmitted or received between
the electronic device 101 and the external electronic device 104 via the server 108
coupled with the second network 199. Each of the electronic devices 102 or 104 may
be a device of a same type as, or a different type, from the electronic device 101.
According to an embodiment, all or some of operations to be executed at the electronic
device 101 may be executed at one or more of the external electronic devices 102,
104, or 108. For example, if the electronic device 101 should perform a function or
a service automatically, or in response to a request from a user or another device,
the electronic device 101, instead of, or in addition to, executing the function or
the service, may request the one or more external electronic devices to perform at
least part of the function or the service. The one or more external electronic devices
receiving the request may perform the at least part of the function or the service
requested, or an additional function or an additional service related to the request,
and transfer an outcome of the performing to the electronic device 101. The electronic
device 101 may provide the outcome, with or without further processing of the outcome,
as at least part of a reply to the request. To that end, a cloud computing, distributed
computing, mobile edge computing (MEC), or client-server computing technology may
be used, for example. The electronic device 101 may provide ultra low-latency services
using, e.g., distributed computing or mobile edge computing. In another embodiment,
the external electronic device 104 may include an internet-of-things (IoT) device.
The server 108 may be an intelligent server using machine learning and/or a neural
network. According to an embodiment, the external electronic device 104 or the server
108 may be included in the second network 199. The electronic device 101 may be applied
to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based
on 5G communication technology or IoT-related technology.
[0041] FIG. 2 is a block diagram 200 illustrating the display module 160 according to various
embodiments. Referring to FIG. 2, the display module 160 may include a display 210
and a display driver integrated circuit (DDI) 230 to control the display 210. The
DDI 230 may include an interface module 231, memory 233 (e.g., buffer memory), an
image processing module 235, or a mapping module 237. The DDI 230 may receive image
information that contains image data or an image control signal corresponding to a
command to control the image data from another component of the electronic device
101 via the interface module 231. For example, according to an embodiment, the image
information may be received from the processor 120 (e.g., the main processor 121 (e.g.,
an application processor)) or the auxiliary processor 123 (e.g., a graphics processing
unit) operated independently from the function of the main processor 121. The DDI
230 may communicate, for example, with touch circuitry 250 or the sensor module 176
via the interface module 231. The DDI 230 may also store at least part of the received
image information in the memory 233, for example, on a frame by frame basis. The image
processing module 235 may perform pre-processing or post-processing (e.g., adjustment
of resolution, brightness, or size) with respect to at least part of the image data.
According to an embodiment, the pre-processing or post-processing may be performed,
for example, based at least in part on one or more characteristics of the image data
or one or more characteristics of the display 210. The mapping module 237 may generate
a voltage value or a current value corresponding to the image data pre-processed or
post-processed by the image processing module 235. According to an embodiment, the
generating of the voltage value or current value may be performed, for example, based
at least in part on one or more attributes of the pixels (e.g., an array, such as
an RGB stripe or a pentile structure, of the pixels, or the size of each subpixel).
At least some pixels of the display 210 may be driven, for example, based at least
in part on the voltage value or the current value such that visual information (e.g.,
a text, an image, or an icon) corresponding to the image data may be displayed via
the display 210.
[0042] According to an embodiment, the display module 160 may further include the touch
circuitry 250. The touch circuitry 250 may include a touch sensor 251 and a touch
sensor IC 253 to control the touch sensor 251. The touch sensor IC 253 may control
the touch sensor 251 to sense a touch input or a hovering input with respect to a
certain position on the display 210. To achieve this, for example, the touch sensor
251 may detect (e.g., measure) a change in a signal (e.g., a voltage, a quantity of
light, a resistance, or a quantity of one or more electric charges) corresponding
to the certain position on the display 210. The touch circuitry 250 may provide input
information (e.g., a position, an area, a pressure, or a time) indicative of the touch
input or the hovering input detected via the touch sensor 251 to the processor 120.
According to an embodiment, at least part (e.g., the touch sensor IC 253) of the touch
circuitry 250 may be formed as part of the display 210 or the DDI 230, or as part
of another component (e.g., the auxiliary processor 123) disposed outside the display
module 160.
[0043] According to an embodiment, the display module 160 may further include at least one
sensor (e.g., a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance
sensor) of the sensor module 176 or a control circuit for the at least one sensor.
In such a case, the at least one sensor or the control circuit for the at least one
sensor may be embedded in one portion of a component (e.g., the display 210, the DDI
230, or the touch circuitry 250)) of the display module 160. For example, when the
sensor module 176 embedded in the display module 160 includes a biometric sensor (e.g.,
a fingerprint sensor), the biometric sensor may obtain biometric information (e.g.,
a fingerprint image) corresponding to a touch input received via a portion of the
display 210. As another example, when the sensor module 176 embedded in the display
module 160 includes a pressure sensor, the pressure sensor may obtain pressure information
corresponding to a touch input received via a partial or whole area of the display
210. According to an embodiment, the touch sensor 251 or the sensor module 176 may
be disposed between pixels in a pixel layer of the display 210, or over or under the
pixel layer.
[0044] FIG. 3A is a block diagram illustrating a configuration of an electronic device according
to an embodiment.
[0045] According to an embodiment, an electronic device 301 (e.g., the electronic device
101 of FIG. 1) may include a first processor 320, memory 330, and a display 360.
[0046] According to an embodiment, the electronic device 301 may be implemented the same
as or similar to the electronic device 101 of FIG. 1.
[0047] According to an embodiment, the display 360 may be implemented the same as or similar
to the display module 160 of FIG. 1 or the display device 260 of FIG. 2. According
to an embodiment, the display 360 may include a DDI 370 and a display panel 380. According
to an embodiment, the DDI 370 may control the display panel 380. According to an embodiment,
the DDI 370 may be implemented the same as or similar to the DDI 230 of FIG. 2.
[0048] According to an embodiment, the first processor 320 may provide overall control to
operations of the electronic device 301. For example, the first processor 320 may
be implemented the same as or similar to the processor 120 of FIG. 1. According to
an embodiment, the first processor 320 may be implemented as an AP. According to an
embodiment, the first processor 320 may include a second processor 321, a third processor
322, and a display controller 350. The display controller 350 may be implemented as
a display processing unit or data processing unit (DPU). According to an embodiment,
the second processor 321 may be implemented as a GPU. According to an embodiment,
the third processor 322 may be implemented as a CPU. According to an embodiment, the
third processor 322 may execute a display driver.
[0049] According to an embodiment, the display controller 350 may receive a tearing signal
for a frame (or image) from the display 360. For example, the tearing signal may refer
to a tearing effect (TE) signal.
[0050] A tearing effect, TE, may refer to a visual artifact that appears when a plurality
of frames are displayed as one screen on the display 360. For example, TE may occur
when display's refresh rate and GPU's frame rate are not synchronized.
[0051] According to an embodiment, the tearing signal may include a control signal for determining
a time interval during which a frame is written and a time interval during which the
frame is scanned (or displayed) to prevent the TE of the frame scanned (or displayed)
on the display panel 380. When identifying the tearing signal, the display controller
350 may transmit the frame to the DDI 370. According to an embodiment, transmitting
the frame to the DDI 370 by the display controller 350 may include writing the frame
to memory (e.g., the memory 233 of FIG. 2 (e.g., graphic RAM (GRAM)) for the display
360 by the DDI 370. The DDI 370 may then read the frame written to the memory 233
and scan (or display) the frame on the display panel 380. For example, a period of
the tearing signal may coincide with a period of a synchronization signal (e.g., a
Vsync signal). Alternatively, an output time of the tearing signal may be adjusted
based on an output time of the synchronization signal depending on implementation.
According to an embodiment, the tearing signal may be a clock signal that rises at
a rising time (e.g., a rising edge time) of the synchronization signal and falls at
a falling time (e.g., a falling edge time) of the synchronization signal. According
to an embodiment, the rising time of the tearing signal may be the output time of
the tearing signal. According to an embodiment, the display controller 350 may transmit
the frame to the DDI 370, and the DDI 370 may write the frame to the memory 233, at
the rising time of the tearing signal. According to an embodiment, the DDI 370 may
read the frame from the memory 233 and scan (or display) the frame on the display
panel 380, at the falling time of the tearing signal.
[0052] According to an embodiment, the second processor 321 may render a frame (or image).
According to an embodiment, the second processor 321 may transmit the rendered frame
(or image) to the third processor 322. According to an embodiment, the third processor
322 may transmit the frame to the display controller 350. According to an embodiment,
the display controller 350 may transmit the frame (or image) to the DDI 370. According
to an embodiment, the display controller 350 may transmit a first frame received from
the third processor 322 to the DDI 370, based on the tearing signal output from the
display 360 at a predetermined period. According to an embodiment, the tearing signal
may be output at a time advanced from an output time of a reference tearing signal
by a specified time. For example, the reference tearing signal may refer to a tearing
signal output from the display 360 at an output time prior to adjustment. For example,
the specified time may be determined based on at least one of the number of layers
included in the frame, a resolution of the display 360, or a frequency (frame rate)
of the display 360. For example, the resolution of the display 360 may be implemented
as 3,088 (the number of horizontal pixels) x 1,440 (the number of vertical pixels).
For example, the frequency of the display 360 may be implemented as 120Hz. For example,
in the case of a set value based on the above resolution, the above frequency, and
four layers, the specified time may be 2ms. However, this is exemplary, and embodiments
of the disclosure may set the specified time based on various methods.
[0053] According to an embodiment, the display controller 350 may transmit the first frame
to the DDI 370 at the output time of the tearing signal. According to an embodiment,
the first frame may be written to the memory 233 (e.g., the memory 233 of FIG. 2)
at the output time of the tearing signal. According to an embodiment, the memory 233
may be implemented as a GRAM. According to an embodiment, the DDI 370 may read the
first frame from the memory 233 and scan (or display) the first frame on the display
panel 380.
[0054] According to an embodiment, the third processor 322 may receive a second frame from
the second processor 321 to update some layers related to the first frame. For example,
the second frame may be a frame rendered by the second processor 321 after the first
frame. According to an embodiment, the third processor 322 may identify whether the
second frame is a frame for updating a part of the first frame. For example, the third
processor 322 may identify whether the second frame is a frame for updating some of
a plurality of layers included in the first frame. In this case, the second frame
may include only some of the layers, not all of them. When identifying that the second
frame is for updating some layers of the first frame, the third processor 322 may
identify whether the second frame is for updating a layer corresponding to a region
of interest (ROI) of the first frame. For example, the ROI may include a bottom region
of the first frame displayed (or scanned) on the display panel 380. For example, the
area of interest may be set based on user input. For example, the region of interest
is not limited to the bottom region of the first frame and may include at least a
portion of regions of the first frame displayed on the display panel 380.
[0055] According to an embodiment, when identifying that the second frame is for updating
the layer corresponding to the ROI of the first frame, the third processor 322 may
identify that transmission of the second frame to the DDI 370 based on the tearing
signal will cause (or cause expectation of) screen tearing. For example, when the
second frame is transmitted to the DDI 370 based on the tearing signal, a time when
the DDI 370 scans (or displays) the first frame written to the memory 233 on the display
panel 380 may overlap with a time when the DDI 370 writes the second frame to the
memory 233. This may result in tearing of an image or frame displayed (or scanned)
on the display 360.
[0056] According to an embodiment, the third processor 322 and the display controller 350
may perform an operation of preventing screen tearing based on identifying that screen
tearing will occur.
[0057] According to an embodiment, the display controller 350 may transmit the second frame
to the DDI 370 after the first frame is scanned (or displayed) on the display panel
380. According to an embodiment, the display controller 350 may transmit the second
frame to the DDI 370 after a specified first time from an output time of the tearing
signal. According to an embodiment, the specified first time (e.g., 2ms) may refer
to a time between the output time of the tearing signal and a completion time of scanning
(or displaying) the first frame on the display panel 380. Alternatively, according
to an embodiment, the third processor 322 may transmit the second frame to the display
controller 350 after a specified second time from an output time of the tearing signal.
According to an embodiment, the specified second time (e.g., 2ms) may refer to a time
between the output time of the tearing signal and the completion time of scanning
(or displaying) the first frame on the display panel 380. According to an embodiment,
the specified first time and the specified second time may be equal or different.
According to an embodiment, the specified first time and the specified second time
may refer to a time set such that a time interval during which an ROI of a part of
the first frame is scanned (or displayed) on the display panel 380 does not coincide
with a time interval during which the second frame for updating a layer corresponding
to the ROI is transmitted to the DDI 370. For example, transmitting the second frame
to the DDI 370 may include writing the second frame to the memory 233 by the DDI 370.
For example, when the first frame is partially updated, the third processor 322 may
adjust a transmission time of the second frame to the DDI 370. In this way, even updating
only a part of the frame may not result in screen tearing.
[0058] According to an embodiment, the third processor 322 may transmit a third frame to
the display controller 350 to update the entire first frame based on identifying that
screen tearing will occur. According to an embodiment, the display controller 350
may receive the third frame from the third processor 322 based on the tearing signal.
According to an embodiment, the display controller 350 may transmit the third frame
to the DDI 370 at an output time of the tearing signal. According to an embodiment,
the display controller 350 may transmit the third frame to the DDI 370 during scanning
(or display) of the first frame on the display panel 380. For example, transmitting
the third frame to the DDI 370 may include writing the third frame to the memory 233
by the DDI 370. For example, updating the entire first frame, as opposed to updating
a part of the first frame, may not cause screen tearing. Accordingly, when screen
tearing is expected, the third processor 322 may request the third frame for updating
the entire first frame from the second processor 321, and transmit the third frame
received from the second processor 321 to the display controller 350.
[0059] According to an embodiment, the third processor 322 may receive the first frame in
response to output of the first tearing signal. However, when the third processor
322 fails to receive the first frame during an output time of the first tearing signal
to an output time of a subsequent second tearing signal (e.g., a tearing signal output
after the first tearing signal), a frame delay may occur. For example, when a frame
delay occurs, the third processor 322 may transmit the first frame to the display
controller 350 in a time interval following an originally allocated time interval.
To address this, the third processor 322 may advance an output time of the tearing
signal by a specified time (e.g., 2ms). This allows the third processor 322 to wait
longer for a rendered frame from the second processor 321 by a specified time. Additionally,
the third processor 322 may reduce the frame delay.
[0060] FIG. 3B is a block diagram illustrating a configuration of an electronic device according
to an embodiment.
[0061] According to an embodiment, the first processor 320 (e.g., the first processor 320
of FIG. 3A) may include the second processor 321 (e.g., the second processor 321 of
FIG. 3A), the third processor 322 (e.g., the third processor 322 of FIG. 3A), and
the display controller 350 (e.g., the display controller 350 of FIG. 3A).
[0062] According to an embodiment, the third processor 322 may execute a surface flinger
381, a hardware (HW) composer 382, and a display driver 383. According to an embodiment,
the second processor 321 may render a frame (or an image). According to an embodiment,
the rendered frame (or image) may be stored in a graphics buffer (e.g., a frame storage
area in the memory 330).
[0063] According to an embodiment, the surface flinger 381 may transmit information about
the graphics buffer (e.g., an address of the frame storage area, resolution information,
and so on) to the HW composer 382. According to an embodiment, the HW composer 382
may transmit the information about the graphics buffer to the display driver 383.
[0064] According to an embodiment, the display driver 383 may transmit the rendered frame
to the display controller 350. For example, the display driver 383 may determine a
time when the display controller 350 is to transmit the frame to the display 360 (e.g.,
the display 360 of FIG. 3A). For example, the transmission time of the frame may be
an output time of a tearing signal (or a time when the display controller 350 identifies
the tearing signal). According to an embodiment, the display driver 383 may adjust
an output time of a reference tearing signal to be advanced by a specified time (e.g.,
2ms).
[0065] According to an embodiment, the display controller 350 may transmit the frame to
the display 360 at the adjusted output time of the tearing signal. For example, transmitting
the frame by the display controller 350 may include writing the frame to the memory
233 (e.g., the memory 233 of FIG. 2) by the display 360.
[0066] According to an embodiment, the display 360 may read the frame from the memory 233
and scan (or display) the frame.
[0067] FIG. 4A is a flowchart illustrating a method of operating an electronic device according
to an embodiment. Although the flowchart of Fig. 4A comprises conditional branching
statements such as operation 405, it is defined that each of the branching operations
may form a separate additional embodiment, and each of the branching operations may
be considered optional, as described below.
[0068] According to an embodiment, in operation 401, the display controller 350 (e.g., the
display controller 350 of FIG. 3A) may transmit a first frame to the display 360 (e.g.,
the display 360 of FIG. 3A), based on a tearing signal output at a predetermined period
from the display 360. According to an embodiment, the first frame may be a frame rendered
by the second processor 321 (e.g., the second processor 321 of FIG. 3A). According
to an embodiment, the output time of the tearing signal may be adjusted to be advanced
from the output time of a reference tearing signal by a specified time (e.g., 2ms).
For example, the reference tearing signal may refer to a tearing signal output from
the display 360 at an output time prior to the adjustment. For example, the specified
time may be determined based on at least one of the number of layers included in the
frame, the resolution of the display 360, or the frequency of the display 360. However,
this is exemplary, and the specified time may be set based on various methods in embodiments
of the disclosure.
[0069] According to an embodiment, the first frame may be written to the memory 233 (e.g.,
the memory 233 of FIG. 2) at an output time of the tearing signal. According to an
embodiment, the display controller 350 may transmit the first frame to the display
360 within a preset time (e.g., 2ms) from the output time of the tearing signal. According
to an embodiment, the display 360 may read the first frame from the memory 233 and
scan (or display) the first frame. According to an embodiment, the preset time may
be set by a user or the first processor 320.
[0070] According to an embodiment, in operation 403, the third processor 322 (e.g., the
third processor 322 of FIG. 3A) (e.g., the display driver 383) (e.g., the display
driver 383 of FIG. 3B) may receive a second frame for updating some of layers related
to the first frame from the second processor 321. According to an embodiment, the
second frame may include a frame rendered by the second processor 321 after the first
frame. According to an embodiment, the second frame may include only some of the layers
rather than all of the layers.
[0071] According to an embodiment, in operation 405, the third processor 322 (e.g., the
display driver 383) may identify whether the second frame is a frame for updating
a layer corresponding to an ROI of the first frame (e.g., a bottom region of the first
frame). When steps 411 and 413 are not part of an embodiment as explained above, step
405 may be formulated as determining 405, by the third processor 322 (e.g., the display
driver 383), that the second frame is a frame for updating a layer corresponding to
an ROI of the first frame (e.g., a bottom region of the first frame). When steps 407
and 409 are not part of an embodiment as explained above, step 405 may be formulated
as determining 405, by the third processor 322 (e.g., the display driver 383), that
the second frame is not a frame for updating a layer corresponding to an ROI of the
first frame (e.g., a bottom region of the first frame).
[0072] According to an embodiment, when the third processor 322 (e.g., the display driver
383) identifies that the second frame is a frame for updating the layer corresponding
to the ROI of the first frame (e.g., the bottom region of the first frame) (yes in
operation 405), the third processor 322 (e.g., the display driver 383) may identify
that transmission of the second frame is transmitted to the display 360 through the
display controller 350 will cause (or cause expectation of) screen tearing in operation
407. For example, when the second frame is transmitted to the display 360 based on
the tearing signal, a time when the display 360 scans (or displays) the first frame
written to the memory 233 may overlap with a time when the display 360 writes the
second frame to the memory 233. This may cause tearing of an image or frame displayed
(or scanned) on the display 360. According to an embodiment, the third processor 322
(e.g., the display driver 383) may transmit the second frame to the display controller
350. For example, the third processor 322 may transmit information related to the
time when the display controller 350 transmits the second frame to the display 360
(e.g., a specified first time from an output time of the tearing signal). Alternatively,
for example, the third processor 322 may transmit the second frame to the display
controller 350 after a specified second time (e.g., 2ms) from the output time of the
tearing signal.
[0073] According to an embodiment, in operation 409, the display controller 350 may transmit
the second frame to the display 360 after the first frame is scanned (or displayed)
on the display. According to an embodiment, the display controller 350 may transmit
the second frame to the display 360 after the specified first time (e.g., 2ms) from
the output time of the tearing signal. Alternatively, according to an embodiment,
the display controller 350 may receive the second frame from the third processor 322
(e.g., the display driver 383) after the specified second time (e.g., 2ms) from the
output time of the tearing signal. According to an embodiment, the specified first
time and the specified second time may be equal or different. According to an embodiment,
the specified first time and the specified second time may refer to times that are
set such that a time interval during which an ROI in a part of the first frame is
scanned (or displayed) on the display 360 does not coincide with a time interval during
which a second frame for updating a layer corresponding to the ROI is transmitted
to the display 360.
[0074] As a result, the electronic device 301 (e.g., the electronic device 301 of FIG. 3A)
may ensure that the time interval during which the ROI of the first frame is scanned
(or displayed) on the display 360 does not coincide with the time interval during
which the second frame for updating the layer corresponding to the ROI is transmitted
to the display 360 (or the time interval during which the second frame is written),
thereby avoiding screen tearing, when updating only a part of the screen.
[0075] According to an embodiment, the operation of transmitting the second frame to the
display 360 after the specified first time period (e.g., 2ms) from an output time
of the tearing signal by the display controller 350 will be described in detail with
reference to FIG. 6A. According to an embodiment, the operation of transmitting the
second frame to the display controller 350 after the specified second time (e.g.,
2ms) from the output time of the tearing signal by the third processor 322 will be
described in detail with reference to FIG. 7A.
[0076] According to an embodiment, when the third processor 322 (e.g., the display driver
383) does not identify that the second frame is a frame for updating a layer corresponding
to an ROI of the first frame (e.g., a bottom region of the first frame) (no in operation
405), the third processor 322 may identify that transmission of the second frame to
the display 360 through the display controller 350 will not cause screen tearing in
operation 411.
[0077] According to an embodiment, in operation 413, the display controller 350 may transmit
the second frame to the display 360 at an output time of the tearing signal. According
to an embodiment, the display controller 350 may transmit the second frame to the
display 360 during scanning (or display) of the first frame on the display 360. According
to an embodiment, the second frame may be written to the memory 233 at the output
time of the tearing signal. According to an embodiment, the display controller 350
may transmit the second frame to the display 360 after a preset time from the output
time of the tearing signal. For example, the display 360 may read the second frame
from the memory 233 and scan (or display) the second frame at a falling time of the
tearing signal.
[0078] According to an embodiment, the electronic device 301 (e.g., the electronic device
301 of FIG. 3A) may display a frame that updates only a part of a screen on the display,
thereby reducing the value of current consumed to update the entire screen. Furthermore,
when updating only a part of the screen, the electronic device 301 may not cause screen
tearing.
[0079] FIG. 4B is a flowchart illustrating a method of operating an electronic device according
to an embodiment. Although the flowchart of Fig. 4B comprises conditional branching
statements such as operation 425, it is defined that each of the branching operations
may form a separate additional embodiment, and each of the branching operations may
be considered optional, as described below.
[0080] According to an embodiment, in operation 421, the display controller 350 (e.g., the
display controller 350 of FIG. 3A) may transmit a first frame to the display 360,
based on a tearing signal output at a predetermined period from the display 360 (e.g.,
the display 360 of FIG. 3A). According to an embodiment, an output time of the tearing
signal may be adjusted to be advanced from an output time of a reference tearing signal
by a specified time (e.g., 2ms).
[0081] According to an embodiment, in operation 423, the third processor 322 (e.g., the
third processor 322 of FIG. 3A) (e.g., the display driver 383) (e.g., the display
driver 383 of FIG. 3B) may receive a second frame for updating some layers related
to the first frame from the second processor 321.
[0082] According to an embodiment, in operation 425, the third processor 322 (e.g., the
display driver 383) may identify whether the second frame is a frame for updating
a layer corresponding to an ROI of the first frame (e.g., a bottom region of the first
frame).
[0083] When steps 431 and 433 are not part of an embodiment as explained above, step 425
may be formulated as determining 425, by the third processor 322 (e.g., the display
driver 383), that the second frame is a frame for updating a layer corresponding to
an ROI of the first frame (e.g., a bottom region of the first frame). When steps 427
and 429 are not part of an embodiment as explained above, step 425 may be formulated
as determining 425, by the third processor 322 (e.g., the display driver 383), that
the second frame is not a frame for updating a layer corresponding to an ROI of the
first frame (e.g., a bottom region of the first frame).
[0084] According to an embodiment, when the third processor 322 (e.g., the display driver
383) identifies that the second frame is a frame for updating the layer corresponding
to the ROI of the first frame (e.g., yes in operation 425), the third processor 322
(e.g., the display driver 383) may identify that transmission of the second frame
to the display 360 through the display controller 350 will cause (or cause expectation
of) screen tearing in operation 427.
[0085] According to an embodiment, in operation 429, the display controller 350 may transmit
a third frame for updating the entire first frame to the display 360 at an output
time of the tearing signal. For example, updating the entire first frame, as opposed
to updating a part of the first frame, may not result in screen tearing. Accordingly,
when screen tearing is expected, the third processor 322 (e.g., the display driver
383) may request the third frame to update the entire first frame from the second
processor 321. The third processor 322 (e.g., the display driver 383) may transmit
the third frame to the display controller 350, and the display controller 350 may
transmit the third frame to the display 360. According to an embodiment, the third
frame may include a frame rendered by the second processor 321 (e.g., the second processor
321 of FIG. 3A). According to an embodiment, the display controller 350 may transmit
the third frame to the display 360 during scanning (or display) of the first frame
on the display 360. According to an embodiment, the operation of transmitting the
third frame to the display 360 at the output time of the tearing signal by the display
controller 350 will be described in more detail with reference to FIG. 8A.
[0086] According to an embodiment, when the second frame is not identified as a frame for
updating the layer corresponding to the ROI of the first frame (no in operation 425),
the third processor 322 (e.g., the display driver 383) may identify that transmission
of the second frame to the display 360 through the display controller 350 will not
cause screen tearing in operation 431.
[0087] According to an embodiment, in operation 433, the display controller 350 may transmit
the second frame to the display 360 at an output time of the tearing signal. According
to an embodiment, the display controller 350 may transmit the second frame to the
display 360 during the scanning (or display) of the first frame on the display 360.
According to an embodiment, the second frame may be written to the memory 233 at the
output time of the tearing signal.
[0088] Consequently, screen tearing may be avoided when updating only a part of the screen.
[0089] FIG. 5 is a timing diagram illustrating a process of transmitting, writing, and scanning
a frame after adjusting an output time of a tearing signal according to a comparative
embodiment.
[0090] According to the comparative embodiment, when identifying delay of a first frame,
the display driver may not transmit the first frame to the display controller at an
output time of a second tearing signal (not shown) (e.g., a tearing signal after a
first tearing signal). For example, when failing to receive the first frame during
the output time of the first tearing signal (not shown) to the output time of the
second tearing signal (not shown) (e.g., the tearing signal after the first tearing
signal), the display driver may identify that there is a delay in the first frame.
According to the comparative embodiment, when the frame delay occurs, the display
driver may transmit the first frame to the display controller at an output time of
a third tearing signal (not shown) (e.g., a tearing signal output after the second
tearing signal).
[0091] To address this, the display driver may adjust the output time of the tearing signal
such that the tearing signal is output at a time adjusted from an output time of a
vertical synchronization signal by a specified time T1 according to the comparative
embodiment. Therefore, the electronic device may relatively reduce the occurrence
of a frame delay according to the comparative embodiment. FIG. 5 illustrates an output
time 561 of the tearing signal from the display before adjustment, and output times
511, 512, 513 of the tearing signal, which have been adjusted by the specified time
T1.
[0092] Consequently, screen tearing may be avoided when updating only a part of the screen.
[0093] Referring to FIG. 5, the display driver may wait to receive a first frame until the
second time 512 of a tearing signal 560 output from the display after the first time
511. According to an embodiment, the display driver may receive the first frame rendered
by the GPU between the first time 511 and the second time 512 based on a synchronization
signal 540. For example, the first time 511, the second time 512, and the third time
513 may refer to times when the tearing signal is output from the display and rises.
The display driver may transmit the first frame to the display controller between
the first time 511 and the second time 512.
[0094] According to the comparative embodiment, the display controller may transmit the
first frame to the display at the second time 512. When the first frame is written
to the memory, the display may read the first frame from the memory and scan (or display)
the first frame.
[0095] According to the comparative embodiment, the display driver may wait to receive a
second frame until the third time 513 after the second time 512. According to the
comparative embodiment, the display driver may receive the second frame rendered by
the GPU based on the synchronization signal 540. For example, the second frame may
be a frame for updating some layers of the first frame.
[0096] According to the comparative embodiment, the electronic device may experience screen
tearing of the display in a time interval 510 in which a time interval during which
the second frame is written at least partially overlap with a time interval during
which the first frame is scanned (or displayed).
[0097] FIG. 6A is a flowchart illustrating a method of operating an electronic device to
prevent screen tearing according to an embodiment.
[0098] According to an embodiment, when the third processor 322 (e.g., the third processor
322 of FIG. 3A) (e.g., the display driver 383) (e.g., the display driver 383 of FIG.
3B) identifies that the second frame is for updating a layer corresponding to an ROI
in a part of a first frame (e.g. a bottom region of the first frame), the third processor
322 may identify that transmission of the second frame to the display 360 (e.g., the
display 360 of FIG. 3A) through the display controller 350 (e.g., the display controller
350 of FIG. 3A) based on the tearing signal will cause (or cause expectation of) screen
tearing in operation 605. According to an embodiment, the second frame may refer to
a frame rendered by the second processor 321 (e.g., the second processor 321 of FIG.
3A) after the first frame. According to an embodiment, the second frame may include
only some layers rather than all layers. According to an embodiment, screen tearing
may mean that a time interval during which an ROI in a part of the first frame is
scanned (or displayed) on the display 360 coincides at least partially with a time
interval during which the second frame is transmitted to the display 360 (or a time
interval during which the second frame is written to the memory 233 (e.g., the memory
233 in FIG. 2).
[0099] According to an embodiment, in operation 607, the display controller 350 may transmit
the second frame to the display 360 after a specified time (e.g., 2ms) from an output
time of the tearing signal. For example, transmitting the second frame to the display
360 by the display controller 350 may include writing the second frame to the memory
233 by the display 360. According to an embodiment, the specified time may mean a
time between the output time of the tearing signal and a time when the first frame
begins to be scanned (or displayed) on the display 360. According to an embodiment,
the specified time may be set such that the time interval during which the ROI in
the part of the first frame is scanned (or displayed) on the display 360 does not
coincide with the time interval during which the second frame for updating the layer
corresponding to the ROI is transmitted to the display 360. However, this is an example,
and the specified time may be set based on various methods in embodiments of the disclosure.
[0100] According to an embodiment, the electronic device 301 (e.g., the electronic device
301 of FIG. 3A) may ensure that the time interval during which the ROI in the part
of the first frame is scanned (or displayed) on the display 360 does not coincide
with the time interval during which the second frame for updating the layer corresponding
to the ROI is transmitted to the display 360 (or the time interval during which the
second frame is written), thereby avoiding screen tearing, when updating only a part
of the screen.
[0101] FIG. 6B is a timing diagram illustrating a process of transmitting, writing, and
scanning a frame according to an embodiment.
[0102] According to an embodiment, the display driver 383 (e.g., the display driver 383
of FIG. 3B) may wait to receive a first frame until a second time 612 after a first
time 611 of a tearing signal 660 output from the display 360 (e.g., the display 360
of FIG. 3A). According to an embodiment, the display driver 383 may receive the first
frame rendered by the second processor 321 (e.g., the second processor 321 of FIG.
3A) at the second time 612 based on a synchronization signal 640. For example, the
synchronization signal 640 (e.g., a Vsync signal) may be a clock signal including
a rising time (e.g., a rising edge time) and a falling time (e.g., a falling edge
time). The tearing signal 660 may have the same period as the synchronization signal
640 (e.g., the Vsync signal). According to an embodiment, the tearing signal may be
a clock signal that rises at a rising time (e.g., a rising edge time) of the synchronization
signal and falls at a falling time (e.g., a falling edge time) of the synchronization
signal. According to an embodiment, a time when the tearing signal rises may be an
output time of the tearing signal. For example, the first time 611, the second time
612, and a third time 613 may refer to times (e.g., rising edge times) when the tearing
signal 660 is output from the display 360 and rises. The display driver 383 may transmit
the first frame to the display controller 350 between the first time 611 and the second
time 612.
[0103] According to an embodiment, the display controller 350 (e.g., the display controller
350 of FIG. 3A) may transmit the first frame to the display 360 at the second time
612. For example, the display 360 may write the first frame to the memory 233 (e.g.,
the memory 233 of FIG. 2) at the second time 612.
[0104] According to an embodiment, the display 360 may read the first frame from the memory
233 and scan (or display) the first frame at a falling time (e.g., a falling edge
time) of the tearing signal 660.
[0105] According to an embodiment, the display driver 383 may wait to receive a second frame
until the third time 613 after the second time 612. According to an embodiment, the
display driver 383 may receive the second frame rendered by the second processor 321
based on the synchronization signal 640. For example, the second frame may be a frame
for updating some layers of the first frame.
[0106] According to an embodiment, the display driver 383 may identify whether the second
frame is for updating a layer corresponding to an ROI (e.g., a bottom region) of the
first frame. According to an embodiment, when identifying that the second frame is
for updating the layer corresponding to the ROI (e.g., the bottom region) of the first
frame, the display controller 350 may transmit the second frame to the display 360
after a specified time T2 from the third time 613. For example, the specified time
T2 may be set such that a time interval during which the ROI in a part of the first
frame is scanned (or displayed) on the display 360 does not coincide with a time interval
during which the second frame for updating the layer corresponding to the ROI is transmitted
to the display 360. For example, the specified time T2 may be set to 2ms.
[0107] According to an embodiment, the display controller 350 may transmit the second frame
to the display 360. For example, the display controller 350 may cause the display
360 to write the second frame to the memory 233, as at least a part of transmitting
the second frame. According to an embodiment, the display 360 may read the second
frame from the memory 233 and scan (or display) the second frame at a falling time
(e.g., a falling edge time) of the tearing signal 660.
[0108] According to the above-described method, when expecting screen tearing of the display
260, the display controller 350 may adjust the transmission time of the second frame.
Therefore, the display controller 350 may prevent screen tearing of the display 360.
[0109] FIG. 7A is a flowchart illustrating a method of operating an electronic device to
prevent screen tearing according to an embodiment.
[0110] According to an embodiment, when identifying that a second frame is for updating
a layer corresponding to an ROI (e.g., a bottom region of the first frame) in a part
of the first frame, the third processor 322 (e.g., the display driver 383) (e.g.,
the third processor 322 in FIG. 3A) may identify that transmission of the second frame
to the display 360 through the display controller 350 (e.g., the display controller
350 in FIG. 3A) based on a tearing signal will cause (or cause expectation of) screen
tearing in operation 705.
[0111] According to an embodiment, in operation 707, the third processor 322 (e.g., the
display driver 383) may transmit the second frame to the display controller 350 after
a specified time (e.g., 2ms) from an output time of the tearing signal. According
to an embodiment, the specified time may mean the time between the output time of
the tearing signal and a time when the first frame begins to be scanned (or displayed)
on the display 360 (e.g., the display 360 of FIG. 3A). According to an embodiment,
the specified time may be set such that a time interval during which the ROI in the
part of the first frame is scanned on the display 360 does not coincide with a time
interval during which the second frame for updating a layer corresponding to the ROI
is transmitted to the display 360.
[0112] According to an embodiment, in operation 709, the display controller 350 may transmit
the second frame to the display 360.
[0113] FIG. 7B is a timing diagram illustrating a process of transmitting, writing, and
scanning a frame according to an embodiment.
[0114] According to an embodiment, the display driver 383 (e.g., the display driver 383
of FIG. 3B) may wait to receive a first frame until a second time 712 of a tearing
signal 760 output from the display 360 (e.g., the display 360 of FIG. 3A) after a
first time 711 of the tearing signal 760. According to an embodiment, the display
driver 383 may receive the first frame rendered by the second processor 321 (e.g.,
the second processor 321 of FIG. 3A) at the second time 712 based on a synchronization
signal 740. For example, the synchronization signal 740 may be a vertical synchronization
signal. For example, the first time 711, the second time 712, and a third time 713
may refer to times when the tearing signal 760 is output from the display 360 and
rises. The display driver 383 may transmit the first frame to the display controller
350 between the first time 711 and the second time 712.
[0115] According to an embodiment, the display controller 350 (e.g., the display controller
350 of FIG. 3A) may transmit the first frame to the display 360 at the second time
712. The display 360 may write the first frame to the memory 233 (e.g., the memory
233 of FIG. 2) at the second time 712.
[0116] According to an embodiment, the display 360 may read the first frame from the memory
233 and scan (or display) the first frame at a falling time of the tearing signal
760.
[0117] According to an embodiment, the display driver 383 may wait to receive a second frame
until the third time 713 after the second time 712. According to an embodiment, the
display driver 383 may receive the second frame rendered by the second processor 321
based on the synchronization signal 740. For example, the second frame may be a frame
for updating some layers of the first frame.
[0118] According to an embodiment, when identifying that the second frame is for updating
a layer corresponding to an ROI (e.g., a bottom region) of the first frame, the display
driver 383 may transmit the second frame to the display controller 350 after a specified
time T2 from the third time 713. For example, the specified time T2 may be set such
that a time interval during which the ROI in a part of the first frame is scanned
(or displayed) on the display 360 does not coincide with a time interval during which
the second frame for updating a layer corresponding to the ROI is transmitted to the
display 360. For example, the specified time T2 may be set to 2ms.
[0119] According to an embodiment, the display controller 350 may transmit the second frame
to the display 360. For example, as at least a part of transmitting the second frame,
the display 360 may write the second frame to the memory 233.
[0120] According to an embodiment, the display 360 may read the second frame from the memory
233 and scan (or display) the second frame at a falling time of the tearing signal
760.
[0121] According to the above-described method, when expecting screen tearing, the display
driver 383 may adjust the transmission time of the second frame to the display controller
350. Therefore, the electronic device 301 (e.g., the electronic device 301 of FIG.
3A) may prevent the display 360 from experiencing screen tearing.
[0122] FIG. 8A is a flowchart illustrating a method of operating a display controller to
prevent screen tearing according to an embodiment.
[0123] According to an embodiment, in operation 805, when the third processor 322 (e.g.,
the third processor 322 of FIG. 3A (e.g., the display driver 383)) identifies that
a second frame is for updating a layer corresponding to an ROI in a part of a first
frame, the third processor 322 may identify that transmission of a second frame to
the display 360 (e.g., the display 360 of FIG. 3A) through the display controller
350 based on a tearing signal will cause (or cause expectation of) screen tearing.
[0124] According to an embodiment, in operation 807, the third processor 322 (e.g., the
display driver 383) may request a third frame for updating the entire first frame
from the second processor 321 (e.g., the second processor 321 of FIG. 3A). According
to an embodiment, updating the entire first frame, as opposed to updating a part thereof,
may not cause screen tearing. Accordingly, when expecting screen tearing, the third
processor 322 (e.g., the display driver 383) may request the third frame for updating
the entire first frame from the second processor 321. According to an embodiment,
the second processor 321 may transmit the rendered third frame to the third processor
322 (e.g., the display driver 383).
[0125] According to an embodiment, in operation 809, the third processor 322 (e.g., the
display driver 383) may receive the third frame from the second processor 321. According
to an embodiment, the third processor 322 may transmit the third frame to the display
controller 350.
[0126] According to an embodiment, in operation 811, the display controller 350 may transmit
the third frame to the display 360 at an output time of the tearing signal. According
to an embodiment, the display 360 may write the third frame to the memory 233 (e.g.,
the memory 233 of FIG. 2) at the output time of the tearing signal. According to an
embodiment, the display 360 may scan (or display) the third frame. According to an
embodiment, the display controller 350 may transmit the third frame to the display
360 during the scanning (or display) of the first frame on the display 360.
[0127] According to an embodiment, the electronic device 301 (e.g., the electronic device
301 of FIG. 3A) may write a top region (or a middle region) of the third frame to
the memory 233 during scanning (or display) of the bottom region of the first frame
on the display 360 to prevent screen tearing of an image displayed on the display
360.
[0128] FIG. 8B is a timing diagram illustrating a process of transmitting, writing, and
scanning a frame according to an embodiment.
[0129] According to an embodiment, the display driver 383 (e.g., the display driver 383
of FIG. 3B) may wait to receive a first frame until a second time 822 of a tearing
signal 860 output from the display 360 (e.g., the display 360 of FIG. 3A) after a
first time 821. According to an embodiment, the display driver 383 may receive the
first frame rendered by the second processor 321 (e.g., the second processor 321 of
FIG. 3A) at the second time 822 based on a synchronization signal 840. For example,
the synchronization signal 840 may be a vertical synchronization signal. For example,
the first time 821, the second time 822, and a third time 823 may refer to times when
the tearing signal 860 is output from the display 360 and rises. The display driver
383 may transmit the first frame to the display controller 350 between the first time
821 and the second time 822.
[0130] According to an embodiment, the display controller 350 (e.g., the display controller
350 of FIG. 3A) may transmit the first frame to the display 360 at the second time
822.
[0131] According to an embodiment, the display 360 may read the first frame from the memory
233 and scan (or display) the first frame at a falling time of the tearing signal
860.
[0132] According to an embodiment, the display driver 383 may wait to receive a next frame
(e.g., a second frame or a third frame) for updating the first frame until a third
time 823 after the second time 822.
[0133] According to an embodiment, the display driver 383 may receive the second frame rendered
by the second processor 321 based on the synchronization signal 840. For example,
the second frame may be a frame for updating some layers of the first frame.
[0134] According to an embodiment, when identifying that the second frame is a frame for
updating some layers of the first frame, the display driver 383 may request the third
frame for updating the entire first frame from the second processor 321.
[0135] According to an embodiment, the display controller 350 may transmit the third frame
to the display 360 at the third time 823. For example, the display 360 may write the
third frame to the memory 233 at the third time 823. According to an embodiment, the
display 360 may read the third frame from the memory 233 and scan (or display) the
third frame at a falling time of the tearing signal 860.
[0136] According to an embodiment, the display controller 350 may transmit a top region
(or middle region) of the third frame to the display 360 during scanning (or display)
of a bottom region of the first frame on the display 360, such that the top region
(or middle region) of the third frame may be written to the memory 233 by the display
360.
[0137] According to the above method, when expecting screen tearing, the display driver
383 may transmit a third frame for updating an entire first frame to the display 360
through the display controller 350. Therefore, the display driver 383 may prevent
screen tearing of the display 360.
[0138] An electronic device according to an embodiment may include memory, a processor,
a display controller, and a display.
[0139] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment, transmit a
first frame to the display controller based on a first tearing signal output at a
first output time from the display.
[0140] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment, obtain a second
frame for updating a portion of the plurality of layers related to the first frame.
[0141] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment, based on identifying
that the portion of the plurality of layers corresponds to a region of interest (ROI)
of the first frame, identify that screen tearing occurs when the second frame is transmitted
to the display through the display controller.
[0142] The electronic device according to an embodiment, wherein the second frame is not
written to memory included in the display while the first frame is scanned on the
display, based on identifying that screen tearing occurs.
[0143] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment, transmit the
second frame to the display controller after the first frame is scanned on the display,
based on identifying that the screen tearing occurs.
[0144] The memory may store at least one instruction configured to cause, when executed
by the display controller, the electronic device to, according to an embodiment, transmit
the second frame to the display after the first frame is scanned on the display, based
on identifying that the screen tearing occurs.
[0145] The memory may store at least one instruction configured to cause, when executed
by the display controller, the electronic device to, according to an embodiment, transmit
the second frame to the display after a first specified time from a second output
time of a second tearing signal after the first tearing signal, based on identifying
that the screen tearing occurs.
[0146] The memory may store at least one instruction configured to cause, when executed
by the display controller, the electronic device to, according to an embodiment, write
the second frame to the memory of the display after a second specified time from a
second output time of a second tearing signal after the first tearing signal.
[0147] The electronic device according to an embodiment, the ROI may include a bottom region
of the first frame displayed on the display.
[0148] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment, transmit the
second frame to the display controller after a third specified time from a second
output time of a second tearing signal after the first tearing signal, based on identifying
that the screen tearing occurs.
[0149] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment, obtain a third
frame for updating the plurality of layers related to the first frame to the display,
based on identifying that the screen tearing occurs.
[0150] The memory may store at least one instruction configured to cause, when executed
by the processor, the electronic device to, according to an embodiment, transmit the
third frame to the display controller at a second output time of a second tearing
signal after the first tearing signal.
[0151] The memory may store at least one instruction configured to cause, when executed
by the display controller, the electronic device to, according to an embodiment, transmit
the third frame to the display during scanning the first frame on the display.
[0152] A method of operating an electronic device according to an embodiment may include
transmitting a first frame to a display controller included in the electronic device
based on a first tearing signal output at a first output time from the display.
[0153] The method of operating the electronic device according to an embodiment may include
obtaining a second frame for updating a portion of the plurality of layers related
to the first frame.
[0154] The method of operating the electronic device according to an embodiment may include,
based on identifying that the portion of the plurality of layers corresponds to a
region of interest (ROI) of the first frame, identifying that screen tearing occurs
when the second frame is transmitted to the display through the display controller.
[0155] The method of operating the electronic device according to an embodiment, wherein
the second frame is not written to memory included in the display while the first
frame is scanned on the display, based on identifying that screen tearing occurs.
[0156] The method of operating the electronic device according to an embodiment may include
transmitting the second frame to the display controller after the first frame is scanned
on the display, based on identifying that the screen tearing occurs.
[0157] The method of operating the electronic device according to an embodiment may include
transmitting the second frame to the display after the first frame is scanned on the
display, based on identifying that the screen tearing occurs, by the display controller.
[0158] The method of operating the electronic device according to an embodiment may include
transmitting the second frame to the display after a first specified time from a second
output time of a second tearing signal, based on identifying that the screen tearing
occurs by the display controller.
[0159] The method of operating the electronic device according to an embodiment may include
writing the second frame to the memory of the display after a second specified time
from a second output time of a second tearing signal after the first tearing signal,
by the display controller.
[0160] The method of operating the electronic device according to an embodiment, the ROI
may include a bottom region of the first frame displayed on the display.
[0161] The method of operating the electronic device according to an embodiment may include
transmitting the second frame to the display controller after a third specified time
from a second output time of a second tearing signal, based on identifying that the
screen tearing occurs.
[0162] The method of operating the electronic device according to an embodiment may include
obtaining a third frame for updating the plurality of layers related to the first
frame, based on identifying that the screen tearing occurs.
[0163] The method of operating the electronic device according to an embodiment may include
transmitting the third frame to the display controller at a second output time of
a second tearing signal after the first tearing signal.
[0164] The method of operating the electronic device according to an embodiment may include
transmitting the third frame to the display during scanning the first frame on the
display, by the display controller.
[0165] A non-transitory computer readable medium storing at least one instruction, wherein
the at least one instruction, when executed by at least one processor of an electronic
device, may cause the electronic device to, according to an embodiment, transmit a
first frame to a display controller included in an electronic device based on a first
tearing signal output at a first output time from the display.
[0166] The non-transitory computer readable medium storing at least one instruction, wherein
the at least one instruction, when executed by at least one processor of an electronic
device, may cause the electronic device to, according to an embodiment, obtain a second
frame for updating a portion of the plurality of layers related to the first frame.
[0167] The non-transitory computer readable medium storing at least one instruction, wherein
the at least one instruction, when executed by at least one processor of an electronic
device, may cause the electronic device to, according to an embodiment, based on identify
that the portion of the plurality of layers corresponds to a region of interest (ROI)
of the first frame, identifying that screen tearing occurs when the second frame is
transmitted to the display through the display controller.
[0168] The non-transitory computer readable medium storing at least one instruction, according
to an embodiment, wherein the second frame is not written to memory included in the
display while the first frame is scanned on the display, based on identifying that
screen tearing occurs. The electronic device according to various embodiments may
be one of various types of electronic devices. The electronic devices may include,
for example, a portable communication device (e.g., a smartphone), a computer device,
a portable multimedia device, a portable medical device, a camera, a wearable device,
or a home appliance. According to an embodiment of the disclosure, the electronic
devices are not limited to those described above.
[0169] It should be appreciated that various embodiments of the present disclosure and the
terms used therein are not intended to limit the technological features set forth
herein to particular embodiments and include various changes, equivalents, or replacements
for a corresponding embodiment. With regard to the description of the drawings, similar
reference numerals may be used to refer to similar or related elements. It is to be
understood that a singular form of a noun corresponding to an item may include one
or more of the things, unless the relevant context clearly indicates otherwise.
[0170] As used herein, each of such phrases as "A or B", "at least one of A and B", "at
least one of A or B", "A, B, or C", "at least one of A, B, and C", and "at least one
of A, B, or C", may include any one of, or all possible combinations of the items
enumerated together in a corresponding one of the phrases. As used herein, such terms
as "1
st" and "2
nd", or "first" and "second" may be used to simply distinguish a corresponding component
from another, and does not limit the components in other aspect (e.g., importance
or order). It is to be understood that if an element (e.g., a first element) is referred
to, with or without the term "operatively" or "communicatively", as "coupled with",
"coupled to", "connected with", or "connected to" another element (e.g., a second
element), it means that the element may be coupled with the other element directly
(e.g., wiredly), wirelessly, or via a third element.
[0171] As used in connection with various embodiments of the disclosure, the term "module"
may include a unit implemented in hardware, software, or firmware, and may interchangeably
be used with other terms, for example, logic, logic block, part, or circuitry. A module
may be a single integral component, or a minimum unit or part thereof, adapted to
perform one or more functions. For example, according to an embodiment, the module
may be implemented in a form of an application-specific integrated circuit (ASIC).
[0172] Various embodiments as set forth herein may be implemented as software (e.g., the
program 140) including one or more instructions that are stored in a storage medium
(e.g., internal memory 136 or external memory 138) that is readable by a machine (e.g.,
the electronic device 101, 201, or 301). For example, a processor (e.g., the processor
120 or 320) of the machine (e.g., the electronic device 101, 201 or 301) may invoke
at least one of the one or more instructions stored in the storage medium, and execute
it, with or without using one or more other components under the control of the processor.
This allows the machine to be operated to perform at least one function according
to the at least one instruction invoked. The one or more instructions may include
a code generated by a complier or a code executable by an interpreter. The machine-readable
storage medium may be provided in the form of a non-transitory storage medium. Wherein,
the term "non-transitory" simply means that the storage medium is a tangible device,
and does not include a signal (e.g., an electromagnetic wave), but this term does
not differentiate between where data is semi-permanently stored in the storage medium
and where the data is temporarily stored in the storage medium.
[0173] According to an embodiment, a method according to various embodiments of the disclosure
may be included and provided in a computer program product. The computer program product
may be traded as a product between a seller and a buyer. The computer program product
may be distributed in the form of a machine-readable storage medium (e.g., compact
disc read only memory (CD-ROM)), or be distributed (e.g., downloaded or uploaded)
online via an application store (e.g., PlayStore
™), or between two user devices (e.g., smart phones) directly. If distributed online,
at least part of the computer program product may be temporarily generated or at least
temporarily stored in the machine-readable storage medium, such as memory of the manufacturer's
server, a server of the application store, or a relay server.
[0174] According to various embodiments, each component (e.g., a module or a program) of
the above-described components may include a single entity or multiple entities, and
some of the multiple entities may be separately disposed in different components.
According to various embodiments, one or more of the above-described components may
be omitted, or one or more other components may be added. Alternatively or additionally,
a plurality of components (e.g., modules or programs) may be integrated into a single
component. In such a case, according to various embodiments, the integrated component
may still perform one or more functions of each of the plurality of components in
the same or similar manner as they are performed by a corresponding one of the plurality
of components before the integration. According to various embodiments, operations
performed by the module, the program, or another component may be carried out sequentially,
in parallel, repeatedly, or heuristically, or one or more of the operations may be
executed in a different order or omitted, or one or more other operations may be added.