Rendering optimization for web application requires us to know how to speed up FPS.
A display can have 60 Hz, 120 Hz, 144 Hz, or other variance (known as refresh rate).
60 Hz means that it will refresh the image it displays 60 times a second. So we can put up 60 images each second (60 fps), more images won't be displayed. If we have less images, say 30 fps, then two refreshes will display the same frame.
The question I have been wondering about is, what if I'm just looking at a static web pages without any events triggered to it.
Is the GPU still producing 60 fps of the same exact image, or maybe it does not produce any image at all?
If we take a look at Wikipedia's definition on GPU:
A graphics processing unit (GPU), occasionally called visual processing unit (VPU), is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device.
I suppose the display refreshes it screens and show images from the frame buffer.
If no events triggered, then the GPU does not need to alter the frame buffer.
What is the frame buffer anyway?
Well, go back to Wikipedia and we will find out that it is:
A framebuffer (frame buffer, or sometimes framestore) is a portion of RAM containing a bitmap that is used to refresh a video display from a memory buffer containing a complete frame of data.
It all makes sense now.