This modification introduces a new layer to the painting process. The
stacking context traversal no longer immediately calls the
Gfx::Painter methods. Instead, it writes serialized painting commands
into newly introduced RecordingPainter. Created list of commands is
executed later to produce resulting bitmap.
Producing painting command list will make it easier to add new
optimizations:
- It's simpler to check if the painting result is not visible in the
viewport at the command level rather than during stacking context
traversal.
- Run painting in a separate thread. The painting thread can process
serialized painting commands, while the main thread can work on the
next paintable tree and safely invalidate the previous one.
- As we consider GPU-accelerated painting support, it would be easier
to back each painting command rather than constructing an alternative
for the entire Gfx::Painter API.
Otherwise, in a simple page such as:
<video src=...>
<audio src=...>
The video's clip rect would "leak" to the AudioPaintable, preventing the
audio controls from rendering at all.
This moves the painting of the media timeout out of VideoPaintable into
a base MediaPaintable. This is to allow re-using the same timeline logic
and controls for audio elements.
This fixes a plethora of rounding problems on many websites.
In the future, we may want to replace this with fixed-point arithmetic
(bug #18566) for performance (and consistency with other engines),
but in the meantime this makes the web look a bit better. :^)
There's a lot more things that could be converted to doubles, which
would reduce the amount of casting necessary in this patch.
We can do that incrementally, however.
This will be needed by the layout node, which may change what is painted
when the position of the frame image is not the same as the element's
current time.
When clicking on the media timeline, compute the percentage along the
timeline's width the user clicked, and set the playback time to the same
percentage of the video's duration.
When the control bar is shown, do not toggle playback when clicking on
the control bar, unless the click target is the playback button. This
will let us implement clicking on the timeline to seek.
Note that this requires caching some layout rects on the video element.
We need to remember where the corresponding layout boxes are, and we
can't cache them on the layout box, as that may be destroyed any time.
The layout node, and therefore the painting box, is frequently destroyed
and recreated. This causes us to forget the cached mouse position we use
to highlight media controls. Move this cached position to the DOM node
instead, which survives relayout.
The link color is what closely resembled the color I was going for on
the machine I originally developed the controls on. But turns out this
is a very dark blue on most Serenity themes. Instead, hard-code the
original intended color, which is a lighter blue.
Rather than storing static DevicePixels dimensions, treat the desired
pixel sizes as CSSPixels and convert them to DevicePixels.
This was originally developed on a mac with a device-to-CSS-pixel ratio
of 2. Running it on another machine with a ratio of 1 made the controls
appear huge.
If the video element has a 'controls' attribute, we now paint some basic
video controls over the video element. If no frame has been decoded yet,
we paint a play button on the center of the element.
If a frame has been decoded, we paint that frame and paint a control bar
on the bottom of the frame. This control bar currently only contains a
play/pause button, depending on the video's playback state. We will only
paint the control bar if the video is paused or hovered.