The if statement in the dispatch implies we are in the idle state, so of
course the active time will always be undefined. If this was cancelled
via a call to cancel(), we can save the time at that point. Otherwise,
just send 0.
This patch fixes two issues:
- Animation events that should go to the target element now do
(some were previously being dispatched on the animation itself.)
- We update the "previous phase" and "previous iteration" fields of
animation effects, so that we can actually detect phase changes.
This means we stop thinking animations always just started,
something that caused each animation to send 60 animationstart
events every second (to the wrong target!)
Now that the lambda capture plugin isn't full of false-positives, we can
make the jump and start halting builds for these errors. It also allows
these plugins to be useful in CI.
Instead of being opt-out with NOESCAPE, it is now opt-in with ESCAPING.
Opt-out is ideal, but unfortunately this was extremely noisy when
compiling the entire codebase. Escaping functions are rarer than non-
escaping ones, so let's just go with that for now.
This also allows us to gradually add heuristics for detecting missing
ESCAPING annotations and emitting them as errors. It also nicely matches
the spelling that Swift uses (@escaping), which is where this idea
originally came from.
This change adds the `width` and `height` properties to
`HTMLVideoElement` and `HTMLSourceElement`. These properties reflect
their respective content attribute values.
This returns the secondary target of a mouse event. For `onmouseenter`
and `onmouseover` events, this is the EventTarget the mouse exited
from. For `onmouseleave` and `onmouseout` events, this is the
EventTarget the mouse entered to.
For now, this slot is always 0 - (the default value per spec). But
once we start actually processing audio streams this internal slot
should be changed correspondingly.
To determine the palette of colors we use the median cut algorithm.
While being a correct implementation, enhancements are obviously
existing on both the median cut algorithm and the encoding side.
This is useful to find the best matching color palette from an existing
bitmap. It can be used in PixelPaint but also in encoders of old image
formats that only support indexed colors e.g. GIF.
For example, for 7z7c.gif, we now store one 500x500 frame and then
a 94x78 frame at (196, 208) and a 91x78 frame at (198, 208).
This reduces how much data we have to store.
We currently store all pixels in the rect with changed pixels.
We could in the future store pixels that are equal in that rect
as transparent pixels. When inputs are gif files, this would
guaranteee that new frames only have at most 256 distinct colors
(since GIFs require that), which would help a future color indexing
transform. For now, we don't do that though.
The API I'm adding here is a bit ugly:
* WebPs can only store x/y offsets that are a multiple of 2. This
currently leaks into the AnimationWriter base class.
(Since we potentially have to make a webp frame 1 pixel wider
and higher due to this, it's possible to have a frame that has
<= 256 colors in a gif input but > 256 colors in the webp,
if we do the technique above.)
* Every client writing animations has to have logic to track
previous frames, decide which of the two functions to call, etc.
This also adds an opt-out flag to `animation`, because:
1. Some clients apparently assume the size of the last VP8L
chunk is the size of the image
(see https://github.com/discord/lilliput/issues/159).
2. Having incremental frames is good for filesize and for
playing the animation start-to-end, but it makes it hard
to extract arbitrary frames (have to extract all frames
from start to target frame) -- but this is mean tto be a
delivery codec, not an editing codec. It's also more vulnerable to
corrupted bytes in the middle of the file -- but transport
protocols are good these days.
(It'd also be an idea to write a full frame every N frames.)
For https://giphy.com/gifs/XT9HMdwmpHqqOu1f1a (an 184K gif),
output webp size goes from 21M to 11M.
For 7z7c.gif (an 11K gif), output webp size goes from 2.1M to 775K.
(The webp image data still isn't compressed at all.)
Truncating the value is mathematically incorrect, this error made the
conversion to grayscale unstable. In other world, calling `to_grayscale`
on a gray value would return a different value. As an example,
`Color::from_string("#686868ff"sv).to_grayscale()` used to return
#676767ff.