We were able to keep LibCoreMinimal a bit smaller as an object library,
but that is causing ODR violations in the fuzzer build (realistically,
should be an issue in all builds, but only the fuzzer actively complains
for some reason).
To make it a shared library, we have to add a couple more symbols to it,
and make LibCore publicly depend on it.
We were off-by-one when returning the result of parsing a quoted string
in Web::Fetch::Infrastructure::collect_an_http_quoted_string. Instead of
backtracking the lexer and consuming the backtracked string, do a simple
substring operation.
We have been dancing around circular dependencies between LibCore and
generated sources. For example, LibURL currently cannot depend on
LibUnicode because the LibUnicode generators depend on LibCore, and
LibCore depends on LibURL. LibTimeZone is in a similar situation.
To alleviate this, we can define the minimal sources that the code
generators need as an object library. This will allow the generators to
depend on this library, rather than the full LibCore.
JPEG2000 is the last image format used in PDF filters that we
don't have a loader for. Let's change that.
This adds all the scaffolding, but no actual implementation yet.
This is a fetching AO and is only used by LibWeb in the context of fetch
tasks. Move it to LibWeb with other fetch methods.
The main reason for this is that it requires the use of other LibWeb AOs
such as the forgiving Base64 decoder and MIME sniffing. These AOs aren't
available within LibURL.
These are standalone applications meant to be run by the user directly,
as opposed to other libexec processes which are programmatically forked
by the browser. To do this, we simply remove these processes from the
`ladybird_helper_processes` list. We must also explicitly list the
dependencies for these processes.
This does not implement any of the IDL methods, but GitHub requires the
interface exists to upload files via an <input type="file"> element.
Their JS handles uploads via this element and via drag-and-drop in one
function, and check if the uploaded file is `instanceof DataTransfer` to
decide how to handle it.
This patch implements and tests window.crypto.sublte.generateKey with
an RSA-OAEP algorithm. In order for the types to be happy, the
KeyAlgorithms objects are moved to their own .h/.cpp pair, and the new
KeyAlgorithms for RSA are added there.
This patch throws away some of the spec suggestions for how to implement
the normalize_algorithm AO and uses a new pattern that we can actually
extend in our C++.
Also update CryptoKey to store the key data.
Now that all input events are handled by LibWebView, replace the IPCs
which send the fields of Web::KeyEvent / Web::MouseEvent individually
with one IPC per event type (key or mouse).
We can also replace the ad-hoc queued input structure with a smaller
struct that simply holds the tranferred Web::KeyEvent / Web::MouseEvent.
In the future, we can also adapt Web::EventHandler to use these structs.
The Serenity chrome is the only chrome thus far that sends all input key
and mouse events to WebContent, including shortcut activations. This is
necessary for all chromes - we must give web pages a chance to intercept
input events before handling them ourselves.
To make this easier for other chromes, this patch moves Serenity's input
event handling to LibWebView. To do so, we add the Web::InputEvent type,
which models the event data we need within LibWeb. Chromes will then be
responsible for converting between this type and their native events.
This class lives in LibWeb (rather than LibWebView) because the plan is
to use it wholesale throughout the Page's event handler and across IPC.
Right now, we still send the individual fields of the event over IPC,
but it will be an easy refactor to send the event itself. We just can't
do this until all chromes have been ported to this event queueing.
Also note that we now only handle key input events back in the chrome.
WebContent handles all mouse events that it possibly can. If it was not
able to handle a mouse event, there's nothing for the chrome to do (i.e.
there is no clicking, scrolling, etc. the chrome is able to do if the
WebContent couldn't).
This will be used to transfer information about the parent context to
DedicatedWorkers and future out-of-process Worker/Worklet
implementations for fetching purposes. In order to properly check
same-origin and other policies, we need to know more about the outside
settings than we were previously passing to the WebWorker process.
It aligns better with the Filesystem Heirarchy Standard[1] to put our
program-specific helper programs that are not intended to be executed by
the user of the application in $prefix/libexec or in whatever the
packager sets as the CMake equivalent. Namely, on Debian systems this
should be /usr/lib/Ladybird or similar.
[1] https://refspecs.linuxfoundation.org/FHS_3.0/fhs-3.0.html#usrlibexec
We had previous implemented some plumbing for file input elements in
commit 636602a54e.
This implements the return path for chromes to inform WebContent of the
file(s) the user selected. This patch includes a dummy implementation
for headless-browser to enable testing.
This works very similarly to MarkedVector<T>, but instead of expecting
T to be Value or a GC-allocated pointer type, T can be anything.
Every pointer-sized value in the vector's storage will be checked during
conservative root scanning.
In other words, this allows you to put something like this in a
ConservativeVector<Foo> and it will be protected from GC:
struct Foo {
i64 number;
Value some_value;
GCPtr<Object> some_object;
};
The JIT compiler was an interesting experiment, but ultimately the
security & complexity cost of doing arbitrary code generation at runtime
is far too high.
In subsequent commits, the bytecode format will change drastically, and
instead of rewriting the JIT to fit the new bytecode, this patch simply
removes the JIT instead.
Other engines, JavaScriptCore in particular, have already proven that
it's possible to handle the vast majority of contemporary web content
with an interpreter. They are currently ~5x faster than us on benchmarks
when running without a JIT. We need to catch up to them before
considering performance techniques with a heavy security cost.
We currently bundle AK with LibCore on Lagom. This means that to use AK,
all libraries must also depend on LibCore. This will create circular
dependencies when we create LibURL, as LibURL will depend on LibUnicode,
which will depend on LibCore, which will depend on LibURL.
This splits the RIFFTypes header/TU into the WAV specific parts, which
move to WavTypes.h, as well as the general RIFF parts which move to the
new LibRIFF.
Sidenote for the spec comments: even though they are linked from a site
that explains the WAV format, the document is the (an) overall RIFF spec
from Microsoft. A better source may be used later; the changes to the
header are as minimal as possible.
Instead of spawning these processes from the WebContent process, we now
create them in the Browser chrome.
Part 1/N of "all processes are owned by the chrome".
We have two known PlatformObjects that need to implement some of the
behavior of LegacyPlatformObjects to date: Window, and HTMLFormElement.
To make this not require double (or virtual) inheritance of
PlatformObject, move the behavior of LegacyPlatformObject into
PlatformObject. The selection of LegacyPlatformObject behavior is done
with a new bitfield of feature flags instead of a dozen virtual
functions that return bool. This change simplifies every class involved
in the diff with the notable exception of Window, which now needs some
ugly const casts to implement named property access.
This large block of code is repeated nearly verbatim in LibWeb. Move it
to a helper function that both LibIPC and LibWeb can defer to. This will
let us make changes to this method in a singular location going forward.
Note this is a bit of a regression for the MessagePort. It now suffers
from the same performance issue that IPC messages face - we prepend the
meessage size to the message buffer. This degredation is very temporary
though, as a fix is imminent, and this change makes that fix easier.
With this, it's possible to build Ladybird without having Qt installed.
(Previously, the build required `moc` to exist.)
In fact, it's possible to build Ladybird without anything off `brew`
as long as you have `ninja` and `gn` (both of which don't have any
dependencies themselves and are easy to build).
Before this change, we would only cache and reuse Gfx::ScaledFont
instances for downloaded CSS fonts.
By moving it into Gfx::VectorFont, we get caching for all vector fonts,
including local system TTFs etc.
This avoids a *lot* of style invalidations in LibWeb, since we now vend
the same Gfx::Font pointer for the same font when used repeatedly.
This commit un-deprecates DeprecatedString, and repurposes it as a byte
string.
As the null state has already been removed, there are no other
particularly hairy blockers in repurposing this type as a byte string
(what it _really_ is).
This commit is auto-generated:
$ xs=$(ack -l \bDeprecatedString\b\|deprecated_string AK Userland \
Meta Ports Ladybird Tests Kernel)
$ perl -pie 's/\bDeprecatedString\b/ByteString/g;
s/deprecated_string/byte_string/g' $xs
$ clang-format --style=file -i \
$(git diff --name-only | grep \.cpp\|\.h)
$ gn format $(git ls-files '*.gn' '*.gni')
This adds APIs to allow Ispector clients to:
* Change a DOM text or comment node's text data.
* Add, replace, or remove a DOM element's attribute.
* Change a DOM element's tag.
From test262 documentation, this flag means:
The test file should only be run when the [[CanBlock]] property of
the Agent Record executing the file is `false`.
This patch stubs out the accessor for that internal slot and skips tests
with the CanBlockIsFalse if that internal slot is true.
AbstractBrowsingContext has a subclass RemoteBrowsingContext without a
visit_edges() override (and it doesn't really need one). But currently,
we rely on subclasses visiting AbstractBrowsingContext's opener BC.
This adds a visit_edges() to AbstractBrowsingContext to explicitly visit
the opener BC itself.
When writing to /sys/kernel/request_panic it will do a kernel panic.
Trying to truncate the node will result in kernel panic with a slightly
different message.
This was used to provided base functionality for model-based chromes for
viewing the DOM and accessibility trees. All chromes now use the WebView
inspector model for those trees, thus this class is unused.
This is modeled after a similar implementation for the JS console.
This client takes over an inspector WebView (created by the chrome) to
create the inspector application. Currently, this application includes
the DOM tree and accessibility tree as a first pass. It can later be
extended to included the style tables, the JS console itself, etc.
This is an internal object that must be explicitly enabled by the chrome
before it is added to the Window. The Inspector object will be used by a
special WebView that will replace all chrome-specific inspector windows.
The IDL defines methods that this WebView will need to inform the chrome
of various events, such as the user clicking a DOM node.
If a unit tests defines a `deps` array, the unit test template would
have tried to overwrite it (and it is actually an error to overwrite
a non-empty list with another non-empty list).
This is a bit spammy now that we are performing some overload resolution
at build time. The fallback to an interface has generally worked fine on
the types it warns about (BufferSource, Module, etc.) so let's not warn
about it for every build.
The previous implementation was calling `backtrace()` for every
function call, which is quite slow.
Instead, this implementation provides VM::stack_trace() which unwinds
the native stack, maps it through NativeExecutable::get_source_range
and combines it with source ranges from interpreted call frames.
For now, part of this is commented-out. Our current implementations of
`<mask>` and `<symbol>` rely on creating layout nodes, so they can't be
`display: none`.
The `to_string()` for this is modified a little from the original,
because we have to calculate what the layer-count is then, instead of
having it already calculated.
Support for this element has been removed from all major engines years
ago already, and it's currently the only reason we have a weird
"visible" flag on Layout::Node (which we toggle on a timer here..)
This event is the star of the show, and the main way that web content
can react to either programmatic or user-initiated navigation.
All of the fun algorithms will have to come later though.
This API is how JavaScript can manipulate the new Navigable concepts
directly. We are still missing most of the interesting algorithms on
Navigation that do the actual navigation steps, and call into the
currently WIP navigable AOs.
IFF was a generic container fileformat that was popular on the Amiga
since it was the only file format supported by Deluxe Paint.
ILBM is an image format popular in the late eighties/nineties
that uses the IFF container.
This is a very first version of the decoder that only supports
(byterun) compressed files with bpp <= 8.
Only the minimal chunks are decoded: CMAP, BODY, BMHD.
I am planning to add support for the following variants:
- EHB (32 colours + lighter 32 colours)
- HAM6 / HAM8 (special mode that allowed to display the whole Amiga
4096 colours / 262 144 colours palette)
- TrueColor (24bit)
Things that could be fun to do:
- Still images could be animated using color cycle information
We now apply MathML's default user agent style sheet along with other
default styles. This sheet is not mixed in with the other styles in
CSS/Default.css because it is a namespaced stylesheet and so has to
be its own sheet.
This will help a lot with developing chromes for different UI frameworks
where we can see which helper classes and processes are really using Qt
vs just using it to get at helper data.
As a bonus, remove Qt dependency from WebDriver.
We got some errors while loading https://twinings.co.uk/ about this
interface missing, and it looked fairly simple so I sketched it out.
Note that I did leave some FIXMEs where it's not clear exactly which
metrics we should be returning.
Now that all the classes for Ladybird are in the Ladybird namespace, we
don't need them named as Ladybird::FooBarLadybird. For the Qt-specific
classes, we can tack on a Qt at the end for clarity, but FontPlugin and
ImageCodecPlugin no longer have anything to do with Qt.
LibTLS still can't access many parts of the web, so let's hide this
behind a flag (with all the plumbing that entails).
Hopefully this can encourage folks to improve LibTLS's algorithm support
:^).
Re-organize our helper files here a bit, to make a clearer distinction
between Qt-specific helpers and generic non-serenity helpers.
A future commit will move Lagom specific code from LibSQL to ladybird
as well, so that we can see about future generic apis for spawning
helper procesess.
The Qt relationship was removed in de31a8a4, so let's acknowledge that
and make it clearer to potential non-Qt ports that this file is usable
by them :^).
This reduces the number of tasks to schedule, and the complexity of the
build system integrations for the BindingsGenerator. As a bonus, we move
the "only write if changed" feature into the generator to reduce the
build system load on generated files for this generator.
The idl file lists are used for two things:
1. As inputs for `generate_window_or_worker_interfaces`
2. In a loop in `generate_idl_bindings` and the loop variable
is passed to `rebase_path`
Both these cases can handle a normal fully-qualified GN path,
so there's no need for the "abspath".
No behavior change.
The dollar sign is a special character in POSIX shells and in the Ninja
build file format. If the file name contains a `$`, something goes wrong
in the escaping/unescaping of this symbol, and CMake/GCC/Clang generate
invalid dependency files where not all instances of `$` are escaped
properly. Because of this, Ninja fails to rebuild `$262Object.cpp` if
the headers included by it have changed.
Stale `$262Object.cpp.o` files have been the cause of mysterious crashes
multiple times which only go away after doing a clean build. Let's
prevent these from happening again by removing the `$` from the
filename.
This allows opening the ladybird.app app bundle on macOS, using Xcode
tools like Instruments on the applications in the app bundle, and even
installing the app bundle into /Applications :^)
A lot of code gen happening here. These generators are kind of
awkward to work with, and the fact that the CLDR data download
extracts over 8,000 files makes it hard to fit into the explicit
patterns GN expects of us.