This replaces the current LexicalPath::append() API with a new method
that returns a new LexicalPath object and doesn't touch the this-object.
With this, LexicalPath is now immutable. It also adds a
LexicalPath::parent() method and the relevant test cases.
Since this is always set to true on the non-default constructor and
subsequently never modified, it is somewhat pointless. Furthermore,
there are arguably no invalid relative paths.
Split out the functionality to gather multiple tests from the filesystem
and run them in turn into Test::TestRunner, and leave the JavaScript
specific test harness logic in Test::JS::TestRunner and friends.
If someone runs the test with shell redirection going on, or in a way
that changes any of the standard file descriptors this assumption will
not hold. When running from a terminal normally, it is true however.
Instead, check that /proc/self/fd/[0,1,2] are symlinks, and can be
stat-d by verifying that both stat and lstat succeed, and give different
struct stat contents.
Also add some tests to ensure that they _remain_ constexpr.
In general, any runtime assertions, weirdo C casts, pointer aliasing,
and such shenanigans should be gated behind the (helpfully newly added)
AK::is_constant_evaluated() function when the intention is to write
constexpr-capable code.
a.k.a. deliver promises of constexpr-ness :P
This commit converts naked `new`s to `AK::try_make` and `AK::try_create`
wherever possible. If the called constructor is private, this can not be
done, so we instead now use the standard-defined and compiler-agnostic
`new (nothrow)`.
Scanning tables is a linear process using pointers in the table's
tuples, and does not involve more 'stochastic' code paths like index
traversals. Therefore the 1000 and 10000 row tests were basically
overkill and added nothing we can't find out with less rows.
This declares all test cases which compare function outputs over the
entire Unicode range as `BENCHMARK_CASE`, to avoid them being run by CI.
This reduces runtime of TestCharacterTypes (without benchmarks) by about
one third.
SQL was standardized before there was consensus on sane language syntax
constructs had evolved. The language is mostly case-insensitive, with
unquoted text converted to upper case. Identifiers can include lower
case characters and other 'special' characters by enclosing the
identifier with double quotes. A double quote is escaped by doubling it.
Likewise, a single quote in a literal string is escaped by doubling it.
All this means that the strategy used in the lexer, where a token's
value is a StringView 'window' on the source string, does not work,
because the value needs to be massaged before being handed to the
parser. Therefore a token now has a String containing its value. Given
the limited lifetime of a token, this is acceptable overhead.
Not doing this means that for example quote removal and double quote
escaping would need to be done in the parser or in AST node
construction, which would spread lexing basically all over the place.
Which would be suboptimal.
There was some impact on the sql utility and SyntaxHighlighter component
which was addressed by storing the token's end position together with
the start position in order to properly highlight it.
Finally, reviewing the tests for parsing numeric literals revealed an
inconsistency in which tokens we accept or reject: `1a` is accepted but
`1e` is rejected. Related to this is the fate of `0x`. Added a FIXME
reminding us to address this.
Unfortunately this patch is quite large.
The main functionality included are a BTree index implementation and
the Heap class which manages persistent storage.
Also included are a Key subclass of the Tuple class, which is a
specialization for index key tuples. This "dragged in" the Meta layer,
which has classes defining SQL objects like tables and indexes.
This patch adds the basic dynamic value classes used by the SQL Storage
layer. The most elementary class is Value, which holds a typed Value
which can be converted to standard C++ types. A Tuple is a collection
of Values described by a TupleDescriptor, which specifies the names,
types, and ordering of the elements in the Tuple.
Tuples and Values can be serialized and deserialized to and from
ByteBuffers. This is mechanism which is used to save them to disk.
Tuples are used as keys in SQL indexes and rows in SQL tables.
Also included is a test file.
The insert_before method on AK::InlineLinkedList is used, so in order to
achieve feature parity, we need to implement it for AK::IntrusiveList as
well.
A POSIX-compatibility fix was introduced in 64740a0214 to make the
compilation of the `diffutils` port work, which expected a
`char* const* argv` signature.
And indeed, the POSIX spec does not mention permutation of `argv`:
https://pubs.opengroup.org/onlinepubs/9699919799/functions/getopt.html
However, most implementations do modify `argv` as evidenced by
documentation such as:
https://refspecs.linuxbase.org/LSB_5.0.0/LSB-Core-generic
/LSB-Core-generic/libutil-getopt-3.html
"The function prototype was aligned with POSIX 1003.1-2008 (ISO/IEC
9945-2009) despite the fact that it modifies argv, and the library
maintainers are unwilling to change this."
Change the behavior back to permutate `argc` to allow for the following
command line argument order to work again:
unzip ./file.zip -o target-dir
Without this change, `./file.zip` in the example above would have been
ignored completely.
Doing these as custom classes might be faster, especially when writing
them in SSE, but this would cause a lot of Code duplication and due to
the nature of constexprs and the intelligence of the compiler they might
be using SSE/MMX either way
This commit makes it possible to instantiate `Vector<T&>` and use it
to store references to `T` in a vector.
All non-pointer observers are made to return the reference, and the
pointer observers simply yield the underlying pointer.
Note that the 'find_*' methods act on the values and not the pointers
that are stored in the vector.
This commit also makes errors in various vector methods much more
readable by directly using requires-clauses on them.
And finally, it should be noted that Vector cannot hold temporaries :^)
Previously, AK::Function would accept _any_ callable type, and try to
call it when called, first with the given set of arguments, then with
zero arguments, and if all of those failed, it would simply not call the
function and **return a value-constructed Out type**.
This lead to many, many, many hard to debug situations when someone
forgot a `const` in their lambda argument types, and many cases of
people taking zero arguments in their lambdas to ignore them.
This commit reworks the Function interface to not include any such
surprising behaviour, if your function instance is not callable with
the declared argument set of the Function, it can simply not be
assigned to that Function instance, end of story.
According to the definition at https://sqlite.org/lang_expr.html, SQL
expressions could be infinitely deep. For practicality, SQLite enforces
a maxiumum expression tree depth of 1000. Apply the same limit in
LibSQL to avoid stack overflow in the expression parser.
Fixes https://crbug.com/oss-fuzz/34859.
Because non-ASCII code points have negative byte values, trimming away
control characters requires checking for negative bytes values.
This also adds a test case with a URL containing non-ASCII code points.