- Introduced optional callback for Deno.core.serialize API, that returns
cloning error if there is one.
- Removed try/catch in seralize structured clone function and throw error from
callback.
- Removed "Object with a getter that throws" assertion from WPT.
This commit fixes a failing WPT test by making EventTarget's
addEventListener() method throw if both the listener and the
signal option are null.
Fixes: https://github.com/denoland/deno/issues/14593
This commit adds proper support for import assertions and JSON modules.
Implementation of "core/modules.rs" was changed to account for multiple possible
module types, instead of always assuming that the code is an "ES module". In
effect "ModuleMap" now has knowledge about each modules' type (stored via
"ModuleType" enum). Module loading pipeline now stores information about
expected module type for each request and validates that expected type matches
discovered module type based on file's "MediaType".
Relevant tests were added to "core/modules.rs" and integration tests,
additionally multiple WPT tests were enabled.
There are still some rough edges in the implementation and not all WPT were
enabled, due to:
a) unclear BOM handling in source code by "FileFetcher"
b) design limitation of Deno's "FileFetcher" that doesn't download the same
module multiple times in a single run
Co-authored-by: Kitson Kelly <me@kitsonkelly.com>
This commit adds support for exporting RSA JWKs in the Web Crypto API.
It also does some minor fixes for RSA JWK imports.
Co-authored-by: Sean Michael Wykes <sean.wykes@nascent.com.br>
The Web IDL conversion to `BufferSource` and similar types shouldn't
check whether the buffer is detached.
In the case of `TextDecoder`, our implementation would still throw after
the Web IDL conversions because we're creating a new `Uint8Array` from
the buffer source's buffer, which throws if it's detached. This change
also fixes this bug.
The initial implementation of `importScripts()` in #11338 used
`reqwest`'s default client to fetch HTTP scripts, which meant it would
not use certificates or other fetching configuration passed by command
line flags. This change fixes it.
* 'request-upload.h2' and 'redirect-upload.h2' only work with a
functional HTTP2 test harness server, otherwise they're flaky.
* Fetch request streaming tests require a server that doesn't choke
on requests that use 'Transfer-Encoding: chunked'.
`Window`'s `self` property and `DedicatedWorkerGlobalScope`'s `name`
property are defined as Web IDL read-only attributes with the
`[Replaceable]` extended attribute, meaning that their setter will
redefine the property as a data property with the set value, rather than
changing some internal state. Deno currently defines them as read-only
data properties instead.
Given that Web IDL requires all attributes to be accessor properties
rather than data properties, but Deno exposes almost all of those
properties as either read-only or writable data properties, it makes
sense to expose `[Replaceable]` properties as writable as well – as is
already the case with `WindowOrWorkerGlobalScope`'s `performance`
property.
Classic workers were implemented in denoland#11338, which also enabled the WPT
tests in the `workers` directory. However, the rest of WPT worker tests
were not enabled because a number of them were hanging due to
web-platform-tests/wpt#29777. Now that that WPT issue is fixed, the bulk
of worker tests can be enabled.
There are still a few tests that hang, and so haven't been enabled. In
particular:
- The following tests seem to hang because a promise fails to resolve.
We can detect such cases in non-worker tests because the process will
exit without calling the WPT completion callback, but in worker tests
the worker message ops will keep the event loop running. This will be
fixed when we add timeouts to WPT tests (denoland#9460).
- `/fetch/api/basic/error-after-response.any.worker.html`
- `/html/webappapis/microtask-queuing/queue-microtask-exceptions.any.worker.html`
- `/webmessaging/message-channels/worker-post-after-close.any.worker.html`
- `/webmessaging/message-channels/worker.any.worker.html`
- `/websockets/Create-on-worker-shutdown.any.worker.html`
- The following tests apparently hang because a promise rejection is
never handled, which will kill the process in the main thread but not
in workers (denoland#12221).
- `/streams/readable-streams/async-iterator.any.worker.html`
- `/workers/interfaces/WorkerUtils/importScripts/report-error-setTimeout-cross-origin.sub.any.worker.html`
- `/workers/interfaces/WorkerUtils/importScripts/report-error-setTimeout-redirect-to-cross-origin.sub.any.worker.html`
- `/workers/interfaces/WorkerUtils/importScripts/report-error-setTimeout-same-origin.sub.any.worker.html`
In the spec, a URL record has an associated "blob URL entry", which for
`blob:` URLs is populated during parsing to contain a reference to the
`Blob` object that backs that object URL. It is this blob URL entry that
the `fetch` API uses to resolve an object URL.
Therefore, since the `Request` constructor parses URL inputs, it will
have an associated blob URL entry which will be used when fetching, even
if the object URL has been revoked since the construction of the
`Request` object. (The `Request` constructor takes the URL as a string
and parses it, so the object URL must be live at the time it is called.)
This PR adds a new `blobFromObjectUrl` JS function (backed by a new
`op_blob_from_object_url` op) that, if the URL is a valid object URL,
returns a new `Blob` object whose parts are references to the same Rust
`BlobPart`s used by the original `Blob` object. It uses this function to
add a new `blobUrlEntry` field to inner requests, which will be `null`
or such a `Blob`, and then uses `Blob.prototype.stream()` as the
response's body. As a result of this, the `blob:` URL resolution from
`op_fetch` is now useless, and has been removed.
This adds support for the URLPattern API.
The API is added in --unstable only, as it has not yet shipped in any
browser. It is targeted for shipping in Chrome 95.
Spec: https://wicg.github.io/urlpattern/
Co-authored-by: crowlKats < crowlkats@toaxl.com >
Classic worker scripts are now executed in the context of a Tokio
runtime. This does mean we can not spawn more tokio runtimes in
"op_worker_sync_fetch". We instead spawn a new thread there, that can
create a new Tokio runtime that we can use to block the worker thread.
This commit implements classic workers, but only when the `--enable-testing-features-do-not-use` flag is provided. This change is not user facing. Classic workers are used extensively in WPT tests. The classic workers do not support loading from disk, and do not support TypeScript.
Co-authored-by: Luca Casonato <hello@lcas.dev>
Currently, calling the `abort()` method on a `FileReader` object aborts
any current read operation, but it also prevents any read operation
started at some later point from starting. The File API instead
specifies that calling `abort()` should reset the `FileReader`'s state
and result, as well as removing any queued tasks from the current
operation that haven't yet run.
Currently, a WPT test is considered failed if its status code is
anything other than 0, regardless of whether the test suite completed
running or not, and any subtests that haven't finished running are not
considered to be failures.
But a test can exit with a zero status code before it has completed
running, if the event loop has run out of tasks because of a bug in one
of the ops, leading to false positives. This change fixes that.