Fixes #26498.
This was a sort of intentional decision originally, as I wanted to avoid
caching extra files that may not be needed. It seems like that behavior
is unintuitive, so I propose we cache all of the exports of listed jsr
packages when you run a bare `deno install`.
This commit makes sure that `deno add`, `deno install` and `deno remove`
update the lockfile if only `package.json` file is present.
Fixes https://github.com/denoland/deno/issues/26270
1. Respects the formatting of the file (ex. keeps four space indents or
tabs).
2. Handles editing of comments.
3. Handles trailing commas.
4. Code is easier to maintain.
This PR fixes the issue where mapped specifiers in a workspace member
would never be found. Only mapped paths from the workspace root would
resolve.
This was caused by always passing the workspace root url to the import
map resolver instead of the workspace member one.
Fixes https://github.com/denoland/deno/issues/26138
Fixes https://github.com/denoland/fresh/issues/2615
---------
Signed-off-by: Marvin Hagemeister <marvinhagemeister50@gmail.com>
Co-authored-by: David Sherret <dsherret@users.noreply.github.com>
<!--
Before submitting a PR, please read
https://docs.deno.com/runtime/manual/references/contributing
1. Give the PR a descriptive title.
Examples of good title:
- fix(std/http): Fix race condition in server
- docs(console): Update docstrings
- feat(doc): Handle nested reexports
Examples of bad title:
- fix #7123
- update docs
- fix bugs
2. Ensure there is a related issue and it is referenced in the PR text.
3. Ensure there are tests that cover the changes.
4. Ensure `cargo test` passes.
5. Ensure `./tools/format.js` passes without changing files.
6. Ensure `./tools/lint.js` passes.
7. Open as a draft PR if your work is still in progress. The CI won't
run
all steps, but you can add '[ci]' to a commit message to force it to.
8. If you would like to run the benchmarks on the CI, add the 'ci-bench'
label.
-->
Fixes https://github.com/denoland/deno/issues/26115.
We weren't normalizing the headers to lower case, so code that attempted
to delete the `Content-Length` header (but used a different case) wasn't
actually removing the header.
partially unblocks #25470
This PR aligns the resolution of `localhost` hostname to Node.js
behavior.
In Node.js `dns.lookup("localhost", (_, addr) => console.log(addr))`
prints ipv6 address `::1`, but it prints ipv4 address `127.0.0.1` in
Deno. That difference causes some errors in the work of enabling
`createConnection` option in `http.request` (#25470). This PR fixes the
issue by aligning `dns.lookup` behavior to Node.js.
This PR also changes the following behaviors (resolving TODOs):
- `http.createServer` now listens on ipv6 address `[::]` by default on
linux/mac
- `net.createServer` now listens on ipv6 address `[::]` by default on
linux/mac
These changes are also alignments to Node.js behaviors.
`napi_call_function` should use our equiv of NAPI_PREAMBLE (`&mut Env`
instead of `*mut Env`) since it needs to set error codes based on
whether the body of the function raised a JS exception.
Fixes: https://github.com/denoland/deno/issues/26282
This came up on Discord as a question so I thought it's worth adding a
hint for this as it might be a common pitfall.
---------
Signed-off-by: Bartek Iwańczuk <biwanczuk@gmail.com>
Co-authored-by: David Sherret <dsherret@users.noreply.github.com>
Fixes https://github.com/denoland/deno/issues/26119.
Originally I wanted to put them in package.json if there's no deno.json,
but on second thought it makes more sense to just create a deno.json
Fixes https://github.com/denoland/deno/issues/26177
The significant delay was caused by Nagel's algorithm + delayed ACKs in
Linux kernels. Here's the [kernel
patch](https://lwn.net/Articles/502585/) which added 40ms
`tcp_default_delack_min`
```
$ deno run -A pg-bench.mjs # main
Tue Oct 15 2024 12:27:22 GMT+0530 (India Standard Time): 42ms
$ target/release/deno run -A pg-bench.mjs # this patch
Tue Oct 15 2024 12:28:02 GMT+0530 (India Standard Time): 1ms
```
```js
import { Buffer } from "node:buffer";
import pg from 'pg'
const { Client } = pg
const client = new Client({
connectionString: 'postgresql://postgres:postgres@127.0.0.1:5432/postgres'
})
await client.connect()
async function fetch() {
const startPerf = performance.now();
const res = await client.query(`select
$1::int as int,
$2 as string,
$3::timestamp with time zone as timestamp,
$4 as null,
$5::bool as boolean,
$6::bytea as bytea,
$7::jsonb as json
`, [
1337,
'wat',
new Date().toISOString(),
null,
false,
Buffer.from('awesome'),
JSON.stringify([{ some: 'json' }, { array: 'object' }])
])
console.log(`${new Date()}: ${Math.round(performance.now() - startPerf)}ms`)
}
for(;;) await fetch();
```
When using the `--unstable-detect-cjs` flag or adding `"unstable":
["detect-cjs"]` to a deno.json, it will make a JS file CJS if the
closest package.json contains `"type": "commonjs"` and the file is not
an ESM module (no TLA, no `import.meta`, no `import`/`export`).
Fixes #22995. Fixes #23000.
There were a handful of bugs here causing the hang (each with a
corresponding minimized test):
- We were canceling recv futures when `receiveMessageOnPort` was called,
but this caused the "receive loop" in the message port to exit. This was
due to the fact that `CancelHandle`s are never reset (i.e., once you
`cancel` a `CancelHandle`, it remains cancelled). That meant that after
`receieveMessageOnPort` was called, the subsequent calls to
`op_message_port_recv_message` would throw `Interrupted` exceptions, and
we would exit the loop.
The cancellation, however, isn't actually necessary.
`op_message_port_recv_message` only borrows the underlying port for long
enough to poll the receiver, so the borrow there could never overlap
with `op_message_port_recv_message_sync`.
- Calling `MessagePort.unref()` caused the "receive loop" in the message
port to exit. This was because we were setting
`messageEventListenerCount` to 0 on unref. Not only does that break the
counter when multiple `MessagePort`s are present in the same thread, but
we also exited the "receive loop" whenever the listener count was 0. I
assume this was to prevent the recv promise from keeping the event loop
open.
Instead of this, I chose to just unref the recv promise as needed to
control the event loop.
- The last bug causing the hang (which was a doozy to debug) ended up
being an unfortunate interaction between how we implement our
messageport "receive loop" and a pattern found in `npm:piscina` (which
angular uses). The gist of it is that piscina uses an atomic wait loop
along with `receiveMessageOnPort` in its worker threads, and as the
worker is getting started, the following incredibly convoluted series of
events occurs:
1. Parent sends a MessagePort `p` to worker
2. Parent sends a message `m` to the port `p`
3. Parent notifies the worker with `Atomics.notify` that a new message
is available
4. Worker receives message, adds "message" listener to port `p`
5. Adding the listener triggers `MessagePort.start()` on `p`
6. Receive loop in MessagePort.start receives the message `m`, but then
hits an await point and yields (before dispatching the "message" event)
7. Worker continues execution, starts the atomic wait loop, and
immediately receives the existing notification from the parent that a
message is available
8. Worker attempts to receive the new message `m` with
`receiveMessageOnPort`, but this returns `undefined` because the receive
loop already took the message in 6
9. Atomic wait loop continues to next iteration, waiting for the next
message with `Atomic.wait`
10. `Atomic.wait` blocks the worker thread, which prevents the receive
loop from continuing and dispatching the "message" event for the
received message
11. The parent waits for the worker to respond to the first message, and
waits
12. The thread can't make any more progress, and the whole process hangs
The fix I've chosen here (which I don't particularly love, but it works)
is to just delay the `MessagePort.start` call until the end of the event
loop turn, so that the atomic wait loop receives the message first. This
prevents the hang.
---
Those were the main issues causing the hang. There ended up being a few
other small bugs as well, namely `exit` being emitted multiple times,
and not patching up the message port when it's received by
`receiveMessageOnPort`.
Although using `--allow-run` without an allow list gives basically no
security, I think we should remove this warning because it gets in the
way and the only way to disable it is via --quiet.
Closes https://github.com/denoland/deno/issues/26012
```
========================================
failures:
"/WebCryptoAPI/wrapKey_unwrapKey/wrapKey_unwrapKey.https.any.html - Can wrap and unwrap X25519 private key keys as non-extractable using pkcs8 and AES-CTR"
final result: failed. 1 passed; 1 failed; 0 expected failure; total 2 (15646ms)
diff --git a/Users/divy/gh/deno/tests/wpt/runner/expectation.json b/var/folders/ll/ltqsx4nx72v5_r7rlsl36pgm0000gn/T/375d79f1257b05cd
index fb2063935..4449c5d15 100644
--- a/Users/divy/gh/deno/tests/wpt/runner/expectation.json
+++ b/var/folders/ll/ltqsx4nx72v5_r7rlsl36pgm0000gn/T/375d79f1257b05cd
@@ -1531,6 +1531,7 @@
},
"wrapKey_unwrapKey": {
"wrapKey_unwrapKey.https.any.html": [
+ "Can wrap and unwrap X25519 private key keys as non-extractable using pkcs8 and AES-CTR",
"Can wrap and unwrap X25519 private key keys as non-extractable using pkcs8 and AES-CBC",
"Can wrap and unwrap X25519 private key keys as non-extractable using pkcs8 and AES-GCM",
"Can wrap and unwrap X25519 private key keys as non-extractable using pkcs8 and AES-KW",
```
Fixes #25998. Fixes https://github.com/denoland/deno/issues/25928.
Originally I was just going to make this an error message instead of a
panic, but once I got to a minimal repro I felt that this really should
work.
The panic occurs when you have `nodeModulesDir: manual` (or a
package.json present), and you have an npm package with a tag in your
deno.json (see the spec test that illustrates this).
This code path only actually executes when trying to choose an
appropriate package version from `node_modules/.deno`, so we should be
able to fix it by storing some extra data at install time.
The fix proposed here is to repurpose the `.initialized` file that we
store in `node_modules` to store the tags associated with a package.
Basically, if you have a version requirement with a tag (e.g.
`npm:chalk@latest`), when we set up the node_modules folder for that
package, we store the tag (`latest`) in `.initialized`. Then, when doing
BYONM resolution, if we have a version requirement with a tag, we read
that file and check if the tag is present.
The downside is that we do more work when setting up `node_modules`. We
_could_ do this only when BYONM is enabled, but that would have the
downside of needing to re-run `deno install` when you switch from auto
-> manual, though maybe that's not a big deal.
Fixes #25861.
Previously we were attempting to match the version requirement against
the version already present in `node_modules` root, and if they didn't
match we would create a node_modules dir in the workspace member's
directory with the dependency.
Aside from the fact that this caused the panic, on second thought it
just doesn't make sense in general. We shouldn't be semver matching, as
resolution has already occurred and decided what package versions are
required. Instead, we can just compare the versions directly.
Fixes #24740.
Implements the `uv_mutex_*` and `uv_async_*` APIs.
The mutex API is implemented exactly as libuv, a thin wrapper over the
OS's native mutex.
The async API is implemented in terms of napi_async_work. As documented
in the napi docs, you really shouldn't call `napi_queue_async_work`
multiple times (it is documented as undefined behavior). However, our
implementation doesn't have any issue with this, so I believe it suits
our purpose here.
Previously the CLI was incorrectly reporting `React` as unused in a JSX
file that uses the "old" transform.
The LSP was already handling this correctly.
Currently we only warn once. With this PR, we continue to warn about
not-run scripts on explicit `deno install` (or cache). For `run` (or
other subcommands) we only warn the once, as we do currently.
Fixes https://github.com/denoland/deno/issues/25862.
npm only makes bin entries executable if they get linked into `.bin`, as
we did before this PR. So this PR actually deviates from npm, because
it's the only reasonable way to fix this that I can think of.
---
The reason this was broken in moment is the following:
Moment has dependencies on two typescript versions: 1.8 and 3.1
If you have two packages with conflicting bin entries (i.e. two
typescript versions which both have a bin entry `tsc`), in npm it is
non-deterministic and undefined which one will end up in `.bin`.
npm, due to implementation differences, chooses to put typescript 1.8
into the `.bin` directory, and so `node_modules/typescript/bin/tsc` ends
up getting marked executable. We, however, choose typescript 3.2, and so
we end up making `node_modules/typescript3/bin/tsc` executable.
As part of its tests, moment executes `node_modules/typescript/bin/tsc`.
Because we didn't make it executable, this fails.
Since the conflict resolution is undefined in npm, instead of trying to
match it, I think it makes more sense to just make bin entries
executable even if they aren't chosen in the case of a conflict.
This replaces `--allow-net` for import permissions and makes the
security sandbox stricter by also checking permissions for statically
analyzable imports.
By default, this has a value of
`--allow-import=deno.land:443,jsr.io:443,esm.sh:443,raw.githubusercontent.com:443,gist.githubusercontent.com:443`,
but that can be overridden by providing a different set of hosts.
Additionally, when no value is provided, import permissions are inferred
from the CLI arguments so the following works because
`fresh.deno.dev:443` will be added to the list of allowed imports:
```ts
deno run -A -r https://fresh.deno.dev
```
---------
Co-authored-by: David Sherret <dsherret@gmail.com>
Fixes #25813.
I initially tried doing this in `deno_semver`, where it's a cleaner
change, but that caused breakage in deno in places where we don't expect
a tag (see https://github.com/denoland/deno/issues/25857).
This does not fix wildcard requirements failing to choose pre-release
versions. That's a little more involved and I'll do a separate PR.
Refactors the lifecycle scripts code to extract out the common
functionality and then uses that to provide a warning in the global
resolver.
While ideally we would still support them with the global cache, for now
a warning is at least better than the status quo (where people are
unaware why their packages aren't working).
`deno fmt --check` was broken for CSS, YAML and HTML files.
Before this PR, formatting any of these file types would return a
string, even though the contract in `cli/tools/fmt.rs` is to only return a
string if the formatting changed. This causes wrong flagging of these files
as being badly formatted even though diffs showed nothing (because
they were in fact formatted properly).
Closes https://github.com/denoland/deno/issues/25840
Partially addresses https://github.com/denoland/deno/issues/25648.
This allows packages that use `crossws` to be installed with `deno
install`. `crossws` specifies an optional peer dependency on
`uWebSockets`, but `uWebSockets` is not on npm (it is used with `git:`
or `github:` specifiers). Previously we would error on this, now we
don't error on non-existent optional peer dependencies.
This is for security reasons for the time being for Deno 2. Details to
follow post Deno 2.0 release.
Remote import maps seem incredibly rare (only 2 usages on GitHub from
what I can tell), so we'll add this back with more permissions if
there's enough demand for it:
https://github.com/search?type=code&q=%2F%22importMap%22%3A+%22http%2F
In the meantime, use the `--import-map` flag and `"deno.importMap"`
config in the LSP for remote import maps.
Fixes #25802
markup_fmt plugin supports some HTML-like formats like Angular, Jinja,
Twig, Nunjucks or Vento, that are not supported by `deno fmt`. This PR
adds support for the extensions `njk` (Nunjucks) and `vto` (Vento).
Angular doesn't have a custom extension (it uses `html` afaik) and Jinja
and Twig are template engines written in Python and PHP respectively so
it doesn't make sense to be supported by Deno.
This commits stabilizes CSS, HTML and YAML formatters
in `deno fmt`.
It is no longer required to use either of these flags:
- `--unstable-css`
- `--unstable-html`
- `--unstable-yaml`
Or these `unstable` options in the config file:
- `fmt-css`
- `fmt-html`
- `html-yaml`
This commit adds better handling for terminal errors when
`window` global is used. This global is removed in Deno 2,
and while we have lints to help with that, an information and
hints are helpful to guide users to working code.
Ref https://github.com/denoland/deno/issues/25797
Fixes https://github.com/denoland/deno/issues/23508
`width` and `height` are required to configure the wgpu surface because
Deno is headless and depends on user to create a window. The options
were non-standard extension of `GPUCanvasConfiguration#configure`.
This PR adds a required options parameter with the `width` and `height`
options to `Deno.UnsafeWindowSurface` constructor.
```typescript
// Old, non-standard extension of GPUCanvasConfiguration
const surface = new Deno.UnsafeWindowSurface("x11", displayHandle, windowHandle);
const context = surface.getContext();
context.configure({ width: 600, height: 800, /* ... */ });
```
```typescript
// New
const surface = new Deno.UnsafeWindowSurface({
system: "x11",
windowHandle,
displayHandle,
width: 600,
height: 800,
});
const context = surface.getContext();
context.configure({ /* ... */ });
```
This PR optimizes the case when `performance.measure()` needs to find
the startMark by name. It is a simple change on `findMostRecent` fn to
avoiding copying and reversing the complete entries list.
Adds minor missing tests for:
- `clearMarks()`, general
- `clearMeasures()`, general
- `measure()`, case when the startMarks name exists more than once
### Benchmarks
#### main
```
CPU | AMD Ryzen 7 PRO 6850U with Radeon Graphics
Runtime | Deno 2.0.0-rc.4 (x86_64-unknown-linux-gnu)
benchmark time/iter (avg) iter/s (min … max) p75 p99 p995
---------------------- ----------------------------- --------------------- --------------------------
worst case measure() 2.1 ms 486.9 ( 1.7 ms … 2.4 ms) 2.2 ms 2.4 ms 2.4 ms
```
#### this PR
```
CPU | AMD Ryzen 7 PRO 6850U with Radeon Graphics
Runtime | Deno 2.0.0-rc.4 (x86_64-unknown-linux-gnu)
benchmark time/iter (avg) iter/s (min … max) p75 p99 p995
---------------------- ----------------------------- --------------------- --------------------------
worst case measure() 966.3 µs 1,035 (876.9 µs … 1.1 ms) 1.0 ms 1.1 ms 1.1 ms
```
```ts
Deno.bench("worst case measure()", (b) => {
performance.mark('start');
for (let i = 0; i < 1e5; i += 1) {
performance.mark(crypto.randomUUID());
}
b.start();
performance.measure('total', 'start');
b.end();
performance.clearMarks();
performance.clearMeasures();
});
```
Fixes rsbuild running in deno.
You can look at the test to see what was failing, the gist is that we
were trying to statically analyze the re-exports of a CJS script, and if
we couldn't find the source for the re-exported file we would fail.
Instead, we should just treat these as if they were too dynamic to
analyze, and let it fail (or succeed) at runtime. This aligns with
node's behavior.
Aligns the error messages in the ext/http and a few messages in the
ext/fetch folder to be in-line with the Deno style guide.
This change-set also removes some unnecessary checks in the 00_serve.ts.
These options were recently removed, so it doesn't make sense to check
for them anymore.
https://github.com/denoland/deno/issues/25269
This commit lets `deno test --doc` command actually evaluate code snippets in
JSDoc and markdown files.
## How it works
1. Extract code snippets from JSDoc or code fences
2. Convert them into pseudo files by wrapping them in `Deno.test(...)`
3. Register the pseudo files as in-memory files
4. Run type-check and evaluation
We apply some magic at the step 2 - let's say we have the following file named
`mod.ts` as an input:
````ts
/**
* ```ts
* import { assertEquals } from "jsr:@std/assert/equals";
*
* assertEquals(add(1, 2), 3);
* ```
*/
export function add(a: number, b: number) {
return a + b;
}
````
This is virtually transformed into:
```ts
import { assertEquals } from "jsr:@std/assert/equals";
import { add } from "files:///path/to/mod.ts";
Deno.test("mod.ts$2-7.ts", async () => {
assertEquals(add(1, 2), 3);
});
```
Note that a new import statement is inserted here to make `add` function
available. In a nutshell, all items exported from `mod.ts` become available in
the generated pseudo file with this automatic import insertion.
The intention behind this design is that, from library user's standpoint, it
should be very obvious that this `add` function is what this example code is
attached to. Also, if there is an explicit import statement like
`import { add } from "./mod.ts"`, this import path `./mod.ts` is not helpful for
doc readers because they will need to import it in a different way.
The automatic import insertion has some edge cases, in particular where there is
a local variable in a snippet with the same name as one of the exported items.
This case is addressed by employing swc's scope analysis (see test cases for
more details).
## "type-checking only" mode stays around
This change will likely impact a lot of existing doc tests in the ecosystem
because some doc tests rely on the fact that they are not evaluated - some cause
side effects if executed, some throw errors at runtime although they do pass the
type check, etc. To help those tests gradually transition to the ones runnable
with the new `deno test --doc`, we will keep providing the ability to run
type-checking only via `deno check --doc`. Additionally there is a `--doc-only`
option added to the `check` subcommand too, which is useful when you want to
type-check on code snippets in markdown files, as normal `deno check` command
doesn't accept markdown.
## Demo
https://github.com/user-attachments/assets/47e9af73-d16e-472d-b09e-1853b9e8f5ce
---
Closes #4716