The map field has been empty for years now and we don't want the emit
file to be exposed so it allows us to iterate on making the cache
faster. Additionally, it's racy/unreliable to rely on this information.
Instead, people should emit the TS files themselves using tools like
deno_emit, typescript, esbuild, etc.
Closes https://github.com/denoland/deno/issues/17703
This allows using npm deps of jsr deps without having to add them to the
root package.json.
Works by taking the package requirement and scanning the
`node_modules/.deno` directory for the best matching package, so it
relies on deno's node_modules structure.
Additionally to make the transition from package.json to deno.json
easier, Deno now:
1. Installs npm deps in a deno.json at the same time as installing npm
deps from a package.json.
2. Uses the alias in the import map for `node_modules/<alias>` for
better package.json compatiblity.
Stores normalized version constraints in the lockfile, which will
improve reproducibility and will fix a bug with duplicate specifiers
ending up in the lockfile. Also, gets rid of some duplicate data in the
specifiers area of the lockfile.
Adds much better support for the unstable Deno workspaces as well as
support for npm workspaces. npm workspaces is still lacking in that we
only install packages into the root node_modules folder. We'll make it
smarter over time in order for it to figure out when to add node_modules
folders within packages.
This includes a breaking change in config file resolution where we stop
searching for config files on the first found package.json unless it's
in a workspace. For the previous behaviour, the root deno.json needs to
be updated to be a workspace by adding `"workspace":
["./path-to-pkg-json-folder-goes-here"]`. See details in
https://github.com/denoland/deno_config/pull/66
Closes #24340
Closes #24159
Closes #24161
Closes #22020
Closes #18546
Closes #16106
Closes #24160
VScode will typically send a `textDocument/semanticTokens/full` request
followed by `textDocument/semanticTokens/range`, and occassionally
request semantic tokens even when we know nothing has changed. Semantic
tokens also get refreshed on each change. Computing semantic tokens is
relatively heavy in TSC, so we should avoid it as much as possible.
Caches the semantic tokens for open documents, to avoid making TSC do
unnecessary work. Results in a noticeable improvement in local
benchmarking
before:
```
Starting Deno benchmark
-> Start benchmarking lsp
- Simple Startup/Shutdown
(10 runs, mean: 383ms)
- Big Document/Several Edits
(5 runs, mean: 1079ms)
- Find/Replace
(10 runs, mean: 59ms)
- Code Lens
(10 runs, mean: 440ms)
- deco-cx/apps Multiple Edits + Navigation
(5 runs, mean: 9921ms)
<- End benchmarking lsp
```
after:
```
Starting Deno benchmark
-> Start benchmarking lsp
- Simple Startup/Shutdown
(10 runs, mean: 395ms)
- Big Document/Several Edits
(5 runs, mean: 1024ms)
- Find/Replace
(10 runs, mean: 56ms)
- Code Lens
(10 runs, mean: 438ms)
- deco-cx/apps Multiple Edits + Navigation
(5 runs, mean: 8927ms)
<- End benchmarking lsp
```
Moves sloppy import resolution from the loader to the resolver.
Also adds some test helper functions to make the lsp tests less verbose
---------
Co-authored-by: David Sherret <dsherret@gmail.com>
This PR enables V8 code cache for ES modules and for `require` scripts
through `op_eval_context`. Code cache artifacts are transparently stored
and fetched using sqlite db and are passed to V8. `--no-code-cache` can
be used to disable.
---------
Co-authored-by: Bartek Iwańczuk <biwanczuk@gmail.com>
I'm running into a node resolution bug in the lsp only and while
tracking it down I noticed this one.
Fixed by moving the project version out of `Documents`.
Previously we locked the entire `FileSystemDocuments` even for lookups,
causing contention. This was particularly bad because some of the hot
ops (namely `op_resolve`) can end up hitting that lock under contention.
This PR replaces the mutex with synchronization internal to
`FileSystemDocuments` (an `AtomicBool` for the dirty flag, and then a
`DashMap` for the actual documents).
I need to think a bit more about whether or not this introduces any
problematic race conditions.