It's unnecessary indirection and is preventing the ability to easily
pass isolate references into the dispatch and dyn_import closures.
Note: this changes how StartupData::Script is executed. It's no longer done
during Isolate::new() but rather lazily on first poll or execution.
This patch makes it so that RecursiveLoad doesn't own the Isolate, so
Worker::execute_mod_async does not consume itself.
Previously Worker implemented Loader, but now ThreadSafeState does.
This is necessary preparation work for dynamic import (#1789) and import
maps (#1921)
* Compiler no longer has its own Tokio runtime. Compiler handles one
message and then exits.
* Uses the simpler ts.CompilerHost interface instead of
ts.LanguageServiceHost.
* avoids recompiling the same module by introducing a hacky but simple
`hashset<string>` that stores the module names that have been already
compiled.
* Removes the CompilerConfig op.
* Removes a lot of the mocking stuff in compiler.ts like `this._ts`. It
is not useful as we don't even have tests.
* Turns off checkJs because it causes fmt_test to die with OOM.
This patch provides a work-around for an apparent V8 bug where
initializing multiple isolates concurrently leads to a crash on
Windows.
At the time of writing the cause of this crash is not exactly
understood, but it seems to be related to the V8 internal
function win64_unwindinfo::RegisterNonABICompliantCodeRange(),
which didn't exist in older versions of V8.
* In order to prevent ArrayBuffers from getting garbage collected by V8,
we used to store a v8::Persistent<ArrayBuffer> in a map. This patch
introduces a custom ArrayBuffer allocator which doesn't use Persistent
handles, but instead stores a pointer to the actual ArrayBuffer data
alongside with a reference count. Since creating Persistent handles
has quite a bit of overhead, this change significantly increases
performance. Various HTTP server benchmarks report about 5-10% more
requests per second than before.
* Previously the Persistent handle that prevented garbage collection had
to be released manually, and this wasn't always done, which was
causing memory leaks. This has been resolved by introducing a new
`PinnedBuf` type in both Rust and C++ that automatically re-enables
garbage collection when it goes out of scope.
* Zero-copy buffers are now correctly wrapped in an Option if there is a
possibility that they're not present. This clears up a correctness
issue where we were creating zero-length slices from a null pointer,
which is against the rules.
Op dispatch is now dynamically dispatched, so slightly less efficient.
The immeasurable perf hit is a reasonable trade for the API simplicity
that is gained here.