1
0
Fork 0
mirror of https://github.com/denoland/deno.git synced 2024-11-24 15:19:26 -05:00

chore: fix typos (#19572)

This commit is contained in:
Martin Fischer 2023-06-26 15:10:27 +02:00 committed by GitHub
parent ad3c494b46
commit 801b9ec62d
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
83 changed files with 169 additions and 168 deletions

View file

@ -66,7 +66,7 @@ https://github.com/denoland/deno_install
(#19355) (#19355)
- perf(ext/http): Use flat list of headers for multiple set/get methods (#19336) - perf(ext/http): Use flat list of headers for multiple set/get methods (#19336)
- perf(ext/websocket): Make send sync for non-stream websockets (#19376) - perf(ext/websocket): Make send sync for non-stream websockets (#19376)
- perf(ext/websocket): Reduce GC pressure & monomorpize op_ws_next_event - perf(ext/websocket): Reduce GC pressure & monomorphize op_ws_next_event
(#19405) (#19405)
- perf(ext/websocket): monomorphize code (#19394) - perf(ext/websocket): monomorphize code (#19394)
- perf(http): avoid flattening http headers (#19384) - perf(http): avoid flattening http headers (#19384)
@ -217,7 +217,7 @@ https://github.com/denoland/deno_install
- feat(bench): add `--no-run` flag (#18433) - feat(bench): add `--no-run` flag (#18433)
- feat(cli): don't check permissions for statically analyzable dynamic imports - feat(cli): don't check permissions for statically analyzable dynamic imports
(#18713) (#18713)
- feat(cli): flatten deno.json configuaration (#17799) - feat(cli): flatten deno.json configuration (#17799)
- feat(ext/ffi): support marking symbols as optional (#18529) - feat(ext/ffi): support marking symbols as optional (#18529)
- feat(ext/http): Rework Deno.serve using hyper 1.0-rc3 (#18619) - feat(ext/http): Rework Deno.serve using hyper 1.0-rc3 (#18619)
- feat(ext/kv): add more atomic operation helpers (#18854) - feat(ext/kv): add more atomic operation helpers (#18854)
@ -431,7 +431,7 @@ https://github.com/denoland/deno_install
- fix(npm): improve peer dependency resolution with circular dependencies - fix(npm): improve peer dependency resolution with circular dependencies
(#18069) (#18069)
- fix(prompt): better output with control chars (#18108) - fix(prompt): better output with control chars (#18108)
- fix(rumtime): Add `Deno.` prefix for registered symbols (#18086) - fix(runtime): Add `Deno.` prefix for registered symbols (#18086)
- fix(runtime/windows): ensure `Deno.stdin.setRaw(false)` properly disables raw - fix(runtime/windows): ensure `Deno.stdin.setRaw(false)` properly disables raw
mode (#17983) mode (#17983)
- fix: Split extension registration and snapshotting (#18098) - fix: Split extension registration and snapshotting (#18098)
@ -609,7 +609,7 @@ https://github.com/denoland/deno_install
- feat(core): Reland support for async ops in realms (#17204) - feat(core): Reland support for async ops in realms (#17204)
- fix(cli/fmt): show filepath for InvalidData error (#17361) - fix(cli/fmt): show filepath for InvalidData error (#17361)
- fix(core): Add `Generator` and `AsyncGenerator` to promordials (#17241) - fix(core): Add `Generator` and `AsyncGenerator` to primordials (#17241)
- fix(ext/fetch) Fix request clone error in flash server (#16174) - fix(ext/fetch) Fix request clone error in flash server (#16174)
- fix(ext/fetch): remove Response.trailer from types (#17284) - fix(ext/fetch): remove Response.trailer from types (#17284)
- fix(ext/ffi): use SafeMap in getTypeSizeAndAlignment (#17305) - fix(ext/ffi): use SafeMap in getTypeSizeAndAlignment (#17305)
@ -690,7 +690,7 @@ https://github.com/denoland/deno_install
- fix(lsp): "Add all missing imports" uses correct specifiers (#17216) - fix(lsp): "Add all missing imports" uses correct specifiers (#17216)
- fix(lsp): completions for private variables (#17220) - fix(lsp): completions for private variables (#17220)
- fix(lsp): don't error if completionItem/resolve request fails (#17250) - fix(lsp): don't error if completionItem/resolve request fails (#17250)
- fix(lsp): less agressive completion triggers (#17225) - fix(lsp): less aggressive completion triggers (#17225)
- fix(lsp/format): language formatter used should be based on language id - fix(lsp/format): language formatter used should be based on language id
(#17148) (#17148)
- fix(lsp/testing): fallback name for non-analyzable tests in collector (#17120) - fix(lsp/testing): fallback name for non-analyzable tests in collector (#17120)
@ -1242,7 +1242,7 @@ https://github.com/denoland/deno_install
- feat(unstable): initial support for npm specifiers (#15484) - feat(unstable): initial support for npm specifiers (#15484)
- feat: `queueMicrotask()` error handling (#15522) - feat: `queueMicrotask()` error handling (#15522)
- feat: add "deno init" subcommand (#15469) - feat: add "deno init" subcommand (#15469)
- fix(cache): do not attempt to emit non-emitable files (#15562) - fix(cache): do not attempt to emit non-emittable files (#15562)
- fix(core/runtime): always cancel termination in exception handling (#15514) - fix(core/runtime): always cancel termination in exception handling (#15514)
- fix(coverage): ensure coverage is only collected in certain situations - fix(coverage): ensure coverage is only collected in certain situations
(#15467) (#15467)
@ -1446,7 +1446,7 @@ https://github.com/denoland/deno_install
- feat(web): enable deflate-raw compression format (#14863) - feat(web): enable deflate-raw compression format (#14863)
- fix(check): use "moduleDetection": "force" (#14875) - fix(check): use "moduleDetection": "force" (#14875)
- fix(cli): add config flag to `deno info` (#14706) - fix(cli): add config flag to `deno info` (#14706)
- fix(console): constrol inspect() indent with option (#14867) - fix(console): control inspect() indent with option (#14867)
- fix(url): properly indent when inspecting URLs (#14867) - fix(url): properly indent when inspecting URLs (#14867)
- upgrade: v8 10.4.132.5 (#14874) - upgrade: v8 10.4.132.5 (#14874)
@ -2008,7 +2008,7 @@ Note 1.20.0 was dead on arrival, see https://github.com/denoland/deno/pull/13993
- feat(core): update to V8 9.7 (#12685) - feat(core): update to V8 9.7 (#12685)
- fix(cli): do not cache emit when diagnostics present (#12541) - fix(cli): do not cache emit when diagnostics present (#12541)
- fix(cli): don't panic when mapping unknown errors (#12659) - fix(cli): don't panic when mapping unknown errors (#12659)
- fix(cli): lint/format all discoverd files on each change (#12518) - fix(cli): lint/format all discovered files on each change (#12518)
- fix(cli): linter/formater watches current directory without args (#12550) - fix(cli): linter/formater watches current directory without args (#12550)
- fix(cli): no-check respects inlineSources compiler option (#12559) - fix(cli): no-check respects inlineSources compiler option (#12559)
- fix(cli/upgrade): nice error when unzip is missing (#12693) - fix(cli/upgrade): nice error when unzip is missing (#12693)
@ -2236,7 +2236,7 @@ Note 1.20.0 was dead on arrival, see https://github.com/denoland/deno/pull/13993
- feat(lsp): ignore specific lint for entire file (#12023) - feat(lsp): ignore specific lint for entire file (#12023)
- feat(unstable): Add file locking APIs (#11746) - feat(unstable): Add file locking APIs (#11746)
- feat(unstable): Support file URLs in Deno.dlopen() (#11658) - feat(unstable): Support file URLs in Deno.dlopen() (#11658)
- feat(unstable): allow specifing gid and uid for subprocess (#11586) - feat(unstable): allow specifying gid and uid for subprocess (#11586)
- feat(workers): Make the `Deno` namespace configurable and unfrozen (#11888) - feat(workers): Make the `Deno` namespace configurable and unfrozen (#11888)
- feat: ArrayBuffer in structured clone transfer (#11840) - feat: ArrayBuffer in structured clone transfer (#11840)
- feat: add URLPattern API (#11941) - feat: add URLPattern API (#11941)
@ -4257,7 +4257,7 @@ Read more about this release at https://deno.land/v1
- feat: Deno.test() sanitizes ops and resources (#4399) - feat: Deno.test() sanitizes ops and resources (#4399)
- feat: Fetch should accept a FormData body (#4363) - feat: Fetch should accept a FormData body (#4363)
- feat: First pass at "deno upgrade" (#4328) - feat: First pass at "deno upgrade" (#4328)
- feat: Prvode way to build Deno without building V8 from source (#4412) - feat: Provide way to build Deno without building V8 from source (#4412)
- feat: Remove `Object.prototype.__proto__` (#4341) - feat: Remove `Object.prototype.__proto__` (#4341)
- fix(std/http): Close open connections on server close (#3679) - fix(std/http): Close open connections on server close (#3679)
- fix(std/http): Properly await ops in a server test (#4436) - fix(std/http): Properly await ops in a server test (#4436)
@ -5301,7 +5301,7 @@ details.
- Upgrade V8 to 7.2.502.16 (#1403) - Upgrade V8 to 7.2.502.16 (#1403)
- make stdout unbuffered (#1355) - make stdout unbuffered (#1355)
- Implement `Body.formData` for fetch (#1393) - Implement `Body.formData` for fetch (#1393)
- Improve handling of non-coercable objects in assertEqual (#1385) - Improve handling of non-coercible objects in assertEqual (#1385)
- Avoid fetch segfault on empty Uri (#1394) - Avoid fetch segfault on empty Uri (#1394)
- Expose deno.inspect (#1378) - Expose deno.inspect (#1378)
- Add illegal header name and value guards (#1375) - Add illegal header name and value guards (#1375)

View file

@ -26,7 +26,7 @@ macro_rules! bench_or_profile {
)+ )+
if $crate::is_profiling() { if $crate::is_profiling() {
// Run profling // Run profiling
$crate::run_profiles(&test_opts, benches); $crate::run_profiles(&test_opts, benches);
} else { } else {
// Run benches // Run benches

View file

@ -64,7 +64,7 @@ impl Emitter {
} }
/// Gets a cached emit if the source matches the hash found in the cache. /// Gets a cached emit if the source matches the hash found in the cache.
pub fn maybed_cached_emit( pub fn maybe_cached_emit(
&self, &self,
specifier: &ModuleSpecifier, specifier: &ModuleSpecifier,
source: &str, source: &str,

View file

@ -1808,7 +1808,7 @@ impl Iterator for PreloadDocumentFinder {
} }
} }
/// Removes any directorys that are a descendant of another directory in the collection. /// Removes any directories that are a descendant of another directory in the collection.
fn sort_and_remove_non_leaf_dirs(mut dirs: Vec<PathBuf>) -> Vec<PathBuf> { fn sort_and_remove_non_leaf_dirs(mut dirs: Vec<PathBuf>) -> Vec<PathBuf> {
if dirs.is_empty() { if dirs.is_empty() {
return dirs; return dirs;

View file

@ -1374,7 +1374,7 @@ impl Inner {
} }
self.recreate_npm_services_if_necessary().await; self.recreate_npm_services_if_necessary().await;
self.assets.intitialize(self.snapshot()).await; self.assets.initialize(self.snapshot()).await;
self.performance.measure(mark); self.performance.measure(mark);
Ok(InitializeResult { Ok(InitializeResult {

View file

@ -626,7 +626,7 @@ pub fn tokens_to_regex(
route.push('$'); route.push('$');
} }
} else { } else {
let is_end_deliminated = match maybe_end_token { let is_end_delimited = match maybe_end_token {
Some(Token::String(mut s)) => { Some(Token::String(mut s)) => {
if let Some(c) = s.pop() { if let Some(c) = s.pop() {
delimiter.contains(c) delimiter.contains(c)
@ -642,7 +642,7 @@ pub fn tokens_to_regex(
write!(route, r"(?:{delimiter}(?={ends_with}))?").unwrap(); write!(route, r"(?:{delimiter}(?={ends_with}))?").unwrap();
} }
if !is_end_deliminated { if !is_end_delimited {
write!(route, r"(?={delimiter}|{ends_with})").unwrap(); write!(route, r"(?={delimiter}|{ends_with})").unwrap();
} }
} }

View file

@ -74,7 +74,7 @@ fn base_url(url: &Url) -> String {
} }
#[derive(Debug)] #[derive(Debug)]
enum CompletorType { enum CompletionType {
Literal(String), Literal(String),
Key { Key {
key: Key, key: Key,
@ -85,25 +85,25 @@ enum CompletorType {
/// Determine if a completion at a given offset is a string literal or a key/ /// Determine if a completion at a given offset is a string literal or a key/
/// variable. /// variable.
fn get_completor_type( fn get_completion_type(
offset: usize, offset: usize,
tokens: &[Token], tokens: &[Token],
match_result: &MatchResult, match_result: &MatchResult,
) -> Option<CompletorType> { ) -> Option<CompletionType> {
let mut len = 0_usize; let mut len = 0_usize;
for (index, token) in tokens.iter().enumerate() { for (index, token) in tokens.iter().enumerate() {
match token { match token {
Token::String(s) => { Token::String(s) => {
len += s.chars().count(); len += s.chars().count();
if offset < len { if offset < len {
return Some(CompletorType::Literal(s.clone())); return Some(CompletionType::Literal(s.clone()));
} }
} }
Token::Key(k) => { Token::Key(k) => {
if let Some(prefix) = &k.prefix { if let Some(prefix) = &k.prefix {
len += prefix.chars().count(); len += prefix.chars().count();
if offset < len { if offset < len {
return Some(CompletorType::Key { return Some(CompletionType::Key {
key: k.clone(), key: k.clone(),
prefix: Some(prefix.clone()), prefix: Some(prefix.clone()),
index, index,
@ -120,7 +120,7 @@ fn get_completor_type(
.unwrap_or_default(); .unwrap_or_default();
len += value.chars().count(); len += value.chars().count();
if offset <= len { if offset <= len {
return Some(CompletorType::Key { return Some(CompletionType::Key {
key: k.clone(), key: k.clone(),
prefix: None, prefix: None,
index, index,
@ -130,7 +130,7 @@ fn get_completor_type(
if let Some(suffix) = &k.suffix { if let Some(suffix) = &k.suffix {
len += suffix.chars().count(); len += suffix.chars().count();
if offset <= len { if offset <= len {
return Some(CompletorType::Literal(suffix.clone())); return Some(CompletionType::Literal(suffix.clone()));
} }
} }
} }
@ -688,17 +688,17 @@ impl ModuleRegistry {
.ok()?; .ok()?;
if let Some(match_result) = matcher.matches(path) { if let Some(match_result) = matcher.matches(path) {
did_match = true; did_match = true;
let completor_type = let completion_type =
get_completor_type(path_offset, &tokens, &match_result); get_completion_type(path_offset, &tokens, &match_result);
match completor_type { match completion_type {
Some(CompletorType::Literal(s)) => self.complete_literal( Some(CompletionType::Literal(s)) => self.complete_literal(
s, s,
&mut completions, &mut completions,
current_specifier, current_specifier,
offset, offset,
range, range,
), ),
Some(CompletorType::Key { key, prefix, index }) => { Some(CompletionType::Key { key, prefix, index }) => {
let maybe_url = registry.get_url_for_key(&key); let maybe_url = registry.get_url_for_key(&key);
if let Some(url) = maybe_url { if let Some(url) = maybe_url {
if let Some(items) = self if let Some(items) = self

View file

@ -650,7 +650,7 @@ impl Assets {
} }
/// Initializes with the assets in the isolate. /// Initializes with the assets in the isolate.
pub async fn intitialize(&self, state_snapshot: Arc<StateSnapshot>) { pub async fn initialize(&self, state_snapshot: Arc<StateSnapshot>) {
let assets = get_isolate_assets(&self.ts_server, state_snapshot).await; let assets = get_isolate_assets(&self.ts_server, state_snapshot).await;
let mut assets_map = self.assets.lock(); let mut assets_map = self.assets.lock();
for asset in assets { for asset in assets {
@ -4737,7 +4737,7 @@ mod tests {
} }
#[test] #[test]
fn include_supress_inlay_hit_settings() { fn include_suppress_inlay_hit_settings() {
let mut settings = WorkspaceSettings::default(); let mut settings = WorkspaceSettings::default();
settings settings
.inlay_hints .inlay_hints

View file

@ -3,7 +3,7 @@
This directory contains source for Deno's Node-API implementation. It depends on This directory contains source for Deno's Node-API implementation. It depends on
`napi_sym` and `deno_napi`. `napi_sym` and `deno_napi`.
- [`async.rs`](./async.rs) - Asyncronous work related functions. - [`async.rs`](./async.rs) - Asynchronous work related functions.
- [`env.rs`](./env.rs) - Environment related functions. - [`env.rs`](./env.rs) - Environment related functions.
- [`js_native_api.rs`](./js_native_api.rs) - V8/JS related functions. - [`js_native_api.rs`](./js_native_api.rs) - V8/JS related functions.
- [`thread_safe_function.rs`](./threadsafe_functions.rs) - Thread safe function - [`thread_safe_function.rs`](./threadsafe_functions.rs) - Thread safe function

View file

@ -136,7 +136,7 @@ fn napi_create_threadsafe_function(
_async_resource_name: napi_value, _async_resource_name: napi_value,
_max_queue_size: usize, _max_queue_size: usize,
initial_thread_count: usize, initial_thread_count: usize,
thread_finialize_data: *mut c_void, thread_finalize_data: *mut c_void,
thread_finalize_cb: Option<napi_finalize>, thread_finalize_cb: Option<napi_finalize>,
context: *mut c_void, context: *mut c_void,
maybe_call_js_cb: Option<napi_threadsafe_function_call_js>, maybe_call_js_cb: Option<napi_threadsafe_function_call_js>,
@ -168,7 +168,7 @@ fn napi_create_threadsafe_function(
thread_counter: initial_thread_count, thread_counter: initial_thread_count,
sender: env_ref.async_work_sender.clone(), sender: env_ref.async_work_sender.clone(),
finalizer: thread_finalize_cb, finalizer: thread_finalize_cb,
finalizer_data: thread_finialize_data, finalizer_data: thread_finalize_data,
tsfn_sender: env_ref.threadsafe_function_sender.clone(), tsfn_sender: env_ref.threadsafe_function_sender.clone(),
ref_counter: Arc::new(AtomicUsize::new(1)), ref_counter: Arc::new(AtomicUsize::new(1)),
env, env,

View file

@ -463,7 +463,7 @@ impl NpmCache {
} }
pub fn mixed_case_package_name_encode(name: &str) -> String { pub fn mixed_case_package_name_encode(name: &str) -> String {
// use base32 encoding because it's reversable and the character set // use base32 encoding because it's reversible and the character set
// only includes the characters within 0-9 and A-Z so it can be lower cased // only includes the characters within 0-9 and A-Z so it can be lower cased
base32::encode( base32::encode(
base32::Alphabet::RFC4648 { padding: false }, base32::Alphabet::RFC4648 { padding: false },

View file

@ -1325,7 +1325,7 @@ fn lsp_hover_change_mbc() {
"end": { "end": {
"line": 1, "line": 1,
// the LSP uses utf16 encoded characters indexes, so // the LSP uses utf16 encoded characters indexes, so
// after the deno emoiji is character index 15 // after the deno emoji is character index 15
"character": 15 "character": 15
} }
}, },
@ -2088,7 +2088,7 @@ fn lsp_document_symbol() {
"uri": "file:///a/file.ts", "uri": "file:///a/file.ts",
"languageId": "typescript", "languageId": "typescript",
"version": 1, "version": 1,
"text": "interface IFoo {\n foo(): boolean;\n}\n\nclass Bar implements IFoo {\n constructor(public x: number) { }\n foo() { return true; }\n /** @deprecated */\n baz() { return false; }\n get value(): number { return 0; }\n set value(newVavlue: number) { return; }\n static staticBar = new Bar(0);\n private static getStaticBar() { return Bar.staticBar; }\n}\n\nenum Values { value1, value2 }\n\nvar bar: IFoo = new Bar(3);" "text": "interface IFoo {\n foo(): boolean;\n}\n\nclass Bar implements IFoo {\n constructor(public x: number) { }\n foo() { return true; }\n /** @deprecated */\n baz() { return false; }\n get value(): number { return 0; }\n set value(_newValue: number) { return; }\n static staticBar = new Bar(0);\n private static getStaticBar() { return Bar.staticBar; }\n}\n\nenum Values { value1, value2 }\n\nvar bar: IFoo = new Bar(3);"
} }
}), }),
); );
@ -3114,7 +3114,7 @@ fn lsp_code_lens_test_disabled() {
"text": "const { test } = Deno;\nconst { test: test2 } = Deno;\nconst test3 = Deno.test;\n\nDeno.test(\"test a\", () => {});\nDeno.test({\n name: \"test b\",\n fn() {},\n});\ntest({\n name: \"test c\",\n fn() {},\n});\ntest(\"test d\", () => {});\ntest2({\n name: \"test e\",\n fn() {},\n});\ntest2(\"test f\", () => {});\ntest3({\n name: \"test g\",\n fn() {},\n});\ntest3(\"test h\", () => {});\n" "text": "const { test } = Deno;\nconst { test: test2 } = Deno;\nconst test3 = Deno.test;\n\nDeno.test(\"test a\", () => {});\nDeno.test({\n name: \"test b\",\n fn() {},\n});\ntest({\n name: \"test c\",\n fn() {},\n});\ntest(\"test d\", () => {});\ntest2({\n name: \"test e\",\n fn() {},\n});\ntest2(\"test f\", () => {});\ntest3({\n name: \"test g\",\n fn() {},\n});\ntest3(\"test h\", () => {});\n"
} }
}), }),
// diable test code lens // disable test code lens
json!([{ json!([{
"enable": true, "enable": true,
"codeLens": { "codeLens": {
@ -7373,7 +7373,7 @@ Deno.test({
.as_str() .as_str()
.unwrap(); .unwrap();
// deno test's output capturing flushes with a zero-width space in order to // deno test's output capturing flushes with a zero-width space in order to
// synchronize the output pipes. Occassionally this zero width space // synchronize the output pipes. Occasionally this zero width space
// might end up in the output so strip it from the output comparison here. // might end up in the output so strip it from the output comparison here.
assert_eq!(notification_value.replace('\u{200B}', ""), "test a\r\n"); assert_eq!(notification_value.replace('\u{200B}', ""), "test a\r\n");
assert_eq!( assert_eq!(

View file

@ -2466,7 +2466,7 @@ mod permissions {
#[test] #[test]
fn net_listen_allow_localhost() { fn net_listen_allow_localhost() {
// Port 4600 is chosen to not colide with those used by // Port 4600 is chosen to not collide with those used by
// target/debug/test_server // target/debug/test_server
let (_, err) = util::run_and_collect_output( let (_, err) = util::run_and_collect_output(
true, true,

View file

@ -105,7 +105,7 @@ Deno.test(
}, },
async function chownSyncSucceed() { async function chownSyncSucceed() {
// TODO(bartlomieju): when a file's owner is actually being changed, // TODO(bartlomieju): when a file's owner is actually being changed,
// chown only succeeds if run under priviledged user (root) // chown only succeeds if run under privileged user (root)
// The test script has no such privilege, so need to find a better way to test this case // The test script has no such privilege, so need to find a better way to test this case
const { uid, gid } = await getUidAndGid(); const { uid, gid } = await getUidAndGid();
@ -114,7 +114,7 @@ Deno.test(
Deno.writeTextFileSync(filePath, "Hello"); Deno.writeTextFileSync(filePath, "Hello");
// the test script creates this file with the same uid and gid, // the test script creates this file with the same uid and gid,
// here chown is a noop so it succeeds under non-priviledged user // here chown is a noop so it succeeds under non-privileged user
Deno.chownSync(filePath, uid, gid); Deno.chownSync(filePath, uid, gid);
Deno.removeSync(dirPath, { recursive: true }); Deno.removeSync(dirPath, { recursive: true });
@ -182,7 +182,7 @@ Deno.test(
await Deno.writeFile(fileUrl, fileData); await Deno.writeFile(fileUrl, fileData);
// the test script creates this file with the same uid and gid, // the test script creates this file with the same uid and gid,
// here chown is a noop so it succeeds under non-priviledged user // here chown is a noop so it succeeds under non-privileged user
await Deno.chown(fileUrl, uid, gid); await Deno.chown(fileUrl, uid, gid);
Deno.removeSync(dirPath, { recursive: true }); Deno.removeSync(dirPath, { recursive: true });

View file

@ -797,7 +797,7 @@ setInterval(() => {
Deno.writeFileSync(`${cwd}/${programFile}`, enc.encode(program)); Deno.writeFileSync(`${cwd}/${programFile}`, enc.encode(program));
Deno.writeFileSync(`${cwd}/${childProgramFile}`, enc.encode(childProgram)); Deno.writeFileSync(`${cwd}/${childProgramFile}`, enc.encode(childProgram));
// In this subprocess we are spawning another subprocess which has // In this subprocess we are spawning another subprocess which has
// an infite interval set. Following call would never resolve unless // an infinite interval set. Following call would never resolve unless
// child process gets unrefed. // child process gets unrefed.
const { success, stdout, stderr } = await new Deno.Command( const { success, stdout, stderr } = await new Deno.Command(
Deno.execPath(), Deno.execPath(),

View file

@ -3,7 +3,7 @@
// TODO(ry) The unit test functions in this module are too coarse. They should // TODO(ry) The unit test functions in this module are too coarse. They should
// be broken up into smaller bits. // be broken up into smaller bits.
// TODO(ry) These tests currentl strip all the ANSI colors out. We don't have a // TODO(ry) These tests currently strip all the ANSI colors out. We don't have a
// good way to control whether we produce color output or not since // good way to control whether we produce color output or not since
// std/fmt/colors auto determines whether to put colors in or not. We need // std/fmt/colors auto determines whether to put colors in or not. We need
// better infrastructure here so we can properly test the colors. // better infrastructure here so we can properly test the colors.
@ -1069,7 +1069,7 @@ Deno.test(function consoleTestWithCustomInspectorError() {
() => stringify(a), () => stringify(a),
Error, Error,
"BOOM", "BOOM",
"Inpsect should fail and maintain a clear CTX_STACK", "Inspect should fail and maintain a clear CTX_STACK",
); );
}); });
@ -1779,7 +1779,7 @@ Deno.test(function consoleLogShouldNotThrowErrorWhenInvalidCssColorsAreGiven() {
}); });
// console.log(Invalid Date) test // console.log(Invalid Date) test
Deno.test(function consoleLogShoultNotThrowErrorWhenInvalidDateIsPassed() { Deno.test(function consoleLogShouldNotThrowErrorWhenInvalidDateIsPassed() {
mockConsole((console, out) => { mockConsole((console, out) => {
const invalidDate = new Date("test"); const invalidDate = new Date("test");
console.log(invalidDate); console.log(invalidDate);

View file

@ -1270,9 +1270,9 @@ Deno.test(
}, 1000); }, 1000);
}, },
}); });
const nonExistantHostname = "http://localhost:47582"; const nonExistentHostname = "http://localhost:47582";
await assertRejects(async () => { await assertRejects(async () => {
await fetch(nonExistantHostname, { body, method: "POST" }); await fetch(nonExistentHostname, { body, method: "POST" });
}, TypeError); }, TypeError);
await done; await done;
}, },

View file

@ -29,7 +29,7 @@ Deno.test({ permissions: { ffi: false } }, function ffiPermissionDenied() {
Deno.dlopen("/usr/lib/libc.so.6", {}); Deno.dlopen("/usr/lib/libc.so.6", {});
}, Deno.errors.PermissionDenied); }, Deno.errors.PermissionDenied);
const fnptr = new Deno.UnsafeFnPointer( const fnptr = new Deno.UnsafeFnPointer(
// @ts-expect-error: Not NonNullable but null check is after premissions check. // @ts-expect-error: Not NonNullable but null check is after permissions check.
null, null,
{ {
parameters: ["u32", "pointer"], parameters: ["u32", "pointer"],
@ -43,7 +43,7 @@ Deno.test({ permissions: { ffi: false } }, function ffiPermissionDenied() {
Deno.UnsafePointer.of(new Uint8Array(0)); Deno.UnsafePointer.of(new Uint8Array(0));
}, Deno.errors.PermissionDenied); }, Deno.errors.PermissionDenied);
const ptrView = new Deno.UnsafePointerView( const ptrView = new Deno.UnsafePointerView(
// @ts-expect-error: Not NonNullable but null check is after premissions check. // @ts-expect-error: Not NonNullable but null check is after permissions check.
null, null,
); );
assertThrows(() => { assertThrows(() => {

View file

@ -14,14 +14,14 @@ Deno.test({ permissions: { read: true } }, function watchFsInvalidPath() {
if (Deno.build.os === "windows") { if (Deno.build.os === "windows") {
assertThrows( assertThrows(
() => { () => {
Deno.watchFs("non-existant.file"); Deno.watchFs("non-existent.file");
}, },
Error, Error,
"Input watch path is neither a file nor a directory", "Input watch path is neither a file nor a directory",
); );
} else { } else {
assertThrows(() => { assertThrows(() => {
Deno.watchFs("non-existant.file"); Deno.watchFs("non-existent.file");
}, Deno.errors.NotFound); }, Deno.errors.NotFound);
} }
}); });
@ -51,7 +51,7 @@ Deno.test(
const testDir = await makeTempDir(); const testDir = await makeTempDir();
const iter = Deno.watchFs(testDir); const iter = Deno.watchFs(testDir);
// Asynchornously capture two fs events. // Asynchronously capture two fs events.
const eventsPromise = getTwoEvents(iter); const eventsPromise = getTwoEvents(iter);
// Make some random file system activity. // Make some random file system activity.

View file

@ -1662,7 +1662,7 @@ Deno.test({
await db.enqueue("msg2"); await db.enqueue("msg2");
await promise; await promise;
// Close the database and wait for the listerner to finish. // Close the database and wait for the listener to finish.
db.close(); db.close();
await listener; await listener;
@ -1718,7 +1718,7 @@ Deno.test({
await db.enqueue("msg1", { delay: 10000 }); await db.enqueue("msg1", { delay: 10000 });
await db.enqueue("msg2", { delay: 10000 }); await db.enqueue("msg2", { delay: 10000 });
// Close the database and wait for the listerner to finish. // Close the database and wait for the listener to finish.
db.close(); db.close();
await listener; await listener;

View file

@ -2153,16 +2153,16 @@ const compressionTestCases = [
// out: { "Content-Type": "text/plain" }, // out: { "Content-Type": "text/plain" },
// expect: null, // expect: null,
// }, // },
{ name: "Uncompressible", length: 1024, in: {}, out: {}, expect: null }, { name: "Incompressible", length: 1024, in: {}, out: {}, expect: null },
{ {
name: "UncompressibleAcceptGzip", name: "IncompressibleAcceptGzip",
length: 1024, length: 1024,
in: { "Accept-Encoding": "gzip" }, in: { "Accept-Encoding": "gzip" },
out: {}, out: {},
expect: null, expect: null,
}, },
{ {
name: "UncompressibleType", name: "IncompressibleType",
length: 1024, length: 1024,
in: { "Accept-Encoding": "gzip" }, in: { "Accept-Encoding": "gzip" },
out: { "Content-Type": "text/fake" }, out: { "Content-Type": "text/fake" },
@ -2190,21 +2190,21 @@ const compressionTestCases = [
expect: "br", expect: "br",
}, },
{ {
name: "UncompressibleRange", name: "IncompressibleRange",
length: 1024, length: 1024,
in: { "Accept-Encoding": "gzip" }, in: { "Accept-Encoding": "gzip" },
out: { "Content-Type": "text/plain", "Content-Range": "1" }, out: { "Content-Type": "text/plain", "Content-Range": "1" },
expect: null, expect: null,
}, },
{ {
name: "UncompressibleCE", name: "IncompressibleCE",
length: 1024, length: 1024,
in: { "Accept-Encoding": "gzip" }, in: { "Accept-Encoding": "gzip" },
out: { "Content-Type": "text/plain", "Content-Encoding": "random" }, out: { "Content-Type": "text/plain", "Content-Encoding": "random" },
expect: null, expect: null,
}, },
{ {
name: "UncompressibleCC", name: "IncompressibleCC",
length: 1024, length: 1024,
in: { "Accept-Encoding": "gzip" }, in: { "Accept-Encoding": "gzip" },
out: { "Content-Type": "text/plain", "Cache-Control": "no-transform" }, out: { "Content-Type": "text/plain", "Cache-Control": "no-transform" },

View file

@ -250,17 +250,17 @@ Deno.test(function toStringShouldBeWebCompatibility() {
Deno.test(function textEncoderShouldCoerceToString() { Deno.test(function textEncoderShouldCoerceToString() {
const encoder = new TextEncoder(); const encoder = new TextEncoder();
const fixutreText = "text"; const fixtureText = "text";
const fixture = { const fixture = {
toString() { toString() {
return fixutreText; return fixtureText;
}, },
}; };
const bytes = encoder.encode(fixture as unknown as string); const bytes = encoder.encode(fixture as unknown as string);
const decoder = new TextDecoder(); const decoder = new TextDecoder();
const decoded = decoder.decode(bytes); const decoded = decoder.decode(bytes);
assertEquals(decoded, fixutreText); assertEquals(decoded, fixtureText);
}); });
Deno.test(function binaryEncode() { Deno.test(function binaryEncode() {

View file

@ -23,7 +23,7 @@ Deno.test(function urlSearchParamsWithQuotes() {
assertEquals(searchParams, "str=%27hello+world%27"); assertEquals(searchParams, "str=%27hello+world%27");
}); });
Deno.test(function urlSearchParamsWithBraket() { Deno.test(function urlSearchParamsWithBracket() {
const init = [ const init = [
["str", "(hello world)"], ["str", "(hello world)"],
]; ];
@ -328,10 +328,10 @@ Deno.test(
// If a class extends URLSearchParams, override one method should not change another's behavior. // If a class extends URLSearchParams, override one method should not change another's behavior.
Deno.test( Deno.test(
function urlSearchParamsOverridingAppendNotChangeConstructorAndSet() { function urlSearchParamsOverridingAppendNotChangeConstructorAndSet() {
let overridedAppendCalled = 0; let overriddenAppendCalled = 0;
class CustomSearchParams extends URLSearchParams { class CustomSearchParams extends URLSearchParams {
append(name: string, value: string) { append(name: string, value: string) {
++overridedAppendCalled; ++overriddenAppendCalled;
super.append(name, value); super.append(name, value);
} }
} }
@ -339,7 +339,7 @@ Deno.test(
new CustomSearchParams([["foo", "bar"]]); new CustomSearchParams([["foo", "bar"]]);
new CustomSearchParams(new CustomSearchParams({ foo: "bar" })); new CustomSearchParams(new CustomSearchParams({ foo: "bar" }));
new CustomSearchParams().set("foo", "bar"); new CustomSearchParams().set("foo", "bar");
assertEquals(overridedAppendCalled, 0); assertEquals(overriddenAppendCalled, 0);
}, },
); );

View file

@ -215,7 +215,7 @@ Deno.test({
assertEquals( assertEquals(
Buffer.byteLength(Buffer.alloc(0)), Buffer.byteLength(Buffer.alloc(0)),
Buffer.alloc(0).byteLength, Buffer.alloc(0).byteLength,
"Byte lenght differs on buffers", "Byte length differs on buffers",
); );
}, },
}); });

View file

@ -288,7 +288,7 @@ Deno.test("[node/http] non-string buffer response", {
}); });
// TODO(kt3k): Enable this test // TODO(kt3k): Enable this test
// Currently ImcomingMessage constructor has incompatible signature. // Currently IncomingMessage constructor has incompatible signature.
/* /*
Deno.test("[node/http] http.IncomingMessage can be created without url", () => { Deno.test("[node/http] http.IncomingMessage can be created without url", () => {
const message = new http.IncomingMessage( const message = new http.IncomingMessage(

View file

@ -103,7 +103,7 @@ Deno.test({
worker.postMessage("Hello, how are you my thread?"); worker.postMessage("Hello, how are you my thread?");
assertEquals((await once(worker, "message"))[0], "I'm fine!"); assertEquals((await once(worker, "message"))[0], "I'm fine!");
const data = (await once(worker, "message"))[0]; const data = (await once(worker, "message"))[0];
// data.threadId can be 1 when this test is runned individually // data.threadId can be 1 when this test is run individually
if (data.threadId === 1) data.threadId = 3; if (data.threadId === 1) data.threadId = 3;
assertObjectMatch(data, { assertObjectMatch(data, {
isMainThread: false, isMainThread: false,
@ -144,7 +144,7 @@ Deno.test({
}); });
Deno.test({ Deno.test({
name: "[worker_threads] inheritences", name: "[worker_threads] inheritances",
async fn() { async fn() {
const worker = new workerThreads.Worker( const worker = new workerThreads.Worker(
` `

View file

@ -132,12 +132,12 @@ impl CoverageCollector {
let mut out = BufWriter::new(File::create(filepath)?); let mut out = BufWriter::new(File::create(filepath)?);
let coverage = serde_json::to_string(&script_coverage)?; let coverage = serde_json::to_string(&script_coverage)?;
let formated_coverage = format_json(&coverage, &Default::default()) let formatted_coverage = format_json(&coverage, &Default::default())
.ok() .ok()
.flatten() .flatten()
.unwrap_or(coverage); .unwrap_or(coverage);
out.write_all(formated_coverage.as_bytes())?; out.write_all(formatted_coverage.as_bytes())?;
out.flush()?; out.flush()?;
} }
@ -533,20 +533,20 @@ impl CoverageReporter for PrettyCoverageReporter {
let mut last_line = None; let mut last_line = None;
for line_index in missed_lines { for line_index in missed_lines {
const WIDTH: usize = 4; const WIDTH: usize = 4;
const SEPERATOR: &str = "|"; const SEPARATOR: &str = "|";
// Put a horizontal separator between disjoint runs of lines // Put a horizontal separator between disjoint runs of lines
if let Some(last_line) = last_line { if let Some(last_line) = last_line {
if last_line + 1 != line_index { if last_line + 1 != line_index {
let dash = colors::gray("-".repeat(WIDTH + 1)); let dash = colors::gray("-".repeat(WIDTH + 1));
println!("{}{}{}", dash, colors::gray(SEPERATOR), dash); println!("{}{}{}", dash, colors::gray(SEPARATOR), dash);
} }
} }
println!( println!(
"{:width$} {} {}", "{:width$} {} {}",
line_index + 1, line_index + 1,
colors::gray(SEPERATOR), colors::gray(SEPARATOR),
colors::red(&lines[line_index]), colors::red(&lines[line_index]),
width = WIDTH width = WIDTH
); );
@ -703,7 +703,7 @@ pub async fn cover_files(
| MediaType::Mts | MediaType::Mts
| MediaType::Cts | MediaType::Cts
| MediaType::Tsx => { | MediaType::Tsx => {
match emitter.maybed_cached_emit(&file.specifier, &file.source) { match emitter.maybe_cached_emit(&file.specifier, &file.source) {
Some(code) => code.into(), Some(code) => code.into(),
None => { None => {
return Err(anyhow!( return Err(anyhow!(

View file

@ -442,8 +442,8 @@ fn format_ensure_stable(
concat!( concat!(
"Formatting succeeded initially, but failed when ensuring a ", "Formatting succeeded initially, but failed when ensuring a ",
"stable format. This indicates a bug in the formatter where ", "stable format. This indicates a bug in the formatter where ",
"the text it produces is not syntatically correct. As a temporary ", "the text it produces is not syntactically correct. As a temporary ",
"workfaround you can ignore this file ({}).\n\n{:#}" "workaround you can ignore this file ({}).\n\n{:#}"
), ),
file_path.display(), file_path.display(),
err, err,

View file

@ -288,7 +288,7 @@ fn validate(input: &str) -> ValidationResult {
| (Some(Token::LBrace), Token::RBrace) | (Some(Token::LBrace), Token::RBrace)
| (Some(Token::DollarLBrace), Token::RBrace) => {} | (Some(Token::DollarLBrace), Token::RBrace) => {}
(Some(left), _) => { (Some(left), _) => {
// queue up a validation error to surface once we've finished examininig the current line // queue up a validation error to surface once we've finished examining the current line
queued_validation_error = Some(ValidationResult::Invalid(Some( queued_validation_error = Some(ValidationResult::Invalid(Some(
format!("Mismatched pairs: {left:?} is not properly closed"), format!("Mismatched pairs: {left:?} is not properly closed"),
))); )));

View file

@ -145,7 +145,7 @@ fn get_script_with_args(script: &str, options: &CliOptions) -> String {
.argv() .argv()
.iter() .iter()
// surround all the additional arguments in double quotes // surround all the additional arguments in double quotes
// and santize any command substition // and sanitize any command substitution
.map(|a| format!("\"{}\"", a.replace('"', "\\\"").replace('$', "\\$"))) .map(|a| format!("\"{}\"", a.replace('"', "\\\"").replace('$', "\\$")))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(" "); .join(" ");

View file

@ -125,7 +125,7 @@ impl<TEnvironment: UpdateCheckerEnvironment> UpdateChecker<TEnvironment> {
/// Returns the version if a new one is available and it should be prompted about. /// Returns the version if a new one is available and it should be prompted about.
pub fn should_prompt(&self) -> Option<String> { pub fn should_prompt(&self) -> Option<String> {
let file = self.maybe_file.as_ref()?; let file = self.maybe_file.as_ref()?;
// If the current version saved is not the actualy current version of the binary // If the current version saved is not the actually current version of the binary
// It means // It means
// - We already check for a new version today // - We already check for a new version today
// - The user have probably upgraded today // - The user have probably upgraded today

View file

@ -45,7 +45,7 @@ pub fn human_download_size(byte_count: u64, total_bytes: u64) -> String {
} }
} }
/// A function that converts a milisecond elapsed time to a string that /// A function that converts a millisecond elapsed time to a string that
/// represents a human readable version of that time. /// represents a human readable version of that time.
pub fn human_elapsed(elapsed: u128) -> String { pub fn human_elapsed(elapsed: u128) -> String {
if elapsed < 1_000 { if elapsed < 1_000 {

View file

@ -221,7 +221,7 @@
error = errorMap[className]?.(message); error = errorMap[className]?.(message);
} catch (e) { } catch (e) {
throw new Error( throw new Error(
`Unsable to build custom error for "${className}"\n ${e.message}`, `Unable to build custom error for "${className}"\n ${e.message}`,
); );
} }
// Strip buildCustomError() calls from stack trace // Strip buildCustomError() calls from stack trace

View file

@ -24,7 +24,7 @@ bytes.workspace = true
deno_ops.workspace = true deno_ops.workspace = true
futures.workspace = true futures.workspace = true
# Stay on 1.6 to avoid a dependency cycle in ahash https://github.com/tkaitchuck/aHash/issues/95 # Stay on 1.6 to avoid a dependency cycle in ahash https://github.com/tkaitchuck/aHash/issues/95
# Projects not depending on ahash are unafected as cargo will pull any 1.X that is >= 1.6. # Projects not depending on ahash are unaffected as cargo will pull any 1.X that is >= 1.6.
indexmap = "1.6" indexmap = "1.6"
libc.workspace = true libc.workspace = true
log.workspace = true log.workspace = true

View file

@ -390,7 +390,7 @@ mod internal {
let turn = self.turn.get(); let turn = self.turn.get();
if id < turn { if id < turn {
// We already made a borrow count reservation for this waiter but the // We already made a borrow count reservation for this waiter but the
// borrow will never be picked up and removesequently, never dropped. // borrow will never be picked up and consequently, never dropped.
// Therefore, call the borrow drop handler here. // Therefore, call the borrow drop handler here.
self.drop_borrow::<M>(); self.drop_borrow::<M>();
} else { } else {

View file

@ -4,7 +4,7 @@
// NOTE: // NOTE:
// Here we are deserializing to `serde_json::Value` but you can // Here we are deserializing to `serde_json::Value` but you can
// deserialize to any other type that implementes the `Deserialize` trait. // deserialize to any other type that implements the `Deserialize` trait.
use deno_core::v8; use deno_core::v8;
use deno_core::JsRuntime; use deno_core::JsRuntime;

View file

@ -72,7 +72,7 @@ enum PollState {
/// After creating this structure it's possible to connect multiple sessions /// After creating this structure it's possible to connect multiple sessions
/// to the inspector, in case of Deno it's either: a "websocket session" that /// to the inspector, in case of Deno it's either: a "websocket session" that
/// provides integration with Chrome Devtools, or an "in-memory session" that /// provides integration with Chrome Devtools, or an "in-memory session" that
/// is used for REPL or converage collection. /// is used for REPL or coverage collection.
pub struct JsRuntimeInspector { pub struct JsRuntimeInspector {
v8_inspector_client: v8::inspector::V8InspectorClientBase, v8_inspector_client: v8::inspector::V8InspectorClientBase,
v8_inspector: Rc<RefCell<v8::UniquePtr<v8::inspector::V8Inspector>>>, v8_inspector: Rc<RefCell<v8::UniquePtr<v8::inspector::V8Inspector>>>,
@ -143,7 +143,7 @@ impl v8::inspector::V8InspectorClientImpl for JsRuntimeInspector {
impl JsRuntimeInspector { impl JsRuntimeInspector {
/// Currently Deno supports only a single context in `JsRuntime` /// Currently Deno supports only a single context in `JsRuntime`
/// and thus it's id is provided as an associated contant. /// and thus it's id is provided as an associated constant.
const CONTEXT_GROUP_ID: i32 = 1; const CONTEXT_GROUP_ID: i32 = 1;
pub fn new( pub fn new(
@ -270,7 +270,7 @@ impl JsRuntimeInspector {
mut invoker_cx: Option<&mut Context>, mut invoker_cx: Option<&mut Context>,
) -> Result<Poll<()>, BorrowMutError> { ) -> Result<Poll<()>, BorrowMutError> {
// The futures this function uses do not have re-entrant poll() functions. // The futures this function uses do not have re-entrant poll() functions.
// However it is can happpen that poll_sessions() gets re-entered, e.g. // However it is can happen that poll_sessions() gets re-entered, e.g.
// when an interrupt request is honored while the inspector future is polled // when an interrupt request is honored while the inspector future is polled
// by the task executor. We let the caller know by returning some error. // by the task executor. We let the caller know by returning some error.
let mut sessions = self.sessions.try_borrow_mut()?; let mut sessions = self.sessions.try_borrow_mut()?;

View file

@ -73,7 +73,7 @@ pub(crate) struct ModuleMap {
pub(crate) pending_dynamic_imports: pub(crate) pending_dynamic_imports:
FuturesUnordered<StreamFuture<RecursiveModuleLoad>>, FuturesUnordered<StreamFuture<RecursiveModuleLoad>>,
// This store is used temporarly, to forward parsed JSON // This store is used temporarily, to forward parsed JSON
// value from `new_json_module` to `json_module_evaluation_steps` // value from `new_json_module` to `json_module_evaluation_steps`
json_value_store: HashMap<v8::Global<v8::Module>, v8::Global<v8::Value>>, json_value_store: HashMap<v8::Global<v8::Module>, v8::Global<v8::Value>>,
} }

View file

@ -242,7 +242,7 @@ async fn op_read_all(
let mut grow_len: usize = 64 * 1024; let mut grow_len: usize = 64 * 1024;
let (min, maybe_max) = resource.size_hint(); let (min, maybe_max) = resource.size_hint();
// Try to determine an optimial starting buffer size for this resource based // Try to determine an optimal starting buffer size for this resource based
// on the size hint. // on the size hint.
let initial_size = match (min, maybe_max) { let initial_size = match (min, maybe_max) {
(min, Some(max)) if min == max => min as usize, (min, Some(max)) if min == max => min as usize,

View file

@ -369,7 +369,7 @@ fn empty_fn(
//Do Nothing //Do Nothing
} }
//It creates a reference to an empty function which can be mantained after the snapshots //It creates a reference to an empty function which can be maintained after the snapshots
pub fn create_empty_fn<'s>( pub fn create_empty_fn<'s>(
scope: &mut v8::HandleScope<'s>, scope: &mut v8::HandleScope<'s>,
) -> Option<v8::Local<'s, v8::Function>> { ) -> Option<v8::Local<'s, v8::Function>> {

View file

@ -200,7 +200,7 @@ impl JsRealmInner {
// Expect that this context is dead (we only check this in debug mode) // Expect that this context is dead (we only check this in debug mode)
// TODO(mmastrac): This check fails for some tests, will need to fix this // TODO(mmastrac): This check fails for some tests, will need to fix this
// debug_assert_eq!(Rc::strong_count(&self.context), 1, "Realm was still alive when we wanted to destory it. Not dropped?"); // debug_assert_eq!(Rc::strong_count(&self.context), 1, "Realm was still alive when we wanted to destroy it. Not dropped?");
} }
} }

View file

@ -401,7 +401,7 @@ pub struct RuntimeOptions {
pub create_params: Option<v8::CreateParams>, pub create_params: Option<v8::CreateParams>,
/// V8 platform instance to use. Used when Deno initializes V8 /// V8 platform instance to use. Used when Deno initializes V8
/// (which it only does once), otherwise it's silenty dropped. /// (which it only does once), otherwise it's silently dropped.
pub v8_platform: Option<v8::SharedRef<v8::Platform>>, pub v8_platform: Option<v8::SharedRef<v8::Platform>>,
/// The store to use for transferring SharedArrayBuffers between isolates. /// The store to use for transferring SharedArrayBuffers between isolates.
@ -924,7 +924,7 @@ impl JsRuntime {
// macroware wraps an opfn in all the middleware // macroware wraps an opfn in all the middleware
let macroware = move |d| middleware.iter().fold(d, |d, m| m(d)); let macroware = move |d| middleware.iter().fold(d, |d, m| m(d));
// Flatten ops, apply middlware & override disabled ops // Flatten ops, apply middleware & override disabled ops
let ops: Vec<_> = exts let ops: Vec<_> = exts
.iter_mut() .iter_mut()
.filter_map(|e| e.init_ops()) .filter_map(|e| e.init_ops())
@ -1771,7 +1771,7 @@ impl JsRuntime {
let has_dispatched_exception = let has_dispatched_exception =
state_rc.borrow_mut().dispatched_exception.is_some(); state_rc.borrow_mut().dispatched_exception.is_some();
if has_dispatched_exception { if has_dispatched_exception {
// This will be overrided in `exception_to_err_result()`. // This will be overridden in `exception_to_err_result()`.
let exception = v8::undefined(tc_scope).into(); let exception = v8::undefined(tc_scope).into();
let pending_mod_evaluate = { let pending_mod_evaluate = {
let mut state = state_rc.borrow_mut(); let mut state = state_rc.borrow_mut();

View file

@ -364,7 +364,7 @@ fn terminate_execution_webassembly() {
let (mut runtime, _dispatch_count) = setup(Mode::Async); let (mut runtime, _dispatch_count) = setup(Mode::Async);
let v8_isolate_handle = runtime.v8_isolate().thread_safe_handle(); let v8_isolate_handle = runtime.v8_isolate().thread_safe_handle();
// Run an infinite loop in Webassemby code, which should be terminated. // Run an infinite loop in WebAssembly code, which should be terminated.
let promise = runtime.execute_script_static("infinite_wasm_loop.js", let promise = runtime.execute_script_static("infinite_wasm_loop.js",
r#" r#"
(async () => { (async () => {
@ -1894,7 +1894,7 @@ fn test_op_unstable_disabling() {
"test.js", "test.js",
r#" r#"
if (Deno.core.ops.op_foo() !== 42) { if (Deno.core.ops.op_foo() !== 42) {
throw new Error("Exptected op_foo() === 42"); throw new Error("Expected op_foo() === 42");
} }
if (typeof Deno.core.ops.op_bar !== "undefined") { if (typeof Deno.core.ops.op_bar !== "undefined") {
throw new Error("Expected op_bar to be disabled") throw new Error("Expected op_bar to be disabled")

View file

@ -1402,7 +1402,7 @@ function formatSet(value, ctx, _ignored, recurseTimes) {
return output; return output;
} }
function formatMap(value, ctx, _gnored, recurseTimes) { function formatMap(value, ctx, _ignored, recurseTimes) {
ctx.indentationLvl += 2; ctx.indentationLvl += 2;
const values = [...new SafeMapIterator(value)]; const values = [...new SafeMapIterator(value)];

View file

@ -2666,7 +2666,7 @@ function importKeyAES(
TypedArrayPrototypeGetByteLength(keyData) * 8, TypedArrayPrototypeGetByteLength(keyData) * 8,
) )
) { ) {
throw new DOMException("Invalid key length", "Datarror"); throw new DOMException("Invalid key length", "DataError");
} }
break; break;

View file

@ -130,7 +130,7 @@ fn export_key_rsa(
algorithm: spki::AlgorithmIdentifier { algorithm: spki::AlgorithmIdentifier {
// rsaEncryption(1) // rsaEncryption(1)
oid: const_oid::ObjectIdentifier::new_unwrap("1.2.840.113549.1.1.1"), oid: const_oid::ObjectIdentifier::new_unwrap("1.2.840.113549.1.1.1"),
// parameters field should not be ommited (None). // parameters field should not be omitted (None).
// It MUST have ASN.1 type NULL. // It MUST have ASN.1 type NULL.
parameters: Some(asn1::AnyRef::from(asn1::Null)), parameters: Some(asn1::AnyRef::from(asn1::Null)),
}, },
@ -158,7 +158,7 @@ fn export_key_rsa(
algorithm: rsa::pkcs8::AlgorithmIdentifier { algorithm: rsa::pkcs8::AlgorithmIdentifier {
// rsaEncryption(1) // rsaEncryption(1)
oid: rsa::pkcs8::ObjectIdentifier::new_unwrap("1.2.840.113549.1.1.1"), oid: rsa::pkcs8::ObjectIdentifier::new_unwrap("1.2.840.113549.1.1.1"),
// parameters field should not be ommited (None). // parameters field should not be omitted (None).
// It MUST have ASN.1 type NULL as per defined in RFC 3279 Section 2.3.1 // It MUST have ASN.1 type NULL as per defined in RFC 3279 Section 2.3.1
parameters: Some(asn1::AnyRef::from(asn1::Null)), parameters: Some(asn1::AnyRef::from(asn1::Null)),
}, },

View file

@ -96,7 +96,8 @@ pub trait FileSystem: std::fmt::Debug + MaybeSend + MaybeSync {
options: OpenOptions, options: OpenOptions,
) -> FsResult<Rc<dyn File>>; ) -> FsResult<Rc<dyn File>>;
fn mkdir_sync(&self, path: &Path, recusive: bool, mode: u32) -> FsResult<()>; fn mkdir_sync(&self, path: &Path, recursive: bool, mode: u32)
-> FsResult<()>;
async fn mkdir_async( async fn mkdir_async(
&self, &self,
path: PathBuf, path: PathBuf,

View file

@ -293,7 +293,7 @@ function createRespondWith(
if (respBody.locked) { if (respBody.locked) {
throw new TypeError("ReadableStream is locked."); throw new TypeError("ReadableStream is locked.");
} }
reader = respBody.getReader(); // Aquire JS lock. reader = respBody.getReader(); // Acquire JS lock.
try { try {
await core.opAsync( await core.opAsync(
"op_http_write_resource", "op_http_write_resource",

View file

@ -97,10 +97,10 @@ static USE_WRITEV: Lazy<bool> = Lazy::new(|| {
/// MUST be followed by a SETTINGS frame (Section 6.5), which MAY be empty. /// MUST be followed by a SETTINGS frame (Section 6.5), which MAY be empty.
const HTTP2_PREFIX: &[u8] = b"PRI * HTTP/2.0\r\n\r\nSM\r\n\r\n"; const HTTP2_PREFIX: &[u8] = b"PRI * HTTP/2.0\r\n\r\nSM\r\n\r\n";
/// ALPN negotation for "h2" /// ALPN negotiation for "h2"
const TLS_ALPN_HTTP_2: &[u8] = b"h2"; const TLS_ALPN_HTTP_2: &[u8] = b"h2";
/// ALPN negotation for "http/1.1" /// ALPN negotiation for "http/1.1"
const TLS_ALPN_HTTP_11: &[u8] = b"http/1.1"; const TLS_ALPN_HTTP_11: &[u8] = b"http/1.1";
/// Name a trait for streams we can serve HTTP over. /// Name a trait for streams we can serve HTTP over.

View file

@ -126,7 +126,7 @@ pub struct NetworkBufferedStream<S: AsyncRead + Unpin> {
} }
impl<S: AsyncRead + Unpin> NetworkBufferedStream<S> { impl<S: AsyncRead + Unpin> NetworkBufferedStream<S> {
/// This constructor is private, because passing partically initialized data between the [`NetworkStreamPrefixCheck`] and /// This constructor is private, because passing partially initialized data between the [`NetworkStreamPrefixCheck`] and
/// this [`NetworkBufferedStream`] is challenging without the introduction of extra copies. /// this [`NetworkBufferedStream`] is challenging without the introduction of extra copies.
fn new( fn new(
io: S, io: S,

View file

@ -63,7 +63,7 @@ pub fn slab_get(index: SlabId) -> SlabEntry {
unsafe { std::mem::transmute(x.borrow_mut()) } unsafe { std::mem::transmute(x.borrow_mut()) }
}); });
let Some(entry) = lock.get_mut(index as usize) else { let Some(entry) = lock.get_mut(index as usize) else {
panic!("HTTP state error: Attemped to access invalid request {} ({} in total available)", panic!("HTTP state error: Attempted to access invalid request {} ({} in total available)",
index, index,
lock.len()) lock.len())
}; };

View file

@ -8,7 +8,7 @@ edition.workspace = true
license.workspace = true license.workspace = true
readme = "README.md" readme = "README.md"
repository.workspace = true repository.workspace = true
description = "IO promitives for Deno extensions" description = "IO primitives for Deno extensions"
[lib] [lib]
path = "lib.rs" path = "lib.rs"

View file

@ -325,7 +325,7 @@ impl SqliteQueue {
// Oneshot requeue of all inflight messages. // Oneshot requeue of all inflight messages.
Self::requeue_inflight_messages(conn.clone()).await.unwrap(); Self::requeue_inflight_messages(conn.clone()).await.unwrap();
// Continous dequeue loop. // Continuous dequeue loop.
Self::dequeue_loop(conn.clone(), dequeue_tx, shutdown_rx, waker_rx) Self::dequeue_loop(conn.clone(), dequeue_tx, shutdown_rx, waker_rx)
.await .await
.unwrap(); .unwrap();
@ -716,12 +716,12 @@ impl Database for SqliteDb {
} }
tx.commit()?; tx.commit()?;
let new_vesionstamp = version_to_versionstamp(version); let new_versionstamp = version_to_versionstamp(version);
Ok(( Ok((
has_enqueues, has_enqueues,
Some(CommitResult { Some(CommitResult {
versionstamp: new_vesionstamp, versionstamp: new_versionstamp,
}), }),
)) ))
}) })

View file

@ -43,7 +43,7 @@ extern "C" fn call_fn(info: *const v8::FunctionCallbackInfo) {
if let Some(f) = info.cb { if let Some(f) = info.cb {
// SAFETY: calling user provided function pointer. // SAFETY: calling user provided function pointer.
let value = unsafe { f(info.env, info_ptr as *mut _) }; let value = unsafe { f(info.env, info_ptr as *mut _) };
// SAFETY: napi_value is reprsented as v8::Local<v8::Value> internally. // SAFETY: napi_value is represented as v8::Local<v8::Value> internally.
rv.set(unsafe { transmute::<napi_value, v8::Local<v8::Value>>(value) }); rv.set(unsafe { transmute::<napi_value, v8::Local<v8::Value>>(value) });
} }
} }

View file

@ -463,12 +463,12 @@ where
{ {
modules_path modules_path
} else { } else {
let orignal = modules_path.clone(); let original = modules_path.clone();
let mod_dir = path_resolve(vec![modules_path, name]); let mod_dir = path_resolve(vec![modules_path, name]);
if fs.is_dir(Path::new(&mod_dir)) { if fs.is_dir(Path::new(&mod_dir)) {
mod_dir mod_dir
} else { } else {
orignal original
} }
}; };
let pkg = node_resolver.load_package_json( let pkg = node_resolver.load_package_json(

View file

@ -874,7 +874,7 @@ Module.prototype.load = function (filename) {
pathDirname(this.filename), pathDirname(this.filename),
); );
const extension = findLongestRegisteredExtension(filename); const extension = findLongestRegisteredExtension(filename);
// allow .mjs to be overriden // allow .mjs to be overridden
if ( if (
StringPrototypeEndsWith(filename, ".mjs") && !Module._extensions[".mjs"] StringPrototypeEndsWith(filename, ".mjs") && !Module._extensions[".mjs"]
) { ) {

View file

@ -13,7 +13,7 @@ import {
} from "ext:deno_node/internal/validators.mjs"; } from "ext:deno_node/internal/validators.mjs";
import { promisify } from "ext:deno_node/internal/util.mjs"; import { promisify } from "ext:deno_node/internal/util.mjs";
/** These options aren't funcitonally used right now, as `Dir` doesn't yet support them. /** These options aren't functionally used right now, as `Dir` doesn't yet support them.
* However, these values are still validated. * However, these values are still validated.
*/ */
type Options = { type Options = {

View file

@ -86,13 +86,13 @@ export class FileHandle extends EventEmitter {
): Promise<WriteResult>; ): Promise<WriteResult>;
write( write(
bufferOrStr: Buffer | string, bufferOrStr: Buffer | string,
offsetOrPotition: number, offsetOrPosition: number,
lengthOrEncoding: number | string, lengthOrEncoding: number | string,
position?: number, position?: number,
): Promise<WriteResult> { ): Promise<WriteResult> {
if (bufferOrStr instanceof Buffer) { if (bufferOrStr instanceof Buffer) {
const buffer = bufferOrStr; const buffer = bufferOrStr;
const offset = offsetOrPotition; const offset = offsetOrPosition;
const length = lengthOrEncoding; const length = lengthOrEncoding;
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
@ -110,7 +110,7 @@ export class FileHandle extends EventEmitter {
}); });
} else { } else {
const str = bufferOrStr; const str = bufferOrStr;
const position = offsetOrPotition; const position = offsetOrPosition;
const encoding = lengthOrEncoding; const encoding = lengthOrEncoding;
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {

View file

@ -1,5 +1,5 @@
// Copyright 2018-2023 the Deno authors. All rights reserved. MIT license. // Copyright 2018-2023 the Deno authors. All rights reserved. MIT license.
// This file contains C++ node globals accesed in internal binding calls // This file contains C++ node globals accessed in internal binding calls
/** /**
* Adapted from * Adapted from

View file

@ -731,14 +731,14 @@ class URL {
get username() { get username() {
webidl.assertBranded(this, URLPrototype); webidl.assertBranded(this, URLPrototype);
// https://github.com/servo/rust-url/blob/1d307ae51a28fecc630ecec03380788bfb03a643/url/src/lib.rs#L881 // https://github.com/servo/rust-url/blob/1d307ae51a28fecc630ecec03380788bfb03a643/url/src/lib.rs#L881
const schemeSeperatorLen = 3; /* :// */ const schemeSeparatorLen = 3; /* :// */
if ( if (
this.#hasAuthority() && this.#hasAuthority() &&
this.#usernameEnd > this.#schemeEnd + schemeSeperatorLen this.#usernameEnd > this.#schemeEnd + schemeSeparatorLen
) { ) {
return StringPrototypeSlice( return StringPrototypeSlice(
this.#serialization, this.#serialization,
this.#schemeEnd + schemeSeperatorLen, this.#schemeEnd + schemeSeparatorLen,
this.#usernameEnd, this.#usernameEnd,
); );
} else { } else {

View file

@ -129,7 +129,7 @@ class AbortSignal extends EventTarget {
} }
} }
// `addEventListener` and `removeEventListener` have to be overriden in // `addEventListener` and `removeEventListener` have to be overridden in
// order to have the timer block the event loop while there are listeners. // order to have the timer block the event loop while there are listeners.
// `[add]` and `[remove]` don't ref and unref the timer because they can // `[add]` and `[remove]` don't ref and unref the timer because they can
// only be used by Deno internals, which use it to essentially cancel async // only be used by Deno internals, which use it to essentially cancel async

View file

@ -17,7 +17,7 @@ const {
WeakMapPrototypeSet, WeakMapPrototypeSet,
} = primordials; } = primordials;
const locationConstructorKey = Symbol("locationConstuctorKey"); const locationConstructorKey = Symbol("locationConstructorKey");
// The differences between the definitions of `Location` and `WorkerLocation` // The differences between the definitions of `Location` and `WorkerLocation`
// are because of the `LegacyUnforgeable` attribute only specified upon // are because of the `LegacyUnforgeable` attribute only specified upon

View file

@ -123,7 +123,7 @@ class MessagePort extends EventTarget {
} }
const { transfer } = options; const { transfer } = options;
if (ArrayPrototypeIncludes(transfer, this)) { if (ArrayPrototypeIncludes(transfer, this)) {
throw new DOMException("Can not tranfer self", "DataCloneError"); throw new DOMException("Can not transfer self", "DataCloneError");
} }
const data = serializeJsMessageData(message, transfer); const data = serializeJsMessageData(message, transfer);
if (this[_id] === null) return; if (this[_id] === null) return;

View file

@ -91,7 +91,7 @@ pub async fn op_sleep(
// Windows timer period is 15ms, this means a 100ms timer could fire at 115ms (15% late). We assume that // Windows timer period is 15ms, this means a 100ms timer could fire at 115ms (15% late). We assume that
// timers longer than 100ms are a reasonable cutoff here. // timers longer than 100ms are a reasonable cutoff here.
// The high-res timers on Windows are still limited. Unfortuntely this means that our shortest duration 4ms timers // The high-res timers on Windows are still limited. Unfortunately this means that our shortest duration 4ms timers
// can still be 25% late, but without a more complex timer system or spinning on the clock itself, we're somewhat // can still be 25% late, but without a more complex timer system or spinning on the clock itself, we're somewhat
// bounded by the OS' scheduler itself. // bounded by the OS' scheduler itself.
let _hr_timer_lock = if millis <= 100 { let _hr_timer_lock = if millis <= 100 {

View file

@ -121,7 +121,7 @@ function type(V) {
case "function": case "function":
// Falls through // Falls through
default: default:
// Per ES spec, typeof returns an implemention-defined value that is not any of the existing ones for // Per ES spec, typeof returns an implementation-defined value that is not any of the existing ones for
// uncallable non-standard exotic objects. Yet Type() which the Web IDL spec depends on returns Object for // uncallable non-standard exotic objects. Yet Type() which the Web IDL spec depends on returns Object for
// such cases. So treat the default case as an object. // such cases. So treat the default case as an object.
return "Object"; return "Object";

View file

@ -13,24 +13,24 @@ declare module "ext:deno_webidl/00_webidl.js" {
): any; ): any;
interface IntConverterOpts { interface IntConverterOpts {
/** /**
* Wether to throw if the number is outside of the acceptable values for * Whether to throw if the number is outside of the acceptable values for
* this type. * this type.
*/ */
enforceRange?: boolean; enforceRange?: boolean;
/** /**
* Wether to clamp this number to the acceptable values for this type. * Whether to clamp this number to the acceptable values for this type.
*/ */
clamp?: boolean; clamp?: boolean;
} }
interface StringConverterOpts { interface StringConverterOpts {
/** /**
* Wether to treat `null` value as an empty string. * Whether to treat `null` value as an empty string.
*/ */
treatNullAsEmptyString?: boolean; treatNullAsEmptyString?: boolean;
} }
interface BufferConverterOpts { interface BufferConverterOpts {
/** /**
* Wether to allow `SharedArrayBuffer` (not just `ArrayBuffer`). * Whether to allow `SharedArrayBuffer` (not just `ArrayBuffer`).
*/ */
allowShared?: boolean; allowShared?: boolean;
} }

View file

@ -22,7 +22,7 @@ pub(crate) fn import() -> TokenStream {
// TODO(@littledivy): This won't work for `deno_core` examples // TODO(@littledivy): This won't work for `deno_core` examples
// since `crate` does not refer to `deno_core`. // since `crate` does not refer to `deno_core`.
// examples must re-export deno_core to make this work // examples must re-export deno_core to make this work
// until Span inspection APIs are stabalized. // until Span inspection APIs are stabilized.
// //
// https://github.com/rust-lang/rust/issues/54725 // https://github.com/rust-lang/rust/issues/54725
quote!(crate) quote!(crate)

View file

@ -746,7 +746,7 @@ fn codegen_sync_ret(
} else if is_ptr_cvoid(output) || is_ptr_cvoid_rv(output) { } else if is_ptr_cvoid(output) || is_ptr_cvoid_rv(output) {
quote! { quote! {
if result.is_null() { if result.is_null() {
// External canot contain a null pointer, null pointers are instead represented as null. // External cannot contain a null pointer, null pointers are instead represented as null.
rv.set_null(); rv.set_null();
} else { } else {
rv.set(v8::External::new(scope, result as *mut ::std::ffi::c_void).into()); rv.set(v8::External::new(scope, result as *mut ::std::ffi::c_void).into());

View file

@ -329,7 +329,7 @@ pub(crate) struct Optimizer {
pub(crate) has_rc_opstate: bool, pub(crate) has_rc_opstate: bool,
// Do we need an explict FastApiCallbackOptions argument? // Do we need an explicit FastApiCallbackOptions argument?
pub(crate) has_fast_callback_option: bool, pub(crate) has_fast_callback_option: bool,
// Do we depend on FastApiCallbackOptions? // Do we depend on FastApiCallbackOptions?
pub(crate) needs_fast_callback_option: bool, pub(crate) needs_fast_callback_option: bool,
@ -426,7 +426,7 @@ impl Optimizer {
} }
}; };
// The reciever, which we don't actually care about. // The receiver, which we don't actually care about.
self.fast_parameters.push(FastValue::V8Value); self.fast_parameters.push(FastValue::V8Value);
if self.is_async { if self.is_async {

View file

@ -385,7 +385,7 @@ pub fn os_uptime() -> u64 {
#[cfg(target_family = "windows")] #[cfg(target_family = "windows")]
// SAFETY: windows API usage // SAFETY: windows API usage
unsafe { unsafe {
// Windows is the only one that returns `uptime` in milisecond precision, // Windows is the only one that returns `uptime` in millisecond precision,
// so we need to get the seconds out of it to be in sync with other envs. // so we need to get the seconds out of it to be in sync with other envs.
uptime = winapi::um::sysinfoapi::GetTickCount64() / 1000; uptime = winapi::um::sysinfoapi::GetTickCount64() / 1000;
} }

View file

@ -29,7 +29,7 @@ fn op_main_module(state: &mut OpState) -> Result<String, AnyError> {
} }
/// This is an op instead of being done at initialization time because /// This is an op instead of being done at initialization time because
/// it's expensive to retreive the ppid on Windows. /// it's expensive to retrieve the ppid on Windows.
#[op] #[op]
pub fn op_ppid() -> i64 { pub fn op_ppid() -> i64 {
#[cfg(windows)] #[cfg(windows)]

View file

@ -51,7 +51,7 @@ pub fn op_worker_sync_fetch(
.clone(); .clone();
// TODO(andreubotella): make the below thread into a resource that can be // TODO(andreubotella): make the below thread into a resource that can be
// re-used. This would allow parallel fecthing of multiple scripts. // re-used. This would allow parallel fetching of multiple scripts.
let thread = std::thread::spawn(move || { let thread = std::thread::spawn(move || {
let runtime = tokio::runtime::Builder::new_current_thread() let runtime = tokio::runtime::Builder::new_current_thread()

View file

@ -1659,7 +1659,7 @@ impl Permissions {
v.iter() v.iter()
.map(|x| { .map(|x| {
if x.is_empty() { if x.is_empty() {
Err(AnyError::msg("emtpy")) Err(AnyError::msg("empty"))
} else { } else {
Ok(SysDescriptor(x.to_string())) Ok(SysDescriptor(x.to_string()))
} }

View file

@ -29,9 +29,9 @@ nested HashMaps), so when writing ops we recommend directly using rust
structs/tuples or primitives, since mapping to `serde_json::Value` will add structs/tuples or primitives, since mapping to `serde_json::Value` will add
extra overhead and result in slower ops. extra overhead and result in slower ops.
I also recommend avoiding unecessary "wrappers", if your op takes a single-keyed I also recommend avoiding unnecessary "wrappers", if your op takes a
struct, consider unwrapping that as a plain value unless you plan to add fields single-keyed struct, consider unwrapping that as a plain value unless you plan
in the near-future. to add fields in the near-future.
Instead of returning "nothing" via `Ok(json!({}))`, change your return type to Instead of returning "nothing" via `Ok(json!({}))`, change your return type to
rust's unit type `()` and returning `Ok(())`, `serde_v8` will efficiently encode rust's unit type `()` and returning `Ok(())`, `serde_v8` will efficiently encode

View file

@ -29,7 +29,7 @@ impl<T: serde::Serialize> Serializable for T {
} }
/// SerializablePkg exists to provide a fast path for op returns, /// SerializablePkg exists to provide a fast path for op returns,
/// allowing them to avoid boxing primtives (ints/floats/bool/unit/...) /// allowing them to avoid boxing primitives (ints/floats/bool/unit/...)
pub enum SerializablePkg { pub enum SerializablePkg {
Primitive(Primitive), Primitive(Primitive),
Serializable(Box<dyn Serializable>), Serializable(Box<dyn Serializable>),

View file

@ -15,7 +15,7 @@ pub extern "C" fn print_something() {
/// # Safety /// # Safety
/// ///
/// The pointer to the buffer must be valid and initalized, and the length must /// The pointer to the buffer must be valid and initialized, and the length must
/// not be longer than the buffer's allocation. /// not be longer than the buffer's allocation.
#[no_mangle] #[no_mangle]
pub unsafe extern "C" fn print_buffer(ptr: *const u8, len: usize) { pub unsafe extern "C" fn print_buffer(ptr: *const u8, len: usize) {
@ -25,7 +25,7 @@ pub unsafe extern "C" fn print_buffer(ptr: *const u8, len: usize) {
/// # Safety /// # Safety
/// ///
/// The pointer to the buffer must be valid and initalized, and the length must /// The pointer to the buffer must be valid and initialized, and the length must
/// not be longer than the buffer's allocation. /// not be longer than the buffer's allocation.
#[no_mangle] #[no_mangle]
pub unsafe extern "C" fn print_buffer2( pub unsafe extern "C" fn print_buffer2(
@ -117,7 +117,7 @@ pub extern "C" fn sleep_blocking(ms: u64) {
/// # Safety /// # Safety
/// ///
/// The pointer to the buffer must be valid and initalized, and the length must /// The pointer to the buffer must be valid and initialized, and the length must
/// not be longer than the buffer's allocation. /// not be longer than the buffer's allocation.
#[no_mangle] #[no_mangle]
pub unsafe extern "C" fn fill_buffer(value: u8, buf: *mut u8, len: usize) { pub unsafe extern "C" fn fill_buffer(value: u8, buf: *mut u8, len: usize) {
@ -129,7 +129,7 @@ pub unsafe extern "C" fn fill_buffer(value: u8, buf: *mut u8, len: usize) {
/// # Safety /// # Safety
/// ///
/// The pointer to the buffer must be valid and initalized, and the length must /// The pointer to the buffer must be valid and initialized, and the length must
/// not be longer than the buffer's allocation. /// not be longer than the buffer's allocation.
#[no_mangle] #[no_mangle]
pub unsafe extern "C" fn nonblocking_buffer(ptr: *const u8, len: usize) { pub unsafe extern "C" fn nonblocking_buffer(ptr: *const u8, len: usize) {
@ -517,7 +517,7 @@ pub struct Mixed {
/// # Safety /// # Safety
/// ///
/// The array pointer to the buffer must be valid and initalized, and the length must /// The array pointer to the buffer must be valid and initialized, and the length must
/// be 2. /// be 2.
#[no_mangle] #[no_mangle]
pub unsafe extern "C" fn create_mixed( pub unsafe extern "C" fn create_mixed(

View file

@ -69,7 +69,7 @@ await sendWorkerMessage("register");
dylib.symbols.call_stored_function(); dylib.symbols.call_stored_function();
// Unref both main and worker thread callbacks and terminate the wrorker: Note, the stored function pointer in lib is now dangling. // Unref both main and worker thread callbacks and terminate the worker: Note, the stored function pointer in lib is now dangling.
mainThreadCallback.unref(); mainThreadCallback.unref();
await sendWorkerMessage("unref"); await sendWorkerMessage("unref");

View file

@ -1,6 +1,6 @@
// Copyright 2018-2023 the Deno authors. All rights reserved. MIT license. // Copyright 2018-2023 the Deno authors. All rights reserved. MIT license.
// This test performs initilization similar to napi-rs. // This test performs initialization similar to napi-rs.
// https://github.com/napi-rs/napi-rs/commit/a5a04a4e545f268769cc78e2bd6c45af4336aac3 // https://github.com/napi-rs/napi-rs/commit/a5a04a4e545f268769cc78e2bd6c45af4336aac3
use napi_sys as sys; use napi_sys as sys;

View file

@ -111,7 +111,7 @@ impl TestContextBuilder {
pub fn use_sync_npm_download(self) -> Self { pub fn use_sync_npm_download(self) -> Self {
self.env( self.env(
// make downloads determinstic // make downloads deterministic
"DENO_UNSTABLE_NPM_SYNC_DOWNLOAD", "DENO_UNSTABLE_NPM_SYNC_DOWNLOAD",
"1", "1",
) )
@ -379,7 +379,7 @@ impl TestCommandBuilder {
fn sanitize_output(text: String, args: &[String]) -> String { fn sanitize_output(text: String, args: &[String]) -> String {
let mut text = strip_ansi_codes(&text).to_string(); let mut text = strip_ansi_codes(&text).to_string();
// deno test's output capturing flushes with a zero-width space in order to // deno test's output capturing flushes with a zero-width space in order to
// synchronize the output pipes. Occassionally this zero width space // synchronize the output pipes. Occasionally this zero width space
// might end up in the output so strip it from the output comparison here. // might end up in the output so strip it from the output comparison here.
if args.first().map(|s| s.as_str()) == Some("test") { if args.first().map(|s| s.as_str()) == Some("test") {
text = text.replace('\u{200B}', ""); text = text.replace('\u{200B}', "");
@ -647,7 +647,7 @@ impl TestCommandOutput {
} }
#[track_caller] #[track_caller]
pub fn assert_stderrr_matches_file( pub fn assert_stderr_matches_file(
&self, &self,
file_path: impl AsRef<Path>, file_path: impl AsRef<Path>,
) -> &Self { ) -> &Self {

View file

@ -109,7 +109,7 @@ pub fn env_vars_for_npm_tests_no_sync_download() -> Vec<(String, String)> {
pub fn env_vars_for_npm_tests() -> Vec<(String, String)> { pub fn env_vars_for_npm_tests() -> Vec<(String, String)> {
let mut env_vars = env_vars_for_npm_tests_no_sync_download(); let mut env_vars = env_vars_for_npm_tests_no_sync_download();
env_vars.push(( env_vars.push((
// make downloads determinstic // make downloads deterministic
"DENO_UNSTABLE_NPM_SYNC_DOWNLOAD".to_string(), "DENO_UNSTABLE_NPM_SYNC_DOWNLOAD".to_string(),
"1".to_string(), "1".to_string(),
)); ));
@ -1372,7 +1372,7 @@ async fn wrap_main_https_server() {
.expect("Cannot bind TCP"); .expect("Cannot bind TCP");
println!("ready: https"); // Eye catcher for HttpServerCount println!("ready: https"); // Eye catcher for HttpServerCount
let tls_acceptor = TlsAcceptor::from(tls_config.clone()); let tls_acceptor = TlsAcceptor::from(tls_config.clone());
// Prepare a long-running future stream to accept and serve cients. // Prepare a long-running future stream to accept and serve clients.
let incoming_tls_stream = async_stream::stream! { let incoming_tls_stream = async_stream::stream! {
loop { loop {
let (socket, _) = tcp.accept().await?; let (socket, _) = tcp.accept().await?;
@ -1417,7 +1417,7 @@ async fn wrap_https_h1_only_tls_server() {
.expect("Cannot bind TCP"); .expect("Cannot bind TCP");
println!("ready: https"); // Eye catcher for HttpServerCount println!("ready: https"); // Eye catcher for HttpServerCount
let tls_acceptor = TlsAcceptor::from(tls_config.clone()); let tls_acceptor = TlsAcceptor::from(tls_config.clone());
// Prepare a long-running future stream to accept and serve cients. // Prepare a long-running future stream to accept and serve clients.
let incoming_tls_stream = async_stream::stream! { let incoming_tls_stream = async_stream::stream! {
loop { loop {
let (socket, _) = tcp.accept().await?; let (socket, _) = tcp.accept().await?;
@ -1463,7 +1463,7 @@ async fn wrap_https_h2_only_tls_server() {
.expect("Cannot bind TCP"); .expect("Cannot bind TCP");
println!("ready: https"); // Eye catcher for HttpServerCount println!("ready: https"); // Eye catcher for HttpServerCount
let tls_acceptor = TlsAcceptor::from(tls_config.clone()); let tls_acceptor = TlsAcceptor::from(tls_config.clone());
// Prepare a long-running future stream to accept and serve cients. // Prepare a long-running future stream to accept and serve clients.
let incoming_tls_stream = async_stream::stream! { let incoming_tls_stream = async_stream::stream! {
loop { loop {
let (socket, _) = tcp.accept().await?; let (socket, _) = tcp.accept().await?;
@ -1527,7 +1527,7 @@ async fn wrap_client_auth_https_server() {
.expect("Cannot bind TCP"); .expect("Cannot bind TCP");
println!("ready: https_client_auth on :{HTTPS_CLIENT_AUTH_PORT:?}"); // Eye catcher for HttpServerCount println!("ready: https_client_auth on :{HTTPS_CLIENT_AUTH_PORT:?}"); // Eye catcher for HttpServerCount
let tls_acceptor = TlsAcceptor::from(tls_config.clone()); let tls_acceptor = TlsAcceptor::from(tls_config.clone());
// Prepare a long-running future stream to accept and serve cients. // Prepare a long-running future stream to accept and serve clients.
let incoming_tls_stream = async_stream::stream! { let incoming_tls_stream = async_stream::stream! {
loop { loop {
let (socket, _) = tcp.accept().await?; let (socket, _) = tcp.accept().await?;

View file

@ -9,7 +9,7 @@ Node.js compat testing in Deno repository.
- `//tools/node_compat/setup.ts` - `//tools/node_compat/setup.ts`
- This script sets up the Node.js compat tests. - This script sets up the Node.js compat tests.
- `//tools/node_comapt/versions/` - `//tools/node_compat/versions/`
- Node.js source tarballs and extracted test cases are stored here. - Node.js source tarballs and extracted test cases are stored here.
- `//cli/tests/node_compat/config.jsonc` - `//cli/tests/node_compat/config.jsonc`
- This json file stores the settings about which Node.js compat test to run - This json file stores the settings about which Node.js compat test to run
@ -29,7 +29,7 @@ The above command copies the updated items from Node.js tarball to the Deno
source tree. source tree.
Ideally Deno should pass the Node.js compat tests without modification, but if Ideally Deno should pass the Node.js compat tests without modification, but if
you need to modify it, then add that item in `ignore` perperty of you need to modify it, then add that item in `ignore` property of
`config.jsonc`. Then `setup.ts` doesn't overwrite the modified Node.js test `config.jsonc`. Then `setup.ts` doesn't overwrite the modified Node.js test
cases anymore. cases anymore.
@ -41,7 +41,7 @@ If the test needs to be ignored in particular platform, then add them in
Node.js compat tests are run as part of `cargo test` command. If you want to run Node.js compat tests are run as part of `cargo test` command. If you want to run
only the Node.js compat test cases you can use the command only the Node.js compat test cases you can use the command
`cargo test node_compat`. If you want to run specific tests you can use the `cargo test node_compat`. If you want to run specific tests you can use the
command `deno task test` (in `tools/node_comapt` dir). For example, if you want command `deno task test` (in `tools/node_compat` dir). For example, if you want
to run all test files which contains `buffer` in filename you can use the to run all test files which contains `buffer` in filename you can use the
command: command:

View file

@ -115,7 +115,7 @@ async function copyTests() {
for await (const entry of walk(VENDORED_NODE_TEST, { skip: ignoreList })) { for await (const entry of walk(VENDORED_NODE_TEST, { skip: ignoreList })) {
const fragments = entry.path.split(sep); const fragments = entry.path.split(sep);
// suite is the directory name after test/. For example, if the file is // suite is the directory name after test/. For example, if the file is
// "node_comapt/node/test/fixtures/policy/main.mjs" // "node_compat/node/test/fixtures/policy/main.mjs"
// then suite is "fixtures/policy" // then suite is "fixtures/policy"
const suite = fragments.slice(fragments.indexOf("node_compat") + 3, -1) const suite = fragments.slice(fragments.indexOf("node_compat") + 3, -1)
.join("/"); .join("/");

View file

@ -188,7 +188,7 @@ script generates the symbols based on the latest tags.
<details> <details>
<summary>Failure Steps</summary> <summary>Failure Steps</summary>
1. Clone `deno/apliland_scripts`. 1. Clone `deno/apiland_scripts`.
2. Execute `deno task release`. 2. Execute `deno task release`.
</details> </details>