phoenisx
phoenisx
DDeno
Created by phoenisx on 12/20/2022 in #help
LSP becomes slow when importing large json
Filtered perf measures
semantic_tokens_range (6638ms)
request (6638ms)
inlay_hint (105800ms)
request (105800ms)
initialize (1350ms)
update_registries (1292ms)
update_tsconfig (53ms)
semantic_tokens_range (6638ms)
request (6638ms)
inlay_hint (105800ms)
request (105800ms)
initialize (1350ms)
update_registries (1292ms)
update_tsconfig (53ms)
JSON file that's being imported and lives in memory is 116M in size How can I disable or exclude parsing this JSON file in VSCode extension, so that extension features are not affected.?
1 replies
DDeno
Created by phoenisx on 12/9/2022 in #help
Deno Subprocess piping is slow when piping large data between processes
I could be wrong about this but this is what I have been facing on my M1 Mac. The following code snippet, pipes stdout of one process to another, and it works seamlessly for small json data.
import { copyN } from "https://deno.land/std/io/util.ts";

try {
const catProcess = Deno.run({
cmd: ["cat", "small-stats.json"],
stdout: "piped",
});
const proc = Deno.run({
cmd: [
"grep",
"search_text",
],
stdin: "piped",
stdout: "piped"
});

await copyN(catProcess.stdout, proc.stdin, 65536);
console.log("Cat Process Status: ", await catProcess.status());
proc.stdin.close();

console.log("Proc Status: ", await proc.status());
const encoder = new TextDecoder();
const out = encoder.decode(await proc.output());
console.log(out);
} catch (e) {
console.error(e);
}
import { copyN } from "https://deno.land/std/io/util.ts";

try {
const catProcess = Deno.run({
cmd: ["cat", "small-stats.json"],
stdout: "piped",
});
const proc = Deno.run({
cmd: [
"grep",
"search_text",
],
stdin: "piped",
stdout: "piped"
});

await copyN(catProcess.stdout, proc.stdin, 65536);
console.log("Cat Process Status: ", await catProcess.status());
proc.stdin.close();

console.log("Proc Status: ", await proc.status());
const encoder = new TextDecoder();
const out = encoder.decode(await proc.output());
console.log(out);
} catch (e) {
console.error(e);
}
The program completely freezes, when a json data of around 1GB is piped using cat to second sub-process. Am I doing something wrong here?
1 replies