back to poking at adding type acquisition support to vscode.dev and finally fixed some issues that were making it slow. Tested with a huge package.json, yeah, it took a while. Tested on orogene desktop, took about the same amount of time. ok.

npm? 3x the amount of time (9 minutes!!), even though orogene web can only open two fetch requests at a time and has to serialize stuff between the Rust and JS layers.

lol. lmao. And orogene desktop was only a few seconds faster!

I should jump back on #orogene dev some time. This thing is still pretty solid.

#orogene web: 177.5s
orogene desktop: 157.861s
bun: 106.3s
npm: 540s

not bad for something done in someone's spare time, rather than with VC money, and that can run in the browser (!)

This is full cold start. No cache, no lockfile, nothing. Just a package.json and some vibes.

@joepie91

pnpm cold cache: 107s
pnpm warm cache + lockfile: 8.7s (??? not sure if I'm testing this right. This seems wrong.)

Follow

@zkat pnpm typically uses a system-wide package installation folder, so the second 'install' is likely just linking into that

· · Web · 1 · 0 · 0

@joepie91 I thought that would be the --store-dir, which I'm definitely wiping to test, but maybe there's more data being stored in the store dir about installations specific to certain projects in the system?...

@joepie91 it's complicated. These numbers would very likely be much closer on linux or macOS, too, which have... more reasonable file systems.

@joepie91 oh.

I was doing it wrong.

I was testing across drives, and not passing in the cache stuff correctly. So pnpm was actually hard linking/cloning, whereas the other two were having a Bad Time doing cross-disk full copies.

Here's the timings with this issue addressed, on NTFS instead of ReFS (where things can be weird). All timings with an existing lockfile.

orogene cold cache: 105s
orogene warm cache: 16s

pnpm cold cache: 39.7s
pnpm warm cache: 26.8s

bun cold cache: 85.6s
bun warm cache: 72.19s

@zkat Those are some pretty good results for orogene, nice :)

This is probably tangential and rambly, but what I would love to see some day is a JS package manager that can do cross-source installations; ie. packages can depend on packages on another registry (but not outright import-from-URL because of the linkrot problem).

Worked on a design for this a long time ago, but ended up slightly stuck on the interoperability with stock npm; best workaround we could find with semver preservation would be to have every registry operate a fake git server, since that seemed to be the only way to make npm do auto-updated cross-registry installs... but that would probably wreck these performance scores in how slow it is 🙂

@joepie91 I wouldn't recommend this just on the basis of it being a massive security hole.

@joepie91 I mean, so are git and http packages, and imo those don't belong in registry-published packages either.

But this would just add to the problem.

In 99.9% of cases, you know exactly where a package is coming from: your --registry config. Bam, that's it.

You can further extend that by saying "also anything that comes from @somescope comes from this other registry", which is useful for organizations.

But it's very important that organizations be able to control what sources they're consuming packages from. The main registry is not a safe place, and will never be.

the idea that an arbitrary package I download could bypass this entirely is... not great (again, this can already be done with http and git packages, but that should not be available either)

@zkat I'm not convinced that this is a problem, to be honest. That it's possible to install dependencies cross-repository does not mean that there cannot be policies to restrict it; it would not be fundamentally different from a security perspective from how it is today, except that the whole ecosystem would not be beholden to a singular for-profit registry operator with a dubious security track record anymore (because cross-registry interoperability would break the network effect).

Speaking more personally, I'm also less concerned about organizational needs, and more about community needs, and then particularly the tools for having a genuine distributed commons (where eg. manual scope configuration is not a viable solution).

Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.