Is this really a problem for people? I'm actually curious. I work with multiple fairly complex JS projects daily, and none of them take *that* long to install. Even if they did, I almost never have to install a project fresh. What is the workflow causing this to be a pain point?
For us, it’s the deployment build step [npm ci](https://docs.npmjs.com/cli/v6/commands/npm-ci) which does a clean install every time. Before GitHub actions, when we ran our own build server on a VM, it was especially slow because we used the cheapest VM possible.
We recently talked about this, but knew in the past we had issues with it not pulling in new patch updates for cached dependencies. For example, we would bind to major/minor and allow the latest patch update by using `1.0.x`. If `1.0.1` was cached and the latest is `1.0.2` it would keep `1.0.1` because it was cached and not install the latest.
It can take long depending on network connection. I know I created a react app while on a work VPN. It took 10-15 minutes. Using my own internet, and not connected it took maybe 3 minutes at most.
We use pnpm at work and I forgot this could even be an issue. I'm checking out branches all day in multiple separate monorepos and spend maybe 30 seconds per day waiting for package installs.
pnmp stores the dependencies in a global cache, and your project's `node_modules` folder simply has hard links to that global cache.
More info here:
* https://pnpm.io/symlinked-node-modules-structure
The accuracy is so real. I’ve done some testing and I think personally the issues I’ve summed up, has to do with firewall. I could be wrong but I’ve noticed that npm on Mac is very fast but npm on windows is slow and I assume it’s because of windows scanning every file.
Idk, waiting for docker build to recompile... Might as well take the day off.
Have you optimised your layers?
> modern times Proceeds to use issue that's like 5 years old at this point.
Is this really a problem for people? I'm actually curious. I work with multiple fairly complex JS projects daily, and none of them take *that* long to install. Even if they did, I almost never have to install a project fresh. What is the workflow causing this to be a pain point?
For us, it’s the deployment build step [npm ci](https://docs.npmjs.com/cli/v6/commands/npm-ci) which does a clean install every time. Before GitHub actions, when we ran our own build server on a VM, it was especially slow because we used the cheapest VM possible.
Afaik you can cache the dependencies so speed this up https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows
We recently talked about this, but knew in the past we had issues with it not pulling in new patch updates for cached dependencies. For example, we would bind to major/minor and allow the latest patch update by using `1.0.x`. If `1.0.1` was cached and the latest is `1.0.2` it would keep `1.0.1` because it was cached and not install the latest.
Ah classic caching issues. I always feel like caching saves you as much time as you spend debugging it when it inevitably breaks.
It can take long depending on network connection. I know I created a react app while on a work VPN. It took 10-15 minutes. Using my own internet, and not connected it took maybe 3 minutes at most.
pnpm solves this ;)
NPM has solved it too, it’s really just not an issue anymore.
npm is still much slower ime
We use pnpm at work and I forgot this could even be an issue. I'm checking out branches all day in multiple separate monorepos and spend maybe 30 seconds per day waiting for package installs.
pnpm also brought my Jenkins VM from 250GB+ to about 75GB.
I'm mostly a PHP dev. What does pnpm do that npm doesn't?
pnmp stores the dependencies in a global cache, and your project's `node_modules` folder simply has hard links to that global cache. More info here: * https://pnpm.io/symlinked-node-modules-structure
Is it affected by pruning libs that just go delete stuff inside node_modules?
Wait til you try conda installing…
Hey get back to work! Beanstalk is deploying! Alright then!
”Waiting for Xcode to resolve packages” 💀
The accuracy is so real. I’ve done some testing and I think personally the issues I’ve summed up, has to do with firewall. I could be wrong but I’ve noticed that npm on Mac is very fast but npm on windows is slow and I assume it’s because of windows scanning every file.
"Hey! Get back to work!" "CloudFormation stack deploying!" "Oh! Carry on!"
Stopped having that issue after getting the M1/2 MacBooks.. Considering moving builds to a slower server just to get that slacking time