You have probably faced the same dilemma. If you let you dependencies get out of date, the chances are you'll harbor a code vulnerability. If you update them too soon, you potentially introduce a malicious version with a supply chain attack.
This may leave you wondering whether you should update dependencies, or not.
Well, here's an adaptable strategy I apply to my PNPM dependencies that seeks the Goldilocks zone of "not too hot, not too cold" for dependency updates.
First, the config in my pnpm-workspace.yaml file:
ignoreScripts: true
minimumReleaseAge: 10080
minimumReleaseAgeExclude:
- astro-accelerator
- astro-accelerator-utils
trustPolicy: no-downgrade
blockExoticSubdeps: true
What the settings do
And here's a breakdown of each setting...
ignoreScripts: true
Prevents packages from running scripts automatically during installation. Supply chain attacks often hide malicious code in postinstall hooks. This setting stops them from executing without your explicit permission.
blockExoticSubdeps: true
Blocks dependency subtrees that use unusual or uncommon install patterns. These "exotic" configurations are a common hiding spot for malicious packages trying to slip in under the radar.
minimumReleaseAge: 10080
Refuses to install any package version released within the last 10,080 minutes (that's exactly one week). This guards against "package takeover" attacks, where a bad actor publishes a malicious version of a legitimate package and hopes projects grab it before anyone notices.
trustPolicy: no-downgrade
Prevents packages from being silently downgraded to older, potentially vulnerable versions. If something tries to pull in a lower version than what you already have verified, PNPM will block it.
Together these four settings form a pretty solid defense-in-depth strategy, blocking execution at install time, restricting unusual patterns, enforcing a freshness quarantine, and preventing version rollbacks.
You can "adjust the slider" on this by changing the minimum release age. You might be more eager and able to take on more risk by taking packages sooner, or you may be more cautious and want to delay even longer.
And if you really care about security
If you're working on a system where a malicious package could cause widespread damage, for example a regulated industry, or a tool vendor for tools used in sensitive areas like build pipelines where you have secrets that could be exfiltrated, you need to go a bit deeper.
The belt-and-braces approach to dependencies here would be that you review the changes to the packages you depend on. If you see something dodgy, not only should you not update the package, you should also raise the security concern so alarms can sound for everyone else.
Many organizations put an intermediary package manager in place so all internal apps pull from the approved source of packages instead of public repositories. A central team might update the packages with a deep review process, giving you extra confidence in your supply chain.
This may make you think twice about taking on a dependency as the review burden may exceed the cost of simply creating the code that does what you need.
Why so serious?
We don't really take packages very seriously. But I want folks to think about it like this.
Your Git repo likely doesn't let a bunch of strangers view and update your code. You might accept contributions from strangers, but with a review process to make sure you check the code before you accept it.
Packages are basically strangers adding code to your system, so why would you treat them any differently?