Or find the best third party library and copy the code from a widely used version that has been out long enough to have been well tested into your source tree.
The problem is not third party libraries. It is updating third party libraries when the version you have still works fine for your needs.
Don't do this. Use a package manager that let's you specify a specific version to pin against. Vendoring side steps most automated tooling that can warn you about vulnerabilities. Vendoring is a signal that your tooling is insufficient, 99% of the time.
Vendoring means you don't have to fetch the internet for every build, that you can work offline, that you're not at the mercy of the oh-so-close-99.999 availability, that it will keep on working in 10 years, and probably other advantages.
If your tooling can pull a dependency from the internet, it could certainly check if more recent version from a vendored one is available.
This is only true if you aren’t internally mirroring those packages.
Most places I’ve worked have Artifactory or something like it sitting between you and actual PyPI/npm/etc. As long as someone has pulled that version at some point before the internet goes out, it’ll continue to work after.
And this is exactly why we see noise on HN/Reddit when a supply-chain cyberattack breaks out, but no breach is ever reported. Enterprises are protected by internal mirroring.
> Is there any package manager incapable of working offline?
I think you've identified the problem here: package management and package distribution are two different problems. Both tools have possibilities for exploits, but if they are separate tools then the surface area is smaller.
I'm thinking that the package distribution tool maintains a local system cache of packages, using keys/webrings/whatever to verify provenance, while the package management tool allows pinning, minver/maxver, etc.
The problem is not third party libraries. It is updating third party libraries when the version you have still works fine for your needs.