Thursday, July 14, 2016

Using Nuget Package Restore Instead Of Checking into Source Control Is Evil


 NuGet is great and wonderful and makes acquiring packages for .Net project easy and encourages consistency in directory structure which is always a happy thing.

But this trend towards using NuGet Package Restore instead of checking into source control is evil.

Dynamically downloading NuGet files has these supposed advantages:
  1. Reduces the amount of disk storage for version controlled files
  2. Reduces the number of binaries you need to checkin.  We all know that checking binaries into a version control system is bad because Git doesn't play well with binaries.
Well neither of those are compelling to me.  Disk space is cheap.  Really.  Don't worry about it.  You  can store binaries in Git and it's OK.  Really.  Philosophically it's bad to store binaries, but in the real world no kittens die. 

Seven Reasons not to do dynamic downloading of NuGet packages:

1. The most compelling reason for not pulling your libraries directly from NuGet.org for each build is security.  NuGet.org is a huge security risk.  A bad guy could quietly inject malware into NuGet packages and the resulting malware would be embedded the next day via continuous delivery in hundreds of projects on the web.

A security researcher would hopefully find the malware in a few days, but by then the malware has compromised thousands of machines.

By taking a library and putting it in version control you protect yourself by giving the internet time to test the libraries.  You may hear that your package was corrupted, and you have time to fix it.  After a few weeks, the packages are probably fine.

2. NuGet will go away someday.  It won't be tomorrow, but eventually a better technology will replace it and if you try to build your system in 10 years you may be out of luck. I know this is true and wrote about it extensively on my GeoCities page.

3. A package author may remove her code from the NuGet feed the night before you compile your next release.  Sure you can get a backup from somewhere, but it takes time.

4. The NuGet.org goes off line.  NuGet.org is not guaranteed to always be up.  I've been working from home late at night and couldn't compile because NuGet.org was offline.  Eventually I contacted a coworker who emailed me the missing libraries, but is that any way to develop software?

5. Late at night, your build server may lose internet connectivity and your six hour build fails.

6. Reading libraries from a disk is faster than pulling from the internet.
7. Dynamically downloading dependencies complicates the build.  You have to remember to do a "dotnet restore -v Minimal".   OK, it's not much more complicated, but it's an extra step.

In conclusion, spend an extra $100 bucks on a terabyte of disk space and checkin all your dependencies.  Your co-workers in 10 years will thank you.







1 comment:

Anonymous said...

I agree it makes no sense to pull dynamically when you can store in source or another repository. Your dependency should really be negligible in size. Then again we are in v1 of the product and I'm sure there is a setting for this we haven't found or made just yet!