The root of all evil for “DLL Hell” is the Microsoft C runtime library.  Yes, I said it!  I threw a rock and it wasn’t nice, but give me a few minutes and you’ll see why solving the problem is much more complicated problem than just “fixing it”. 

Programmer hat turned on for this post.


The Microsoft Visual C runtime implements all of the functions in the Kernighan and Ritchie ‘C’ book.  Over time, they also added support for CPP, MFC, ATL and a bunch of other TLAs that I haven’t bothered to list here.  

The implementation of these is the C runtime and the runtime is available to all programmers that write code in these languages.  The runtime is available both as a lib where the executable bits become PART of the produced executable and the runtime is available as a DLL where it is dynamically linked to the application at runtime.  Either way, bits are bits and the program can call the runtime to have it do work.  It also means that bugs in the runtime have the ability to effect the calling program.

When building an application, the programmer has to decide how to “link” with the runtime.  This MSDN page shows the details of it, but the short version is to chose “static” (lib) or “dynamic” (dll) implementation.  In almost all cases, dynamic is the right answer. 

We walked up hill, to get to school, in the snow, both directions

In the beginning (early 1990s), RAM memory was scarce and I’ll state that a drive to save memory was the root of design decisions that ultimately invented DLL Hell.  

DLLs are much-preferred to libs because a DLL means that multiple executable programs and even the individual DLLs of various programs can all SHARE a single physical storage of the runtime DLL. 

If the runtime DLL is 100KB and if there are 10 programs using it, it will only occupy 100KB in memory.  Notice I said 100KB and not 1000KB. It doesn’t matter how many programs link to the dynamic DLL, the majority of the DLL usage for the runtime code itself will be “one”.  Compare to static linking and here, there would be 100 * 10 apps = 1000MB.  In this example, a 10X savings!  And remember, memory is scarce.

How far should this savings go?   Put your compiler maker hat on.

There are two approaches.

1) All executable content on the machine shares the single global runtime DLL.

2) All executable content within a single PRODUCT shares the same runtime DLL.

Way back in olden days, Microsoft went with “1”.  This was Windows 3.1 time and I’ll say that today – we are today STILL paying the price for this short look.

In the same time frame, “other” compiler vendors forced their customers to implement “2”.  It was in the license – you could not, shall not and should not redistribute THE compiler runtime DLLs in their native form, period. 

The only way to redistribute the runtime was to build your own runtime DLL and then have your program link to that DLL rather than the default compiler dynamic runtime.  This DLL would have a name different than the official runtime DLL, but would be for all other purposes the same.

To accomplish this

  • You had to produce your own DLL that was nothing but a repackaging
  • All exports from the runtime imported into your DLL, then all exported. 
  • It was a big PIA, but when you got done, you had a reliable system.

And DLL Hell would not occur!
The cost though was sacrificing memory.   If each “application” on the machine had to carry along it’s own copy of the C runtime and if you have 4 applications running, then you have 4 copies of the runtime in memory.  They may be shared between the EXEs and DLLs of each program, but there would be no sharing across software vendors.  The usage of each of the copies of the runtime would be limited to the software package.

MSVC 4, 5, 6

Microsoft Visual C ran through a bunch of revisions and became the fine compiler it is today.  This took time and during that run of releases, each major release of the compiler had multiple updates to the C runtime. 


In THEORY, each new release of the runtime is BETTER than each prior release.  Only bug fixes could be added and all releases with bigger numbers have to be super-sets of the priors.


Just because you fix a bug in the runtime, is not always good.  Applications CAN BE and ARE dependent on the bugs and if you fix them – you break the applications.

In theory, the C runtime exists in global space in \Windows\System32 and ALL installers check that version against what they have and if there’s a better version available, they update it.

DLL Hell is born…

The result is that an application can be working happily and then along comes some other non-related application that just flat out craters the first one.  The first application vendor is the one that takes the service call – and they did nothing wrong!  The app vendor that took them out takes no call because their application is running happily.  This continues until first app vendor fixes the DLL and craters the second.

This is the worlds most complicated solution to what should have never been a problem.  In a Utopian world (listed as “2” above), this is not even needed. 

In a post trouble world, one COULD move to “2” and this would help all new programs, but the installed base comes to play here and to solve this, Microsoft invents WinSxS. 

How does it work?  Behind the scenes, the operating system stores numerous versions of the C runtime – and at application launch the OS loader decides which one to give to this specific application based upon “manifest” data that the application caries along describing in great detail, the very specific version of the runtime that this application requires.  

This is SLIGHTLY better than forcing each application to repackage their own DLL and by better, I’m saying that it uses less memory, but WOW is this a lot of work to do to accomplish a fairly modest savings.   It also means that there ARE multiple copies of the dynamic runtime in memory so the global memory savings intended in “1” were a fantasy.

Installed base though is a tough nut 

The new administration at Microsoft likely had little choice but to implement a complicated solution.  I hope they threw many rocks.

Where this gets interesting

I got an inquiry recently.  Could your products runtime DLL be updated, because we want to standardize on a SINGLE version of the compiler runtime across all of the software that we use.  Notice that they want this because an “old” runtime is known to have some kind of string overrun bug that in some product could and would result in a problem where code could be overwritten and where evil doers would inject badness.  I note that this happens every Tuesday.

This is actually an excellent goal – but the realities of it are near impossible to achieve – and so I write this blog post to ponder the ideas and now 3 pages in, we are finally on to the purpose of this post.

The meat of the post

Consider that a more secure world could be achieved if the only C runtime were the one that has all the bugs fixed.  The current version of the runtime is the ONLY version that has all known security threats plugged and by inference, this means that all “old” versions of the C runtime are insecure.  This is an academic exercise and it is over simplified, but in concept, it’s right.

Reread the paragraph above and focus on the word “known”.  Okay, here we go.

Folks are RIGHT to motivate application vendors to move up to the “secure” version though one can debate the timing and the subsets of the runtime that are actually used by the application, in concept – the argument is good.

Taken to the end though, what this means is that WinSxS to manage multiple versions of the runtime is a dead idea.  We’re back to “1”.  A single runtime is assigned for global use and …

DLL Hell is coming back

If we move all applications to THE good C runtime, then we have immediately resurrected DLL Hell!  Awesome!  Nothing in this business is ever new and even this Utopian solution is also doomed. 

There will always be a new runtime

This past Sunday night, my notebook brought down 2 pages of updates from Microsoft update and then auto-rebooted to complete the install.  My first thought was “since when is patch Tuesday on Sunday”?  More to the point, most of the updates were labeled “security” and several were runtime DLLs of various forms.  

Consider that you are an application vendor and you just completed testing and verifying version 118 of the runtime.   This was last Friday.  Over the weekend, a new zero day vulnerability was discovered in strstr and Microsoft released version 119 to fix this and … You are ALREADY out of date.

There is no way to win this puzzle.   If there is “one” DLL, then DLL Hell exists.  If there are “many”, then the world is not running the “secure” version of the runtime.

Going static doesn’t help

Actually, it makes it worse.  If the runtime is updated to fix a problem, you CAN update that DLL and it will near immediately be updated for the use of the application.  Statistically, you’ll be in good shape from DLL Hell perspective; the odds are that your application will work with an updated runtime and it’s easy to try it and revert if it doesn’t work, at least until your customer discovers the code path that you missed.

By contrast, if you have the runtime static linked, then updating the runtime DLL requires re-release of the application, ugly.  Whether linked into the executable EXE itself or dynamically linked to the program at runtime, it doesn’t matter, the same bits are there.  The DLL implementation of the runtime though makes it easier to update.

So, the world sticks with dynamic and we use THE compiler runtime as is standard today.  WinSxS though is fround upon because it is actually working and people don’t remember how many dinners they missed because of DLL Hell.  All this stuff goes round and round if if it plays out to completion, DLL Hell will return…  This time, we will call it .NET Assembly Hell.

Joe Nord