Sooner or later, every administrator finds himself wanting to run applications from multiple operating systems on the same physical machine simultaneously, and then struggles to figure out a solution that works somewhat seamlessly.
Maybe you're married to Microsoft Exchange, but you secretly pine for open-source e-mail tools like SpamAssassin or fetchmail. Or maybe you're using Unix-based applications for some network services, but you really want to run them under Windows so you can integrate them into your overall network security model. Whatever the case, wishing that you could run best-of-breed applications from different operating systems simultaneously is pretty common, and often unavoidable.
In my case, I ran into this situation when upgrading an aging utility server, and I made the mistake of buying a brand-new Intel motherboard that does not yet have adequate Linux driver support. If I was going to use this system at all, it had to run Windows, but most of the software that this particular system is meant to run was designed for Unix.
The traditional solution to this dilemma is to force yourself to pick one platform--either by developing detailed lists of the weighted pros and cons for each operating system, or by using the proven decision-making technique of eenie-meenie-minie-moe--and then making do with the tools that are available for the winner. But the reason you find yourself in this dilemma in the first place is because you want to use the best applications from multiple platforms, so "making do" with almost-as-good alternative applications means using something other than "the best" applications, and is sub-optimal by definition.
Another option here, and one that is increasingly viable, is to run multiple systems in parallel, or to use virtualization technology to run multiple platforms on a single system simultaneously. Given the relatively low cost of high-powered modern hardware, not to mention the availability of zero-cost virtualization products, this is at least more feasible than it used to be. However, there are additional problems that are created by this kind of approach, particularly in the areas of user and device integration.
But there is another option available that hardly anyone seems to know about, which is to simply run your Unix-based applications under Windows itself, using the Posix subsystem from Microsoft's Services for Unix package. In this model, the actual operating system itself is Windows, while system libraries and executables provide a Posix-compliant front-end to the operating system's resources. Simply put, Unix-based applications think they are using regular Unix, but are actually using the Windows resources instead.
This is easier shown than explained, and since a picture is worth a thousand words anyway, the screenshot below shows what I mean.
In this example, a Windows domain user is logged into a Windows XP PC via a local bash shell, using the Posix subsystem in SFU. To the user and applications, the environment looks and feels pretty much like a regular Unix system, but it is in fact Windows XP.