The rise of virtualization has transformed the landscape in IT in the past ten years. Sounds a little stiff and boring, maybe, but the ability to fire up a system inside of a software-based box can be incredibly useful—and not just for servers.
My first run-ins with virtualization came way before that term was in use. I used to run Mac emulators on my Atari ST and Amiga 3000 computers back in the day, so I could get a high-quality word processor to work alongside all the color and games on my computers of choice. Was fun to tweak Mac guys by going meta on their entire OS.
Nowadays, the Mac guys tend to be the ones using VMs to run Windows.
MAME, the multi-arcade-machine emulator, runs a series of pretty precise, low-level hardware emulation VMs, and that's nothing but pure entertainment. (Well, and fiddling with ROMs.) I've used Amiga Forever to resurrect my A3000 and play some sweet, sweet Psygnosis games.
Technically, one uses a VM to run Minecraft, too, I suppose, but that's getting a little precious.
I keep a VirtualBox VM around on my desktop running Windows 7, so I can run some older software I like and so I can use all the scanning features of my (still relatively new) copy/fax/scan/print device.
And here at TR, we are working on converting our web servers to VMs, as well. The added flexibility is looking very attractive, and I think we can gain performance overall simply be being able to move to newer hardware with ease.
So... how do use VMs? What are they good for? Any disappointments or frustrations about what they can't yet do, or are those getting to be rare these days? Discuss.