Saturday, June 21, 2008

Why 10.6 So Soon? A New Kernel? Could it be Xen? Arguments for and against

The speed of the new Mac OS release has me thinking.  It's really soon since Leopard was released, and Exchange support really isn't enough to warrant it.  Then I keep coming back to the reasons Apple said they are releasing it:  Security, Efficiency, and Power Consumption.  This is really low-level stuff, down to the kernel. 

When Mac OS X was first released, the OS was built around the Mach kernel.  To date, there are only two OSes that I am aware of that have successfully used the Mach kernel:  Mac OS X and the NeXT OS.  THis shouldn't be surprising, since Steve Jobs owned NeXT, and just brought it over to Apple when he came back.  

But the Mach kernel is very limiting, meaning that there is a lot of overhead to make it work across platforms.  While it can work fine on various architectures, the Mach kernel has to be developed specifically for that platform before it will work.  As such, there is an inherent flaw in using this core in an OS that is poised to do so many things.  

Another problem with the Mach kernel is virtualization.  Now, I'm not talking about virtualization in a desktop sense, but rather a server sense.  While it is possible to use the current OS in a virtual machine (both Parallels and VMWare are doing something just like that), it's very difficult to get it to work in Compatibility Mode, because the kernel needs to be modified heavily.  Since Compatibility mode is more efficient than HVM, it should be a goal of Apple.

But then I read this article regarding the possibility of using Xen as a replacement for the Mach kernel, as tested and run by Moshe Bar.  All of a sudden, my heart skipped a beat.  Xen!  Running natively on the Mac as a Bare-bones OS, virtualizing the Mac OS!  I started looking back at the evidence:  no PPC support, which means Intel only.  The Core 2 Duo and Atom chips all have Intel VT technology, so it should be no problem.  With Xen at the core, they can still keep Darwin open source, which is a huge plus.  And, you no longer need to boot up to Windows to use it:  Just run it through Xen.  It would work almost like fast user switching, but fast OS switching.  

And, virtualization no longer becomes a problem, either for desktop or server level.  The OS can still be targeted specifically for Mac Hardware (though I think that will no longer be an issue as there is a law against requiring software to run on specific hardware), and could even be easily migrated to other hardware platforms, should Apple so choose.  

Okay, once the euphoria of the possibility of Xen being the platform for OS X 10.6 Snow Leopard, the nagging started to hit me.  Could there be reasons why Apple wouldn't go with Xen?

  1. The new "Grand Central" multi-core optimization project.  It *could* be Xen, but why rename it?  Perhaps because it isn't Xen at all.  Of course it still could be, just modified to fix the Mac even more. 

  2. XenSource was purchased by Citrix not long ago, and the question of it's Open Source status is still hanging.  There could be some collaboration here, but Apple likes to have control of everything from start to finish.  It now becomes very unlikely.  


So the possibility starts to dim, and my hopes start to dim with them.  Perhaps the new core will be more Xen-friendly.  


So what do you think?  

No comments: