• Improved Multiboot support in NetBSD/i386

    Back in February this year I added Multiboot support to NetBSD/i386. Unfortunately, the implemenation was quite hackish because it required the application of a patch to GRUB-Legacy: the code used the "a.out kludge" present in the Multiboot specification which this bootloader incorrectly omitted in ELF kernels; the patch fixed this issue. However, this prevented booting NetBSD with mainstream GRUB builds (those used by all Linux distributions), thus making this feature mostly useless.The need for the "a.out kludge" came from two different problems:The kernel's ELF image was incorrectly linked because it did not set the correct physical load addresses for the segments it contained, thus GRUB could not load it because it thought there was not enough memory.The "a.out kludge" was used here to tell the boot loader which was the correct address to load the binary image into. Pavel Cahyna fixed this issue back in May, removing the need for the hack in this specific case.The native boot loader constructs a minimal ELF image that contains the kernel symbol table (ksyms) and sticks it just after the BSS space in memory. GRUB did not do this so the NetBSD kernel resorted to manually creating this image itself based on the data passed in by GRUB. In order to be successful, some space was reserved after the BSS section by using the "a.out kludge" (tricking the bootloader to think that this section was larger than it actually was) so that the kernel's bootstrapping process could freely access it. Pavel's fix did not address this problem so, when booting a NetBSD Multiboot kernel with an unpatched GRUB, ksyms did not work.I've now finally fixed this long-standing issue appropriately. All the code to create the minimal ELF image is gone and instead the kernel simply moves the data passed in by GRUB to a memory region that is available after bootstrapping. Then, it uses a custom function (ksyms_init_explicit instead of ksyms_init) which does not need any ELF headers to initialize the ksyms.The results are much clearer and less error-prone code as well as the ability to boot NetBSD straight from a stock GRUB installation! Keep in mind that this will go into 4.0, so setting up dual-boot machines will be easier than ever :-)I've prepared a couple of screenshots for your pleasure. First, a look at the configuration used to boot a Multiboot-enabled NetBSD kernel:And then a look at the messages printed by the kernel as well as a demonstration that ksyms work by invoking a backtrace in the debugger (ddb): [Continue reading]

  • Mac OS X vs. Ubuntu: Summary

    I think I've already covered all the areas I had in mind about these two operating systems. And as the thread has lasted for too long, I'm concluding it now. Here is a summary of all items described:IntroductionHardware supportThe environmentSoftware installationAutomatic updatesFreedomCommercial softwareDevelopment platformAfter all these notes I still can't decide which operating system I'd prefer based on quality, features and cost. Nowadays I'm quite happy with Kubuntu (installed it to see how it works after breaking Ubuntu and it seems good so far) and I'll possibly stick to it for some more months.This will last until I feel the need to buy a Mac again (or simply renew my desktop), at which point I might buy one with Mac OS X or wait until the desire passes away ;-) [Continue reading]

  • Mac OS X vs. Ubuntu: Development platform

    First of all, sorry for not completing the comparison between systems earlier. I had to work on some university assignments and started to play a bit with Haskell, which made me start a rewrite of an utility (more on this soon, I hope!).Let's now compare the development platform provided by these operating systems. This is something most end users will not ever care about, but it certainly affects the availability of some applications (specially commercial ones), their future evolution and how the applications work e.g. during installation.As you may already know, both systems are Unix-like. First of all, they provide a comfortable command line interface with the usual utilities and development tools to get started very easily. They also come with the common POSIX interfaces to manage files, sockets, devices, etc. which allow a great deal of compatibility among operating systems that support them. The problem they have is that they are too low level, are C-specific and are "console-based"; there is no way to develop visual applications with them. This is why almost all programs use some sort of abstraction layer over these interfaces apart from some library that provides a graphical toolkit; otherwise development times could be extremely long and there could be lots of portability problems. These extra libraries brings us the biggest difference among the two OSes.When you are coding for Linux, the de facto standard graphical interface is the X Window System which comes with its own set of graphical libraries (Xlib) to program applications. The problem is that these are, again, too low level for general usage so developers have come up with some nice abstractions that provide widgets, layouts, etc. Among them are the well-know Qt and GTK+ toolkits. These, on their own, also lack functionality to build complete desktop environments (DE), so KDE and GNOME were born on top of them. They not only provide a consistent graphical interface but also a development platform on which to build applications: each DE has a set of services and components that make the implementation of shiny tools a breeze.However, application developers are faced with the difficult task of choosing the adequate subset of libraries for their application, which at its root means choosing one of the two major development platforms (KDE and GNOME) — if they don't implement their own, something not that uncommon. For tiny programs this may not be an issue (as can be seen with the duality of tools available), but it certainly has issues for big applications (you certainly do not want to rewrite, e.g., The GIMP, for KDE) and commercial ones. In some way you can think as if you were coding for KDE or GNOME, not Linux. You may argue that competition is good but, in my opinion, not at this level.On the other hand, Mac OS X has three frameworks: Cocoa, Carbon and Cocoa on Java (I'm not sure this last name is correct, but you get the idea). Carbon is from the Mac OS 9 days and Cocoa on Java is not recommended for anything else other than learning. Even if you chose to use Cocoa on Java, in the end, you would be using plain Cocoa so you needn't consider it in the equation. In other words, the only reasonable choice when developing an application for Mac OS X is to approach Cocoa. This brings a lot of consistency between applications, keeps a single set of services available for all programs to use and allows easy interoperability with each component. (Not to mention that you either use Cocoa or you don't; you cannot do strange mixes... or I haven't seen them.)Oh, and before you tell me that Qt is also available for Mac OS X... yes, it is, but it is built on top of Cocoa. So there is a common, high-level layer beneath all APIs that provides consistency among them.As a side effect we have the problem of application redistribution. End users do not want to deal with source code, so you have to provide them some binaries. But how do you do that on Linux to ensure that they will work on any system? Keep in mind that "any system" does not mean any version of a specific distribution; it means any distribution! Well, the thing is... it is almost impossible: there are problems everywhere that prevent binary applications to be transported between systems. I'm not going to discuss this here because it is a rather long topic; check out the linked article for more details (and I think they are missing some).Contrarywise, Mac OS X is simpler in this aspect. There is just one operating system with a consistent set of libraries, so you build software for those explicitly. You only need care about compatibility of some APIs between versions. And if your application uses any non-standard library, you can bundle it in the final binaries for easy redistribution (OK, OK, you'd also use static binaries in Linux). This of course also has its own drawbacks, but in general is nicer on the developer's eyes.There are other differences, but the point I want to make (and which is entirely my own view) is that the diversity in Linux hurts development. Different distributions make it hard to package software for each of them (can you conceive the amount of time wasted by package maintainers of each single distribution out there?) and bring many binary compatibility issues. Because, you know, Linux is just the kernel. Aside that, different desktop environments pose some hard decisions to the developers and there is a lot duplicate code in them to manage common stuff; fortunately Freedesktop.org is solving some of these points.Systems as Mac OS X (or the BSDs, or Solaris, etc.) are better in this regard because the system is a single unit distributed by a single group of people. So, whenever I say I use "Mac OS X Tiger" developers know exactly what my system has available for them.Yeah, this is a rather generic rant against Linux and is possibly not that important in our comparison, but I had to mention it because I've faced the above issues multiple times. [Continue reading]

  • Ubuntu vs. Mac OS X: Commercial software

    As much as we may like free software, there is a lot of interesting commercial applications out there (be them free as in free beer or not). Given the origins and spirit of each OS, the amount of commercial applications available for them is vastly different.Let's start with Ubuntu (strictly speaking, Linux). Although trends are slowly changing, the number of commercial programs that are addressed to Linux systems is really small. I've reasons to believe that this is because Linux, as a platform to provide such applications, is awful. We already saw an example of this in the software installation comparison because third-party applications have a hard path to distribute their software under the Linux world. We will see more examples about this soon in another post.In my opinion, this is a disadvantage because, although there are free replacements for almost any utility you can imagine, they are not necessarily better yet. Similarly, there are tools for which no replacement exists yet. Or simply put the user may want to use such commercial tool because he prefers them over any of the other alternatives.On the other side of things, a typical user will generally be satisfied with all the free tools included in the Ubuntu repositories. If not, sites such as Sourceforge or Freshmeat are full of Unix-based free applications. Generally they won't ever have the need to consider commercial applications so they won't have to spend any single amount of money to use their software nor keep it up to date.Mac OS X is a different world; commercial software (shareware, freeware, etc.) is still extremely abundant in it. This is probably, in part, because the platform is also commercial: developers won't feel "strange" in providing applications following the same model, and there are chances that their applications will succeed. Fortunately, there is also a growing number of free applications that compete with these commercial ones, and they do a great job (to mention a few: Camino, Adium X, Colloquy, Smultron, etc.).Even more, given that Mac OS X is based on Unix and that it provides a X Window System server, it is possible to run most of the free applications available under Linux in this operating system. Just check out, for example, The GIMP, or fetch pkgsrc and start building your own favourite programs!Aside that, there are also very popular commercial applications available for this OS. These include the popular Apple and Adobe applications (iWork, Photoshop, Premiere, etc.) and other such as Microsoft Office, Parallels or Skype (I know, the latter is also available for Linux). It is a fact that nowadays some of these programs are superior to their free alternatives and some people will want to use them. But, ultimately, they have the freedom to make that decision.In this area I think that Mac OS X is more versatile because it can take advantage of both free applications and some interesting commercial ones. Only time will tell if those will be natively ported to Linux some day or not, but if/when that happens, it will be as versatile as Mac OS X with the advantage of a predominating feeling of developing free software. [Continue reading]

  • Mac OS X vs. Ubuntu: Freedom

    Ubuntu is based on Debian GNU/Linux, a free (as in free beer and free speech) Linux-based distribution and the free GNOME desktop environment. Therefore it keeps the phylosophy of the two, being itself also free. Summarizing, this means that the user can legally modify and copy the system at will, without having to pay anyone for doing so. When things break, it is great to be able to look at the source code, find the problem and fix it yourself; of course, this is not something that end users will ever do, but I have found this situation valuable many times (not under Ubuntu though).Mac OS X, on the other hand, is a proprietary OS with the exception of the core kernel whose source code is published as free sofware (I don't know the license details though). This means that you must pay for a license in order to use it, and even then you cannot mess with its internals — its source code — nor redistribute it. Given that Mac OS X comes prebundled with new Apple machines, this is not so important because you'll rarely feel the need to look at its code (I certainly don't care as long as it works). However, if you want to jump to a new major version, you must pay for it. For example, if I got an iMac now, I'd have to pay around 200€ in mid-2007 to get the Mac OS X 10.5 family pack (5 licenses); I'm not implying that it's not worth it though.I know the free software ideals very well and like them but, sincerely, freedom is something that end users do not perceive in general. And I won't base the decision on which OS to run on my computer based on this criterion alone; that's why the iBook is stuck with Mac OS X ;-) Really, I've lately come to think that what really matters are free and open standards (i.e. communication protocols, document formats, etc.), not the software packages themselves. [Continue reading]

  • Mac OS X vs. Ubuntu: Automatic updates

    Security and/or bug fixes, new features... all those are very common in newer versions of applications — and this obviously includes the operating system itself. A desktop OS should provide a way to painlessly update your system (and possibly your applications) to the latest available versions; the main reason is to be safe to exploits that could damage your software and/or data.Both Mac OS X and Ubuntu provide tools to keep themselves updated and, to some extent, their applications too. These utilities include an automated way to schedule updates, which is important to avoid leaving a system unpatched against important security updates. Let's now drill down the two OSes a bit more.Ubuntu shines in this aspect thanks to the centralized packaging of software. Given that all available applications are packaged by the developers and put in a common server, the apt package manager is able to automatically update all of your installed packages to the latest available versions. This also includes keeping track of added dependencies so that an update will not (generally) break any of the existing stuff. In some sense, you can consider that there is no "core OS": once a new program is installed from the repository, it is integrated into the OS in such a way that it is indistinguishable.Unfortunately, if the application was not explicitly packaged for Ubuntu, it is not possible to use apt (I mean Synaptic, the tool you'll always work with) to keep it up to date. In that case either the program itself provides its own updating method or it simply provides none at all, leaving the user on his own to update it whenever he wants/remembers. We saw some examples of applications not made for Ubuntu in the previous post, which basically includes commercial software.Mac OS X is slightly different. Similarly to Ubunti, it has a tool that can update your system as well as applications, but these are restricted to Apple ones such as iLife. Third-party applications need to provide their own updating method, and most of them actually do (*). For example, taking Adium X again: this program checks on its startup if any newer version is available and offers the user to download and install it in that case. This is completely decoupled from the system, which makes it suboptimal. It'd be great if the OS could keep everything up to date as long as the applications provided the required information to the update manager.So... it is clear that Ubuntu wins this specific comparison as long as you always use prepackaged software. Mac OS X, while not as clean, is good enough because the OS is able to "fix" itself and most third-party applications already provide custom update methods. In the end, the user will not notice the difference.* I don't know if it is possible for such programs to "hook" into the system's update manager. This sounds reasonable and, if indeed supported, could make this point moot. Don't hesitate to correct me in that case! [Continue reading]

  • Mac OS X vs. Ubuntu: Software installation

    Installing software under a desktop OS should be a quick and easy task. Although systems such as pkgsrc — which build software from its source code — are very convenient some times, they get really annoying on desktops because problems pop up more often than desired and builds take hours to complete. In my opinion, a desktop end user must not ever need to build software by himself; if he needs to, someone in the development chain failed. Fortunately the two systems I'm comparing seem to have resolved this issue: all general software is available in binary form.Ubuntu, as you may already know, is based on Debian GNU/Linux which means that it uses dpkg and apt to manage installed software. Their developers do a great job to provide binary packages for almost every program out there. These packages can be installed really quickly (if you have a broadband Internet connection) and they automatically configure themselves to work flawlessly in your system, including any dependencies they may need.On the easiness side, Ubuntu provides the Add/Remove Applications utility and the Synaptic package manager, both of which are great interfaces to the apt packaging system. The former shows a simple list of programs that can be installed while the latter lets you manage your software on a package basis. After enabling the Universe and Multiverse repositories from Synaptic, you can quickly search for and install any piece of software you can imagine, including a few commercial applications. Installation is then trivial because apt takes care of downloading and installing the software.Given that the software is packaged explicitly for Ubuntu (or Debian), each package morphs into the system seamlessly, placing each file (binaries, documentation, libraries, etc.) where it belongs. On a somewhat related note, the problem of rebuilding kernels and/or drivers is mostly gone: the default kernel comes very modularized and some proprietary drivers are ready to be installed from the repository (such as the NVIDIA one).Unfortunately, you are screwed if some application you want to install is not explicitly packaged for the system (not only it needs to be compiled for Linux; it needs to be "Ubuntu-aware"). These applications are on their own in providing an installer and instructions on how to use them, not to mention that they may not work at all in the system due to ABI problems. I installed the Linux version of Doom 3 yesterday and I can't conceive an end user following the process. The same goes for, e.g., JRE/JDK versions prior to 1.5, which are not packaged due to license restrictions (as far as I know). We will talk some more about this in a future post when we compare the development platform of each system.Mac OS X has a radically different approach to software distribution and installation. An applicaction is presented to the user as a single object that can be moved around the system and work from anywhere. (These objects are really directories with the application files in them, but the user will not be aware of this.) Therefore the most common way to distribute software is through disk images that contain these objects in them. To install the application you just drag it to the Applications folder; that's all. (Some people argue that this is counterintuitive but it's very convenient.) These bundles often include all required dependencies too, which avoids trouble to the end user.Other applications may include custom (graphical!) installers, although all of them behave similarly (much like what happens under Windows). At last, some other programs may be distributed in the form of "mpkg"s which can be processed through a common installer built into the system; all the user has to do is double click on them. No matter what method is used by a specific program, its installation is often trivial: no need to resort to the command line nor do any specific changes by hand.As you can see both systems are very different when it comes to software installation. If all the software you need is on the Ubuntu repositories, it probably beats Mac OS X in this area. But this won't always be the case, specially for commercial software, and in those situations it can be much worse than any other system. I'm not sure which method I like most; each one has its own pros and cons as described above. [Continue reading]