• Keeping pkgsrc packages up to date

    drio asks in the suggestion box which is the best way to keep all the packages installed from pkgsrc up to date. I must confess that pkgsrc is quite weak in the updating area when compared to systems such as apt-get or yum. The problem comes from the fact that pkgsrc is a source-based packaging system, meaning that the end user builds packages by himself most of the times. Doing updates from such a system is hard because rebuilds take a long time and have high chances of breaking, leaving your system in an unusable status. Of course there is support for binary packages in pkgsrc, but we are not doing a good job in providing good sets of prebuilt binaries. Furthermore, and as drio stated, there is few documentation on the subject.The thing is there are several ways of updating all your installed packages. All of them are quite tedious and not "official", but with some work you can configure some scripts and cron jobs to automate the process as much as possible.Before doing an update, I usually start by running pkg_chk -u -n; this tells me which packages are out of date and which are their new versions. If the resulting list is short, I tend to follow the make replace procedure. This only works if the new versions of the packages are binary compatible with the old ones, something that you cannot guarantee by looking at the version numbers. For example, you can assume that if you have version X.Y.A from the libfoo library, the newer X.Y.B will be compatible to the old one. This is generally true but not always. Plus you need to have some knowledge of the dependency graph of your installed packages. Anyway, if you want to take the risk, simply go to the pkgsrc's directory for the outdated packages and run make replace in them. In most cases this works and is the fastest way to do minor updates.Things get worse when you have to update lots of stuff. The first and most obvious approach resorts to doing a clean reinstall. Start by issuing pkg_delete -r "*" followed by wiping /usr/pkg and /var/db/pkg. Then rebuild your packages. The problems with this approach are that it introduces a huge downtime in the system — until you have rebuilt everything (which can take a long time), the tools won't be available — and that any build failure can prevent you from reconfiguring your system soon enough.Another approach involves using different installation prefixes for the old and new installations. I used to do that when working on major GNOME updates. To do this set LOCALBASE to something like /usr/pkg-YYYYMMDD (similarly for PKG_SYSCONFBASE, VARBASE and PKG_DBDIR) where YYYYMMDD is the date when you started the installation of that specific set of packages. Then install your packages as usual and at last create a /usr/pkg symlink to point to the real directory. Do not change the date until you need to do major updates. When that time comes, change the date in your configuration to the current day; after that, pkgsrc will think that you don't have any packages installed so you can cleanly reinstall everything. Once you have finished installing all your packages again, update the /usr/pkg symlink to point to the new directory and remove the old one. Voila, minimum downtime and build failures cannot bother you. (However, you will need to migrate configuration files to the new tree, for example.)The last approach I can think of involves using pkg_comp. Use this tool to configure a sandbox in which you build binary packages for all the stuff you are interested in. You can even set up a cron job to do this rebuild weekly, for example, which is trivial using the tool's auto target. Once pkg_comp has generated a clean set of binary packages for you, you can proceed to update your real system with those packages. The way you proceed is up to you though. You can remove everything and do a clean reinstall (which should be a quick process anyway because you needn't rebuild anything!) or use pkg_add -u for the outdated packages. I think this is the safest way to proceed.Oh, I now notice that there is a pkg_rolling-replace utility that can also be used for updates. Dunno how it works though.Hope this makes any sense!Edit (22:15): Peter Bex refers us to the How to upgrade packages page in the unofficial NetBSD Wiki. It contains all these tricks plus many more, so it is worth to link it from here for completeness. [Continue reading]

  • Piled Higher and Deeper

    I've been told today about the Piled Higher and Deeper website, also known as phdcomics (easier to remember). And so far I'm hooked. I love this comic strip! May it be because I'm already involved in the research area due to my PFC and I know what they are talking about? Possibly. And it also illustrates what I can "expect" if I finally enroll in a Ph.D. course. [Continue reading]

  • Monotone's help rewrite merged

    I have just merged my net.venge.monotone.help-rewrite branch into the mainline Monotone's source code. I already explained its purpose in a past post, so please refer to it to see what has changed.There is still some work to do on the "help rewrite" area, but I won't have the time to do it in the near future. Hence I added some items to the ROADMAP file explaining what needs to be done, hoping that someone else can pick them up and do the work. They are not difficult but they can introduce you to Monotone's development if you are interested! ;-) [Continue reading]

  • Talk about Git

    I've been using Git (or better said Cogito) recently as part of my PFC and, although I don't like the way Git was started, I must confess I like it a lot. In some ways it is very similar to Monotone (the version control system I prefer now) but it has its own features that make it very interesting. One of these is the difference between local and remote branches, something I'll talk about in a future post.For now I would just like to point you to a talk about Git by Linus given at Google. He focuses more on general concepts of distributed version systems than on Git itself, so most of the ideas given there apply to many other systems as well. If you still don't see the advantages of distributed VCSs over centralized ones, you must watch this. Really. Oh, and it is quite "funny" too ;-) [Continue reading]