Ubuntu 2 1/2 month review, etc.

I’ve been using Ubuntu now for about 2 1/2 months. The verdict so far: I like it. When I initially installed it, I was looking for a Linux distro with a reasonably well-integrated and user-friendly desktop environment, and Ubuntu (with GNOME) has lived up to those expectations. I really like the GNOME file manager with its built-in sshfs support, and the menu/taskbar integration works really well. When I install an app, it automatically shows up in my “Applications” menu, and the GNOME-aware apps also take advantage of the task bar. This works even with GNOME apps that aren’t provided with Ubuntu — for example, today I installed gSTM, which is a GUI interface for managing SSH tunnels. It’s not part of the Ubuntu “universe”, so I downloaded a .deb from Sourceforge and installed it (there’s a handy GUI for installing .debs too, which Firefox launched automatically). Once installed, gSTM showed up in my “Applications” menu and also added itself to the task bar when I launched it. Very nice.

The only thing I’m not quite happy with is my age-old gripe with all Linux distros: fonts. I’ve done all my requisite font-fiddling and I’ve got fonts I’m pretty much happy with now. But the font rendering in Firefox is just horrible. Text is always overflowing table cells and other stuff, and certain web sites just look, well, bad. It’s not bad enough to be a show stopper, but I really wish it looked nicer. I’m not sure what’s to blame: Firefox, X, GNOME, or whatever. But I will say that the fonts look pretty good in most of the other apps.

Linux GUI distros have to fight an uphill battle, because there are so many different apps (some 20+ years old) coded to all sorts of different GUI standards. There’s no way to get all of these apps to look perfect — it’s like herding cats. But the GNOME people have done a pretty admirable job fitting everything together. The user experience is about as seamless as one could hope for.

I hate Microsoft

So, I’m trying to find a nice, easy, seamless way to access data on my home Linux fileserver from XP. The goal is to have drag-and-drop access to files using the Windows GUI, as opposed to using sftp, which I’ve been doing pretty much forever and am finally getting tired of. But of course, with Windows, nothing is ever easy. My first thought was to use NFS, as I’m already using that to provide connectivity for my Mac, and Microsoft kindly provides a free toolkit (“Services for Unix”) which includes an NFS client. Nope.. our Windows PC runs XP Home Edition (which really should be called “Crippled Edition”) and SFU doesn’t work with XP Home. Of course, I didn’t find this out until I had downloaded the entire 200+ meg SFU distro, extracted it, and attempted to run the installer, which happily crapped out. Thanks guys.

With SFU ruled out, I fell back on SMB. I already run Samba on the Linux box, so I can just map my home directory to a drive letter and do it that way. That’s not quite as nice as NFS because I have to enter a username and password when I map the drive (although I might be able to work around that). But, there’s a hitch with that too — the XP box already has a couple anonymous shares mapped from the same Linux server, and for some inane reason, XP won’t let me map shares from the same server under multiple usernames. But, I outsmarted it by connecting to the server using an alternate DNS name, and that seems to work fine.

So in spite of Microsoft’s best efforts, I now have an XP box that’s actually somewhat useful on a heterogeneous network. Party on…

All’s not happy in the land of calendaring

Well, it appears I spoke too soon about backwards compatibility with the new Oracle Collaboration Suite and the old Calendar API that I’m using for my iCal downloader stuff. The first bad sign happened a couple weeks ago, when I noticed that a bunch of my Oracle Calendar entries had mysteriously disappeared from my iCal subscription. After investigating, it turned out that an entry in September 2007 was screwing it up. If I downloaded a date range up to but not including that date, it worked fine, but as soon as I included that date, about 50% of my entries disappeared. Hmmmm, not good. Then, today, I noticed that my iCal subscription was about a week out of date. When I went to run the download job manually, it bombed out with the API error code CAPI_STAT_DATA_ICAL_COMPVALUE. The docs have the following description for this code: “There was a problem with what a component contained.” Thanks guys, that’s really helpful.

So anyhow, it looks like I’m back to square one with the calendar stuff. When I get the time, I’ll rewrite it to use the newer API. Alternatively, maybe OCS has a way to do this without having to write custom code. That would sure be nice. Unfortunately though, until I get around to this, I guess I’m stuck with no Oracle Calendar data on my Palm. Bummer.

Weird mp3act streaming problem solved..

When I switched my desktop Linux box to Ubuntu recently, I was able to get everything working relatively easily except for one thing: For some reason, I couldn’t stream MP3s from my mp3act server. The playlists would download properly, but nothing would play: xmms would just ignore everything in the playlists. Well, today I finally tracked the problem down, and it turned out to have nothing to do with Ubuntu. The culprit was my web proxy configuration. I run the Privoxy ad-blocking software (ObPlug: Privoxy really makes the web a much more pleasant experience). Now, I used to run a local Privoxy on each of my Linux boxes. But when I installed Ubuntu, I elected not to run Privoxy on the Ubuntu box. Instead, I configured Firefox to proxy through the Privoxy running on my server. That way, I only have to maintain one Privoxy installation. That works fine, but it does make all of my web traffic appear to come from the server rather than the desktop. And therein lies the problem. When mp3act generates a playlist, the URLs for the MP3 streams are all keyed to the IP address that requested the playlist. When xmms requests the stream, it doesn’t go through the proxy, so the requests come directly from the desktop, and the IPs don’t match. So, mp3act refuses to serve the file.

Solution: When talking to the mp3act server, use a direct connection and don’t go through the proxy. In Firefox, this is configured under Edit->Preferences->General->Connection_Settings. Once I made that change, everything worked fine.

Very very happy to have this working again!

3 cheers for backward compatibility

Well, we finally upgraded our calendar server last night, from old-and-crusty Steltor CorporateTime to its successor, new-and-shiny Oracle Collaboration Suite. And, kudos to Oracle, as it looks like they’ve kept it backwards-compatible with the old CorporateTime API. That is, my homegrown OracleCalendar-to-iCalendar exporter thingy is still working. That’s nice, because I’ve come to depend on it, and this means that I won’t need to spend lots of time fixing it. Things are a little busy here right now, so if it had broken, it probably would have stayed broken for awhile.

Once things slow down, and I can revisit this, I bet I can make it work even better with the newer APIs now. Actually I may not need to use the APIs at all any more, as OCS supposedly supports CalDAV. In the meantime though, I’ll very happily continue to use the stuff I already have.

Oh, and a few days ago, I installed MediaWiki. I think I’m going to get a lot of use out of it. Read about my recent trials and tribulations with PHP iCalendar.

Party on…

Ubuntu fonts

I think I’ve finally got an Ubuntu font setup that I can live with. It’s not perfect, but it’s livable. Here’s what I did so I can replicate it if necessary.

  • Install msttcorefonts package.
  • Install MS “Tahoma” and “Tahoma Bold” fonts, neither of which are included with msttcorefonts.
  • Set X server to 96x96DPI.
  • Install a custom .fonts.conf that disables anti-aliasing for smaller fonts, sets some font prefs, and enables sub-pixel rendering.
  • In Firefox, go to Edit->Content->Fonts & Colors->Advanced. Set Proportional font to “Sans Serif”, Size 14pt. Set Serif font to “Times New Roman”. Set Sans-Serif font to “Verdana”. Set Monospace to “Courier New” at 12pt.

I’m pretty sure that’s it. Further references may be found in other posts in this category.

The overall result is a very Microsoft-y look, probably because of the heavy use of the Tahoma font. Some fonts are a little too small, others are a little to big (the default font in Firefox, for one). But, I can live with this until I go to a Mac on my desktop. It took a bit of tweaking, but it definitely looks nicer than my old vanilla Debian setup.

Followup 8/15.. The menu fonts in OpenOffice.org were still kinda ugly after doing all this.. I fixed this by going to Tools->Options->OpenOffice.org->View, and unchecking “Use system font for user interface.” Then when I restarted, the menus came up in Tahoma. Problem solved.

Followup 1/11/07: Installed Firefox 2 and found things required some additional tweaking. Changed Sans-Serif font from Verdana to Arial. Changed proportional font size to 16pt and fixed font size to 14pt. Re-enabled anti-aliasing for smaller fonts. Not sure I’m 100% happy with it, but it’s tolerable.

Installed Ubuntu

I tracked down a spare 8.5gig disk today (the one that came with my old P2-300 box, ironically) and installed Ubuntu on it. First problem: I installed the spare disk as an IDE slave, and the Ubuntu install totally hosed the boot loader on the master (Grub). After installation, the boot loader gave some cryptic error message and hung. So, I booted into Knoppix and reinstalled the boot loader, which allowed me to boot into my Debian OS on the master disk. I then attempted to configure my existing Grub to boot Ubuntu off the slave. But, when I booted, grub refused to recognize the slave disk. Not sure why (BIOS issue maybe?) but I ended up copying all of the Ubuntu /boot stuff into the /boot partition on my master disk, pointing grub at that, and just booting everything from there. Once I did that I was finally able to boot Ubuntu. (One hitch with this method — Kernel upgrades in Ubuntu are no longer automatic. I have to copy the kernel and initrd images into /boot on the main disk, then edit the gruf.conf there to boot the new kernel. Not a big deal, as I don’t plan on running this configuration for too long — if I like Ubuntu, I’ll install it as the main OS on the computer.)

Upon bootup, it immediately prompted me to install 160-odd megs of software updates, which mostly worked, but some of them apparently crapped out as I got this happy-fun-ball “some updates failed to install” window after the installation finished. Being that Ubuntu uses apt, this is somewhat to be expected, but I hope it doesn’t screw up further updates (as apt failures on my Debian boxes are wont to do). Followup — no further problems with apt so far. After installing the updates, I was prompted to reboot, which I did, which brings me to where I am now, writing this entry.

Ubuntu seems nice enough, but so far it doesn’t seem much different from other Linux desktop installations I’ve seen, all of which are fraught with quality-control issues such as these. Once configured, they work well, but there’s always that pain of setup and configuration. I guess I’m a little disappointed — after all the hype I’ve read, I was hoping Ubuntu would be more revolutionary — a Linux desktop that doesn’t really feel like a Linux desktop. Oh well. Off I go to a command-line to get my graphics card working and fix my fonts, just like every other Linux desktop….

[More:]

OK.. Installing the nvidia driver was easy, actually. There’s a package (nvidia-glx) that takes care of it. After installing this, I went in and copied my configuration out of my old xorg.conf, restarted X (by way of the gdm display manager), and it came right up with my dualhead configuration.

I’m now in the process of installing some other “must-have” apps such as emacs, thunderbird, etc. Oh yeah.. and OpenAFS. Uh-oh…

Well, openafs turned out to be painless. Just install the modules-source package and follow the directions in /usr/share/doc/openafs-client/README.modules. Now to work on fonts. Installing msttcorefonts package helped a lot. To do that I first needed to go into Synaptic (Ubuntu’s GUI front-end to apt) and enable the “multiverse” repository, which includes non-free packages. Then, I found that my X display was not set to 96x96dpi, which is supposedly optimal for font rendering. Based on info found here and here, I tweaked the display DPI in xorg.conf by adding the following option to my “Monitor” section:

DisplaySize 783 277 # 96 DPI @ 2960x1050

and the following to “Device”:

Option "UseEdidDpi" "false"

Next it looks like I need to tweak anti-aliasing for certain fonts (reference).

Little by little it’s coming along.

Another good font tutorial for configuring fonts under Ubuntu. This one includes instructions for installing the Tahoma family, which for some reason is not included with Microsoft’s Core fonts for the web. With the MS fonts (plus Tahoma) installed, things look much better already, and apparently I can improve the look further by tweaking anti-aliasing and other stuff… might play with that a bit tomorrow.

First impressions of Ubuntu, etc.

Last Friday I tried out the latest release of Ubuntu Linux. They provide a “live” CD, which boots and runs directly from the CD just like Knoppix. My goal is to find a nice desktop-oriented version of Linux that “just works”. On the server side, I’m sticking with Debian, but I find vanilla Debian a bit lacking in the desktop department. So as a stop-gap between cutting over to OS X completely, I thought I’d try out Ubuntu and see how I like it. Ubuntu is based on the same apt package system as Debian, so it’s familiar, and it’s touted as being very desktop-friendly.

First impressions: it looks nice. apt-get works as expected from the command line, but the default archive server has a very slow connection — I wonder if there are mirrors on Internet2 that I could use. If not, that’s a definite drawback, as I’m not sure I could give up the blazing speed I get from debian.lcs.mit.edu. I was able to install xmms easily, and my sound card was immediately recognized, and the system shares the sound card between apps. However, for some reason it didn’t work streaming MP3s from my mp3act server. Recent versions of OpenOffice and Firefox are provided. It didn’t pick up my dual-head setup, but I didn’t expect it to — I’ll need to download and install nVidia’s x.org driver manually. It looks like I’ll need to install some compilers and devel tools before I’ll be able to build the nVidia driver. But I expect it’ll work.

As with every other version of Linux, the default fonts are butt-ugly. Why can’t someone put out a Linux distro that has nice fonts out of the box? That has always been one of my biggest gripes with Linux. There are tutorials on the ‘net to improve the look of the fonts under Ubuntu, but honestly, this shouldn’t be something I have to mess with. Linux is never going to get anywhere in the desktop market until they can get past this issue.

All of that said, I think I may try out an “official” install of Ubuntu on the hard drive, and see how it goes for awhile. I’d rather not wipe out my existing Debian install, so I’ll have to scrounge around for a spare hard drive first.

In other news.. I’m thinking about finally taking the plunge and going with totally paperless bills and financial statements (where possible). My redundant disk setup gives me a more reliable place to store documents electronically, so there’s no reason not to go for it. As with everything else, I’ll see how it goes.

MySQL Replication

I’m about done with my computer shuffling which I started a month or so ago. I have a 300g drive at work and a 120g drive at home. The idea is to replicate stuff like the MP3 collection, photos, system backups, etc. in both places, to guard against losing data to a disk crash.

The next challenge is to set up an mp3act server at home that replicates the one at work. I’m going to try doing this with MySQL replication. The idea here is to replicate the mostly-static music library tables with work being the master and home being the slave. Then, each site would have its own copies of the dynamic tables like playlists and playlist history. This lets me use one-way replication and avoid setting up a dual master/slave configuration, which I don’t think would work well with the dynamic tables, particularly WRT simultaneous access etc.

Yesterday and this morning I took the first steps toward doing this, which involved locking down my MySQL server at home, then setting it up to allow remote TCP connections (comment out the bind_address option in my.cnf). Then I needed to create a slave user, which I did by doing

GRANT REPLICATION SLAVE ON *.* TO slave@homedsl IDENTIFIED BY 'password'

The grant command will create the user if it doesn’t already exist.

Then, I needed to edit /etc/mysql/my.cnf on both the master and the slave, to give each one a unique server ID. I gave the master an ID of 1 and the slave an ID of 2. This is accomplished in the [mysql] section of the config file, with a line like this:

server-id=1

Next, I followed the instructions in the documentation (see link above) to lock the tables on the master and create tar archives of each of the databases I wanted to replicate (on my Debian box, each database has its own directory under /var/lib/mysql). I then untarred the databases into /var/lib/mysql on the slave. For each database I was copying, I added a line like this to the slave’s config file:

replicate-do-db=database-name

I found that if I wasn’t replicating all the databases from the master, the replication would fail unless I explicitly listed the databases like this. The slave expects the tables it’s replicating to already be present — it does not “automagically” create tables as part of the replication process. I wasn’t really clear on this point after reading the documentation; it was only clear after I actually tried it.

With the tables copied over and the configurations tweaked accordingly, I followed the docs to record the master’s ‘state’ info, point the slave at the master, and start the replication threads on the slave. This all worked as expected. All in all, it wasn’t too hard to set up, so I’ll see how well it works going forward.

Whew… what a day

Thursday was quite the day of triumphs and tribulations.

It all started out with a successful swap-out of my home Linux server. It had been running on an old, trusty-but-tired Dell P2-300mhz, and I upgraded it to a slightly-less-old Dell P3-450mhz (Linux is far from perfect, but it truly shines at getting new life out of old, scavenged hardware). The upgrade was as easy as: build a new kernel, shut the old box down, pull out all the boards and peripherals, put the stuff in the new box, and boot the new box up. The result was a home server with a 50% faster processor and an extra 256mb RAM (640mb total vs 384mb). Not earth shattering, but a noticeable improvement, and it was easy to do. The trick to doing is this to transplant as much of the “guts” from the old box to the new box as possible, so the hardware configuration stays mostly the same.

Next up was the launch of our new myUMBC portal, which so far has been very smooth, other than the usual little niggling problems that always pop up with these things. We had already been running uPortal in production for six months, and that experience definitely helped ease the transition. The centerpiece of this launch was a complete redesign of the UI, but behind the scenes we also upgraded the framework and layout manager. This gave us an opportunity to start out “clean” with a fresh set of database tables, and to redo a few things which were done sloppily with the initial launch (such as our PAGS group and channel hierarchies). It positions us very well for future releases and gives us a clean platform on which to build future improvements.

Of course, by Murphy’s Law, the air conditioning in ECS picked yesterday to go on the fritz. So the launch was a little sweaty, but it happened anyhow. When I left the office, the A/C was finally getting back to normal, and then I get home and our power goes out. It ended up being out for around 4 hours, from 7:30pm till 11:30pm. Not all that long, but it always seems like forever when it’s actually out, and it made for a pretty sweaty (and surly) day. And of course, BGE’s automated system never gave us an estimate of when the power would be restored, so Cathy packed up the kids to go sleep at her sister’s, and pulled out 2 minutes before the power came back on. Fortunately I was eventually able to get a decent night’s sleep. I must say I’m more than ready for this summer to end. This is the first truly miserable summer we’ve had since 2002, and I had forgotten just how bad 2002 was. Oh well… t-minus 7 weeks till the Outer Banks.