More fun with digital TV

The great digital TV antenna project continues.  I found out that the cheapie UHF antennas I built are known as “4 bay bowtie dipoles,” and they are very similar to the Model 4221 by Channel Master.   Based on my reading, I’ve decided not to try using a combiner to join the antenna signals.  Instead I picked up a remote control A/B switch at Radio Shack, model 15-1968, and it seems to work great.  I’m going to buy a second one for our other TV.  Providing most of your stations are in 1 of 2 different directions (as mine are), this switch is a great alternative to a rotator.  In particular, multiple TVs can watch signals from different antennas simultaneously, which is not possible with a rotor.  The down side, of course, is that you need to run two separate antenna cables to each TV.  But that only needs to be done once.  I’ve also ordered a couple of Sony model RM-VL600 universal remotes, based on all the positive reviews.  My hope is that I can use these to work the A/B switches.  We’ll see how they work out once they get here.

I may need to move my Baltimore antenna.  It’s aimed NNE directly at TV hill, but there are a lot of tall trees blocking its path.  It seems to pick up most of the Baltimore stations just fine..  WMAR-2, WBAL-11 and WBFF-45 all come in perfectly with 95%+ signal strength consistently and no dropouts.  WJZ-13 is my problem child, though.  I was watching it this afternoon and it started dropping out as soon as the wind kicked up.  Wondering if the frequency WJZ-DT is currently using has something to do with it — WMAR, WBAL and WBFF are all currently at the higher end of the UHF spectrum, while WJZ is lower at 38.  Dunno, but I’m going to try moving the antenna to the other end of the house, where it can hopefully get a clear shot through the foliage.  Just need a longer length of RG-6.

All bets are going to be off come February 2009, when a lot of these stations will be shifting back to the VHF band.  At that point, I may need to add a VHF antenna to my setup.  Looks like all of my local stations will end up on the high VHF band (channels 7-13), so I should be able to get by with a smaller VHF antenna.  I’m going to hold off before I do anything, though.  My current antennas seem to pick up the analog channels in the VHF-hi band pretty well, so they may do the job with the digital channels.

Stay tuned..  (no pun intended)

Mortgage Escrow Follies

A year or so back, the mortgage on our primary house was sold to CitiMortgage.  Prior to this happening, I was aware of quite a number of horror stories about CitiMortgage, so when the sale was announced, I was a little apprehensive.  However, one thing I’ve learned over the years is to always look at the big picture.  The nature of these things is that people who have problems tend to complain the loudest, so for every one person complaining about CitiMortgage on the Internet, there are probably hundreds who are not having problems.  Still, there are disproportionately more horror stories floating around about CitiMortgage than about other mortgage companies, which is a little troubling.  All the same, I gave them the benefit of the doubt.

After a year, the verdict on CitiMortgage is neutral.  The loan transfer went off without a hitch, with no mistakes on the principal, interest, and amortization side of things (kind of hard to screw up a simple 15 year fixed-rate loan, I would think).  They also made two interest credits to the escrow account, which was a pleasant surprise, although I’m assuming they were one-time credits as there have been none since (Maryland does not require lenders to pay interest on escrow balances, so I’m wondering if this was one of the terms of the loan sale or something — haven’t bothered to investigate).  To date, all of our escrow transactions (property tax and hazard insurance bills) have been processed correctly and on time.

There’s been one little hiccup to the whole CitiMortgage experience, related to the annual escrow analysis process.  First, some background.  Our property taxes come due semiannually, as is the case in many municipalities.  However, they’re not billed in equal installments — the first installment (due in July) is always several hundred dollars more than the second (due in December).  I’m not sure how common this practice is.  But in any case, most mortgage companies I’ve dealt with take a full year’s worth of past tax payments into account when running an escrow analysis.  CitiMortgage, however, assumes that the tax is billed in equal installments, and only looks at the most recent tax payment during the escrow analysis.  They then use this amount to project both semiannual tax payments for the coming 12 months, and as a result, their numbers are always wrong.

Depending on what time of year they run the escrow analysis, this can be good or bad.  CitiMortgage ran our first escrow analysis in January 2008, right after they paid the (lower) December property tax installment out of escrow.  They then used the lower December amount to project both the July 2008 and December 2008 payments.  The result was a lower-than-expected monthly escrow payment, which is great (as long as there’s still enough in the escrow account to cover the bills — you never, ever want your escrow balance to go negative).  However, CitiMortgage caught onto this in July, when the tax payment was much higher than they had projected.  This triggered another escrow analysis in July.  This time, they used the July amount to project payments in December 2008 and July 2009, which resulted in a monthly escrow payment that was too high.

Now, it’s nothing personal, but if CitiMortgage underestimates my monthly escrow payment, and there’s still enough in the account to pay the bills, I’m certainly not going to call it to their attention.  In the opposite situation, though, I’m always going to call them on it, because I don’t want to pay more than necessary into a non-interest-bearing escrow account.  So, when we got the second escrow analysis statement, we got on the phone with them.  They told us to fax them a copy of our property tax bill, showing the correct amount due for December, which we did.  This didn’t produce any action for two weeks, though, so my wife called them again.  This time she reached a supervisor who acknowledged receipt of the fax, re-computed the escrow, and adjusted our monthly payment on the spot.  So the two weeks of inaction was a little questionable, but the followup call produced an immediate resolution.

I was recently reading some of the FAQs on CitiMortgage’s web site (I’d provide a link, but it appears you need to be signed in to get at the FAQs) and it turns out this process is documented there.  Here’s an excerpt, with the relevant bit in bold:

We automatically adjust your escrow payment one time a year to reflect changes in your escrow related items. If you would like us to complete an interim adjustment, please send official documentation of the new tax amount to our Tax Department at:

CitiMortgage, Inc.
Attn: Tax Dept.
PO Box 23689
Rochester, NY 14692
Please write your account number on the documentation.

If this happens again (and I’m assuming it will) I may try sending my request to them via certified mail, with the thinking that it’ll be more effective than a fax.  Either way, though, it seems we should be able to get it rectified with a letter/fax and possibly a single follow-up phone call.  Time will tell.

If this little snafu unfolds the same way every year, it will result in a lower average escrow balance over time, because our escrow payment will be lower than it should be for the first half of the year.  The price of this is the hassle of getting the payment corrected after it’s overestimated in July.  Yet another reason to try and pay off the mortgage early, I guess 🙂

And finally, some parting advice:

  • Stay on top of things.  Monitor your escrow balance and activity at least monthly.  Make sure there is always enough in it to cover the bills.  Ensure that the mortgage company is paying the bills in a timely fashion.
  • Understand how mortgage companies compute escrow payments.  It’s a simple formula, and every mortgage company uses it. If you know it, you can double-check the mortgage company’s numbers and call them on any errors.

Laptop Travel Essentials

Being an intrepid occasional business traveler, I’ve come to rely on my trusty MacBook as sort of an office-away-from-home. Working while away from the office presents an interesting set of challenges. Internet access (particularly Wi-Fi) is becoming ever more ubiquitous, so getting connected is easy, but there’s no guarantee that a given Internet access point is secure — in fact, it’s most likely not. Working remotely often requires access to potentially sensitive data and resources that are protected behind firewalls and the like. It’s best to keep sensitive data off laptops, as laptops are easily stolen, and a data breech could land you on the 11:00 news, or worse.

This is a collection of tips, tools, etc. that I use to work securely with my laptop while on travel. It’s geared towards Macs, but much of it is applicable to other operating systems as well. Comments, suggestions, corrections, etc. are welcome.

SSH Tunnel Manager

A lot of organizations use VPNs to facilitate off-site access to private intranets, and ours is no exception.  I’ve never been a big fan of VPNs, because they all seem to rely on OS-specific drivers (or weird Java applets) that inevitably fail to work properly with my OS or web browser.  So, I avoid our VPN and use SSH tunnels instead.  All this requires is SSH access to a host with access to the intranet resource(s) I need.  With several well-crafted SSH tunnels, I’ve never found a need to use our VPN.

There’s one catch with SSH tunnels where laptops are concerned.  Setting up an SSH tunnel often requires feeding the SSH command a complex set of options.  When I’m on travel, I’m constantly moving from place to place, and bringing my laptop in and out of sleep mode.  This causes the SSH connections to time out, and I end up having to re-initialize all my tunnels every time I want to work on something — a big pain.  This is where a good SSH tunnel manager helps.  A tunnel manager maintains a list of tunnels and lets you start and stop them with a mouse click.  There’s a decent app for OS X called (surprise) “SSH Tunnel Manager,” and PuTTY does a nice job on Windows.  For Linux, I like gSTM.  With the SSH Tunnel manager, I’m up and running in seconds after starting up the laptop, and I don’t have to remember complex SSH command-line options.

Firefox Proxy-switching Extension

Secure web-browsing is a primary concern when traveling.  As such, I do all my browsing through SSH tunnels, which ensures that all my browser traffic is encrypted.  For general purpose browsing, I use a tunnel to an ad-filtering proxy running on a server in my office.  For work related stuff, as well as online banking and related things, I use a SOCKS proxy.  There are a couple other configurations I use as well.  Each of these requires a different proxy configuration in Firefox.  As shipped, Firefox only allows you to define a single proxy configuration.  There’s no support for multiple proxy configurations; if you want to change your proxy, you need to go in and manually update the settings each time.  Proxy-switching extensions allow you to define as many proxy configurations as you want, and switch between them quickly and conveniently.  I’ve found them to be indispensable.  There are a bunch of proxy-switching extensions out there, but my favorite is SwitchProxy, because it seems to be the best balance between simplicity and functionality (note that the stock version of SwitchProxy doesn’t run on Firefox 3, but I found a modified version that works nicely here).

Foxmarks

Foxmarks is a Firefox extension that synchronizes bookmarks between different instances of Firefox.  With Foxmarks, I now have the same set of bookmarks at work, at home, and on my laptop, and when I change my bookmarks in one place, all the others stay in sync automatically.  I’ve been running separate Firefox installations on different computers forever now, and I only recently discovered Foxmarks.  It’s one of those things where once you have it, you wonder how you got along without it.

VNC

VNC, or Virtual Network Computing, is a remote desktop-sharing technology.  It’s similar to Microsoft’s Remote Desktop service, but it’s an open standard and is platform-independent.  It allows me to pull up a virtual desktop and access data on a remote server as if I were physically sitting at the server.  This is a great way to keep sensitive data off my laptop — I just manipulate it remotely.  All of the connections are made through SSH tunnels. (what else?)

VNC is one of those things that I keep finding more and more uses for as time goes on.  I use it to access various GUI-based apps on my home and work PCs while traveling.  It’s particularly useful for running the occasional Windows or Linux-based app that I don’t have available on my Mac.  For example, I use GnuCash to track all of our household finances.  It’s installed on my Linux server at home.  With VNC, I can connect to my home server, run GnuCash remotely, and keep up with the finances while I’m away from home.  No need to run it locally on the Mac and worry about the data getting out of sync.

My favorite VNC client for the Mac is Chicken of the VNC.

FileVault

FileVault is a file-encryption system that ships with OS X.  It will transparently encrypt and decrypt files on the fly, using the user’s account password as a key.  I haven’t used it before, but I am going to give it a go with my new laptop.  It seems like an easy way to protect sensitive data that inadvertently finds its way onto the laptop.  In the event the laptop is stolen, the thieves will at least have to work harder to get at the data.

And there you have it.  I’m sure I’m leaving something out that will become apparent the next time I travel.  One thing I’d like to have is some sort of theft recovery software.  Haven’t yet looked into what’s available in that department.

They warned me

Everyone warned me the time would fly by, and they were right.

Today was my oldest son’s first day of kindergarten.  In what seems like the blink of an eye, he’s gone from infant, to toddler, to preschooler, to kindergartener.  Kind of scary how the time flies.  Next thing I know he’ll be in high school, then college.  I’m doing my best to enjoy his childhood while I can.  One of the rewards of age and wisdom is the ability to recognize life’s significant moments while they’re still happening.  Nothing drives this home like parenthood.  Our own childhoods are gone, but we can relive them vicariously through our children.

There’s an awful lot of advice about parenting floating around out there.  But I think a happy childhood is the single greatest gift a parent can give a child.  Hopefully my own kids will grow up to be well adjusted adults with happy memories of childhood.

And with that bit of late-summer sentimentality, we now return you to our regular bevy of posts about geeky stuff and swimming pool maintenance 🙂

Gorilla Glue Test

We have an automatic cleaner for our swimming pool.  It’s the kind that runs on water pressure and uses a booster pump.  I have kind of a love-hate relationship with it.  When it works, it works great, but when it doesn’t, I’m constantly swearing at it.  It’s got a lot of moving parts (belts, gears, you name it) and it seems like something breaks on it every year.  Anyhow, this year it was the automatic backup valve’s turn to break.

The backup valve is a big conglomeration of gears driven by a paddle wheel.  It’s actually quite ingenious.  The gears drive a rotating port that changes the flow of water through the valve every several minutes, which causes the pool cleaner to “back up,” which is an essential feature as it’s continually getting stuck in the pool’s corners.  One of the gears is held in place by a small retaining clip, and at some point this year the clip came off and disappeared into the great unknown.  The result was that the gear no longer stayed on the shaft.  It would stay there for a while, but eventually it would fall off and the backup valve would stop cycling.

Lacking a matching retaining clip, I wasn’t sure how to fix this without shelling out megabucks for a new valve.  So I decided to try Gorilla Glue. We picked up a bottle of this a while back, and it’s pretty impressive stuff.  One of its interesting properties is that it expands as it cures, sort of like that expanding aerosol spray foam stuff.  This can be a pain when you’re using it to repair furniture, as the glue tends to ooze out of the repair as it cures.  But, it’s exactly what I needed in this case.  I just stuck the gear on the shaft and added a dab of Gorilla Glue, and it expanded into a small blob, holding the gear in place perfectly.

So anyways, Gorilla Glue claims to be waterproof.  During pool season (which runs from Memorial Day through around the end of September in these parts), The pool cleaner spends 90% of its time at the bottom of the pool, submerged in chemically treated water.  I made the repair somewhere around the end of June, and it’s been mostly submerged ever since.  At the end of pool season, I’ll take the valve apart and see how well the repair has held up.  I’ve had no problems with the backup valve since making the repair, which I take as a good sign.  But I can’t think of a better way to see how waterproof this stuff really is.  Stay tuned!

Blog administrivia

OK.. figured out a somewhat better solution for links to blog posts. Use relative URLs, and use named permalinks. Then, I should be able to move the blog in the future without breaking self-referential links (providing that the new blog either runs WordPress or can support WordPress-style permalinks). I still need to go through and update all my old links, which looks to be a tedious process, but with any luck it’ll be the last time I need to do it.

I discovered the other day that it doesn’t work to set up a CNAME pointing to ‘lpaulriddle.wordpress.com.’ I just end up getting sent to the main wordpress.com homepage. Using an HTTP redirect (or meta refresh) seems to work. However it doesn’t appear that I can set up a redirect with my current domain hosting provider, at least not without signing up for web hosting that I don’t want. However, I’m going to be shopping around for a new home for my domain within the next couple months anyhow, so I’ll check into this as I’m evaluating new providers.

Blogging again…

Hopefully I’ll pick up the blogging pace a bit now that I’ve moved everything over to wordpress.com.  In preparation for un-password-protecting the blog, I’m going through all of my old posts, and the only thing I’ve noticed is that all of my links to other blog entries are broken.  Nothing unexpected, but I wonder if there’s anything that can be done to prevent this from happening every time I move the blog (it’s moved 3 times now, and it’ll probably move again).  Probably not, but right now all my links reference posts by number, and it might help to change them to use named permalinks.

My latest pet project at home is preparing us for the impending cutover to digital TV.  We’re far too cheap to pay for cable or satellite TV (although FiOS may be hard to resist), so I’m concentrating on getting a nice setup for receiving over-the-air digital broadcasts.  Following some instructions I found on the Internet (where else), I built two homemade UHF antennas.  The author of this web page uses a single antenna with a rotator.  At our location, though, we’re smack-dab in the middle of the Baltimore and DC TV markets.  So I can set 2 antennas up in the attic, one aimed at Baltimore, and another aimed at Washington, and pick up pretty much every station within 50 miles, without the hassle of a rotator.  The only issue is combining the signals.  The antennas work great separately, but I haven’t tried using a combiner yet (I try to avoid going up in the attic this time of year..).  Using a combiner in this kind of setup is always going to result in some signal loss, so the question is, will the resulting combined signal be acceptable?  I don’t know, but it’s easy enough to try, which I plan on doing soon.  If the combiner setup doesn’t work well, the other option is to run separate antenna feeds to each TV and then use a switch similar to Radio Shack Cat. No. 15-1968 (each TV would need its own switch).  I know this will work, but it obviously involves extra work and expense.  But it’s still preferable to a rotator, IMO.

Sort of on the same topic, I picked up one of those much ballyhooed digital “converter boxes” awhile back, to use with our old TV.  Total outlay was just over $20, thanks to the $40 coupon from Uncle Sam.  This is an Apex model that is being sold at Best Buy.  It works as expected, and includes all the standard features you would expect from a digital tuner (TV guide, signal strength meter, etc).  However, I kind of wish it had come with a universal remote.  The included remote control works fine, but it’d be a nice touch if I could also turn the TV on/off and adjust the volume with it.  As it is now, I’m stuck with 2 remotes until I can find a cheap universal remote that can also work the converter box.

More later..

That was easy…

Searching for a new home for my blog (and didn’t want to pay for web hosting, at least not yet), and it came down to wordpress.com vs blogger.com.  Blogger.com had the early edge because it works with my existing Google account, and offers free domain name mapping.  WordPress.com charges a nominal annual fee for domain name mapping, and requires a separate account.  However, Blogger.com didn’t have an easy way for me to import my existing WordPress blog.  With wordpress.com, it was as simple as exporting the old and importing the new, and everything came in completely intact.  Other than the URL, I can’t even tell the difference between this and my old self-hosted WordPress blog.

There’s also the question of whether I need domain name mapping in the first place.  It’s not like I expect this to be a high-visibility or high-traffic blog.  I think I’ll just set up a CNAME for this under lpaulriddle.com, and be done with it.

In any case, looks like I’ll be setting up shop here for awhile.

Latest Ubuntu Upgrade

I just upgraded my Ubuntu box from 7.04 (Feisty Fawn) to 7.10 (Gutsy Gibbon), and after 3 upgrades (I started out with Dapper Drake) I remain impressed with how easy and painless it is. This time there was a hiccup, though. But, it’s not something that I’d expect an average user to encounter.

First, a bit of background. My desktop machine gets a lot of its files via NFS from a remote server. The server runs a base of Debian Etch with a bunch of stuff from the testing/unstable trees. The two computers are both on the same LAN, and the server currently runs kernel v. 2.6.20.1. Ubuntu 7.10 currently uses 2.6.22.

After completing the upgrade, I rebooted my machine into Ubuntu 7.10 for the first time, and logged on. It took about 5 minutes for all my menus and apps to come up (some of the apps came up before I had any menus, making me wonder if the upgrade had botched something.. but everything did finally appear). I quickly figured out the cause of the problem: all of my NFS mounts were timing out.

I did a few more tests, and I found out that I had no problem mounting and unmounting NFS directories from the server. But when I tried to run ‘ls’, my terminal just froze and I got ‘NFS server blah.blah not responding’ in the kernel log. No amount of rebooting, re-exporting filesystems, etc. seemed to help. I wondered if it was some sort of subtle incompatibility between the two different kernel versions, although I’d never had this kind of issue with NFS before in my almost 20 years of dealing with it. (Wow, has it really been that long?)

I’m aware that there are two versions of NFS nowadays, the older version 3 and the newer version 4. The 2.6 kernel supports both versions, but I’ve always run version 3 because it has always worked for me, and I’ve never seen a need to change. Plus, when I go to configure my kernel, all of the NFS v4 stuff is labeled as EXPERIMENTAL, which makes me shy away from it. This time, though, rather than futzing around trying to get my old NFS v3 setup to work again, I decided to try v4. I built it into the server’s kernel and rebooted the server. I then followed the very helpful Ubuntu NFSv4 Howto, which explained the differences between v3 and v4 and walked me through the setup.  It worked, and it doesn’t hang any more.

It’s a little troubling not knowing why the upgrade broke my NFS v3 setup.  Searching around on Google wasn’t too helpful.  I suspect it’s some kind of issue with the 2.6.22 kernel, but I did not want to spend a lot of time troubleshooting it..  I just need my computer to work.  So I’m glad NFS v4 fixed it for me, otherwise I’d probably have to downgrade back to Feisty.

NFS issue aside, the Gutsy upgrade was very smooth, and I continue to be happy with Ubuntu.

Ubuntu hard drive upgrade

I just finished upgrading the hard drive on my Ubuntu machine, and it wasn’t as easy or straightforward as I was expecting.  So I figured I’d write up some notes for the next time I do it.

First I backed everything up. Then I shut down the computer, put the new drive in, and booted up with a copy of Knoppix I had lying around. Under Knoppix, I opened up a shell and mounted my old root filesystem:

# mount /dev/hda1 /mnt

I then mounted the new root filesystem on /mnt2, and copied all the files over:

# mount /dev/hdb1 /mnt2
# cd /mnt
# tar cvpf - . | (cd /mnt2; tar xpf -)

Then, I installed the grub boot loader in the MBR of the new drive:

# grub
# root (hd1,0)
# setup (hd1)

At that point, I shut down the computer, removed the old drive, installed the new one in its place, and attempted to boot back up. Happily, it found the grub boot loader and proceeded to load the kernel. But then it hung trying to mount the root filesystem.

It turns out that a couple releases ago, Ubuntu started referring to disk partitions by UUID rather than using a specific device name such as /dev/sda1 or /dev/hda1. Both /boot/grub/menu.lst and /etc/fstab still contained UUID references for the old hard drive, so I had to go through and painstakingly replace all the old UUID references with the updated UUID for the new disk. I just used vi and did a search-and-replace, although there’s probably an easier way. Once I did this, everything booted up just fine.

I can see the advantages to using UUIDs, but it does add an extra layer of complexity when doing something like this. At least I know what to expect the next time around.

Coming soon: my adventures upgrading from Ubuntu 7.04 (Feisty Fawn) to 7.10 (Gutsy Gibbon).  There were a couple of snags, but it was mostly painless.