Fruddled Gruntbugglies

Enthralling readers since 2005

Author: lpaulriddle

  • Leaf Patrol

    Yesterday, I finally finished up this year’s round of fall leaf removal. After 5 go-arounds with leaf removal on this property, I’m getting better at it, but the process could still stand some improvement.

    The bulk of the leaves fall in back of the house, with the Tulip Poplars starting earliest, and the Oaks finishing up last. The Tulip Poplars start dropping leaves in mid to late August, creating an ongoing chore of clearing leaves from the pool, pool area and deck. The rest of the trees are better behaved, and drop their leaves in November.

    Up to now, my leaf removal equipment has consisted of: Toro electric blower/vac mulcher, push broom, rake, and an old chipper/shredder (rescued from my parents’ garage).

    Now, the Toro actually does a really nice job. I use it in vac mode in the summer, to clean up around the pool area without blowing debris into the pool. The blower does a good job clearing off the deck and other paved surfaces. Its only problem? The cord. It’s a pain maneuvering the cord around the pool fencing and trying to keep it from falling in the pool. This past fall, I tried using the blower to clear some grassy areas, but my extension cord was too short. My solution for next season: I’m going to upgrade to a gas powered blower/vac, probably an Echo ES-230. I’m hoping it’ll work as well as the Toro, without the cord.

    The next big issue is removing the piles of leaves from the property. This year, I mulch/composted a bunch with the chipper/shredder, and put a bunch more out for yard waste pickup. I’m really looking for ways to make this process more efficient, because it’s long, hard work. The chipper/shredder has a ramp that you can lower to the ground, and rake leaves right up into the unit. I used to use this, but this year I found it was faster to just grab a big armload of leaves, and slowly drop it into the hopper. After a few tries, I got it so that I could do this without clogging up the intake. Still, this takes a long time. A bigger chipper/shredder might help. I’ve seen yard sweepers (Agri-Fab seems to be a popular brand) selling for $200 or so at Sears and Lowes. Still, with the amount of leaves we get here, mulching them down is pretty much a necessity, or I’d be putting out hundreds of bags a year for pickup. I’d love it if Howard County would start doing a service where you rake all your leaves to the curbside, and they pick them up without you having to bag them. That would eliminate the need for mulching, but they’d probably use it as an excuse to jack up our property taxes again.

  • Do-it-yourself DVDs: If at first you don’t succeed…

    When I got my Powerbook, it came with software for creating/editing movies (iMovie) and burning them to DVD (iDVD). I already have a Sony MiniDV video camera, and several hours of footage of my now-3-year-old son. With this gear, all I needed to make DVDs, was a FireWire cable and some blank DVD media. So I figured, what the heck, I’ll give it a shot. I bought a cable for $10 and a spindle of DVD-R media for $12 (after rebates of course), and today I gave it a shot. It took two tries, but the end result was success.

    First step was to copy the video onto the computer. This was straightforward. Connect the camera, start iMovie, and tell it to import from the camera. I imported two 1-hour tapes, which took up around 26 gigs total (13 per tape). Then, I used iMovie to add DVD chapter titles to the movie, and told it to create a project in iDVD.

    In iDVD, I was able to build menus for the soon-to-be DVD using several different built-in themes. It’s actually pretty cool. I went through this process, got everything looking good, and attempted to burn a disc. Nope. The project was too big for the single-layer disc I inserted. It asked for dual layer media. I don’t have any. So instead, I created a new iDVD project with only half the footage from my imported video (one tape, or one hour’s worth). Then, I went into iDVD’s Project menu and told it I was using single-layer media. That seemed to make it happy. I redid the menus and went to burn again.

    Dang, this takes a long time! The encoding process seems very CPU intensive. Encoding the video is the most time consuming part. After it does the video, it encodes the audio. This takes longer than you would think by looking at the progress meter, but it eventually completes after 10 minutes or so. Then it goes to actually burn the disc.

    The disc seems to burn OK, but at the end I get some sort of happy-fun-ball encoding error at the end. The resulting disc plays in the Mac, but my 1-year-old Sony DVD player refuses to recognize it. Bummer.

    I try to quit out of iDVD. It seems wonky. I have to CMD-Q to quit it and I get a “terminated unexpectedly” dialog. Now, the odd part. I start it back up, open my project, and this time, it tells me the “project is too large for my encoding scheme” or somesuch. I wonder if that was the problem. If it was, why didn’t it tell me that in the first place? OK, so the software’s not perfect I guess. I’ve got a nice shiny round coaster to show for it.

    Not to be discouraged, I try again. This time, I change the encoding scheme to “maximize quality” (it was previously set to “maximize performance”). I go to burn again. One bit of weirdness this time: During encoding, the progress bar got to 100% when the encoding was only half done. That didn’t give me warm fuzzies, but I let it keep going anyhow. It finished this time, with no errors. Seemed to play OK on the Mac, too. Cool.

    Moment of truth: I popped it into my Sony again, and this time it worked! Great.

    Moral(s) of the story:

    1. Use “maximize quality” setting
    2. Ignore the progress meter during encoding
    3. Keep videos to around one hour for single-layer media (this works well when using tapes recorded in SP mode; 1 tape == 1 DVD).

    It seems to have used most of the available space on the media, just from looking at the disc. The “maximize quality” setting must use minimal compression. I’ve got no problems with that, the media is cheap.

    Just for yuks, I’ll try it out in my 1997-vintage Toshiba 3006. I really don’t expect that it’ll play DVD-R media, but if it does, I’ll be really impressed.

  • Wiring’s done!

    Subject says it all! I finished the wiring up today, installed the fan control, and replaced an outlet while I was at it. All my extra wiring turned out to be worth the effort — there’s absolutely no way I would have gotten the fan control in the wall box with all the extra wires there. It’s enough of a challenge just getting these controls in the box with only one wire. Which brings me to my obligatory gripe of the day. These fan controls (Lutron Skylark model) are great. They seem well-made and reliable. But I hate installing them. They’re so deep that they barely fit in a standard-depth wall box. And on top of that, they have pigtails, and you have to fit three wirenuts (four if you’re attaching the ground) in the box, in addition to the control. This makes them very bad for retrofit work, particularly in older houses where the boxes tend to be smaller. If there’s more than one cable going into your wall box, you can pretty much forget it. It’d be much nicer if these controls could be backwired (stick wire in hole, tighten screw), to eliminate the need for wirenut splices. Maybe Lutron will eventually figure this out. Unfortunately it’ll be too late to help me out.

    Anyhow, the only thing left now is to remount the fan and clean up all the plaster chunks, insulation and other crap that fell out of the hole in the ceiling. I’d say we can pretty much stick a fork in this project.

  • Quality time in the attic

    I spent the afternoon in the attic today, and got the lion’s share of the wiring done for the fan project. Last week I fished the wire from the basement to the attic. It was pretty straightforward. Some medium-duty nylon rope was the ticket. I dropped it down into the stud cavity from the attic, went into the basement, poked up a hooked piece of stiff wire, snagged the rope, and pulled it through. Then I used the rope to pull the romex up from the basement into the attic. The two keys to doing this successfully are:

    1. Electrical tape; and
    2. A helper.

    Just tape the romex to the end of the rope with plenty of electrical tape, go up to the attic, and have your helper feed the cable up from the basement while you pull it up. This can be done by yourself, but you’ll get lots of exercise running upstairs and downstairs to unkink the romex.

    The first job today was to get the old box and brace out of the ceiling to make room for the new fan-approved brace and box. Every time I do this, I’m reminded of how much I hate those metal ceiling box braces that nail to the underside of the joists. There’s no way to get them all the way out without tearing up the ceiling. Plus, the weight of the fixture tends to pull the nails loose over time, which is not good news for the ceiling, or for the person standing under the fixture when it eventually comes crashing down.

    The trick to getting these out is to cut them, removing the center part and leaving the ends nailed to the joists. I’ve found that the best tool for this is a Dremel rotary tool with a cutoff wheel. I’ve used a hacksaw, and it’s laborious (the bars are actually pretty thick metal) and the sawing action can damage the ceiling (and your knuckles). The Dremel is not perfect (if you breathe wrong on the cutoff wheels, they break), but believe me, it is far superior to sawing.

    This bar came out easier than others I’ve done. Once I cut one side, the other side just swung out of the way (because, of course, the nails had pulled loose).

    The actual wiring was complicated but straightforward. There were a lot of wires in the old box (it fed two different downstream branches). Rather than put everything back into the fixture box, I mounted a second junction box, wired everything up to that, and ran a single 12/3 cable to the fixture box carrying two switched hots (lights and fan) and neutral. This makes for a neater job and lets me use a larger box for all of my splices.

    Just a couple parting tips for doing this kind of work:

    1. Invest in a pair of knee pads (or “kneelers”). Your knees will thank you for it.
    2. If your house has lots of BX wiring like mine, invest $25 or so for a good quality rotary BX cutter. It’s absolutely worth its weight in gold, which you’ll appreciate if you’ve ever tried to cut BX with a hacksaw.

    Almost done now, just need to wire up the fan control, route the wire in the basement, and remount the fan.

  • Fixing Daily Notes

    It turns out that Daily Notes, Day Events, and Holidays all get the same treatment from the CAPI export process, so I need to rewrite the iCalendar output for all of them. Instead of using a DURATION to these events, I ended up just removing DTEND. Thus we end up with an event with DTSTART but no DTEND, which iCalendar defines as an event that takes up no time. That’s pretty much accurate, except in the case of Day Events, which technically take up all day. Unfortunately, in Oracle Calendar, some people put entries in as Day Events when they really should be Daily Notes. For that reason I’m not quite decided yet as to whether I should put DURATION in for Day Events. In any case, I’ve fixed the problem, and everything shows up in PHP iCalendar. For now I’ll just leave DURATION out, until I change my mind.

    One thing I might consider, is splitting the four Oracle Calendar categories (appointment, daily note, day event, holiday) into separate calendars, so I can differentiate the various events more easily in iCal and on the Palm. It seems like a good idea, but will require some extra work.

  • First wrench in the works…

    Well, I found the first problem with my exported iCalendar data. In iCal, I turned off everything except my two exported Oracle Calendar views (one done via export/import, the other extracted and published), so I could compare the two. I noticed that my published calendar was not showing recurring events properly. Only one instance of each event was showing up.

    Now, I already knew that the CAPI is supposed to export multiple VEVENT records for recurring events, instead of adding RRULE attributes. But I didn’t expect this to be a problem, as I just need the stuff to show up, and I’m not worried about editing the exported data, adding new recurrences, etc. I’m doing all that kind of manipulation via the Oracle Calendar client.

    So, why am I only seeing one event? First thing I checked was the .ics file. Maybe the docs are wrong, and it’s exporting RRULEs after all, and I’m ignoring them? Nope… each recurrence does have a separate VEVENT in the iCalendar file. So why aren’t they all showing up? Because they all have the same UID.

    So, CAPI does export separate VEVENTs, but it doesn’t make each one a “real” separate event by assigning it a new UID. Kind of annoying. It’s one thing to cheap out and not support RRULE, but it’s entirely another thing when the result doesn’t comply with the iCalendar spec.

    Interestingly, the vCalendar export does assign unique IDs to the recurrences. Too bad they couldn’t do it with the iCalendar export. Looks like I’m going to have to do it myself. For my first stab, I’ll just build a hash of UIDs as I’m reading the iCalendar file, and if I find a duplicate, I’ll append an ascending number to the end. Hopefully that’ll work OK. It’s dependent on Oracle Calendar exporting its data in the same order each time. Dunno if it does or not. If not, it means that the recurring events’ UIDs may not be consistent across multiple exports. That may or may not present a problem with the Palm export. I guess time will tell.

    Well, off I go to make this happen.

  • Getting closer

    Today I began work on a Perl script to massage the exported Oracle Calendar data before publishing. I exported a full 3-year date range (the same date range I’m currently loading into iCal via export/import), and began addressing some of the issues I noted in my previous post.

    • Times showing up incorrectly in PHP iCalendar: This was just a config thing. I hadn’t configured PHP iCalendar with my time zone, so it was defaulting to UTC. The iCal-exported stuff was showing up correctly because Apple adds explicit time zone info to the .ics file.
    • I did a few things to fix up the display of DESCRIPTION fields. First, I used MIME::QuotedPrint to strip out some of the ‘=20’s and other MIME artifacts that were lying around. Then, I stripped blank lines out to keep entries from getting truncated. I was originally un-escaping commas (by stripping out leading backslashes), but apparently they need to be escaped to comply with the iCalendar specification. iCal displays the commas correctly (without backslashes), but PHP iCalendar leaves the backslashes in.
    • I added an Apple-specific field, X-WR-CALNAME. iCal uses this as the default calendar name to use when importing or subscribing to the file. Not totally necessary, but saves typing.

    With those changes, everything shows up nicely now and the descriptions look good. The end product is a usable .ics file that I can subscribe to with iCal, and it includes everything I want except alarm and attendee data. I’ll tackle alarms first.

    Alarms: The vCalendar export includes alarm data. The strange thing is, I have no idea where the alarms in the vCalendar file are coming from. I’ve never defined any alarms within Oracle Calendar, yet somehow they’re showing up in the vCalendar export file. Certain events will show up with alarms in the vCalendar file, but when I go into Oracle Calendar and bring up that event, it will say there is no alarm defined. I almost wonder if the vCalendar export process is picking up other peoples’ alarms or something. Very strange.

    Looks like the iCalendar export gets this right. I went into Oracle Calendar and defined an alarm (Oracle Calendar actually calls them reminders), then re-exported the data in iCalendar format, and I got a VALARM section added to that event. I’ll have to see if it shows up properly in iCal. I’d like to be able to set alarms and have them show up on my Palm, as I’m prone to get sidetracked and forget meetings otherwise.

    OK, looks like iCal picks up the VALARMs properly. Just remains to be seen how they’ll show up on the Palm. It looks like I can set different alarm types (message, email, audio etc.) using the ACTION attribute. As exported, they show up as ACTION:DISPLAY which gets translated to ‘Message’ in iCal. At some point I’ll have to see how different actions will affect behavior on iCal and the Palm. Then if necessary, I can tweak the ACTION attribute with my Perl script.

    With that, I think this gives me all the functionality I was getting with the export/import process, so there’s no reason I can’t make this my “official” method now, and sync this data to my Palm. If that works OK, I’ll work on automating it and bringing in attendee data.

  • First attempt at publishing exported Oracle Calendar data

    Made a first stab at publishing exported output from Oracle Calendar tonight. I ran my CAPI program for a one-week time window, deleted the MIME headers at the beginning and end of the output, and slapped it up on my web server without any further mods. It actually sorta worked. Observations:

    • The iCalendar data appears to come through in DOS format, although neither iCal nor PHP iCalendar seem to have problems reading it. So it appears there’s no immediate need to convert.
    • PHP iCalendar shows the times incorrectly. The exported data has the time in GMT (or “Zulu” time). iCal correctly converts things to EST, but PHP iCalendar does not. I probably need to add timezone data to the file.
    • One of my meetings has a long DESCRIPTION property. The text is kind of screwed up. There are a lot of backslashed characters, a few ‘=20’s thrown in, and most importantly, the exported data has an extra carriage return thrown in. Both apps truncate the description at this spot.
    • Attendees are missing (which was expected) as well as alarms. Hopefully, getting alarms is just a matter of adding the appropriate alarm stuff to my list of properties to export. We’ll see.

    So it’s obvious that I’ll need to write some Perl code to massage the exported data a bit, but I already expected that. The important thing is, we’re definitely making progress here.

  • My Fun Day.

    Today was lotsa fun. It started out with the National Student Clearinghouse. I decided to get a “real” development instance going where I could connect to them as a student, demo it to Academic Services, etc. I ended up wrestling with their stupid referrer-based security scheme again. I took my existing clearinghouse script, which was working fine, and added Webauth authentication to it. I figure I’ll have the script verify the user’s Webauth credentials, then do an LDAP query to get the student ID, then pass that to the remote site. That way, my local script will have some authentication built in. Well, that broke it. On the initial authentication attempt, Webauth adds a query parameter called WebAuthExtAction (which the client is supposed to decode, and use the result to set a cookie). Great, but that changes the HTTP Referrer string, which breaks the clearinghouse crap. Hey, but they changed their site so it actually tells you what’s going on now, rather than just booting you out. Have to at least give them props for that, it saved me some head-scratching. OK, first attempt at fixing this: I’ll check for a WebAuthExtAction parameter, and if it exists, I’ll append it to the initial referrer string that I send them. Nope, that makes the referrer string too long, and the clearinghouse code can’t deal with it. Second attempt: look for the WebAuthExtAction parameter, and if it’s there, redirect the browser back to the same script, omitting the parameter. Bloody convoluted, but it works. Fortunately, in production, we won’t have to deal with this, because the prod code will run from the same web server as the portal, and the user will always have valid creds when they come to the site. Aargh.

    Then there was fun with myUMBC itself. In an attempt to speed things up on the myUMBC web server, I decided to redo the Webauth ticket-logout script that it was using, and make it part of the myUMBC app itself. That way, logouts will go to the FastCGI processes, reducing overhead (the script needs to connect to the database, among other things) and hopefully speeding the machine up. This actually worked OK eventually, but of course, it broke things at first. Turns out I was short-circuiting the FastCGI loop without resetting certain global variables, which of course is a big no-no. But, that was good for a few choice expletives.

    When does Christmas break start again?

  • Attendee field definitely the culprit

    See subject. I checked the API documentation, and it has a complete list of iCalendar attributes that the server returns. So, in my downloader code I just listed out each attribute except ATTENDEE. With that list of attributes, it took about 1 minute to download a year’s worth of data. When I added ATTENDEE in, the download pretty much ground to a halt.

    So at any rate, it looks like I want to leave ATTENDEE out when doing my bulk downloads. That’s a bit of a bummer, though, because it means I won’t get attendee lists for meetings etc. It’d be useful to have that. What about this compromise: for a small window, say today through two weeks from today, I’ll export events with attendees. Then, for events outside that window, I’ll export events without attendees. I think that gives me the best compromise between performance and function.

    The only other possible issue I can think of is, the UID field. Each event has a UID field that uniquely identifies that event. I’m hoping the UID stays consistent across multiple exports of the same data. I’m pretty sure the Palm sync stuff keys off the UID, so keeping the UID consistent should make the Palm sync process faster and more reliable. I did try exporting the same date range (all of 2005) twice, saving the results to a file, and comparing the files. The UIDs were the same both times, so that’s a good (although not conclusive) sign.

    Next up: I’ll massage the data as necessary and try turning it into an iCal subscription.