Towards Better Bookmark Syncing: del.icio.us and diigo

I use Firefox (well, Iceweasel) from several machines. On a daily basis, at least three: my workstation at home, my workstation at work, and my laptop. I have wanted to have my bookmarks synced between all three of them for some time. I’ve been using unison to sync them, which mostly works. But firefox likes to store a last-visited timestamp in bookmarks.html, so if I have a browser open at more than one place, I get frequent unison conflicts.

I started searching for better alternatives again, and noticed that the new alternative del.icio.us plugin for Firefox supports a del.icio.us version of the traditional Firefox Bookmarks Toolbar. I use that toolbar a lot, and anything I use in place of standard Firefox bookmarks absolutely must support something like it.

I imported my Firefox bookmarks (about 900 or so) into del.icio.us. They arrived OK, but flattened, as del.icio.us doesn’t have a hierarchical structure like Firefox does. After a good deal of experimentation, I have mostly gotten it working how I want. I’m using the bundles mode of the extension toolbar in Firefox, and simulating subfolders by using certain tags. It works fine; not quite what I’d want out of it ideally, but everything else is so much better that I’m happy with it.

The social bookmarking aspects of del.icio.us sound interesting, too, but I haven’t started trying to look at that stuff very much yet. Delicious also has a new “Firefox 3” extension that also is documented to work fine in Firefox 2. It has a few new features but nothing I care all that much about.

My main gripe at this point is that the Firefox extension doesn’t allow me to set things as private by default. It also doesn’t propogate my changes to the site immediately, which led to a considerable amount of confusion initially. On the plus side, it does do a synchronization and store a local cache, so I can still use it offline to load up file:/// links.

Some things about del.icio.us bug me. There are very limited features for editing things in bulk (though Greasemonkey scripts help here). It has a published API, but seems quite limited (I couldn’t find out how, in their documentation, to add a tag to an existing bookmark, for instance.)

del.icio.us lets you export all your bookmarks, so you have freedom to leave. Also, if you poke around on freshmeat.net, you can find Free Software alternatives that actually emulate del.icio.us APIs and sites.

I also looked at alternatives, and it seems that the most plausible one is Diigo. But I’m going to refuse to use it right now for two reasons: 1) its Firefox plugin has nothing like the Firefox bookmarks toolbar, and 2) its hideous Terms of Service. If you go to their ToS and scroll down to “Content/Activity Prohibited”, you’ll see these gems:

6. provides any telephone numbers, street addresses, last names, URLs or email addresses;

7. promotes information that you know is false or misleading or promotes illegal activities or conduct that is abusive, threatening, obscene, defamatory or libelous;

11. furthers or promotes any criminal activity or enterprise or provides instructional information about illegal activities including, but not limited to making or buying illegal weapons, violating someone’s privacy, or providing or creating computer viruses;

So, in other words, they can delete me account if I bookmark the Amazon.com contact page, or if I bookmark the opinions of someone I disagree with. Good thing the Vietnam War protesters in the 70s didn’t use Diigo, because they’d be kicked off if they wrote about their sit-ins at Berkeley. Also, I didn’t even quote the other section that says they get to remove anything you post that they think is offensive, in their sole judgment. Goodbye, links to EFF’s articles about RIAA.

Since we can’t use last names, I guess it’s just “Hillary” and “John” instead of “Clinton” and “McCain”. Oh, and don’t get me started about the folly of operating a social bookmarking site where you aren’t allowed to post URLs. That’s right up there with Apple releasing a Windows version of Safari that you aren’t allowed to install on PCs.

Compare that to the del.icio.us terms and privacy policy and the contrast is stark indeed.

DjVu and the scourge of the PDF

A little while back, I wrote a blog post called DjVu: Almost Awesome, where I pointed out the strengths of the three DjVu family of formats, but lamented the fact that there was no Free Software to create DjVu files in the most interesting format, DjVu Document.

Well, now there is: pdf2djvu is out and works, and it’s been ITP’d to Debian, too.

As a very quick recap, DjVu is a family of raster image codecs that often creates files much smaller than PDFs, PNGs, TIFFs, etc. It has a ton of advanced features for things like partial downloads from websites. It’s pretty amazing that a raster format can create smaller files than PDFs, even at 300 or 600dpi resolutions in the output. Of course, for some ultra-high-end press work, PDF would still be needed, but DjVu is quite compelling for quite a few uses. Since it is a raster format, it is simpler to decode and is not subject to local system variations, such as installed fonts, like PDF is.

Which brings me to the scourge of PDF. Recently we got a trouble ticket at work from someone saying there was a bug with our Linux environment because Linux users didn’t see the correct results when they opened his PDF file. A quick inspection with some of the xpdf utilities (pdffonts, to be specific) revealed that the correct fonts were not embedded in the file. The user didn’t believe me, and still wanted to blame Linux, saying that it worked fine on his PC with Acrobat. So I tried opening the file on a Windows 2003 terminal server, and it looked worse there than it did with any Free Linux viewer — really quite terribly corrupted. He still wasn’t entirely convinced, until he happened to try printing the file in question, and even Acrobat couldn’t print it right.

PDF was supposed to be a “read anywhere” format that produces exact results. But it hasn’t really lived up to that. Font embedding is one reason; the spec lists a handful of fonts that are allowed to not be embedded, but it is routine for some reason to violate that and fail to embed quite a few more. Then you have to deal with font substitution on the receiving end, which is inexact at best. Then you have all sorts of complex differences between versions, and it becomes quite the mess. (And don’t even get me started on broken PDF editors, such as the ones Adobe sells…) Somehow, quite a few people seem to have this idea built up in their heads that PDF is both an exact format, and an editable format, when really it is neither. (Last week, I was asked to convert a PDF file to a Word document. Argh.)

DjVu keeps looking more and more pleasant to my eyes.

Knuth and Reusable Code

In the recent interview with InformIT, Donald Knuth said:

I also must confess to a strong bias against the fashion for reusable code. To me, “re-editable code” is much, much better than an untouchable black box or toolkit. I could go on and on about this. If you’re totally convinced that reusable code is wonderful, I probably won’t be able to sway you anyway, but you’ll never convince me that reusable code isn’t mostly a menace.

I have tried in vain to locate any place where he talks about this topic at greater length. Does anyone have a link?

A Smart Gas Tax

The recent announcements by McCain and Clinton of their support for a temporary repeal of the Federal gas tax make me sick. More on why later, but first, I want to put forth my idea. I think both Republicans and Democrats would like it — as it’s based on market principles and achieves a reduction in costs to the average household, while simultaneously helping the environment and reducing our dependency on foreign oil. But of course, it’s courageous, and we don’t have many politicians of that type anymore.

What we need is a large, revenue-neutral, gas tax increase. Now, before people go nuts, let’s explore what this means.

Revenue-neutral means that it doesn’t result in a net increase of monies going to the government. The increase in the gas tax rate is offset by a decrease in the income tax, tied to the cost of direct and indirect taxable gasoline each family or business consumes. So on day 1, if you cost of filling up at the tank goes up by $10 in a week, if you are an average family, your total paychecks also go up by $10. Your cost for receiving a package might go up by $1, and your paycheck goes up by the same amount. So you’re no worse off than before — if you’re average.

Let’s look at the pros and cons of this sort of plan:

  • The economic incentive to be efficient consumers of gas is magnified. This will eventually lead to Americans having more money in their pockets, increasing market incentives for fuel efficiency, and a decreasing (or increasing slower) price of oil as demand slows.
  • Economic incentives to use mass transit, live close to urban centers, or drive fuel-efficient vehicles are magnified. Likewise, the economic incentives to invest in mass transit and efficient automobiles are also magnified.
  • As more efficient technologies come on the market, and Americans decide that they’d like to pad their bank accounts by hundreds or thousands of dollars a year, more sustainable and environmentally-friendly development patterns will emerge. Also, the price of oil will be kept low. Of course, people that choose not to change will, on average, be no worse off than before.
  • Alternative choices to the automobile will have a greater incentive to develop. Think the return of a fast, speedy national passenger and freight network, greater mass transit options, etc.
  • The marketplace will drive Detroit to love making fuel-efficient vehicles, because they will be the new profit centers.
  • This sort of thing is known to work well in other countries around the world.

If we think more long-term, we see even more positive effects:

  • The return to local agriculture and manufacturing. Due to lower transportation costs, local farmers and manufacturers will be able to undercut Walmart’s prices due to the larger relative costs of Walmart’s much-vaunted national distribution network. Unless, that is, Walmart starts buying local — which is a good thing too. This is a good thing for American jobs.
  • Keeping all that oil money in the domestic economy is a good thing for American jobs, too.
  • Our businesses will have a jump start on being competitive in the increasingly carbon-regulated global marketplace.

As for the cons:

  • Eventually this will lead to a net reduction in Federal revenues as efficiencies develop in the marketplace and people save money on gas. Corresponding budget cuts will be required. (A good thing, I figure)
  • Implementing this all at once would be a shock to some people living inefficiently now — those that are far above average. It would have to be implemented gradually to avoid being a shock to the economy.

Now, for the McCain/Clinton plan: it’s a farce. Reducing the gas taxes means more efficient gas, which means more consumption of gas, which in turn leads to — yes — higher gas prices. Its real effect will be minimal, and is a terrible long-term policy. It charges tens of billions of dollars to the national credit card (which we, and our children, will have to repay) while achieving almost no benefit now. It’s a gimmick through and through, and something that says loud and clear that neither candidate is on track for the “Straight Talk Express”.

Update 4/29/2008: One potential solution for the problem of declining revenues over time is to periodically re-index the averages to mirror current usage. Assuming this does really lead to the expected drop in consumption, there is no sense in 2020 of paying people for how much gas they would have used in 2008.

datapacker

Every so often, I come across some utility that need. I think it must have been written before, but I can’t find it.

Today I needed a tool to take a set of files and split them up into directories in a size that will fit on DVDs. I wanted a tool that could either produce the minimum number of DVDs, or keep the files in order. I couldn’t find one. So I wrote datapacker.

datapacker is a tool to group files by size. It is perhaps most often used to fit a set of files onto the minimum number of CDs or DVDs.

datapacker is designed to group files such that they fill fixed-size containers (called “bins”) using the minimum number of containers. This is useful, for instance, if you want to archive a number of files to CD or DVD, and want to organize them such that you use the minimum possible number of CDs or DVDs.

In many cases, datapacker executes almost instantaneously. Of particular note, the hardlink action can be used to effectively copy data into bins without having to actually copy the data at all.

datapacker is a tool in the traditional Unix style; it can be used in pipes and call other tools.

I have, of course, uploaded it to sid. But while it sits in NEW, you can download the source tarball (with debian/ directory) from the project homepage at http://software.complete.org/datapacker. I’ve also got an HTML version of the manpage online, so you can see all the cool features of datapacker. It works nicely with find, xargs, mkisofs, and any other Unixy pipe-friendly program.

Those of you that know me will not be surprised that I wrote datapacker in Haskell. For this project, I added a bin-packing module and support for parsing inputs like 1.5g to MissingH. So everyone else that needs to do that sort of thing can now use library functions for it.

Update… I should have mentioned the really cool thing about this. After datapacker compiled and ran, I had only one mistake that was not caught by the Haskell compiler: I said < where I should have said <= one place. This is one of the very nice things about Haskell: the language lends itself to compilers that can catch so much. It’s not that I’m a perfect programmer, just that my compiler is pretty crafty.

Backup Software

I think most people reading my blog would agree that backups are extremely important. So much important data is on computers these days: family photos, emails, financial records. So I take backups seriously.

A little while back, I purchased two identical 400GB external hard disks. One is kept at home, and the other at a safe deposit box in a bank in a different town. Every week or two, I swap drives, so that neither one ever becomes too dated. This process is relatively inexpensive (safe deposit boxes big enough to hold the drive go for $25/year), and works well.

I have been using rdiff-backup to make these backups for several years now. (Since at least 2004, when I submitted a patch to make it record all metadata on MacOS X). rdiff-backup is quite nice. It is designed for storage to a hard disk. It stores on the disk a current filesystem mirror along with some metadata files that include permissions information. History is achieved by storing compressed rdiff (rsync) deltas going backwards in time. So restoring “most recent” files is a simple copy plus application of metadata, and restoring older files means reversing history. rdiff-backup does both automatically.

This is a nice system and has served me well for quite some time. But it has its drawbacks. One is that you always have to have the current image, uncompressed, which uses up lots of space. Another is that you can’t encrypt these backups with something like gpg for storage on a potentially untrusted hosting service (say, rsync.net). Also, when your backup disk fills up, it takes forever to figure out what to delete, since rdiff-backup –list-increment-sizes must stat tens of thousands of files. So I went looking for alternatives.

The author of rdiff-backup actually wrote one, called duplicity. Duplicity works by, essentially, storing a tarball full backup with its rdiff signature, then storing tarballs of rdiff deltas going forward in time. The reason rdiff-backup must have the full mirror is that it must generate rdiff deltas “backwards”, which requires the full prior file available. Duplicity works around this.

However, the problem with duplicity is that if the full backup gets lost or corrupted, nothing newer than it can be restored. You must make new full backups periodically so that you can remove the old history. The other big problem with duplicity is that it doesn’t grok hard links at all. That makes it unsuitable for backing up /sbin, /bin, /usr, and my /home, in which I frequently use hard links for preparing CD images, linking DVCS branches, etc.

So I went off searching out other projects and thinking about the problem myself.

One potential solution is to simply store tarballs and rdiff deltas going forward. That would require performing an entire full backup every day, which probably isn’t a problem for me now, but I worry about the load that will place on my hard disks and the additional power it would consume to process all that data.

So what other projects are out there? Two caught my attention. The first is Box Backup. It is similar in concept to rdiff-backup. It has its own archive format, and otherwise operates on a similar principle to rdiff-backup. It stores the most recent data in its archive format, compressed, along with the signatures for it. Then it generates reverse deltas similar to rdiff-backup. It supports encryption out of the box, too. It sounded like a perfect solution. Then I realized it doesn’t store hard links, device entries, etc., and has a design flaw that causes it to miss some changes to config files in /etc on Gentoo. That’s a real bummer, because it sounded so nice otherwise. But I just can’t trust my system to a program where I have to be careful not to use certain OS features because they won’t be backed up right.

The other interesting one is dar, the Disk ARchive tool, described by its author as the great grandson of tar — and a pretty legitimate claim at that. Traditionally, if you are going to back up a Unix box, you have to choose between two not-quite-perfect options. You could use something like tar, which backs up all your permissions, special files, hard links, etc, but doesn’t support random access. So to extract just one file, tar will read through the 5GB before it in the archive. Or you could use zip, which doesn’t handle all the special stuff, but does support random access. Over the years, many backup systems have improved upon this in various ways. Bacula, for instance, is incredibly fast for tapes as it creates new tape “files” every so often and stores the precise tape location of each file in its database.

But none seem quite as nice as dar for disk backups. In addition to supporting all the special stuff out there, dar sports built-in compression and encryption. Unlike tar, compression is applied per-file, and encryption is applied per 10K block, which is really slick. This allows you to extract one file without having to decrypt and decompress the entire archive. dar also maintains a catalog which permits random access, has built-in support for splitting archives across removable media like CD-Rs, has a nice incremental backup feature, and sports a host of tools for tweaking archives — removing files from them, changing compression schemes, etc.

But dar does not use binary deltas. I thought this would be quite space-inefficient, so I decided I would put it to the test, against a real-world scenario that would probably be pretty much a worst case scenario for it and a best case for rdiff-backup.

I track Debian sid and haven’t updated my home box in quite some time. I have over 1GB of .debs downloaded which represent updates. Many of these updates are going to touch tons of files in /usr, though often making small changes, or even none at all. Sounds like rdiff-backup heaven, right?

I ran rdiff-backup to a clean area before applying any updates, and used dar to create a full backup file of the same data. Then I ran apt-get upgrade, and made incrementals with both rdiff-backup and dar. Finally I ran apt-get dist-upgrade, and did the same thing. So I have three backups with each system.

Let’s look at how rdiff-backup did first.

According to rdiff-backup –list-increment-sizes, my /usr backup looks like this:

        Time                       Size        Cumulative size
-----------------------------------------------------------------------------
Sun Apr 13 18:37:56 2008         5.15 GB           5.15 GB   (current mirror)
Sun Apr 13 08:51:30 2008          405 MB           5.54 GB
Sun Apr 13 03:08:07 2008          471 MB           6.00 GB

So what we see here is that we’re using 5.15GB for the mirror of the current state of /usr. The delta between the old state of /usr and the state after apt-get upgrade was 471MB, and the delta representing dist-upgrade was 405MB, for total disk consumption of 6GB.

But if I run du -s over the /usr storage area in rdiff, it says that 7.0GB was used. du -s –apparent-size shows 6.1GB. The difference is that all the tens of thousands of files each waste some space at the end of their blocks, and that adds up to an entire gigabyte. rdiff-backup effectively consumed 7.0GB of space.

Now, for dar:

-rw-r--r-- 1 root root 2.3G Apr 12 22:47 usr-l00.1.dar
-rw-r--r-- 1 root root 826M Apr 13 11:34 usr-l01.1.dar
-rw-r--r-- 1 root root 411M Apr 13 19:05 usr-l02.1.dar

This was using bzip2 compression, and backed up the exact same files and data that rdiff-backup did. The initial mirror was 2.3GB, much smaller than the 5.1GB that rdiff-backup consumes. The apt-get upgrade differential was 826MB compared to the 471MB in rdiff-backup — not really a surprise. But the dist-upgrade differential — still a pathologically bad case for dar, but less so — was only 6MB larger than the 405MB rdiff-backup case. And the total actual disk consumption of dar was only 3.5GB — half the 7.0GB rdiff-backup claimed!

I still expect that, over an extended time, rdiff-backup could chip away at dar’s lead… or maybe not, if lots of small files change.

But this was a completely unexpected result. I am definitely going to give dar a closer look.

Also, before I started all this, I converted my external hard disk from ext3 to XFS because of ext3’s terrible performance with rdiff-backup.

Pennsylvania and Irrelevance

NPR has been doing an interesting series this week. They’ve sent out a reporter who is going all across Pennsylvania interviewing people at local food markets. He found a fish shop in Pittsburgh, a market in Lancaster, and some shops in Philadelphia. He sought out Democratic voters to ask them about their thoughts on Clinton vs. Obama.

A lot of the Pennsylvania voters were for Clinton. When asked why, most of them said that they liked Bill Clinton and his policies. A few said they liked how Hillary handled the Lewinsky affair. To me, none of that has anything to do with whether Clinton or Obama would be better for the country.

Then there was the person this morning who was criticizing Obama for not offering specifics. She said she is Jewish, and so Israel is important to her, and Obama hasn’t said anything about helping along the peace process. So I went to barackobama.com, clicked Enter the Site, went to Issues, Foreign Policy, then Israel. Then I clicked on the full fact sheet, which was a full 2 pages on Israel, including far more detail than the voter said she wanted.

I often wonder about these people that say Obama doesn’t have specifics. Just because each speech doesn’t read off a whole lot of information doesn’t mean that he doesn’t have it — it’s all there on the website. I’m sure people that don’t have Internet access could call the Obama campaign and get information, too. It seems Obama ought to do a better job of mentioning this fact at every possible opportunity.

Then I hear a lot of Clinton supporters saying that since Clinton has won states like Ohio in the primaries, she’d do better there in the general election. I think that is a totally facetious argument. Just because Clinton did better with Democrats doesn’t mean that she’d do better in the general election. We can generally assume that the Democratic voters will vote for the Democratic nominee, whoever it is. The question is how many independents and Republicans a person can win over.

A Realistic View of the Economy

Yesterday, I read an article on CNN called From $70K to food bank.

It describes a woman who was laid off in February from a job paying $70,000 a year. “Weeks later”, with bills “piling up and in need of food for her family”, she went to a food bank.

The article proceeds to talk about the subprime lending situation at great length, which is largely irrelevant to this person’s situation.

Then we learn she applied for food stamps, but was denied. There’s a quote from this person about how frustrating that was, and general “tugging at the heartstrings” trying to make us feel sorry for this woman with two children whose mother moved in to help make the house payment. It seems to me that this is a correct decision; someone that can pay a $2500 mortgage each month ought to move into an apartment before trying to leech food or money from social service agencies.

And that’s where this story gets interesting.

She has an interest-only mortgage, and is managing to pay the $2500 bill each month.

If you’re not familiar with an interest-only mortgage, here’s how it works. The bank loans you money to buy your house — say, $200,000. This is a loan, and you have to pay interest on it each month, just like a regular mortgage. But with an interest-only mortgage, you never pay off the loan. You could be making monthly payments for 30 years and still owe $200,000. In general, the only ways to “pay off” this kind of loan is to sell your house, or get a conventional mortgage that pays off the interest-only loan.

Interest-only mortgages were largely banned after the Great Depression. Prior to that time, they were how mortgages normally worked. But there are several problems with them. One is that you have to pay on them forever, even after you retire. Another is that you can’t move unless you can sell your house for at least as much as the bank financed, even if you’ve lived there for 20 years. In times of declining housing prices and unemployment, that really stinks. People often default on the loans, and from a bank’s perspective, that really stinks, too.

Interest-only mortgages are usually used by banks financing construction (we had one for a few months when we renovated our farmhouse) or other short-term projects such as professional real-estate investors that buy old houses and fix them up to sell at a profit. Except for these things, in general, they should never be used for a primary house. It’s not in the interest of the bank or the homeowner.

But since you never pay off the principal, the monthly payments can be lower. It seems likely that this woman took a knowing gamble, buying a home more expensive than she could afford, and somehow found a bank willing to finance this. Problem is, both she and the bank took a knowing risk. If she ever ran into financial difficulties, she’d have to sell the house quick. But now the house is probably worth less than the value of the mortgage, so selling it won’t remove the loan — BUT it would let her pay off a large part of the principal, reducing her monthly payments and giving her some wiggle-room to buy food and pay off the rest of it.

It seems to me that she is unwilling to own up to the calculated risk she took, and wants society to help bail her out. Don’t get me wrong; I think we need to help people that run into hard times. We need to help make sure they still have the tools they need to find a job and a place to stay. But bailing out people that take huge financial risks shouldn’t be the job of society. Let’s help them land softly, but not be enablers keeping them in a home they never could — and still can’t — afford. Fortunately, I don’t think anyone in government (or running for president) is suggesting we should.

Not only that, but her bank shouldn’t have ever made that loan. Banks should be held accountable to not sell unwise products to people that rely on them for their primary residences.

Here’s another interesting point: in just a few weeks, she had burned through her entire savings.

This, unfortunately, is a quite typical situation for many Americans. My financial planner, and I think most experts, suggest that everyone ought to have 6 months of income in liquid non-retirement assets (savings accounts, investments, etc.) in case something like a layoff happens. Very few Americans have this.

And when it comes down to it, isn’t that part of the problem? The economy thrives on consumer spending. Or, put more starkly, overconsumption. If people start saving like they ought to, and stop feeling like they’re outcasts just for not keeping up with the Joneses and buying every last gadget or the biggest house, we’d all be in better shape — but the economy wouldn’t have grown as much.

The growth it would have seen, had we all been more responsible, would have been a lot more durable and recession-proof, I think.

The Power of Love

A few years ago, Elvera Voth, a musician that grew up a few miles from here was back in the area. Her specialty is vocal music, and one evening, she led a hymn sing at our church.

During the event, she talked about how much music can touch the heart. Elvera remembered many years ago that a woman in the church was leaving for a service trip to India. She would be gone for 7 years straight. None of her family or friends would be able to see her during that entire time.

The day she was to depart, friends, family, and church members went with her to the Santa Fe station in Newton, KS. While waiting for her train, at some point, the group started singing. Elvera remembered that they sang So nimm denn meine Hände (Take thou my hand, O Father) and Ich bete an die Macht der Liebe (O Power of Love).

Elvera remembered they sang in the station, and the high, wood ceilings made it sound like the music filled the whole building. I can’t think of a better goodbye than that.

Elvera remembered so many details about the event, but two things she didn’t remember were who was leaving and what year this happened. So I remembered this story for awhile, but didn’t really follow up on it.

Then last December, our neighbor Hildred called. Hildred and her sister live on their old family farm about a mile from us. They’re some of the older members of our church, and I believe both of them have lived on that farm their entire lives. Hildred heard that I am gathering photos for a book about the centennial of our church, and she offered to bring some of them over. Knowing that it was cold and dark outside, the roads were snowy, and that Hildred drives a car at least 40 years old (because “Daddy said this is a good car”), I offered to drive to their place. “Oh no,” she said, “it’s no trouble. I like to get out. Besides, I haven’t seen your house since it’s been remodeled!” So she came over.

Hildred had stacks of amazing old photos from the church and the community. And she had a stack of photos and letters from India, where her aunt Augusta Schmidt was a nurse for 14 years. She was very proud of her aunt’s service to the needy there. I started to put things together in my head and asked her if she remembered singing at the train station when Augusta left for India. “Oh sure,” said Hildred, as if everyone knew about that.

So that’s how it happened that the “historical moment” on Feb. 10 was about Augusta Schmidt. Each month during church, leading up to our centennial in October, we have a brief time where we highlight some interesting story from the church’s past. I happened to mention this one at a historical committee meeting.

So, on that Sunday in February, someone got up and told everyone about Augusta’s life. She was born in 1894 and graduated from college with a nursing degree in 1927. She heard about India at a conference, and quickly felt that God wanted her to serve there. She left for India in the fall of 1927, and would serve two 7-year terms there.

She wrote that India was a beautiful land, contrary to things she had heard. The city where she worked (I believe it was Bombay, but I’m not positive) had hallmarks of a wealthy city, such as educational institutions, hotels, etc. However, it saddened her greatly to learn that 80% of the people in the city were homeless and slept on the street. No doubt this played a role in her dedication to service there.

After we learned about Augusta, the choir sang So nimm denn meine Hände — one of the songs that Augusta heard at the train station back in 1927. Imagine you were there, 81 years ago, seeing a friend off on a trip across continents, not to see her again for 7 years. Then the people there start singing a cappella

So nimm denn meine Hände
und führe mich
Bis an mein selig Ende
und ewiglich!
Ich kann allein nicht gehen,
nicht einen Schritt;
Wo du wirst gehn und stehen,
da nimm micht mit.
Take thou my hand, O Father,
and lead thou me,
until my journey endeth
eternally.
Alone I will not wander
one single day.
Be thou my true companion
and with me stay.

You probably weren’t there that day in 1927, or even the day in February when the choir sang the song. I wasn’t either because I had the flu that day. But I borrowed the cassette recording of that day’s service, recorded using the best we have right now — the wrong type of microphone pointed the wrong way, onto a cassette tape that has certainly been reused way more times than anybody knows.

Click here to listen.

The choir sang the first verse in German, verse 2 in English, and the whole church joined in on verse 3. I’m told there weren’t many dry eyes in the church after that. After all, how could you keep a straight face singing “Take, then, my hand, O Father, and lead thou me, until my journey endeth eternally” right after the narrator read about Augusta’s retirement and death, saying, “there, I was surrounded by friends, but most of all, by the sovereign love of God who had been with me my entire life.”

Postscript

Remember Elvera Voth, from whom I first learned this story? In 1961, she moved to Alaska. Elvera taught at several universities; founded the Anchorage Opera; directed the Alaska Festival of Music, Anchorage Boys Choir, and Alaska Chamber Singers; and there is Elvera Voth Hall at the Alaska Center for the Performing Arts.

But her best work, I think, happened after she retired and moved back to Kansas in 1995. In 1998, Elvera founded the East Hill Singers, a choir composed mainly of minimum-security prison inmates, plus volunteers from the community. Elvera has inspired so many people, taught them that they have value, that they can succeed and make themselves better. One of the said:

Can you imagine what a standing ovation feels like after being told all your life that you are worthless?

And another inmate commented:

It made me feel like maybe I’m not just being punished. I mean, I am being punished for what I did. But being in this program made me think that I can also come out… well, better … a better person.

It all makes me think. What an amazing thing these two women with love in their hearts have done to make this planet a better place. Is it even possible to do that by using weapons that kill and power to frighten?

As Elvera puts it, “many of the men in prison will be back in the community soon. I’d rather have them as a neighbor with hope in their hearts than with hate in their eyes.”

At long last, software.complete.org migrated to Redmine

I’ve been writing a bit about Trac and Redmine lately. For approximately the 1/3 of the publically-available software that I’ve written, I maintain a Trac site for it at software.complete.org. This 1/3 is generally the third that has the most interest from others, and there’s a bug tracker, wiki, download area, etc.

Trac is nice, and much nicer than one of the *Forge systems for a setup of this scale. But it has long bugged me that Trac has no integration between projects. To see what open bugs are out there on my software, I have to check — yes — 17 individual bug trackers.

To keep track on the wikis to make sure that nobody is adding spam, I have to subscribe to 17 different RSS feeds.

It took me some time just to hack up a way so I didn’t have to have 17 different accounts to log in to…

So, mainly, my use case for Trac isn’t what it was intended for.

Enter Redmine. It’s similar in concept to Trac — a lightweight project management system. But unlike Trac, Redmine allows you to have separate projects, but still manage them all as one if you please.

Redmine didn’t have Git support in its latest release, but there was a patch in Redmine’s BTS for it. I discussed why it wasn’t being applied with Redmine’s author, and then went in and fixed it up myself. (I used Git to make a branch off the Redmine SVN repo — very slick.) Unlike Trac’s Git support, Redmine’s is *fast*. I tested it against a clone of the Linux kernel repo on my local machine.

There are a few things about Redmine I don’t like, but I have learned that they mainly have to do with Ruby on Rails. As someone pointed out on Planet Debian lately (sorry, can’t find the link), the very nature of Rails makes it almost impossible for OS developers like Debian to include Rails apps in the distribution.

Not only that, but it seems like Rails assumes that even if you are just going to *use* an app, you know how to *write* one. For instance, this is pretty much the extent of documentation on how to set up a Rails app to be able to send out mail:


# See Rails::Configuration for more options

And of course, googling that turns up nothing useful.

Redmine is a rails app, so it cannot escape some of this. It seems to be a solid piece of work, but Rails seems to make things unnecessarily complex. That, and I’ve found some bugs in the underlying Rails infrastructure (like activerecord not quoting the schema name when talking to PostgreSQL) that make me nervous about the stack.

But the site is up and running well now, so I’m happy, and am planning to keep working with Redmine for quite some time.