The Haskell Blog Tutorial

The first installment of Mark C. Chu-Carroll’s Haskell tutorial series went up last week.

It begins this way:

Before diving in and starting to explain Haskell, I thought it would be good to take a moment and answer the most important question before we start:

Why should you want to learn Haskell?

It’s always surprised me how many people don’t ask questions like that.

Farther down:

So what makes Haskell so wonderful? Or, to ask the question in a slightly better way: what is so great about the pure functional programming model as exemplified by Haskell?

The answer is simple: glue.

Languages like Haskell have absolutely amazing support for modular development.

An interesting and though-provoking article, even for someone that’s been using Haskell for more than 2 years now. (Yikes, I had no idea it was that long)

You can also see all his posts on Haskell, which include a couple more installments.

Weather in Kansas

Here’s what it’s been doing:

2 days ago: high of 69
yesterday: daytime high of 23 plus “breezy” winds at 30MPH
today: forecasted high of 26, 3-7 inches of snow, winds at 25MPH, gusting to 35MPH

Yesterday, I drove to the doctor’s office and back to work, a drive of about 15 minutes. During that time, I encountered:

  • calm
  • thunder and lightning
  • freezing rain bad enough that my windshield almost froze over in less than 5 minutes
  • calm again
  • sleet

I’ve heard people from all over the country say, “if you don’t like our weather, wait 5 minutes.”

In Kansas, we really mean it.

Renovation: Weeks 19-22

It’s been a few weeks since my last house update.

I’ve posted some new photos of the project.

The new insulation has been sprayed. It’s a foam type of insulation that dries solid. It not only acts as an insulator, but also as a sealant, filling in little cracks that could let air in and out of the house. In some places it is up to 12 inches thick — mostly in the attic.

After that, work on the drywall began. By this point, it’s pretty much done and interior painting should be starting this week. After the painting, the next thing will be the floor restoration.

Concrete work got started last week. The porches are being re-built now, and in the photo below, you can see the forms starting to be in place for them.

So the appearance of the interior has changed dramatically over the past few weeks, mainly because there are now walls there.

Here’s the usual sample photo:

hpodder to be multithreaded… done right.

I’ll be hacking on my hpodder program this weekend. hpodder is a full-featured podcast aggregator that runs on the command line, and has many features over other command-line podcatchers like bashpodder, and even over GUI tools like iPodder.

I originally envisioned hpodder to be something that I’d cron up and run in the background. But I have tended to run it in the foreground more than in the background. Some others have too, and the requested hpodder feature is parallel downloads.

So I am working on that. I already have code working, in fact, that will parallelize both the hpodder update (downloading the feeds) and the hpodder download (downloading the actual episodes) commands.

Unlike ipodder, my code will make sure that no more than 1 thread will ever be downloading from a given server at a given time. ipodder had the terribly annoying habit of pointing all of its threads at a single server, thus pounding it while also providing little benefit for someone with a pipe fatter than the server’s.

Before all this multithreaded stuff could be written, I needed to write my own status bar code instead of just letting curl display its own status bar. (That wouldn’t work when there are 5 curls running at once)

I decided that I would write some generic status bar code, rather than something specific to hpodder. I took the apt-get status bar as an example, and whipped one up in Haskell and added it to my MissingH Haskell library.

But a status bar just begged for another feature: a generalized progress tracker. Something that could keep track of where a task (and its sub-tasks) are, calculate ETA, estimated time remaining, speed, etc. So I wrote that and made the status bar use it.

AND, a status bar begged for a generalized numeric formatter: something that could render 512 as 512, 2048 as 2K, 1048576 as 1M, etc. So I wrote that, and it’s general enough that it can render into both SI and “binary” units by default (and others that users may want).

Finally, I wrote a function to take a number of seconds and render it in something friendly like 23m5s like apt uses, and shoved that in MissingH as well.

So now hpodder will have a status bar, and any other Haskell program can use the same status bar code in minutes because it’s all generic. Or if someone just needs to render a number in megabytes, they can do that.

I really enjoy it when a program needs a solution that is generic enough to put in a larger library. I try to put as much of my Haskell code in MissingH as I can, so as to make it useful to others (and my other programs).

Desktop Linux: NFS or something else?

Recently, I asked for opinions on desktop Linux. Thanks very much to those that replied. I’ve set up an old laptop as an experiment. I’m using Debian, Gnome, and Systemimager. It’s been an interesting project (especially getting SystemImager and a splash screen program to do what I want).

I’d like for my desktop machines to mount /home over the network. I could use NFS, but of course that has all the well-known security risks. Is there a better network filesystem that is easy to use, fast, and more secure than NFS?

Desktop Linux Opinions?

I’m brainstorming about ways of setting up Linux desktops machines for people used to Window users on a LAN. It could be any size of LAN.

I’d like people to be able to sit down at any Linux machine on the LAN and log in — probably use a LDAP directory for that, and NFS-mounted home directories. I wouldn’t want to NFS-mount the entire thing for performance reasons.

So, some of the things I’m thinking about are:

  • Desktop environment: KDE or Gnome? Which would give Windows users all the tools they’d want? Which would they feel most at home with? I’m thinking it’s KDE, but Gnome has a more polished “feel” too it.
  • Image management. How could the desktops be updated? Just rsync everything except fstab over? Can we actually have a single system image? Is XOrg powerful enough to just recognize hardware at boot and Do The Right Thing? Can we build a unified initrd somehow?
  • Distribution. Debian, Ubuntu, Kubuntu? Do the Ubuntus bring anything to the table, if we take as a given that an experienced Debian admin is managing all this?
  • Laptops. What do we do about the home directories there? Some sort of automated rsync thingy?
  • Installation. FAI? Or some homegrown thing that just boots up, partitions, and runs rsync?

Renovation: Week 18

Well, it’s been awhile since I’ve posted a renovation update. Things have been going on, but not much that is all that impressive with photos.

We’ve had electrical work going on. The new water and propane lines have been trenched in. New ductwork is in progress. New electrical and phone service will be trenched in this week or next.

In the past week, the new siding has started going up, and the sheetrock is supposed to start going up on Monday.

So there is some quite visible change going on now! Click here for the pictures.

Managing Software

Recently I mentioned that I hate releasing software. It’s true, and I’ve decided that the first part of fixing it is to tackle the presentation of software to the world.

My current scheme of darcs.complete.org for repositories plus bare directories on my gopher (yes, that gopher) site leaves a lot to be desired. There is no bug tracker, there are few screenshots, there is no consistency. It is also not easy to empower others to work on them directly.

At the same time, I am the sole or primary contributor to most of them. These are not huge kernel-sized projects. These are smaller, bite-size projects. So I don’t want or need a lot of overhead. I’ve been thinking about my options.

  • I could just use sourceforge.net. I poked around there today, and all the advertising there is a real eyesore. Plus I figure that if anyone is getting paid for all my hard work, why should it be some random people that no longer write free software? On the other hand, it would be an easy way for my projects to gain visibility. Or I could use alioth, and give up both the advertisements and the larger visibility. But I don’t like giving up control over my site’s appearance, or behing beholden to others for backups, uptime, etc.
  • I could use trac. It’s nice, and is the only option that supports darcs, and has a very cool wiki integrated into everything (even parsing out keywords from changelogs). On the other hand, downloads are — at best — attachments to wiki pages. There is no download manager. And you have to set up a separate trac instance for each project. That is a non-starter for me. If I can’t see all my bug reports at one place, the bug tracker will be too annoying to use.
  • I could use gforge or savane. These are the sourceforge forks. Neither seems to be as resource-hungry as I expected, and debs are available for both. I could just install them locally and use them for my projects, though that seems like overkill. Plus, like SourceForge and Alioth, they have a crappy web-only bug tracker. I’d rather use something like RT that works by email. (Though RT is too resource-intensive to run on my server). However, web-only is better than nothing so I could hold my nose and use it.
  • I could write my own. But I’d rather not, if there’s already something workable out there.

Is anyone else thinking about this? What are your thoughts?