Sunday, August 26, 2007

Laptop dead?

I was doing KDE programming on a 3 year old Celeron 2.2GHz laptop late Thursday night. I finished and turned it off -- nothing out of the ordinary.

The next evening I tried turning it on. The "on" light lit up and the fan started spinning. All was good until 5 seconds later: it just shut off - the light went off and the fan stopped. A few seconds later it turned on again. It kept looping in this cycle.

If anyone has an idea of what this is or whether this can be fixed, I'd be very grateful. I have lots of uncommitted code sitting there so if the laptop's dead, I'll have to move the hard disk to another (less powerful) laptop, at least to get my code off.

Nothing is displayed on the screen so it's not even getting to the POST tests. There's no beeping whatsoever. And there's no smoke :)

I opened up the laptop today hoping to find something loose or abnormal but I can't see anything wrong with it.

This laptop has had a habit of getting up to 90-91 (ninety to ninety-one) degrees celcius during compiles (before Linux shut it down), if I didn't prop it up to give it more ventilation, so maybe it finally fried?

Thursday, May 31, 2007

Offline GMail?

With today's announcement of Google Gears - "an open source browser extension that lets developers create web applications that can run offline" - I'd like to make a somewhat obvious prediction that GMail will feature a disconnected mode within, say, 2 years.

It looks like Google is seriously entering the desktop space.

Tuesday, May 01, 2007

Linux pids wrapping

I have this old "ps waux" textdump:


Thu Jun 2 22:55:56 EDT 2005 i686 i686 i386 GNU/Linux
23:29:59 up 9:35, 22 users, load average: 3.47, 1.13, 0.44
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
[...]
clarence 32522 0.0 0.7 4516 1616 pts/18 Ss 22:33 0:00 /bin/bash
clarence 32544 0.0 1.3 9100 2992 pts/18 T 22:34 0:01 vim test2.cc
clarence 729 0.9 6.0 36360 13516 ? S 23:27 0:01 konsole [kdeinit]
clarence 730 0.0 0.6 4516 1508 pts/19 Ss 23:27 0:00 /bin/bash


I still cannot figure out how I managed to do that. As far as I can see, I was only using OpenOffice.org Writer, viewing webpages and had compiled KDE4 earlier in the day. The machine had not even been up for 10hrs.

Mimetypes not working for you in KDE4?

If KDE4 mimetypes aren't working, type "kbuildsycoca4" instead of "kbuildsycoca". I believe this change allows for KDE3 and KDE4 apps to coexist. I thought I'd share this since it took me an embarrassingly long amount of time to figure it out :)

I also recall someone saying that the "shared-mime-info" package had to be a certain version. So I found a Fedora RPM for 0.17 and upgraded to that (I know that RPM is old but it was the only one I could find quickly). Not sure if it was required though.

Monday, April 23, 2007

Using a HP PSC 1315 as a scanner under Fedora Core 4

Update (2007-04-24): I've been told (see the comments) that the Linux support works out-of-the-box these days with newer distros. Also, I've got nothing against hplip - it's just that hpoj was the first one I got working with such an old distro. I ran into numerous difficulties setting up this scanner because of conflicting information on the internet and also instructions that just didn't seem to work for my setup.

The HP PSC 1315 stands for "Hewlett-Packard (all-in-one) Printer-Scanner-(photo)Copier". It's a USB colour inkjet and was dirt cheap considering all the functionality - less than $100, years ago. I do not recommend this hardware (see bottom of post).

Now I was interested in getting scanning to work under my old Fedora Core 4 system. The scanner supports 600 DPI and this is more than I need. After much googling and conflicting information, a search string of hp psc fedora got me to these working instructions. This uses the obsolete "hpoj" driver, instead of hplip's "hpaio" driver, but I don't mind since kooka and xsane both work now. Thanks very much to "zparihar" on linuxquestions.org for that useful post.

I can think of only 2 things that I have found harder to install under Linux: support for the Aetheros WPA wireless and an analog joystick. I won't even attempt to set up the printing for the time being since I don't need that capability.

By the way, I initially found this HOWTO Install a USB Scanner guide on the Gentoo wiki, which looks like it could be useful for other scanner models. I gave up on it since it looked too complicated by the time it asked me to add a 03f0:3f11 line (printer USB ID given by lsusb) to /etc/hotplug/usb/libsane.usermap.

I strongly recommend against buying this printer given that Linux support was not out-of-the-box and also given that HP's driver for Windows is horrendous. Regarding the latter, sometimes print jobs go nowhere and deleting them from the print queue does nothing, blocking all future print jobs. Unplugging the USB cable is another sure way of preventing one from deleting print jobs even after reattaching the cable. Some combination of rebooting, reinstalling the printer under a different USB port etc. (I still haven't found a reliable way) is required to delete the stuck print jobs. To make things worse, the "HP Director" application would quit silently after the IE7 upgrade and after applying a patch from HP, it now starts but the buttons don't work.

Wednesday, April 11, 2007

Sydney meeting

As Seb Ruiz writes, we met up with Brad Hards last night in the city. See Seb's blog for photos.

Some interesting KDE trivia that was shared around was that:

* Kopete is apparently pronounced cop-it-tay, not co-pete

* Krita can be read as kri-ta instead of krit-a

Thursday, March 08, 2007

Going to the ACM programming contest in Tokyo

Update: We came equal 44th out of 88. While it's generally invalid to draw certain comparisons in such competitions (since you can jump a large of number of rankings by debugging just an extra problem), I'll draw one anyway: we equalled CMU and beat Harvard :) However, pretty much all of the other unis I listed below beat us easily. The University of Auckland is a deserving South Pacific Champion with a Bronze Medal.

Overall, I'm happy that we made it this far and I'm happy with this year's performance against an elite set of world class competitors. On the non-technical side of things, we had a great time in Tokyo and at the Disney Sea theme park, paid for by IBM. It is on this high note and after nearly 10 years of contests, that I'm retiring from programming competitions.



I've been a bit quiet these last few months as I've been practising for this contest. Two teams are representing Australia this year - one from the University of Adelaide (the defending South Pacific Champion) and the other, the University of New South Wales (one of the members being yours truly).

It will be interesting to see how we'll fare against the Chinese, Russians and MIT, CMU, Harvard, Waterloo etc. considering that their teams were presumably picked from a multi-tiered qualification process (and reportedly, the Chinese education system also has high schools - not just universities - dedicated to programming!), while the qualification process for the South Pacific consists of a single competition :) In any case, we hope to improve on the performance of recent South Pacific teams.

Anyway, after all this craziness / intense competition is over, I hope to finally, finally, finally get back to opensource programming (although, consistently, every year, I seem to find a different distraction). So in a sense, I wish the competition was over more quickly :)

Sayoonara!

Monday, December 25, 2006

Another script: revision controlled OpenOffice.org documents

I mentioned that I used OpenOffice.org for my thesis.

However, a problem is that binary files, like .ODT, don't work well with revision control systems.

Firstly, they aren't diff-able so you have to also keep a .TXT file, in the repository, in sync with the .ODT. This is error-prone and the .TXT diffs don't reflect formatting changes.

Secondly, because OpenOffice.org files are compressed (.ZIP format), the binary deltas that revision control systems (e.g. SVN) use to save space fall apart for even the tiniest of changes to documents, due to the characteristics of compression.

But thanks to the second anonymous commenter on that blog post, I came up with a new way of revision-controlling OpenOffice.org documents:

What I now do is keep the unzipped OpenOffice.org document in revision control, not the binary .ODT file. This means that I can diff the content.xml between revisions, not bother keeping a .TXT file in sync and avoid SVN's space-inefficient binary deltas since I'm really only keeping text files in the repository (more on this later).

If you checkout such an unzipped OpenOffice.org document from SVN, you can use my magic Makefile to reconstruct the .ODT from these unzipped contents by typing make. It's just like adding water to milk powder.

And every time you make a change to the .ODT file, the same make command works the other way and updates the unzipped contents to match the changed .ODT. Then you can svn commit a new revision.

Caveats:

* Warning: svn status will not report changes to the .ODT, until you type make. Be careful or you might think that your checkout has no local changes and you decide to delete the checkout to "save space"! A "foolproof" way to get around this problem is to skip the svn propedit svn:ignore . [and type kolourpaint-developer-guide.odt] step.

* It actually does store a binary file in revision control, namely Thumbnails/thumbnail.png. I haven't dared try to work around this though, for fear that OpenOffice.org won't like me playing with it.

* Some files such as layout-cache and settings.xml, while not binary unlike Thumbnails/thumbnail.png, change on every save and probably shouldn't be revision controlled either.

* You can't have 2 people working on the same document as merging two content.xml files is asking for trouble. But at least you can diff between revisions.

* A lot of lines in content.xml may change in response to even a small layout change due to lots of tags changing control numbers.

If you try this scheme, please let me know how it goes.

BTW, in the end, my thesis turned out to be 192 pages (probably, about 100 pages too long :)). It was on porting the L4 microkernel to processors without virtual memory, specifically the Blackfin processor. It was written in a rush so apologies for the awful number of spelling and grammatical errors!

A few scripts that could be useful

I just uploaded some scripts I've had lying around for months. Hope they're useful:

1. kbus: starts the DBUS server since it's not done automatically. Sure beats typing eval `dbus-launch --auto-syntax` and copying environment variables all the time.

Update: It looks like DBUS has an auto-starting feature now, via --ensure, so this script is no longer needed).

2. svnupfast: an "svn up" that takes half the bandwidth.

3. svn-import-changes: a poor person's distributed revision control system.

Full details and "sort of" docs here.

Re: We’ve got a couple of feet in our sights

Regarding a few things Boudewijn Rempt wrote about the look of KDE:

1. I actually thought Keramik was stunning during the early builds of KDE 3.1 - it was the perfect balance between good eye candy and not being so noticeable that it got in the way. Unfortunately, I believe it changed in the final KDE 3.1 (please correct me if I'm wrong) and the gradients became too striking.

I liked it in combination with KDE 3.1's bluish icon set, which was sadly replaced in KDE 3.2.

I still use Keramik with the KDE 2 colour scheme and I like it better than Plastik (which is still a very good theme).

2. The window colour changing on focus switches in KDE 4 is a feature of the style, I believe. It is really slow indeed on my old graphics cards as well and for a while I thought it was a palette bug, until I realised that it was supposed to be highlighting the focused window.

I'm sure this will be fixed eventually but I'd still expect KDE 4 to be a tad slower than KDE 3 because of Qt4's automatic double buffering. But I think this is a reasonable price to pay for less flicker.

On a side note, for those of you who are wondering, that automatic double buffering is actually done per top-level window, rather than per widget (I looked inside the Qt code for this and it's actually quite clever).