Thursday 22 November 2007

Installing Postgresql 8.2 on 7.10 server

Installing Postgresql 8.2 was tricky. The database did not start automatically. The directory /etc/postgresql was not created by the installation scripts of the package.

In a previous version of this blog I complained that Ubuntu developers would ignore the problem.

Martin Pitt pointed out that the problem had something to do with my settings of locale. It turns out that the installation script checks very stringently for a correct setting of the locales. Otherwise it would simply terminate the installation process.

locales showed a correct list of locale settings.

Martin suggested to use the following command:
sudo locale-gen de_AT.UTF-8 en_US.UTF-8
On my test machine this changed nothing (obvious). Only when Martin explained why he insisted on the locales (setting the character set and collation sequence in tables) did I follow his advice and reset the locales.

Reinstallation postgresql 8.2 worked finally, the database started automatically.

Kudos to Martin Pitt for having the patience putting up with me. Martin, I am sorry.

To set up you database server you first need to:
  • edit postgresql.conf configuration file and set
    listen_addresses = '*'
  • edit pg_hba.conf host based authentication configuration, add
    hostssl all all 0.0.0.0 0.0.0.0 md5
    to the end of the file to allow all clients to connect to the database
  • restart the database server with
    /etc/init.d/postgresql-8.2 restart
I installed pgadmin3 on my notebook to access and successfully configure the database. I created a table using webmin for testing purposes.. Finally I tried to connect to my database using MonoDevelop and reconfigured the table to my requirements.

Friday 16 November 2007

7.10 -> HP 8510w

Installing 7.10 on a HP 8510w (T7700, 2.4 GHz, 120 GB SATA HD, 2 GB 600MHz RAM, nVidia FX560M) works fine.

To boot the 64bit Edition you need to adjust SATA Native mode = Disabled, in order to boot the LiveCD. After basic installation, I set SATA Native mode = Enabled.

Migrating settings onto the machine worked as described for the HP 8510p.
  • Copying Thunderbird settings worked fine (with the exception of Lightning, which requires a 64bit version). A special treatment require the additional phone books
  • Migrating keyrings is as simple as to copying the gnupg directory
  • Transfering Firefox settings worked by copying the user profile to the target machine

Some things I did not get to work yet:
  • SD cards are not recognized (they are in my old nx8220)
  • Sleep mode sometimes does not resume. The fans work their maximum, but the screen remains black. This happens every 5th or so sleep and you have to reboot the machine.
All other things work fine:
  • Screen resolution of 1900x1200 was recognized and the screen adjusted correctly
  • Touchpad and the little blue nub in the middle of the keyboard work fine
  • Even though the screen resolution is larger than that of the 8510p, over all the 8510w feels faster and snappier than the 8510p.
A note on the notebook itself: The keyboard feels slightly different to the 8510p I used before or the nx8220. I have to type harder or some letters get swallowed. Also the distance of the keys feel slightly different. I sometimes miss the "i" and get an "o" instead.
Other than that, its a pretty machine.

Saturday 3 November 2007

Installing Ubuntu 7.10 on a Sony Vaio Z600NE

Haven't used my old Sony Vaio Z600NE for a while (Windows XP was prohibitively slow). The Sony Vaio Z600NE was shipped with Windows 98 in 2000. Sony does not support any OS after W98 on this machine, but W2K Windows XP worked fine.

The machine came with a CD ROM that plugs into the PCMCIA slot. USB 1.1 connectors don't support booting from USB devices.There is a 10 Mbit Ethernet connector on board.

Booting LiveCD

First I tried to boot the LiveCD. The machine hung without giving any indication about the cause. I tried several boot parameters and settings:
  • set keyboard to German / left default
  • set graphics resolution to 1024x768 / left VGA
  • set kernel parameters: noquiet, nosplash, irqpoll, irqfixup, acpi=off, pci=off, clocksetup=hpet, clocksetup=pit (in any combination)
LiveCD would not boot, after the graphical or textbased loading of the kernel image (vmlinuz) the screen blanked and installation halted (even tried a full night in case Linux required longer timeout periode).

Booting the Alternate CD

Booting the alternate CD worked fine (I tried several boot parameters first with same results as with the LiveCD: the CD would not boot). Alternate CD boots into a textbased installer.

I entered username, password, keyboard layout and location. From there the installation went without any interupts. Installation took pretty long (approx. 50 minutes), after detecting the hardware, a default set of software packages was copied and in a third round installed.

After installation, a reboot was required.

My Sony did not boot up. Turning the machine off completely did not change the situation.

Removing the CD ROM from the PCMCIA slot and rebooting finally worked. The machine shows the GRUB boot loader prompt. It prints ACPI capabilities on the console. Then the splash screen shows up and with one hesitation it boots up quite fast.

Adding Netgear WG511T Wireless adapter

After the first reboot, software updates were available. In order to get higher bandwidth, I inserted a WG511T wireless adapter from Netgear. The adapter was recognized immediately and configured correctly. I had to enter key information and then could access my network at 108 Mbits/sec.

Software update after this was fast and reliable.

Performance

As a general office machine, the Z600NE is sufficient. Accessing the internet or reading E-Mails work fine. Even blogging this article works fine.

However, I tried to install TuxKart und Planet Pinguin Racer. Both programs crawl along so I deinstalled them. Other than this, I am happy having revived a long lasting companion.

Final steps

Sometimes the Z600 refuses to reboot. Kernel boot options were set to quiet splash. After changing this in /boot/grub/menu.lst to
quiet nosplash noapic clocksource=acpi_pm
the machine rebooted reliably.

Wednesday 31 October 2007

I'm not reading mags

I stopped reading magazines years ago for several reasons:
  1. they distract from the issue
  2. their net ratio of information transferred is bad
  3. cost per page of information is beyond acceptable ranges
Let me elaborate on this.

Magazines usually cover just one or two subjects that interest you or that are helpful to your current situation. The rest is either unsolicited input or, worse still, advertisement. I don't need advertisement. Who wants to learn about the n-th version of a charting package that can be embedded into your source code or the umpteenth version management with pink color coding of source files ending with the letter x. Take away cover, index, imprint, the info per pages ratio lies somewhere around 3%.

Even if an article covers an issue right in your focus, usually the issue is large enough to be broken into several articles. Publishers do not want articles to extend two pages and they want to sell the next issue as well. So, the article consists of 30% intro and repitition, 60% information and 10% links to supportive web pages, half of them not online otherwise you would have found them in previous web searches. So the info per space ratio lies around 60%.

Your average magazine costs about € 10,- for around 60 pages. This makes 17 cent / page.

Compared to a book with 1.000 pages (e.g. O'Reilly JavaScript Reference, € 46,99) the price of a magazine is about 3,5 times as high.

All that said, I found a magazine and an aspiring magazine that are worth reading:

dotnetpro is a magazine that concentrates on .NET software development. Much to my surprise it also covers Mono (in those much-hated multi-issued never-ending article-threads). On average, I can read 4 to 8 articles in it that interest me and the quality of code is more than acceptable.

pythonmagazine is an effort to get a Python developer magazine up and running. Issue 1 can be downloaded in PDF and offers an interesting blend of Python issues.

I am not going to change my mind: I read books, no magazines.

But I will keep an eye on the two.

Monday 29 October 2007

Upgrading 7.04 server -> 7.10

I upgraded my server from 7.04 to 7.10. It went through straightforward.

Prerequisites
  • package update_manager_core
If you have not installed it, you can do so:
sudo apt-get install update-manager-core
Installation

Start upgrade with:
sudo do-release-upgrade
I did this over a ssh session. The installation process warns you that you might loose connection during the upgrade and gives you the option to open a secondary ssh session on port 9004.
ssh server -D serverip:9005 -l username
will do the trick.

My upgrade went smoothly. Connection did not fail and after about 30 minutes I had 7.10 installed.

Post installation procedures

VMware Server did not start automatically. I had to install the linux headers manually. The upgrade deleted the old ones without adding new headers. This was due to my installing the server 1.0.4 provided by the vmware.com site, so this is no error in the package upgrade mechanism.
sudo apt-get install linux-headers-generic
Then recompiling all modules:
/usr/bin/vmware-config.pl
and
/usr/bin/vmware-config-mui.pl
There is an error in the way, the installation procedure sets up access rights to the /var/run/vmware/ directory. httpd.vmware cannot create a httpd directory to store temporary data. I added the following lines to /etc/init.d/httpd.vmware (after the ### END INIT BLIOCK):
RUNDIR="/var/run/vmware/httpd"
OWNER="www-data"
GROUP="www-data"

/usr/bin/test -d "$RUNDIR" || \
/bin/mkdir -p "$RUNDIR" && /bin/chown "$OWNER:$GROUP" "$RUNDIR"
This changes the ownership to www-data, the processes user and group ID. Now one can start the httpd.vmware process that handles the web administrative interface.

Tuesday 23 October 2007

Enabling compiz on ATI Mobility Radeon X600

I enjoyed compiz on my ATI graphics card during 7.10 beta. Then xserver-xorg-video.ati got updated and screen effects stopped working.

Many complaints and errors filed but no update to activate the feature.

To fix this, I opened /usr/bin/compiz (a wrapper script to start compiz) and edited the cards blacklist.

In my case I searched for my card using
$ lspci -nn | grep ATI
This gave me 1002:3150 as the card ID.

Delete this string from the blacklist (T).

Activating compiz, now works.

Sunday 21 October 2007

11 most useful console tools

Here are the most essential command line tools:

> filename
Redirect output of commands into a text file, so that you can attach it to emails.
$ uname -a > mypc.log
man command
This gives a brief online description of the command. man always requires a parameter.
$ man uname
sudo command
Run a command with full access rights. You will be asked to enter your password again. The command then executes.
$ sudo chown root:root /etc/init.d/networking
Attention: You turn off Ubuntu's protection features, so double check what you enter!

uname
Provides general system information. Usually called like:
$ uname -a
less filename
If you need to view configuration- or logfiles, less allows you to view them on the console. less always requires a parameter.
$ less /etc/X11/xorg.conf
tail (-f) filename
To view just the most recent part of logfiles, tail shows the last 10 lines of the file. The parameter -f will show new lines as they come in.
$ tail -f /var/log/syslog
lspci
Lists all devices attached to the PCI bus. You can analyze hardware problems with this.
$ lspci
lsusb
If you have trouble with USB devices, this lists all attached USB ports.
$ lsusb
lsmod
Lists kernel modules and their dependencies. You may want to find out, if kernel modules are loaded at all and which modules depend on others.
$ lsmod
dmesg
To read what was printed on the boot screen, this command preserves the messages even when you are in X-Windows.
$ dmesg
sysctl kernel.parameter
If you ever need to display or set any kernel parameter, you can use this.
$ sysctl kernel.hostname
To start a console window, from the main menu select Applications -> Accessories -> Terminal. At the prompt ($) you can enter your command. The output will be shown in the same window.

Saturday 20 October 2007

8 steps into Ubuntu

You are new to Ubuntu? Here are 8 steps to remember in order to enjoy working with it.

1. Ubuntu is not Windows

If you come from a Microsoft Windows environment, you might expect everything to work the same.
Drop this expectation!
If you don't, you will always compare Windows with Ubuntu and will not benefit from the differences between the two.

2. Different GUI

The first and most obvious difference is the Graphical User Interface, the way information is displayed on the screen and the way how you can interact with the machine.

With Ubuntu you can choose your favourite Edition:
  • Ubuntu (easy to use Gnome based user interface)
  • Kubuntu (highly customizable KDE user interface)
  • Xubuntu (optimized for old and slow machines)
  • Gobuntu (no restricted software included)
To get the most out of Ubuntu, start experimenting. Things follow a certain logic that is concise and clear.

3. Different CDs

Ubuntu comes in different packages:
  • LiveCD (Ubuntu, Kubuntu, Xubuntu) lets you start Ubuntu from CD, play and experiment with it or install a standard set of applications on your PC
  • AlternateCD is for installing special hardware or configuration requirements
  • Server CD is required to install out-of-the-box server systems
If you are new to Ubuntu, you should be fine with the LiveCD. You can download any of the above from here.

4. You cannot destroy anything

Ubuntu protects essential parts of the computer system from being changed in a way that the system stops working. Every change that is considered critical is cross checked and authorized by you by entering your password again.

Ubuntu cleans up after all changes so there are no left overs from previous changes.

You may loose some personal adjustments but the system generally protects you from altering the system inoperable.

5. One CD covers all

Ubuntu comes with everything on board.
  • Email -> Use Thunderbird (my favorite) or Evolution
  • Internet -> Firefox
  • Office productivity -> OpenOffice is a complete suite of applications that allow you to write documents, calculate spread sheets, present slide shows, access databases and some things more
  • Entertainment -> Sound Juicer, Media Player, RythmBox, Serpentine CD Creator and others come out of the box
There are more applications in the software repository that comes with Ubuntu. You can install, test and deinstall any of them without the risk of degraded performance or instability of your system.

6. Under the hood

Every 6 month a new version will be released. You can upgrade to these new versions automatically and at no extra cost. You choose the time and authorize individual updates at your own leisure.

Security issues are fixed and shipped to you during these periods. To update your system you can choose between fully automated or individually authorized procedures. There is no hidden updating of system components.
You get what you ask for. Nothing less, nothing more.
7. Yes, there is a console window

Most things can be done from the graphical user interface. Some actions are more efficient or only possible using a special command window: the Console.

Don't panic. You will use the console only on rare occasions where you want to query your machine about specific details usually required to fix problems.

See my blog entry about the most useful commands.

8. Help where help is due

There are zillions of books on Windows available. There are only a few books on Ubuntu out.

Information is conveyed over the Internet.
  • A good starting point would be Ubuntu's homepage
  • Special search engines cover Ubuntu topics
  • There are User groups around the world. (If you happen to be Austrian, here is a good choice)
  • You can get professional support from Canonical or RSB at a reasonable fee.
Good luck!

Chess does not show 3D board

In Gnome, when you switch Chess (glchess) into 3D mode, you get an error dialog:
Unable to enable 3D mode

Your system does not have the required software to enable 3D mode. Please contact your system administrator and ask them to install the OpenGL Python bindings and the GtkGLExt Python bindings.

You are still able to play chess in 2D without these packages.

[Close]
In order to activate 3D you need to install the following packages:
  • python-gtkglext1
  • libgtkglext1
  • python-opengl
  • python-setuptools (not required in 8.04)
I had trouble getting 3D to work on an ATI X2600 graphics card. With the nVidia card, everything works fine.

Monday 15 October 2007

VMWare Workstation & Server console

VMware Workstation prerequisites:
  • You need the linux headers installed
  • also install build-essential package (this has the C compiler and make utility required for the vmware modules)
Installing VMware Workstation is straight forward:
  • Download the installation package from VMwares webpage.
  • Unpack the zip-file in a directory
  • sudo ./vmware-install.pl and answer the questions about directory location
  • confirm that you want to run the vmware-configure.pl script as well
  • add bridged, NATed and host network adapters. I chose the ones suggested
You will get a working installation with icons available under Applications -> System Tools. Start the Workstation and enter a license. Viola, you are ready to go.

There is also a VMware Player installed which is free of charge and does not require a license.

If you run into trouble the most likely causes are that you do not have the latest VMware build. On several occasions I found that there was a newer build on the web. Incidences included:
  • Could not compile vmnet and vmmon utilities (I tried the vmware_any_any_update113 patch and it worked partly but not on every machine, so I could not recommend it)
  • Installation routine finds wrong network adapter. This is the case when you work with a notebook connected to you LAN via wireless adapter. The scripts suggests eth0 as the default bridge. This is no good if you use eth1 or wlan0 to connect to you LAN.
  • Wrong set of kernel module drivers. I tried to install the kernel module drivers that come with Ubuntu. They never worked for me.
To fix problems, run sudo /usr/bin/vmware-config.pl and recompile the kernel modules. You can keep the network settings by skipping network discovery.

VMware Server console:
  • Same prerequisites as with VMware Workstation.
  • Decompress the vmware_server_linux_client archive. Take the vmware_server_console archive and decompress it in a directory
  • in the vmware-server-console-distrib run sudo ./vmware-install.pl
  • again answer the questions. You can change the port the client communicates with the server. But this has to be done on the server as well. I recommend you don't change this.
  • The installation script asks whether you want to run the configuration, which you want to do.
After the installation, you are left without any menu entry. I manually added a lauch icon under Applications -> System Tools and chose the icon that comes in the /usr/lib/vmware-server-console/share/pixmaps directory.

Most likely errors are:
  • Your server does not run the console service (default on port 902)
  • You haven't installed the prerequired components
  • You do not run the latest vmware build (in 7.10 only 1.0.4 build 56528 worked on my machines)
To fix problems, run sudo /usr/bin/vmware-config-server-console.pl and recompile the kernel modules if required.

During the latest updates I got a vmware kernel components from the Ubuntu repository. They prevent you having to recompile kernel modules after each kernel update.

Sunday 14 October 2007

Screen resolution on HP 8510p is wrong

I run Gnome 2.20 desktop on the machine. Ubuntu 7.10 recognized a screen resolution of 1400x1050. This produces a blured image (as the native resolution is 1680x1050).

Here is what my xorg.conf says:
Section "Device"
Identifier "ATI Technologies Inc ATI Default Card"
Driver "vesa"
BusID "PCI:1:0:0"
EndSection

Section "Monitor"
Identifier "Generic Monitor"
Option "DPMS"
EndSection

Section "Screen"
Identifier "Default Screen"
Device "ATI Technologies Inc ATI Default Card"
Monitor "Generic Monitor"
DefaultDepth 24
SubSection "Display"
Modes "1680x1050"
EndSubSection
EndSection
Under System -> Administration -> Screens and Graphics -> Screen I have selected:
  • LCD Panel 1680x1050 (wide screen)
  • Resolution = 1400x1050 (cannot be changed), 60Hz
  • Default Screen selected
On the Graphics tab:
  • Graphics Card VESA driver (generic)
  • Video memory = automatic
Under System -> Preferences -> Screen Resolution:
  • Resolution = 1400x1050
  • Refresh Rate = 60Hz
  • Rotation = (disabled) Normal
  • Make default for this computer only = not selected
Steps to fix the screen resolution
  1. Changing xorg.conf I tried several hint offered on the internet:
    a. changed the graphics driver to "ati" in xorg.conf Section "Device". Restarting the X-server provides an obscured logon screen (horizontal sync does not work)
    b. added
    Screen 0
    Option "MergedFB" "off"
    to the Section "Device". Still the same effect.
    c. added
    Vendorname "Generic LCD Display"
    Modelname "LCD Panel 1680x1050"
    Horizsync 31.5-65.5
    Vertrefresh 56.0 - 65.0
    modeline "1680x1050@60" 147.14 1680 1784 1968 2256 1050 1051 1054 1087 -hsync +vsync
    and removed
    Option "DPMS"
    to and from Section "Monitor". Still the same pattern after restarting gmd.
  2. Then I tried setting the screen resolution in Gnome:
    a. from the recovery dialog at the start of Gnome I set 1680x1050. Gnome starts with 1400x1050 resolution.
    b. tried several other ways but no resolution other than 1400x1050 and a vesa driver.
  3. Tried to install xserver-xorg-video-radeonhd package. I could not find the correct driver in the control panel and the driver was not selected during startup of the X server.
Installed ATI proprietory drivers

Next I took ATI proprietory drivers. I followed the installation instructions:
  • Download the driver from the web site
  • chmod o+x to make the driver file executable
  • ran the file as sudo user
  • answered the following dialogs by selecting a default installation
  • The final dialog informs you that you have to run aticonfig --initial to create a suitable xorg.conf configuration file and reboot the machine
If you do not follow the procedure exactly as described you still run into the situation described above. If you reboot (hey, are we back to Windows) you are welcome with the correct screen resolution.

OK, now it works

Having a suitable screen resolution of 1680x1050 I tried to enable desktop appearance enhancements (compiz). As described in the Readme of the ATI installer, compiz is not supported.

Further, the graphical performance of the driver leaves a lot to be asked for. I have never seen anything that sluggish. I'm left with the choice of having a screen with a readable resolution where I do not even want to move windows or have a decently fast GUI with an unreadable resolution.

The driver prohibits the machine from entering sleep mode (suspend to RAM). The machine stays in a semi-suspended state, consuming full power but with a blank display. To wake up the machine you have to force reboot it, as the driver never wakes up again.

Conclusion

If you have the choice of not getting an ATI card, then do so. NVidia seems the far better choice. It works out of the box and has support for visual enhancements

7.04 -> Server

My Server is an assembly made up of individual components that had a high price/performance ration at the time of purchase. You can always get cheaper or better equipment if you wait but eventually you have to start. This is what I started with.

Installing the hardware

Right after the first boot we ran into several issues:
  • CPU frequency would only operate at 2,2 GHz
  • Memory access was half the frequency than what was advertised
Installed BIOS update that I downloaded from the EliteGroup website. This solved the issue of CPU speed. The documentation of the motherboard mentioned nothing about specific banks that you should install memory in, it turned out though that leaving a gap between two memory banks would provide the desired improvement in access time.

Intalling Ubuntu 7.04

Booting with the LiveCD did not work. Sometimes the CD would boot and perform as expected, but this was not reproducible.
  1. At first, after the initial loading dialog, the screen blanked. This could be resolved by selecting a display resolution using F4.
  2. Next, the boot process would continue providing the well known splash screen. After a short period the boot process terminated and dropped out into a recovery console (a special ash).
  3. Quick research on the kernel hompage did not help.
  4. I found an online copy of O'Reillys Kernel in the Nutshell. Two parameters -irqpoll and (the more resource saving) -irqfixup solved the problem of hangups during the boot process
Currently I boot the machine with -irqfixup as this performs significantly faster than -irqpoll (which polls all IRQs and rebuilds an IRQ table inside the kernel, irqfixup does this on the fly where needed).

LiveCD installation worked fine from there on.

As this was intended to be a server, a Gnome front end was not required. LiveCD does not allow to install the logical volume manager LVM or RAID functionality. So I reinstalled from the Alternate 64 bit CD.

Installing from the Alternative CD

With -irqfixup in place, installation went without interuption. Some issues that required consideration:
  • The LVM2 requires you to define physical volumes, volume groups and logical volumes. While the installation is straightforward it helped me to read through the RedHat LVM webpages to gain a better understanding and build confidence in the software (I later ran into problems that I cover in a separate blog entry). Another source of information is the LVM HowTos. Other than those two pages other web pages were mostly junk.
  • Ubuntus native webbased administration tool was too limiting. Only a few services were supported. I installed webmin instead. Webmin comes as a Debian package and a clear concise installation script. It supports a full range of services (including a nice LVM administration).
After the basic installation I added servers as required:
  • File services, I installed Samba. Configuration was straight forward. Adjusting security required a bit of fiddling as the modifications recommended on the Ubuntu wiki page were simply not correct. I followed the original Samba documentation and got usable results.
  • Print service, I installed a HP 4500DN on the server. The printer driver that comes with the Alternative CD is different from the one that comes installed from the LiveCD. It allows for duplex printing, 3 Trays and even PCL works fine. I copied the PDD file to all workstations and get better results and faster print times.
  • DHCP server installed fine. Defining scopes was straightforward, help from the ICS homepage was available and useful.
  • DNS server required some tweaking. I use the DNS server internally and also service several domains externally. Not so straight forward was the adjustment to allow dynamic DNS updates only internally. I will cover this in a separate blog entry.
  • OpenSSH requires not separate configuration
  • VMWare Server and web administration will be covered in a separate blog entry.
The server runs stable. LVM survived some tweaking and several power outages. Just as expected.

7.10 -> HP 8510p

Started to install 7.10 Tribe 5 on my HP 8510p from a downloaded LiveCD.
  • Bootup worked to the selection screen
  • Default installation dropped out into BusyBox, an ash recovery console
  • Tried with kernel parameters -irqpoll and -irqfixup (both were successful on my server). No success
  • Tried kernel parameter -noquiet -nosplash. This showed that the harddisk would not be recognised. 8510p has a 120GB SATA which worked in native mode (BIOS setting)
  • Changed BIOS SATA to Disable native mode -> 7.10 booted ok from LiveCD
  • Installed on the local harddisk. Boot worked fine
  • Changed SATA setting in the BIOS to enable native mode, booted fine still
As 7.10 rc came out that very evening, I did a complete reinstall (as there was not much customizing done yet).

7.10 rc did not hickup with SATA native mode enabled. Installed without any flaws. I ran the customizing from there.

Saturday 13 October 2007

Migrating Thunderbird nx8220 -> 8510p

I use Thunderbird as my Mail and Calendar client. Main reason for this is that Thunderbird has an excellent spam filter built in. My current settings are:
  • Thunderbird 2.0.0.6
  • Additional address books
  • IMAP and POP connections
  • Extended German dictionary
  • Dictionary switcher 1.1.2
  • Enigmail 0.95.3
  • Lightning 0.5 (2MB version that supports calendar publishing)
  • QuoteColors 0.2.8
  • QuoteCollapse 0.7
  • Signature Switch 1.5.4
Target machine is the HP 8510p. Here are the results:
  • Thunderbird 2.0.0.6 works fine.
    I had trouble maintaining the date format (US format shown, EU format required).
    I installed language-pack-de.
    In .profile I added a line to export LC_TIME=de_AT.utf8. This did the trick
  • Addressbooks had to be exported on the source machine and re-imported on the target machine. Only the default address book (abook.map can be copied)
  • Extended German dictionary went in ok.
  • Dictionary switcher 1.1.2 works fine. Due to installed language pack I see all English and all German settings. This can be awkward but works
  • Enigmail 0.95.0 was installed from the repository (originally I installed a separate xpi). Adding the keys for communication required me to copy the folder .gnupg onto the target machine.
  • Lightning 0.5 could not be installed (32 bit version). Even the 64 bit version crashes the system. Installing Ubuntu package lightning-extension 0.5-0ubuntu4 works. Copied storage.db to get all calendar setting and data.
  • QuoteColors 0.2.8 works fine
  • QuoteCollapse 0.7 works fine
  • Signature Switch 1.4.2 upgraded to 1.5.4 without problem. Added the signatures which I kept in separate files.
Errors I ran into:

I tried to shortcut porting my settings by copying the profile folder. This did not work for several reasons:
  • Some files contain absolute path description
  • The extension folder contained the old 32 bit version of Lightning
  • Thunderbird crashed unexpectedly and none reproducable
I resorted to install a plain vanilla Thunderbird and add extensions and settings iteratively. There is a detailed description of porting profiles in the Mozilla knowledge base. It helped migrating address books, key rings and other account settings.

Installing Lightning from the Mozilla download site resulted in reproducible crashes. Installing from the Ubuntu repository solved the issue.

How I test

I am not an expert in Ubuntu. So all I can do is
  • try to identify incidences
  • isolate the possible cause (by comparing 3 different machines, documenting changes)
  • check for solutions on the web
  • register bug (if it is one) with Launchpad
If possible, I provide additional documentation, logfiles and surrounding information.

My mayor target of testing lies in business application. I do not care much about fancy stuff. If it works, ok, if it does not, so what.

This site concentrates on Ubuntu Linux

7.04 -> 7.10 on nx8220

I upgraded from 7.04 to 7.10 (Tribe 5) by simply downloading the CD and inserting.

Update started no request and went without flaws.

With 7.10 installed, the compiz desktop effects worked out of the box (until an update to xorg 7.2-5ubuntu3. Since then the nice desktop effects are gone again (Launchpad 144077)

All in all the machine is stable. I update regularly and things work pretty stable.

Some known issues:
  • Thunderbird does NOT honor gnome-vfs. They say it should but it does not (Launchpad 105088)
  • Having usplash enabled (kernel parameter splash) the machine hangs after a resume from suspend to RAM (Launchpad 106198, 146489)
  • After resume from suspend, sound does not work (Launchpad 146491)
Other than that there are some minor glitches that do not hinder efficient work

Wolfs configration

Here is my testbed environment:

HP nx8220 notebook (my workhorse):
Pentium M 760, 2GHz, 1GB RAM, 100 GB IDE HD, ATI mobile Radeon 600 (64MB), wireless network
Ubuntu 7.10 rc 32bit

HP 510 notebook (my spouses workhorse):
Pentium M 770, 2,16 GHz, 1GB RAM, 60 GB IDE HD, Intel 945 graphics card
Ubuntu 7.04 32bit

HP 8510p notebook (my future workhorse if it survives my testing)
Intel T7700 2,4GHz Dual Core 2, 2GB RAM, 120 GB SATA HD, ATI Radeon X2600 (256MB)
Ubuntu 7.10 rc 64bit

Server
EliteGroup P956T-A (V1.08), Intel E6600 2,6GHz Dual Core 2, 2GB RAM PC5400 CL4, 1 TB SATA HD, Gigabyte GV RX155256D-RH Radeon X1550 PCI express fanless, Antec P180 case

Monday 14 May 2007

First steps in Python

In my blog Python demystified I reflected on some thoughts about computer programming and the language in particular.

Ever since then I was aware and interested in this language. Remember I said:
Without any decent IDE (like Netbeans for Java) that allows for graphical programming and UI-design, I strongly doubt that Python will ever gain momentum.
Well, I found a decent IDE: ActiveState Komodo IDE.

Except for GUI it has everything, a decent IDE needs: Syntax highlighting, code completion and code folding, integration into version control (subversion), debugging and profiling.

So just as I was to change my attitude and general opinion on Python the snakes ugly head rose from the depth of my notebooks core:

(Briefly): Python allows for object oriented programming. Objects can be created and instanciated. When going out of scope they are subjected to the garbage collector for destruction. So far so good.

I tried on example program from a popular python book - it worked.

I tried to extend the program (for better understanding) - it crashed.

Well, it terminated with an exception.

Further investigation revealed:
Python stores class definition and object instances in a globally accessible list. When program flow exits the current scope, all objects within scope are subjected to garbage collection according to this global list (and in the exact order of appearance within it).

So there is the possibility that an instance "wol" may be destructed before the class definition "Person". Changing the object name to "wolf" brings it after the "Person" identifier in the globals list and thus there is an object still in memory and valid, where the class definition is destroyed. Any following destructor of the objects instance cannot be called. The code is not there any more.

Is it just me that I find these things on my first day with the language?

Other than that I am fascinated by this elegant and slim language. Worth a try.

Thursday 26 April 2007

Krieg der offenen Dateiformate

ODF vs. OpenXML

In seinem Blog [1] behauptet Brian Jones, Office Program Manager bei Microsoft, dass der Krieg der offenen Dateiformate beendet sei. Grund sei die Freigabe einer speziellen Version von OpenOffice durch Novell, welche Daten in Microsofts neuem Fileformat OpenXML lesen und schreiben kann. Tatsächlich handelt es sich nur um einen eigenständigen Konverter [2], der OpenOffice Writer Dokumente in Word 2007 Dokumente umwandelt. Ist der Krieg der offenen Dateiformate tatsächlich beendet?

Bisher speicherte Anwendungssoftware ihre Daten als ein Abbild des Hauptspeichers auf Festplatten. Dieser Vorgang ist schnell und effizient, langwierige Übersetzungen der Datenstrukturen unterbleiben. Er ist aber auch fehleranfällig. Speicherabbilder enthalten Querverweise und Verkettungen. Ein fehlerhafter Wert beim Datentransfer führt zu unbrauchbaren Gesamtergebnissen.

Im Lauf der Jahre wurden die internen Datenstrukturen immer komplizierter. Anfänglich bestanden Dokumente nur aus Zeichenketten. Später kamen Textformatierungen, Schriftarten, eingelagerte Bilder und Tabellen hinzu. Mit steigender Komplexität stieg der Platzbedarf auf Datenträgern und die Dauer, welche die Software zum Einlesen und Abspeichern benötigt. Mit jeder neuen Version entstanden Unverträglichkeiten mit den Daten der Vorgängerversionen. Die Datenübernahme anderer Hersteller wurde durch fehlende oder ungenügende Dokumentation der Dateiformate behindert. Besonders der Marktführer Microsoft verteidigte seine Position vehement, sowohl durch laufende Änderung der Dateiformate, als auch durch Verbot von Rückübersetzungen. So entwickelten Alternativanbieter Import- und Exportfilter welche rudimentären Dokumentenaustausch ermöglichen. Um das ursprüngliche Erscheinungsbild wiederherzustellen sind jedoch aufwendige manuelle Nacharbeiten notwendig.

Wozu neue Dateiformate?

Längerfristig entstanden Probleme beim Zugriff, bei der Lesbarkeit, Nutzbarkeit und Vergleichbarkeit von Dokumenten. Besonders im Bereich öffentlicher Verwaltungen besteht Bedarf, auf alte und historische Dokumente uneingeschränkt zugreifen zu können. Über Jahrhunderte wurde Papier als Informationsspeicher erfolgreich genutzt. Ein Ersatz durch elektronische Datenverarbeitung kann nur dann erfolgen, wenn die Nachhaltigkeit, Vertraulichkeit, Sicherheit und Datenintegrität gewährleistet werden kann.

Mit herkömmlichen Dateiformaten sind derartige Anforderungen nicht abzudecken. Parallel zum wachsenden Bedarf aus dem öffentlichen Sektor wuchs der Kostendruck in Unternehmen, hervorgerufen durch Unverträglichkeiten bei den Dateiformaten und daraus resultierender Ineffizienz der Arbeitsabläufe.

Langzeittauglichkeit gefordert

Führende Softwarefirmen entwickelten im Rahmen des OASIS Konsortiums [3] einen offenen Dokumentenstandard, der die gravierendsten Probleme wie Langfristigkeit, Lesbarkeit und Fehlerresistenz lösen soll. Das Open Document Format (ODF) [4] versprach ein Ende bisheriger Inkompatibilitäten zwischen Dateiformaten, sowohl versions- als auch herstellerübergreifend. Mit über 700 Seiten ist der Standard umfassend und für zukünftige Erweiterungen offen.

Öffentliche Verwaltungen und Institutionen erkannten das Potential des neuen Standards. Neben einigen Bundesstaaten der USA definierten vor allem Länder aus Südamerika und Europa sowie einige asiatische Länder ODF als verbindlichen Dokumentenstandard im Parteienverkehr und der internen Abläufe. Spätestens mit der Ankündigung des Department of Defence (DoD) im Jahr 2003, verstärkt Open Source Software und offene Dokumentenstandards zu nutzen, reagierte Microsoft auf diese Entwicklungen.

Nicht nur Mittel zur Datenspeicherung

Microsoft verfügt über einen hohen Marktanteil im Bereich des Basisbetriebssystems, der Standard-Anwendungssoftware und auch bei den Dokumentenformaten. Insbesondere die Dokumentenformate helfen Microsoft, Updatezyklen bei der Standardsoftware und den Betriebssystemen massgeblich zu steuern. Unverträglichkeiten zwischen den Versionen führen mittelfristig dazu, dass Firmen über ihre Aussenkontakte zu einem Update gezwungen sind, wenn sie nicht den elektronischen Anschluss an ihre Partner verpassen wollen. Automatisierte Systemnachbesserungen erleichtern es Microsoft, diesen Zwang nach belieben zu verstärken. Eine nachhaltige Neuorientierung grosser Kunden wie dem DoD gefährdet die Marktposition von Microsoft in Ihren Grundfesten.

Diese Abkehr war nur durch die Bereitstellung offener Standards im Bereich der Dateiformate zu verhindern. ODF als Dateiformat kommt aus marktpolitischen Überlegungen für Microsoft nicht in Frage. So wurde 2005 im Rahmen der ECMA ein neuer Standard ausgearbeitet: ECMA-376 oder Office OpenXML [5,6] wurde am 7. Dezember 2006 trotz zahlreicher technischer Einsprüche [7] durchgesetzt. Dieser Standard umfasst derzeit mehr als 6.500 Seiten, zahlreiche XML Schemaspezifikationen und deckt die Office-Anwendungen Word, Excel, Powerpoint und Access ab [8,9].

(Ironie am Rande: In seiner Proposalpräsentation vor der ECMA [10] verweist Brian Jones auf ein Dokument der EU zum Thema Open Document Standards und Vorteile der Nutzung. Das Originaldokument [11] beschreibt allerdings diese Vorteile unter eindeutigem Bezug auf ODF)

Was können die neuen Formate ...

ODF und OpenXML sind technisch sehr ähnlich. Sie speichern verschiedene Bestandteile der Dokumente als XML Dateien in ZIP Archiv ab. ZIP ist ein anerkannter und weit verbreiteter Kompressionsalgorithmus. XML ist eine erweiterbare Beschreibungssprache für hierarchisch gliederbare Datenbestände. XML Dateien sind zwar grösser als binäre Dateiformate gleichen Inhalts, lassen sich aber aufgrund der hohen Informationsredundanz besser komprimieren. Die komprimierten Archive können schneller auf Datenträger geschrieben und von dort gelesen werden. Dateiinhalte werden nicht unmittelbar in den Arbeitsspeicher übernommen, sondern zuerst analysiert und in maschinenverwertbare Form umgewandelt. Die Dateiformate sind fehlerresistenter als Ihre Vorgänger. Beide Dateiformate können aufgrund ihrer offenen Struktur automatisiert nachbearbeitet werden.

ODF und OpenXML erlauben die Einbindung binärer Informationsfragmente sowie Script- und Makrosprachen. Das führt zu neuen Sicherheitsrisiken. Keines der beiden Formate bietet hinreichenden integrierten Schutz vor ungewollten Änderungen von Dateninhalten. Sie bieten genügend Angriffsfläche zum Einschleusen von Schadcode. Es bleibt den Anwendungsprogrammen überlassen, dies zu verhindern. Microsoft Office prüft anhand der Erweiterung des Dateinamens, ob der Aufruf von Makros erlaubt ist. Dieser Schutz ist allerdings leicht zu umgehen und suggeriert daher eine nicht vorhandene Sicherheit.

Beide Formate gewährleisten den längerfristigen Zugriff auf Daten. In der Darstellung hängen Sie - wie ihre traditionellen Vorgänger - von zahlreichen externen Faktoren ab. Das optisch gleiche Erscheinungsbild kann mit den aktuellen Standards nicht garantiert werden.

... und wo unterscheiden sie sich?

ODF Dateien sind kleiner als ihr OpenXML Pendant. ODF speichert Inhalte gemeinsam mit der Formatierung. OpenXML trennt konsequent Text von der Formatierung. Das ist technisch sauberer und führt erstaunlicherweise nicht zu längeren Ladezeiten. Microsoft behält sich die Option vor, grössere Dateien auch in einem proprietären Format einzubinden, um eventuelle Engpässe in der Performance zu umgehen. Hier sind Inkompatibilitäten vorprogrammiert.

OpenXML greift nicht auf bestehende Standards zurück. Unter anderem wurden neue Standards für Grafiken, Texten, Tabellen, mathematischen Formel, Länder- und Farbcodes definiert. Damit wurde die Spezifikation aufgebläht. In der Umsetzung erhöht das die Fehleranfälligkeit von Anwendungssoftware und Formatkonvertern. ODF setzt in allen Bereichen auf etablierte Standards wie SVG, XML, mathML und standardisierte ISO-Codes.

Wie frei ist „Frei“?

ODF und OpenXML sind lizensierbare Standards, deren Nutzung unentgeltlich ist. ODF kann im Rahmen der Lizenzen durch Dritte ergänzt werden. Das Open Document White Paper verweist auf 9 Referenzimplementation [12]. Zu einigen davon ist der Quellcode verfügbar. Dem gegenüber verweist Microsoft nur auf eine Referenzimplementation, Office 2007, welche nicht quelloffen ist.

Ein Gutachten bestätigt ODF patentrechtliche Unbedenklichkeit. Sun, Hauptzulieferer zum ODF-Standard, hat einen ergänzenden Forderungsverzicht abgegeben. Der Ausstieg einzelner Mitglieder aus dem OASIS Konsortium ist klar geregelt, sodass in Zukunft Ansprüche von Altmitgliedern nicht zu erwarten sind. Eventuell zukünftig auftauchende Forderungen aus Patentrechten werden nicht vollständig ausgeschlossen. OASIS bestätigt dieses marginale Restrisiko, sieht aber selbst keinen Lösungsansatz.

Microsoft stellt die Nutzung von OpenXML jedermann frei. Microsoft verweist auf seine Patente im OpenXML Standard, gibt allerdings auch einen schriftlichen Klagsverzicht auf seiner Website ab. Dieser erschöpft sich auf die im Standard berührten Technologien und Patente. Patente, die von der ordnungsgemässen Umsetzung der OpenXML-Spezifikation in Anwendungssoftware berührt werden, sowie eventuelle Patente von Drittherstellern sind von dieser Freistellung nicht betroffen. Insbesondere bei der Einbindung in Anwendungssoftware sehen Rechtsexperten ein nicht unbeträchtliches Rechtsrisiko welches Microsoft nicht entkräftet.

Sowohl ODF als auch OpenXML sind derzeit sowohl frei zugänglich, frei nutzbar, frei von Kosten sowie frei von patentrechtlichen Einschränkungen. ODF ist für Softwareentwickler leichter zugänglich. Der überschaubare Umfang der Spezifikationen erlaubt wirtschaftliches Einarbeiten in das Thema. OpenXML mit über 6.500 Seiten und zahlreichen Schematas drängt sich dagegen dem Interessenten nicht unmittelbar auf.

Attraktiv für den öffentlichen Bereich

Öffentliche Verwaltungen und Regierungen fordern die Nutzung offener Dateiformate aus zwei Hauptgründen:
1. soll die langfristige Nutzbarkeit auch auf unterschiedlichsten EDV-Systemen gewährleistet sein. Diese Forderung wird hauptsächlich in Ämtern und Behörden gestellt.
2. soll die starke Abhängigkeit von Softwareanbietern reduziert und - wenn möglich - lokales Know-how genutzt werden. Diese Forderung stellen vornehmlich Regierungen.

Hier hat ODF einen deutlichen Vorsprung. Das Format existiert bereits seit längerer Zeit, ist einige Male implementiert und in quelloffener Form zugänglich. Der Standard wird sowohl von einigen grossen Softwareherstellern als auch von einer umfangreichen Entwicklergemeinschaft unterstützt. ODF hat noch einige Einschränkungen, die den Einsatz im öffentlichen Bereich behindern. Die langfristige Erweiterbarkeit ist noch nicht nachgewiesen. OpenXML kann auf keine substantiellen Vorteile verweisen, welche einen Einsatz zwingend notwendig machen würden. Allerdings besitzt Microsoft eine breite installierte Basis, auf die das Unternehmen starken Einfluss über seine automatisierten Updates ausüben kann.

Öffentliche Stellen können bis zu einer endgültigen Lösung der offenen Probleme die ersten Schritte in Richtung Automatisierung gehen. Notwendige Ergänzungen lassen sich in weiteren Phasen der Umsetzung nachziehen.

Attraktiv für Unternehmen?

Unternehmen agieren in kurzfristigeren Innovationszyklen. Nur ein geringer Teil der betriebsnotwendigen Informationen haben langfristige Relevanz (Verträge, Finanzinformationen) und müssen entsprechend archiviert und gewartet werden. Der Rest der Arbeitsdokumente hat eine geringe Halbwertszeit.

Eine grundsätzliche Entscheidung bezüglich eines Dateiformates ist nur dann möglich, wenn die Entscheidung bezüglich alternativer Anwendungssoftware zur Disposition steht. Das Erstellen von ODF Dateien mit Microsoft Office (Word, Excel, Powerpoint, Access) ist heute nicht möglich und zukünftig nicht absehbar. Umgekehrt ist das Erstellen von OpenXML Dateien aus alternativer Anwendungssoftware derzeit nur eingeschränkt möglich. Da Microsofts Anwendungsprogramme in Unternehmen besonders stark verbreitet sind, ist der Einsatz von OpenXML zumindest dort vorhersehbar.

Bekannte Migrationsprojekte, wie jenes der Stadt München oder der Lufthansa, sind weitgehend von strategischen, politischen oder ideellen Motiven geleitet. Wirtschaftliche Vorteile bei der Betrachtung der Gesamtkosten sind marginal oder nicht vorhanden. Wer sich letztlich durchsetzen wird - öffentliche Verwaltungen und Regierungen die ODF bevorzugen oder Unternehmen die Microsoft mit OpenXML nutzen - ist derzeit nicht prognostizierbar.

Der Krieg der offenen Dateiformate ist demnach noch nicht beendet. Wir erleben bestenfalls eine kurzfristigen Waffenstillstand.


Bibliographie:
[1] Blog Brian Jones
[2] Novell OpenXML Translator
[3] OASIS Gründungsmitglieder
[4] Open Document for Office Applications
[5] ECMA-376 - Office OpenXML
[6] Office OpenXML Fact Sheet
[7] Objections to JTC-1 Fast-Track Processing of the Ecma 376 Specification v. 0.1, 27.1.2007
[8] ECMA-376 OpenXML White Paper
[9] Introducing the Office (2007) Open XML File Formats
[10] Start of TC45: Presentation to GA
[11] TAC approval on conclusions and recommendations on open document formats
[12] Open by Design, ODF White Paper

Monday 23 April 2007

Ubuntu vacation feelings

After a long and discontinuous experience with Linux and particularly Ubuntu, I switched to Ubuntu over Easter.

All the good reviews

If you are interested in appraisal only reports, look somewhere else. I had so many errors, bugs and strange behaviours that I cannot fall into the choir of Ubuntu enthusiasts.

Ready for prime time - for some

If you use your computer just for E-Mail, web browsing and the occasional word processing, you will love Ubuntu.

If you watch a video every now and then, you will be excited to see that it can be painless.

but not for all

If, on the other hand, you want to use Ubuntu in a mixed environment with Windows clients and servers, prepare for some surprising incidences.

Gnome provides an interesting approach to file access: gnome-vfs (Gnome virtual file system). Its an easy to use API that allows applications to access remote and heterogeneous file systems. Just mount a volume, drive or directory and access it with any application. That's what it says on the box.

Reality quickly catches on: Only a few applications are aware of gnome-vfs and the mounted drives. Nautilus (the Explorer pendant under Gnome) can access files. OpenOffice supports gnome-vfs as well. Others don't. And they are not just any applications: Thunderbird and Firefox are among them.

Notebook misery

I run Ubuntu on several notebooks. The basic system will always work. If you want to use notebook specific features like touchpads, sleep mode or wireless LAN, prepare for nightly sessions of debugging and error discovery. If your notebook is equipped with exotic peripherals (anything other than a keyboard a screen and an external mouse will do), you will likely find it not working.

On a HP nx8220 the smart card reader is not recognized, the SD cards cannot be mounted and sleep mode will wake up with sound amiss.

My HP 510 has a built-in Synaptics mouse pad. This is recognized in my nx8220 but not in the HP510. To calm us down, sound works after wakeup.


Tiny little annoyances

As a professional developer I am not prepared to ship things that do not work. And it seems pretty clear that some things don't work. So, they should not having been shipped.

Video playback using the proprietary graphics drivers from ATI don't go well with video playback. OK, they are turned of by default. Also compiz is turned of by default, and that is good so as it conflicts with video playback as well.

There is no centralized tool to adjust regional settings. This has to be done in configuration files, logon scripts, gnome tools and sometimes within the application itself. Thunderbird for examples does not honour the system wide font setting. It also ignores regional time formats. You have to set these using environment variables.

Why do I use it?

So, if I am not happy, why did I bother migrating?

Well, I did not say, I was unhappy. There are things that really work well. Automatic update, upgrading to a new version, installation and deinstallation of software all are more stable and trustworthy than the monopolists counterpart.

There is no IE installing malware behind my back, no Office update that deletes some of my .NET framework DLLs and most of all, no DRM to tell me what I am allowed to do, see and view.

I have no need to defrag my harddisk or registry, no thrills using some low level maintenance tools. I do my work and thats ok.

If there are some issues or lack of functionality I can look under the hood and identify the problem myself. I can contribute to the evolution of a system and that contribution is valued (as opposed to Microsoft where reporting a bug will cost you money).

But most of all, I feel like a person that has left its privileged live behind. All the high-tech gadgets, the nitty-gritties, items and toys that seemed so important mean nothing. I stand here with my bare feet in the sand, watch sunrise (or sunset, whichever you prefer). I feel like I don't need all the chaos, hectic and stress.

I feel like I'm on vacation.

Tuesday 27 March 2007

Finger weg from Eclipse

We have this ongoing argument about not enough Java/Oracle programmers available. I took this to brush up on my Java know-how.

A friend suggested Eclipse to use as an IDE. So I listened and tried it.

Let me put it this way: I have not used such a bad piece of software in a while.

OK, it's free of charge but that's about all that speaks for it.
  • It's slow. Dead slow. A simple "Hello world" took 5 minutes to set up, 2 minutes to debug and more than a minute to launch from the IDE
  • It's complicated: To set up a project you need to go from Window to Window to set up projects, packages, classes, hierarchies and outlines
  • It's slow (did I mention that already?): The code completion takes for ages and even blocks text entry (System - hang - .out. - hang - println - hang ...)
  • It's confusing: Try to debug, does nothing. You have to select which type of project it should be (Can a Java class be debugged as C/C++?)
  • It's broken: I tried to update the IDE (I used 3.2). There were some updated modules. The update mechanism asked over and over which Download center I wanted to use (the automatic selection ended in a disaster). And after another 15 minutes of downloading, the IDE told me, I had not enough rights to install the update. Thanks for telling me so soon
  • It's cumbersome: Creating a SWT app is not straight forward. I gave up working on this one
  • It's slow (deja vu): You can add plug-ins easily (if you have root privilege, that is). With each plug-in, the IDE becomes slower and more unusable
Conclusion: If I have to write programs in Java, I use Netbeans (free of charge) or IntelliJ ($ 599,- incl. TeamCity).

Friday 23 March 2007

Where does IT security head to

I was asked to participate in a survey covering IT security. The survey was carried out by a university task group. The goal was to identify areas of security that companies were aware or unaware of.

I tried very hard to give this group meaningful data and information. However, I could not answer the questionaire past question number 13. The questions were irrelevant, ill-formulated, missleading and mostly of archaeological value.

Some examples?

x. How much will has your company spent on IT security in the last year?
(without ever asking the size, branch, turnover or revenue to put this number in relation with)

y. What IT risks are you aware of:
- Viruses
- Trojans
- Worms
- Adware
- Dialers
- Hardware errors
- User failure
- Theft

z. What IT risks do you prevent:
- (above list)

It did not start out that bad. The introduction claimed that due to increased penetration of IT in different businesses there was an expected increase in IT risks to be expected. The survey aimed to help identify areas of IT risks and security issues in these new fields of application.

Is this where university education goes? Triviality!

But the issue in itself is interesting enough to dig into.

I offered the group some thoughts about IT security. First I tried to identify areas of risk that require assessment:
  • System(ical) risk
  • Implementational and operational risk
  • Environmental risk
  • Human factors
Let me get into this a little more.

Systemical risk

Admittedly viruses, trojans and dialers are issues not to be ignored, but widely overemphasized.
Another, more severe issue are rootkits. Lacking medical and historical verbal assoziation, rootkits are anticipated something remote, arcane but hardly threatening me.

Rootkits may be introduced into a computer system using such seemingly harmless methods as playing a CD or watching a video. They lie dormant until activated by their creators. Using stealth technology they are invisible to ordinary administrative activities. They are universal in their functionality, offering a privileged environment inside the infected computer.

Virtualisation will add one layer of complexity and uncertainty to this scenario.

Extending system boundaries, the next vulnerable technology is naming services. DNS is a highly fragile system, relying on approximately 13 root servers. Adding one or taking one root server of the DNS net, has an immediate impact on 7,6% of the network load.

DNS offers some resilience like load balancing, database replication and local name caches, still a targeted combined denial of service and man in the middle attack could wreak havok in our IP based world. And this is just the top of the tree. At the bottom, vulnerabilities exist as well.

Another area pertaining to risk assessment is routing of information flow. At the basis data is forwarded from the sender to the recipient using a combination of interacting protocols. From ARP, UDP, TCP to RIP, OSPF and BGP (to name but a few) data is packed, transfered, destinations looked up, recipients verified and sequences honored. At a higher level even more protocols come into play when analyzing mail traffic, viewed web pages and exchange of authenticating credentials. Attacking on of those protocols renders the whole network useless.

And there are several attack points both known and widely unknown out in the wild.

Finally (and this is not really an exhaustive enumeration), the issue of identification and identity management will become an increasingly important issue in the future. As systems become externalized and users accessing these services in an ever mobile and volatile way, identity, authentication, authorization and non repudiation will move into the focus of future security assessments.

Another area of concern are

Implemental and operational issues

A quick query on CERT or SecurityFocus reveals that most current issues deal with programmatic problems. Buffer overflows, unrecognized error conditions and weak, template based programming are the root source of these issues. If time to market is driving engineering efforts no wonder we have such poor and instable software.

Next in this chain is the implementation of systems (combined hard- and software) by following standard How-Tos or simply clicking a few defaults. In order for a system to function in a variety of different setups, it is required to run with low security implemented. While this allows a quick start, it poses a vast array of vulnerabilities in a running environment.

Data and function exploiting techniques (like SQL-Injection), improper access to underlying data (reading native file systems), identity spoofing to name but a few. The number of imaginable attack vectors are uncountable.

A highly rated issue is data backup. Millions are spent on data backup. Hardly anything is invested into (bare metal) recovery. It takes initial effort (and thus is a one time cost) to set up a working backup strategy.

It takes permanent effort (and therefore a lasting and substantial cost factor) to test the quality and feasibilty of data recovery. And while a backup covers only a selected number of threat scenarios (more full recoveries or just file recovery), testing recovery issues has to deal with all possible cases. A change in an underlying system component may render the whole process useless.

Dealing with large data quantities and centralized storage (using NAS or SAN) increases the sensitivity of the subject.

Availablity of computing resources is another issue. While virtualization allows to use computing hardware efficiently, resilience to hard- and software failure are critical factors as the hosting components are single points of failure that may influence the quality of service and system integrity alike.


Environmental risk

Threat scenarios become wider ranging from fire to water to temperature. While the first two were always on the agenda of even the smallest operation, temperature becomes an issue in the workplace environment. As CPUs and graphics cards produce more and more heat, computer systems are ubiquitous, the heat emission will increase in the future.

I cannot go into every aspect of environmental risk here. Suffice to say that the branch of the company, the typical usage of IT systems, even legal issues and changing regulations are influencing environmental issues.

On arbitrary example might be a sales representative of a company doing business in Asia. Due to restricted information policy in some Asian countries (e.g. China, Burma) content stored on a computer hard disk might be seen as offense by the local authorities (carry a PDF covering the October revolution in China for a little adrenaline peak). Up to now, possession of computers in Burma is strongly prohibited, facing death penalty.

Another widely ignored factor is the dependency of just one single monopolist. Currently there is one major vendor dictating formats for data storage, communication, even software update cycles. This monopolist leverages each of its systems to gain more control and suffocate technological innovation that is incompatible (and therefore unwelcome) with its overall strategy.

The human factor

What can I say. Undereducated, ill trained, so called computer experts, time-to-marked driven decisions about system releases, cost based human resource policies. All of these are security issues.

Hiring a novice programmer might reduce labour cost in the programming department. It sure will increase labour cost in the help desk and call center. Does it make systems more reliable, more secure? No.

Does it make the CFO happy? Yes.

He can always point to the low wages.

Oursourcing? Let's ship our developement to India. Let's move our help desk to India. Let's move the accounting to India. Labour cost for these activities drops (for how long, may I ask). But the understanding of risk, the correct assessment of security issues still lies with the company.

More so, due to increased communication and repair efforts, cost will increase, effectively shifting expenses from production source to cost for communication and repair.

Conclusion

These are security issues I really see and anticipate for the future. In a few years there will still be some viruses, some adware and some hardware failures. But they will hardly be covered by the press.

We will see more incidents hitting one or the other areas that I described above. And the impact will hit more than one single computer or company. It will hit communities, areas and industry segments.

Wednesday 7 March 2007

Customer Service

I read this excellent article on customer service. While at my company I try to enforce good customer support, I never had my quality standards laid out in such an easy list of steps to follow.

Thanks to Joel Spolsky, here it is. Read, enjoy and act on it.

Sunday 4 March 2007

The future of video

Currently, we are migrating our workstations from Windows to Linux. While the basic stuff works perfectly fine, we run into trouble when dealing with music and video.

Especially video.

Our video collection does not work under Linux.

While playing videos under Windows is not an easy feat, playing them under Linux provides insight into a lot of things. Unfortunately, videos are not among them.

Let's summarize:
Videos are stored in container files. To play them, codecs and decoders are required. Some, like MPEG2 and MPEG4 are straight forward, some, like DivX, XVid, H264 are embedded in AVI files.

Under Windows, you install codecs, the installer hooks them into the search path of any installed media player. OK, some honor these paths, some don't. Sometimes, helper applications like Explorer get confused and crash. But low and behold, it works pretty well.

Under Linux, you need graphics libraries, that have library plug-ins. If you ever tried to make ends meet with gstreamer, you know what I am talking about.

My suggestion

Here is a proposal of how video streams could be encoded in order to eliminate the codec problem and thus make video handling user friendly.

All video is encoded using an video and audio encoder. Mixtures are possible and exist.

A new file format could accommodate 2 parts: One, the decoder codec and two, the film itself. The codec would be extracted by a generic decoding engine and used to decrypt the film stream. The player would be a generic codec interpreter that allows for platform independent decoder plug-ins to be hooked in.

None of the above mentioned technologies is new. Platform independent plug-ins are reality in Mozilla driven XUL tools. Piggyback codecs can be attached to the video stream. Interpretation engines can both be Python and JavaScript.

This mechanism would even allow for commercial content, as the codecs may well be connected to payment systems.

I haven't given this a deeper thought. Maybe my tweaking with Linux will take me to deeper insights and eventually make me mad enough to write a prototype.

Tuesday 16 January 2007

Python demystified?

Four years ago I was asked to troubleshoot a project under tight schedule and budget. The team had committed itself to finish an ASP application within 2 years. When I was called, there were only 4 month to the deadline.

I agreed to manage the team under the condition that 2 out of 3 parts be substituted by standard OTS componenten (who needs to develop their own database or browser) and efforts being concentrated on the core functionality.

My proposal was rejected. The project team was confident that using Python programming language would give them an advantage to finish on time and budget.

Later I learned that the project failed and the company went out of business.

My argument then was that Python was not to be compared to C# or Java in efficiency. More so, no real libraries exist.

I was right about the potential of Python as a tool to finish in time.

However I was wrong about the true reason why Python has not gained momentum and probably will not in the future as Java and C# had.

Recent occupation with the subject led me to a different opinion.

1. C# is easy to use and convenient for writing software. However, C#'s strong typing requires programmers to define in advance what to do. Even with refactoring tools and support for generics changes in structure and data definition can cause headache.

Python (as JavaScript) offer weak typing. Objects can be of any type, programmers don't have to worry in advance what they have to handle. Python as opposed to JavaScript has a tight syntax an object declaration (var i, I know its var, so why do I have to state it?)
There are many more advantages to the Python language. Some of them are:
  • everything is an object,
  • strong string operations,
  • plethora of external modules,
  • integration into host operating systems,
  • integration with other programming languages,
  • etc. ...
These make Python a powerful programming tool.

Going from here, I would prefer Python to any language any day.

2. In the early days of Java, everyone wrote their own IDE. This was possible because Java offered AWT a platform independent graphics subsystem. That allowed for many programmers to adopt Java. C# is hosted in the Visual Studio IDE (with a free Express edition available). Even Mono has two prominent IDE's: MonoDevelop and SharpDevelop.

Python offers an outdated IDLE (basically a specialised shell with no charm), ERIC (a bloated QT based environment with an older version of python interpreter), PyDev, a slugish Eclipse plugin, and many more alpha and pre-alpha Editors.

Without any decent IDE (like Netbeans for Java) that allows for graphical programming and UI-design, I strongly doubt that Python will ever gain momentum.

3. There is plenty of introductory books about Java, C# or VB. There are some books around introducing Python. None of the books I reviewed showed how to set up a working python development environment. Maybe its the selection, maybe I am used to skipping chapters on how to get started with other languages. With Python I wasted hours installing, testing and deinstalling IDE's and development tools.

Without a cross platform native or Python based IDE I am pretty sure that Python will be something like Modula 2 in the 70s, ADA in the 80s and LiveScript in the 90s.

Looking back, finishing the project on time was absolutely impossible. Not because of missing power in the language but lack of a powerful IDE.