Tuesday, 27 March 2007

Finger weg from Eclipse

We have this ongoing argument about not enough Java/Oracle programmers available. I took this to brush up on my Java know-how.

A friend suggested Eclipse to use as an IDE. So I listened and tried it.

Let me put it this way: I have not used such a bad piece of software in a while.

OK, it's free of charge but that's about all that speaks for it.
  • It's slow. Dead slow. A simple "Hello world" took 5 minutes to set up, 2 minutes to debug and more than a minute to launch from the IDE
  • It's complicated: To set up a project you need to go from Window to Window to set up projects, packages, classes, hierarchies and outlines
  • It's slow (did I mention that already?): The code completion takes for ages and even blocks text entry (System - hang - .out. - hang - println - hang ...)
  • It's confusing: Try to debug, does nothing. You have to select which type of project it should be (Can a Java class be debugged as C/C++?)
  • It's broken: I tried to update the IDE (I used 3.2). There were some updated modules. The update mechanism asked over and over which Download center I wanted to use (the automatic selection ended in a disaster). And after another 15 minutes of downloading, the IDE told me, I had not enough rights to install the update. Thanks for telling me so soon
  • It's cumbersome: Creating a SWT app is not straight forward. I gave up working on this one
  • It's slow (deja vu): You can add plug-ins easily (if you have root privilege, that is). With each plug-in, the IDE becomes slower and more unusable
Conclusion: If I have to write programs in Java, I use Netbeans (free of charge) or IntelliJ ($ 599,- incl. TeamCity).

Friday, 23 March 2007

Where does IT security head to

I was asked to participate in a survey covering IT security. The survey was carried out by a university task group. The goal was to identify areas of security that companies were aware or unaware of.

I tried very hard to give this group meaningful data and information. However, I could not answer the questionaire past question number 13. The questions were irrelevant, ill-formulated, missleading and mostly of archaeological value.

Some examples?

x. How much will has your company spent on IT security in the last year?
(without ever asking the size, branch, turnover or revenue to put this number in relation with)

y. What IT risks are you aware of:
- Viruses
- Trojans
- Worms
- Adware
- Dialers
- Hardware errors
- User failure
- Theft

z. What IT risks do you prevent:
- (above list)

It did not start out that bad. The introduction claimed that due to increased penetration of IT in different businesses there was an expected increase in IT risks to be expected. The survey aimed to help identify areas of IT risks and security issues in these new fields of application.

Is this where university education goes? Triviality!

But the issue in itself is interesting enough to dig into.

I offered the group some thoughts about IT security. First I tried to identify areas of risk that require assessment:
  • System(ical) risk
  • Implementational and operational risk
  • Environmental risk
  • Human factors
Let me get into this a little more.

Systemical risk

Admittedly viruses, trojans and dialers are issues not to be ignored, but widely overemphasized.
Another, more severe issue are rootkits. Lacking medical and historical verbal assoziation, rootkits are anticipated something remote, arcane but hardly threatening me.

Rootkits may be introduced into a computer system using such seemingly harmless methods as playing a CD or watching a video. They lie dormant until activated by their creators. Using stealth technology they are invisible to ordinary administrative activities. They are universal in their functionality, offering a privileged environment inside the infected computer.

Virtualisation will add one layer of complexity and uncertainty to this scenario.

Extending system boundaries, the next vulnerable technology is naming services. DNS is a highly fragile system, relying on approximately 13 root servers. Adding one or taking one root server of the DNS net, has an immediate impact on 7,6% of the network load.

DNS offers some resilience like load balancing, database replication and local name caches, still a targeted combined denial of service and man in the middle attack could wreak havok in our IP based world. And this is just the top of the tree. At the bottom, vulnerabilities exist as well.

Another area pertaining to risk assessment is routing of information flow. At the basis data is forwarded from the sender to the recipient using a combination of interacting protocols. From ARP, UDP, TCP to RIP, OSPF and BGP (to name but a few) data is packed, transfered, destinations looked up, recipients verified and sequences honored. At a higher level even more protocols come into play when analyzing mail traffic, viewed web pages and exchange of authenticating credentials. Attacking on of those protocols renders the whole network useless.

And there are several attack points both known and widely unknown out in the wild.

Finally (and this is not really an exhaustive enumeration), the issue of identification and identity management will become an increasingly important issue in the future. As systems become externalized and users accessing these services in an ever mobile and volatile way, identity, authentication, authorization and non repudiation will move into the focus of future security assessments.

Another area of concern are

Implemental and operational issues

A quick query on CERT or SecurityFocus reveals that most current issues deal with programmatic problems. Buffer overflows, unrecognized error conditions and weak, template based programming are the root source of these issues. If time to market is driving engineering efforts no wonder we have such poor and instable software.

Next in this chain is the implementation of systems (combined hard- and software) by following standard How-Tos or simply clicking a few defaults. In order for a system to function in a variety of different setups, it is required to run with low security implemented. While this allows a quick start, it poses a vast array of vulnerabilities in a running environment.

Data and function exploiting techniques (like SQL-Injection), improper access to underlying data (reading native file systems), identity spoofing to name but a few. The number of imaginable attack vectors are uncountable.

A highly rated issue is data backup. Millions are spent on data backup. Hardly anything is invested into (bare metal) recovery. It takes initial effort (and thus is a one time cost) to set up a working backup strategy.

It takes permanent effort (and therefore a lasting and substantial cost factor) to test the quality and feasibilty of data recovery. And while a backup covers only a selected number of threat scenarios (more full recoveries or just file recovery), testing recovery issues has to deal with all possible cases. A change in an underlying system component may render the whole process useless.

Dealing with large data quantities and centralized storage (using NAS or SAN) increases the sensitivity of the subject.

Availablity of computing resources is another issue. While virtualization allows to use computing hardware efficiently, resilience to hard- and software failure are critical factors as the hosting components are single points of failure that may influence the quality of service and system integrity alike.

Environmental risk

Threat scenarios become wider ranging from fire to water to temperature. While the first two were always on the agenda of even the smallest operation, temperature becomes an issue in the workplace environment. As CPUs and graphics cards produce more and more heat, computer systems are ubiquitous, the heat emission will increase in the future.

I cannot go into every aspect of environmental risk here. Suffice to say that the branch of the company, the typical usage of IT systems, even legal issues and changing regulations are influencing environmental issues.

On arbitrary example might be a sales representative of a company doing business in Asia. Due to restricted information policy in some Asian countries (e.g. China, Burma) content stored on a computer hard disk might be seen as offense by the local authorities (carry a PDF covering the October revolution in China for a little adrenaline peak). Up to now, possession of computers in Burma is strongly prohibited, facing death penalty.

Another widely ignored factor is the dependency of just one single monopolist. Currently there is one major vendor dictating formats for data storage, communication, even software update cycles. This monopolist leverages each of its systems to gain more control and suffocate technological innovation that is incompatible (and therefore unwelcome) with its overall strategy.

The human factor

What can I say. Undereducated, ill trained, so called computer experts, time-to-marked driven decisions about system releases, cost based human resource policies. All of these are security issues.

Hiring a novice programmer might reduce labour cost in the programming department. It sure will increase labour cost in the help desk and call center. Does it make systems more reliable, more secure? No.

Does it make the CFO happy? Yes.

He can always point to the low wages.

Oursourcing? Let's ship our developement to India. Let's move our help desk to India. Let's move the accounting to India. Labour cost for these activities drops (for how long, may I ask). But the understanding of risk, the correct assessment of security issues still lies with the company.

More so, due to increased communication and repair efforts, cost will increase, effectively shifting expenses from production source to cost for communication and repair.


These are security issues I really see and anticipate for the future. In a few years there will still be some viruses, some adware and some hardware failures. But they will hardly be covered by the press.

We will see more incidents hitting one or the other areas that I described above. And the impact will hit more than one single computer or company. It will hit communities, areas and industry segments.

Wednesday, 7 March 2007

Customer Service

I read this excellent article on customer service. While at my company I try to enforce good customer support, I never had my quality standards laid out in such an easy list of steps to follow.

Thanks to Joel Spolsky, here it is. Read, enjoy and act on it.

Sunday, 4 March 2007

The future of video

Currently, we are migrating our workstations from Windows to Linux. While the basic stuff works perfectly fine, we run into trouble when dealing with music and video.

Especially video.

Our video collection does not work under Linux.

While playing videos under Windows is not an easy feat, playing them under Linux provides insight into a lot of things. Unfortunately, videos are not among them.

Let's summarize:
Videos are stored in container files. To play them, codecs and decoders are required. Some, like MPEG2 and MPEG4 are straight forward, some, like DivX, XVid, H264 are embedded in AVI files.

Under Windows, you install codecs, the installer hooks them into the search path of any installed media player. OK, some honor these paths, some don't. Sometimes, helper applications like Explorer get confused and crash. But low and behold, it works pretty well.

Under Linux, you need graphics libraries, that have library plug-ins. If you ever tried to make ends meet with gstreamer, you know what I am talking about.

My suggestion

Here is a proposal of how video streams could be encoded in order to eliminate the codec problem and thus make video handling user friendly.

All video is encoded using an video and audio encoder. Mixtures are possible and exist.

A new file format could accommodate 2 parts: One, the decoder codec and two, the film itself. The codec would be extracted by a generic decoding engine and used to decrypt the film stream. The player would be a generic codec interpreter that allows for platform independent decoder plug-ins to be hooked in.

None of the above mentioned technologies is new. Platform independent plug-ins are reality in Mozilla driven XUL tools. Piggyback codecs can be attached to the video stream. Interpretation engines can both be Python and JavaScript.

This mechanism would even allow for commercial content, as the codecs may well be connected to payment systems.

I haven't given this a deeper thought. Maybe my tweaking with Linux will take me to deeper insights and eventually make me mad enough to write a prototype.