I tried very hard to give this group meaningful data and information. However, I could not answer the questionaire past question number 13. The questions were irrelevant, ill-formulated, missleading and mostly of archaeological value.
x. How much will has your company spent on IT security in the last year?
(without ever asking the size, branch, turnover or revenue to put this number in relation with)
y. What IT risks are you aware of:
- Hardware errors
- User failure
z. What IT risks do you prevent:
- (above list)
It did not start out that bad. The introduction claimed that due to increased penetration of IT in different businesses there was an expected increase in IT risks to be expected. The survey aimed to help identify areas of IT risks and security issues in these new fields of application.
Is this where university education goes? Triviality!
But the issue in itself is interesting enough to dig into.
I offered the group some thoughts about IT security. First I tried to identify areas of risk that require assessment:
- System(ical) risk
- Implementational and operational risk
- Environmental risk
- Human factors
Admittedly viruses, trojans and dialers are issues not to be ignored, but widely overemphasized.
Another, more severe issue are rootkits. Lacking medical and historical verbal assoziation, rootkits are anticipated something remote, arcane but hardly threatening me.
Rootkits may be introduced into a computer system using such seemingly harmless methods as playing a CD or watching a video. They lie dormant until activated by their creators. Using stealth technology they are invisible to ordinary administrative activities. They are universal in their functionality, offering a privileged environment inside the infected computer.
Virtualisation will add one layer of complexity and uncertainty to this scenario.
Extending system boundaries, the next vulnerable technology is naming services. DNS is a highly fragile system, relying on approximately 13 root servers. Adding one or taking one root server of the DNS net, has an immediate impact on 7,6% of the network load.
DNS offers some resilience like load balancing, database replication and local name caches, still a targeted combined denial of service and man in the middle attack could wreak havok in our IP based world. And this is just the top of the tree. At the bottom, vulnerabilities exist as well.
Another area pertaining to risk assessment is routing of information flow. At the basis data is forwarded from the sender to the recipient using a combination of interacting protocols. From ARP, UDP, TCP to RIP, OSPF and BGP (to name but a few) data is packed, transfered, destinations looked up, recipients verified and sequences honored. At a higher level even more protocols come into play when analyzing mail traffic, viewed web pages and exchange of authenticating credentials. Attacking on of those protocols renders the whole network useless.
And there are several attack points both known and widely unknown out in the wild.
Finally (and this is not really an exhaustive enumeration), the issue of identification and identity management will become an increasingly important issue in the future. As systems become externalized and users accessing these services in an ever mobile and volatile way, identity, authentication, authorization and non repudiation will move into the focus of future security assessments.
Another area of concern are
Implemental and operational issues
A quick query on CERT or SecurityFocus reveals that most current issues deal with programmatic problems. Buffer overflows, unrecognized error conditions and weak, template based programming are the root source of these issues. If time to market is driving engineering efforts no wonder we have such poor and instable software.
Next in this chain is the implementation of systems (combined hard- and software) by following standard How-Tos or simply clicking a few defaults. In order for a system to function in a variety of different setups, it is required to run with low security implemented. While this allows a quick start, it poses a vast array of vulnerabilities in a running environment.
Data and function exploiting techniques (like SQL-Injection), improper access to underlying data (reading native file systems), identity spoofing to name but a few. The number of imaginable attack vectors are uncountable.
A highly rated issue is data backup. Millions are spent on data backup. Hardly anything is invested into (bare metal) recovery. It takes initial effort (and thus is a one time cost) to set up a working backup strategy.
It takes permanent effort (and therefore a lasting and substantial cost factor) to test the quality and feasibilty of data recovery. And while a backup covers only a selected number of threat scenarios (more full recoveries or just file recovery), testing recovery issues has to deal with all possible cases. A change in an underlying system component may render the whole process useless.
Dealing with large data quantities and centralized storage (using NAS or SAN) increases the sensitivity of the subject.
Availablity of computing resources is another issue. While virtualization allows to use computing hardware efficiently, resilience to hard- and software failure are critical factors as the hosting components are single points of failure that may influence the quality of service and system integrity alike.
Threat scenarios become wider ranging from fire to water to temperature. While the first two were always on the agenda of even the smallest operation, temperature becomes an issue in the workplace environment. As CPUs and graphics cards produce more and more heat, computer systems are ubiquitous, the heat emission will increase in the future.
I cannot go into every aspect of environmental risk here. Suffice to say that the branch of the company, the typical usage of IT systems, even legal issues and changing regulations are influencing environmental issues.
On arbitrary example might be a sales representative of a company doing business in Asia. Due to restricted information policy in some Asian countries (e.g. China, Burma) content stored on a computer hard disk might be seen as offense by the local authorities (carry a PDF covering the October revolution in China for a little adrenaline peak). Up to now, possession of computers in Burma is strongly prohibited, facing death penalty.
Another widely ignored factor is the dependency of just one single monopolist. Currently there is one major vendor dictating formats for data storage, communication, even software update cycles. This monopolist leverages each of its systems to gain more control and suffocate technological innovation that is incompatible (and therefore unwelcome) with its overall strategy.
The human factor
What can I say. Undereducated, ill trained, so called computer experts, time-to-marked driven decisions about system releases, cost based human resource policies. All of these are security issues.
Hiring a novice programmer might reduce labour cost in the programming department. It sure will increase labour cost in the help desk and call center. Does it make systems more reliable, more secure? No.
Does it make the CFO happy? Yes.
He can always point to the low wages.
Oursourcing? Let's ship our developement to India. Let's move our help desk to India. Let's move the accounting to India. Labour cost for these activities drops (for how long, may I ask). But the understanding of risk, the correct assessment of security issues still lies with the company.
More so, due to increased communication and repair efforts, cost will increase, effectively shifting expenses from production source to cost for communication and repair.
These are security issues I really see and anticipate for the future. In a few years there will still be some viruses, some adware and some hardware failures. But they will hardly be covered by the press.
We will see more incidents hitting one or the other areas that I described above. And the impact will hit more than one single computer or company. It will hit communities, areas and industry segments.