Lifted from http://www.infowar.com/mil_c4i/mil_c4id.html-ssi
First Steps Toward a Defense
by Michael Wilson.
Though this be madness, yet there is method in 't.
-- Shakespeare, _Hamlet_
I've written a great deal about information warfare and the application of technology to areas of pursuit such as guerrilla warfare and terrorism. What about the other side of the coin? Can a defense be mounted? Certainly.
First, a few constraints:
No Reduction of Civil Liberties
Why must security seemingly be purchased at the cost of the liberties that make life worth living? What have prior purchases of 'security' bought us? Are we truly safer, more secure? Have crime, terror, and war gone the way of the dinosaur? Of course not. Therefore I will not posit nor endorse any further reduction of civil liberties; I will even endeavor to strengthen a few.
No Empty Gestures
The only interpretation one can give to anti- or counter-terror measures, other than the cynical portrayal of politicians as hidden fascists, is where politicians are being political--fond of empty gestures. Like chrome on a car, they serve no purpose; not being a politician, I will pass on the option of doing something to 'look good' or, Soviet-like, justifying my existence by taking action, any action, regardless of applicability or utility, not to mention consequence.
My approach is to propose the initiation of a number of measures and projects to establish a defense-in-depth approach. Strategic elements fall in the domains of technology; social policy and initiatives; and military, intelligence, and law enforcement work. As the name of this article states, these are the first steps; the threat is developing, and so must the defense. This is like the recognition of the tank or helicopter on the battlefield; speculation and experimentation at first, evolving into important, potent tools later--and always playing with measures/countermeasures along the way.
Technological Steps
Engineering Rigor
Engineering is about trying and failing--then learning from mistakes and trying again. And again, and so on. Sometimes, in the process of those mistakes, people die--bridges and buildings collapse, airplanes fall out of the sky, ships sink. Sometimes the fear of people dying complicates the engineering process to the point where the process becomes almost unworkable--such as in pharmaceutical development. Sometimes the conditions engineered for change, or we learn things that change how we view the problem--like with chemicals, or radiation.
In some fields of technology, there isn't a whole lot of engineering rigor going around these days. Look at the trouble expected from the switch of the calendar year to 2000 and you'll see a simple example. You don't know what you don't know, you don't know and can't adequately plan for the future, and nothing gets used as it was intended. Technology is hard enough to get working as it is--it doesn't happen on its own, and if it isn't explicitly built in the hardware or software, it can't do it. But this should be understood before the responsibility for large, critical portions of the political economy is passed over to the machines. This means changes in education of engineers to change the engineering process--research, development, testing, evaluation, utilization, consequences.
Safe Fail Designs
From the design phase on out, failure of systems (or subsystems, or remote systems, or users) should be expected. Not if, when. While not every particular real world failure can be predicted in advance (nice to know that some element of human behavior is forever beyond prediction), certainly more effort can be undertaken. Outside labs such as 'Underwriter Labs' test products, but by then it is already too late, and they don't test much of what society depends upon daily. Safe fail designs need to be devolved down into every development project. This means computer, software, networks; it means all those complex systems they are part of, and all the systems those systems interact with. And that's just the start.
Forcing Factors
Forcing factors are design elements that require proper, defined usage for a system to work. You can't successfully drive a vehicle unless you are in the right seat for it--this is a physical forcing factor. Security systems use forcing factors--they require a confrontation before access is allowed (usually authentication--keys, passwords, token). Forcing factors can be cheap and effective when built in rather than added later (automotive alarm systems, for instance). Much of what passes for security has no forcing factors--sleepy security guards, or locked doors and open windows. Some forcing factors mis-assess the threat, like protecting a nuclear power reactor from protesters or takeovers, but leaving critical systems (like cooling circulation systems) exposed to destruction without having to confront any security measures.
Forcing factors are essential to good secure system design; defense in depth strategies layer forcing factors to cover each layer's weaknesses and maintain overall system integrity. Social systems at risk rarely have forcing factors; there is a trade-off of freedom versus restrictions, but again, such factors are designed to meet threats, which free, responsible behavior is not.
Hardware/Software Designed for Threat
This is obviously easier said than done. The U.S. National Security Agency (NSA) has had long running projects to design and implement secure computing technology with minimal success--the technology keeps advancing in front of them. Even as a branch of the government, NSA hasn't had the resources to keep pace; I personally believe this is why every so often they try to hit the breaks and slow the juggernaut (see NSA and civilian cryptographic usage policy for an example). Obviously silly, the approach hasn't a prayer of succeeding, which is just as well--since it is exactly the wrong strategy.
Some parts of the NSA's approach to security are sound; for instance, the Rainbow Series advocates going back to the design phase when security flaws are found. Don't patch the hole, cure the logical design concept that led to it. Integrity is like a balloon, no matter how good the rubber, the air still leaks out the hole. This doesn't mean throwing out the unsecured systems we have now--in fact, a lot of good could be accomplished with a major effort to close the existing holes in design and implementation (yes, patches). Society needs to keep functioning while improvements are made, and computers are now essential. But it means an effort must be started at every major hardware and software developer, no more pitiful efforts by amateurs. So follows the next point.
Strong Cryptography
Put good, strong, fast cryptography where it is needed--in every computer, on the motherboard. No key escrow--that just leaves the key management body as judge, jury, and executioner, not to mention a sitting target; no weakened keys, that makes cryptography value subtracted--it takes time, but provides no protection. Crypto such as DES or RSA should be readily available to everyone with a computer; while we're at it, who knows that there isn't something even better at the NSA? Open the technology up--get out the strong crypto, security, authentication, etc. Ship the scientists out from Fort Meade to computer hardware and software developers, or ship the engineers to Fort Meade, or at least publish what the market needs to know. Think of it as investing the Cold War peace dividend to help strengthen the society to weather the next wars. Moving the national network to a point-to-point session-to-session capability would be, as they say, a startling development.
Hardened Computers
I don't believe in computer security--I've had too much luck playing the other side of the field. I don't believe in secure systems, that NSA Orange Book Platonic Ideal. Go ahead and isolate the computer behind a locked door with no network connection; you still use an operating system and applications, and those (and the tools they are built with) aren't built on isolated systems.
I believe in cryptography and communications, and oddly enough, those can be solutions that work better than obscure technological attempts to prevent data from doing what it does best--move. That's why crypto is great--let the data move, but reader makes right. If you haven't got the key in a strong crypto world, you better get used to the noise.
The impact of crypto on security can be profound--call it cryptosecurity as an approach:
Data Security
If data is stored or on the move, only you or the intended party/parties can make sense out of it; whether your recipe for soup or your credit information, your privacy is yours.
Anti-viral
Software systems can catch the signature of the known virus, but not the unknown--the first sign of trouble can already be too late. Cryptosecurity stops viruses by stopping infection or execution. On boot, the operating system can be confirmed with cryptographic checksums (for instance, DES's Message Authentication Codes, or MACs for short) prior to execution, as can all applications; if the MACs don't match those of a 'clean' MAC'ed application, you don't execute the application. If all stored applications and data is encrypted, and extra layer of protection is present--decrypting the application or datafile actually turns the virus into unexecutable noise (since it appended itself into an encrypted file). The system is necessarily more complex than this (for instance, maintaining memory maps and referencing against MAC'ed file loads prior to privileged functions to prevent datafile virus execution, etc.), but prototyping has shown the process to work effectively.
Anti-malice (virus, hacker)
The hardware needs to trap calls to specified functions (delete, formatting, etc.) and require a token (a password provided to the hard subsystem, or a physical key turned) prior to approval.
Miscellaneous Safety Functions
Saves aren't overwrites, but are layered for transaction rollback and checkpointing--copious storage makes this cost effective in most cases; dense sites should use WORM drives; backing storage up and frequent checkpointing are necessary actions; redundant systems are essential in critical systems.
Heterogeneous Systems/Networks
Biodiversity makes for resilient ecosystems, and technodiversity makes for a resilient infosphere; wide-spread standards make standard attacks that much more leveraged, and dangerous.
[The specifics on cryptosecurity require significant ongoing research and development by skilled scientists and engineers, but are encouraging areas to explore.]
Social Steps
Community Education
Threats and countermeasures need to be explained to all levels--engineers, students, politicians, business leaders, etc. Disclosures (read 'evangelism') on cryptography, security, information warfare, terrorism, etc. need to be made. Such action would reduce the value of moral and material surprise that any harmful operations against their domain may have. It also enlists the general community in the cause, rather than leaving them as helpless casualties.
Root Causes
Opposition forces using tactics to effect social change--whether bombs or bytes--are doing so for a reason. While there are certainly criminal or mentally ill perpetrators, there are also those who are where they are for a reason. Odd as it may seem, sometimes the only rational reaction to an irrational situation is irrational action--or so it seems. Blanket criminalization (including the truly moronic (and hypocritical) statements such as "We don't deal with terrorists and criminal elements.") remove options for these players, and give them no other recourse than to act as criminals (the Nostromo dilemma, see Joseph Conrad's _Nostromo_: if you are labeled to your detriment, you may as well act the way they expect you to). The triggers or causes for behavior should be sought out and dealt with; who knows what possibilities it might create?
Lock Personal Data
You own your data--too much harm can occur to you if your data (credit, medical, legal, etc.) end up in the wrong hands. Nobody should be able to view your data without your permission; this process will become much easier when strong crypto and authentication are more prevalent, but even then, it shouldn't be easy.
Honest Media Relationships
The voice to the people needs to be put to good use; first though, trust needs to be reestablished. All the players in this game--government, free market, media--need to stop scare tactics, spin control, hidden agendas, mythmaking, and personal aggrandizement. The issues are confused enough without making fuel for every 'Oliver Stone'-esque conspiracy theorist's fire.
Military, Intelligence, Law Enforcement
Jurisdiction
The jurisdictional muddle needs to be cleaned up; how to resolve it will be a troublesome issue, however. Law enforcement like the FBI see the actors as criminals, which colors their perceptions and actions more than a little. Intelligence has a more balanced view of the world, but has a "water's edge" difficulty, as well as the difficulty of giving clearances to people they may rather not wish to. The military mindset led to troubles such as we have now with the NSA, where the priorities are antithetical to solving the problems. The free market doesn't have the powers, budget, or access to tackle the problem on its own. The solution is likely some sort of new agency composed of all the mentioned elements, although the anarchist in me screams against the creation of yet another body.
Recruit Inside the Community
The paradigm mismatch between how government bodies and the free market (not to mention the opposition forces) view the issues is crippling. These organizations need to get 'local guides' who know the language, customs, and terrain. Such domain expertise would help them to stop making fools of themselves, not to mention starting to address the problems. It also provides a necessary positive element of value to the community, one of the reasons why individuals are drawn in the first place, a need to belong and be valued.
HUMINT
The days of SIGINT strength are already fading; computing power and knowledge have given the tools to secure themselves to opposition forces that want/need them. HUMINT sources need to be developed--you learn intent, capabilities, plans, etc. in advance and possibly on an interactive basis. Of course it's hard to develop HUMINT sources, but that doesn't mean you don't do it anyway.
Tradecraft
Opposition forces have their tradecraft, but so does intelligence and law enforcement--shoeleather research, manpower, resources. Certain elements of the opposition forces' tradecraft are susceptible; for instance, all senders/receivers of remailer traffic can be studied and mapped, as well as all traffic flow in the remailer net. It doesn't violate any civil liberties; in fact, it is almost an expectation (read the remailer threat assessment some time). Strengths need to be challenged, and weaknesses need to be forced to failure. This means following the traffic, following the money, etc.
Rapid Response Team
Since the timing is a matter of when, not if, a response should be prepared. Threats should be modeled, games played, tools built, measures planned--try to prevent fires, but still have a fire department. Team members should be domain experts in related fields, but cross-disciplinary enough to handle a widely divergent set of crises.
Propaganda Countering Team
PsyOps is a critical component of potential future conflicts, and measures need to be taken to cope with attacks and the aftermath. Nothing cuts through like the truth, but the team needs to establish trust and earn the reputation capital in advance to be able to spend it when the crisis hits.
Conclusions
This is, I hope, a living list; measures will be thought up and initiated, and the feedback loop between opponents will become faster, denser, and more intense as the years go by.
But--you have to start somewhere, you have to start something, and you have to start sometime. Why not here, now, and with these?
Michael Wilson [5514706@mcimail.com]
Managing Director, The Nemesis Group
Copyright 1996 by author. All rights reserved.