The Importance of Truth: Information Warfare as the Ultimate Weapon
By John D. Turner
27 Dec 2006

Have you ever considered how much of everything we do each day depends on the concept that what we perceive is true? Truth underlies almost everything we take for granted.

Our legal system is grounded in the belief that witnesses will tell the truth; that once under oath, whatever is said on the stand will be “the truth, the whole truth, and nothing but the truth”. If witnesses lie, justice is cheated; the guilty are wrongly freed and the innocent erroneously jailed. Our medical system is based on truth; when you visit the doctor you expect that his or her diagnosis is made based on their knowledge of medicine and your symptoms. When they prescribe medication for you, you take it on faith that their diagnosis is correct and what they prescribed is for you is exactly what you need to make you better. Most Americans believe that what they read in their newspapers and hear on their TV news programs is the truth, or at least that those presenting the news are not intentionally lying to them.

All around us each day, we filter our input through a lens that accepts what we hear, see, smell, feel and taste as “truth”. Most of this input is passed straight on through and accepted as “true” without further thought. It is the “normal” input we receive on a daily basis. For example, we get on the freeway and merge in with the traffic flow. We accept as “truth” that the cars around us are filled with people who are typically “like us”. If the traffic is flowing at 70 mph in a 60 mph zone, we flow right along with it. If we see a car with a rack of lights on the top, we may slow down, perceiving the “truth” of such a sighting as a police car. If not, we go about our merry way.

But what if we hear on the radio that the local police are out patrolling heavily in unmarked vehicles? What if we see such an unmarked vehicle with his “catch” on the side of the road? How do we react? Suddenly, every car we see is suspicious. We can’t know until we are right next to it whether it is an “innocent” vehicle or a “marauding” police cruiser in disguise. Intellectually we know that the odds of our encountering such a vehicle are small, and that if we are going the same speed as everyone else, the chances that we individually will be singled out are slight, nevertheless, we slow down anyway.

As does everyone else who is thinking exactly the same way.

Our concept of truth has just been changed and our behavior modified accordingly. What we had previously perceived as “truth”, that police cars could be uniquely identified by the lights mounted on their roof, giving us sufficient warning to react to their presence, has been proven false. Now, additional mental processing must be devoted to ensure that every car we see is not a threat, rather than relying on pattern recognition to recognize the threat for us.

The concept of “untruth” is a learned concept. We typically perceive most of our surroundings from a perspective of truth. Untruth is an “exception” condition and requires further processing of the data stream. Once we encounter the untruth, we must then decide how it affects us and plan a course of action. That plan can be to counter the untruth, respond to the untruth, or ignore it if we perceive that it doesn’t affect us. For example, I might notice that unmarked police car on the side of the road, and correctly infer that my perception that all the cars around me are not police cars may be flawed, and yet be unconcerned because I am driving the speed limit.

For the most part, we accept our perceptions as truth unless forced to do otherwise. This is normal human behavior. Typically it takes something unpleasant happening personally to us, the perceived potential of such an unpleasant happening occurring to us, or a general conditioning concerning a potential event over a period of time, to change this behavior. An example of this might be locking the doors of your house when you leave. For someone raised in the country, this might be a foreign concept; one that changes rapidly when they move to the city and someone enters their home while they are gone and steals something. For someone raised in the city, this is a normal part of everyday living. In the first case, the “truth set” that would allow the door to stay unlocked has been violated. An exception flag has been raised, and a new response, locking the door, is now required. In the second case, locking the door is a conditioned response and has been part of the “truth set” from the beginning.

As a software engineer, I know that when software is written, it must first be tested before it is released to remove as many errors or “bugs” as possible. We won’t get them all; in any but the most trivial piece of software there will be latent defects. Testing doesn’t guarantee the absence of errors, it merely attempts to find the worst of them, and ensure that any that do turn up won’t be “show stoppers”. Software testers know that the best place to find errors is “at the edges”; places where one piece of software interfaces with another; at “boundaries” where a value on one side of the boundary is “true” and on the other is “false”; where “off by one” errors can occur; and where an accumulation of data can “overflow” the area set aside to contain it.

When these conditions occur, bad things can happen. Effects can occur that were not planned to occur by the programmer. These can range from minor nuisances to major catastrophes for the user, for whom “truth”, a bug-free application environment, has now become “untruth”, an environment where the unexpected can happen, and all the work you have done can be lost at a moment’s notice, prompting a new response; save your work frequently!

In war, we also look for interfaces, and attack them to produce desired results on enemy forces. For example, you may not have the necessary force to attack an enemy armor unit directly. However if you can successfully attack the communications link that connects it to its headquarters, you can perhaps prevent it from responding to an attack elsewhere or at least delay its response. If you can successfully interdict its fuel supply, you may in fact be able to stop it in its tracks (no pun intended). In maneuver warfare, a unit that can’t maneuver isn’t very useful.

But what if the targeted interface isn’t physical? What if it is something more fundamental than a communication link, or what we in the military like to refer to as a “center of gravity”? What if it is something very basic, like your perception of “truth”?

Perception of truth as a military target is not new. Throughout history, offensive and defensive military deception has been a part of warfare. Defensive deception is common. Examples include camouflage netting and uniforms that attempt to conceal the location of forces. Other examples include decoy guns, tanks, planes, etc, that attempt to get the enemy to spend time and munitions attacking objects of no value while the real weapon systems are protected or off doing something else.

Offensive military deception can be much scarier. In fact, certain types of offensive military deception are so scary that the rules of war allow them to be dealt with harshly or even ban them completely. A modern offensive example is German soldiers who dressed as American MPs and infiltrated U.S. forces during the Battle of the Bulge in 1944 during the critical moment of German advance. The ploy was countered, but not before sowing confusion and mistrust among U.S. troops, who could now not take an American uniform for granted without making sure that each person wearing one was, in fact, an American. This is a similar problem as the unmarked police car, but with deadly consequence instead of just an inconvenient speeding ticket. Truth was attacked. A paradigm was shifted. A new course of action had to be taken and learned. Valuable time was spent verifying what had previously been known “truth” that was now suspect.

And what of the Germans who conducted this operation? When an enemy soldier is captured during battle, they are to be treated properly with respect to the Geneva Conventions and other “rules of war” that have been agreed to by signatory nations. An enemy soldier is characterized by wearing the uniform of their particular armed forces. If they are captured wearing your uniform instead, then they are regarded as a spy and can be executed on the spot, no questions asked. These Germans knew this going in, and in fact, this is what happened with many of them.

Other actions of this type, banned under the rules of war include attacking under a flag of truce, using noncombatant status to disguise an attack (primarily misuse of the Red Cross emblem to achieve military advantage), feigning surrender to draw an enemy out where they can be attacked, etc.

Such tactics may indeed be difficult if not impossible for your enemy to counter. They may gain you momentary advantage. However, once you start down that path, your enemy probably will as well. The tactics which seemed so good when you used them, will not seem so good when your enemy begins using them as well; best not to even go there from the beginning.

All of these actions have one thing in common; they attack the perception of “truth”; of what is expected. These activities are put off limits by nations because the consequences of their employment are ultimately more detrimental than the momentary advantage one may gain by employing them. As individuals, we consider them treacherous or cowardly acts.

Now we move into the modern Information Age. Information flows are everywhere, more information than we could ever process. When I was a kid, we had one television station, which was off the air from midnight until six a.m., and cable was for rich folks. Today, we have 500 stations at our fingertips and TV is 24x7; my kids have never know a world without cable and have no idea what a “test pattern” is. When I was young, “the news” came on at 6 pm, and there were only three networks providing it. Now news is 24x7 also, and you have multiple sources from all around the globe. It used to require a shortwave set (and luck) if you wanted to hear the BBC, for example. Today you can watch British television on cable or satellite, or stream it across the Internet with perfect fidelity.

We have movies, whose special effects are better than the real thing. Indeed, I have heard it said by those who think the space program a waste of money, “Why send probes to Mars to take pictures, when Hollywood can do it so much better so much cheaper?”

Want to put the boss’s head on Paris Hilton’s body? Photoshop will let you do that in the comfort of your own home. Want to put the boss’s head on the body of one of the people in the background in the picture of the police raid on the local massage parlor? You can do that too.

Hollywood has released movies and TV shows and commercials where dead actors take on new life, where current actors appear in scenes shot long ago; CGI effects that appear “real”. And this technology is not available just to Hollywood; the video equivalents of Photoshop and powerful, modern computers make this available to the hobbyist as well.

We make decisions every day in life based on information flows that we trust. Our perceptions of the news are shaped by the news sources we trust. Some people trust the news from the major networks. Some trust CNN. Some trust FOX. Likewise, some sources are distrusted – what one person trusts may be distrusted by another. I tend to watch FOX, for example, because my perception is that they are indeed more “fair and balanced” than the alternatives, even though there is a conservative bias. Other people I know trust CNN and wouldn’t watch FOX if it were the only channel left on the planet.

When we go to the store, we trust that the bar code scanner accurately and correctly translates that bar code into the proper price; that the price in the computer matches that on the shelves, and the barcode on the item is in fact the code for that item. We certainly can’t read it ourselves!

Trusted sources are, well, trusted. Data flows from one trusted source to another. If I can insert myself into that trusted data stream (a so-called man-in-the-middle attack), then I, and the information I provide, become trusted as well. This gives me the ability to manage the data stream, to “massage” it, and by extension, the perceptions of the people using it, in the direction I want it to go.

Such information management is part of a “new” kind of warfare, called “Information Warfare”. It really isn’t new; it has been around for ages. But our modern information age can make it much more effective than ever before. And the target isn’t the military, it’s you. The objective is to get you to do something that the targeteer wants you to do, or to get you to not do something the targeteer doesn’t want you to do. The targeteer comes in many forms; the company advertising a product they want you to buy; the politician who wants you to pull the lever for them; the special interest group that wants your support; those we are fighting who want us to quit.

With our global connectivity, it is very easy for someone on the other side of the globe to reach out and touch you. As if things weren’t confusing enough, information warfare as a component of Asymmetric warfare means that in today’s world, your perceptions of reality are now a target in a much larger conflict. The news can be and has been manipulated to create perceptions of reality intended to evoke responses favorable to the designs and aims of the perpetrators.

Examples include pictures coming out of the Israeli-Hisbollah conflict, and pictures and news stories coming out of Iraq; carefully staged pictures and stories of events that never occurred, or if they did occur, were very different than the “reality” presented. Events and pictures that were not initially questioned because first, they came from “trusted” sources, and second, conformed to pre-existing media biases.

Such things can be very difficult to find, and once found, can be difficult to prove. And even if they are eventually proven to have been deliberate manipulation and fabrication, it may be too late – the damage may already be done.

These things are especially pernicious, because they strike at the heart of something that is fundamental to our being able to function in the world; our perception of truth.

What if we could no longer believe anything? What if everything you saw on TV suddenly became suspect. You get an announcement over the TV that a threat has just been received that a nuclear weapon has been planted in your city and is set to go off in three hours. How do you know it is real? You switch to CNN, and they aren’t carrying it. Whew! Glad I checked. It must be a hoax. Except what if CNN were the vehicle used to carry the message? Once inside a “trusted” system, the news is trusted. All the other networks will pick it up as well. Sure, it will probably get sorted out after awhile, but in the meantime…

What if the CNN news feed is the one being manipulated to not carry information about the threat? So when you checked CNN, the "most trusted name in news", and found it was not carrying the story you incorrectly concluded that the story was a hoax? Wheels within wheels. What is truth and what is not? What if all information sources could no longer be trusted? What would you do? How would that affect the nation? Could a nation, dependent on information flows, survive if those information flows were no longer reliable?

Because it deals with people’s perception of reality as a target, and because today’s society provides the means to more easily corrupt that reality than ever before, and because the target is largely undefended and attackable by virtually any group in any part of the globe without going through the U.S. military first, perception management as a tool of Information Warfare is a weapon of choice in today’s world of asymmetric warfare against the United States. It is a weapon that can be wielded by nation states, criminal and terrorist groups, and individuals.

Because it has the capability to undermine our perceptions of “truth”, information warfare, carried to its extreme, has the capability to undermine everything. If you can no longer believe anything you see or hear, if everything suddenly becomes an “exception condition”, paralysis results. Imagine 300 million individuals for whom no communication can be regarded as true. The result would be mass chaos, death and destruction on a national scale.

Ultimately, information warfare, out of control, can be more powerful than nuclear bombs.