[wordup] How to start a language virus.
Adam Shand
adam at personaltelco.net
Mon Oct 8 12:48:08 EDT 2001
MeatBall is one of my favorite wiki's. It's dedicated to the building and
understanding of online communities and as such they discuss some pretty
interesting things.
The below article discusses how internet meme's (the burroughs-esque
"language is a virus" deal) get started, and how the anonymity or
pseudo-anonymity of the internet can aid it.
Anyway, while not directly relevant I thought it was something worth
keeping in mind as the war machine begins to crank up. Keep your wits
about you folks.
There's lots more links at the web site, the question marks and
RunTogetherWords are part of the wiki deal, don't worry about it :-)
From: http://www.usemod.com/cgi-bin/mb.pl?PhonyFlood
In spaces without some method of accountability, content in those spaces
have limited credibility. For instance, in a completely anonymous system,
there is no accountability. All interactions are "one-time identities," so
there is nothing but a whiff of a shadow to hold responsible.
Alternatively, in a weak pseudonymous space, there is no way of
guaranteeing the pseudonym isn't being coopted by different people during
different usages. In a strong pseudonymous space, there is no way of
holding the underlying person accountable in the RealWorld. Moreover, the
Internet loses ParaLanguage and BodyLanguage, so it's easier to lie. (see
ImportanceOfIdentityInOnlineCommunities for more of this.)
Those people who understand this will attempt to corroborate important
information on the Internet by cross-checking it against other
information. Usually, they will cross-check it against other information
on the internet first because that's the easiest. However, note that
information on the internet is not validated as noted above. Feeding into
this, information on the Internet moves rapidly, a` la memes (see
VirianLexicon ), because it is easy to copy on the internet. Thus,
duplicated information doesn't necessarily mean independently derived
information. Finally, feeding into this, the Internet is not an edited
forum; anyone can publish and republish, which means it's possible to
create an idea at the "grassroots" level without having it martialed
through fact-checkers. All one needs to do is appear convincing. If one is
really good at this, one may induce mass hallucination, a self-reinforcing
fiction in similar lines to FollowYourNeighbour?.
Therefore, to thoroughly mislead the public, one need only to create a
seemingly trustable yet false account. Flood the space with phony facts,
disinformation, that taken together seem to reinforce whatever idea they
wish to create. Furthermore, because in these spaces there is no external
traceability, it is also trivial to assassinate someone's character by
stuffing words into their mouths.
Note that while it is trivial to disconnect nodes physically flooding the
network in a short period of time (those perpetrating a DenialOfService),
a PhonyFlood is semantic, and it can be built over time. Say a hostile
entity (say a repressive government) wishes to discredit an individual,
the network may decide to disconnect both the hostile entity and the
individual frantically proclaiming her innocence as confused nodes. This
would be a win for the hostile entity. At worst, the disinformation will
be permitted to remain on the network, thoroughly confusing the audience
and also scoring a win for the hostile entity.
Also, this scenario doesn't require a complex architecture like FreeNet.
It is possible on the Internet as it is today, and it was so blatantly
demonstrated by the crash of TWA Flight 800. LawrenceLessig recounts this
well in CodeAndOtherLawsOfCyberSpace. p.171:
On July 17, 1996, TWA Flight 800 fell from the sky ten miles off the
southern coast of Center Moriches, New York. Two hundred and thirty
people were killed. Immediately after the accident the United States
launched the largest investigation of an airplane crash in the history
of the National Transportation Safety Board (NTSB), spending $27
million to discover the cause of the crash, which eventually was
determined to have been a mechanical failure. [1]
This was not, however, the view of the Internet. From the beginning
stories circulated about missiles--people said they saw a streaking
light shoot toward the plane just before it went down. There were also
stories about missile tests conducted by the Navy seventy miles from
the crash site. And then there were reports of a cover-up by the U.S.
government to hide its involvement in one of the worst civil air
disasters in American history.
The government denied these reports, yet the more the government denied
them, the more contrary "evidence" appeared on the Net. [2] [3] [4]
There were repeated reports of sightings of missiles by witnesses on
the ground. These reports, writers on the Net claimed, were being
"suppressed" by the government. The witnesses were being silenced. And
then, as a final straw in the story, there was a report, purportedly by
a government insider, claiming that indeed there was a
conspiracy--because evidence suggested that a friendly fire had shot
down TWA 800.
A former press secretary to President John F. Kennedy believed it. In a
speech in France, Pierre Salinger announced that his government was
hiding the facts of the case, and that he had proof.
Of course, he had no proof. He was duped by misinformation (perhaps
disinformation?) on the Internet.
A similar instance occured in the days following the destruction of the
World Trade Center on September 11, 2001. Many Arab news organizations
begin reporting a story based on an dubious e-mail reference to
[Information Times] that the Israeli government had warned 4000 New York
Jews not to show up to work that morning. As more and more news
organizations reported this, the Russian Pravda did as well--although, to
their credit, they pulled the article hours after they posted it. [5]
Ultimately, there is no way to embed semantic notions of trust in an
system that actively breaks social structures. Any attempt to do so will
necessarily expose the network's members to some level. And without
traceability to the RealWorld, the exposure will not be sufficient to
defend against a dedicated PhonyFlood.
But, of course this is doable in the RealWorld as well. There are
countless occurences of urban myths in high school textbooks for instance.
And much journalism is just dressed up gossip.
All you need is a medium and anonymous or near anonymous sources. Sources
are effectively anonymous when nobody bothers to follow them up. The phony
flood effectively prevents people from making an informed decision. Take
UFO stories as an example. There are magazines dedicated to the topic and
sources remain nameless. People tire of trying to verify the claims and
thus the topic as a whole is discredited. Confronted with the lack of
trusted sources, people usually just DefendAgainstParanoia -- effectively
shutting out information that doesn't fit the current world model.
Nonetheless, the point remains that it is impossible to secure a network
against semantic attacks and that anonymity makes this easier.
----
UrbanLegend?s are one instance, and the phenomenon certainly is
independent of the Internet. ConspiracyTheory? is another example, as are
PropagandaLie?s disseminated during wartime. As well as others, more
later.
There seem to be several elements required for a successful PhonyFlood:
1. Iteration. The story is repeated and reinserted many times. It is a
frequently repeated lie.
2. Appeal. There has to be a fundamental appeal to the story being told
-- a desire to think it is true.
3. Plausability. Though not required, it helps for such stories to be
at least on the border of plausble reality. If not itself believable,
the contrary evidence should be weak or diffuse.
4. Vague origins. The origins of the story should be somewhat hazy.
"Friend of a friend" or other undefined but plausible source.
5. Hazy facts. Either the facts are few, disputed, or hazy, or their
negating arguments are in a similar position. It's hard (though not
impossible) to espouse a viewpoint in the face of overwhelming
evidence. Easier to pick up a situation in which there's some
ambiguity to exploit.
6. Emotional investment. Many of these issues are a counterpoint of
disputed facts and a strong emotional factor, whether it's paranoia,
prestige, patriotism, religion, or similar.
There's probably one ultimate exemplar of this phenomenon, a story which
simply transcends proof, and relies instead on faith and repetition for
belief: religion. Without meaning this as an assesment of religion as true
or false -- in fact, as I said, it transcends this. Religion is ultimately
something taken on faith -- you believe or...you don't.
What's the ultimate answer? "Truth is a battle." No rest for the wicked --
you've got to get your story out early and often. -- KarstenSelf 19 Apr
2001
More information about the wordup
mailing list