Posted by
There’s a delicious irony in some of the testimony on cybersecurity
that the Senate Homeland Security and Governmental Affairs Committee
will hear today (starting at 2:30 Eastern — it’s unclear from the hearing’s page
whether it will be live-streamed). Former National Security Agency
general counsel Stewart Baker flubs a basic mathematical concept.
If Congress credits his testimony, is it really equipped to regulate the Internet in the name of “cybersecurity”?
Baker’s written testimony (not yet posted) says, stirringly, “Our
vulnerabilities, and their consequences, are growing at an exponential
rate.” He’s stirring cake batter, though. Here’s why.
Exponential growth occurs when the growth rate of the value of a
mathematical function is proportional to the function’s current value.
It’s nicely illustrated with rabbits. If in week one you have two
rabbits, and in week two you have four, you can expect eight rabbits in
week three and sixteen in week four. That’s exponential growth. The
number of rabbits each week dictates the number of rabbits the following
week. By the end of the year, the earth will be covered in rabbits.
(The Internet provides us an exponents calculator, you see. Try calculating 2^52.)
The vulnerabilities of computers, networks, and data may be growing.
But such vulnerabilities are not a function of the number of transistors
that can be placed on an integrated circuit. Baker is riffing on Moore’s Law, which describes long-term exponential growth in computing power.
Instead, vulnerabilities will generally
be a function of the number of implementations of information
technology. A new protocol may open one or more vulnerabilities. A new
piece of software may have one or more vulnerabilities. A new chip
design may have one or more vulnerabilities. Interactions between
various protocols and pieces of hardware and software may create
vulnerabilities. And so on. At worst, in some fields of information
technology, there might be something like cubic growth in
vulnerabilities, but it’s doubtful that such a trend could last.
Why? Because vulnerabilities are also regularly closing. Protocols
get ironed out. Software bugs get patched. Bad chip designs get fixed.
There’s another dimension along which vulnerabilities are also
probably growing. This would be a function of the “quantity” of
information technology out there. If there are 10,000 instances of a
given piece of software in use out there with a vulnerability, that’s
10,000 vulnerabilities. If there are 100,000 instances of it, that’s 10
times more vulnerabilities—but that’s still linear growth, not
exponential growth. The number of vulnerabilities grows in direct
proportion to the number of instances of the technology.
Ignore the downward pressure on vulnerabilities, though, and put
growth in the number of vulnerabilities together with the growth in the
propogation of vulnerabilities. Don’t you have exponential growth? No.
You still have linear growth. The growth in vulnerability from new
implementations of information technology and new instances of that
technology multiply. Across technologies, they sum. They don’t act as exponents to one another.
Baker uses “vulnerability” and “threat” interchangeably, but careful
thinkers about risk wouldn’t do this, I don’t think. Vulnerability is
the existence of weakness. Threat is someone or something animated to
exploit it (a “hazard” if that thing is inanimate). Vulnerabilities
don’t really matter, in fact, if there isn’t anyone to exploit them. Do
you worry about the number of hairs on your body being a source of pain?
No, because nobody is going to come along and pluck them all. You need
to have a threat vector, or vulnerability is just idle worry.
Now, threats can multiply quickly online. When exploits to some
vulnerabilities are devised, their creators can propogate them quickly
to others, such as “script kiddies” who will run such exploits everywhere they can. Hence, the significance of the “zero-day threat” and the importance of patching software promptly.
As to consequence, Baker cites examples of recent hacks on HBGary,
RSA, Verisign, and DigiNotar, as well as weakness in industrial control
systems. This says nothing about growth rates, much less how the number
of hacks in the last year forms the basis for more in the next. If some
hacks allow other hacks to be implemented, that, again, would be a
multiplier, not an exponent. (Generally, these most worrisome hacks
can’t be executed by script kiddes, so they are not soaring in
numerosity. You know what happens to consequential hacks that do soar in
numerosity? They’re foreclosed by patches.)
Vulnerability and threat analyses are inputs into determinations
about the likelihood of bad things happening. The next step is to
multiply that likelihood against consequence. The product is a sense of
how important a given risk is. That’s risk assessment.
But Baker isn’t terribly interested in acute risk management. During
his years as Assistant Secretary for Policy at the Department of
Homeland Security, the agency didn’t do the risk management work that would validate or invalidate the strip-search machine/intrusive pat-down policy (and it still hasn’t, despite a court order).
The bill he’s testifying in support of wouldn’t manage cybersecurity
risks terribly well, either, for reasons I’ll articulate in a
forthcoming post.
Do your representatives in Congress get the math involved here? Do
they know the difference between exponential growth and linear growth?
Do they “get” risk management? Chances are they don’t. They may even
parrot the “statistic” that Baker is putting forth. How well equipped do
you suppose a body like that is for telling you how to do your
cybersecurity?
No comments:
Post a Comment