2012-07-31

‘Coverage Will Not Necessarily Translate into Care’

Posted by Michael F. Cannon at http://www.cato-at-liberty.org/coverage-will-not-necessarily-translate-into-care/


Members of the Anti-Universal Coverage Club already knew this. Members of the Church of Universal Coverage may want to take heed. The New York Times reports:
In the Inland Empire, an economically depressed region in Southern California, President Obama’s health care law is expected to extend insurance coverage to more than 300,000 people by 2014. But coverage will not necessarily translate into care: Local health experts doubt there will be enough doctors to meet the area’s needs. There are not enough now.
Other places around the country, including the Mississippi Delta, Detroit and suburban Phoenix, face similar problems…
Moreover, across the country, fewer than half of primary care clinicians were accepting new Medicaid patients as of 2008, making it hard for the poor to find care even when they are eligible for Medicaid. The expansion of Medicaid accounts for more than one-third of the overall growth in coverage in President Obama’s health care law.
But isn’t the important thing that they’ll have a piece of paper that says “health insurance”?

Police Use of Drones Leads to Arrest of North Dakota Farmer

Posted by Walter Olson at http://www.cato-at-liberty.org/police-use-of-drones-leads-to-arrest-of-north-dakota-farmer/


From a Minneapolis Star-Tribune account last week:
…a Predator drone led to the arrests of farmer Rodney Brossart and five members of his family last year after a dispute over a neighbor’s six lost cows on his property escalated into a 16-hour standoff with police.
It is one of the first reported cases in the nation where an unmanned drone was used to assist in the arrest of a U.S. citizen on his own property…
Many more cases are likely to follow, in areas that include drug enforcement, child welfare and environmental regulation. Warns the ACLU in a report issued in December:
All the pieces appear to be lining up for the eventual introduction of routine aerial surveillance in American life — a development that would profoundly change the character of public life in the United States.
Lots of money is being made from drone (UAV) operations already, and those who are dubious about the privacy and Constitutional aspect of the trend can expect to run into comebacks like the following:
“If you’re concerned about it, maybe there’s a reason we should be flying over you, right?” said Douglas McDonald, the company’s director of special operations and president of a local chapter of the unmanned vehicle trade group.
Read the whole thing. The Fourth Amendment to the U.S. Constitution states:
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
Related on the surveillance state here. Cato Unbound published a symposium on drones in January which included discussion of their domestic implications.

2012-07-30

Talking about ‘Trade-offs’ between Liberty and Security Begs the Question

Posted by Julian Sanchez at http://www.cato-at-liberty.org/talking-about-trade-offs-between-liberty-and-security-begs-the-question/


Over at the New York Times, reporter Scott Shane announces the beginning of a running dialogue about how to strike “the proper balance between liberty and security” more than a decade after the terror attacks of 9/11.  I want to suggest, however, that framing the question the way Shane does, in terms of optimizing the “trade-off” between these competing values, begs the crucial question: Has there been a trade-off? Have all the billions of dollars and intrusive new surveillance powers granted our intelligence agencies in recent years actually made us any safer? Shane presents the choice we face in a way that simply assumes, without argument, that they have:
The next president might reach one of two very different conclusions: to continue its record of success, the government should keep doing everything it is doing, and Americans should accept that the trade-offs of the national security state are permanent. Or: the terrorism emergency that began with 9/11 has eased, the threat has diminished, so the security bureaucracy should shrink accordingly and the pendulum should swing back in favor of civil liberties and individual privacy.
But surely there’s a third possible conclusion—and one with the virtue of being far better supported by the available evidence: That much of the expansion of the national security state has not involved any “trade-off” at all because it has not meaningfully increased our security; that the absence of major terror attacks since 9/11 is not remotely the same thing as a “record of success” for the “security bureaucracy,” and that to the extent a genuine “record of success” does exist, it has very little to do with the most controversial and prominent “War on Terror” measures.
As my colleague John Mueller observes, there has been essentially no serious effort to do any serious cost benefit analysis of the costly security measures imposed since 2001—probably in part because it seems so obvious that most would fail any such test. The mere fact that we haven’t seen a repeat of 9/11 hardly tells us much: We hadn’t had one in the decades before we started inflating the powers and budgets of the intelligence agencies either. Inferring a causal relationship from the absence of an already rare phenomenon is pure magical thinking—the national security equivalent of trumpeting the lack of volcano eruptions since the last virgin sacrifice.  Moreover, there have been multiple attacks over the past decade that the newly expanded security bureaucracy clearly failed to detect or preempt. The NSA’s vast surveillance apparatus didn’t stop shoe bomber Richard Reid or undie bomber Umar Farouk Abdulmutallab: Alert passengers did that. Attempted Times Square bomber Faisal Shazahd was foiled by his own incompetence and an alert street vendor.
Of course, our intelligence agencies have had concrete counterterrorism successes, but that in itself doesn’t tell us anything about the causes of those successes. If, as the saying goes, “everything changed” after 9/11,  we need to know which changes contributed to those successes and which didn’t. There little evidence that the intelligence failures leading up to the 2001 attacks were fundamentally about insufficient power or resources. Rather, as intelligence scholar Amy Zegart has convincingly argued, they were the result of organizational and cultural problems within the agencies which had been identified again and again and again over the years by expert panels and blue ribbon reports whose reform recommendations somehow never wound up getting implemented. To the extent some of those problems finally started to be addressed after 9/11, it would be a much more plausible explanation of any subsequent performance improvements.
What about the government’s own claims that some of these radical new surveillance powers have been vital, life-saving intelligence tools? That would support the “trade-off” thesis, but since it’s bad politics to announce that you’ve violated people’s rights frivolously, we should want to check those claims against some independent analysis. In the case of the warrantless surveillance program authorized by President Bush—predecessor to the programmatic surveillance now conducted under FISA’s section 702—we’ve actually got just that: An unclassified report by the Inspectors General of the intelligence community. And what did they find? A lot of officials making vague statements to the effect that the program was “one useful tool in the toolbox,” but not much in the way of concrete achievements—plots foiled or terrorists captured—that could be attributed to the program we’d been told for years was absolutely critical. Instead—as Scott Shane’s own reporting had told us!—it seems to have wasted the time of a lot of FBI agents chasing down all the dead-end leads it generated.
Approaching it from the other direction, we can look at the known cases of people charged with terror-related crimes. Do we find a string of plots foiled thanks to sophisticated surveillance methods that would have been unavailable under pre-2001 laws? We do not. Mostly we find human intelligence and tips from alert members of the community playing the critical role.
Benjamin Franklin’s wise aphorism notwithstanding, there are cases where we do face genuine trade-offs between liberty and security, and sometimes—as when we allow homes to be searched pursuant to a judicial warrant supported by probable cause—those are even trade-offs worth making. But many measures that make us more secure don’t affect liberty at all: The single reform that probably did the most to guarantee we wouldn’t see a repeat of 9/11 was the simple decision to reinforce and lock cockpit doors. And the easiest thing of all to do is is implement “security theater” that diminishes our privacy to create the appearance of “doing something” without actually making us any safer. Sometimes we face hard questions about the trade-off between liberty and security—but we shouldn’t even begin to consider which trade-offs are worth making until we’ve seen some solid evidence that the trade-off is real. For most “war on terror” measures, the evidence just isn’t there.

Aaron Sorkin's The Newsroom: One-Sided Politics Will Not Save Us from Politics

by Trevor Burrus at http://www.cato.org/publications/commentary/aaron-sorkins-newsroom-onesided-politics-will-not-save-us-politics

Aaron Sorkin's new HBO show The Newsroom is dishing out and receiving a lot of criticism. The Newsroom is Sorkin's latest attempt to cleanse the demons from our national character through fast-talking characters fighting for their principles. This time, however, rather than just obliquely commenting on the political fights of the day through thinly veiled metaphors, Sorkin's characters deal with political events of the recent past. In the pilot episode, Jeff Daniels's character, a mundane and middle-of-the-road newscaster, lets loose his spleen upon an innocent college student who asks him why America is the greatest country on Earth. Daniels rants on America's fallen status but fondly remembers its great past. In the wake of his outburst, he's rebranded as a no-nonsense truth-speaker who will confront the powers that be — a supposed return to the glory days of Edward R. Murrow and Walter Cronkite. In the latter half of the pilot, the newsroom accepts the challenge of "speaking truth to power" and exposes the alleged corporate malfeasance that led to the BP oil spill.
Most recently, The Newsroom has taken on the Tea Party, attacking the Koch brothers by name, as well as the Cato Institute, the Institute for Justice (IJ), and Heritage. In this clip, the characters discuss Cato's and IJ's amicus briefs on behalf of Citizens United. Jeff Daniels's character then misstates the holding of Citizens United — the case did not hold that corporations can donate directly to political candidates — and another character badly misquotes from the Institute for Justice's brief. According to her, the Institute for Justice argued that "finance laws prohibiting unlimited corporate contributions trump the First Amendment." This poorly written line not only misstates IJ's brief, it actually seems like IJ is supporting limits on campaign spending. Actual quote from the brief: "The problem lies in allowing the logic of campaign-finance laws to trump the First Amendment." Like all of his projects, Sorkin's characters are prone to flowery orations. Unfortunately, this time the Sorkin's words, including the sloppy mischaracterization of IJ's argument, were shamefully taken nearly verbatim from a ThinkProgress blog post written by the extremely partisan Lee Fang (compare the characters' words with Fang's words here).
That fact merely underscores the folly in Sorkin's obvious goals for The Newsroom. If he wants to lambaste the mainstream media for no longer providing hard-hitting coverage that "speaks truth to power" and to lament the fallen nature of modern, partisan journalism, then I would suggest to Mr. Sorkin that, in the future, he should not outsource his thinking and language to one politically committed blog. At the very least, he should ensure that his characters do not misstate the central holding of the case they are attacking. If he wants to portray smart, honest, hard-working people turning journalism back into an antacid for our partisan-induced ulcers, then he should make more of an effort to be a non-partisan researcher.
Sorkin is coming from a long tradition in American political thought which holds that a well-functioning republic requires virtuous citizens. Those citizens must be informed and high-minded, not prone to meaningless squabbles or the pursuit of naked self-interest. As John Adams, perhaps the foremost proponent of virtue among the Founding Fathers, said, "Liberty can no more exist without virtue and independence than the body can live and move without a soul."
Sorkin is also within a more recent tradition in American politics: A utopian pining for the days when D.C. was not deeply divided along partisan lines. There was once a time when Washington got things done, so the story goes. There was a time when parties didn't thwart the proposals of an opposition president merely because he was on the other side. There was a time when representatives, opinion-leaders, and ordinary people from both sides reached across the aisle and went to the same social functions, the same movies, and watched the same news. Those days are gone, but perhaps, The Newsroom tells us, the right combination of good-hearted elites could rescue our national narrative from the warring factions and give it purpose and direction.
It's all a little smug, as many have pointed out. Perhaps D.C. politics has become more divided. Perhaps we are becoming a red vs. blue country. What is never mentioned in these increasingly common lamentations, however, is the simple question, "What else would you expect?" Over the past 50 years, politics has crept into nearly every area of our lives, affecting our most personal and consequential decisions. Our political parties no longer fight over simple regulations of interstate commerce and tariffs, we fight, on a national level, over the nature of American health care and how we will educate our children. How could these fights not be schismatic, vicious, and underhanded?
Football teams didn't exist before the game, and fans didn't exist before the teams. The teams fight over zero-sum gains and there can only be one Super Bowl champion. Thus, the players and the fans react accordingly. If the stakes were even higher, however, if Ohio State and Michigan had more to lose than simple bragging rights, then the players and fans would certainly ramp up their partisan loyalties, their vicious name-calling, and their parking lot brawls. Put simply, increased partisanship is a direct result the increased the scope and importance of politics in our daily lives.
None of this should be surprising, yet the collective head-scratching over where our politics went awry continues. Sorkin's plea for elites to create a national narrative that brings us together is no more coherent than his plea that a brilliant, charming, Nobel-laureate, polyglot president can best fix our national crises. America doesn't need better elites, we need fewer of them.

2012-07-29

Timber Payments Subisidize Counties at Taxpayers' Expense

by Randal O'Toole at http://www.cato.org/publications/commentary/timber-payments-subisidize-counties-taxpayers-expense

Many Oregon counties, particularly in southwestern Oregon, are in deep financial trouble. Coos, Curry, Douglas, Jackson, Josephine, Klamath, Lake, and Lane counties historically received 15 to 33 percent of their revenues from the federal government as payments in lieu of property taxes for the national forest and Bureau of Land Management lands in those counties.
Those payments came out of timber sale revenues, but as concerns over the spotted owl and other environmental issues led to a decline in timber sales after 1990, the payments also fell. To ease the transition to more sustainable revenue sources, Congress provided "temporary" funding out of general funds.
Each time temporary funding was set to expire, though, counties complained about a financial crisis; and Congress extended the funding. The latest extension was added to a transportation bill that Congress passed on June 29. But this bill extends the funding only one more year, so county treasuries may be emptied next year. Curry County has threatened to simply shut down, and the Oregon state auditor recently reported that all of these counties have a high risk of financial distress.
The truth is that taxpayers in these counties (of which I am one) have been getting a free ride for decades. While federal lands impose little cost on counties, the payments out of timber receipts have been many times greater than the federal government would have paid if it had paid ordinary property taxes.
Counties throughout the country that have national forests in them receive 25 percent of timber sale receipts. In most cases, this was more than property taxes before sales declined. But the greatest difference was in Oregon, whose valuable old-growth timber produced 40 percent of national forest revenues in the 1970s and 1980s.
Congress allowed the states to divide these "25-percent funds" between schools and county road departments. Most states gave half to each, but Oregon gave 75 percent to roads and 25 percent to schools. This meant that Oregon county road departments were literally rolling in cash in the 1970s and 1980s, but it also meant that the decline in timber sales hit them the hardest.
To make matters worse, the BLM paid a whopping 50 percent of the revenues from most of its Western Oregon timber sales to counties. This compares with just 10 percent of timber receipts paid by the BLM to counties elsewhere. While the national forest funds were split between roads and schools, all BLM funds went straight into county general funds.
The result is that these counties have some of the lowest property tax rates in the state. While the average Oregon property owner pays more than $2.80 per $1,000 in assessed value to the county, property owners in Curry and Josephine counties pay only 60 cents, and rates also are much lower than average in Coos, Douglas and Jackson counties.
Raising property taxes to somewhere around the statewide average would solve the problems in all of these counties except Lake and Lane. But Oregon law prevents counties from raising taxes without voter approval, and county commissioners suspect that few voters will be willing to double or quadruple their county tax burden.
Rep. Peter DeFazio has proposed to divide Western Oregon BLM lands into two chunks. One portion, containing mostly old-growth timber, would be set aside for conservation. The other portion, mainly second-growth timber, would be managed as a source of revenues for the counties.
While some environmental groups oppose this plan, I don't see anything wrong with managing cutover land for timber. But I have to wonder why southwest Oregon counties should continue to live off of federal taxpayers, who otherwise would get any receipts from Forest Service and BLM sales.
County leaders say these BLM lands (which Congress originally granted to a railroad, then took back when the railroad failed to live up to the terms of the grant) would have been private had they not been taken back by Congress. Perhaps so, but the amounts the counties are asking federal taxpayers to pay — either through an extension of timber payments or via DeFazio's bill — greatly exceed the amount private forestland owners pay in property and harvest taxes.
Most of these counties spend the largest share of their funds on public safety, including the sheriff, courts and jail. Other funds go for health and human services. But most also spend a significant amount of money on what might be called luxuries, including recreation, cultural resources and community development programs (which mainly means land-use planning).
County leaders need to accept reality and make some hard decisions about their budgets. Recreation, culture and most public works programs should be funded out of user fees rather than taxes. If users aren't willing to pay for them, then they aren't really needed. Counties also could stop funding land-use planning and let the state pay for those programs if it feels they are needed.
To the extent that these cuts aren't enough to maintain public safety and human service programs, county leaders will have to make it plain to voters that they will have a choice between somewhat higher property taxes or accepting major cuts to these programs.
There is no justification for forcing federal taxpayers elsewhere to subsidize county taxpayers in Oregon.

Homegrown Failure: Why the Domestic Terror Threat Is Overblown

by Benjamin H. Friedman at http://www.cato.org/publications/commentary/homegrown-failure-why-domestic-terror-threat-is-overblown

Homegrown terrorism is not becoming more common and dangerous in the United States, contrary to warnings issuing regularly from Washington. American jihadists attempting local attacks are predictably incompetent, making them even less dangerous than their rarity suggests.
Janet Napolitano, Secretary of Homeland Security, and Robert Mueller, Director of the Federal Bureau of Investigation, are among legions of experts and officials who have recently warned of a rise in homegrown terrorism, meaning terrorist acts or plots carried out by American citizens or long-term residents, often without guidance from foreign organisations.
But homegrown American terrorism is not new.
Leon Czolgosz, the anarchist who assassinated President McKinley in 1901, was a native-born American who got no foreign help. The same goes for John Wilkes Booth, Lee Harvey Oswald and James Earl Ray. The deadliest act of domestic terrorism in U.S. history, the 1995 Oklahoma City Bombing, was largely the work of New York-born Gulf War vet, Timothy McVeigh.
As Brian Michael Jenkins of RAND notes, there is far less homegrown terrorism today than in the 1970s, when the Weather Underground, the Jewish Defense League, anti-Castro Cuban exile groups, and the Puerto Rican Nationalists of the FALN were setting off bombs on U.S. soil.
There was an increase in homegrown terrorism arrests in the late 2000s, with the decade's high coming in 2009. That year saw the decade's deadliest act of homegrown terrorism when Nidal Hasan killed thirteen people at Ft. Hood. Homegrown terrorism has declined since. According a report published earlier this year by Charles Kurzman of the University of North Carolina, arrests of homegrown terrorists fell from 47 in 2009 to 20 in 2011. No more successful plots have occurred.
There are reasons to doubt that the recent increase in homegrown terrorism arrests reflected an increase in actual terrorism. One reason is random variation. Because a run of aces may be due to chance, it does make the next gamble is good one. Similarly, a spate of homegrown terrorism does not necessarily indicate a continuing trend.
Second, as Jenkins notes, a sizeable minority of those arrested for terrorism in the late 2000s were U.S. nationals trying to help the al-Shabaab group in Somalia, either by recruiting, fundraising or joining its ranks. That counts as terrorism because the U.S. government categories al-Shabaab as a terrorist organisation and criminalises support for it. But it is an insurgent organisation chiefly interested in Somalia politics that has not attempted terrorism in the United States. With Ethiopian forces occupying parts of Somalia from 2006-2009, many in the Somali diaspora saw support for al-Shabaab as a defense of their homeland. Those that aid or join it are not necessarily interested in terrorism, let alone terrorism against Americans.
Third, U.S. authorities began to search harder for terrorists at home. After the September 11, the FBI received a massive boost in counterterrorism funding and shifted a small army of agents from crime-fighting to counterterrorism. Many joined new Joint Terrorism Task Forces. Ambitious prosecutors increasingly looked for terrorists to indict. Most states stood up intelligence fusion centers, which the Department of Homeland Security (DHS) soon fed with threat intelligence.
The intensification of the search was bound to produce more arrests, even without more terrorism, just as the Inquisition was sure to find more witches. Of course, unlike the witches, only a minority of those found by this search are innocent. But many seem like suggestible idiots unlikely to have produced workable plots without the help of FBI informants or undercover agents taught to induce criminal conduct without engaging in entrapment.
Take Rezwan Ferdaus, the 26-year-old who lived with his parents outside Boston before his arrest last fall. He allegedly planned to fly small remote-controlled airplanes carrying a few pounds of explosives into the Pentagon and Capitol dome, assuming they would easily collapse. A second attack would somehow destroy the bridges at the Pentagon complex, before a six-man team armed with AK-47s attacked the survivors. Happily, Ferdaus had no accomplices, aside from those provided by the FBI, no money for the planes, other than what the FBI loaned him, and no explosives, beyond fakes sort that the FBI provides.
The officials and pundits most worried about homegrown terrorists claim that Americans are lucky to have enemies like Ferdaus. They say the same of Faizal Shahzad, whose car bomb failed to explode in Times Square, Nazibullah Zazi, who could not make a working bomb despite the training he got on the subject in Pakistan, and the many other incompetents that have lately attempted terrorism in the United States.
Homegrown American jihadists cannot acquire the funds and training needed for terroristic expertise. Most would quickly kill themselves once they achieved it despite their serial failure, U.S. leaders describe homegrown terrorists as cunning and their threat as great. Napolitano says they are especially dangerous because they can come from "any direction, and with little or no warning." Mueller warns that they "understand our culture, our security protocols, and our vulnerabilities. They use the Internet, social media, and marketing skills to influence like-minded individuals."
The failure of U.S.-born jihadists, however, reflects more than luck. There are at least two good reasons for it. The first is al Qaeda's ideology. By supporting the murder of most people, including most Muslims, al Qaeda ensures that it remains wildly unpopular in most places. Their ideology is especially noxious to those living in coherent, liberal societies like the United States. Americans drawn to al Qaeda are likely to be a troubled and disaffected lot, lacking traits that most organisations value in recruits.
A more important reason source of failure is organisational weakness. Mass violence has historically been the product of bureaucratic, hierarchical organisations that belong to states or insurgencies resembling them. Only bureaucratic organisations who have the tools train and motivate many to act on the orders of a few, which is historically how mass violence with small arms occurred. As agents of states or other organisations that monopolise violence, bureaucratic organisations alone have got the physical security, expertise and capital need to manufacture mass killing weapons like artillery, strike aircraft, and nuclear weapons.
Because they are generally clandestine, terrorist groups usually lack these attributes. They struggle to gain and transfer deadly knowledge, amass wealth, build the physical plants needed to make sophisticated weapons or mass enough manpower to sustain attacks on populations. Those flaws are especially evident in al Qaeda, which has always been more a loosely linked set of radicals than an organisation that commands adherents.
Homegrown American jihadists, who generally lack guidance even from al Qaeda's withering core, are about the least organised terrorists imaginable. They cannot acquire the funds and training needed for terroristic expertise. Most would quickly kill themselves once they achieved it.
Contrary to much recent analysis, the internet does not solve these problems. As Anne Stenersen of the Norwegian Defence Research Establishment has shown, online guides to bomb-making, poison manufacture and other tools of mayhem provide unreliable information. Authorities can monitor such sites or set up their own to mislead or trap malfeasants.
Moreover, internet-based instruction does not provide the sort of rapid interaction between trainer and trainee that characterises most successful training in complex tasks. The internet is an even more useless for mastering acts of violence that require teamwork. There is a reason why organisations that effectively coordinate activity, whether it is the Marines Corps or Real Madrid, avoid virtual training.
If DHS is right that homegrown terrorists are now a bigger threat than the international variety, we should celebrate. Even if American-born jihadists grow more numerous and skilled, which now seems unlikely, they will remain far less deadly than the terrorist supervillains we have been taught to expect. They will never compare to big risks to American longevity like heart disease and depression.
The other good news is that the two best ways to combat this overrated danger are cheap. One is community policing, where police form relationships with local groups, including criminals, to generate tips that lead to other criminals, including terrorists.
According to a report by the Institute for Homeland Security Solutions, this method brought most recent arrests of homegrown terrorists. It is nearly free, given that police do much of the relevant work in the course of their normal work. The other promising antidote to homegrown terrorism is foreign policy restraint. Most homegrown terrorists credibly claim that U.S. foreign policy in the Muslim world, especially the wars in Iraq and Afghanistan, motivated them to violence. The end of the war in Iraq has probably shrunk their number. A U.S. exit from Afghanistan and disengagement from Middle-Eastern politics would shrink it further, while allowing vast defense savings.
Unfortunately, DHS has almost nothing to do with those matters. Towns, states and the Department of Justice handle most policing, while State and Defense do foreign policy. Rather than admit its general irrelevance to a threat it is funded to combat, DHS will likely continue counterterrorism policies with costs that outweigh their benefit. That may prove the most costly consequence of homegrown terrorism in the United States.

2012-07-28

Malicious Meddling in Washington: Just What Is the Government's Business?

by Doug Bandow at http://www.cato.org/publications/commentary/malicious-meddling-washington-just-what-is-governments-business

The battle over spending continues in Washington. Despite decades of grotesque federal malfeasance, President Obama believes that government can easily fix the world — including us, the people in it. Republicans don't believe much of anything, other than they don't want to raise taxes. It's a good instinct, but not much of a political philosophy.
The fundamental issue is not government expenditures, but the role of government. If you believe most of human affairs should be conducted, and controlled, by the public sphere, then you must support lots of government spending. If you want to limit those outlays, you have to reduce the state's responsibilities.
America has become a transfer society in which Uncle Sam subsidizes virtually every noisy and noisome interest group, as well as scores of dubious friends and allies overseas. Subsidies is where most of the money goes — Social Security and Medicare, which are middle class welfare; Medicaid, for low-income people; the Pentagon, which devotes much of its resources to defending other peoples, such as the Europeans, Japanese, and South Koreans, who could defend themselves; and endless smaller benefit and grant programs for education, housing, income support, training, transportation, and more. Roll back domestic and foreign subsidies for those who don't need them and Americans would be well on their way to solving the current budget crisis.
Some government programs don't cost a lot but still lead to the obvious question: Why is that activity government's business? Even if such actions were costless, they still would be inappropriate. There are some things which the state simply should not do, at least in a society which purports to be "free."
For instance, in little more than two weeks 100 watt incandescent bulbs will be illegal. After failing in the war on drugs, Uncle Sam is about to initiate a war on bulbs. Luckily, I stocked up earlier this year. I don't go through them very quickly, so I figure I should be set for the rest of my natural (and even unnatural) life. Perhaps I can make a little black market profits on the side and leave a few bulbs for my heirs.
But the 100 watt bulbs are only the start. Lesser wattages will be banned in coming years. Naturally, it is supposed to be for our own good. Our betters in Washington believe that average people are too stupid to choose the right bulbs. So in our name we are being forced to raid the college fund to purchase expensive compact fluorescent lamp (CFL) bulbs, which take a long time to reach full brightness, yield an inferior glow, and require a hazmat team to deal with breakage. For the latter the government urges people to open windows, evacuate the room, and toss any clothes contaminated by mercury from the wonderful CFLs.
Yes, yes, we are told — there is no ban on incandescents, only a standard which they cannot meet. Just purely coincidental that they all will be illegal. And the CFLs are getting better, much better, and shouldn't be rejected because of consumer prejudices. No, of course not. We should just let the smart people decide that everyone should buy Pepsi rather than Coke, or Coke rather than Pepsi, or Diet-Rite instead of the other two, or just drink water instead.
Why are the light bulbs we buy the government's business?
Last year Senate Majority Whip Richard Durbin (D-Ill.) decided that Wal-Mart and friends needed a break. So he led an effort to cut the "swipe" fees charged by banks to retailers for debit card purchases. Sen. Durbin was just outraged at the thought that America's retail giants were being gouged by companies allowed to set their own prices. Imagine! In America! Companies allowed to charge whatever they want!
Just like the big merchants themselves.
Of course, passage of the law led to a bitter regulatory fight to influence the Federal Reserve, which was tasked with arbitrarily setting the swipe fees. Then the retailers whined that the new, lower fees were too high, which meant they weren't getting as much of a return on their investment in lobbying as they had expected. So they did what most Americans do when disappointed, sued.
After having their swipe fees cut by the government, several banks announced plans to raise costs on consumers. Under protest most backed down — for now, but their forbearance may not last forever. Moreover, smaller retailers aren't doing nearly as well as the big boys: banks have dropped discounts they once offered and companies which process debit transactions are raising their fees. Obviously, it is dangerous to allow too much freedom in a free market! Maybe Sen. Durbin needs to push legislation for a new round of price controls on everyone, retailers included. I think everything should be free!
Why are swipe fees the government's business?
The Food and Drug Administration, which has killed tens or hundreds of thousands of people by delaying safe drugs from reaching the sick, is now considering legislation to reduce the amount of salt in our food. Not recommend that consumers use less. But mandate that producers use less. After all, notes the agency, long one of Washington's worst national nannies, consumers' "taste preference for sodium is acquired and can be modified."
Presumably that is the case for broccoli, lima beans, and tofu as well. What will the health fascists next come up with? Everyone must eat Spam! No doubt, most Americans would benefit from a healthier diet. But which one?
Even the elites who are supposed to know better than the rest of us rubes often fall victim to fads and battle each other over who is the biggest, baddest genius. My Cato Institute colleague Walter Olson observed: "the government's dietary advice has changed often through the years, and its recommendations in retrospect have regularly proved to be unfounded and even damaging. Sure enough, reports have begun to come out that the salt panic has been exaggerated and may even pose some health dangers of its own."
Great. The government already has multiple ways to kill its citizens, starting with foolish, stupid, and unnecessary wars. Now it is planning to impose the latest diet fads.
Why is people's salt consumption the government's business?
Flying is a pain. After years of losing money while sending millions of people to thousands of destinations, the airlines are trying to make a profit by charging fees on everything but bathroom use, and that may be next. One of the most irritating costs to passengers is for checking luggage. Which encourages people to carry more bags onto planes, irritating flight attendants. So Sen. Benjamin Cardin (D-MD) wants to save us all. He has introduced legislation mandating that everyone gets one free bag and banning any charges for carry-ons.
After all, bags have a constitutional right to travel free. But why stop there? Passengers also should have a right to free booze. And better meals. Travelers shouldn't have to pay more for good seats. Moreover, it is a time to think outside the box. Airlines should have to provide live music entertainment. That would spice up an otherwise boring time in the air. There are so many other items that belong on a list of passenger freebies.
In fact, government regulation of airline luggage policies is an idiotic idea. Personally, I prefer one (higher) price for everything. But there's a good argument for people who ship more bags paying more than people who carry their own. I don't know which business model is better. But I am certain that Sen. Cardin doesn't have the slightest idea. As Milton Friedman observed, There Ain't No Such Thing as a Free Lunch!
Why is the cost of checking luggage the government's business?
A battle recently opened over which religious organizations should be forced to provide coverage for birth control as part of their health insurance policies. Complained Rep. Diana DeGette (D-Col.): "I think in the 21st century, most people are stunned to hear that we would even be talking about whether women can buy birth control through their insurance policies."
Of course, people should be stunned, since in a normal, sensible, nonpolitical world, birth control would not be covered by health insurance. People normally seek insurance to cover large, unexpected costs, not pay for modest, recurring expenses of activities freely chosen. Apparently Rep. DeGette is not familiar with the fact that sex normally is a voluntary activity, the frequency of which is under individual control. Imagine auto "insurance" which covered gas fill-ups, new stereo systems, and fancy detailing. It doesn't take a genius to realize the cost implications for everyone. A company might decide that covering contraception would still be cheaper than paying for unplanned pregnancies, but Rep. DeGette is not the one to make that decision.
States have been playing this game for years. The podiatrists show up in the state capital and insist that health care policies cover their services. Then the acupuncturists make the lobbying trip. Followed by doctors doing hair transplants. Soon the legislature is forcing everyone is to pay for everything, even if most people would prefer an inexpensive catastrophic policy — real insurance. Now ObamaCare has Washington deciding what everyone in America must pay for, raising both health care costs and government outlays.
Why is insurance coverage of birth control the government's business?
There's a reason government is so costly. It does too much. But some of its worst abuses are more intrusive than expensive. It simply is not government's business which light bulbs we buy, what banks charge for debit transactions, how much salt we eat, what airlines charge for checked bags, and whether insurance policies cover birth control. None of these issues should be dictated by the enlightened public servants of civics education myth, let alone the cynical vote-seeking politicians of Capitol Hill reality.
The Founders intended to create a limited government dedicated to protecting individual liberty. There can still be disagreement over what is necessary for the framework of a free society, but today Washington vastly exceeds its proper role. As a result, government costs far too much. And interferes far too much with our liberty.

Don't Blame the Depression on the Gold Standard -- But Don't Expect It Back Either

by George A. Selgin at http://www.cato.org/publications/commentary/dont-blame-depression-gold-standard-dont-expect-it-back-either

Two of America’s Republican candidates — Newt Gingrich and Ron Paul — have dared to toy with the idea of bringing back the gold standard. Their remarks have in turn triggered a fusillade of indignant replies, from pundits and professional economists alike, the general theme of which is that no one fit to be America’s Commander-in-Chief can possibly have a good word to say about gold.
Of all the reasons usually given for condemning the gold standard, perhaps the most common is the claim that it was to blame for the Great Depression. What responsible politician, gold’s critics ask rhetorically, wants to relive the 1930s?
But the criticism misses its mark. Fans of the gold standard are no more anxious to repeat the 1930s than their critics are. Their nostalgia is instead for the interval of exceptional international monetary stability that prevailed from the mid-1870s until World War I. That was the era of the classical gold standard — a standard policed by the citizens of participating countries, all of whom were able to convert their nations’ paper money into gold.

This classical gold standard can have played no part in the Great Depression for the simple reason that it vanished during World War I, when most participating central banks suspended gold payments. (The US, which entered the war late, settled for a temporary embargo on gold exports.) Having cut their gold anchors, the belligerent nations’ central banks proceeded to run away, so that by the war’s end money stocks and price levels had risen substantially, if not dramatically, throughout the old gold standard zone.
Postwar sentiments ran strongly in favour of restoring gold payments. Countries that had inflated, therefore, faced a stark choice. To make their gold reserves adequate to the task, they could either permanently devalue their currencies relative to gold and start new gold standards on that basis, or they could try to restore their currencies’ pre-war gold values, though doing so would require severe deflation. France and several other countries decided to devalue. America and Great Britain chose the second path.
The decision taken by Winston Churchill, then Britain’s chancellor of the exchequer, to immediately restore the pre-war pound, prompted John Maynard Keynes to ask, “Why did he do such a silly thing?” The answer was two-fold: first, Churchill’s advisers considered a restored pound London’s best hope for regaining its former status — then already all but lost to New York — as the world’s financial capital.
Second, Britain had other cards to play, aimed at making its limited gold holdings go further than usual. Primarily, it would convince other countries to take part in a gold-exchange standard, by using claims against either the Bank of England or the Federal Reserve in place of gold in international settlements. It would also ask the Fed to help improve Great Britain’s trade balance by pursuing an easy monetary policy.
The hitch was that the gold-exchange standard was extremely fragile: if any major participant defected, the British-built house of cards would come tumbling down, turning the world financial system into one big smouldering ruin.
In the event, the fatal huffing and puffing came then, as it has come several times since, from France, which decided in 1927 to cash in its then large pile of sterling chips. The Fed, in turn, decided that pulling back the reins on a runaway stock market was more important than propping-up the pound. Soon other central banks joined what became a mad scramble for gold, in which Britain was the principal loser. At long last, in September of 1931, the pound was devalued. But by then it was too late: the Great Depression, with its self-reinforcing rondos of failure and panic, was well under way.
So the gold standard that failed so catastrophically in the 1930s wasn’t the gold standard that some Republicans admire: it was the cut-rate gold standard that Great Britain managed to cobble together in the 20s — a gold standard designed not to follow the rules of the classical gold standard but to allow Great Britain to break the old rules and get away with it.
So does this mean that those Republican candidates are right to pin America’s hopes on a return to gold? Alas, it doesn’t: the collapse of the gold-exchange standard forever undermined the public’s confidence in governments’ monetary promises; and absent such confidence there can be no question of a credible, government-sponsored gold standard, classical or otherwise. Sometimes with monetary systems, as with life, you can’t go home again.

2012-07-27

Let the Games Begin! — But Free of Myths, Ancient and Modern

by Ilya Shapiro at http://www.cato.org/publications/commentary/let-games-begin-free-myths-ancient-modern

Preparations for the London Olympics, about to start in the first city to host three Games, have not been without controversy. Cost overruns — more than 100 percent over budget — will make these by far the most expensive Games. Landlords in working-class East London rewrote leases so apartments would be temporarily vacated in favor of cash-rich tourists. And the logo is an incomprehensible mishmash of jagged shapes.
The International Olympic Committee has also taken heat for declining to commemorate the 40th anniversary of Palestinian jihadists' murder of 11 Israeli athletes and coaches at the 1972 Munich Games. Across the pond, Washington saw a rare moment of bipartisanship in politicians' outrage that American Olympians' garb was made in China. "I think they should take all the uniforms, put them in a big pile, and burn them," said Senate Majority Leader Harry Reid, apparently not yet taken with the Olympic spirit.
More broadly, the media again remind us of how commercialism and the threat of terrorism have spoiled the world's preeminent athletic event. Columnists lament the passing of a purer age, when doctors trained in their spare time — Chariots of Fire is enjoying a rebirth — and competition was about more than endorsement contracts. These Cassandras habitually predict the demise of the Olympics as modern society wreaks havoc on the sacrosanct traditions of the ancients.
But this idea that the Games should promote a kinder, gentler world reflects sentimentalized history. Since the end of the Cold War, the Olympics have thrown off the corrosive chains of ideology to revert to the values of the original Games, among which were the dominance of the personal over the national, the economic over the political, and the athletic over larger concerns of the state.
The standard view of the Greek Olympics as a halcyon festival bringing amateur sportsmen together in the name of peace and brotherhood is a remnant of 19th-century Romanticism that was institutionalized by aristocrats like modern Games founder Pierre de Coubertin. Adolf Hitler, who staged the 1936 Berlin Games as a testament to the German people, was taken in by a similar vision of nationalism via physical perfection.
The ancient reality could not have been further from these modern misconceptions, as Greek armies routinely violated the Olympic truce, sometimes battling in the Olympic sanctuary itself. Individual achievement was valued much more than participation, and wealth superceded ideology.
Pindar, the lyric poet whose odes tell us much of what we know about the early Olympians, wrote at the behest and patronage of wealthy athletes, who sought personal glory rather than the vindication of their city-state and its political system. The great champion Alcibiades used his prestige to gain fame and riches, often at the expense of "national interest."
The modern Games, in allowing politics to overshadow sports, broke with their predecessors. After the Munich tragedy, the 1976 Montreal event left a trail of debt that Quebec taxpayers only recently paid off — and for which British taxpayers now brace themselves — and saw the first of a series of boycotts. The Olympics had lost their ancient bearings.
Though nobody knew it at the time, the 1988 Seoul Olympics were a watershed. These Games were the first to be free from major political turmoil since Tokyo in 1964. More importantly, they represented the last Olympiad of the Cold War, with the Berlin Wall falling the next year, followed by the dissolution of the Evil Empire and German reunification.
The 20th Century took us through almost continual political upheaval, most of it defined by the bipolar Cold War mentality and the specter of nuclear Armageddon. With that edifice of pretension eroded, the Games were free to become athletic spectacles again.
Under today's conditions of globalization — cultural homogenization, economic interdependence, decline of the nation-state even with respect to our enemies in war — international athletic competition assumes an ever-more parallel course to that of the world at large. As with all sporting events, the Olympics of the past two decades have become exponentially more entertainment-oriented. Even the proliferation of crass commercialism is a positive step because it returns the Olympics to the role they fulfill best: providing a forum for the globe's finest athletes to show the rest of us a good time.
The Olympics now bring us the absolute best, without regard to color, creed, contract, or the Iron Curtain. The nature of the Olympic "movement," meanwhile, has returned to the entertainment, ritual, and indeed athletic value of the original Games. Gone is the sham of amateurism, as athletes are once more individuals, not tools of the state.
Tradition meet meritocracy; Coubertin meet Milton Friedman. Counter the conventional narrative, the symbiotic relationship between sports and society has reverted to its original, proper status under the ancient Greeks.
Returning to 2012, the various London "scandals" are mere sound and fury compared to the Cold War-era misuse of sports for political purposes — or, more prosaically, the lack of snow at the 2010 Vancouver Winter Games. Even the tiff over foreign-made blazers betrays a lack of understanding about international trade. After all, the very reason consumers don't have to be polo-playing scions to afford the iconic wares of Ralph Lauren — which is outfitting the U.S. team at no cost to taxpayers — is because the company seeks out low-cost manufacturing.
The grandees of both the I.O.C. and Congress peddle utopian myths when they should be recognizing that the Olympics are no more or less than the very best the sports world has to offer.

The Federal Reserve's Covert Bailout of Europe

by Gerald P. O'Driscoll Jr. at http://www.cato.org/publications/commentary/federal-reserves-covert-bailout-europe

America's central bank, the Federal Reserve, is engaged in a bailout of European banks. Surprisingly, its operation is largely unnoticed here.
The Fed is using what is termed a "temporary U.S. dollar liquidity swap arrangement" with the European Central Bank (ECB). There are similar arrangements with the central banks of Canada, England, Switzerland and Japan. Simply put, the Fed trades or "swaps" dollars for euros. The Fed is compensated by payment of an interest rate (currently 50 basis points, or one-half of 1%) above the overnight index swap rate. The ECB, which guarantees to return the dollars at an exchange rate fixed at the time the original swap is made, then lends the dollars to European banks of its choosing.
Why are the Fed and the ECB doing this? The Fed could, after all, lend directly to U.S. branches of foreign banks. It did a great deal of lending to foreign banks under various special credit facilities in the aftermath of Lehman's collapse in the fall of 2008. Or, the ECB could lend euros to banks and they could purchase dollars in foreign-exchange markets. The world is, after all, awash in dollars.
The two central banks are engaging in this roundabout procedure because each needs a fig leaf. The Fed was embarrassed by the revelations of its prior largess with foreign banks. It does not want the debt of foreign banks on its books. A currency swap with the ECB is not technically a loan.
The ECB is entangled in an even bigger legal and political mess. What the heads of many European governments want is for the ECB to bail them out. The central bank and some European governments say that it cannot constitutionally do that. The ECB would also prefer not to create boatloads of new euros, since it wants to keep its reputation as an inflation-fighter intact. To mitigate its euro lending, it borrows dollars to lend them to its banks. That keeps the supply of new euros down. This lending replaces dollar funding from U.S. banks and money-market institutions that are curtailing their lending to European banks — which need the dollars to finance trade, among other activities. Meanwhile, European governments pressure the banks to purchase still more sovereign debt.
The Fed's support is in addition to the ECB's €489 billion ($638 billion) low-interest loans to 523 euro-zone banks last week. And if 2008 is any guide, the dollar swaps will again balloon to supplement the ECB's euro lending.
This Byzantine financial arrangement could hardly be better designed to confuse observers, and it has largely succeeded on this side of the Atlantic, where press coverage has been light. Reporting in Europe is on the mark. On Dec. 21 the Frankfurter Allgemeine Zeitung noted on its website that European banks took three-month credits worth $33 billion, which was financed by a swap between the ECB and the Fed. When it first came out in 2009 that the Greek government was much more heavily indebted than previously known, currency swaps reportedly arranged by Goldman Sachs were one subterfuge employed to hide its debts.
The Fed had more than $600 billion of currency swaps on its books in the fall of 2008. Those draws were largely paid down by January 2010. As recently as a few weeks ago, the amount under the swap renewal agreement announced last summer was $2.4 billion. For the week ending Dec. 14, however, the amount jumped to $54 billion. For the week ending Dec. 21, the total went up by a little more than $8 billion. The aforementioned $33 billion three-month loan was not picked up because it was only booked by the ECB on Dec. 22, falling outside the Fed's reporting week. Notably, the Bank of Japan drew almost $5 billion in the most recent week. Could a bailout of Japanese banks be afoot? (All data come from the Federal Reserve Board H.4.1. release, the New York Fed's Swap Operations report, and the ECB website.)
No matter the legalistic interpretation, the Fed is, working through the ECB, bailing out European banks and, indirectly, spendthrift European governments. It is difficult to count the number of things wrong with this arrangement.
First, the Fed has no authority for a bailout of Europe. My source for that judgment? Fed Chairman Ben Bernanke met with Republican senators on Dec. 14 to brief them on the European situation. After the meeting, Sen. Lindsey Graham told reporters that Mr. Bernanke himself said the Fed did not have "the intention or the authority" to bail out Europe. The week Mr. Bernanke promised no bailout, however, the size of the swap lines to the ECB ballooned by around $52 billion.
Second, these Federal Reserve swap arrangements foster the moral hazards and distortions that government credit allocation entails. Allowing the ECB to do the initial credit allocation — to favored banks and then, some hope, through further lending to spendthrift EU governments — does not make the problem better.
Third, the nontransparency of the swap arrangements is troublesome in a democracy. To his credit, Mr. Bernanke has promised more openness and better communication of the Fed's monetary policy goals. The swap arrangements are at odds with his promise. It is time for the Fed chairman to provide an honest accounting to Congress of what is going on.

Sweet Land of Liberty -- Public School Principals Must Be Evaluated, Too

by Nat Hentoff at http://www.cato.org/publications/commentary/sweet-land-liberty-public-school-principals-must-be-evaluated-too

Fierce battles continue around the country among school officials, teachers unions and parents about how to best evaluate teachers so that the incompetents can be terminated. Largely overlooked, however, is the vital need to evaluate principals. In many schools I've reported on over the years, it's been clear that a principal can determine the learning environment in a school beneficially, or encourage dropouts.
In several schools with a domineering but uneducable principal, I've actually witnessed a few teachers — able to create lifelong learners — keeping the doors of their classrooms shut as long as possible lest a clueless, destructive principal wander in and warn these creative teachers they'd be disciplined for being out of step.
I've rarely seen an education reporter so well describe an exemplary principal as the New York Times' Maria Newman did in her Feb. 14 story "On the Front Lines of School Reform."
Here is Jim Manly, principal of New York City's Harlem Success Academy 2, part of Eva Moskowitz's network of 40 charter schools that are public but not required to employ unionized teachers.
Jim Manly so believes in personalized education that at the beginning of each year, he works on getting to know the names of each of the school's students, kindergarten through fourth grade.
He tells Maria Newman: "I go down first thing every morning and I shake every scholar's hand and I say good morning by their name."
This reminded me of a fourth-grader in another New York public school who was explaining why he so liked being there. He was still surprised, he said, that his teachers "know my name," unlike in his previous school.
Principal Manly doesn't just know the students' names. He also keeps up on how each is doing. When he finds some of them faltering, he'll then tell the parents:
"Your kid is coming to school at 12:30 in the afternoon, or they're missing three days in a row for no other reason than (they) felt tired or (they) didn't feel like coming to school."
He then brings those parents right into their children's education. "We can't throw anybody out, but we sit the parents down and say there is a waiting list a mile long of people who want in to this school and you have this spot and you're throwing it away. We need your help."
Just about every one of Eva Moskowitz's Success charter schools does have long lines of parents intently striving to get their kids into those schools because academically, they genuinely outperform neighboring regular schools.
That's why I have urged, in New York City's Village Voice newspaper, Eva Moskowitz to run for mayor when the present incumbent, Michael Bloomberg's extended term expires. Bloomberg proudly calls himself "The Education Mayor" and has repeatedly urged New Yorkers to judge his reign by what he has achieved for the students.
The degree of his failing grades is revealed by the long reliable Quinnipiac University poll, reported on Feb. 12 by Michael Goodwin in the New York Post:
"The numbers jump off the page. Only 26 percent of New Yorkers approve of Mayor Bloomberg's handling of the public schools, while 61 percent disapprove."
One of Bloomberg's main pledges while running for office and since was that he would prove the lasting value for students of mayoral control of the schools. Here is the current report card on how well he is doing in that regard: "Just 24 percent say mayoral control has been a success, with 57 percent calling it a failure."
And The Education Mayor's response, reported by Michael Goodwin: "The mayor says he kept that promise, recently declaring that 'schools are better than they have ever been.'"
Sadly, many years ago, the teachers of young Bloomberg in Massachusetts were not able to get him to learn critical thinking with regard to assessing the actual results of education.
If Eva Moskowitz does not run to succeed Michael Bloomberg, I would gladly vote for principal Jim Manly, because he sees — and continually acts on — what he calls the real urgency to this work, telling the New York Times this about the kindergarten to fourth-graders in his Harlem Success Academy 2:
"These kids don't have more time. They don't get to say 'I'll wait five or six more years for this school to get fixed.' By then they'll be in eighth grade, reading at a third-grade level."
As a reporter and then a friend, I came to know Dr. Kenneth Clark, whose research on many black children being deprived of learning to be lifelong learners contributed significantly to the Supreme Court's unanimous 1954 Brown vs. Board of Education ruling that racially segregated public schools are unconstitutional.
When that decision failed to racially integrate many schools because of legal residential segregation, Kenneth told me: "So, by the end of the second grade, some black kids still learn to believe that they're dumb, and they are not."
In such Success Academy Schools as the one where Jim Manly is principal, what the students are learning is the joy of learning.
Perpetuating the other kinds of schools are principals judging students not through personalized learning, but how they do on collective standardized tests.

2012-07-26

Fixing the Federal Reserve

by Richard W. Rahn at http://www.cato.org/publications/commentary/fixing-federal-reserve

There is a growing consensus that the Federal Reserve is broken — because it is. The Fed was established to provide price stability and prevent periodic banking crises. It has accomplished neither.
The wholesale price level in the United States was at almost the same level when the Fed was established in 1913 as it was in 1793, 120 years earlier. Now it takes about 22 dollars to equal the 1913 dollar. There have been far more bank failures post-Fed than pre-Fed, and we seem to be in an almost permanent state of banking crises with “too big to fail.”
The Fed’s near-zero interest policy is a growing disaster. With inflation near 4 percent and interest on various types of savings accounts less than 1 percent, those who have been prudent and saved are being punished — forced to accept what is, in effect, a negative rate of interest. Credit is no longer being allocated by the market but to classes of borrowers as determined by politicians. Homeowners are being given money at a near-zero rate (the interest rate they are being charged is about equal to inflation) and the interest expense is tax-deductible. Many small-business people are not able to get loans because they are “risky,” and the banks can borrow from the Fed at lower rates than they can get on government bonds, so there is no incentive for them to take on the risk. Unless the banks become more willing to lend to businesses that create real jobs and innovations, the economy will continue to stagnate.
All of the Republican presidential candidates have called for getting rid of Fed Chairman Ben S. Bernanke, but only Rep. Ron Paul has advocated abolishing the Fed. Mr. Paul wants to return to a gold standard. There are pros and cons of going back to gold, but, short of that, there are a number of constructive things that can be done.
One reason Fed policy is so confused and conflicted is that the Fed has been given multiple targets and tasks, some of which, at times, conflict with one another. The Fed is supposed to maintain not only price stability but also full employment. In addition, it is supposed to make sure the banking system is sound. The Dodd-Frank bill gave it the additional task of consumer financial protection. To understand the problem, assume you decide to participate in the Olympics because you are a fast runner and want to compete in the 100-meter dash. But then the government says, “By the way, you must also compete in weight lifting.” A bit later, the government comes back to you and says, “You must also add diving to your Olympic sports.” How would you train?
Fed officials often say — and some seem to believe — that their job is to “lean against the wind.” Do they know which way the wind is blowing better than anyone else? Remember, they managed to miss the financial meltdown in 2008 even though some in the private sector got it right.
Lesson 1: The Fed should have only one target and one responsibility, and that is price stability. (Other government agencies can do the other things).
Lesson 2: Even with only one target, the Fed still will have trouble getting it right.
Therefore, the government should let citizens experiment, as the great economist F.A. Hayek advocated, with developing their own monies, whether it be gold, silver, a commodity basket or whatever. Americans now do have the legal right to make contracts in gold, as long as both buyer and seller agree.
There are two reasons why private monies have not been successful. The first is that the Treasury Department has taken the position that only the government can produce money. The Constitution says the Congress shall have the power “To coin Money, and regulate the Value thereof.” Clearly, the government has the right to specify what legal tender is for the collection of taxes, for government payments and for payment of debts when an alternative to government money is not specified. However, the Constitution does not say that nongovernment entities are prohibited from producing money — provided they do not claim it is legal tender — and that both buyer and seller agree to the alternative money.
The second reason gold or other commodities cannot be practically used as money is the U.S. Treasury takes the economically destructive position that there must be capital gains taxes paid on commodity transactions. This, in a practical sense, means that to use gold for payment, every transaction, no matter how small, would require a calculation and report of the capital gain or loss. Commodities trading is a zero-sum gain, so the capital losses and gains offset each other over time. Thus, the Treasury receives no net revenue from such trades, making payment of capital gains a truly stupid tax.
In sum, the monetary situation could be greatly improved if: (1) The Fed were charged only with maintaining the value of the currency and nothing else; (2) others were given the right to compete with the Fed in creating money (again, provided they do not claim it is legal tender); and finally, (3) the capital gains tax were removed from commodity transactions.

Blind Ambition Is Not a Presidential Job Qualification

by Gene Healy at http://www.cato.org/publications/commentary/blind-ambition-is-not-presidential-job-qualification

Are you depressed about the shape of the 2012 presidential race? Maybe you're not depressed enough.
Nobody who wants the presidency too badly ought to be trusted with it. George Washington struck the right note in his first inaugural: "No event could have filled me with greater anxieties" than learning of his election.
Yet, as the powers of the presidency have grown far beyond what Washington could have imagined, the selection process has changed in ways that make it vanishingly unlikely that a latter-day Washington will seek the job.
Unfortunately, the modern presidential campaign calls forth characters with delusions of grandeur, a flair for dissembling, and a bottomless hunger for higher office.
Barack Obama's audacious ambition is by now well-known. "He's always wanted to be president," one of Obama's oldest friends, presidential adviser Valerie Jarrett, has admitted.
In a November 2007 interview, then-candidate Obama commented, "If you don't have enough self-awareness to see the element of megalomania involved in thinking you should be leader of the free world, then you probably shouldn't be president."
So, only "self-aware" megalomaniacs should get nuclear weapons — that's one way of looking at it. Judging by the 2012 field, it may be the best we can do.
In a famous 1979 television interview, Democratic presidential contender Ted Kennedy flubbed a softball question: "Why do you want to be president?" Kennedy's sputtering answer damaged his campaign.
Despite extraordinary efforts in two campaigns — spending millions of dollars of his own money, it's not obvious that Mitt Romney has a clear answer to that question, either. Mitt's "main cause appeared to be himself," a longtime Republican observer of the Massachusetts governor told the authors of "The Real Romney."
"Commander-in-chief of this country," is how former Sen. Rick Santorum describes the job he's applying for — and he sees the CINC's portfolio as broad enough to include hectoring Americans about their sex lives: "The dangers of contraception in this country, the whole sexual libertine idea... these are important public policy issues."
Clearly, anyone who wants the job badly enough to campaign as exhaustingly as Santorum has — living out of a suitcase on the long march through all 99 Iowa counties — doesn't simply want to take care that the laws are faithfully executed and otherwise mind his own business.
But maybe we shouldn't be surprised that the modern process calls forth people with inordinate ambition and grandiose visions, like Newt Gingrich, who has bragged that "I first talked about [saving civilization] in August of 1958."
As the Atlantic's James Fallows put it recently, "an abnormal-psych study could be written on every president of the modern era except the one who never ran for national office, Gerald R. Ford." With apologies to Groucho Marx, anybody who wants to belong to this club shouldn't be allowed to be a member.
In his terrific book "See How They Ran," historian Gil Troy writes that "Originally, presidential candidates were supposed to 'stand' for election, not 'run.' They did not make speeches. They did not shake hands. Republican detachment from the political arena was good and dignified; actively seeking office and soliciting votes was humiliating and bad."
The Jeffersonian ideal of the "mute tribune" was imperfectly observed, Troy notes, but it was something to aspire to, and candidates who violated it were occasionally punished at the polls.
Amid the tumult of the 2012 race, it's hard to imagine returning to the era of the "front porch campaign," when candidates were hardly seen and rarely heard.
But we ought to strive to make the office less powerful, and thus, a less attractive prize for those who hunger for power.

2012-07-25

The Real Victims in the Patent Wars

by Timothy B. Lee at http://www.cato.org/publications/commentary/real-victims-patent-wars

Last May, James Thompson, the maker of a popular calculator app for the iPhone, got a nasty surprise. "Just got hit by very worrying threat of patent infringement lawsuit," he tweeted.
The letter came from a Texas company called Lodsys LLC. The firm, widely described as a "patent troll," doesn't produce any useful products. Rather, its primary business is suing other companies that accidentally infringe its portfolio of patents, including one on the concept of buying content from within a mobile application.
Lodsys' letters, which went out to numerous small software entrepreneurs last year, place their targets in a bind. Merely asking a patent lawyer to evaluate the case and advise a company on whether it was guilty of infringement could cost a firm tens of thousands of dollars. And a full-blown patent lawsuit could easily carry a price tag in the millions of dollars, with no guarantee of recovering attorney's fees even if the defendant prevailed. The Lodsys litigation campaign is a particularly egregious example of a much broader problem. Patents are supposed to reward innovation, but in the software industry, they are having the opposite effect. The patent system has become a minefield that punishes innovators who accidentally infringe the patents of others. There are now so many software patents in force that it is practically impossible to avoid infringing them all.
The result has been an explosion of litigation. Large firms like Apple, Microsoft, Motorola, and Samsung are suing one another over mobile phone patents. And as a recent episode of This American Life documented, there are entire office buildings full of "patent trolls" that produce no useful products but sue other companies that do. What has gone largely overlooked in the coverage of the “patent wars,” however, has been the disproportionate burden placed on small firms—which has enormous consequences for the movement toward DIY innovation.
Software is unusual because it is effectively eligible for both copyright and patent protection. Patents traditionally protect physical machines or processes, like the light bulb, the vulcanization of rubber, or the transistor. Copyrights protect written and audiovisual works, like novels, music, or movies. Computer programs straddle this boundary. They are written works, but when executed by computers, they affect the real world. Since the 1990s, courts have allowed software creators to seek both copyright and patent protections.
While copyright law has served the software industry well, the same is not true of patents. Copyright protection is granted automatically when a work is created. In contrast, obtaining a patent is an elaborate, expensive process. Copyright infringement occurs only when someone deliberately copies someone else's work. But a programmer can infringe someone else's patent by accident, simply by creating a product with similar features.
The patent system doesn't even offer software developers an efficient way of figuring out which patents they are in danger of infringing upon. It’s a matter of arithmetic: There are hundreds of thousands of active software patents, and a typical software product contains thousands of lines of code. Given that a handful of lines of code can lead to patent infringement, the amount of legal research required to compare every line of a computer program against every active software patent is astronomical.
Little wonder, then, that most software firms don't even try to avoid infringement. Defending against patent litigation is simply seen as a cost of doing business in the software industry. Startups hope that by the time the inevitable lawsuits arrive, they will have grown large enough to hire good lawyers to defend themselves. But as the number of software patents—and with it, the volume of litigation—has soared, smaller companies have become targets.
These startup firms face legal threats from two directions: patent trolls and large incumbents like Microsoft and IBM that demand small firms pay them licensing fees.
The contrast between Microsoft and Google helps to illustrate the problem. The U.S. Patent Office has issued Microsoft more than 19,000 patents since 1998, the year Google was founded. In contrast, Google has been issued fewer than 1,100. While Microsoft is undoubtedly an innovative company, it's hard to argue that it has been almost 20 times as innovative as Google in the last 14 years. Rather, Microsoft's larger portfolio reflects the fact that a decade ago, Microsoft was a mature company with plenty of cash to spend on patent lawyers, while Google was still a small startup focused on hiring engineers.
Most of Microsoft's patents cover relatively pedestrian features of software products. In a pending lawsuit against Barnes & Noble, for example, Microsoft asserts that the Nook infringes patents on the concept of selecting text by dragging "graphical selection handles" and the idea of displaying a website sans background image while waiting for the background image to load. Individually, these patents are unremarkable. But when consolidated in the hands of one firm, they form a dense "patent thicket." Microsoft's vast portfolio—reportedly numbering about 60,000 when acquisitions are taken into account—allows the Redmond, Wash., software giant to sue almost any software firm for patent infringement.
And that makes Microsoft a de facto gatekeeper to the software industry. This isn't a problem for other large incumbents, such as Apple and IBM, which have thousands of their own patents and use them to negotiate broad cross-licensing agreements. But small firms haven't had the time—or the millions of dollars—to acquire large portfolios.
And that's troubling because Silicon Valley has traditionally been a place where new firms can come out of nowhere to topple entrenched incumbents. Yet new firms wanting to compete against Microsoft, Apple, or IBM are increasingly forced to first license the incumbents' patents. It’s hard to win in the marketplace when you're forced to share your revenues with your competitors.
The patent system is a poor fit for the software industry. We shouldn't force the small, nimble firms (which make the field so dynamic) to divert scarce capital to defending themselves against patent lawsuits or amassing patent portfolios of their own. And computer software is already eligible for copyright protection, so patent protection is superfluous for rewarding software innovation.
Unfortunately, there appears to be little understanding among policymakers of the damage patents are doing in the software industry and particularly for the scrappy innovators they love to talk up. Congress excluded tax strategies from patentability in last year's patent reform legislation, but it did not even consider a similar carve-out for software. The Supreme Court seems equally oblivious to the damage patents are doing to the software industry. Writing in a 2010 patent case, Justice Kennedy worried that a lower court's ruling, if upheld, would "create uncertainty as to the patentability of software." He seemed to regard this as a bad thing, but software entrepreneurs like James Thompson are likely to see things differently.

The Top Ten Things Santa Claus Forgot to Give Me

by Doug Bandow at http://www.cato.org/publications/commentary/top-ten-things-santa-claus-forgot-give-me

Santa Claus came and went. Truth be told, I'm a bit disappointed. He didn't leave me even one of my ten favorite gifts. I guess I have to wait another year. Maybe after next year's election Santa will be more forthcoming.
Top of my list is for Americans to stop confusing Uncle Sam with Santa Claus. The idea of some rich guy from far away showing up to fulfill one's most devout desires is really quite attractive. When people expect the government to do the same things quickly get ugly — and quite expensive. No wonder economist Lawrence Kotlikoff figures that we face total debts and unfunded liabilities totaling some $211 trillion, 14 times America's annual GDP. We've been racking up the red ink in the belief that someone else would pay the bill.
Next, I wish the people of the world would stop confusing Uncle Sam with Joan of Arc. It seems everyone everywhere expects America to show up and save them. The South Koreans desire to be defended from the North Koreans. The Japanese want protection from China. The Afghans in government want to keep the Taliban out of government. The Europeans expect Americans to buy the expensive weapons necessary to allow them to take credit for tossing out a North African dictator. The Israelis insist that the U.S. bomb their enemies. And so it goes.
It's kind of nice to know that most everyone — except the cuddly North Koreans and their new dictator, informally known as the Cute Leader — trust Americans with guns. (Too bad liberal congressmen at home don't do so either, but that's another story!) However, the result is a big expense, with the U.S., despite its $211 trillion in debts and liabilities, spending as much on the military as the rest of the world combined. And it means Americans are constantly at war dying for things which are pretty hard to explain to the families of those doing the dying. Such as creating a strong, honest, and competent central government in Afghanistan, a country which never has had a strong, honest, and competent central government, at least in our lifetimes. And a country where it wouldn't make any difference to America if there was a strong, honest, and competent central government.
Number three is that Washington stop lecturing other nations about democracy while sucking up to corrupt thugs who jail anyone foolish enough to support democracy there. You know, like the Saudi royals. It's a great scam — they promote ascetic Islamic lifestyles at home while enjoying licentious playboy lifestyles abroad. The U.S. also supports crooked autocrats throughout Central Asia. A little hypocrisy might be necessary in international relations, but American officials tend to engage in ostentatious hypocrisy, which unfortunately is noticed around the world.
My fourth wish is for my conservative friends who claim to believe in individual liberty and limited government to stop campaigning to toss people in jail for smoking marijuana and stop glorifying participation in deadly and destructive wars. It may be stupid to use drugs — though not obviously more so than to use alcohol and tobacco — but that's not a good reason for filling America's prisons. War is the ultimate big government program. Not to mention the fact that killing people always should be a last resort, not something to engage in when one has a midnight brainstorm after consuming a quart of one's favorite ice cream, which probably explains most of Newt Gingrich's crackpot pronouncements.
Number five on the unfulfilled Christmas list is that my liberal friends who say they believe in "choice" apply the same principle to issues other than sex. Like choosing to engage in economic acts among consenting adults. To use one's own earnings how one wishes, even if that means being selfish, greedy, obnoxious, and just not very nice. To purchase firearms to defend oneself from criminals. To engage in even "offensive" free speech. In short, to live pretty much as you'd like so long as you aren't violating other people's rights.
Coming in sixth place is my desire that conservative Republicans who have trouble staying married or staying faithful to their wives — and especially who have trouble doing both — shut up about family, marriage, fidelity, religion, morality, and especially Western civilization. I used to hope that they would just slink away if they were on their second wife. But Santa consistently refused to provide that gift, so I'm now asking for less. Could Republicans shut up about these things if they were caught cheating on their second wife? (I call it defining deviancy down, or "Newting" for short.)
At seven is my hope that my fellow Christian believers will get over their feelings of persecution. Yes, much of elite culture is unreservedly hostile. More ominous are occasional legal attempts to limit religious activity. However, the First Amendment remains a powerful bulwark against state interference, a protection lacked by people in other lands. Christians need to do more to reclaim the culture instead of just complaining about its decline.
Moreover, hundreds of millions of believers in nations as diverse as China and Pakistan, North Korea and Iran, and Burma and Saudi Arabia face brutal, sometimes murderous persecution. George W. Bush's needless war in Iraq and this year's "Arab Spring" have unleashed successive waves of new persecution against Christians and other religious minorities. I have stood amid the rubble of wrecked churches in Indonesia and Pakistan. Christians in such nations know what persecution really is. In contrast, yesterday, Christmas Day, tens if not hundreds of millions of Americans safely attended religious services of all kinds.
In eighth place is my wish is that members of the bipartisan War Party stop smearing their opponents as isolationists. There is something strange about people who joyously propose bombing, invading, and occupying nations around the globe claiming to be internationalists. The real internationalists are those who argue that the best forms of global involvement are not slaughtering other peoples. No doubt, there are a lot of bad folks whose deaths make the world a better place. Saddam Hussein for one. But it is not clear that the benefits of his death outweigh the tragedy of some 200,000 Iraqis killed in the ensuing civil strife. And it certainly wasn't America's place to decide that "the price is worth it," as then UN Ambassador Madeleine Albright described her view of the deaths of Iraqi babies due to U.S.-supported sanctions.
Wish number nine: partisans of all stripes should stop demonizing their opponents. Bill Clinton had a pretty disreputable marital life. Nevertheless, he was a smart guy, interested in policy, and with great but sadly unfulfilled potential. He also managed to stay married, in contrast to so many holier-than-thou Republicans. He deserved to be impeached for committing perjury, but he was not the president most deserving of that fate: think Richard Nixon, who shamelessly abused the trust placed in him.
George W. Bush was a big spender who made tragically foolish international decisions. He was a poor decision-maker who should have stayed a baseball owner. But his personal life was exemplary; he treated people decently. He was a bad president, not a moral monster. There are lots of reasons to disagree with Barack Obama on policy. But he is bright and engaged, has suffered no hint of personal scandal, is known for treating staff well, and gives no sign of being anything other than a patriot. He is liberal, yes, but certainly not a "socialist thug" as one embittered conservative columnist described him, let alone the evil incarnate that so many conservative emailers suggest.
All of these presidents deserved determined opposition from people who believe in limited government and individual liberty. However, none deserved to be targeted by an increasingly vicious political campaign of personal destruction.
Last but not least, to paraphrase that great political philosopher Michael Jackson, everyone should look at the person "in the mirror" before rushing off to demand some politician somewhere do something. Compassion originally meant to "suffer with," as Marvin Olasky explained in his book The Tragedy of American Compassion. Compassion should require giving of oneself, both money and self. Compassion should not mean stealing from others, even for alleged good works.
Reform of all sorts should start at home and in community with our friends and neighbors. Children must be raised, morality must be taught, needs must be met, lives must be healed, problems must be solved. There is a role for government, but it should be the last resort. We live within concentric rings of people and institutions. At the center are individuals and families and we move outward as we relate to and cooperate with others. The national government is the outer ring, like the planet Pluto in our solar system. We should go there only after everyone and everything else has failed.
These ten seemed like pretty reasonable wishes to me. I don't know why Santa was so uncooperative. He didn't give me even one of them. But there's always next year! I'll mail my letter to Santa earlier next time. Maybe then Santa Claus will make an appearance at my house next Christmas.

2012-07-24

America As a Constitutional Republic: When Can the President Kill?

by Doug Bandow at http://www.cato.org/publications/commentary/america-constitutional-republic-when-can-president-kill

The U.S. has been fighting the “war on terrorism” for more than a decade. Thousands of Americans have died, both in the 9/11 attacks and Washington’s wars in Afghanistan and Iraq. The Constitution also is under assault, as successive presidents have asserted extraordinary and unreviewable power in the name of combating terrorism.
Washington even has turned targeted killing—or assassination—into routine practice. U.S. SEALs are used when the job needs to be close and personal, like the mission against Osama bin Laden. But drones have become the tool of choice, widely used in Pakistan, Yemen, and elsewhere.
This new form of warfare raises fundamental questions for a democratic, constitutional republic. International law bars arbitrary killing. Domestic law further restricts the execution of U.S. citizens. Moreover, promiscuous assassinations move foreign policy into the shadows, reducing the opportunity for a full public debate over issues of war and peace.
In traditional conflict the opposing sides are reasonably clear. Not so in the “war on terrorism.” Is this fight traditional war, law enforcement, or a new hybrid? If the latter, what rules apply? What should be done if there are no obvious battlefields and no certain combatants? Should propagandists be treated as fighters? Are any procedural protections required before a U.S. citizen can be killed?
These issues ended up in federal court in August 2010 when Nasser al-Aulaqi filed suit seeking a preliminary injunction to prevent the Obama administration from killing his son, Anwar al-Aulaqi. The latter, an American citizen living in Yemen, had been added to a federal “kill list” four months before. Judge John Bates dismissed the lawsuit on procedural grounds, ruling that Nasser al-Aulaqi lacked “standing” to sue and the so-called “political doctrine” prevented the court from deciding the issue. Last September Anwar al-Aulaqi was killed by a Predator drone.
The Constitution is the fount of authority for the national government and protects Americans even when they are overseas. The 4th Amendment regulates the seizure of citizens, who are to be “secure in their persons.” The 5th Amendment mandates that no one can “be deprived of life, liberty, or property, without due process of law.” Other constitutional provisions cover prosecuting traitors and imposing bills of attainder (the former requires the testimony of two witnesses; the latter is prohibited).
The Alien Tort Statutes and Torture Victim Protection Act also bar arbitrary killing. Moreover, this principle has been incorporated into customary international law. Admittedly, “many norms of international law are vague and even border on the vacuous.” Nevertheless, international law reinforces domestic legal restrictions on killing American citizens.
Limits on government are necessary to preserve a liberal democratic order and protect individual liberty. These constraints are most important where state power is most extreme and its consequences are most significant. Like killing people. In 2004 the United Nations Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions observed: “Empowering governments to identify and kill ‘known terrorists’ places no verifiable obligation upon them to demonstrate in any way that those against whom legal force is used indeed are terrorists, or to demonstrate that every other alternative has been exhausted.”
So can the U.S. government kill its own citizens, like al-Aulaqi? Ryan Alford, a professor at Ave Maria School of Law, observed: “It is beyond peradventure that the Framers never intended to invest the president with the power to order a citizen’s execution without trial.”
Yet police sometimes shoot and kill without trial. Doing so is legal, but requires a powerful justification. The same principle applies to combating terrorism.
The U.S. government may prosecute citizens for many reasons. Committing treason, for instance. Supporting organizations which threaten the U.S. Perhaps even for serving as propagandists for America’s avowed enemies. But none of these activities would warrant secretly placing the person’s name on a “death list,” especially without a conviction or other adjudication of guilt by an objective body. Even Jeh Johnson, the Department of Defense General Counsel, observed that simply embracing al-Qaeda’s ideology would not be enough.
In contrast, joining enemy armed forces and fighting U.S. forces would allow the U.S. government to target a U.S. citizen. But Anwar al-Aulaqi was living in Yemen where no U.S. troops were fighting, unlike in Afghanistan and Iraq, had joined the equivalent of a gang rather than an army, and was not involved in traditional combat.
Nor did the authorization to use military force adopted by Congress after 9/11 cover al-Aulaqi. The resolution authorized the president to use “all necessary and appropriate force” against those who “planned, authorized, committed, or aided” the 9/11 attacks in order to “prevent future acts of terrorism.” As such, the AUMF targeted al-Qaeda and those who attacked America a decade ago. Al-Aulaqi did not leave the U.S. until 2002, settling in Yemen two years later.
Outside of active combat, especially in a declared war, when can Washington kill American citizens? Only if the government can demonstrate a compelling interest subject to what the U.S. Supreme Court terms “strict scrutiny.” That means the targeted citizen must pose an imminent threat to life (or threaten serious physical injury) and killing him or her must be a last resort.
International law embraces similar concepts. One is proportionality—such as responding to a threat to life. Another is necessity—which reflects imminence and last resort. Also considered is precaution—which requires planning to limit the recourse to lethal force. Thus, under both domestic and international law, the American government can execute people, including American citizens, only for the most important of reasons and when there is no reasonable alternative to doing so.
Were these criteria met in the case of Anwar al-Aulaqi?
The administration insisted they were. It called him chief of operations for al-Qaeda in Yemen. It said he personally instructed a suicide bomber in 2009. It claimed he was “intimately involved in the attacks that have come closest to hitting the United States.” It contended that he had a “direct role in supervising” the attempt to send mail bombs to America. It asserted that he pushed al-Qaeda (Yemen) to attack the U.S., something he “said publicly was his goal.”
If these allegations are true, al-Aulaqi threatened the lives of Americans. One could imagine someone joining a completely ineffective terrorist-wannabe group, which might not justify a deadly U.S. government response. However, while al-Qaeda (Yemen) thankfully so far has achieved little practical success, it is not for want of trying. Washington should not be restricted to playing defense, hoping to always be lucky in foiling new terrorist plots. By his conduct al-Aulaqi created a presumptive danger to America.
However, the government was not attempting to preempt any particular plot. Was the threat imminent, especially since names apparently are entered on the “kill list” for months or years, without apparent regard to potentially changed circumstances?
Regarding al-Aulaqi Cato Institute Chairman Robert Levy said bluntly: “The imminent-threat contention isn’t credible.” There is no obvious reason why it was necessary to kill al-Aulaqi on September 30, 2011 versus October 30 or November 30. Indeed, only rarely is the government likely to have reliable knowledge of an upcoming plot of the sort necessary to demonstrate “imminence” in a particular case.
Nevertheless, membership in a hostile, violent terrorist group engaged in an ongoing campaign to harm Americans arguably creates a substitute form of imminence. In essence, al-Aulaqi’s actions shifted the burden of proof. Take a leadership role in a group dedicated to attacking Americans and you can be presumed to pose an imminent threat to kill or commit great bodily harm. Membership in al-Qaeda (Yemen) joined intent with action.
Finally, was assassination a last resort—could al-Aulaqi have been captured? The U.S. government has successfully prosecuted other individuals for terrorist activities. Trying instead of killing al-Aulaqi would have showcased America’s commitment to the rule of law.
But it obviously is easier to capture a fugitive in the United States, where the government (at whatever level) has full authority. Even with the cooperation of foreign governments, it is much more difficult to grab someone overseas, especially if he or she has friends in the local police, military, or intelligence services. In fact, even attempting to capture someone might require a significant military operation. It would be ironic if the Constitution was interpreted to bar use of a drone to kill a person while justifying a large foreign expedition to capture the same person.
In short, if the administration’s claims were true, the al-Aulaqi killing probably met constitutional requirements. However, simply saying it is so does not make it so. My Cato Institute colleague Julian Sanchez warned of the tendency to treat such assertions “as ironclad facts rather than contestable inferences from necessarily patchy data—even though the past decade should have made it abundantly clear that analysts sometimes get it wrong.”
Washington policymakers have commonly relied on discredited intelligence claims. Consider the catastrophic war against Iraq. The credibility of foreign sources must be weighed. Competing intelligence must be balanced.
Indeed, this is why trials are held on criminal charges: juries assess witness credibility and compare conflicting claims. In the case Yaser Esam Hamdi vs. Donald H. Rumsfeld, the U.S. Supreme Court even ruled that “a citizen held in the United States as an enemy combatant [must] be given a meaningful opportunity to contest the factual basis for that detention before a neutral decisionmaker.” It would be ironic if it was easier to kill than imprison a U.S. citizen.
Administration claims regarding al-Aulaqi have been challenged. Gregory Johnsen, a Yemen specialist, contended: “Certainly, Aulaqi was a threat, but eliminating him is not the same as killing Osama bin Laden.” Johnson pointed out that al-Aulaqi was not the head of al-Qaeda (Yemen), in charge of military operations, or even the organization’s top religious scholar.
Explained Johnsen: “Rather, he is a mid-level religious functionary who happens to have American citizenship and speak English. This makes him a propaganda threat, but not one whose elimination would do anything to limit the reach of the Qaeda brand.” Indeed, added Johnsen, “Mr. Aulaqi’s name may be the only one Americas know, but that doesn’t make him the most dangerous threat to our security.”
How to decide the truth about al-Aulaqi and others like him? The administration apparently produced a 50-page memo citing his operational role in a group viewed as a co-belligerent with al-Qaeda. Washington also contended that his capture was impracticable. What was reasonable and due process, the administration added, had to be determined by consequence.
These are reasonable arguments. But allowing the president and his aides to compile “kill lists” in secret with no charges filed, no outside review of evidence, and no oversight of decisions should leave every American more than uncomfortable. Unreviewable and unaccountable power is inconsistent with a constitutional republic.
Events like 9/11 may justify expanding government power. However, officials still must be held accountable for their use of that power. Yet in cases like al-Aulaqi there is no accountability so long as the government is careful to assert arguments which offer a constitutional justification for targeted killings—that the person posed an imminent threat which could be dealt with no other way—and the courts refuse to exercise oversight.
Even if the president can get away with acting unilaterally, he should not do so. The administration could create a formal process with internal checks and balances. Afsheen John Radsan and Richard Murphy, of the William Mitchell School of Law and Texas Tech University School of Law, respectively, argued that “the government must take reasonable steps based on individualized facts to ensure accuracy before depriving any person of life, liberty, or property,” but suggested that this requirement “might be satisfied by independent, intra-executive review.” In fact, Jeh Johnson contended: “Within the executive branch the views and opinions of the lawyers on the president’s national security team are debated and heavily scrutinized.”
However honest such an internal review, it is not enough. In the case of al-Aulaqi, the administration should have released its decision memo. It need not reveal any sensitive intelligence. But the government’s arguments should be available for public review. Chicago Tribune columnist Steve Chapman complained that the president “saw no need to bother” to make the case that al-Aulaqi “posed a clear threat to American lives and that the missile was the only feasible way to avert it.” The president should have made the case.
Moreover, the nation’s founders created a system with numerous checks and balances to constrain government irrespective of who was in office. Argued Robert Levy: “The separation of powers doctrine, if it means anything, stands for the proposition that citizens cannot be killed on command of the executive branch alone, without regard to the Fourth and Fifth Amendments.” Institutionalizing stricter safeguards is imperative today, with the new forms of warfare which has come to dominate U.S. policy.
Electronic surveillance of foreign powers and their agents, which could include Americans, posed a similar challenge. In 1978 Congress passed the Foreign Intelligence Surveillance Act. FISA allows surveillance of foreign parties without a court order, but requires a warrant, through a special court which hears the case in secret, when Americans are involved.
Congress should create a similar process for targeted killings. Legislators should establish special National Security Courts to grant formal Assassination Warrants. The government would have to demonstrate that a serious threat was imminent and there was no reasonable alternative to a targeted killing. Judges would be trained to assess intelligence claims. A warrant would allow the government to place a name on an official “kill list.” The warrant would sunset after a period of time—six months, perhaps—after which the government would have to return to court to renew the warrant.
Admittedly, “assassination warrants” would seem grotesque in a free society. The fact that the threat of terrorism has generated new forms of war which undercut Americans’ liberty provides another reason to rethink an interventionist foreign policy which encourages terrorism. Promiscuous intervention by Washington has left the U.S. less secure in recent years. An activist foreign policy also is undercutting America’s heritage of liberty.
As long as Washington responds to terrorism with extreme countermeasures, such as targeted killings, new procedures are necessary. At least judicial review would force the government to make a proffer of proof to someone independent of the executive branch. Moreover, specialized training would enable jurists to ask the right questions. Executive authority might remain excessive and subject to abuse, but it would no longer be essentially limitless.
Osama bin Laden and his fellow terrorists have lost the war on terrorism. However, their attacks have transformed the U.S., threatening the liberties as well as lives of Americans. There is no greater government power than to order someone’s death.
In killing Anwar al-Aulaqi the administration may have acted constitutionally. But even if so, it did not act consistently with a free society. Congress should create additional safeguards.
Americans must never forget that we are securing a democratic republic, a system based on protecting individual liberty. If we fail to preserve the freedoms which make America unique and worth defending, the terrorists truly will have won.