Should Government Implement Net Neutrality?

This is a slightly modified version of a speech I gave in a debate on Net Neutrality that I did with my partner, Louis Saintvil on January 18, 2018. The debate was sponsored by MicGoat and was an Oxford-Style debate. My partner and I won by swaying more of the undecided vote: our side began with 4 votes and ended with 9, while the opposition began with 9 votes and ended with 11.

You can see the debate on Youtube here.


I want to begin with an ethical explanation for why Net Neutrality is not valid legislation. As with any ethical discussion, a proper context is required. To that end, we need to define and understand three key concepts: the nature of man, the nature of government, and the nature of individual rights.

Let’s look at the first of these: the nature of man. In the words of Aristotle, man is the rational animal. He survives by the free exercise of his reasoning mind. A man may feel hunger on instinct, but only his reason will tell him how to distinguish food from poison. A man’s body may suffer from infection without his thinking about it, but only his thinking about it can produce medicines to counteract the disease.

What is it that prevents man from exercising his reason? Physical force. Coercion. You can’t think if someone else brandishes a gun your way or threatens to break your legs should you come to the wrong conclusion. Force is anti-mind and anti-reason.

To summarize this point: man uses reason to survive and force is opposed to reason.

The next concept to examine is the nature of government. What is government? Government is an organization with a monopoly on physical force in a given geographic area; coercion is the essence of government. If you think otherwise, try ignoring a call from the IRS or disobeying a traffic cop sometime to see just how voluntary government mandates are.

To summarize this point: the essence of government is force and a moral government seeks to prohibit the exercise of force in civilian life.

What is government’s proper relationship with man? To answer this, we must discuss our third key concept: individual rights.

Rights are principles that define an individual’s freedom to act in a given social context; they are a bridge between ethics and politics. The theory of individual rights recognizes that an individual’s life is his own, and that the individual is not to be expropriated by those who would sacrifice him to the ends of the group, the tribe, the race, the class, “God” or “society”.

Legitimate rights are concerned with freedom of action, and are not guarantees for free goodies. This is worth repeating: rights do not entitle a person access to the products or services created by others on the grounds of need or whim. Rather, rights secure what a person creates by his own effort and allows him the freedom to trade with others.

There is no right to a “fair” wage, only a right to hire someone on the basis of agreed employment terms.

There is no right to a “competitive” price, only a right to produce and negotiate the terms of sale.

There is no right to happiness, only a right to the pursuit of your own happiness by your own effort.

Rights provide us with an objective means of rating a government. If government protects citizens from those that initiate force it secures their rights and is a critical ally of man.  If government initiates force against man so to exploit what he creates, it becomes even worse than the two-bit thugs that hold him at gunpoint.

History is littered with governments that used force to oppress the people in its borders. Examples include the absolutist monarchies in Europe, the current theocratic states in the Middle East, and the totalitarian states of the 20th century. It wasn’t until the late 18th century when the American founding fathers discovered the proper role of government and gave birth to the United States, the first government that was limited by design.

To summarize this point: rights are principles that delineate individual action in a social context, and government is at its best when it protects man’s rights.

So how does this viewpoint apply to the issue of Net Neutrality? Whenever we examine a government policy, what we need to ask is: is this a proper place to use a gun instead of a comment card, a boycott, or a rational argument?

In the case of Net Neutrality, the answers to these questions is a resounding “no!”

Net Neutrality legislation is based on the idea that cheap internet access is a right. In the words of the Orwellian “media watchdog” Free Press:

“What we want to have in the US and in every society, is an Internet that is not private property, but a public utility. We want an Internet where you don’t have to have a password and that you don’t pay a penny for. It is your right to use the Internet.”

Like all phony rights, Net Neutrality is an attempt to guarantee people access to something that they didn’t produce without first asking: at whose expense?

Internet service is not something that simply exists in nature, ripe for redistribution and smarmy government guarantees. Net Neutrality legislation treats this expensive infrastructure as a given. It evades and invalidates the rights of the internet service providers (ISPs) who make the internet possible.

Advocates for Net Neutrality will counter that Net Neutrality protects free speech. If you want to post controversial blog posts or Facebook comments, proponents of the law argue, then what stops Verizon or Comcast from preventing that from occurring on their networks? Or even worse, what if your ISP decides to block access to certain websites on the grounds that it objects to the content there? As stated by the ACLU, ” freedom of expression isn’t worth much if the forums where people actually make use of it are not themselves free.”

This argument is based on a flawed conception of free speech. All rights are contextual; that is, there is a context in which one’s freedom to act can be limited by the rights of others involved, and this includes property rights. One person’s right to free speech does not mean that others must provide them with the means to express their ideas. Rather, free speech ensures that you are free to spread your views by whatever means you have earned in trade with others.

Free speech allows you to publish content in your own newspaper; it does not entitle you to publish content in someone else’s newspaper. Free speech holds that you can freely rent a lecture hall to speak, not that you are entitled to the microphone at an event that someone else has paid for. Content providers today such as Google or Facebook reserve the right to remove posts from their media at their discretion, and that is their right. By the same token, your right to free speech does not impose any obligations on those that deliver such content to you in the first place.

Even so, say Net Neutrality spokespeople, what about corporate influence in America? It is a long-accepted bromide among the political left (and some on the political “right”) that corporations collectively benefit from screwing over their customers and monopolizing whole industries for the sake of “greed.”

This attitude is generally misguided, but in the case of the internet it is totally out of touch with reality. To bring the internet to your home requires myriad cables, satellites, wireless transmitters, servers and other expensive, electronic equipment. ISPs have labored for decades to innovate and produce quality service to their customers, moving us from dial-up to DSL to fiber optics and beyond. From 2011 to 2013, the top 3 providers alone spent over $100 billion on improving their service. From 2005 to 2015, average broadband increased by a whopping 1150%. All this, without government mandates to enforce Net Neutrality. So much for the mustache-twirling businessman.

The fact is that Net Neutrality is not only unnecessary, but it is unjust and immoral. The internet is an important part of life in today’s world; I do not deny this. Precisely because it is so vital, however, the least we can do for those who make it possible, rather than try to regulate them out of existence, is to say if only once and as a whisper: thank you.

Is It A Proper Role of the Government to Provide Healthcare to Its Citizens?

This is an essay based on a motion debate that I performed at in June 2017. My friend Chuck Braman and I defended the con side of the debate and argued that it is NOT proper for the government to provide healthcare for its citizens. This was another victory for the two of us as a team, but it was a lot closer than the previous debate on education. This time around, the same number of people shifted their vote on both sides, but because we came in with a far smaller percentage of the total, we were declared the winners.

The division of labor was the same as before: Chuck established the ethical context for our position and I supplemented it with economic and practical arguments. This essay has been lengthened to include some of Chuck’s arguments for the benefit of those that were not present.


America is obsessed with healthcare, and has been for nearly a century. In particular, activists and ideologues have led the charge with demands for government health guarantees, especially for those that are unable to afford it. The alternative, these people hold, is to simply allow children and elderly people to die in the street while well-fed elites enjoy the latest technology and treatments. It has been said that the free market has failed to provide Americans with adequate healthcare and that the government must now step in to correct the issue.

This view is entirely mistaken.

I’d like to disabuse the reader of a common misconception that is popular with both political parties: the idea that we have tried the free market in medicine and that it has failed to deliver results. The truth is that we currently do not have a free market in medicine in the United States today and have not for much of the 20th century. The US is a mixed economy with some freedom and some controls in the healthcare industry. Over time, the controls have continued to grow at the expense of Americans’ freedom.

Government intrusion into healthcare began with occupational licensing laws passed in the late 19th century. These regulations determine who can practice medicine by making it illegal to do so without a license. Medicine is already a demanding profession, as it requires considerable training and skill to become a doctor, yet these laws make that harder still. In the name of “high standards,” the state requires doctors to jump over regulatory hurdles in order to gain regulators’ permission to practice. Licensing laws overrule the independent judgement of dedicated doctors and savvy consumers and leave the market subject to the whims of the bureaucrats. The result is an artificially restricted supply of doctors, and therefore a higher cost for their services.

Licensing laws are only the beginning. The next step in the march towards more government in healthcare was the Stabilization Act of 1942, which froze wages nationwide during World War II. Employee benefits, such as health insurance, were not considered wages under the law and were therefore exempt from the freeze. Employers who sought to attract better talent were incentivized to offer health insurance during this period as part of their compensation packages. The practice was made permanent with the passage of the Internal Revenue Code of 1954, which enshrined the tax benefits for these compensation packages.

The Stabilization Act brought us the modern link between health insurance coverage and employment that is bemoaned by many of today’s unemployed millennials.

The beat goes on, as Medicare and Medicaid were passed in 1965. These two programs taken together implemented full-on socialized medicine for the elderly and low-income populations in the United States. Years later, President Nixon declared a “war on cancer” with the passage of the National Cancer Act. Bill Clinton later took the baton from Tricky Dick in the 90’s when he earmarked more federal funds to cure cancer. The spent dollars are gone, but cancer remains a potent cause of death in America.

In 1986, under the supervision of the allegedly capitalistic Reagan administration, Congress passed the Emergency Medical Treatment and Active Labor Act, also known as EMTALA. EMTALA mandated that hospitals could not reject emergency room care to any patient in need, and made doctors required by law to treat patients regardless of their ability to pay. In the 1990’s, attempts by the Clinton administration to further increase government involvement in healthcare were narrowly defeated by the Republican majority in Congress.

This victory was short-lived, though, as the later Obama administration succeeded in passing the Affordable Care Act. Known colloquially as Obamacare, this bill has made health insurance mandatory for American citizens. In one of the most shameful Supreme Court decisions in history, the bill was upheld based on the government’s power to tax. Despite promises to repeal Obamacare, President Trump has no intention of slowing the growth of government in this sector of our economy.

This gallop through healthcare history explains how we got what we have today: a bloated, chimeric, monstrosity bred by a mixture of cronyism and big government. Today’s health care industry was not birthed by the free market, but by the enemies of freedom. What are the consequences of such a system?

One is the creation of an enormous, unnecessary bureaucracy that would make even the protagonist of Brazil recoil in horror. In the healthcare industry, red tape is the norm as clerks that are responsible for processing payments and dealing with insurance forms outnumber nurses two to one. The same clerks outnumber the physicians by an outrageous nine to one! In case these figures make you think there may be too many nurses on staff, note that it is estimated that nurses spend nearly 35% of their time on documentation alone.

A typical hospital spends over 38,000 man-hours a year to deal with the billing requirements for Medicare. It is easy to see why: Medicare has over 130,000 pages of rules and restrictions that need to be applied whenever a patient uses it. Studies have found that for every 1 hour spent caring for a Medicare patient, ½ an hour must be spent on paperwork. Many doctors no longer accept Medicare and those that do are stuck with the high overhead cost it imposes. Meanwhile, the FDA prohibits the manufacture and sale of new medical devices and prescription drugs in order to perform tests that take an average of 16 years. In the interim, many who would otherwise have had early access to life-saving technology perish while awaiting the seal of approval from some Washington bureaucrat. Such facts are enough to warrant changing the meaning of the acronym “FDA” from the “Food and Drug Administration” to the “Federal Death Administration.”

American lives are not the only thing spent on the healthcare industry. In 2014, Medicare and Medicaid together cost the country nearly $1 trillion; that is nearly 20% of the annual federal budget. Medicaid alone is often the single most expensive budget item in most states. The trend indicates that this will only continue to get worse. It is estimated that in 2029, 20% of federal dollars will be spent on Medicare. The same statistic in 2041 is expected to be closer to 25%. Obamacare, the most recent legislation on healthcare, on its own cost taxpayers $3 billion in 2016. Was this due to a new advance in technology or perhaps a new line of miracle drugs? No, the costs were due to penalties for not having purchased health insurance.

Is so-called “single-payer” (read: full socialized medicine) any cheaper than the hybrid system that we have today? Let’s look at some example cases to find out.

California recently sought to enact a single payer healthcare system, with an estimated cost of $400 billion. The only problem was that the entire state budget was $183 billion at the time. Vermont also tried to implement a single payer healthcare system in 2014, but it also failed due to the high cost. Taxpayers would have been on the hook to fund a $4.3 billion program in that state. How did the government of Vermont propose they pay? An 11.5% payroll tax on all businesses, and an income tax as high as 9.5% for individual taxpayers on top of it. In Colorado, voters rejected plans for a single payer system in 2016 when it was revealed that a 10% payroll tax increase would be needed to meet the $25 billion price tag.

The facts speak for themselves. Is single-payer cheaper? Hell, no!

What makes it so expensive? People are misled to believe that healthcare which is allegedly provided by the government is free when it isn’t. Nothing that requires human effort and the use of man’s reason is “free,” someone has to produce it. Yet in America, customers are insulated from the true costs of healthcare and health insurance due to a looming, third party payer environment that is in cahoots with the government. If and when this colossus fails to deliver, left-wing activists say, what we need is a complete government takeover to eliminate the contemptuous “middleman.”

The truth is that even in single-payer government systems, government does not produce anything. All it can do is redistribute by force what has been created by successful, productive citizens. The healthcare industry in the United States is inspired not by socialism on the Bolshevik model (communism), but socialism on the German model (fascism). The former is a system wherein the government completely owns the means of production outright, and the latter is a system wherein the government forces people with nominal “property rights” to do its bidding. Both systems are opposed to laissez-faire capitalism and both systems obliterate private property rights.

The advantage of the fascist model is that by preserving the trappings of a capitalist system, it is more difficult to see its totalitarian nature. It also makes it that much more difficult to see the true costs, since government’s ability to borrow and print unlimited money serves to obscure what is really happening. People tend to spend more when they are not given the bill at the end to settle. Before Medicare, nearly 55% of healthcare spending was out of pocket, and by 2010 that figure dropped to just over 10%. For every 1 dollar of care that a patient receives today, on average he only personally pays 14 cents. Who covers the rest? The benevolent government, a.k.a “the public,” a.k.a the cash-strapped taxpayer.

The final and arguably most morally devastating sin I will attribute here to government meddling in the healthcare industry is the massive wedge that it drives between doctors and their patients. Under the free market, the focus is on individuals — individual patients treated by individual doctors. The standard is individualized care. Under socialized medicine, the focus is on collectives – a collective of patients that is owed the services of a collective of doctors. The standard is collectivized care.

Collectivized care holds that what matters is what the FDA thinks is good for all patients, not what an individual doctor thinks is good for his patient.

Collectivized care holds that what matters is overall spending, not what an individual consumer should spend to satisfy her healthcare needs.

Collectivized care holds that what matters is what people need, not what they have earned.

The activists will argue that healthcare is a right, and that the state has a responsibility to provide affordable healthcare to its citizens. But there can be no right to the products or services produced by others. Such “rights” only enslave the producers to the consumers. To see the absurdity of such an argument, apply the principle that doctors owe care to whoever claims a need to other industries that serve a vital need, say restaurants. What if restaurant owners were required to serve free meals to anyone that came in claiming to be hungry, regardless of their ability to pay? What about clothing stores being required to let anyone who has a need for a new pair of high heels or the latest swimsuit to come in and take it without compensation? Such a principle leads necessarily to the virtual enslavement of the doctors on the grounds that the products of their minds are owed to the first person who demands them.

But the doctors are not the only ones that are enslaved by government control of medicine; so too are the consumers who buy it. Once the premise that government ought to pay your medical bills is accepted, it follows logically that government has a stake in lowering the costs by whatever means it deems appropriate. This is done by controlling behavior, either by limiting “unhealthy” activities or “encouraging” healthy ones.

As an example of the former, consider the onerous taxes on smoking and soda. People genuinely get pleasure from these products and they are available at a price that even poor people can afford. In the name of public health costs, the government thinks is fitting that it make these things harder to buy. California has a “zero tolerance” policy on after school bake sales at schools on the grounds that the children are too obese. Such a message is clear: “you do not own your life, and the state has the authority to tell you what is good for you.”

Other countries in the world are further along the authoritarian road than the United States. They impose even more humiliating restrictions on their people and annihilate even a pretense of privacy. In Japan, the government checks the length of your waistline when you are over 40. If you are deemed too fat, the government dictates that you undergo “reeducation” and lose weight; otherwise, you pay stiff fines. Germany publishes an annual list of people with high health care costs and labels them “antisocial.” New Zealand has turned people away from its borders if they are obese, on the grounds that their healthcare bills would be too high. How is that for fat-shaming?

We do not need this nonsense in the United States; what we need is a free market in medicine.

There is ample evidence that markets lower costs and raise quality, even in the healthcare industry. While healthcare on the whole has risen in price, elective procedures like LASIK and cosmetic surgery have gone down in price and up in quality year after year, even though they are not covered by insurance. In 1998 LASIK cost $2,200 per eye and in 2014 it cost a mere $300 per eye. In the case of cosmetic surgery, the top three most popular procedures (botox, laser hair removal and chemical peel), have fallen in price since 1998 by double digits. For those procedures not covered by insurance which have seen increased spending, the increase averages 32% compared to the industry-wide 47.2%.

Advocates for government-run healthcare seem to believe that no matter how many regulations and controls they place on the healthcare industry, medical care will be readily available to all those that want or need it without regard to cost or the rights of the providers, but this is not true. Health care is not exempt from the laws of economics. It is not manna from heaven.

If you want everyone in society to have bread, your first priority would be to respect the rights of the bakers who toil in kitchens to bake it.

If you want everyone in society to have clothing, your first priority would be to respect the rights of the dressmakers who sit at looms to weave them.

If you want everyone in society to have iPhones, your first priority would be to respect the rights of the programmers who manufacture them.

The same reasoning applies to those that provide healthcare. Leave doctors free; it’s the healthy thing to do.

Trump is Right…Health Care is Complicated!

Unfortunately, this does not prevent people from trying to use government for it.

In a recent speech with state governors, President Trump admitted that implementing government health care is a tricky business and that Republicans are having a hard time pulling the plug. One would think that the wayward health care law, which is responsible for skyrocketing insurance premiums and deductibles, would be easy to ax. Even the standard issue, rose-colored glasses worn by Democrats cannot hide the fact that the bill has been a complete failure. Whence the trouble?

I remember when Obamacare was first rolled out to the public as a 20,000 page bill that nobody in our government seemed to have read. Nancy Pelosi clumsily explained that we would need to pass the bill in order to know what was in it. Healthcare.gov ended up costing three times what was expected and was launched months after the intended date due to mismanagement. I even remember when Obama said that Americans could keep their doctors, if they wanted. We know how that ended up.

Conservative and libertarian Americans flocked to Tea Party protests to oppose Obama’s attempts to expand socialized medicine, and today Americans cite “health care” and “dissatisfaction with government” as two of the largest issues facing the country. This was a healthy reaction but there were warnings that the law would be difficult to dislodge, if it was successfully passed. Wary observers may recall one Tea Party participant’s eloquent sign: “Keep your government hands off my Medicare!” The promise for free goods can corrupt even those Americans that are most suspicious of government.

During the campaign in 2016, Trump went so far as to say he was a “fan of the [individual] mandate” and that the more popular aspects of the law would remain. Today, Republican voters that oppose “socialized medicine” clamor to keep the parts of the bill that would help them personally. As Trump aptly put it: “People hate it…but now they see that the end is coming and they’re saying, ‘Oh, maybe we love it.’” “Repeal and Replace” has become “Rebrand and Re-neg.”

Programs like Obamacare are how government expansion occurs and persists in general. Government-sponsored goodies are as potent and addictive as heroin or cocaine, particularly when people are convinced that it is a right that they are owed. To fund the largess, the government raises taxes and imposes restrictions on businesses. Hard-done-by Americans organize into pressure groups to lobby for new handouts to mitigate the ruinous effects of the old handouts. By the end, a single entitlement thread has morphed into a regulatory spider-web.

This was done by design. Leftists for decades have known that Americans would reject explicit socialism, but would handily vote it in piecemeal if couched in the proper language.

Advocates for the free market were always aware that government health care would be a train wreck. What is one to expect when a gang of government bureaucrats, armed with their pens and the entitled screeching of their constituents, takes it upon itself to try to serve a complex, modern service to a population as large and diverse as the United States?

“You know, health care is a very complex subject,” Trump said. “If you do this, it affects nine different things. If you do that, it affects 15 different things.” Couldn’t agree more, Mr. President.

Is It The Duty Of The Government To Educate Its Citizens?

Government has no business in education and should get out of it.

This is an essay based on a motion debate that I performed in recently. My friend Chuck Braman and I defended the con side of the debate and argued that it is NOT proper for the government to get involved in education. We won the debate by swaying more audience members to our side, based on votes taken before and after our speeches.

The division of labor was such that Chuck addressed the ethical issues concerned with government involvement in education while I was tasked with providing the economic arguments. His powerful opening statement can be found here; I encourage anyone interested in this topic to give it a read! Below is an essay based on my speech.


Larry Elder makes the point that government education is similar to an item on a restaurant menu that not even the waitress would order. Roughly 11% of Americans send their kids to private school, but nearly 30% of parents who work in public schools do so. In urban areas such as Chicago, New York, San Francisco, and Cincinnati it hovers closer to 40%. To reiterate, these are government education providers choosing to send their kids to the competing private schools.

What about the government officials themselves? 37% of Representatives send their kids to private school. For US senators, that number is a staggering 45%. President Obama, himself a product of private education, made a big show of vetting DC public schools when he was elected. After all of the hullabaloo, he sent his daughters to the most elite private school in the capital. If government education is so great, why do its biggest advocates avoid it like tap water in Mexico?

The reason is that empirically, government education has been a total failure.

Consider the money, first. Over a 30 year time frame from 1970 to 2010, spending on education increased by 375% while test scores have stagnated. We spent a total of $934 billion on public education in 2013 alone. Overall, the US government spends about 7% of its GDP on education. That works out to a little over $15,000 per head, all in.

school

You would think that such figures would mean that we had a fairly educated populace, right? Think again.

The US administers the National Assessment of Educational Progress (NAEP) exams every 4 years. The test scores out of 500, and is meant to determine how proficient US students are in a variety of subjects. What does the performance look like on these tests for high school seniors, who have been through the rigmarole of twelve years of government schooling?

In history, 50% of seniors place below “basic” and a mere 12% are deemed either “proficient” or “advanced.” In science, 79% of seniors failed to show “proficiency.” In reading, 26% of seniors scored below the minimum. You read that correctly: nearly a quarter of the students that graduate from government education are, for all intents and purposes, illiterate! These stats are all well-documented here and here and here.

Further evidence of the ignorance borne by government schools can be observed in various polls and surveys. For instance, 42% of Americans think that the slogan “From each according to his ability, to each according to his need” appears in one of America’s founding documents. That is, they do not know the difference between the Constitution and the Communist Manifesto. Another poll demonstrated that 18% of millennials were not familiar with Soviet mass-murderer Joseph Stalin. The same poll showed that 32% did not know who Marx was. A frightening 42% were probably under the impression that Mao Zedong was an item you might order from a Chinese restaurant, since they were not familiar with the communist dictator and his oppressive regime.

In 2003, the National Assessment of Adult Literacy conducted by the US Department of Education found that 14% of adults scored below basic on the exam and qualified as “below basic,” making them virtually illiterate. That’s nearly 30 million people, 45% of which graduated from high school!

Far from being “the bedrock of our democracy”, the US Department of Education came into existence in 1979, during the Carter years. Before that, in 1940, the US had a literacy rate exceeding 97% despite the fact that the population had no more than an 8th grade education. Now, nearly 60% of graduating seniors in the US that enter community colleges require remedial education.

Why is the education system producing such abysmal ignorance, in spite of such high spending? The answer is that government is not suited to educating anyone, let alone impressionable children.

Education is another aspect of raising children. Americans expect that parents will shelter, clothe and feed their children without complete government control; why not allow for the same thing in education? In principle, there is no difference between a child with an empty mind and a child with an empty stomach: both needs ought to be served by the parents, not the state.

On most days, liberals and conservatives alike oppose monopolies and will ask the government to interfere in order to prevent their formation. However, the government education system is a monopoly that exercises absolute control over the quality of teachers and the material that is presented. There are no truly “private” schools since each one exists with government permission. Instead of viewing the parent as a client with needs to fulfill in a market setting, the school bureaucrats see parents as obstacles in their way.

The alternative to coercive, government education is voluntary education on the free market. Advocates for free market education do not trivialize its importance by asking to get government out; we hold that education is too important to let the government in. Free markets allow individuals to patronize those establishments that provide the best value for the cheapest price. This is the way to establish rational, objective standards in education. Government standards, on the other hand, are based on the arbitrary whims of bureaucrats rather than what is actually demanded in the market. According to the prevailing view, an Ivy League scholar with multiple degrees has less ability to teach than someone with a bachelor’s in education “bulletin board” design.

Compare education to another, relatively freer industry: technology. Today we carry within our pockets micro-machines that are more powerful than mainframes that took up entire rooms just 30 years ago! 81% of households below the poverty level have access to these miraculous devices. This technology was created not by government bureaucrats, but by entrepreneurs who sought to make a profit by providing value to paying customers.

What would a free market education look like, you may ask? Coercive government education dominates the industry today and renders this question difficult to answer. As in many industries, it is nearly impossible to know exactly what innovative solutions would be implemented, but there are glimpses that occasionally break through here and there.

The internet allows for low cost teleconferencing, recording and podcasting. People can get lessons on the go, or retake courses that they have trouble understanding. Students can view worked examples on YouTube as effortlessly as their parents can view movies on Netflix or Amazon. Indeed, there is a burgeoning industry for private tutoring which features companies such as Hooked on Phonics, Varsity Tutors, Coursera, Rosetta Stone, Khan Academy, Lynda.com, and many others. The trend is that technology is rendering government education largely obsolete and unable to compete.

Meanwhile, companies like Boeing, Apple, IBM and Google already teach summer workshops and seminars for students free of charge. If government were out of the picture, many tech companies would be able to invest in computer science academies that specialize in teaching the students how to use their best-selling products of today and program the  revolutionary products of tomorrow. It would be a win-win for the companies and the students, just as one would expect in a situation with no coercion. Tech would not only be the only industry that would benefit from such an occurrence, though it would certainly be among the first.

All of this, despite the fact that government continues to tax away our earnings in order to subsidize the compulsive government schools. One can only imagine what things would look like if the sacred cow were put to pasture and people were left free to innovate.

Many concerned people may ask: what about the poor? Would they not be able to receive an education if there were no government schools? The critics that take this line seem to forget the fact that most parents, even the poor, love their children and want to see them succeed. In fact, one could argue that the reason some parents may not be as involved in their kids’ education in the first place is because they have been taught by “education theorists” and government policy-wonks that they should stay out of the way and let the state handle it. But I digress.

We can derive solace that education in a free market would be cheaper, since government involvement creates artificial scarcity both by limiting the number of schools that come into existence as well as the limiting the number of educators via certification. Parents would also have less of their income taxed away and would therefore be able to direct more funds into education if they have kids, and this would include the poor. James Tooley and Pauline Dixon have found that even in third world countries, private educators are finding effective means of educating children at a lower cost than the government schools, and with better results.

If there remains a need for education among the poorest, there is always charity. Even with huge amounts of taxation, charities have given millions of dollars to families of low income. For instance, in NYC alone one charity (the Children’s Scholarship Fund) donated $525 million over the past 13 years alone. One can only imagine what that figure would look like if the government were not sapping over $900 billion a year from American taxpayers!

To the extent that a market is free is the extent to which individuals are free to offer value for value, without coercion or physical force. When man has the ability to invest and build without confiscatory taxation or hyper-regulation, he is free to unleash the power of his mind. Imagine what the education industry would like if we allowed the genius of a Steve Jobs, a Henry Ford, or a Thomas Edison to tackle the problem at hand: how to deliver high quality education at the lowest cost.

Education is not a privilege nor is it a right; it is a service, made possible by the effort of those with the ability and the will to provide it.

 

The Obama Economy: A Study in Keynesian Stagnation

The first in a series of posts about Obama’s legacy.

This is the first in a series of essays on the Obama presidency and its long-term consequences on the United States.


President Obama entered mainstream politics by reviving the rhetoric of class warfare. The community organizer-turned-President promised everyone that the key to prosperity was for “the rich to pay their fair share.” His initial policy was to raise taxes on everyone who made at least $250,000 a year or more in order to fund more entitlement spending. He equated “millionaires and billionaires” with “lottery winners” further cementing the well-known sentiments in his “you didn’t build that” speech. Rather memorably, Obama opined that wealth is largely a matter of happenstance that we as individuals have little control over. In his 2008 victory speech, I sat in a Cornell dining hall and listened to Obama tell throngs of cheering Democrats that the United States has “never been a mere collection of individuals” and that we prosper to the extent that each person “resolves to pitch in and work harder and look after not only ourselves, but each other.” Now I look forward as an adult in New York City to his departure from the White House.

It is not difficult to make the case that Obama is a believer in collectivism, but the exact nature of his economic philosophy is not total socialism. Recall that socialism occurs when the government is the sole owner of the means of production and is responsible for allocating goods and services to the general public. Nazi Germany, Soviet Russia, Maoist China, Castro’s Cuba, and numerous other Soviet satellite states are historical examples of socialism in practice.

President Obama, like his predecessor George W. Bush, believes in the mixed economy. A mixed economy is capitalist in its basic structure but with significant government intervention on behalf of various pressure-groups within society. Another term for a mixed economy is a “hampered free market” economy, because people are free to own property and engage in trade but on a far more limited basis than that of a laissez-faire capitalist society. Even in a pure capitalist society it is proper for the government to limit the free actions of its citizens in order to protect individual rights; capitalism is not anarchy. In a mixed economy, the state will exceed this legitimate function and act with expediency to do “good” according to some non-objective standard, often at the behest of the ruling class or some politically powerful democratic majority.

The narrative favored by mixed economy advocates to justify such intrusion is that government is needed to restrain the excesses of the free market. The economist most responsible for formulating the principles of the mixed economy in the 20th century is John Maynard Keynes, whose work was the inspiration for Franklin Roosevelt’s “New Deal” policies in the 1930’s.

20161015_161422

Keynes argued that the private sector, left to its own devices, was bound to fail without occasional intervention by the public sector. In particular, the government is obligated to “invest” in infrastructure to stimulate growth.. There should be a central bank, said Keynes, to control interest rates and strictly regulate the money supply. Keynes even advocated deficit spending in the short term to fund large projects if the money was not forthcoming in the present, remarking that “in the long run, we’re all dead.” This is the economic theory that the Obama administration adopted and sought to implement during its tenure. What are the results?

The United States has experienced economic stagnation under Obama, with growth rates less then 3% per year. The percentage of American households with at least one person receiving government benefits has reached an unprecedented 52%, up from 44% in 2008. There is at present a shocking 97,000 pages of federal regulations after Obama’s rule, up from the 79,000 pages that we had under George W Bush. Though these regulations no doubt vary widely in their objectives, the result is a complicated hodgepodge that makes it difficult for upstart firms to compete with more established players. In order to survive in the business world companies cannot afford to ignore lobbyists.

Now some might ask: shouldn’t Obama get a pass on the bad economy since he inherited Bush’s mess? It is true that the economy was in a tight spot at the end of the Bush years thanks to the subprime mortgage crisis of 2008 (which, by the way, was caused by the government and not the free market). All this proves, though, is that George Bush was not a champion of the free market. It does not exonerate Obama, who pursued policies in line with what Bush started and took them much further.

It is no secret that the United States experienced a recession in 2007 that Obama inherited. Before he left office, President Bush signed into law the Troubled Asset Relief Program (TARP) in order to bailout the large banks that were deemed “too big to fail.” Congress then passed the American Recovery and Reinvestment Act of 2009 in an effort to “stimulate” the economy with a cash injection of $831 billion into the arteries of American business. The subsequent Dodd-Frank legislation in 2010 was also passed to further tighten regulations on the financial industry, which was held responsible for the whole affair. Obama, following traditional Keynesian doctrine, signed these bills into law and initiated a largely phony “recovery.”

Despite the lack of results, there is a cost. The stimulus and the later attempt to nationalize the health care industry have resulted in a dramatic increase in the national debt. Under Obama, the debt increased from $10.6 trillion to nearly $19 trillion. Even more alarming is that in 2008 the debt was approximately 65% of the GDP of the US, meaning that it was approximately 2/3 of what the United States produces as a country in a given year. Today, the national debt is 104.5% of GDP if you include external debt! The conclusion that we can draw from this figure is that the US economy is heavily over-leveraged, a fact that is ironic considering Obama and his followers criticized the banks for the same sort of risky behavior when the financial crisis of 2008 hit.

Supporters of the president will sometimes make the claim that Ronald Reagan had a higher percentage change than Obama in the national debt, but this is dishonest rhetoric. When Reagan took office, the total debt was about $1 trillion and by the time he left the debt stood at $2.1 trillion. Overall Reagan added a total of $1.1 trillion to the debt, compared to Obama’s nearly $9 trillion. Even if we adjust the Reagan number to 2016 dollars with the CPI, this leaves you with close to $3 trillion, which is still 1/3 of what Obama inflicted. Economists have also pointed out that aside from the existing national debt, the US suffers from over $127 trillion in unfunded liabilities.

Americans also experienced the so-called Debt Ceiling Crisis of 2011 under Barack Obama. Congressional Republicans, keen to slow down the administration as much as possible in its expansive spending, refused to raise the amount of money that the government could borrow. The result was a political standoff that did little stop the Obama administration’s spending. The “compromise” that ensued promised future tax cuts in exchange for an increase in the national debt by the largest amount in a single day in US history. S&P even went so far as to downgrade the United States credit rating from AAA to AA+.

The truth is that stimulus spending by the government is nothing more than a short term palliative and is not a viable long term solution. Government is funded by tax money that is collected by citizens who participate in the market economy to produce goods and services; it has no money of its own. The fiat money that we use in trade today has only nominal value so long as it can be traded for real goods and services that have actual value to consumers on the market. If the government increases the amount of fiat money in society without a corresponding increase in the number of goods, the result is inflation. Inflation occurs when the purchasing power of money decreases, often due to an increase in the money supply. Inflation functions as an indirect means of taxation, since it devalues the existing savings that existed prior to the money creation. If printing money fails to fix the issue, the government can kick the can down the road with Keynes’ preferred deficit spending alternative.

What about employment, that old Keynesian bellwether?  Supporters of the president claim that he “created” 11.3 million jobs throughout his entire administration and continue to evoke favorable comparisons with Ronald Reagan. There are several things to keep in mind with regards to these misleading statements.

To begin, the comparison to Reagan is specious since the overall labor participation soared under Reagan while the opposite has occurred under Obama. Indeed, there has been an 18% uptick in the number of people who have stopped looking for jobs since Obama took office in 2009. As it turns out, the unemployment statistics only account for people who are actively looking for work; those who choose to remain unemployed are not counted in these statistics. Further, statistics on job creation do not focus at all on job quality and treat full-time jobs with paid benefits as the equivalent of part-time jobs. One could arrive at 100% employment by having half the population dig ditches and the other half fill them; in the meantime, little to no economic value is created.

These quibbles notwithstanding, there is a more fundamental reason to reject the positive picture painted by the Obama administration with regards to jobs. The fact is that the president, like any government organization, simply cannot create jobs. The reason for this is that government does not produce wealth, it merely redistributes existing wealth that is generated by the private sector by force in the form of taxation. Capitalists create jobs when they defer their consumption to invest in a business and serve a market need. Customers patronize the business and the proceeds are used to pay individuals that wish to sell their services without having to take on the risk of the investment. Job creation is one of the many improper functions taken up by government in a mixed economy.

Obama, like many leftists, does not understand that it is private savings and investment on the part of capitalists that expands the economy. When entrepreneurs invest in capital goods, the result is lower good prices and a subsequent increase in the purchasing power of money. The tragic but instructive Obama legacy on economics is a case study in outright Keynesian stagnation.

Why We Should Not Expel the Electoral College

A defense of America’s method for electing the president.

It is periodically fashionable for some Americans to question the electoral college, the mechanism used to elect the commander-in-chief. These individuals couch their position in language that portrays the staid institution as primitive or outdated. A recent petition circulated by the “progressive” MoveOn.org characteristically tries to make the case that the electoral college “has outlived its usefulness” and that it, along with the Constitution itself, was “written when communication was by Pony Express.”

This is curious rhetoric. One would do well to point out that the Pony Express point is an intellectually lazy one because that system actually came into existence in the late 1850’s, more than half a century after the Constitution was penned. The larger point, though, is that when the electoral college was installed has little bearing on how valid it is as an election mechanism. The reason for this is that the problem of political order is as old as human society itself. While our technology and culture may change with time, the core issues at stake with how human society is organized are timeless. It is my contention that the electoral college is an effective, albeit imperfect method for electing the president of the United States and that it should be preserved. To fully appreciate the brilliance of the electoral college, one must understand not only the way that it operates but the history behind its inception.

The electoral college is a system whereby Americans indirectly elect the president. Each state receives one electoral vote for each representative it has in the House of Representatives, plus another electoral vote for each of its two senators. Currently, the states possess 435 representatives and 100 senators between them. Additionally, the Twenty-Third Amendment to the Constitution provides Washington DC, the capital of the federal government, with three electoral votes. There are therefore 538 total electoral votes to allocate for president and a candidate must receive at least 270 to proclaim victory.

Though a state cannot unilaterally decide how many electoral votes it receives, it does have the ability to determine how to allocate the electoral votes that it possesses. On election day, voters cast their ballots in their home state. As the votes are tallied and the states determine the winning candidate in their jurisdiction, electoral votes are “called” for the candidates in the national election. Aside from Nebraska and Maine, all of the states employ a “winner-take-all” approach whereby the candidate that receives the most votes in the state receives all the electoral votes in that state. If there is a situation wherein no candidate for president receives at least 270 electoral votes, the president is elected by a vote in the House of Representatives while the vice president is elected by the Senate.

The individuals that cast the electoral votes in each state are known as “electors” and they take an oath to vote for the candidate that does the best in the state election that they serve. Curiously, there is a possibility that individual electors diverge from their state’s prescription; Such a person is known as a “faithless elector.” This has occurred in American history, but it is a rare phenomenon that is discouraged with state laws which levy a fine on such behavior. These laws have never been challenged in the courts, however, so there is some doubt as to whether such statutes are constitutional.

This is what is meant when it is said that America is not a democracy, but a republic. By definition, a democracy is a system of government where the prevailing power is unlimited majority rule. A society that restricts voting to specific matters can be called democratic, but it is technically not a full-fledged democracy. In a pure democracy,  the people would be able to vote not just on taxes and parades, but also on whether it is valid to dispose the life or property of specific individuals. One need only recall the story of Socrates, who was sentenced to death for “corrupting the youth” by vote in Ancient Athens, to see the dark side of unrestricted democracy. The electoral college entrusts the electors to vote as representatives of the general population.

Why allow a select group to cast the final ballots for the president rather than open it up, Athenian style, to the general populace? It is no secret that the American founding fathers, contrary to what some may believe, were not huge fans of direct democracy. Former president James Madison argued extensively in Federalist 10 that, with regards to government, “measures are too often decided, not according to the rules of justice and the rights of the minor party, but by the superior force of an interested and overbearing majority.” Alexander Hamilton, the man who would go on to serve as the first Treasury Secretary and inspire a hit Broadway musical, was a monarchist sympathizer of the British system of government and sharply criticized the bloodshed in the French Revolution. Fans of the musical may be surprised that Hamilton believed that the greatest threat to American liberty, apart from the return of the British army to North America, was mob rule.  Ben Franklin, never at a loss for witty aphorisms, quipped that democracy is on a par with two wolves and a sheep voting on what to eat for dinner.

The founding fathers sought to limit the “tyranny of the majority” caused by a democratic system when they drafted the Constitution. The method that they elected to use in this mission was federalism, a system of government where power is divided between a large, central governing body and smaller, regional governments. The founding fathers understood that if the United States was overly centralized the government would lose touch with people on a local level because traditions and culture differ from state to state. On the other hand, if there were no centralization at all then the states would be less able to protect themselves from foreign aggression. The US Constitution also limits the extent to which democracy plays a role in American politics by restricting what we can and cannot vote for. America can this be said to be a democratic, or representative, republic.

Apart from granting several powers to the state governments, the American founding fathers also implemented a system of checks and balances between the branches of the federal government to make it more difficult for any one person or political party to fully control it. The Congress is tasked with legislating, the president is tasked with enforcing the legislation passed by Congress and the Supreme Court is tasked with ensuring that the actions of the prior two branches are in accordance with the Constitution. Madison succinctly encapsulated the benefits of federalism when he wrote in Federalist 47 that “the accumulation of all powers, legislative, executive and judiciary in the same hands, whether of one, a few, or many, and whether hereditary, self-appointed, or elective may justly be pronounced the very definition of tyranny.” Like all federalist devices, the electoral college represents an attempt to mitigate the negative effects of democracy while still allowing for the common man to have a say in who his ruler is.

There are persistent opponents of the electoral college to this day, despite the arguments of the founders. Traditionally, these opponents have been agents of the Democratic party dating back to its creation under President Andrew Jackson. The most common objection raised by Jacksonian Democrats applies to the most recent election between Hillary Clinton and Donald Trump, wherein Clinton won the popular vote but Trump won the election with over 300 electoral votes. The basic idea is that because more individual people voted for Clinton than for Trump the electoral college failed to account for the true tastes of the voters. Critics contend that a direct, popular vote is a better approach to elect the president.

This argument overlooks the federalist character of the USA. The Constitution was originally ratified with the understanding that the federal government was created by the states, not the other way around. When the states ratified the Constitution they delegated some powers to the federal government and retained the remainder, as per the Tenth Amendment. The federal government, then, was created not by the act of a single, united American people but rather the various peoples of each individual state. As such, it is the states that select the president, not the populace at large. It is not the people of the US that elect the president but rather the people of each state that elect the president. If there were a direct election by popular vote, the largest and most populous cities would carry a disproportionate weight in the election. The states that have more populous cities, such as Texas and California, would overshadow the states with smaller cities such as Wyoming and Delaware. The result is a complete collapse of individual state sovereignty and representation.

Advocates for the popular vote may argue that the electoral system also disenfranchises states, it just does so for a different set of them. Under the electoral system, presidential candidates spend a majority of their time and ad money in the so-called “swing states” such as Iowa and Florida and less time in stronghold states such as Alabama and New York. This view is also misguided. The truth is that the median voters in any election decide the outcome because the voters at the extreme ends cancel each other out; this is a basic mathematical fact. In a direct popular vote, a simple majority would be able to elect the president without a need to appeal to the minority position at all. The electoral college shifts this “median voter effect” to the state level and makes it more difficult to overlook those with a minority position.

To see why this is, let’s look at a short example. Consider a minority group that numbers approximately half a percent of the total population; as of 2016, this is a group of about 1.5 million people overall. In a direct popular vote their voice is a drop in the bucket and no candidate worth their salt would appeal to such a small niche group. Now suppose a modest, politically conscious chunk of that minority group moved to a state with a smaller population, say Wyoming with a population of half a million. If even one tenth of our beleaguered minority group lived in Wyoming, then they would comprise nearly 30% of the total population of Wyoming, a sizable percentage. When it comes time to campaign, the candidate that seeks to gain the electoral votes in Wyoming would be unable to simply ignore the minority group and they would have a better chance of getting their voice heard in the national election. There is a useful side effect here that the state politics of Wyoming would be more conducive to the goals and interests of the minority group. Apart from casting ballots every four years, individuals vote with their feet all the time when they move to different states.

American federalism remains an innovative solution to the problem of political order. This is not the first time in history that aspersions were cast on the electoral college system and it will not be the last. We should be suspicious of those that seek to overturn it not by refuting the arguments that gave rise to it, but by portraying it as old and outdated. Tyrannical government, after all, is older than federalism; I leave it to you to decide which is the more primitive relic.

The Deplorable Hillary Clinton

Why Hillary was a bad choice for president.

If we cared to vote on such things, “deplorable” would probably win an election on this year’s favorite adjective. For Republicans, it is an example of reappropriation that demonstrates their opposition to the Hillary Clinton campaign. For Democrats, the word conjures images of a cartoon frog. Today, I plan to use it in its original sense to describe the political career of Hillary Clinton, the Democratic nominee for president in 2016.

In the 90’s, it was mostly Republicans and fiscal conservatives who disliked Hillary Clinton when she took a lead during her husband’s administration to overhaul the health care system. As her career moved forward, Hillary changed positions on issues rapidly and sought to amass power for herself and for her husband with no need for principles. Even committed leftists such as my personal hero, Christopher Hitchens, saw her as a dangerous political opportunist and published a whole book critical of Hillary Clinton. Hillary started out as an opponent of gay marriage in 1996 when her husband signed DOMA into law, only to change her opinion in 2004 when the political winds changed. She went on record in the mid 90’s as an opponent of “violent video games” and harsher penalties for recreational drug users, only to see the error of her ways in the 2008 campaign against Barack Obama. Clinton has also tried to backpedal from her hawkish support of the Iraq War during the Bush administration after it became politically important for politicians interested in getting reelected to do so.

Aside from actual voting policies, the public has from time to time gotten a taste of her penchant for telling inspirational white lies which are calculated to make her appear likable, brave, and principled. She claimed in 1995 to be named after Sir Edmund Hillary, the first man to scale Mount Everest; regrettably, Hillary was born several years prior to this event. In the 2008 campaign Hillary falsely claimed that she arrived in Bosnia under sniper fire; in reality, she landed in one of the safest parts of Bosnia with not even spitballs being fired at her. The Wikileaks emails have also recently exposed that Clinton gleefully admits to being disingenuous with voters, admitting that she has “both a public and a private position” on several issues, such as the common Democratic staple of “Wall Street reform”.

What about Clinton’s experience? She was Secretary of State under President Obama, said many of her erstwhile supporters willing to excuse her mendacious nature. Perhaps the country can endure an opportunist, so long as that opportunist is competent. In answer to this, I would argue that there is substantial evidence that Clinton is an incompetent administrator because of her inability to focus on details. There was the recent email issue, where Clinton maintained several private email servers in violation of US security protocol; her actions put American national security interests at risk. There was the 2012 attack on America’s embassy in Benghazi, where Clinton downplayed the possibility of an attack; her actions led to the deaths of several American servicemen. Additionally, Clinton is too eager to intervene in foreign affairs with scant American interests present while ignoring legitimate threats to America from international bad actors. Clinton championed the interventions in both Libya and Syria, where tyrannical dictators were pitted against “freedom fighters,” many of whom are sympathizers to radical Islam. In cases such as these, America ought not choose a side since both parties are anathema to American values. On the other hand Clinton was happy to take money from Saudi Arabia via the Clinton Foundation, even after that regime had worked hard to spread radical Wahhabi ideology hostile to the American Constitution.

Finally, there is the rank corruption that has become synonymous with the Clinton name. There is evidence that Hillary colluded with the DNC to stack the deck against her opponent, Senator Bernie Sanders, in the Democratic primaries; when this became known to the public, Hillary hired Debbie Wasserman Schultz to join her campaign. Hillary took part in the perpetual war that the Democrats are wont to wage against Wall Street, only to accept thousands of dollars for political speeches given to various financial institutions behind closed doors. Wikileaks further illuminated the fact that mainstream media outlets were in the tank for Hillary. One example was CNN’s Donna Brazile, who furnished Clinton with debate questions in advance. Another is the New York Times’ John Harwood, who cozied up to John Podesta and sought what questions to ask Jeb Bush prior to an interview that he was scheduled to conduct.

Reviewing the full Clinton record is as daunting to write as it is to read. In hindsight, I think that this election was lost for the Democrats as soon as Hillary was selected as the nominee. How could things have been different? For one thing, the DNC could have acted sooner to get a younger, fresher face with less baggage involved in the process to oppose Hillary. There are many Democrats in the House and in the Senate that may have stepped forward if only the powers that be encouraged them to act. The only serious challenger that Hillary faced during the primaries was Senator Sanders, who was not even a Democrat but rather an independent that classified himself as a socialist. Rather than stand up to the Clinton political machine, the DNC and the plurality of Democratic voters decided it was best to cash in on Clinton’s brand name recognition in order to gain power. There was plenty of evidence during the primary process that Hillary had ethical problems, and yet many Democrats enthusiastically signed on to support her campaign despite this news.

Once she had secured the nomination, Democrats enjoyed the biased coverage of the mainstream media. Many newspapers, academic journals and research outlets burned the bridge of objectivity to endorse Hillary, even the conservative National Review. Polls consistently showed a pro-Hillary turnout, and Newsweek put out a cover that read “Madame President” nearly two weeks prior to the actual election. In the final days of the campaign, Trump spent hours a day giving speeches across the country to get his message out, while Hillary all but vanished from the public eye. She avoided giving interviews, mostly because it was known that she would need to confront the more unsavory news that surrounded her and her husband. Enthusiastic Clinton supporters I am familiar with here in New York City even passed around a GIF of Hillary performing what appears to be a “victory shimmy” after out-talking Donald Trump during one of the presidential debates. Fed by the echo chambers of social media, Clinton supporters thought that the election was a mere formality and that their girl had it in the bag. Even Clinton rival Bernie Sanders betrayed his constituency and drank the kool-aid when it came time to line up between Clinton. After the election, Sanders has little to show for his endorsement and has lost his integrity, which he had for decades in Congress.

Summarizing, Hillary was an ineffective candidate to run for president. The Democrats could have done better and really should have, if they cared about the future of their party. The majority of Democrats with rare exceptions decided arrogantly that the election was over and that they could pull out an easy win by going with a familiar brand without asking the right questions. The result is that the “I’m With Her” hashtags tweeted out were not successful product slogans, but inscriptions for an unsightly political tombstone.