Thursday, December 31, 2015

Died: Ed Dobson, Pastor and One-Time Moral Majority Leader


After he was diagnosed with ALS in 2000, Dobson spoke and wrote on how to die—and live—well.


By Morgan Lee
http://www.christianitytoday.com/
December 29, 2015





Ed's Story screenshot


“Ed Dobson is now healed and with his Lord.”
That’s how the Facebook page chronicling the life of the former Religious Right leader since his diagnosis with ALS announced his death on Saturday.
Dobson, 65, was the former senior pastor at Calvary Church in Grand Rapids, Michigan.
Born on December 29, 1949, in Northern Ireland, Dobson immigrated to the US at the age of 14. He attended Bob Jones University and earned his doctorate from University of Virginia. In the early part of his career, Dobson worked closely with Jerry Falwell, became a Liberty University administrator, and served on the Moral Majority board.
“I graduated from Bob Jones and couldn't get a job, and Falwell offered me to come work. It was the second year they had a college, and I figured that was better than what I was doing, which was digging graves,” Dobson told PBS in 2009. “So I ended up going to Lynchburg till I found something better, which took 14 and a half years.”
He became the pastor of the non-denominational Calvary Church in 1987.
In 1999, Dobson and Cal Thomas co-authored Blinded by Might, which criticized Falwell and the Religious Right movement. (Read CT’s review.)
Dobson later felt he’d been too harsh.
"I was an outspoken critic of Jerry Falwell and others. Recently, I've changed my mind," he told CT after Falwell’s death in 2007. "I think he was doing what he felt God was leading him to do, and I was doing what I felt God was leading me to do. The ultimate judgment is up to God, not me or Jerry."
Click on the link below to read the rest of the article:

Today's Laugh Track: Merry New Year! (Trading Places)

Why Would Anyone Want a Firearm?


By Charles C. W. Cooke — December 31, 2015
From the December 31, 2015, issue of NR
http://www.nationalreview.com/


The "spectacularly unhelpful" Second Amendment
(Credit: Mike Flippoa_v_d via Shutterstock/Salon)

Of all the ill-considered tropes that are trotted out in anger during our ongoing debate over gun control, perhaps the most irritating is the claim that the Constitution may indeed protect firearms, but it says “nothing at all about bullets.”

On its face, this is flatly incorrect. Quite deliberately, the Bill of Rights is worded so as to shield categories and not specifics, which is why the First Amendment protects the “press” and not “ink”; why the Fourth covers “papers” and “effects” instead of listing every item that might be worn about one’s person; and why the Fifth insists broadly that one may not be deprived of “life, liberty, or property” and leaves the language there. The “right of the people” that is mentioned in the Second Amendment is not “to keep and bear guns” or “to keep and bear ammunition” but “to keep and bear arms,” which, per Black’s Law Dictionary, was understood in the 18th century to include the “musket and bayonet”; “sabre, holster pistols, and carbine”; an array of “side arms”; and any accoutrements necessary for their operation. To propose that a government could restrict access to ammunition without gutting the Second Amendment is akin to proposing that a government could ban churches without hollowing out the First. If a free people are to enjoy their liberties without encumbrance, the prerequisite tools must be let well alone.

Without doubt, the vast majority of those who offer up the “But bullets!” talking point are doing little more than repeating memes that they have encountered. Yet at the root of their provocation is a serious misconception that needs to be seriously reckoned with. In most of the world’s countries, firearms are regulated in much the same way as are, say, cars, radios, and lawnmowers: as everyday tools whose utility can be evaluated without prejudice. In the United States, by contrast, the government’s hands are tied tight. To those who are unfamiliar with the contours of Anglo-American history, this can be understandably confusing. “Why,” we often hear it asked, “would the architects of the Constitution put a public policy question into the national charter? Do we really have to stick with a regulatory scheme that originated before the invention of the light bulb?”

The answer to this question is a simple one: “Yes.” Why? Because, our contemporary rhetorical habits notwithstanding, the right to keep and bear arms is not so much a right in and of itself as an auxiliary mechanism that protects the real unalienable right underneath: that of self-defense. By placing a prohibition on strict gun control into the Constitution, the Founders did not accidentally insert a matter of quotidian rulemaking into a statement of foundational law; rather, they sought to secure a fundamental liberty whose explicit recognition was the price of the state’s construction. To understand this, I’d venture, is to understand immediately why the people of these United States remain so doggedly attached to their weapons. At bottom, the salient question during any gun-control debate is less “Do you think people should be allowed to have rifles?” and more “Do you think you should be permitted to take care of your own security?”

A five-foot-tall, 110-pound woman is in a certain sense “armed” if she has a kitchen knife or a baseball bat at her disposal. But if the six-foot-four, 250-pound man who has broken into her apartment has one, too, she is not likely to overwhelm him. If that same woman has a nine-millimeter Glock, however? Well, then there is a good chance of her walking out unharmed. From the perspective of our petite woman, there is really no way for the state to endorse her right to defend herself if it deprives her of the tools she needs for the job.

In the sixth century, the Byzantine emperor Justinian compiled the monumental Digest of Roman Law, cataloguing the laws that had developed over centuries of Roman jurisprudence — among which was this rule of thumb: “That which someone does for the safety of his body, let it be regarded as having been done legally.” When it comes to the police and the armed forces, this principle is widely acknowledged, which is why most nations are happy to let their cops walk around with semi-automatic handguns and an array of advanced tactical gear. Within the civilian context, however, the same idea has become strangely controversial. Think of how often you hear Second Amendment advocates being asked with irritation why they “need” a particular firearm. Think, too, of how infrequently gun controllers focus on keeping weapons out of the hands of ne’er-do-wells rather than on limiting the efficacy of those available to the good guys. This makes no sense whatsoever. If a 15-round magazine and a one-shot-per-trigger-pull sidearm are necessary to give a trained police officer a fighting chance against a man who wishes him harm, there is no good reason that my sister shouldn’t have them, too.

As it happens, exactly this parity is presumed by America’s founding documents. The Declaration of Independence establishes that all men are born in possession of certain unchallengeable rights, and that among them are “life, liberty, and the pursuit of happiness.” This phrase, as with so many promulgated during the revolutionary era, is lightly adapted from John Locke, the English Enlightenment intellectual on whose philosophical presumptions the United States was in large part built. Inter alia, Locke held that every individual has a right to control and to defend his body, and that any government that attempted to deny that right was by necessity unjust. “Self defense,” Locke wrote in his Two Treatises of Government, “is a part of the law of nature” and in consequence cannot be “denied the community, even against the king himself.” In Locke’s view, this principle could be applied both on an individual level — against, say, intruders and other attackers — and on a collective level, against governments that turn tyrannical. Crucially, unlike Rousseau, Locke and his ideological heirs did not consider the establishment of the state to be a justification for the restriction of this principle.

To peruse the explanatory strictures of the Founders’ era is to discover just how seriously the right to protect oneself was taken in the early Anglo-American world. Writing in his 1768 Commentaries on the Laws of England, the great jurist William Blackstone contended that “self-defence” was “justly called the primary law of nature” and confirmed the Lockean contention that it could not be “taken away by the law of society.” In most instances, Blackstone observed, injuries inflicted by one citizen on another could wait to be mediated by the “future process of law.” But if those “injuries [are] accompanied with force . . . it is impossible to say, to what wanton lengths of rapine or cruelty outrages of this sort might be carried, unless it were permitted a man immediately to oppose one violence with another.”

These conceptions were carried over wholesale into the American colonies and cherished long after independence had been won. In Federalist No. 28, Alexander Hamilton affirmed the importance of the “original right of self-defense which is paramount to all positive forms of government” and conceded that, in extreme circumstances, it may even be asserted legitimately “against the usurpations of the national rulers.” This conceit was explicitly established in New Hampshire’s constitution of 1784, which, astonishingly enough, included an enumerated right to revolution: “The doctrine of nonresistance against arbitrary power, and oppression,” its signatories acknowledged, “is absurd, slavish, and destructive of the good and happiness of mankind.” Similar statements were subsequently added to the charters of Kentucky, Pennsylvania, North Carolina, Texas, and Tennessee.

For almost all of American history, this idea remained uncontroversial. When, in the early 19th century, certain large cities took it upon themselves to establish police forces, they presented their initiatives as complementary to, not in lieu of, the status quo. Likewise, when the architects of Reconstruction wondered aloud how free blacks would defend themselves against the hostile white majority, their first instinct, to paraphrase Yale law professor Akhil Reed Amar, was to make minutemen out of freedmen. Today, the Supreme Court continues to affirm the right to defend oneself, refusing to hand that task over exclusively to the armed agents of the state, even in the age of the standing army and militarized police departments. Despite progressivism’s endless march, the spirit of John Locke is alive and well.

But not, alas, omnipresent. Unfortunately, it has become commonplace over the last few decades to hear opponents of the right to keep and bear arms recite aggregate statistics as their case against individual liberties. A particularly egregious example of this came with Colorado’s post-Aurora gun-control debate, during which a state legislator named Evie Hudak casually informed a female survivor of rape that, mathematically speaking, she was more likely to hurt herself with her concealed firearm than to forestall another attack. “Actually, statistics are not on your side even if you had a gun,” Hudak told the stunned hearing. “Chances are that if you had had a gun, then he would have been able to get that from you and possibly use it against you.”

This approach is entirely inconsistent with America’s founding ideals. If it is the case that free people have the right to defend themselves regardless of whether they are likely to prevail, then what their elected representatives think of their endeavors is irrelevant. To take any other approach is to strip from mankind what the great American jurist Henry St. George Tucker, echoing Blackstone, termed the “first law of nature,” and to do so in the name of unwarranted superintendence.

That those who would engage in such supervision do so with good intentions is neither here nor there. When, in their infinite wisdom, the legislators of New Jersey passed the draconian permitting requirements that have led to their constituents’ waiting months for the chance to buy a gun, they presumably believed that they were striking a strong blow for public safety. In truth, however, they were overstepping their legitimate bounds and condemning a handful of American citizens to ignominious death. One such citizen, a diminutive woman named Carol Bowne, found this out firsthand in June of this year, when, having waited long beyond the statutory processing window, she watched her stalker of an ex-boyfriend come into her driveway with a knife and stab her to death. “Who does not see that self-defense is a duty superior to every precept?” asked Montesquieu in his magisterial Spirit of the Laws. Judging by our present debate, the answer to this question is “Too many.”

— Charles C. W. Cooke is a staff writer at National Review. This article originally appeared in the December 31, 2015, issue of National Review.

National Review magazine content is typically available only to paid subscribers. Due to the immediacy of this article, it has been made available to you for free. To enjoy the full complement of exceptional National Review magazine content, sign up for a subscription today. A special discounted rate is available for you here.



Tuesday, December 29, 2015

HOUSE DEMOCRATS MOVE TO CRIMINALIZE CRITICISM OF ISLAM

Lumping together violence with “hateful rhetoric” is a call to destroy the freedom of speech.


December 29, 2015


December 17, 2015 ought henceforth to be a date which will live in infamy, as that was the day that some of the leading Democrats in the House of Representatives came out in favor of the destruction of the First Amendment. Sponsored by among others, Muslim Congressmen Keith Ellison and Andre Carson, as well as Eleanor Holmes Norton, Loretta Sanchez, Charles Rangel, Debbie Wasserman Schultz, Joe Kennedy, Al Green, Judy Chu, Debbie Dingell, Niki Tsongas, John Conyers, José Serrano, Hank Johnson, and many others, House Resolution 569 condemns “violence, bigotry, and hateful rhetoric towards Muslims in the United States.” The Resolution has been referred to the House Committee on the Judiciary.
That’s right: “violence, bigotry and hateful rhetoric.” The implications of those five words will fly by most people who read them, and the mainstream media, of course, will do nothing to elucidate them. But what H. Res. 569 does is conflate violence -- attacks on innocent civilians, which have no justification under any circumstances – with “bigotry” and “hateful rhetoric,” which are identified on the basis of subjective judgments. The inclusion of condemnations of “bigotry” and “hateful rhetoric” in this Resolution, while appearing to be high-minded, take on an ominous character when one recalls the fact that for years, Ellison, Carson, and his allies (including groups such as the Hamas-linked Council on American-Islamic Relations, CAIR) have been smearing any and all honest examination of how Islamic jihadists use the texts and teachings of Islam to incite hatred and violence as “bigotry” and “hateful rhetoric.” This Resolution is using the specter of violence against Muslims to try to quash legitimate research into the motives and goals of those who have vowed to destroy us, which will have the effect of allowing the jihad to advance unimpeded and unopposed.
That’s not what this H. Res. 569 would do, you say? It’s just about condemning “hate speech,” not free speech? That kind of sloppy reasoning may pass for thought on most campuses today, but there is really no excuse for it. Take, for example, the wife of Paris jihad murderer Samy Amimour – please. It was recently revealed that she happily boasted about his role in the murder of 130 Paris infidels: “I encouraged my husband to leave in order to terrorize the people of France who have so much blood on their hands […] I’m so proud of my husband and to boast about his virtue, ah la la, I am so happy.” Proud wifey added: “As long as you continue to offend Islam and Muslims, you will be potential targets, and not just cops and Jews but everyone.”
Now Samy Amimour’s wife sounds as if she would be very happy with H. Res. 569, and its sponsors would no doubt gladly avow that we should stop offending Islam and Muslims – that is, cut out the “bigotry” and “hateful rhetoric.” If we are going to be “potential targets” even if we’re not “cops” or “Jews,” as long as we “continue to offend Islam and Muslims,” then the obvious solution, according to the Western intelligentsia, is to stop doing anything that might offend Islam and Muslims – oh, and stop being cops and Jews. Barack “The future must not belong to those who slander the prophet of Islam” says it. Hillary “We’re going to have that filmmaker arrested” Clinton says it. The U.S. Conference of Catholic Bishops, certain that anyone who speaks honestly about Islam and jihad is a continuing danger to the Church, says it.
And it should be easy. What offends Islam and Muslims? It ought to be a simple matter to cross those things off our list, right? Making a few sacrifices for the sake of our future of glorious diversity should be a no-brainer for every millennial, and everyone of every age who is concerned about “hate,” right? So let’s see. Drawing Muhammad – that’s right out. And of course, Christmas celebrations, officially banned this year in three Muslim countries and frowned upon (at best) in many others, will have to go as well. Alcohol and pork? Not in public, at least.Conversion from Islam to Christianity? No more of that. Building churches? Come on, you’ve got to be more multicultural!
Everyone agrees. The leaders of free societies are eagerly lining up to relinquish those freedoms. The glorious diversity of our multicultural future demands it. And that future will be grand indeed, a gorgeous mosaic, as everyone assures us, once those horrible “Islamophobes” are forcibly silenced. Everyone will applaud that. Most won’t even remember, once the jihad agenda becomes clear and undeniable to everyone in the U.S. on a daily basis and no one is able to say a single thing about it, that there used to be some people around who tried to warn them.
 Tags: CriticismhouseIslam

Saturday, December 26, 2015

Why Hilaire Belloc Still Matters


December 23, 2015
George Bernard Shaw, Hilaire Belloc, G.K. Chesterton (1928)

Chatting with a British bishop who'd said the famous Catholic writer Hilaire Belloc sometimes came to his home when he was a child to visit his father, a friend, I asked the obvious question:
What was Belloc like?
The bishop didn't say a lot, but I do remember this: "...an old man in a rumpled, stained black suit." The image has stuck with me, as apparently it did with the bishop.
That would have been Belloc in the last years of his life. (He died in July, 1953, just short of turning 83.) He kept writing until near the end -- after all, he made his living like that -- social criticism plus history and biography of a polemical nature, vigorous and clear but scarcely unbiased.
But the language -- ah, the language. Here was Belloc's great gift. From beginning to end his writing was a model of simple, elegant English prose.
Lately I've been reading a pocket-sized Belloc book that I found on the shelf without even knowing it was there. Its title is Hills and the Sea, and it's a collection of short essays the author published in British popular periodicals early in the last century. First appearing in 1906, the volume was republished in 1913.
A point of interest in my copy is an inscription on the title page, written when the book was presented to someone as a gift: "For the precious moments just before repose, beauty and adventure here at hand in Belloc's unsurpassable prose."
And here, virtually at random, is a specimen of the writing that earned that effusion:
"There was no breeze in the air, and the little deep vessel swung slightly to the breathing of the sea. Her great mainsail and her balloon-jib came over lazily as she swung, and filled themselves with the cheating semblance of a wind. The boom creaked in the goose-neck, and at every roll the slack of the main sheet tautened with a kind of little thud which thrilled the deck behind me."
Do people still read Belloc?
As with many writers who write a lot, his output was a mixed bag. Undoubtedly, too, he holds no interest for those who imagine European history and European culture began in 1789 with the French Revolution. But for people with an appreciation of Europe's Christian roots, he matters.
Readers who wish to tackle Belloc a little at a time will find a helpful introduction in TheEssential Belloc, a St. Benedict's Press compilation edited by Fr. C.J. McCloskey, Scott Bloch, and Brian Robertson.
His unquestioned masterpiece remains The Path to Rome, a rambling account of a journey -- a pilgrimage, really -- that he made, largely on foot, in 1901 through the heart of the Old Continent to the Eternal City. What is the book about? The answer, you might say, is whatever pops into the writer's head. But on a deeper level its subject is no less than the Christian soul of Europe.
The ideological ideal of today's vision of a secularized Europe requires the creation of a uniform continental identity from which national identities and religious identity have been erased. By contrast, Belloc's writing is full of glimpses of the Europe that was -- Christendom -- presented in inimitable prose and well worth cherishing even now.
In the long run, history will declare the verdict among Christendom, a secularized European monolith, and an Islamicized entity now perhaps starting to emerge.
Russell Shaw was secretary for public affairs of the National Conference of Catholic Bishops from 1969 to 1987. He is the author of many books, including American Church: The Remarkable Rise, Meteoric Fall, and Uncertain Future of Catholicism in America.

Today's Tune: Bruce Springsteen - The Ties That Bind (Live on SNL)

Today's Tune: Bruce Springsteen - Meet Me in the City (Live on SNL)

Food fads: Make mine gluten-full


December 24, 2015
(iStock)
When the federal government’s 1980 “Dietary Guidelines for Americans” warned about the baleful effects of saturated fats, public interest activists joined the fight and managed to persuade major food companies to switch to the shiny new alternative: trans fats. Thirty-five years later, the Food and Drug Administration finally determined that trans fats are not just useless but unsafe, and ordered them removed from all foods. Oops.
So much for settled science. To tell the truth, I never paid much attention to the fat fights in the first place. From my days as a medical student (and prodigious consumer of junk food), I’ve seen so many solemnly proclaimed “findings” come and go that I decided long ago to ignore — and outlive — them all.
So far, I’m ahead. Never had an egg substitute in my life. I figured trans fats were just another fad waiting to be revoked and renounced. Moreover, if I was wrong, the green eggs and ham would take so long to kill me anyway that I was more likely to be hit by a bus first. Either way, win-win.
Don’t get me wrong. I don’t advocate this kind of jaunty fatalism for everyone. This is a private affair. I do, however, preach skepticism. Remember that most venerable piece of received medical wisdom — 98.6 degrees as the average adult human temperature? In 1992, three researchers bothered to measure — and found that the conventional wisdom (based on an 1878 German study) was wrong. Normal is 98.2.
After that — 114 years of error — one is inclined to embrace Woody Allen’s “Sleeper” theory that in 200 years we’ll discover that smoking is good for you, fruits are not. I still love peaches, but I eat them for the taste — and the memories — not because they might add a month to my life (in the ICU when I’m 90).
I don’t mean to be cynical, just realistic. Take fish oil. For at least 10 years the National Institutes of Health has strongly recommended omega-3 fatty acids and fish oil for the prevention of cardiovascular disease.
I held out, trusting both my gastronomic prejudices (more turf than surf) and my faith that time ultimately undoes all of life’s vérités. I waited. My orneriness has not been fully vindicated — NIH still recommends dietary fish oil — but it does find omega-3 supplements to be useless.
Exhibit A for medical skepticism, however, remains vitamin C. When Linus Pauling, Nobel laureate in chemistry (not nutrition), began the vitamin-C megadose fad to fend off all manner of disease, the whole thing struck me as bizarre. Yes, you need some C to prevent scurvy if you’re seven months at sea with Capt. Cook and citrus is nowhere to be found. Otherwise, the megadose is a crock. Evolution is pretty clever. For 2 million years it made sure Homo erectus, neanderthalensis, sapiens, what have you, got his daily dose without having to visit a GNC store.
Sure enough, that fashion came and went. But there are always new windmills to be tilted at. The latest is gluten.
Now, if you suffer from celiac disease, you need a gluten-free diet. How many of us is that? Less than 1 percent. And yet supermarket shelves are groaning with products proclaiming their gluten-freedom. Sales are going through the roof.
Another crock. Turns out, according to a massive Australian study of 3,200 products, gluten-free is useless. “The foods can be significantly more expensive and are very trendy to eat,” says Jason Wu, the principal investigator. “But we discovered a negligible difference when looking at their overall nutrition.”
Told you so.
Why then am I not agitating to have this junk taken off the shelves? Because of my other obsession: placebos. For which I have an undying respect, acquired during my early years as a general-hospital psychiatrist. If you believe in the curative powers of something — often encouraged by the authority of your physician — a sugar pill or a glass of plain water can produce remarkable symptom relief. I’ve seen it. I’ve done it.
So I’d never mess with it. If a placebo can alleviate your pain, that’s better than opioids. If going gluten-free gives a spring to your step, why not? But please, let the civility go both ways. Let the virtuous Fitbit foodie, all omega-3’d and gluten-free, drop the self-congratulatory smugness. And I promise not to say it’s all in his head.
Live and let eat. Merry Christmas.

Just Asking about Islam and Terrorism


By Andrew C. McCarthy — December 26, 2015


ISIS

ISIS. (photo credit:ISLAMIC SOCIAL MEDIA)


Let me ask you a question.

Let’s say you are an authentically moderate Muslim. Perhaps you were born into Islam but have become secularist. Or perhaps you consider yourself a devout Muslim but interpret Islam in a way that rejects violent jihad, rejects the concept that religious and civic life are indivisible, and rejects the principle that sharia’s totalitarian societal framework and legal code must be imposed on the state. Let’s just take that as a given: You are no more inclined toward terrorism than any truly peaceful, moderate, pro-democratic non-Muslim.

So let me pop the question: Is there any insulting thing I could say, no matter how provocative, or any demeaning video I could show you, no matter how lurid, that could convince you to join ISIS?
Mind you, I am not asking whether, upon my insulting and provoking you, you would ever want to have anything to do with me again. I am asking whether there is anything that could be said or done by me, or, say, Donald Trump, or Nakoula Basseley Nakoula — the video producer (Innocence of Muslims) whom Hillary Clinton and Barack Obama tried to blame for the Benghazi massacre — that could persuade you to throw up your hands and join the jihad? Is there anything so profoundly offensive to Islam that we could conjure up that would make a truly moderate, peaceful Muslim sign up for mass murder? Torching and beheading? Killing children? Participating in systematic rape as a weapon of war?

I didn’t think so.

Yet, understand, that is what Washington would have you believe. Whether it is Barack Obama sputtering on about how Guantanamo Bay drives jihadist recruitment, or Hillary Clinton obsessing over videos (the real one by Nakoula that she pretended caused terrorism in Libya, and the pretend ones about Donald Trump that she claims have Muslims lined up from Raqqa to Ramadi to join ISIS), you are to believe violent jihad is not something that Muslims do but that Americans incite.

And it’s not just Democrats who’d have you buy this bunkum. Think of the Arab Spring fairy tale — about Libya, Egypt, and, most recently, Syria — that Republicans have been telling for years, critiqued by yours truly in Spring Fever. It is still GOP gospel, glibly peddled by Marco Rubio just a couple of weeks ago at the 2016 presidential candidates’ debate. (Disclosure: I support Ted Cruz.)

The fairy tale goes something like this. There is a terrible dictator who so tormented his people that they rose up against him. These were noble people, overwhelmingly moderate, secular Muslims — adherents of a “religion of peace” (or, as Bush secretary of state Condi Rice put it, “a religion of peace and love”), who craved democracy. (CautionYou can call them “rebels,” but words like “Muslim Brotherhood” and “sharia” are not to be uttered — we’re trying to build a narrative here!) Sure, the noble people may have tolerated the occasional jihadist in their midst, but that could happen to even the most well-intentioned peaceful moderate, right? (The pervasive presence of jihadists who used Syria and Libya as gateways to jihad against Americans in Iraq is also not to be mentioned.)

Now let’s let bygones be bygones. No need to tarry over small details — like how the noble people installed anti-democratic Islamists who imposed a sharia constitution on Egypt after ousting their pro-American dictator; or how Libya became a jihadist playpen where Americans are murdered after the U.S. government sided with the noble people to oust the U.S.-supported dictator who had been giving us counterterrorism intelligence about jihadists in places like Benghazi.

Let’s just skip ahead to Syria. There, the noble people needed America’s help, but Barack Obama turned a deaf ear. (No need to get into Obama’s collusion with the Islamic-supremacist governments of Turkey, Saudi Arabia, and the UAE to arm and train the “rebels.”) This forfeited our golden opportunity to intervene actively and empower the bounty of moderate, secular, America-loving, democracy-craving Muslims (because that worked so well in Libya). But for Obama’s default, these moderate legions could simultaneously have toppled the dictator and purged the teeny-tiny number of jihadists who might have been skulking about. (Let’s not get into how there don’t seem to be enough of these moderates to man a soccer team, let alone a legion; or how weapons supplied to these “rebels” somehow keep ending up in the hands of the jihadists.Obama’s default, coupled with the ruthlessness of the dictator, created a leadership and territory void into which jihadists suddenly poured (apparently out of nowhere). Somehow, these spontaneously generating jihadists managed to entice recruits, vastly increasing in number and power (even though — you’ll have to trust us on this — the moderate, secular Muslims really want nothing to do with them).

And that, ladies and gentlemen, is how ISIS was born and al-Qaeda rose from the ashes.
You buying it? Me neither.

About 20 years ago, I prosecuted a dozen jihadists, led by the “Blind Sheikh,” Omar Abdel Rahman, for waging a terrorist war against the United States — including the World Trade Center bombing and a plot to attack the Lincoln and Holland Tunnels, as well as other New York City landmarks. The defendants were caught on tape building bombs, scheming to strike at American military sites, and planning attacks timed to achieve maximum infidel carnage.

At trial, the jihadists tried to tell the jury they were just moderate, peace-loving Muslims who had been provoked by American foreign policy, a perception of anti-Muslim bias, and videos of Muslims being persecuted in Bosnia. The Blind Sheikh insisted his incitements to jihad were simply a case of faithfully applying sharia principles, which, according to his lawyers, the First Amendment gave him the right to do.

So I asked the jury a simple question:

Is there any obnoxious, insulting, infuriating thing I could say to you, or show to you, that would convince you to join up with mass-murdering terrorists? To become a terrorist yourself?

Of course, a dozen commonsense New Yorkers did not need to be asked such a question. They laughed the defense out of the courtroom.

Alas, in the 20 years since, the defense they laughed out of the courtroom has become the bipartisan government policy of the United States.

Go figure.

— Andrew C. McCarthy is a policy fellow at the National Review Institute. His latest book is Faithless Execution: Building the Political Case for Obama’s Impeachment.



Thursday, December 24, 2015

Today's Tune: Bruce Springsteen - Christmas (Baby please come home)

America’s hidden jihad


By Daniel Pipes
December 23, 2015

COURTESY NEW JERSEY STATE POLICE

Yusuf Ibrahim had been driving a Mercedes Benz owned by one of the dead men.


The police and press did an impressive job of sleuthing into the lives and motives of Syed Rizwan Farook and Tashfeen Malik, the married couple who massacred 14 people on Dec. 2, in San Bernardino, California.

We know about their families, their studies and employment histories, their travels, their marriage, their statements, and their preparations for the assault. Most importantly, the cascade of background work means we know that the pair had jihadi intentions, meaning, they attacked in their role as pious Muslims spreading the message, law, and sovereignty of Islam.

We are all better off for knowing these facts, which have had a powerful impact on the body politic, making Americans far more concerned with jihadi violence than at any time since just after 9/11, as they should be. For example, in 2011, 53 percent told a pollster that terrorism was a critical issue; that number has now reached 75 percent.

But what about the case of Yusuf Ibrahim? In early 2013, when he was 27, this Egyptian-born Muslim lived in Jersey City, when he allegedly shot, then cut off the heads and hands and knocked the teeth out of two Coptic Christians, Hanny F. Tawadros and Amgad A. Konds, then buried them in Buena Vista Township, New Jersey.

He is charged with two counts each of murder, felony murder, kidnapping, robbery, desecration of human remains, and other crimes. In addition, he has pleaded guilty to a Dec. 22, 2011, carjacking and a Sep. 20, 2012, armed robbery, both in Jersey City (in the latter, he shot his victim in the foot), and early in 2015 he was sentenced to 18 years in prison for these later crimes.

The twin beheadings are spectacular, gruesome, and replete with jihadi (or in police parlance, "terrorist") elements. Historian Timothy Furnish explains that "ritual beheading has a long precedent in Islamic theology and history," making it a distinctly Muslim form of execution. A Muslim killing a non-Muslim fits the ageless pattern of Islamic supremacism. It also fits a tragic pattern of behavior in the United States in recent years.

Yet the police, politicians, the press, and professors (i.e., the Establishment) have shown not the slightest interest in the Islamic angle, treating the double beheadings and amputations as a routine local murder. Symptomatic of this, the police report about Ibrahim's arrest makes no mention of motivation; on the basis of this lack of mention, left-leaning Snopes.com (which describes itself as the "definitive Internet reference source for urban legends, folklore, myths, rumors, and misinformation") goes so far as to dismiss as "false" the allegation that the mainstream media "deliberately ignored" this incident. The wagons have been circled.

Almost three years after the event, we know next-to-nothing about Ibrahim, his motives, his possible connections to others, or his institutional affiliations. We also do not know the relationship of the accused attacker to his victims: Was he a criminal who fell out with his accomplices, a friend who had drunk too much, a would-be lover knocking off his rivals for the affections of a woman, a family member eliminating aspirants for an inheritance, a crazy man randomly shooting passers-by? Or was he perhaps a jihadi seeking to spread the message, law, and sovereignty of Islam?

I cannot answer those questions because the case lingers in total obscurity, popping up from time to time only in connection with some technical procedural matter (such as the amount of Ibrahim's bail or the admissibility of his confession) that sheds no light on the motives for his alleged crime.

Nor is the Ibrahim case unusual. I have compiled long lists of other potential instances of jihadi violence (herehere, and here) in which the Establishment has colluded to sweep the Islamic dimension under the rug, treating the perpetrators as common criminals whose biographies, motives, and connections are of no interest and therefore remain unknown.

This silence about possible jihad has the major consequence of lulling the American public (and its counterparts elsewhere in the West) into believing jihadi violence is far rarer than is the case. If the body politic understood the full extent of jihad in America, the alarm would be much greater; the percentage of those calling terrorism a critical issue would rise much higher than the current 75 percent. That, in turn, might push the Establishment finally to get serious about confronting jihad.
Mr. Pipes (DanielPipes.org@DanielPipes) is president of the Middle East Forum. © 2015 by Daniel Pipes. All rights reserved.
Related Articles:
receive the latest by email: subscribe to daniel pipes' free mailing list
This text may be reposted or forwarded so long as it is presented as an integral whole with complete and accurate information provided about its author, date, place of publication, and original URL.

The Year Christmas Died


New York’s Fifth Avenue is a celebration of pretty much nothing—or worse.


By 
December 23, 2015

A ‘holiday’ window at Bergdorf Goodman in New York City.
A ‘holiday’ window at Bergdorf Goodman in New York City. PHOTO: MARK LENNIHAN/ASSOCIATED PRESS

As we moved into December and what for some time has been called “the holiday season,” the Office of Diversity and Inclusion at the University of Tennessee issued a “best practices” directive for the campus to “ensure your holiday party is not a Christmas party in disguise.”
A Christmas party in disguise? Has it come to this?
Aghast state legislators got the directive rescinded, but the Christmas killers will get the last laugh. In fact, they’ve already won. This is the year Christmas died as a public event in the United States.
We know this after touring the historic heart of public Christmas—Fifth Avenue in New York City.
For generations, American families have come to New York in December to swaddle themselves in the glow and spirit of Christmas—shops, restaurants, brownstones, the evergreen trees along Park Avenue, bar mirrors and, most of all, Fifth Avenue’s department-store windows. You couldn’t escape it, and why would you want to?
A friend, an ardent atheist, would be inconsolable if he couldn’t sing Handel’s entire “Messiah” with 3,000 other revelers this month at Lincoln Center. Even if the only god you worship is yourself, December in New York has always been about the bustling good cheer flowing from the Christian holiday.
For many, December required a pilgrimage to Saks Fifth Avenue, Lord & Taylor and Bergdorf Goodman. No matter the weather, people walked the mile from 38th Street to 59th Street and jammed sidewalks to see these stores’ joyful Christmas windows.
Stay home. This year Fifth Avenue in December is about . . . pretty much nothing, or worse.
To be sure, the magnificent Rockefeller Center Christmas tree still stands, and directly across on Fifth Avenue is St. Patrick’s Cathedral, its facade washed and hung with a big green wreath. But walk up or down the famous avenue this week and what you and your children will see is not merely Christmas scrubbed, but what one can only describe as the anti-Christmas.
Forget public Nativity scenes, as court fiat commanded us to do years ago. On Fifth Avenue this year you can’t even find dear old Santa Claus. Or his elves. Christmas past has become Christmas gone.
The scenes inside Saks Fifth Avenue’s many windows aren’t easy to describe. Saks calls it “The Winter Palace.” I would call it Prelude to an Orgy done in vampire white and amphetamine blue.
A luxuriating woman lies on a table, her legs in the air. Saks’ executives, who bear responsibility for this travesty, did have the good taste to confine to a side street the display of a passed-out man on his back (at least he’s wearing a tux), spilling his martini, beneath a moose head dripping with pearls. Adeste Gomorrah.
But you haven’t seen the anti-Christmas yet. It’s up at 59th Street in the “holiday” windows of Bergdorf Goodman. In place of anything Christmas, Bergdorf offers “The Frosty Taj Mahal,” a palm-reading fortune teller—and King Neptune, the pagan Roman god, seated with his concubine. (One Saks window features the Roman Colosseum, the historic site of Christian annihilation.)
I thought: Lord & Taylor! Surely the iconic Christmas windows on 38th Street won’t shelve St. Nick. They did. He’s gone, replaced by little bears and cupcakes, gingerbread men and Canada geese.
There is one holdout to the de-Santification of America: In Macy’s windows at Sixth Avenue and 34th Street—as in “Miracle on 34th Street”—the characters of “A Charlie Brown Christmas” frolic in Yuletide splendor.
The Christmas-less feeling along once-famous Fifth Avenue this year is similar to the loss one feels reading the last lines of “Casey at the Bat”—a shattering, historic strikeout.
The erasure of Christmas between the grinding stones of secular fanaticism will persist. Eventually the holiday will be forbidden, forgotten and filed away in attic boxes. But maybe God, in His usual mysterious way, is nudging us back toward the beginning.
Once the inevitable Federal Office of Diversity and Inclusion has joined with the commercial cynics at Saks and Bergdorf’s to suppress even Santa, what pretext will parents have to give gifts to their Christmas-cleansed children? Amazon Day?
In the post-Christmas era, the infant Jesus and Santa Claus will go back to the catacombs of early Christian life, where you won’t have to say happy holidays to anyone. Christmas as we know it will die off, and what will be left on December 25th will look a lot like Thanksgiving, but smaller.
Unless celebrating Christmas in America becomes a prosecutable crime, as it was in the Soviet Union, families will go to church in the morning to renew the beginnings of their faith and then spend the day at home listening to pirated copies of the carols and hymns on Bing Crosby’s “White Christmas” album. For radical refuseniks, I recommend playing, at the highest possible volume, “The Bells of St. Mary’s” on Phil Spector’s “A Christmas Gift for You.”
As for Saks and the other Fifth-Avenue sellouts, I have two words this season. They aren’t Merry Christmas.
Write to henninger@wsj.com