Newsgeeker.com news site RSS Email Alerts

Search:Casey


   
[Legislation] S. 3036 – Keep Families Together Act [full text]

By R. Mitchell -

S. 3036 - Keep Families Together Act

115th CONGRESS 2d Session S. 3036 To limit the separation of families at or near ports of entry. IN THE SENATE OF THE UNITED STATES June 7, 2018 Mrs. Feinstein (for herself, Mr. Schumer, Ms. Harris, Mr. Leahy, Mrs. Murray, Mr. Wyden, Mr. Durbin, Mr. Reed, Mr. Nelson, Mr. Carper, Mr. Menendez, Mr. Sanders, Mr. Casey, Ms.Klobuchar, Mrs. Shaheen, Mr. Warner, Mr. Merkley, Mr. Bennet, Mr. Blumenthal, Mr. Schatz, Mr. Murphy, ...

S. 3036 – Keep Families Together Act [full text] is original content from Conservative Daily News - Where Americans go for news, current events and commentary they can trust.

Published:6/19/2018 3:46:50 PM
[Markets] Splitting California Into 3 Pieces Is Long Overdue

As we detailed earlier, it now looks like California voters will have the chance to vote on whether or not the state ought to be split up into three pieces.

The Los AngelesTimes reports :

If a majority of voters who cast ballots agree, a long and contentious process would begin for three separate states to take the place of California, with one primarily centered around Los Angeles and the other two divvying up the counties to the north and south.

As Ryan McMaken, via The Mises Institute, notes, this latest move is just one of many efforts over many decades to split California up, and make its constituent parts more responsive to the people who live there. This effort, however, is more successful than past ones — for example, the 2016 proposed ballot measure breaking up California into six states. That one failed to qualify for the ballot.

To say the least, breaking up California into smaller pieces is something that is long overdue. The population of California is a massive 39 million, making it larger than either Canada or Peru. And the GDP produced by that state is enormous as well. If California were an independent country, it would have an economy larger than that of the United Kingdom.

This means the California government - which can (and does) skim off substantial portions of that wealth - is among the richest governments in the world.

Moreover, the government holds a monopoly of power over a vast area which includes some of the best real estate in the world. Much of North America's best coastlines, mountains, natural harbors, forests, and mountains are contained within California.

And, here's one of the best things about being a huge state (from the government's perspective): the government can make it extremely inconvenient to escape it: "You don't like our policies? Well, then, feel free to move hundreds of miles away to Phoenix or Reno."

It's no wonder then, that the government of California has been able to abuse its taxpayers so freely. California has one of the highest tax burdens in the nation, and many leave the state because of it. More and more, the state is become a playground for the wealthy who have enough of a surplus to endure what ordinary people cannot. Thanks to endless regulations on development via environmental regulations and other measures, housing supply has been artificially limited, and thus the cost of housing in California has skyrocketed. This has led to a situation in which, as the Sacramento Bee put it "California exports its poor to Texas... while wealthier people move in." But California apparently isn't exporting all its poor — the state has the worst poverty rate in the nation when the cost of living is taken into account.

And yet, the opponents of the new "Three Californias" ballot measure will surely be telling us that the status quo is perfectly fine. We'll be told that the political establishment in Sacramento ought not be punished for decades of mismanagement, and that it would be too "extreme" to separate hard-left northern California from the more politically moderate areas of the south and east.

A State Divided

Indeed, the state isn't nearly as united in its love of the dominant political agenda as we're supposed to believe. As recently, as 2008, the vote on the same-sex marriage measure Proposition 8 illustrated some meaningful divisions in the state. Opposition to the measure (i.e., support for gay marriage) enjoyed a majority only in the northern parts of the state and along the central coast. A majority of voters in the south supported the measure, and even a majority in Los Angeles County voted for the measure. Whatever one's opinion on the subject of marriage, the vote reiterated what has long been known: what we regard as "progressive California" has long been pushed primarily by Californians centered around the Bay Area and Silicon valley. While it would be silly to call Southern California a bastion of right-wing thinking, that fact is that Southern California is less about Hollywood than it is about miles upon endless miles of suburban neighborhoods filled with middle class people who have better things to do than push for Dianne Feinstein's latest political hobbyhorse. A middle-class insurance worker in an unglamorous LA suburb with three kids has precious little in common with a pair of Princeton-educated Silicon Valley workers who live a dual-income-no-kids lifestyle, and who bring to the state the "enlightened" views we've come to expect from upper-middle-class suburbanites with expensive designer college degrees.

Moreover, this latter group has been increasingly growing as the influential majority both in terms of population, and in the wealth and resources it can bring to the political game.

So, it's hard to not be sympathetic to Californians who might be happy to break free of the current political stranglehold and perhaps embrace a smaller, more flexible political community that might be slightly more responsive to local taxpayers and citizens.

California is the Poster Child for Un-Responsive Government

And it's easy to see why many Californians might regard their political system as un-responsive to their particular personal and regional needs: California has, by far, the most unrepresentative state government in the United States.

For every state legislator, there are more than 310,000 California residents. Second-place Texas, with 139,000 residents per legislator, doesn't come close. These numbers aren't even in the same league, though, with quite a few other states — including especially safe, wealthy, and healthy ones — like Minnesota, Utah, and Massachusetts. Those states have 1 legislator for every 23,600; 23,500; and 33,000 residents, respectively.

This is what passes for political representation in California's government.

As Gerard Casey has pointed out, the very concept of political representation is built on a pretty shaky foundation as is. It's implausible enough to claim that one person can truly represent the interests of 50 or 100 other people. But 20,000 people spread out over numerous communities, geographies and ethnic groups?

In some circumstances involving fairly uniform populations, even that might be something many people could swallow. But 100,000 people? 3000,000? That mere suggestion of such a thing should be regarded as laughable. And yet that is the foundation on which California's "democracy" is based. Its government, politically speaking, relies on the acceptance of the idea that the state's legislature of 120 people can "represent" 39 million people spread across 163,000 square miles. 

In practice, however, this idea is totally implausible, and the practical downsides are numerous as well.1 With such an unrepresentative scheme:

  • Large constituencies increase the cost of running campaigns, and thus require greater reliance on large wealthy interests for media buys and access to mass media. The cost of running a statewide campaign in California, for example, is considerably larger than the cost of running a statewide campaign in Vermont. Constituencies spread across several media markets are especially costly.

  • Elected officials, unable to engage a sizable portion of their constituencies rely on large interest groups claiming to be representative of constituents.

  • Voters disengage because they realize their vote is worth less in larger constituent groups.

  • Voters disengage because they are not able to meet the candidate personally.

  • Voters disengage because elections in larger constituencies are less likely to focus on issues that are of personal, local interest to many of the voters.

  • The ability to schedule a personal meeting with an elected official is far more difficult in a large constituency than a small one.

  • Elected officials recognize that a single voter is of minimal importance in a large constituency, so candidates prefer to rely on mass media rather than personal interaction with voters.

  • Larger constituent groups are more religiously, ethnically, culturally, ideologically, and economically diverse. This means elected officials from that constituent group are less likely to share social class, ethnic group, and other characteristics with a sizable number of their constituents.

  • Larger constituencies often mean the candidate is more physically remote, even when the candidate is at "home" and not at a distant parliament or congress. This further reduces access.

California epitomizes all of this. And it's even worse when we consider the California's Congressional delegation. For each US Senator in California, there are 19 million Californians. How much do you suppose a Californian's single vote is worth to each senator? Approximately zero. (In California, each Congressional district, by the way, contains more than 600,000 residents for each member.)

So, when we consider California's enormous size, coupled with its tiny political class, it's easy to see that political decision making occurs within a tiny, distant, and remote minority well insulated from the lives of ordinary people.

This is true most places, of course, but California takes this reality to an extreme.

The Benefits of Splitting up the State

While it certainly not a panacea, splitting up California into smaller pieces would be a step in the right direction for many reasons.

First of all, it would provide more options and choice to people living in California right now. Most Californians no doubt consider themselves to be "pro-choice" people. So why not embrace more "choice" among political regimes along the West Coast. With three Californias, current residents would more easily be able to relocate to a state that better fits their personal needs, without having to relocated hundreds of miles away. As its currently proposed, a resident of Los Angeles County who seeks to change the state government he lives under may relocate to Orange County. This isn't totally convenient, of course, but it's certainly more convenient and less disruptive than having to move to Tucson or Dallas or Denver.

Decentralization has also often been shown to increase efforts to attract wealth and capital to each jurisdiction. This in turn limits the extent to which governments are willing to raise taxes and crush business with burdensome regulations. This, after all, is the model that has been working moderately well in Switzerland for centuries.

And finally, splitting up the state would help put a dent in California's utterly unrepresentative and unaccountable political class.

Even after the split-up, the three Californias would still each be among the largest states in the US — and that would still come with all the problems with noted above. But, it's a place to start.

Moreover, there's no reason each new California would have to adopt the model of a tiny 120-member legislature as "Old California" does now. Those who are unreasonably attached to the status quo would no doubt object to a 400-person House of Representatives as tiny New Hampshire has now. But even with a 200-person House, as Pennsylvania has , for each of the new Californias — each with approximately 13 million people — the state's power would become a little less centralized, a little less insulated, a little less lofty.

For More:

Published:6/15/2018 9:52:39 PM
[Markets] When Will Gold's "Summer Doldrums" End? History Says Pretty Soon

Authored by John Rubino via DollarCollapse.com,

This has been a uniquely boring stretch for gold and silver – especially considering all the things going on in the world that ought to light a fire under precious metals. In just the past few weeks, the US started a global trade war, Italy elected a populist governmentemerging markets descended into yet another crisis, and inflation has risen from the dead – all of which would be expected to spook normal financial markets and send capital pouring into safe havens. But not this time, which leaves precious metals under the control of seasonal factors that have over the years generated the “sell in May and go away” rule-of-thumb.

So when do the summer doldrums end? Based on recent history, December is a pretty good bet. The arrows on the following chart mark the beginning of each year since 2014. Note how gold’s price frequently starts moving up either then or a few weeks before, in early December. This seasonal strength is due to Asian buying in anticipation of weddings and harvests, and though you’d think traders would anticipate – and therefore cancel – the cycle’s impact, it still seems to operate.

Prior to 2014 the pattern was slightly different. Here’s a chart from Casey Research showing gold’s average performance for each month between 1975 and 2013. September was by far the best month to buy, with January the second best, implying a eight-month window beginning in July in which buying was rewarded with at least short-term gains.

If the second pattern re-emerges, then we really don’t have long to wait at all. Maybe one more month and at most three, and gold bugs can start having fun again.

As always, though, deciding when to buy precious metals is just the first in a series of challenges. Choosing the right dealer is paramount (see here for a list of reputable ones), followed by whether to take delivery and store the metal at home or seek secure vault storage for gold and silver.

And then there’s the bullion vs mining stock question. The former is money which will hold its value over long periods of time (thus preserving your purchasing power) while the latter are investments that can rise by multiples of their original cost or disappear without a trace. Jay Taylor’s newsletter is a good source for intelligence on the explorers, the riskiest and potentially most profitable segment of the mining market.

Published:6/12/2018 8:06:33 PM
[230d7a6d-1771-47c7-9d19-78e801f493e3] Casey Anthony's mother emotionally storms off set with husband in new A&E special Casey Anthony's mother, Cindy, gets very emotional to the point where she is triggered to walk off set in a new interview special with her husband, George, about the infamous death of their granddaughter, Cayleee, 10 years later. Published:5/29/2018 10:36:29 AM
[Markets] How To Honor Memorial Day

Authored by Ray McGovern via ConsortiumNews.com,

Memorial Day should be a time of sober reflection on war’s horrible costs, not a moment to glorify war. But many politicians and pundits can’t resist the opportunity...

Originally published on 5/24/2015

How best to show respect for the U.S. troops killed in Iraq and Afghanistan and for their families on Memorial Day? 

Simple: Avoid euphemisms like “the fallen” and expose the lies about what a great idea it was to start those wars in the first place and then to “surge” tens of thousands of more troops into those fools’ errands.

First, let’s be clear on at least this much: the 4,500 U.S. troops killed in Iraq so far and the 2,350 killed in Afghanistan [by May 2015] did not “fall.” They were wasted on no-win battlefields by politicians and generals cheered on by neocon pundits and mainstream “journalists” almost none of whom gave a rat’s patootie about the real-life-and-death troops. They were throwaway soldiers.

And, as for the “successful surges,” they were just P.R. devices to buy some “decent intervals” for the architects of these wars and their boosters to get space between themselves and the disastrous endings while pretending that those defeats were really “victories squandered” all at the “acceptable” price of about 1,000 dead U.S. soldiers each and many times that in dead Iraqis and Afghans.

Memorial Day should be a time for honesty about what enabled the killing and maiming of so many U.S. troops in Iraq and Afghanistan. Presidents George W. Bush and Barack Obama and the senior military brass simply took full advantage of a poverty draft that gives upper-class sons and daughters the equivalent of exemptions, vaccinating them against the disease of war.

What drives me up the wall is the oft-heard, dismissive comment about troop casualties from well-heeled Americans: “Well, they volunteered, didn’t they?” Under the universal draft in effect during Vietnam, far fewer were immune from service, even though the well-connected could still game the system to avoid serving. Vice Presidents Dick Cheney and Joe Biden, for example, each managed to pile up five exemptions. This means, of course, that they brought zero military experience to the job; and this, in turn, may explain a whole lot — particularly given their bosses’ own lack of military experience.

The grim truth is that many of the crëme de la crëme of today’s Official Washington don’t know many military grunts, at least not intimately as close family or friends. They may bump into some on the campaign trail or in an airport and mumble something like, “thank you for your service.” But these sons and daughters of working-class communities from America’s cities and heartland are mostly abstractions to the powerful, exclamation points at the end of  some ideological debate demonstrating which speaker is “tougher,” who’s more ready to use military force, who will come out on top during a talk show appearance or at a think-tank conference or on the floor of Congress.

Sharing the Burden?

We should be honest about this reality, especially on Memorial Day. Pretending that the burden of war has been equitably shared, and worse still that those killed died for a “noble cause,” as President George W. Bush liked to claim, does no honor to the thousands of U.S. troops killed and the tens of thousands maimed. It dishonors them. Worse, it all too often succeeds in infantilizing bereaved family members who cannot bring themselves to believe their government lied.

Sheehan: Few like her. (Photo: Joe Raedle/Free Getty image)

Who can blame parents for preferring to live the fiction that their sons and daughters were heroes who wittingly and willingly made the “ultimate sacrifice,” dying for a “noble cause,” especially when this fiction is frequently foisted on them by well-meaning but naive clergy at funerals. For many it is impossible to live with the reality that a son or daughter died in vain. Far easier to buy into the official story and to leave clergy unchallenged as they gild the lilies around coffins and gravesites.

Not so for some courageous parents. Cindy Sheehan, for example, whose son Casey Sheehan was killed on April 4, 2004, in the Baghdad suburb of Sadr City, demonstrated uncommon grit when she led hundreds of friends to Crawford to lay siege to the Texas White House during the summer of 2005 trying to get Bush to explain what “noble cause” Casey died for. She never got an answer. There is none.

But there are very few, like Cindy Sheehan, able to overcome a natural human resistance to the thought that their sons and daughters died for a lie and then to challenge that lie. These few stalwarts make themselves face this harsh reality, the knowledge that the children whom they raised and sacrificed so much for were, in turn, sacrificed on the altar of political expediency, that their precious children were bit players in some ideological fantasy or pawns in a game of career maneuvering.

Former Secretary of State Henry Kissinger is said to have described the military disdainfully as “just dumb stupid animals to be used as pawns in foreign policy.” Whether or not those were his exact words, his policies and behavior certainly betrayed that attitude. It certainly seems to have prevailed among top American-flag-on-lapel-wearing officials of the Bush and Obama administrations, including armchair and field-chair generals whose sense of decency is blinded by the prospect of a shiny new star on their shoulders, if they just follow orders and send young soldiers into battle.

This bitter truth should raise its ugly head on Memorial Day but rarely does. It can be gleaned only with great difficulty from the mainstream media, since the media honchos continue to play an indispensable role in the smoke-and-mirrors dishonesty that hides their own guilt in helping Establishment Washington push “the fallen” from life to death.

We must judge the actions of our political and military leaders not by the pious words they will utter Monday in mourning those who “fell” far from the generals’ cushy safe seats in the Pentagon or somewhat closer to the comfy beds in air-conditioned field headquarters where a lucky general might be comforted in the arms of an admiring and enterprising biographer.

A military band and flag-waving for America’s national religion at National Airport, Washington, May 26, 2018. (Photo by Joe Lauria)

Many of the high-and-mighty delivering the approved speeches on Monday will glibly refer to and mourn “the fallen.” None are likely to mention the culpable policymakers and complicit generals who added to the fresh graves at Arlington National Cemetery and around the country.

Words, after all, are cheap; words about “the fallen” are dirt cheap especially from the lips of politicians and pundits with no personal experience of war. The families of those sacrificed in Iraq and Afghanistan should not have to bear that indignity.

‘Successful Surges’

The so-called “surges” of troops into Iraq and Afghanistan were particularly gross examples of the way our soldiers have been played as pawns. Since the usual suspects are again coming out the woodwork of neocon think tanks to press for yet another “surge” in Iraq, some historical perspective should help.

Take, for example, the well-known and speciously glorified first “surge;” the one Bush resorted to in sending over 30,000 additional troops into Iraq in early 2007; and the not-to-be-outdone Obama “surge” of 30,000 into Afghanistan in early 2010. These marches of folly were the direct result of decisions by George W. Bush and Barack Obama to prioritize political expediency over the lives of U.S. troops.

Taking cynical advantage of the poverty draft, they let foot soldiers pay the “ultimate” price. That price was 1,000 U.S. troops killed in each of the two “surges.”

And the results? The returns are in. The bloody chaos these days in Iraq and the faltering war in Afghanistan were entirely predictable. They were indeed predicted by those of us able to spread some truth around via the Internet, while being mostly blacklisted by the fawning corporate media.

Yet, because the “successful surge” myth was so beloved in Official Washington, saving some face for the politicians and pundits who embraced and spread the lies that justified and sustained especially the Iraq War, the myth has become something of a touchstone for everyone aspiring to higher office or seeking a higher-paying gig in the mainstream media.

Campaigning in New Hampshire, [then] presidential aspirant Jeb Bush gave a short history lesson about his big brother’s attack on Iraq. Referring to the so-called Islamic State, Bush said, “ISIS didn’t exist when my brother was president. Al-Qaeda in Iraq was wiped out … the surge created a fragile but stable Iraq. …”

We’ve dealt with the details of the Iraq “surge” myth before both before and after it was carried out. [See, for instance, Consortiumnews.com’s “Reviving the Successful Surge Myth”;  “Gen. Keane on Iran Attack”; “Robert Gates: As Bad as Rumsfeld?”; and “Troop Surge Seen as Another Mistake.”]

But suffice it to say that Jeb Bush is distorting the history and should be ashamed. The truth is that al-Qaeda did not exist in Iraq before his brother launched an unprovoked invasion in 2003. “Al-Qaeda in Iraq” arose as a direct result of Bush’s war and occupation. Amid the bloody chaos, AQI’s leader, a Jordanian named Abu Musab al-Zarqawi, pioneered a particularly brutal form of terrorism, relishing videotaped decapitation of prisoners.

Zarqawi was eventually hunted down and killed not during the celebrated “surge” but in June 2006, months before Bush’s “surge” began. The so-called Sunni Awakening, essentially the buying off of many Sunni tribal leaders, also predated the “surge.” And the relative reduction in the Iraq War’s slaughter after the 2007 “surge” was mostly the result of the ethnic cleansing of Baghdad from a predominantly Sunni to a Shia city, tearing the fabric of Baghdad in two, and creating physical space that made it more difficult for the two bitter enemies to attack each other. In addition, Iran used its influence with the Shia to rein in their extremely violent militias.

Though weakened by Zarqawi’s death and the Sunni Awakening, AQI did not disappear, as Jeb Bush would like you to believe. It remained active and when Saudi Arabia and the Sunni gulf states took aim at the secular regime of Bashar al-Assad in Syria AQI joined with other al-Qaeda affiliates, such as the Nusra Front, to spread their horrors across Syria. AQI rebranded itself “the Islamic State of Iraq and Syria” or simply “the Islamic State.”

The Islamic State split off from al-Qaeda over strategy but the various jihadist armies, including al-Qaeda’s Nusra Front, [then] seized wide swaths of territory in Syria — and the Islamic State returned with a vengeance to Iraq, grabbing major cities such as Mosul and Ramadi.

Jeb Bush doesn’t like to unspool all this history. He and other Iraq War backers prefer to pretend that the “surge” in Iraq had won the war and Obama threw the “victory” away by following through on George W. Bush’s withdrawal agreement with Maliki.

But the crisis in Syria and Iraq is among the fateful consequences of the U.S./UK attack 12 years ago and particularly of the “surge” of 2007, which contributed greatly to Sunni-Shia violence, the opposite of what George W. Bush professed was the objective of the “surge,” to enable Iraq’s religious sects to reconcile.

Reconciliation, however, always took a back seat to the real purpose of the “surge” buying time so Bush and Cheney could slip out of Washington in 2009 without having an obvious military defeat hanging around their necks and putting a huge stain on their legacies.

Cheney and Bush: Reframed the history. (White House photo)

The political manipulation of the Iraq “surge” allowed Bush, Cheney and their allies to reframe the historical debate and shift the blame for the defeat onto Obama, recognizing that 1,000 more dead U.S. soldiers was a small price to pay for protecting the “Bush brand.” Now, Bush’s younger brother can cheerily march off to the campaign trail for 2016 pointing to the carcass of the Iraqi albatross hung around Obama’s shoulders.

Rout at Ramadi

Less than a year after U.S.-trained and -equipped Iraqi forces ran away from the northern Iraqi city of Mosul, leaving the area and lots of U.S. arms and equipment to ISIS, something similar happened at Ramadi, the capital of the western province of Anbar. Despite heavy U.S. air strikes on ISIS, American-backed Iraqi security forces fled Ramadi, which is only 70 miles west of Baghdad, after a lightning assault by ISIS forces.

The ability of ISIS to strike just about everywhere in the area is reminiscent of the Tet offensive of January-February 1968 in Vietnam, which persuaded President Lyndon Johnson that that particular war was unwinnable. If there are materials left over in Saigon for reinforcing helicopter landing pads on the tops of buildings, it is not too early to bring them to Baghdad’s Green Zone, on the chance that U.S. embassy buildings may have a call for such materials in the not-too-distant future.

The headlong Iraqi government retreat from Ramadi had scarcely ended when Sen. John McCain, (R-AZ), described the fall of the city as “terribly significant” which is correct adding that more U.S. troops may be needed which is insane. His appeal for more troops neatly fit one proverbial definition of insanity (attributed or misattributed to Albert Einstein): “doing the same thing over and over again [like every eight years?] but expecting different results.”

As Jeb Bush was singing the praises of his brother’s “surge” in Iraq, McCain and his Senate colleague Lindsey Graham were publicly calling for a new “surge” of U.S. troops into Iraq. The senators urged President Obama to do what George W. Bush did in 2007 replace the U.S. military leadership and dispatch additional troops to Iraq.

But Washington Post pundit David Ignatius, even though a fan of the earlier two surges, was not yet on board for this one. Ignatius warned in a column that Washington should not abandon its current strategy:

“This is still Iraq’s war, not America’s. But President Barack Obama must reassure Prime Minister Haider al-Abadi that the U.S. has his back, and at the same time give him a reality check: If al-Abadi and his Shiite allies don’t do more to empower Sunnis, his country will splinter. Ramadi is a precursor, of either a turnaround by al-Abadi’s forces, or an Iraqi defeat.”

Ignatius’s urgent tone was warranted. But what he suggests is precisely what the U.S. made a lame attempt to do with then-Prime Minister Maliki in early 2007. Yet, Bush squandered U.S. leverage by sending 30,000 troops to show he “had Maliki’s back,” freeing Maliki to accelerate his attempts to marginalize, rather than accommodate, Sunni interests.

Perhaps Ignatius now remembers how the “surge” he championed in 2007 greatly exacerbated tensions between Shia and Sunni contributing to the chaos now prevailing in Iraq and spreading across Syria and elsewhere. But Ignatius is well connected and a bellwether; if he ends up advocating another “surge,” take shelter.

Keane and Kagan Ask For a Mulligan

Jeb Bush: Sung his brother’s praises. (Sun City Center, Florida, on May 9, 2006. White House photo by Eric Draper)

The architects of Bush’s 2007 “surge” of 30,000 troops into Iraq, former Army General Jack Keane and American Enterprise Institute neocon strategist Frederick Kagan, in testimony to the Senate Armed Services Committee, warned strongly that, without a “surge” of some 15,000 to 20,000 U.S. troops, ISIS would win in Iraq.

“We are losing this war,” warned Keane, who previously served as Vice Chief of Staff of the Army. “ISIS is on the offense, with the ability to attack at will, anyplace, anytime. … Air power will not defeat ISIS.” Keane stressed that the U.S. and its allies have “no ground force, which is the defeat mechanism.”

Not given to understatement, Kagan called ISIS “one of the most evil organizations that has ever existed. … This is not a group that maybe we can negotiate with down the road someday. This is a group that is committed to the destruction of everything decent in the world.” He called for “15-20,000 U.S. troops on the ground to provide the necessary enablers, advisers and so forth,” and added: “Anything less than that is simply unserious.”

(By the way, Frederick Kagan is the brother of neocon-star Robert Kagan, whose Project for the New American Century began pushing for the invasion of Iraq in 1998 and finally got its way in 2003. Robert Kagan is the husband of Assistant Secretary of State for European Affairs Victoria Nuland, who oversaw the 2014 coup that brought “regime change” and bloody chaos to Ukraine. The Ukraine crisis also prompted Robert Kagan to urge a major increase in U.S. military spending. [For details, see Consortiumnews.com’s “A Family Business of Perpetual War.”] )

What is perhaps most striking, however, is the casualness with which the likes of Frederick KaganJack Keane, and other Iraq War enthusiasts advocated dispatching tens of thousands of U.S. soldiers to fight and die in what would almost certainly be another futile undertaking. You might even wonder why people like Kagan are invited to testify before Congress given their abysmal records.

But that would miss the true charm of the Iraq “surge” in 2007 and its significance in salvaging the reputations of folks like Kagan, not to mention George W. Bush and Dick Cheney. From their perspective, the “surge” was a great success. Bush and Cheney could swagger from the West Wing into the western sunset on Jan. 20, 2009.

As author Steve Coll has put it, “The decision [to surge] at a minimum guaranteed that his [Bush’s] presidency would not end with a defeat in history’s eyes. By committing to the surge [the President] was certain to at least achieve a stalemate.”

According to Bob Woodward, Bush told key Republicans in late 2005 that he would not withdraw from Iraq, “even if Laura and [first-dog] Barney are the only ones supporting me.” Woodward made it clear that Bush was well aware in fall 2006 that the U.S. was losing. Suddenly, with some fancy footwork, it became Laura, Barney and new Defense Secretary Robert Gates and Gen. David Petraeus along with 30,000 more U.S. soldiers making sure that the short-term fix was in.

The fact that about 1,000 U.S. soldiers returned in caskets was the principal price paid for that short-term “surge” fix. Their “ultimate sacrifice” will be mourned by their friends, families and countrymen on Memorial Day even as many of the same politicians and pundits will be casually pontificating about dispatching more young men and women as cannon fodder into the same misguided war.

[President Donald Trump has continued the U.S.’s longest war (Afghanistan), sending additional troops and dropping a massive bomb as well as missiles from drones.  In Syria he has ordered two missile strikes and condoned multiple air strikes from Israel.  Here’s hoping, on this Memorial Day 2018, that he turns his back on his war-mongering national security adviser, forges ahead with a summit with North Korean leader Kim Jung-Un rather than toy with the lives of 30,000 U.S. soldiers in Korea, and halts the juggernaut rolling downhill toward war with Iran.]

It was difficult drafting this downer, this historical counter-narrative, on the eve of Memorial Day. It seems to me necessary, though, to expose the dramatis personae who played such key roles in getting more and more people killed. Sad to say, none of the high officials mentioned here, as well as those on the relevant Congressional committees, were affected in any immediate way by the carnage in Ramadi, Tikrit or outside the gate to the Green Zone in Baghdad.

And perhaps that’s one of the key points here. It is not most of us, but rather our soldiers and the soldiers and civilians of Iraq, Afghanistan and God knows where else who are Lazarus at the gate. And, as Benjamin Franklin once said, “Justice will not be served until those who are unaffected are as outraged as those who are.”

Published:5/28/2018 3:00:26 PM
[Politics] Trump: Pennsylvania Sen. Bob Casey 'A Do-Nothing Senator' Offering praise for Rep. Lou Barletta, R-Pa., winner of the Republican nomination for the Senate role, President Donald Trump slammed Sen. Bob Casey, D-Pa., on Wednesday. Published:5/16/2018 2:45:28 PM
[World] Pennsylvania Primary Day: Diners Talk Trump on Fox & Friends

It's primary day in Pennsylvania, where Republicans Lou Barletta and Jim Christiana are facing off for the right to challenge Democratic Sen. Bob Casey in November.

Published:5/15/2018 9:07:29 AM
[Markets] Credit-Driven Train Crash, Part 1

Authored by John Mauldin via MauldinEconomics.com,

In 1999, I began saying the tech bubble would eventually spark a recession. Timing was unclear because stock bubbles can blow way bigger than we can imagine. Then the yield curve inverted, and I said recession was certain. I was early in that call, but it happened.

In late 2006, I began highlighting the subprime crisis, and subsequently the yield curve again inverted, necessitating another recession call. Again, I was early, but you see the pattern.

Now let’s fast-forward to today. Here’s what I said last week that drew so much interest.

Peter [Boockvar] made an extraordinarily cogent comment that I’m going to use from now on: “We no longer have business cycles, we have credit cycles.”

For those who don’t know Peter, he is the CIO of Bleakley Advisory Group and editor of the excellent Boock Report. Let’s cut that small but meaty sound bite into pieces.

What do we mean by “business cycle,” exactly? Well, it looks something like this:


Photo: Wikispaces (Creative Commons license)

A growing economy peaks, contracts to a trough (what we call “recession”), recovers to enter prosperity, and hits a higher peak. Then the process repeats. The economy is always in either expansion or contraction.

Economists disagree on the details of all this. Wikipedia has a good overview of the various perspectives, if you want to geek out. The high-level question is why economies must cycle at all. Why can’t we have steady growth all the time? Answers vary. Whatever it is, periodically something derails growth and something else restarts it.

This pattern broke down in the last decade. We had an especially painful contraction followed by an extraordinarily weak expansion. GDP growth should reach 5% in the recovery and prosperity phases, not the 2% we have seen. Peter blames the Federal Reserve’s artificially low interest rates. Here’s how he put it in an April 18 letter to his subscribers.

To me, it is a very simple message being sent. We must understand that we no longer have economic cycles. We have credit cycles that ebb and flow with monetary policy. After all, when the Fed cuts rates to extremes, its only function is to encourage the rest of us to borrow a lot of money and we seem to have been very good at that. Thus, in reverse, when rates are being raised, when liquidity rolls away, it discourages us from taking on more debt. We don’t save enough.

This goes back farther than 2008. The Greenspan Fed pushed rates abnormally low in the late 1990s even though the then-booming economy needed no stimulus. That was in part to provide liquidity to a Y2K-wary public and partly in response to the 1998 market turmoil, but they were slow to withdraw the extra cash. Bernanke was again generous to borrowers in the 2000s, contributing to the housing crisis and Great Recession. We’re now 20 years into training people (and businesses) that running up debt is fun and easy… and they’ve responded.

But over time, debt stops stimulating growth. Over this series, we will see that it takes more debt accumulation for every point of GDP growth, both in the US and elsewhere. Hence, the flat-to-mild “recovery” years. I’ve cited academic literature via my friend Lacy Hunt that debt eventually becomes a drag on growth.

Debt-fueled growth is fun at first but simply pulls forward future spending, which we then miss. Now we’re entering the much more dangerous reversal phase in which the Fed tries to break the debt addiction. We all know that never ends well.

So, Peter’s point is that a Fed-driven credit cycle now supersedes the traditional business cycle. Since debt drives so much GDP growth, its cost (i.e. interest rates) is the main variable defining where we are in the cycle. The Fed controls that cost—or at least tries to—so we all obsess on Fed policy. And rightly so.

Among other effects, debt boosts asset prices. That’s why stocks and real estate have performed so well. But with rates now rising and the Fed unloading assets, those same prices are highly vulnerable. An asset’s value is what someone will pay for it. If financing costs rise and buyers lack cash, the asset price must fall. And fall it will. The consensus at my New York dinner was recession in the last half of 2019. Peter expects it sooner, in Q1 2019.

If that’s right, financial market fireworks aren’t far away.

Corporate Debt Disaster

In an old-style economic cycle, recessions triggered bear markets. Economic contraction slowed consumer spending, corporate earnings fell, and stock prices dropped. That’s not how it works when the credit cycle is in control. Lower asset prices aren’t the result of a recession. They cause the recession. That’s because access to credit drives consumer spending and business investment. Take it away and they decline. Recession follows.

If some of this sounds like the Hyman Minsky financial instability hypothesis I’ve described before, you’re exactly right. Minsky said exuberant firms take on too much debt, which paralyzes them, and then bad things start happening. I think we’re approaching that point.

The last “Minsky Moment” came from subprime mortgages and associated derivatives. Those are getting problematic again, but I think today’s bigger risk is the sheer amount of corporate debt, especially high-yield bonds that will be very hard to liquidate in a crisis.

Corporate debt is now at a level that has not ended well in past cycles. Here’s a chart from Dave Rosenberg:

Source: Gluskin Sheff

The Debt/GDP ratio could go higher still, but I think not much more. Whenever it falls, lenders (including bond fund and ETF investors) will want to sell. Then comes the hard part: to whom?

You see, it’s not just borrowers who’ve become accustomed to easy credit. Many lenders assume they can exit at a moment’s notice. One reason for the Great Recession was so many borrowers had sold short-term commercial paper to buy long-term assets. Things got worse when they couldn’t roll over the debt and some are now doing exactly the same thing again, except in much riskier high-yield debt. We have two related problems here.

  • Corporate debt and especially high-yield debt issuance has exploded since 2009.
  • Tighter regulations discouraged banks from making markets in corporate and HY debt.

Both are problems but the second is worse. Experts tell me that Dodd-Frank requirements have reduced major bank market-making abilities by around 90%. For now, bond market liquidity is fine because hedge funds and other non-bank lenders have filled the gap. The problem is they are not true market makers. Nothing requires them to hold inventory or buy when you want to sell. That means all the bids can “magically” disappear just when you need them most. These “shadow banks” are not in the business of protecting your assets. They are worried about their own profits and those of their clients.

Gavekal’s Louis Gave wrote a fascinating article on this last week titled, “The Illusion of Liquidity and Its Consequences.” He pulled the numbers on corporate bond ETFs and compared it to the inventory trading desks were holding—a rough measure of liquidity.

(Incidentally, you’ll get that full report on Monday if you subscribe to Over My Shoulder. What you learn could easily pay for your first year.)

Louis found dealer inventory is not remotely enough to accommodate the selling he expects as higher rates bite more.

We now have a corporate bond market that has roughly doubled in size while the willingness and ability of bond dealers to provide liquidity into a stressed market has fallen by more than -80%. At the same time, this market has a brand-new class of investors, who are likely to expect daily liquidity if and when market behavior turns sour. At the very least, it is clear that this is a very different corporate bond market and history-based financial models will most likely be found wanting.

The “new class” of investors he mentions are corporate bond ETF and mutual fund shareholders. These funds have exploded in size (high yield alone is now around $2 trillion) and their design presumes a market with ample liquidity. We barely have such a market right now, and we certainly won’t have one after rates jump another 50–100 basis points.

Worse, I don’t have enough exclamation points to describe the disaster when high-yield funds, often purchased by mom-and-pop investors in a reach for yield, all try to sell at once, and the funds sell anything they can at fire-sale prices to meet redemptions.

In a bear market you sell what you can, not what you want to. We will look at what happens to high-yield funds in bear markets in a later letter. The picture is not pretty.

To make matters worse, many of these lenders are far more leveraged this time. They bought their corporate bonds with borrowed money, confident that low interest rates and defaults would keep risks manageable. In fact, according to S&P Global Market Watch, 77% of corporate bonds that are leveraged are what’s known as “covenant-lite.” We’ll discuss more later in this series, but the short answer is that the borrower doesn’t have to repay by conventional means. Sometimes they can even force the lender to take more debt. In an odd way, some of these “covenant-lite” borrowers can actually “print their own money.”

Somehow, lenders thought it was a good idea to buy those bonds. Maybe that made sense in good times. In bad times? It can precipitate a crisis. As the economy enters recession, many companies will lose their ability to service debt, especially now that the Fed is making it more expensive to roll over—as multiple trillions of dollars will need to do in the next few years. Normally this would be the borrowers’ problem, but covenant-lite lenders took it on themselves.

The macroeconomic effects will spread even more widely. Companies that can’t service their debt have little choice but to shrink. They will do it via layoffs, reducing inventory and investment, or selling assets. All those reduce growth and, if widespread enough, lead to recession.

Let’s look at this data and troubling chart from Bloomberg:

Companies will need to refinance an estimated $4 trillion of bonds over the next five years, about two-thirds of all their outstanding debt, according to Wells Fargo Securities. This has investors concerned because rising rates means it will cost more to pay for unprecedented amounts of borrowing, which could push balance sheets toward a tipping point. And on top of that, many see the economy slowing down at the same time the rollovers are peaking.

“If more of your cash flow is spent into servicing your debt and not trying to grow your company, that could, over time—if enough companies are doing that—lead to economic contraction,” said Zachary Chavis, a portfolio manager at Sage Advisory Services Ltd. in Austin, Texas. “A lot of people are worried that could happen in the next two years.”

The problem is that much of the $2 trillion in bond ETF and mutual funds isn’t owned by long-term investors who hold maturity. When the herd of investors calls up to redeem, there will be no bids for their “bad” bonds. But they’re required to pay redemptions, so they’ll have to sell their “good” bonds. Remaining investors will be stuck with an increasingly poor-quality portfolio, which will drop even faster. Wash, rinse, repeat. Those of us with a little gray hair have seen this before, but I think the coming one is potentially biblical in proportion.

Casey Jones via Wikimedia Commons

Blowing the Whistle

As you can tell, this is a multifaceted problem. I will dig deeper into the specifics in the coming weeks. The numbers seem unbelievable. I truly think we are headed to a staggering credit crisis.

I began this letter describing the coming events as a train wreck. That comparison came up when my colleague Patrick Watson and I were on the phone this week, planning this series of letters. Patrick and his beautiful wife Grace had just come back from Tennessee, and he told me about visiting the Casey Jones birthplacemuseum in Jackson.

For those who don’t know the story or haven’t heard the songs, Casey Jones was a talented young railroad engineer in the late 1800s. On April 30, 1900, Casey Jones was going at top speed when his train tragically overtook a stopped train that wasn’t supposed to be there.

Traveling at 75 miles per hour, Jones ordered his young fireman to jump, pulled the brakes hard, and blew the train whistle, warning his passengers and the other train. Later investigations found he had slowed it to 35 mph before impact. Everyone on both trains survived… except Casey Jones.

His heroic death made Jones a folk hero to this day. Many songs told the story and even the Grateful Dead and AC/DC paid tribute decades later. (Trivia: He actually tuned his train whistle with six different tubes to make a unique whippoorwill sound. So, when people heard his train whistle, they knew it was Casey Jones.)

Right now, the US economy is kind of like that train: speeding ahead with the Fed only slowly removing the fuel it shouldn’t have loaded in the first place and passengers just hoping to reach our destination on time. Unfortunately, we don’t have a reliable Casey Jones at the throttle. We’re at the mercy of central bankers and politicians who aren’t looking ahead. They can’t simply turn the steering wheel. We are stuck on this track and will go where it takes us.

Next week, we’ll talk about the sequence of how the next debt crisis will arise, how it triggers a recession, and then $2 trillion of deficits in the US and rising debt all over the world. Which just increases pressures on interest rates and lending. And reduces growth. It is not a virtuous cycle.

Published:5/12/2018 1:19:37 PM
[Markets] Gold And Silver: Sell In May And Go Away? Not Exactly

Authored by John Rubino via DollarCollapse.com,

It’s easy to dismiss seasonality in the price of a tradable asset. After all, if supply and demand fluctuate regularly you’d think the resulting arbitrage would attract enough traders to smooth out prices.

But that’s apparently not the case with gold and silver. Here’s an analysis from Casey Research on the subject with a couple of highly revealing charts.

Gold Is Seasonal: When Is The Best Month To Buy?

Many investors, especially those new to precious metals, don’t know that gold is seasonal. For a variety of reasons, notably including the wedding season in India, the price of gold fluctuates in fairly consistent ways over the course of the year.

This pattern is borne out by decades of data, and hence has obvious implications for gold investors.

Can you guess which is the best month for buying gold?

When I first entertained this question, I guessed June, thinking it would be a summer month when the price would be at its weakest. Finding I was wrong, I immediately guessed July. Wrong again, I was sure it would be August. Nope.

Cutting to the chase, here are gold’s average monthly gain and loss figures, based on almost 40 years of data:

Since 1975—the first year gold ownership in the US was made legal again—March has been, on average, the worst-performing month for gold.

This, of course, makes March the best month for buying gold.

Here’s what buying in March has meant to past investors. We measured how well gold performed by December in each period if you bought during the weak month of March.

What does this pattern means for us here at the end of April? Most likely a couple of boring months are in store, in which both upside potential and downside risk are modest. This in turn means that decisions (to stack, take profits, or add to mining share portfolios) can be considered instead of rushed, with good-until-canceled orders being allowed to sit until, in the summer doldrums, some bored trader comes along to make you a good deal.

COTs remain bad for gold, good for silver

Meanwhile the structure of the gold futures market got a little less screamingly bad, with both speculators and commercials moving back in the direction of balance.

They’re still a long way from what would normally be thought of as a bullish posture, however. Speculators are too long and commercials are extremely short, which traditionally precedes a drop in price. Maybe “sell in May and go away” will turn out to apply for gold this time.

Silver speculators closed out a lot of shorts last week, but are still closer to net short than is usual. That’s bullish, just less so than in the past couple of weeks.

In other words, nothing huge to report, just as you’d expect as precious metals enter their slow season.

Published:4/29/2018 6:26:40 PM
[Markets] Simon Black On "The Coming Boom In Gold Prices..."

Authored by Simon Black via SovereignMan.com,

In June 1884, a local farmer named Jan Gerritt Bantjes discovered gold on his property in a quiet corner of the South African Republic.

Though no one had any idea at the time, Bantjes’ farm was located on a vast geological formation known as the Witwatersrand Basin… which just happens to contain the world’s largest known gold reserves.

Within a few months, other local farmers started discovering gold… kicking off a full-fledged gold rush.

Just over a decade later, South Africa became the largest gold producer in the world… and the city of Johannesburg grew from absolutely nothing to a thriving boomtown.

This area is singlehandedly responsible for 40% of all the gold discovered in human history – some 2 billion ounces (or $2.6 trillion of wealth at today’s gold price).

And while the Witwatersrand Basin is still being mined to this day, it’s not as active as it used to be.

Gold production in Witwatersrand peaked in 1970, when miners pulled a whopping 1,000 metric tons of gold out of the ground.

A few decades later in 2016, the same area produced just 166 tonnes– a decline of 83%.

That’s not unusual in the natural resource business.

Whereas it takes nature hundreds of millions of years to deposit minerals deep in the earth’s crust, human beings only require a few decades to pull most of it out.

This creates the constant need for mining companies to explore for more and more major discoveries.

Problem is– that’s not happening. Mining companies aren’t finding anymore vast deposits.

According to Pierre Lassonde, founder of the gold royalty giant Franco-Nevada and former head of Newmont Mining–

If you look back to the 70s, 80s and 90s, in every one of those decades, the industry found at least one 50+ million-ounce gold deposit, at least ten 30+ million ounce deposits, and countless 5 to 10 million ounce deposits.

But if you look at the last 15 years, we found no 50-million-ounce deposit, no 30 million ounce deposit and only very few 15 million ounce deposits.

So where are those great big deposits we found in the past? How are they going to be replaced? We don’t know.

Bottom line: gold discoveries are dwindling.

Part of the reason for this is that mining companies aren’t investing as much money in exploration.

According to S&P Global Market Intelligence, major mining companies (excluding those in the iron ore business) have been cutting their exploration budgets for years.

By the end of 2016, exploration budgets hit an 11-year low.

And this has clearly had an effect on new discoveries.

This is all because the gold price has been relatively flat for the past several years.

Investors have lost interest. And the mining companies, eager to cut costs, have pared back their exploration budgets as a result.

But this is where it gets interesting: natural resources are cyclical. They go through extreme periods of BOOM and BUST.

When gold prices are high, major mining companies scramble for new discoveries.

Eventually when they start mining those deposits, though, the supply of gold increases, pushing prices down.

As the price falls, the miners’ profit margins fall, which causes investors to lose interest and the miners to reduce production.

This causes supply to fall, prices to increase, and the cycle starts all over again.

In a way it’s almost comical. And that brings us to today. Well, technically yesterday.

We’ve been seeing for more than a year that interest rates have been rising.

Yesterday afternoon the yield on the 10-year US Treasury note surpassed 3% for the first time since 2014.

And oil prices have been rising steadily as well.

Financial markets don’t like this combination– it means that inflation is coming. Big time. And stocks plummeted worldwide as a result.

Now, that immediate reaction was probably a bit too panicky.

But the deep concern that inflation is coming (or has already arrived) is completely valid.

Inflation is a HUGE problem. And the traditional hedge in times of inflation is GOLD.

But remember– new gold discoveries have collapsed in the past 15 years.

And, as Lassonde said above, there are few discoveries on the horizon to make up the difference.

These companies can’t just go out and start a new mine, either. Even if they found a promising deposit, with all of the bureaucratic red tape, it would take seven to nine years to start producing gold.

So when demand for gold really starts to heat up, the supply won’t be there.

And this could really cause the gold price to soar. (Silver could rise even more… but we’ll save that for another time.)

Now, there are plenty of small, highly speculative companies, known as ‘junior miners’ who specialize in exploring for new deposits.

And when the gold market is in a frenzy, juniors with great deposits tend to be acquired at ridiculous prices by the major miners.

Now, I’m not suggesting you load up on junior miners– you can make a lot of money if you know what you’re doing, and LOSE a lot of money if you don’t know what you’re doing.

These are tiny, extremely high-risk companies often run by sharks and con-men.

As Doug Casey writes in his novel Speculator, they’re great and taking YOUR money and THEIR dream, and turning it into THEIR money and YOUR dream.

Fortunately there are safer ways to take advantage of this looming imbalance between supply and demand in the gold market.

Physical coins are an easy option.

Gold coins typically sell at a price that’s higher than the market price of gold– to account for the work involved in minting the coin.

This price difference is known as the ‘premium’.

And when gold becomes popular, the premiums often increase too.

This means you can make money both from the rise in gold prices, as well as the increased premiums.

Avoid anything obscure– stick to the most popular gold coins like Canadian Maple Leafs.

And to continue learning how to ensure you thrive no matter what happens next in the world, I encourage you to download our free Perfect Plan B Guide.

Published:4/26/2018 9:06:13 AM
[Markets] Doug Casey Warns "It's Going To Get Very Unpleasant In The US At Some Point Soon"

Authored by Doug Casey via InternationalMan.com,

You’re likely aware that I’m a libertarian. But I’m actually more than a libertarian. I don’t believe in the right of the State to exist. The reason is that anything that has a monopoly of force is extremely dangerous.

As Mao Tse-tung, lately one of the world’s leading experts on government, said: “The power of the state comes out of a barrel of a gun.”

There are two possible ways for people to relate to each other, either voluntarily or coercively. And the State is pure institutionalized coercion. It’s not just unnecessary, but antithetical, for a civilized society. And that’s increasingly true as technology advances. It was never moral, but at least it was possible, in oxcart days, for bureaucrats to order things around. Today it’s ridiculous.

Everything that needs doing can and will be done by the market, by entrepreneurs who fill the needs of other people for a profit. The State is a dead hand that imposes itself on society. That belief makes me, of course, an anarchist.

People have a misconception about anarchists. That they’re these violent people, running around in black capes with little round bombs. This is nonsense. Of course there are violent anarchists. There are violent dentists. There are violent Christians. Violence, however, has nothing to do with anarchism. Anarchism is simply a belief that a ruler isn’t necessary, that society organizes itself, that individuals own themselves, and the State is actually counterproductive.

It’s always been a battle between the individual and the collective. I’m on the side of the individual.

I simply don’t believe anyone has a right to initiate aggression against anyone else. Is that an unreasonable belief?

Let me put it this way. Since government is institutionalized coercion—a very dangerous thing—it should do nothing but protect people in its bailiwick from physical coercion.

What does that imply? It implies a police force to protect you from coercion within its boundaries, an army to protect you from coercion from outsiders, and a court system to allow you to adjudicate disputes without resorting to coercion.

I could live happily with a government that did just those things. Unfortunately the US Government is only marginally competent in providing services in those three areas. Instead, it tries to do everything else.

The argument can be made that the largest criminal entity today is not some Colombian cocaine gang, it’s the US Government. And they’re far more dangerous. They have a legal monopoly to do anything they want with you. Don’t conflate the government with America—it’s a separate entity, with its own interests, as distinct as General Motors or the Mafia. I’d rather deal with the Mafia than I would with any agency of the US Government.

Even under the worst circumstances, even if the Mafia controlled the United States, I can’t believe Tony Soprano or Al Capone would try to steal 40% of people’s income from them every year. They couldn’t get away with it. But—perhaps because we’re said to be a democracy—the US Government is able to masquerade as “We the People.” That’s an anachronism, at best. The US has mutated into a domestic multicultural empire. The average person has been propagandized into believing that it’s patriotic to do as he’s told. “We have to obey libraries of regulations, and I’m happy to pay my taxes. It’s the price we pay for civilization.” No, that’s just the opposite of the fact. Those things are a sign that civilization is degrading, that the society is becoming less individually responsible, and has to be held together by force.

It’s all about control. Power corrupts, absolute power corrupts absolutely. The type of people that gravitate to government like to control other people. Contrary to what we’re told to think, that’s why you get the worst people—not the best—who want to get into government.

What about voting? Can that change and improve things? Unlikely. I can give you five reasons why you should not vote in an election (see this article). See if you agree.

Hark back to the ’60s when they said, “Suppose they gave a war and nobody came?” But let’s take it further: Suppose they gave a tax and nobody paid? Suppose they gave an election and nobody voted? What that would do is delegitimize government. I applaud the fact that only half of Americans vote. If that number dropped to 25%, 10%, then 0%, perhaps everybody would look around and say, “Wait a minute, none of us believe in this evil charade. I don’t like Tweedledee from the left wing of the Demopublican Party any more than I like Tweedledum from its right wing…”

Remember you don’t get the best and the brightest going into government. There are two kinds of people. You’ve got people that like to control physical reality—things. And people that like to control other people. That second group, those who like to lord it over their fellows, are drawn to government and politics.

Some might ask: “Aren’t you loyal to America?” and “How can you say these terrible things?” My response is, “Of course I’m loyal to America, but America is an idea, it’s not a place. At least not any longer…”

America was once unique among the world’s countries. Unfortunately that’s no longer the case. The idea is still unique, but the country no longer is.

I’ll go further than that. It’s said that you’re supposed to be loyal to your fellow Americans. Well, here’s a revelation. I have less in common with my average fellow American than I do with friends of mine in the Congo, or Argentina, or China. The reason is that I share values with my friends; we look at the world the same way, have the same worldview. But what do I have in common with my fellow Americans who live in the trailer parks, barrios, and ghettos? Or even Hollywood, Washington, and Manhattan? Everyone has to be judged as an individual, but probably very little besides residing in the same political jurisdiction. Most of them—about 50% of the US—are welfare recipients, and therefore an active threat. So I have more personal loyalty to the guys in the Congo than I do to most of my fellow Americans. The fact we carry US passports is simply an accident of birth.

Those who find that thought offensive likely suffer from a psychological aberration called “nationalism”; in serious cases it may become “jingoism.” The authorities and the general public prefer to call it “patriotism.” It’s understandable, though. Everyone, including the North Koreans, tends to identify with the place they were born. But these things should be fairly low on any list of virtues. Nationalism is the belief that my country is the best country in the world just because I happen to have been born there. It’s most virulent during wars and elections. And it’s very scary. It’s like watching a bunch of chimpanzees hooting and panting at another tribe of chimpanzees across the watering hole. I have no interest in being a part of the charade—although that’s dangerous.

And getting more dangerous as the State grows more powerful. The growth of the State is actually destroying society. Over the last 100 years the State has grown at an exponential rate, and it’s the enemy of the individual. I see no reason why this trend, which has been in motion and accelerating for so long, is going to stop. And certainly no reason why it’s going to reverse.

It’s like a giant snowball that’s been rolling downhill from the top of the mountain. It could have been stopped early in its descent, but now the thing is a behemoth. If you stand in its way you’ll get crushed. It will stop only when it smashes the village at the bottom of the valley.

This makes me quite pessimistic about the future of freedom in the US. As I said, it’s been in a downtrend for many decades. But the events of September 11, 2001, turbocharged the acceleration of the loss of liberty in the US. At some point either foreign or domestic enemies will cause another 9/11, either real or imagined. It’s predictable; that’s what sociopaths, which I discussed earlier, do.

When there is another 9/11—and we will have another one—they’re going to lock down this country like one of their numerous new prisons. I was afraid that the shooting deaths and injuries of several hundred people in Las Vegas on October 1st might be it. But, strangely, the news cycle has driven on, leaving scores of serious unanswered questions in its wake. And about zero public concern.

It’s going to become very unpleasant in the US at some point soon. It seems to me the inevitable is becoming imminent.

*  *  *

As Doug says, the State continues to grow more powerful. But behind it is a little-known group—one made up of unelected insiders. This group calls the real shots in Washington. And it poses the biggest threat to your freedom… and your finances. Luckily, you can protect yourself and your family from its destructive agenda. Learn how here.

Published:4/12/2018 4:06:55 AM
[Entertainment] Casey Affleck Has Been ''Treated Abominably'' in #MeToo Era, Kenneth Lonergan Says Casey Affleck, Kenneth LonerganManchester by the Sea director Kenneth Lonergan thinks Casey Affleck was treated unfairly by the Time's Up and #MeToo movements. Affleck, who won an Oscar for his performance in the...
Published:4/5/2018 4:56:30 PM
[Markets] The American Revolution In Two Acts

Authored by Jeff Thomas via InternationalMan.com,

The American colonies were made up of people who could not accept the downward progression in Europe and said, “I’m leaving.” That took great courage, as they were leaving their few known comforts for unknown difficulties.

However, once they had made the move and overcome the difficulties of settlement, they understood that their courage had been rewarded. Such people never look back and say, “Maybe we shouldn’t have left.”

There can be little doubt that they taught their children and grandchildren the values of courage, determination, hard work, and self-reliance. And more and more immigrants were added to their numbers, each of whom was also courageous enough to abandon Europe for freedom and opportunity. They raised generations of people with a “pioneer spirit.”

Not surprisingly, then, that when the American colonists were squeezed by King George for increases in tax, it wasn’t difficult for them to refuse. They chose to go it alone, rather than allow the British king to steal the fruits of their labours.

Although the tax level at that time was a mere 2%, it was the principle that taxation is theft that angered them. Further, they had already proven to themselves that they had all the character qualities necessary to determine their own future.

And so, in a sense, the American Revolution was Act II of the quest for freedom and, of the two challenges, it may have been the easier one to face.

However, the America of the late eighteenth century is not the America of today - and the outcome will not be the same for Americans in the present era.

It’s important to remember that only a very small percentage of people actually left Europe to find freedom. The great majority remained behind, complaining about the ever-increasing loss of freedoms, but doing nothing about it. Although their governments took more and more from them, the great majority simply tolerated it, saying, “What can you do?” They became the eventual victims of that oppression, as has happened throughout history.

Those in America today are, in essence, a subjugated people, just as Europeans were prior to the American Revolution. They’re accustomed to the concept of the “nanny state”—one which taxes its people heavily and throws back a portion of what they’ve stolen in the form of “bread and circuses,” as in ancient Rome.

Americans today complain continually, either that too much is being taken from them or that the state isn’t providing them with sufficient largesse. Some even complain of both at the same time.

And yet, a very large percentage of Americans holds out “hope” that somehow, the process will reverse itself—that a new political candidate will appear—a “Freedom Fairy,” who will somehow stand in front of the runaway train, stop it, and reverse it.

Historically, this never happens. What happens is that a small number decide to set sail and escape. Whether it’s the Roman commercial class, who walked away from their shops and travelled north to live amongst the barbarians, rather than accept Rome’s increasing domination, or the German Jews who locked up their shops and homes and boarded ships to the West, just prior to the lockdown of 1939, every burgeoning new “free” society has been created by the few who took courage and made an exit from a dying society.

In every case, those who exited did so with fear in their hearts that they would fail. They left their larger possessions behind and travelled light, sewing coins and jewellery into their clothing, not knowing whether they would succeed.

However, when they arrived at the new frontier, they met other like-minded people, each of whom had also shown courage and determination. They then created a new society that was, predictably, based upon the principles described above.

Today, a similar exodus is occurring. It’s made of those who place their liberty and hope for a promising future above the comforts and freedoms that, one by one, are being taken from them by their governments.

Of course, the details are not the same. They no longer travel by ship, but by jet. No one sews valuables into their clothes, as they’d never get through the metal detectors. Instead, they convert their assets to cash and purchase precious metals, to be stored in a country where there is diminished risk of confiscation by governments.

As has happened throughout history, the exodus is being undertaken quietly. Those who emigrate do not wish to call attention to themselves, but then, neither do the governments of the countries they’re leaving. It’s never seen on the news, and the official numbers who leave are far below the number that actually departed.

But the details of the exit are unimportant. What is important is that, when people meet the challenge to exit to find freedom and self-determination, they then build an extremely strong and free society. And there are many locations in the world where this is presently taking place.

But what of those left behind? Surely, the present-day US is at a breaking point and may very well explode into civil disobedience—even revolution.

Yes, this is quite so. And again, history shows us what happens in countries where the majority feel that they’re entitled to be looked after; that the rich must “pay a little more” to provide them with largesse. Good examples of this are the Russian Revolution and the French Revolution.

Both of these are marked by a predominance of belief that “someone has to pay so that I can benefit.” In both revolutions, the aristocracy were violently removed and the rebels scrambled to grab as much of the spoils as possible. Disorder became prolonged and the new leaders that rose up were, if anything, more oppressive than those they replaced.

Today, in visiting the US and talking with Americans, it’s palpable that most Americans now have a gut feeling that this will most certainly not end well. Most hope that there might be a peaceful transition of some sort. Some vainly hope that a “Freedom Fairy” will emerge.

But, Americans, more than most people in the world, incorrectly believe that freedom only exists in their country and that, when it dies there, it will die everywhere. This is far from true, but it does mean that those who were born in the former “land of the free” are more fearful and discouraged than those elsewhere. The great majority doubt that it’s possible for them, individually, to choose freedom, rather than to go down with the ship. They, in effect, are exactly the same as the great majority in Europe in the eighteenth century.

The American colonies were built upon the courage of a few who chose to leave the dominance and stagnation in Europe. The same is true today. The USA may be a sinking ship, but the concept of “America” is not. It’s a movable concept and it can exist anywhere that people have chosen future freedom over tentative comforts.

*  *  *

A “pioneer spirit” isn’t the only thing you need if you want to leave the sinking ship and pursue freedom. You’ll find details on what else you’ll need in Doug Casey’s special report, Getting Out of Dodge. Click here to download your free PDF copy.

Published:4/2/2018 10:35:29 PM
[3e71fcda-1621-4887-be88-99077f9ef8fd] Marcia Clark talks Casey Anthony case in new docu-series sneak peak Famed prosecutor Marcia Clark, best known for her role in the murder trial of O.J. Simpson, is revisiting old cases in a new docu-series called "Marcia Clark Investigates the First 48." Fox News has an exclusive sneak peek of the series' first episode which focuses on the trial of Casey Anthony who was acquitted of murdering her daughter Caylee in 2011. Published:3/29/2018 3:39:36 AM
[bc907e57-bcf6-4a02-964a-317f202eea60] Casey Neistat rips CNN's technology: 'Case study for the innovator's dilemma' Casey Neistat slammed CNN as a “case study for the innovator’s dilemma,” calling it an “old-school media company” after his video-sharing startup, Beme, failed to succeed under CNN Worldwide president Jeff Zucker. Published:3/24/2018 10:24:53 AM
[Markets] Bitcoin: Bubble Or Hyperdeflation?

Bitcoin flew too close to the sun. Now the eyes of the world are upon the crypto market, with all the consequences that follow...

“30th anniversary of Black Monday, when markets dropped 23% in a day. In crypto we just call that Monday."
- Alexander Tapscott

But, as Incrementum's Demelza Hays and Mark J. Valek show in their latest magnificent Crypto Research report, it all happened as it always has.

Bitcoin reached an all-time high and then predictably it fell again. All common stages of the classic bubble were accounted for: euphoria, infatuation, denial, fear, desperation.

When Bitcoin fell under $7000 and the market capitalization of the whole sector halved, the funeral preparations by nay-sayers were already underway. The fact that cryptocurrencies have already survived five such bubbles, as the brilliant analysis by Michael B. Casey shows, is dutifully ignored by said grave diggers...

In general, we are talking about old-school economists who said it from the start: Bitcoin is a scam.

We strongly disagree: This initial scam phase is part of the Wild West stage of any new unregulated market, and Bitcoin and the blockchain have simply a maturing process ahead of them to weed out the bad seeds. In this respect, the crash of the past months is to a certain extent desirable because it is cleansing the market of criminal, half-baked ideas. That is how free market economy works. But it might be a tall order to expect mainstream economists to recall this after more than a decade of bail-outs and quantitative easing.

All of which leaves the question of what happens next. The question appears to be, as Hays and Valek ask (and answer below), is Bitcoin a bursting bubble or the only exit path on the world's first hyperdeflation...

Since December, Bitcoin’s price dropped 69 % from a high of $19,224 to a low of $5,920 in early February. The last time Bitcoin’s price plummeted this much was after the 2013 rally when it reached $1,000 per coin for the first time. During a 411-day correction, Bitcoin’s price dropped 87 % from $1,163 on November 30, 2013 to $152 on January 14, 2015. On February 2 of this year, Nouriel Roubini claimed Bitcoin is “the largest bubble in the history of mankind”. Uber-Keynesian Paul Krugman could hardly contain his joy over the Bitcoin crash – he even created a new word for it: “cryptofreude” alluding to the German word “Schadenfreude” (i.e. to revel in someone else’s pain).

Has anything changed fundamentally since then? Has the bubble really popped? If Bitcoin is no bubble, what is it?

Under fiat monetary systems, average Joes and Janes can no longer store their money under the mattress for safekeeping. If they do, price inflation will eat away the purchasing power of their savings by 2 % to 7 % per year based on official and unofficial calculations, respectively. Permanent money debasement discourages saving and encourages consumption spending on cars, clothing, and vacations. People who are determined to save money are forced to take on additional risk to preserve the purchasing power of their savings over time. Instead of saving their money in bank accounts, savers are forced to look for other long term stores of values like stocks, bonds, and real estate.

Figure 1. Fiat Currencies in Terms of Gold (Logarithmic Scale)

 

The main problem is that printing money is only a short-term strategy. If the purchasing power of a currency depreciates too quickly, demand for that currency decreases. In hyperinflations, demand for holding currency tends toward zero.

The Bitcoin revolution is about having a way to store and transmit value that does not depend on inflationary central bank monetary policy, capital controls, or property rights. The reason people pump their paychecks into real estate, bonds, and stocks is not because these assets make a better medium of exchange. These assets make a better store of value than fiat currency, and Bitcoin has the technological features to become an even better store of value than these assets. Similar to Gold, Bitcoin allows people to save without having their savings diluted slowly by ever increasing money supply (or quickly in the case of hyperinflation).

However, Bitcoin does not only enable secure saving, this technology also allows people to directly send their savings to other people without converting into fiat or any other asset. The Bitcoin network does not depend on intermediaries. Stocks, bonds, real estate, and fine art, all depend on government stability and efficient courts that uphold legal contracts. Even owning gold was outlawed in the U.S. from 1933–1974. Today, it is only legally possible to move $10K worth of gold out of the U.S. at once.

Given Bitcoin’s qualitative features, there are three possible outcomes for Bitcoin...

1.) Bitcoin Becomes A Store of Value

It is true that low interest policy and demand for a secure way to save are fueling part of Bitcoin’s rise in price. If central banks stop debasing the purchasing power of fiat currencies and people can return to the good old days of saving cash in their bank accounts, a large portion of Bitcoin’s appeal will vanish.

In contrast, Bitcoin’s price will go much higher if fiat currencies continue to be a poor vehicle for saving.

The number one reason Bitcoin may not be a bubble stems from Bitcoin’s technological qualities that make it a superior way of saving value. The upward price trend and speculation around Bitcoin stems from Bitcoin’s potential to be a global and permissionless system of managing wealth that cannot be confiscated. Like with Gold, one also gets a different perspective denominating the USD in Bitcoin. We would not expect the steep decline of the USD to continue, however the deflationary nature of Bitcoin would imply that it has more value relative to an ever inflating USD (or any other fiat money).

Figure 2. USD in Terms of Bitcoin (Logarithmic Scale)

 

If Bitcoin is adapted as store of value in the long term, people will be able to directly spend Bitcoin without converting into fiat currencies such as the Swiss franc or euro. The current period of volatility may be referred to as Bitcoin’s “hyperdeflation” phase, and this is the first time in history we are experiencing this type of economic phenomena.

Investors are speculating whether or not Bitcoin will become digital gold, and rightly so. Due to the inflationary design of fiat money, Bitcoin exhibits interesting properties relative to fiat money as store of value. If Bitcoin achieves this goal, the future appreciation relative to inflationary fiat money will happen more gradually and volatility could fall drastically. However, the current part of the appreciation economically would be similar to that of a “one off-seignorage” within a fiat money system.

2.) Bitcoin Becomes A Store of Value and a Medium of Exchange

Given the technological problems, the majority of Bitcoin users do not use it as a medium of exchange. Our current payment system is easier to use. We already have credit cards, banks, and PayPal to facilitate our payments to merchants around the world. On the other hand, technologies such as the Lightning Network and SegWit may eventually make Bitcoin a good medium of exchange as well. If Bitcoin can scale to become a global medium of exchange, its purchasing power will increase because it will be able to serve three distinct functions: storing value, transmitting value, and ultimately being a unit of account.

3.) Bitcoin Becomes Neither and Collapses

Bitcoin’s success so far has been nothing short of a miracle.

Since Bitcoin’s inception in 2009, Bitcoin has been declared “dead” hundreds of times in the media. The Bitcoin network has several threats including hacking the SHA-256 encryption algorithm, being outlawed by governments, a 51 % attack, and solar flares bringing down the Internet. If Bitcoin fails to become a global store of value or medium of exchange, the value and subsequently the price of Bitcoin will fall.

*  *  *

Demelza Hays is Research Analyst at Incrementum AG in Liechtenstein which publishes a quarterly crypto research report.

Mark Valek is Fund Manager of one of the first regulated Alternative Investment Funds with direct exposure to crypto currencies in Liechtenstein for professional investors.

Published:3/16/2018 6:37:35 PM
[Startups] Crypto author Paul Vigna talks about the future of token sales  Paul Vigna, along with his writing partner Michael Casey, are crypto gurus. A crypto critic and Wall Street Journal reporter, Vigna sees through the hype and looks for the value inherent in the crypto system. Vigna and I spoke during this, the 100th episode of Technotopia. Vigna has a lot to say about the market and feels that his new book, the Truth Machine, picks up where his and… Read More
Published:3/12/2018 4:23:51 PM
[World] Delta Airlines NRA Tax Cut Stopped: Georgia Casey Cagle Reacts in Cavuto Interview

Georgia Lieutenant Gov. Casey Cagle (R) said Saturday that "conservatives are tired of being kicked around" when asked to defend his push to halt consideration of a sizeable tax cut for Delta Airlines.

Published:3/3/2018 1:29:42 PM
[World] NRA Boycott: Georgia Threatens Delta Airlines Tax Breaks

Georgia Lt. Gov. Casey Cagle (R) is threatening Atlanta-based Delta Airlines over its decision to sever ties with the National Rifle Association (NRA).

Published:2/28/2018 8:33:03 AM
[Politics] NO, it’s NOT okay for a Republican Lt. Gov to threaten a corporation SO a lot of people are praising the lieutenant governor of Georgia for saying he would rescind tax cuts for Delta Airlines because of their recent decision to cave to liberal pressure . . . Published:2/27/2018 5:04:17 PM
[Startups] Meet crypto authors Michael Casey and Paul Vigna in New York next week  A reminder that I’m going to have Paul Vigna and Michael Casey, authors of The Truth Machine, onstage with me next week at Knotel, a co-working and event space in Manhattan. I’d love for you to come. You can RSVP here and space is limited. It’s happening on February 28 at 7pm and will feature a 35-minute talk with two of the top writers in crypto. These guys literally wrote… Read More
Published:2/21/2018 7:43:04 AM
[Markets] "Their Objective Is To Create Fear..."

Authored by Jeff Thomas via InternationalMan.com,

The Social Justice trend has appeared in recent years, and has rapidly gained momentum.

It appeared first on college campuses, where students accused a professor or, indeed, another student, of making a statement or using a word that was deemed socially unacceptable. The premise by the accuser was that a campus must be a safe space, where people should not be exposed to comments that may possibly make anyone feel demeaned or uncomfortable.

The accusers have earned the name “snowflakes,” as they tend to melt down at the slightest provocation. However, the Social Justice trend has given snowflakes considerable power, a power that’s often used recklessly.

Importantly, whether the offensive comment is correct or incorrect is not an issue. The “offense” is that the speaker has stated something that should not ever be mentioned, as it might upset the listener in some way. The “justice” that takes place is that one or more people file a formal complaint with a person or body that holds power over the speaker and demand that he be punished for his “wrongdoing.”

This has led to teachers and professors being warned, suspended, or fired from their positions, based merely on the existence of a complaint. In addition, “offending” students have been warned, suspended, or expelled, again, without what might be regarded as due process.

A related form of Social Justice is the vigilantism seeking to destroy those who are prominent. Former Miss Americas demanded that the entire board of the Miss America Pageant be dismissed for making disparaging remarks about pageant contestants. Several have been forced to resign in disgrace.

And, of course, we’re seeing the rise of complaints against actors, politicians, and other prominent individuals regarding alleged sexual denigration of women, even if it’s merely verbal. In each case, witnesses are “bravely coming forward,” en masse, although they often were silent for decades (if, indeed, the individual incidents ever occurred at all).

Whether a given individual has actually committed a crime or not seems immaterial in the new Social Justice trend. The focus is on vehement condemnation of an individual, usually by a host of others. Importantly, regardless of what process is used to prosecute (or persecute) those accused, a general assumption of the Social Justice trend is that, once someone is accused, he’s guilty and punishment must take place.

But, in fact, this trend is not new. Rabid groups of accusers appear throughout history, generally during times of existing social tension.

The Salem Witch Trials: 1692-1693

In 1692, several young girls claimed to be possessed by witches and group hysteria ensued. Some 150 men, women and children were ultimately accused and nineteen were hanged. Governor William Phips ordered that an end be put to the show trials in 1693. In the process, his wife was accused of being a witch.

The Nazi Sondergerichte: 1933-1945

In Nazi Germany, kangaroo courts were held for those deemed to have committed “political crimes,” resulting in 12,000 deaths. Germans were encouraged to report on each other. (If your neighbor annoyed you, a good recompense was to report him as being disloyal.) The persecution only ended when Nazi Germany was defeated.

The Great Soviet Purge: 1936-1938

Joseph Stalin ran many successful purges against clergymen, wealthy peasants, and oppositionists, but the foremost of them was the Great Purge, which included anyone with a perceived stain on his record. Denunciation was encouraged. The purge was highly successful and, although the show trials ended in 1938, the threat of accusation remained until the fall of the Soviet Union in 1991.

The Red Scare – McCarthyism: 1947-1956

US Senator McCarthy accused countless people in Hollywood of being communists. Thousands lost their jobs. McCarthyism ended when he accused the Protestant Church as being a communist support group. He also attacked the US Army as having communists within it. The Army lashed back, exposing McCarthy as cruel, manipulative, and reckless and the public fervor against communists subsided.

The Spanish Inquisition: 1478-1834

The Spanish Inquisition lasted for over 350 years. It was originally conceived by King Ferdinand II as a way to expose and punish heretics and suppress religious dissent.

It was preceded by the French Inquisition and spread to other countries in Europe. At its height, it investigated, prosecuted, and sometimes burned alive some 150,000 people. The last execution was in 1826 – for teaching deist principles (deism, not Christianity, was the predominant religious belief of America’s founding fathers).

Crimes committed included blasphemy, witchcraft, immorality, and behavior unbecoming to a woman. (A woman’s role was seen as being limited to raising a family.) False denunciations were frequent and defendants were only rarely acquitted. The auto-da-fé, or public punishment, including groups of people being burned alive, provided an effective demonstration and satisfied the public’s desire for spectacle.

The inquisition finally ended when King Ferdinand VII and others came to regard the church’s power as being a threat to the government’s power and abolished it.

Others that used the Social Justice approach to great effect were China, Hungary, Romania, Czechoslovakia, Egypt (as recently as 2014), and Turkey (as recently as 2016).

And there are many more examples, far too numerous to mention.

In 1970, Monty Python did a series of sketches in which Michael Palin plays a cleric, saying, “Nobody expects the Spanish Inquisition.”

And, of course, this is true. The Spanish Inquisition, the Salem witch trials, the McCarthy hearings, and the present Social Justice trend, are so over-the-top that their very existence is clearly absurd.

However, historically, whether it be a political leader like Stalin or Hitler, or a religious organisation, like the Catholic Church, or the present-day, self-styled “Social Justice Warriors,” such campaigns begin through the desire for power over others. What they have in common is that anyone can be targeted, group accusations carry greater weight than individual accusations, and the punishment invariably exceeds the level of the offense, if, indeed, there is any unlawful offense at all.

The objective is to create fear. The initiative begins with finger-pointing and mild punishment, such as the loss of a job. But it evolves into a circus that often grows to include more serious punishment, sometimes including execution.

Vigilantism grows out of troubled periods when frustrations and resentment run high. Because it’s emotionally driven, not logic-driven, it almost invariably morphs into irrational victimisation… and is always destructive in nature.

*  *  *

Fortunately, there are practical ways to escape the fallout of dangerous groupthink. Doug Casey has turned it into an art form. Find out more in Doug’s special report, Getting Out of Dodge.

Published:2/19/2018 8:44:11 PM
[The Blog] Dem Senator to Mueller: Don’t release findings around the midterms

Fantasies.

The post Dem Senator to Mueller: Don’t release findings around the midterms appeared first on Hot Air.

Published:2/19/2018 4:43:18 PM
[Politics] ‘Law & Order: SVU’ Star Running for Congress in New York

Diane Neal, who played assistant district attorney Casey Novak on NBC's hit show "Law & Order: Special Victims Unit," announced Tuesday that she is running for Congress in New York's 19th congressional district.

The post ‘Law & Order: SVU’ Star Running for Congress in New York appeared first on Washington Free Beacon.

Published:2/8/2018 11:21:05 AM
[Markets] Death Of Democracy? - Part I

Authored by Denis MacEoin via The Gatestone Institute,

For many complex reasons, Europe is in an advanced state of decline. In recent years, several important studies of this condition have appeared, advancing a variety of reasons for it: Douglas Murray's The Strange Death of Europe: Immigration, Identity, Islam, James Kirchik's The End of Europe: Dictators, Demagogues, and the Coming Dark Age, as well as Christopher Caldwell's ground-breaking 2010 study, Reflections on the Revolution in Europe: Immigration, Islam and the West. Soeren Kern at Gatestone Institute has also been detailing the steady impact of immigration from Muslim regions on countries such as Germany, Sweden, and the United Kingdom.

It is clear that something serious is happening on the continent in which I live.

The threat is not restricted to Europe, but has a global dimension. Michael J. Abramowitz, President of Freedom House, writes in his introduction to the organization's 2018 report:

A quarter-century ago, at the end of the Cold War, it appeared that totalitarianism had at last been vanquished and liberal democracy had won the great ideological battle of the 20th century.

Today, it is democracy that finds itself battered and weakened. For the 12th consecutive year, according to Freedom in the World, countries that suffered democratic setbacks outnumbered those that registered gains. States that a decade ago seemed like promising success stories—Turkey and Hungary, for example—are sliding into authoritarian rule.

For Douglas Murray, immigration and the problems it is throwing up are the key topic. He is uncompromising in his negative response to the social change that has been brought about by the excessive and barely controlled immigration of people who, for the most part, do not share the most basic values of the countries in which they now live.

Certainly, Europe's current state of decline owes much to the widely recognized fact that Muslims are the first newcomers to Europe who, over several generations, are resistant to integrating into the societies of which they now form a part. This rejection of Europe's humanitarian, Judeo-Christian values applies, not just to the successive waves of refugees and economic migrants who have washed up on the shores of Greece, Italy and Spain since the start of the Syrian civil war, but to generations of Pakistanis and Bangladeshis in the UK, North Africans in France, and Turkish "guest workers" in Germany.

A former Muslim extremist, Ed Husain, writes in his book, The Islamist: Why I Joined Radical Islam in Britain, what I Saw Inside and why I Left:

The result of 25 years of multiculturalism has not been multicultural communities. It has been mono-cultural communities.... Islamic communities are segregated. Many Muslims want to live apart from mainstream British society; official government policy has helped them do so. I grew up without any white friends. My school was almost entirely Muslim. I had almost no direct experience of 'British life' or 'British institutions'. So it was easy for the extremists to say to me: 'You see? You're not part of British society. You never will be. You can only be part of an Islamic society.' The first part of what they said was true. I wasn't part of British society: nothing in my life overlapped with it.

According to Ed Husain (right), a former Muslim extremist, "The result of 25 years of multiculturalism has not been multicultural communities. It has been mono-cultural communities... Islamic communities are segregated." (Image source: CNN video screenshot)

In July 2015, arguing for an anti-extremism bill in parliament, Britain's prime minister at the time, David Cameron, admitted:

"For all our successes as a multi-racial, multi-faith democracy, we have to confront a tragic truth that there are people born and raised in this country who don't really identify with Britain – and who feel little or no attachment to other people here. Indeed, there is a danger in some of our communities that you can go your whole life and have little to do with people from other faiths and backgrounds."

Countless polls and investigations reveal that refusal to integrate is no figment of the supposedly "Islamophobic" political "right". A 2006 poll carried out by ICM Research on behalf of the Sunday Telegraph, for example, presented worrying findings: 40% of British Muslims polled said they backed introducing shari'a law in parts of Britain, and only 41% opposed it, leaving another 20% unclear. Sadiq Khan, the Labour MP involved with the official task force set up after the July 2005 attacks, said the findings were "alarming". Since then, similar findings have shown that the younger generation of Muslims is more conservative, even radical, than their parents or grandparents:

Commenting on a major 2016 ICM poll of Muslim opinion, Trevor Phillips, who had been Britain's foremost advocate of multiculturalism, said that, with respect to the Muslim community, he had made a 180° turn:

"for a long time, I too thought that Europe's Muslims would become like previous waves of migrants, gradually abandoning their ancestral ways, wearing their religious and cultural baggage lightly, and gradually blending into Britain's diverse identity landscape. I should have known better."

Another major 2016 review on social equality carried out on behalf of the British government by Dame Louise Casey, found Muslims the least well integrated community. In summarizing her work for the National Secular Society, Benjamin Jones wrote:

"Despite decades of failures, it is worth noting that problems integrating Muslim minorities are hardly rare around the world, and this is not a problem unique to the United Kingdom. That brings us to the final unsayable thing – well known to most British people but unmentionable to officials and politicians: Islam is a special case."

Polls carried out in other countries across Europe showed similar or worse results.

Those are only one half of a more complicated and disturbing picture. While Muslims find it hard to abandon the prejudices, doctrines, and outright hatreds (for Jews, for example) that they have imported from their home countries -- or developed as young men and women while living in European states where they were born and raised -- vast numbers of non-Muslims, including politicians, church leaders, civil servants, policemen and women, and many well-meaning people bend over backwards to accommodate them and the demands they make on their host societies.

It would take a book to summarize all the episodes in which Western officialdom, notably in Europe, has abandoned its own historical values in order to protect Islam and radical Muslims from criticism and rebuke. We are not speaking of the proper interventions of the police, courts, and social agencies to safeguard ordinary Muslims from physical attacks, vituperative insults, assaults on mosques, or basic denials of the rights they are entitled to enjoy as citizens of Western countries – much as we expect them to protect Jews, ethnic minorities, or vulnerable women from similar expressions of physical and verbal bigotry. Providing such support for the victims of prejudice should be applauded as an essential expression of post-Enlightenment liberal democratic values. Legislating and acting against outright discrimination is, perhaps, best exemplified in the way post-World War II German governments have criminalized anti-Semitism and Holocaust denial.

Ironically, what anti-Semitism there is today in Germany comes increasingly from Muslims.

According to Manfred Gerstenfeld:

  • Jens Spahn, a board member of Chancellor Merkel's Christian Democrat Union (CDU), and a possible successor to Merkel, remarked that the immigration from Muslim countries is the reason for the recent demonstrations [about immigrants] in Germany.
  • Stephan Harbarth, deputy chairman of the CDU/ CSU faction in the Bundestag — the German parliament — said, "We have to strongly confront the antisemitism of migrants with an Arab background and those from African countries."
  • The CDU interior minister of the federal state of Hessen, Peter Beuth, remarked, "We have to avoid an immigration of antisemitism." He said this after a study on behalf of the state's security service concluded that antisemitism among Muslims "both quantitatively and qualitatively has at least as high relevance as the traditional antisemitism of the extreme right."

Despite this moral response, European countries, including Germany, have shown genuine weakness when face-to-face with radical Islamic ideology, hate preachers, and basic Muslim values regarding women, non-Muslims, LGBT people, and obedience to Western laws.

Before looking at some of the reasons, motivations, and outcomes of this deeply pervasive weakness, here are a handful of examples of pusillanimity from the UK alone.

Last October, it was reported that Queen's Counsel Max Hill, who acts as the British government's independent reviewer of terrorism legislation, argued that British fighters for Islamic State, who had returned or planned to return to the UK, should not be prosecuted but reintegrated into society on the grounds that they had acted "naively". This lenience extended to hate preachers who had given sermons and lectures exhorting Muslims to take direct action that has in the past led to actual terrorist attacks.

Before that, Prime Minister David Cameron and then Home Secretary Theresa May had "proposed measures including banning orders, extremism disruption orders and closure orders, which would allow premises used by extremists to be shut, and make it easier to restrict the activities of individuals and organisations."

In 2015, May had proposed a counter-extremism strategy which said laws would be introduced to "ban extremist organisations that promote hatred and draw people into extremism" and "restrict the harmful activities of the most dangerous extremist individuals". Mrs May also vowed to use the law to "restrict access to premises which are repeatedly used to support extremism". Yet Max Hill QC, the man in charge of British terrorist legislation wants none of that. And May's counter-terrorism measures, proposed again since she became Prime Minister, remain unlegislated.

The same month (October 2017) that Hill undertook the rehabilitation of jihadists and hate preachers, it was reported that the British Home Office (formerly run by Theresa May, now by Amber Rudd MP) was "looking at a new strategy to reintegrate extremists that could even see them propelled to the top of council house waiting lists if needed".

Extremists who had nowhere suitable to live could be put in social housing by the local council and could have their rent paid if necessary, according to reports.

They could also be given priority on waiting lists and helped into education and training or found a job with public bodies or charities.

This proposal would include returnees from the Islamic State in Syria, and overall would include some 20,000 individuals known to the security services. Around 850 British subjects have gone to Syria to fight or support fighters, and 350 of them have come back home, with only a tiny handful so far prosecuted.

This approach, giving social services, is based on the belief -- oft-refuted -- that Muslim extremists (both Muslims-by-birth and converts) have suffered from deprivation. It also greatly rests on the naïve assumption that rewarding them with benefits -- for which genuinely deprived citizens generally need to wait in line -- will turn them into grateful patriots, prepared to stand for the national anthem and hold hands with Christians and Jews.

We now therefore use double standards: one for Muslims and one for the rest of our population. On January 16, 2018, in England, Daniel Grundy, was jailed for six months on a charge of bigamy. However, Muslim men in polygamous marriages are rewarded by the state:

Husbands living in a "harem" with multiple wives have been cleared to claim state benefits for all their different partners.

A Muslim man with four spouses - which is permitted under Islamic law - could receive £10,000 a year in income support alone.

He could also be entitled to more generous housing and council tax benefit, to reflect the fact his household needs a bigger property.

Ministers have decided that, even though bigamy is a crime in Britain, polygamous marriages can be recognised formally by the state - provided they took place overseas, in countries where they are legal.

Actually, British Muslim men do not even have to go abroad to find wives. At least one Muslim dating site run from the UK offers contact with Muslim women who are eager to enter into polygamous marriages. It has not been closed down. The British government has shown itself incapable of enforcing its own laws when it comes to its Muslim citizens or new immigrants.

In a similar vein are official attitudes to a common Muslim practice of female genital mutilation, which has been illegal in the UK since 1985. Politico reported last year:

"Medical staff working in England's National Health Service recorded close to 5,500 cases of female genital mutilation (FGM) in 2016, but no one has been successfully prosecuted since the practice was banned over 30 years ago."

Meanwhile, the practice is rising. The police and the Crown Prosecution Service are too frightened of seeming racist or "Islamophobic" to apply the law.

Max Hill's notion that departing fighters have been naïve is itself a staggering misconception on the part of a man educated at Newcastle's prestigious Royal Grammar School and Oxford University. No one heading for Syria will have been blithely unaware of the multitude of videos broadcast by the mainstream media and all the social media, showing the beheading of hostages, the executions of homosexuals, the lashing of women, the heads spiked on fences, the use of children to shoot victims or cut their throats, and all the other excesses committed by the terrorist group.

Rather than stand up to our enemies, both external and internal, are we now so afraid of being called "Islamophobes" that we will sacrifice even our own cultural, political, and religious strengths and aspirations? The next part of this article will examine just how major this betrayal has been and how much greater it will become.

Published:2/8/2018 3:11:46 AM
[Entertainment] Ben Affleck's Father Says Hollywood Has Been ''Major Factor'' in Son's Drinking Ben Affleck, Timothy AffleckTimothy Affleck, the father of Ben Affleck and Casey Affleck, is sounding off in a rare interview, revealing that he believes that fame has taken a major "toll" on both of his famous...
Published:2/6/2018 5:53:25 PM
[Entertainment] Law & Order: SVU Star Diane Neal Is Running for Congress Diane NealDiane Neal has launched her political campaign. The actress, known for starring as ADA Casey Novak on Law & Order: SVU, is running for Congress in New York. A registered Democrat,...
Published:2/6/2018 3:25:19 PM
[Science] Georgia state Senate denounces NFL year before Super Bowl comes to Atlanta Republican Lt. Gov. Casey Cagle, who is on the ballot for governor, said he wasn't concerned about the message it was sending to the NFL pri... Published:2/1/2018 3:49:49 PM
[Markets] Army Major: Wrong On 'Nam, Wrong On Terror

Authored by Major Danny Sjursen via TomDispatch.com,

Vietnam: it’s always there. Looming in the past, informing American futures.

 

A 50-year-old war, once labeled the longest in our history, is still alive and well and still being refought by one group of Americans: the military high command.  And almost half a century later, they’re still losing it and blaming others for doing so. 

Of course, the U.S. military and Washington policymakers lost the war in Vietnam in the previous century and perhaps it’s well that they did.  The United States really had no business intervening in that anti-colonial civil war in the first place, supporting a South Vietnamese government of questionable legitimacy, and stifling promised nationwide elections on both sides of that country’s artificial border.  In doing so, Washington presented an easy villain for a North Vietnamese-backed National Liberation Front (NLF) insurgency, a group known to Americans in those years as the Vietcong. 

More than two decades of involvement and, at the war’s peak, half a million American troops never altered the basic weakness of the U.S.-backed regime in Saigon.  Despite millions of Asian deaths and 58,000 American ones, South Vietnam’s military could not, in the end, hold the line without American support and finally collapsed under the weight of a conventional North Vietnamese invasion in April 1975.

There’s just one thing.  Though a majority of historians (known in academia as the “orthodox” school) subscribe to the basic contours of the above narrative, the vast majority of senior American military officers do not.  Instead, they’re still refighting the Vietnam War to a far cheerier outcome through the books they read, the scholarship they publish, and (most disturbingly) the policies they continue to pursue in the Greater Middle East.

The Big Re-Write

In 1986, future general, Iraq-Afghan War commander, and CIA director David Petraeus penned an article for the military journal Parameters that summarized his Princeton doctoral dissertation on the Vietnam War.  It was a piece commensurate with then-Major Petraeus’s impressive intellect, except for its disastrous conclusions on the lessons of that war.  Though he did observe that Vietnam had “cost the military dearly” and that “the frustrations of Vietnam are deeply etched in the minds of those who lead the services,” his real fear was that the war had left the military unprepared to wage what were then called “low-intensity conflicts” and are now known as counterinsurgencies.  His takeaway: what the country needed wasn’t less Vietnams but better-fought ones.  The next time, he concluded fatefully, the military should do a far better job of implementing counterinsurgency forces, equipment, tactics, and doctrine to win such wars.

Two decades later, when the next Vietnam-like quagmire did indeed present itself in Iraq, he and a whole generation of COINdinistas (like-minded officers devoted to his favored counterinsurgency approach to modern warfare) embraced those very conclusions to win the war on terror.  The names of some of them -- H.R. McMaster and James Mattis, for instance -- should ring a bell or two these days. In Iraq and later in Afghanistan, Petraeus and his acolytes would get their chance to translate theory into practice.  Americans -- and much of the rest of the planet -- still live with the results.

Like Petraeus, an entire generation of senior military leaders, commissioned in the years after the Vietnam War and now atop the defense behemoth, remain fixated on that ancient conflict.  After all these decades, such “thinking” generals and “soldier-scholars” continue to draw all the wrong lessons from what, thanks in part to them, has now become America’s second longest war. 

Rival Schools

Historian Gary Hess identifies two main schools of revisionist thinking. 

There are the “Clausewitzians” (named after the nineteenth century Prussian military theorist) who insist that Washington never sufficiently attacked the enemy's true center of gravity in North Vietnam.  Beneath the academic language, they essentially agree on one key thing: the U.S. military should have bombed the North into a parking lot.

The second school, including Petraeus, Hess labeled the “hearts-and-minders.”  As COINdinistas, they felt the war effort never focused clearly enough on isolating the Vietcong, protecting local villages in the South, building schools, and handing out candy -- everything, in short, that might have won (in the phrase of that era) Vietnamese hearts and minds.

Both schools, however, agreed on something basic: that the U.S. military should have won in Vietnam. 

The danger presented by either school is clear enough in the twenty-first century.  Senior commanders, some now serving in key national security positions, fixated on Vietnam, have translated that conflict’s supposed lessons into what now passes for military strategy in Washington.  The result has been an ever-expanding war on terror campaign waged ceaselessly from South Asia to West Africa, which has essentially turned out to be perpetual war based on the can-do belief that counterinsurgency and advise-and-assist missions should have worked in Vietnam and can work now. 

The Go-Big Option

The leading voice of the Clausewitzian school was U.S. Army Colonel and Korean War/Vietnam War vet Harry Summers, whose 1982 book, On Strategy: A Critical Analysis of the Vietnam War, became an instant classic within the military.  It’s easy enough to understand why.  Summers argued that civilian policymakers -- not the military rank-and-file -- had lost the war by focusing hopelessly on the insurgency in South Vietnam rather than on the North Vietnamese capital, Hanoi.  More troops, more aggressiveness, even full-scale invasions of communist safe havens in Laos, Cambodia, and North Vietnam, would have led to victory.

Summers had a deep emotional investment in his topic.  Later, he would argue that the source of post-war pessimistic analyses of the conflict lay in “draft dodgers and war evaders still [struggling] with their consciences.”  In his own work, Summers marginalized all Vietnamese actors (as would so many later military historians), failed to adequately deal with the potential consequences, nuclear or otherwise, of the sorts of escalation he advocated, and didn’t even bother to ask whether Vietnam was a core national security interest of the United States. 

Perhaps he would have done well to reconsider a famous post-war encounter he had with a North Vietnamese officer, a Colonel Tu, whom he assured that “you know you never beat us on the battlefield.”

“That may be so,” replied his former enemy, “but it is also irrelevant.”

Whatever its limitations, his work remains influential in military circles to this day. (I was assigned the book as a West Point cadet!) 

A more sophisticated Clausewitzian analysis came from current National Security Adviser H.R. McMaster in a highly acclaimed 1997 book, Dereliction of Duty.  He argued that the Joint Chiefs of Staff were derelict in failing to give President Lyndon Johnson an honest appraisal of what it would take to win, which meant that “the nation went to war without the benefit of effective military advice.”  He concluded that the war was lost not in the field or by the media or even on antiwar college campuses, but in Washington, D.C., through a failure of nerve by the Pentagon’s generals, which led civilian officials to opt for a deficient strategy. 

McMaster is a genuine scholar and a gifted writer, but he still suggested that the Joint Chiefs should have advocated for a more aggressive offensive strategy -- a full ground invasion of the North or unrelenting carpet-bombing of that country.  In this sense, he was just another “go-big” Clausewitzian who, as historian Ronald Spector pointed out recently, ignored Vietnamese views and failed to acknowledge -- an observation of historian Edward Miller -- that “the Vietnam War was a Vietnamese war.”

COIN: A Small (Forever) War

Another Vietnam veteran, retired Lieutenant Colonel Andrew Krepinevich, fired the opening salvo for the hearts-and-minders.  In The Army and Vietnam, published in 1986, he argued that the NLF, not the North Vietnamese Army, was the enemy’s chief center of gravity and that the American military’s failure to emphasize counterinsurgency principles over conventional concepts of war sealed its fate.  While such arguments were, in reality, no more impressive than those of the Clausewitzians, they have remained popular with military audiences, as historian Dale Andrade points out, because they offer a “simple explanation for the defeat in Vietnam.” 

Krepinevich would write an influential 2005 Foreign Affairs piece, “How to Win in Iraq,” in which he applied his Vietnam conclusions to a new strategy of prolonged counterinsurgency in the Middle East, quickly winning over the New York Times’s resident conservative columnist, David Brooks, and generating “discussion in the Pentagon, CIA, American Embassy in Baghdad, and the office of the vice president.” 

In 1999, retired army officer and Vietnam veteran Lewis Sorley penned the definitive hearts-and-minds tract, A Better War: The Unexamined Victories and Final Tragedy of America’s Last Years in Vietnam.  Sorley boldly asserted that, by the spring of 1970, “the fighting wasn’t over, but the war was won.”  According to his comforting tale, the real explanation for failure lay with the “big-war” strategy of U.S. commander General William Westmoreland. The counterinsurgency strategy of his successor, General Creighton Abrams -- Sorley’s knight in shining armor -- was (or at least should have been) a war winner. 

Critics noted that Sorley overemphasized the marginal differences between the two generals’ strategies and produced a remarkably counterfactual work.  It didn’t matter, however.  By 2005, just as the situation in Iraq, a country then locked in a sectarian civil war amid an American occupation, went from bad to worse, Sorley’s book found its way into the hands of the head of U.S. Central Command, General John Abizaid, and State Department counselor Philip Zelikow.  By then, according to the Washington Post’s David Ignatius, it could also “be found on the bookshelves of senior military officers in Baghdad.”

Another influential hearts-and-minds devotee was Lieutenant Colonel John Nagl.  (He even made it onto The Daily Show with Jon Stewart.) His Learning to Eat Soup with a Knife: Counterinsurgency Lessons from Malaya and Vietnam followed Krepinevich in claiming that “if [Creighton] Abrams had gotten the call to lead the American effort at the start of the war, America might very well have won it.”  In 2006, the Wall Street Journal reported that Army Chief of Staff General Peter Schoomaker “so liked [Nagl’s] book that he made it required reading for all four-star generals,” while the Iraq War commander of that moment, General George Casey, gave Defense Secretary Donald Rumsfeld a copy during a visit to Baghdad.

David Petraeus and current Secretary of Defense James Mattis, co-authors in 2006 of FM 3-24, the first (New York Times-reviewed) military field manual for counterinsurgency since Vietnam, must also be considered among the pantheon of hearts-and-minders.  Nagl wrote a foreword for their manual, while Krepinevich provided a glowing back-cover endorsement.

Such revisionist interpretations would prove tragic in Iraq and Afghanistan, once they had filtered down to the entire officer corps. 

Reading All the Wrong Books 

In 2009, when former West Point history professor Colonel Gregory Daddis was deployed to Iraq as the command historian for the Multinational Corps -- the military’s primary tactical headquarters -- he noted that corps commander Lieutenant General Charles Jacoby had assigned a professional reading list to his principal subordinates.  To his disappointment, Daddis also discovered that the only Vietnam War book included was Sorley’s A Better War.  This should have surprised no one, since his argument -- that American soldiers in Vietnam were denied an impending victory by civilian policymakers, a liberal media, and antiwar protestors -- was still resonant among the officer corps in year six of the Iraq quagmire.  It wasn’t the military’s fault!

Officers have long distributed professional reading lists for subordinates, intellectual guideposts to the complex challenges ahead.  Indeed, there’s much to be admired in the concept, but also potential dangers in such lists as they inevitably influence the thinking of an entire generation of future leaders.  In the case of Vietnam, the perils are obvious.  The generals have been assigning and reading problematic books for years, works that were essentially meant to reinforce professional pride in the midst of a series of unsuccessful and unending wars.

Just after 9/11, for instance, Chairman of the Joint Chiefs Richard Myers -- who spoke at my West Point graduation -- included Summers’s On Strategy on his list.  A few years later, then-Army Chief of Staff General Peter Schoomaker added McMaster’s Dereliction of Duty.  The trend continues today.  Marine Corps Commandant Robert Neller has kept McMaster and added Diplomacy by Henry Kissinger (he of the illegal bombing of both Laos and Cambodia and war criminal fame).  Current Army Chief of Staff General Mark Milley kept Kissinger and added good old Lewis Sorley.  To top it all off, Secretary of Defense Mattis has included yet another Kissinger book and, in a different list, Krepinevich’s The Army and Vietnam.

Just as important as which books made the lists is what’s missing from them: none of these senior commanders include newer scholarship, novels, or journalistic accounts which might raise thorny, uncomfortable questions about whether the Vietnam War was winnable, necessary, or advisable, or incorporate local voices that might highlight the limits of American influence and power. 

Serving in the Shadow of Vietnam 

Most of the generals leading the war on terror just missed service in the Vietnam War.  They graduated from various colleges or West Point in the years immediately following the withdrawal of most U.S. ground troops or thereafter: Petraeus in 1974, future Afghan War commander Stanley McChrystal in 1976, and present National Security Adviser H.R. McMaster in 1984.  Secretary of Defense Mattis finished ROTC and graduated from Central Washington University in 1971, while Trump’s Chief of Staff John Kelly enlisted at the tail end of the Vietnam War, receiving his commission in 1976.

In other words, the generation of officers now overseeing the still-spreading war on terror entered military service at the end of or after the tragic war in Southeast Asia.  That meant they narrowly escaped combat duty in the bloodiest American conflict since World War II and so the professional credibility that went with it.  They were mentored and taught by academy tactical officers, ROTC instructors, and commanders who had cut their teeth on that conflict.  Vietnam literally dominated the discourse of their era -- and it’s never ended.

Petraeus, Mattis, McMaster, and the others entered service when military prestige had reached a nadir or was just rebounding.  And those reading lists taught the young officers where to lay the blame for that -- on civilians in Washington (or in the nation’s streets) or on a military high command too weak to assert its authority effectively. They would serve in Vietnam’s shadow, the shadow of defeat, and the conclusions they would draw from it would only lead to twenty-first-century disasters.   

From Vietnam to the War on Terror to Generational War

All of this misremembering, all of those Vietnam “lessons” inform the U.S. military’s ongoing “surges” and “advise-and-assist” approaches to its wars in the Greater Middle East and Africa. Representatives of both Vietnam revisionist schools now guide the development of the Trump administration’s version of global strategy. President Trump’s in-house Clausewitzians clamor for -- and receive -- ever more delegated authority to do their damnedest and what retired General (and Vietnam vet) Edward Meyer called for back in 1983: “a freer hand in waging war than they had in Vietnam.” In other words, more bombs, more troops, and carte blanche to escalate such conflicts to their hearts’ content.

Meanwhile, President Trump’s hearts-and-minds faction consists of officers who have spent three administrations expanding COIN-influenced missions to approximately 70% of the world’s nations.  Furthermore, they’ve recently fought for and been granted a new “mini-surge” in Afghanistan intended to -- in disturbingly Vietnam-esque language -- “break the deadlock,” “reverse the decline,” and “end the stalemate” there.  Never mind that neither 100,000 U.S. troops (when I was there in 2011) nor 16 full years of combat could, in the term of the trade, “stabilize” Afghanistan.  The can-do, revisionist believers atop the national security state have convinced Trump that -- despite his original instincts -- 4,000 or 5,000 (or 6,000 or 7,000) more troops (and yet more drones, planes, and other equipment) will do the trick.  This represents tragedy bordering on farce. 

The hearts and minders and Clausewitzians atop the military establishment since 9/11 are never likely to stop citing their versions of the Vietnam War as the key to victory today; that is, they will never stop focusing on a war that was always unwinnable and never worth fighting.  None of today’s acclaimed military personalities seems willing to consider that Washington couldn’t have won in Vietnam because, as former Air Force Chief of Staff Merrill McPeak (who flew 269 combat missions over that country) noted in the recent Ken Burns documentary series, “we were fighting on the wrong side.”

Today’s leaders don’t even pretend that the post-9/11 wars will ever end.  In an interview last June, Petraeus -- still considered a sagacious guru of the Defense establishment -- disturbingly described the Afghan conflict as “generational.”  Eerily enough, to cite a Vietnam-era precedent, General Creighton Abrams predicted something similar. speaking to the White House as the war in Southeast Asia was winding down.  Even as President Richard Nixon slowly withdrew U.S. forces, handing over their duties to the South Vietnamese Army (ARVN) -- a process known then as “Vietnamization” -- the general warned that, despite ARVN improvements, continued U.S. support “would be required indefinitely to maintain an effective force.”  Vietnam, too, had its “generational” side (until, of course, it didn’t). 

That war and its ill-fated lessons will undoubtedly continue to influence U.S. commanders until a new set of myths, explaining away a new set of failures in Iraq, Afghanistan, and elsewhere, take over, possibly thanks to books by veterans of these conflicts about how Washington could have won the war on terror.  

It’s not that our generals don’t read. They do. They just doggedly continue to read the wrong books.

In 1986, General Petraeus ended his influential Parameters article with a quote from historian George Herring: “Each historical situation is unique and the use of analogy is at best misleading, at worst, dangerous.”  When it comes to Vietnam and a cohort of officers shaped in its shadow (and even now convinced it could have been won), "dangerous" hardly describes the results. They’ve helped bring us generational war and, for today’s young soldiers, ceaseless tragedy.

Published:1/29/2018 11:49:22 PM
Top Searches:
books
-1'
FBI
obama
obamacare
NASA
dow
Casey
dow1111111111111' UNION SELECT CHAR(45,120,49,45,81,45),CHAR(45,120,50,45,81,45),CHAR(45,120,51,45,81,45),CHAR(45,120,52,45,81,45),CHAR(45,120,53,45,81,45),CHAR(45,120,54,45,81,45),CHAR(45,120,55,45,8
dow jones

Jobs from Indeed

comments powered by Disqus