Thursday, May 31, 2018

How did the Boomers, the Counterculture and Peace Generation, produce So Many Warmongering Nihilists?

First off, the Boomer generation was not of a single mind or attitude. If anything, polls show that close to half or more of the boomers in the 60s and 70s were Conservative/Patriotic. Many supported the Vietnam War and voted for Nixon. And the majority of Boomers were not into the Drug Culture or crazy about Sexual Revolution. The majority wouldn't have attended Woodstock even if they could have. And yet, the most iconic images of the 1960s are associated with the Counterculture, Hippies, Antiwar Movement, Drug Culture, Youth Rebellion, and/or Radical Politics. It just goes to show that society is reflective less of the ‘silent majority’ than of Vocal Minority. While it’s true that the loudest voices and splashiest personalities could be speaking in the name of the People — this phenomenon is called Populism — , it is often the case that certain elements of the minority(ethnic, religious, commercial, or ideological) speak much louder than the majority that happens to be content, complacent, apathetic, clueless, bored & tired, and/or passive(even timid). Some may regard this is as politically sound as it allows some degree of balance of power between the majority and minority, i.e. since the majority have the advantage of volume, the minority deserves the advantage of voice.
For those who subscribe to such a view, Populism is most ‘unfair’ and ‘dangerous’ because it gives the majority both the power of volume and voice. To many Jews, Donald Trump is dangerous precisely because he is the Vocal Leader of the Voluminous Majority(though white numbers are fast declining as share of the overall US population).
If capitalists and leftist radicals have one thing in common, it is an hostility to populism. Capitalist entrepreneurs(especially the big ones) are vastly outnumbered by disgruntled workers and dissatisfied consumers, and so, it is in their interest to control the means of information to maintain some kind of balance. No wonder then that all the Big Media are owned by oligarchic powers like Gail Wynand(of THE FOUNTAINHEAD). And even though Charles Foster Kane began as a millionaire populist, his empire eventually grows more cynical and elitist.

Though communists spoke of People Power and Social Justice, the fact is the majority of workers in most nations never wanted communism. As such, the radical left constituted an ideological minority in most societies. What they lacked in numbers, they hoped to gain with vociferousness. The Left had to shout louder, march harder, and make a bigger nuisance of itself to get people’s attention. They had to crank up the noise to be heard. And of course, Jews have favored both the elite commercial class(especially the Advertisers) and the Radical Left because both, in their own way, waged war on the Great Majority. And Jews also did everything possible to suppress Populism of both Right and Left. Nationalist Populism could mean a real challenge to Jewish Power. And Leftist Populism could mean strong opposition to Jewish capitalist elites. Thus, Jews have favored both elitist Right and elitist Left. In America, the elitist Right doesn’t speak for the People. It speaks for ‘low taxes’ and ‘liberty’ so that the rich can grow richer. And the elitist Left has, over the years, increasingly altered Progressivism from a broad movement to a fractured movements of various identities fired up by radical intellectual(often pseudo-intellectual in character) theories that have no hope of making much sense to most people. And even when a certain ideology gains support of the great majority, the effect is to favor minority-over-majority interests: Majority of Americans support Zionism and Homomania, but this isn’t Populism because it’s about the Majority being manipulated and goaded into supporting Minority Privilege and Supremacism.

Read More:

“The Idiot” savant by Gary Saul Morson. On Fyodor Dostoevsky’s Idiot.

Image result for “The Idiot” savant by Gary Saul Morson

Just 150 years ago, Dostoevsky sent his publisher the first chapters of what was to become his strangest novel. As countless puzzled critics have observed, The Idiot violates every critical norm and yet somehow manages to achieve real greatness. Joseph Frank, the author of the definitive biography of Dostoevsky and one of his most astute critics, observed that it is easy enough to enumerate shortcomings but “more difficult to explain why the novel triumphs so effortlessly over all the inconsistencies and awkwardnesses of its structure.” The Idiot brings to mind the old saw about how, according to the laws of physics, bumblebees should be unable to fly, but bumblebees, not knowing physics, go on flying anyway.

Picture Dostoevsky in 1867. With his bride, Anna Grigorievna, he resided abroad, not for pleasure but to escape—just barely— being thrown into debtors’ prison. To pay his fare, Dostoevsky procured an advance from his publisher Katkov for a novel to be serialized in Katkov’s influential journal, The Russian Messenger. But the money was almost gone by the time Dostoevsky left Russia, for the very reason he found himself in financial straits in the first place. Generously but imprudently, he had continued to support the ne’er-do-well son of his first wife while also maintaining the family of his late brother. Anna Grigorievna complained that her sister-in-law lived better than she did.

In her memoirs, Anna Grigorievna described how, exasperated by her husband’s absent-minded generosity, she disguised herself as a beggar, got a handout from her oblivious husband, and confronted him with the donation. When she married Dostoevsky, she thought he had overcome his gambling addiction, but abroad he could not resist roulette, and, of course, always lost. They pawned her dowry, then their clothing. In one letter Dostoevsky begged Katkov for another advance, saying they would otherwise be forced to pawn their linen. It sounds like exaggeration, but he wrote to a friend confessing that he had understated the case, because he could not bring himself to say that they had already pawned it.

In such conditions, the novel Dostoevsky was working on, to be called The Idiot, did not progress well. Five times, the couple was forced to move when landlords would extend no more credit. Dostoevsky was plagued by epileptic seizures, incapacitating him for days. When Anna Grigorievna went into labor with their daughter Sonya, he suffered an attack, and it was hours before she could rouse him to go for a midwife. When the baby died, he experienced guilt as well as grief because, he believed, if they had been in Russia, Sonya would have survived.

Dostoevsky simply had to produce a novel, but refused to cheapen his work. “Worst of all I fear mediocrity,” he wrote to his niece. “I assure you the novel could have been satisfactory,” he explained to his friend, the poet Apollon Maikov, “but I got incredibly fed up with it precisely because of the fact that it was satisfactory and not absolutely good.” At last he abandoned his drafts. Nothing mattered more than artistic integrity.

Read More:

Silence Around Test Scores Serves the Privileged. written by Daniel Friedman

Right-wing podcaster Stefan Molyneux recently advised his teenage fans that they should append their IQ scores to job applications. This idea was widely and deservedly ridiculed on Twitter. It’s a serious faux pas to include test scores of any kind — IQ especially, but also SAT or graduate admissions tests like LSAT, MCAT or GMAT — on a resume.  Including test scores will cause many employers to draw negative assumptions about an applicant, and thus reduce the applicant’s chances of being hired, regardless of how good the scores are.

But why is there such a taboo against sharing scores, that including them on a resume would cause an employer to draw negative inferences about an applicant’s character? Why is it considered extreme and risible to suggest that a job candidate with a high IQ or a high SAT score should treat that as a qualification? And who benefits from this norm of keeping this data secret?

Proxies for aptitude

While it is bad advice for a job applicant to share test scores with an employer, nearly every job application will include an applicant’s college degree and the institution that granted it.  There’s no taboo against telling people where you went to school. You’re expected to put it on your resume. You can mention it in conversation with people you just met. You can include it in your online dating profile.  You can wear a T-shirt with the name of your school printed on it, or festoon your car with bumper stickers that show your school spirit. However, there’s almost no occasion where it is socially acceptable to brag about your test scores or ask somebody else about theirs. 

People use the relative prestige of their degree-granting institutions to sort themselves, their colleagues and their peer groups into hierarchies, and employers use this information to rank prospective hires as well. If an applicant went to an elite university, most employers will assume that applicant is likely to be smarter than someone who attended a less prestigious school.  

It’s true that being very smart and scoring very well on the SAT is the only way most people can have a chance of getting into an Ivy League school, but a lot of seats at elite schools are set aside for special people who don’t have to meet the high academic standards required of everybody else. Meanwhile, a lot of people with stellar scores attend lower-ranked schools, to take advantage of economic incentives like merit scholarships or in-state tuition, or simply because elite schools turn away a lot of top academic performers to make room for the children of the rich and powerful.

The 25th percentile SAT scores at Yale, for example, are 720 on the critical reading section and 710 on the math.  The 75th percentile SAT scores at my undergraduate alma mater, the University of Maryland, College Park, which is currently ranked #61 by USNews, are 720 critical reading and 750 math.
That means at least a quarter of Maryland students scored higher on at least one SAT section than at least a quarter of Yalies.  And there are dozens of other colleges and universities where significant percentages of students are more academically accomplished than many students at top schools.

Privilege at work

Even though many students at places like Maryland have better scores than many students at places like Yale, those students don’t actually have good enough applications to get into those elite schools, because top schools have a two-tiered admissions system.  

Daniel Golden won the Pulitzer Prize in 2004 for reporting on college admissions for the Wall Street Journal, and he explains how these systems work in his excellent 2006 book The Price of Admission. In addition to the widely-discussed practice of using lower admissions cutoffs for underrepresented minorities, the most prestigious colleges in the country also use lower academic criteria to admit students who are legacies — the children or grandchildren of alumni, and students who can participate in athletics. And, while admissions officials flatly denied this on the record, Golden makes a strong anecdotal case that the admissions standards can bend much further for the children of the very rich and the very powerful. 

He found cases where administrators who focused on cultivating donors, attempted to exert pressure on admissions committees on behalf of certain applicants. Or found sports teams to slot favored candidates onto, in order to get them the benefit of lower admissions criteria for athletes. And certain special cases got very special treatment.

Read More:

Revisiting a Transformational Speech: The Culture War Scorecard. Social conservatives won some and lost some since Pat laid down the marker. By Michael Barone

On Monday, August 17, 1992, Patrick Buchanan took the stage at the Republican National Convention in Houston. Buchanan had run against incumbent President George H. W. Bush for the Republican presidential nomination and in the first primary, in New Hampshire in February, had won 37 percent of the vote to Bush’s 53 percent. That turned out to be Buchanan’s high point: overall he won just 23 percent of primary votes to Bush’s 73 percent, and under Republicans’ winner-take-all delegate allocation rules he had only a handful of delegates at the convention—the official roll call credited him with just 18. In contrast, the last challenger of an incumbent Democratic president, Edward Kennedy, held the loyalty of about 40 percent of the delegates at the party’s 1980 national convention.

Buchanan, unlike Kennedy, warmly endorsed the president who defeated him. He credited Ronald Reagan, not Bush, with “leading America to victory in the Cold War,” but noted that “under President George Bush more human beings escaped from the prison house of tyranny to freedom than in any other four-year period in history.” But he had little else to say about foreign policy. And on the economy—thought then to be in a recession which, the official arbiters ruled later, had bottomed out in March 1991—Buchanan was emphatically downbeat, devoting long stretches of his speech to people he’d met on the campaign trail in New Hampshire, Georgia, and California who were terrified of losing their jobs. This was hardly helpful to an incumbent seeking a second term.

But the heart and most of the body of Buchanan’s speech was about cultural issues. “This election is about more than who gets what,” he said, arguing against a generation of political scientists. “It is about who we are. It is about what we believe, and what we stand for as Americans. There is a religious war going on in this country. It is a cultural war, as critical to the kind of nation we shall be as was the Cold War itself, for this war is for the soul of America.” The speech drew loud and lengthy cheers from an audience that included few Buchanan primary supporters. The speech and the cheers were loud enough to delay beyond primetime the evening’s supposed headline event, the address, which turned out to be his last about politics, of former president (and former Buchanan boss, in 1985-87) Ronald Reagan.

Buchanan’s oration was quickly dubbed the “culture war” speech by a hostile press. “It probably sounded better in the original German,” quipped Texas liberal journalist Molly Ivins. They were perhaps particularly appalled at Buchanan’s closing encomia to the Army troops who suppressed the 1992 Los Angeles riot that followed the acquittal of officers accused of assaulting Rodney King. “And as those boys took back the streets of Los Angeles, block by block, my friends, we must take back our cities, and take back our culture, and take back our country.” Perhaps the reporters and commentators thought it was somehow racist, or an appeal to racism, to hail those who stopped aimless property destruction and assaults on innocent human beings.

Read More:

Facebook Is Teetering at the Top of the Beanstalk

Facebook and other social media were supposed to foster community and give a voice to the greater masses. This was the appeal of social media in the first place. In case Mark Zuckerberg or Twitter’s Jack Dorsey forgot – we made you, and we can break you. The recent move by Facebook to censor ads on a whim should beckon entrepreneurs to create a totally agnostic platform outside the U.S. domain.
I started my writing career providing web content for entrepreneurs engaged in online commerce. As my expertise and writing improved, I came to be considered one of the best technology analysts in the world. By the time Web 2.0 rolled in, the landscape of media began changing on a daily basis. And within this atmosphere of paradigm change, people like me not only tested all the startup innovations, we became the lead evangelists that informed the general public of new technological developments. Tech blogs and traditional tech magazines were at the forefront of Silicon Valley’s “Jack and the Beanstalk” emergence, far ahead of media like The New York Times and the Wall Street Journal. And in public relations and marketing, me and my associates even helped transform the world’s biggest PR firms. Companies like Edelman, Waggener Edstrom, and especially Margery Kraus’ political dynamo APCO were operating at pigeon speed until digital gurus like me spurred them. This is no brag, but a fact I can prove. Fast forward to today, and a time when everything is an advertisement, and when the giants we created trample our expectations and illusions.

This morning I went to Facebook as I do every morning. An opinion piece I wrote yesterday was on my mind. I really wanted to inform more people of a theory of mine, and of new evidence, I’d discovered about U.S. policy. This “evidence” was really important, the kind of stuff that helps normal people make heads or tails of the constant propaganda machine western media has become. The piece was a moderation, a dissenting view as so many of today’s independent media stories are. Unfortunately, today was not like previous days. Facebook’s new policies about so-called “political ads” came into force. Instead of being able to create a targeted advertisement for my work, I received an ad block from Facebook telling me my page had not been authorized for political ads. To understand more about Facebook’s new policy, I suggest you read this report on The Verge, which discusses how “all” ads can be classified as unacceptable if Facebook deems it necessary. As for my own situation, the Facebook blockage destroys my capability to reach a wider audience. And destroying the “voices” that do not agree with the mainstream narrative is what the whole anti-Russia mess is really about. Censorship, silencing dissent, and Facebook’s most recent policy change is a greater danger to humanity than any atomic bomb.

Read More:

Mailvox: Hope for Generation Zyklon

A reader sends in these optimistic observations of the postmillennial generation.

Gen-Z and Hope in the Post-Institutional Age

Recently at our church over a dozen graduating high school seniors were honored and their post-graduation plans were highlighted. Every young man is either majoring in engineering or a trade like welding, or electrical. Every young woman except one is going into nursing or the medical field. Why is this significant? This is Generation Z and they aren’t fooling around. They’ve seen their older Millennial siblings and cousins struggle with their worthless degrees and jobs, and they are already taking a different path. Gen-Z supports Trump, and most importantly, there’s no give-up in them or hopelessness, even with all of the problems we face.

If you are an older Millennial or Gen-X like myself, you have seen nothing but loss if you are politically right of center. Vox once mentioned Millennials know something is lost, but not what, but Gen-X watched it happened. This gave us perspective, but it also gave us unbridled cynicism.  Gen-Z doesn’t really live in a world of conservative losers and cucks, as they are now irrelevant. They only see a tough road ahead, but they are determined.

I’ve noticed that older Millennials and Gen-X have a problem. Whenever somebody on our side says they are going to do something, or fight back, or even express hope of winning, we have to repress our cynicism. Why? The conservatives turned out to be the biggest bunch of feckless, stupid, political losers in the last century, and we were dumb enough to believe in them for a while. It’s nearly impossible for us to believe anyone right of center can pull their heads out of their asses long enough to do anything meaningful, so we default to cynicism.

Who can remember American cars from the 70s and 80s? If you do, you will remember they produced some of the worst vehicles in history. We inherited them from our parents in high school and we went off to college with them.  Our experience with them was so bad that GM lost an entire generation of buyers. Quite purposefully, they stopped marketing to us and went for the younger generation who knew nothing of GM.  Microsoft had to do the same thing as the bad PR from the late 90’s and early 2000’s turned off millions of potential buyers. But a14-year-old Xbox owner knows nothing of that and doesn’t care.

In the same way, the failures from the 1960s until the present are history to an 18-year-old.

Gen-Z has a big advantage as they are essentially starting at the bottom.  The institutions are all gone. The corporations are all SJW-converged. Free speech is mostly gone. They don’t know a different world than this, but they do know that this is the wrong state of affairs. They didn’t get to see the massive societal destruction of the last 50 years, which is a good thing. Which is more demoralizing, seeing something you love being destroyed or walking in after the destruction to pick up the pieces?

Read More:

Wednesday, May 30, 2018

Chicago: 35 Shot Over Memorial Day Weekend

There were over 30 shooting incidents in the city during the Memorial Day weekend and the local CBS news station only dedicated a little under 30 seconds to the story. Not even they give a shit about niggers getting killed and shot.
And how much money and time has been spent trying to integrate these baboons into our society? It’s been a huge waste of energy and resources. We see proof of it with stories like these. Only an insane person would think that we should continue trying to treat them as equals. I don’t care how many Bryant Gumbel and Barack Obama look-a-likes they put on television.
Read More:

Controlled media journalist works with Jewish supremacist organization and other anti-White activists to doxx first amendment supporters.

Matthew Stewart, a school break reporter for the Blount County, Tennessee, Daily Times newspaper, who apparently got into trouble by his editor for publishing pictures of ShieldWall Network flyers recently distributed in and around Maryville clearly showing the website address, is having to virtue signal to make up for the dozens of East Tennessee citizens who have contacted us to express support and sign up thanks to his first article. And boy, he’s signaling hard, quoting the ADL about the ShieldWall Network and allowing the Jewish supremacist organization to use his article to try to doxx White patriots:
“Protesters outside First United Methodist Church were members of the Arkansas-based ShieldWall Network, said Allison Padilla-Goodman, director of the (ADL) nonprofit’s Southeast Region, based in Atlanta. The group is run by Billy Roper, whose “goal is to create a white ethnostate,” or a whites-only nation.
Read More:

The Audacious Epigone: Hear, O Israel

Open borders and the aging of Heritage America do not bode well for Israel's relationship with the US. The following graph is sourced from Reuters-Ipsos polling data. It shows percentages of respondents by selected demographics who gauge Israel as a "moderate", "serious", or "imminent" threat--as opposed to a "minimal" or "no" threat. "Not sure" responses, comprising 16% of the total, are excluded (N = 5,097):

Quite a contrast with group sentiments towards Russia. Curious.

That white Christian America is the best friend Israel has ever had and that the Great Replacement will be among the worst things to happen to Israel has been obvious for decades.

Tribe members, encourage Heritage Americans to protect the borders, language, culture, and demography of their country in the same way the Israelis protect theirs and you have a pro-Israel philo-semite in me.

Read More:

Is America an Idea? by Emile A. Doak

The civic-nationalist view holds that subscribing to the philosophy of the Founding—equality, opportunity, individualism—is the defining trait of the American people. But others argue that it is those uniquely American practices that order the rhythms of life that make us love the United States as our home…

This past weekend, there was a wedding some 4,000 miles away that nonetheless captivated the American psyche. Across the country, bars opened early, women donned ornate hats, and talk shows gossiped about every detail of a quintessentially British tradition. Yes, the bride was American. Yes, the ceremony indulged more modern accommodations than any of its regal predecessors. Yet despite these moves way from tradition, the spectacle remained unmistakably British. It reflected the culture, tradition, and mores of a specific national identity.
Perhaps this unabashed Britishness is part of the reason the royal wedding captured America’s attention. Its ceremony reflected a certainty (albeit a fading one) about national identity that Americans have always lacked. The debate over the nature of American identity—one weaved throughout the history of our New World nation—has come to the fore again with the immigration question. What is America? What does it mean to be an American?
For many, these questions are answered by the “American idea.” Rishabh Bandari and Thomas Hopson, writing in National Affairs, define this view as “civic-nationalist” and place conservatives like Senator Ben Sasse within this camp.[1] This American identity places an emphasis on citizenship, but also a mythological perception of the American Founding. Under this view, subscribing to the philosophy of the Founding—equality, opportunity, perhaps individualism and self-reliance—is the defining trait of the American people. Recognizing the civic sainthood of the Founding Fathers is an added boon to one’s American bona fides.
The appeal of this civic-nationalist view is readily apparent. It forges a sense of unity out of a naturally fractured polity. Americans’ diverse ancestral homelands were always going to pose a challenge to the creation of a national identity. The thousands of years of custom and tradition that form the basis of the Britishness on display at the royal wedding are inherently absent in the United States.
This civic-nationalist Americanism also finds appeal in what it refutes: the nefarious strands of ethnic nationalism that attempt to resolve the question of national unity with a toxic, race-based Americanism. The civic-nationalist view is open to anyone from any background, so long as one signs on to the “American idea.” In this way, the multicultural nature of the nation is affirmed, with the political philosophy of the Founding providing the unity necessary for a functioning polity.
Read More:

Wolfe and Roth: A Look Back by Steve Sailer

The deaths this month of literary giants Tom Wolfe, age 88, and Philip Roth, 85, illuminated a little-noticed divide in American life. The two writers were fairly comparable combinations of talent, energy, ambition, and personality, so it’s instructive to see how their reputations differed.
Roth’s passing revived a debate among Jewish critics that has been sputtering along since he published his 1959 satire on the Jewish American Princess, Goodbye, Columbus: Is Philip Roth good for the Jews or bad for the Jews?
Should his many strong novels entitle Roth to be admired by other Jews as a Jewish hero? Or were his books, especially his funny but vulgar 1969 best-seller Portnoy’s Complaint, about a shiksa-chasing liberal lawyer’s Jewish guilt, excessively frank about Jewish life?
In Portnoy’s Complaint, Jewish guilt is the diametrical opposite of white guilt: Portnoy feels guilty not because his ancestors were too ethnocentric (as in white guilt) but because he’s not ethnocentric enough to satisfy his ancestors, who want him to knock it off with the gentile broads, settle down with a nice Jewish girl, and propagate the race.
To Roth’s Jewish critics, his insufficient feelings of Jewish guilt revealed him to be a self-hating Jew. That might seem convoluted, even contradictory, but the point was never to make the rules lucid to the American public, but to discourage Roth from using his skills to let his gentile readers in on how things worked.
But if the early Roth was seen as too honest to be good for the Jews, did Roth’s anti-gentilic 2004 alternative-history novel The Plot Against America, in which the election of Charles Lindbergh as president in 1940 is bad for the Jews, make up for his youthful indiscretions?
As Americans, we are all used to Jews arguing over whether or not various things are good for the Jews. Some Americans may find these displays of ethnocentrism unseemly, but if you are not Jewish, you’d be well advised not to mention that view in public. One thing Jews can agree upon is that gentiles having an opinion on Jews is not good for the Jews.
Read More:

Derbyshire vs. MacDonald Revisited by Edmund Connelly, Ph. D.

I don’t believe I’m exaggerating when I say that Kevin MacDonald’s Culture of Critique is one of the most important books of our age. Despite this fact, it has garnered remarkably little attention in traditional spheres, particularly academic circles. Of course, the reasons for this are obvious — the book is critical of Jewish behavior, it helps Gentiles understand how Jews are working against Gentile interests, and it shows that Jews control much of the content of academic and media discussion.
At the same time, MacDonald and his works have garnered immense attention from those who are often harmed by Jewish behavior or are excluded from areas of cultural construction, including many in the loosely-defined “Alt-Right.” Further, Prof. MacDonald has been indefatigable in his efforts to spread his views, appearing on countless podcasts and other Internet shows, speaking at conferences, etc. In my estimation, MacDonald is the hands-down intellectual leader of resistance to Jewish attacks on the White race, and in our circles he is honored as such.
Recently, a young man in the academic arena chose to address MacDonald’s work, which has given MacDonald a chance to once again defend his various theses on Jews. The academic is Nathan Cofnas, a graduate student working toward his doctorate in the philosophy of biology at the University of Oxford. (He is not a professor yet, though as a potential graduate of Oxford with a Ph.D.  —  and as a Jew who has published an attack on MacDonald —  his chances of gaining a good tenured teaching position are almost guaranteed.)
Fortunately for us, Spencer Quinn, a prolific writer for Counter-Currents, has ably summarized the debate between MacDonald and Cofnas. See Parts 1,  23and 4.
This leaves me free to bring back a similar older debate among MacDonald, John Derbyshire, and Joey Kurtzman. Though these debates deserve a revisit based on their own merits, the fact that Cofnas has now revived similar discussions makes previous discussions all the more relevant.
I first wrote about Derbyshire’s opinions of KMAC’s work way back in 2008, then again in 2012. For the purposes of this current piece, I will crib liberally from those original two essays, though the links within have often not survived well.
Derbyshire’s first major piece on MacDonald appeared in the March 10, 2003 issue of The American Conservative under the title “The Marx of the Anti-Semites.” (Editors chose the title, not Derbyshire.)There Derb’s take on the book was mixed, beginning with “The Culture of Critique includes many good things. . . . Kevin MacDonald is working in an important field.” He even validates an important point of MacDonald’s work: “These Jewish-inspired pseudoscientific phenomena that The Culture of Critique is concerned with — Boasian anthropology, psychoanalysis, the Frankfurt School, and so on — were they a net negative for America? Yes, I agree with MacDonald, they were.”
Read More:

Brussels’ Revenge: Italian President Blocks Nationalist Populist Coaliti...

The Fight for a Human Future - Wolf Age

The Fall of Greece Prepare Yourself Accordingly

Journalist Debunks Syrian War & Exposes White Helmets

Sunday, May 27, 2018

The Public Space: Kevin MacDonald's Reaction to the Cofnas Critique

Kevin MacDonald - How Richard Dawkins Failed Us

David Duke Show May 25th 2018 with Mark Collett


To date, 62 Palestinians have been shot dead in the Gaza Strip by the Israeli army and over 5,500 wounded by gunfire.  Their crime: protesting the loss of their ancestral homes in the West Bank.
Here was an example of Gandhi-style passive resistance that failed.  Israeli sniper teams just fired at will at the protesters, some of who were throwing rocks or firing sling shots.  High concentration tear gas was dumped by drones on the demonstrators.  Israel claimed it was killing ‘terrorists.’
The United States, Israel’s patron and financier, reveled in the move of its embassy from Tel Aviv to Jerusalem, a move seen by Bible Belt religious fundamentalists as a key step to the return of the Christian Messiah and Armageddon.  The rest of us, Jews included, are fated to be burned alive.  The American Republicans, who have become a far-right theocratic party, cheered this good news.  The Trump administration, by now an extension of Israel’s hard right Likud Party, was cock-a-hoop.
There was no joy in Gaza.  This miserable, squalid human garbage dump is a giant open-air prison packed with 2 million Palestinian refugees driven from the newly created state of Israel in 1948.  Israel and its close ally Egypt keep Gaza bottled up on its land and sea borders.  Palestinians are only allowed to fish along the shore. Coastal gas and oil reserves have been expropriated by Israel and Egypt.
Gaza’s two million people subsist on the edge of starvation. Israel openly boasts that it allows just enough food into the enclave to prevent outright starvation.  Chemicals to treat water are banned. Electricity runs only a few hours daily because the power plant was bombed by Israel’s US-supplied air force.  Hospitals have almost no medicines.  In short, wartime conditions in the open-air prison. Even the wretched animals in Gaza zoo are starving.
The intensive punishment of Gaza, a crime under international law, began after its people voted in a free election for the Hamas movement over the Palestine Liberation Organization (PLO) which is more or less run by Israel and the United States.  Israel helped found Hamas in 1987, but then sought, with the US, to destroy the organization, branding it ‘terrorist.’
Israel has extensively used US-supplied arms and money to fight Hamas in Gaza, a clear violation of the Arms Export Control Act of 1976 that bars the use of American weapons against civilian populations.
The question remains, where did all the Palestinians come from?  Israel long claimed there were no such people, or a made-up nationality. This was a pretty rich claim coming from Israelis, many of whom hailed from Russia, Poland and Eastern Europe and who had assumed biblical identities and asserted a direct link to the Hebrews who had lived two thousand years earlier in the Levant.
When Israel was created by the US and UN (with Soviet support) in 1948, from 750,000 to one million native Palestinians were driven from their ancestral home at gunpoint or panicked to flight by massacres and ethnic cleansing.   Their villages were bulldozed.
When Israel conquered and annexed the West Bank and the Old City of Jerusalem in 1967, another 500,000 Palestinians were made refugees.   Some 50,000-250,000 Syrians were driven by Israel from the strategic Golan Heights.  Bedouins were driven from Israel’s Negev Desert.
By our era, the number of homeless Palestinians has grown to 5 million refugees helped by the UN and at least another million scattered about the Mideast.  The actual number could reach as high as 8-9 million thanks to the Palestinian’s high birth rate and strong family values.
Read More:

America’s Fifth Column Will Destroy Russia by PAUL CRAIG ROBERTS

This is the lecture I would have given if I had been able to accept the invitation to address the St. Petersburg Economic Conference in Russia this weekend.
Executive Summary: From the standpoint of Russia’s dilemma, this is an important column. Putin’s partial impotence via-a-vis Washington is due to the grip that neoliberal economics exercises over the Russian government. Putin cannot break with the West, because he believes that Russian economic development is dependent on Russia’s integration within the Western economy. That is what neoliberal economics tells the Russian economic and financial establishment.
Everyone should understand that I am not a pro-Russian anti-American. I am anti-war, especially nuclear war. My concern is that the inability of the Russian government to put its foot down is due to its belief that Russian development, despite all the talk about the Eurasian partnership and the Silk Road, is dependent on being integrated with the West. This totally erroneous belief prevents the Russian government from any decisive break with the West. Consequently, Putin continues to accept provocations in order to avoid a decisive break that would cut Russia off from the West. In Washington and the UK this is interpreted as a lack of resolve on Putin’s part and encourages an escalation in provocations that will intensify until Russia’s only option is surrender or war.
If the Russian government did not believe that it needed the West, the government could give stronger responses to provocations that would make clear that there are limits to what Russia will tolerate. It would also make Europe aware that its existence hangs in the balance. The combination of Trump abusing Europe and Europe’s recognition of the threat to its own existence of its alignment with an aggressive Washington would break the Western alliance and NATO. But Putin cannot bring this about because he erroneously believes that Russia needs the West.
If the neoconservatives had self-restraint, they would sit back and let America’s Fifth Column—Neoliberal Economics—finish off Russia for them. Russia is doomed, because the country’s economists were brainwashed during the Yeltsin years by American neoliberal economists. It was easy enough for the Americans to do. Communist economics had come to naught, the Russian economy was broken, Russians were experiencing widespread hardship, and successful America was there with a helping hand.
In reality the helping hand was a grasping hand. The hand grasped Russian resources through privatization and gave control to American-friendly oligarchs. Russian economists had no clue about how financial capitalism in its neoliberal guise strips economies of their assets while loading them up with debt.
But worse happened. Russia’s economists were brainwashed into an economic way of thinking that serves Western imperialism.
For example, neoliberal economics exposes Russia’s currency to speculation, manipulation, and destabilization. Capital inflows can be used to drive up the value of the ruble, and then at the opportune time, the capital can be pulled out, dropping the ruble’s value and driving up domestic inflation with higher import prices, delivering a hit to Russian living standards. Washington has always used these kind of manipulations to destabilize governments.
Neo-liberal economics has also brainwashed the Russian central bank with the belief that Russian economic development depends on foreign investment in Russia. This erroneous belief threatens the very sovereignty of Russia. The Russian central bank could easily finance all internal economic development by creating money, but the brainwashed central bank does not realize this. The bank thinks that if the bank finances internal development the result would be inflation and depreciation of the ruble. So the central bank is guided by American neoliberal economics to borrow abroad money it does not need in order to burden Russia with foreign debt that requires a diversion of Russian resources into interest payments to the West.
Read More:

US Senate Candidate for California

Thursday, May 24, 2018

Tiger Mother's Son - Steve Sailer

A year ago, after a decade of worsening pain from swinging hard since infancy, Tiger Woods, perhaps the highest-paid athlete in history, appeared headed for the same fate as Philip Seymour Hoffman, Prince, and Tom Petty. But this spring Woods is grinding away again on tour, playing surprisingly respectable golf.
Now 42, he may never win again. Or, who knows, Woods may put on a late charge toward the only records he doesn’t hold: Sam Snead’s 82 PGA wins (Woods has 79), Jack Nicklaus’ eighteen major championships (Woods has been stuck at fourteen since he memorably limped to the 2008 U.S. Open title), and his tennis peer Roger Federer’s twenty Grand Slam titles.
With the U.S. Open coming up in June at Shinnecock Hills in the Hamptons, it’s worth reviewing the new tell-all biography Tiger Woods by Jeff Benedict and Armen Keteyian. Like his mentor Michael Jordan, Tiger benefited from advancements in the techniques of access journalism for controlling his media image, until it all fell apart in the sex scandal of 2009. So it’s informative to compare what Benedict and Keteyian say actually happened with what you had been told by the press at the time.
Woods being part black via his blowhard father was usually thought of as the most important element of his story, an angle assiduously pushed by Earl Woods. The elder Woods told a sports agent when Tiger was a 5-year-old sensation, “I believe that the first black man who’s a really good golfer is going to make a hell of a lot of money.”
But this exhaustive book doesn’t come up with much evidence for the pre-disgrace assumption that Tiger’s black side was key. For example, his father liked to tell the story of how as a kindergartener, Tiger was nearly lynched by racist 5-year-olds. But the authors can’t find a shred of proof for Earl’s oft-told tale. It appears to be just another Hate Hoax.
Read More:

“Prepare for guerrilla war” say police special forces as the authority of the state crumbles in France

The local people interviewed about this incident afterwards said it was nothing special; things like that happen almost every day. Some of the footage showed them just walking around normally as the shooting went on.
Here, in a separate incident in another part of Marseille, we see a police car completely surrounded by moped-riding “jeunes” [young people]. They are intimidating the police.

Read More:

Dutch Schoolgirls Sex Trafficked by the Thousands by Moslems? Imagine Moy Shawk. By Andrew Anglin

White women may no longer make good wives and mothers. But they do make good sex slaves for entrepreneurial Islamic New Dutchmen.

You see, the thing about this is…
The thing that you really need to understand is…
If we talk about the fact that Moslems are sex trafficking underage white girls by the thousands, give it too much media attention, it will make people racist against Moslems because it will make them think that Moslems are sex trafficking underage white girls by the thousands.
And you know what the real problem here is, don’t you?
It’s education.

If I wouldn’t have had my 9th grade 3 times weekly class on why it is bad to sex traffic underage girls, I have no doubt that I myself would be running a sex trafficking gang, giving drugs to little girls, then gang-raping them, then pimping them out while keeping them drugged-up and threatening to kill them if they told anyone what was going on.
In Moslem countries, they don’t have these classes because of the Crusades and colonialism destroying their education systems.
So what do you expect is going to happen?

Read More:

The Psychology of Why Most New Year’s Resolutions Fail

Year after year, thousands of people all over the world have the same experience.
High levels of motivation in January, followed by a temporary change in behaviour, soon followed by a drop-in motivation and then a reversion back to old habits. (With some added self-loathing now thrown into the mix.)
Indeed, a 2016 study found that over 80% of New Year’s Resolutions fail by the second week in February, and recent psychological research indicates that motivation is only effective for short term behaviour change, and if we want to make changes that last, a different approach is required.
So, if a lack of motivation isn’t behind the high failure rate, what is?
In this post, we’ll explore the underlying psychology of why most resolutions fail, then introduce a potential solution to the problem – one based on our evolutionary past, which if properly understood, can empower you to make long term behaviour changes, in a sustainable way.
Read More:

'The New York Times' Runs A Comprehensive Hit Piece On Jordan Peterson. It's Dishonest, Malicious Crap.

 Jordan Peterson during his lecture at UofT. Peterson is the professor at the centre of a media storm because of his public declaration that he will not use pronouns, such as 'they,' to recognize non-binary genders.
On Friday, The New York Times released a 3,400-word, spectacularly brutal and dishonest hit piece on Jordan Peterson. Its none-too-subtle title: "Jordan Peterson: Custodian Of The Patriarchy."
The author, Nellie Bowles, can’t hold back her obvious antipathy for her subject; instead, she lets her sneering condescension drop from every comma. She never lets Peterson spell out an argument; she satisfies her id with every snide aside. It’s an extraordinary performance by a person deliberately attempting to misunderstand her subject.

Read More:

Former WH Spox: Obama Knew About #Spygate – No FBI Director Would Put Spies Inside of a Campaign Without President’s AUTHORIZATION

Former WH Spox: Obama Knew About #Spygate – No FBI Director Would Put Spies Inside of a Campaign Without President’s AUTHORIZATION

Obama knew; the fish rots from the head down.

Obama knew about the spies planted into Trump’s campaign.
Obama knew about Hillary’s phony dossier.

Former White House Press Secretary for George W. Bush, Ari Fleischer broke down why Obama must have known about the informants spies planted in Trump’s campaign.
In fact, Ari Fleischer says Obama AUTHORIZED it.

Black hostility is the new cure for relentless white racism in Birmingham.

Read More:

Another lousy article on epigenetics, this time in the New York Review of Books

I don’t know who the New York Review of Books is getting to vet its biology articles, but this one below (free access) is really confusing. One reason may be that the authors have no particular expertise in evolution. Israel Rosenfeld is an MD with training in neuroscience, whileEdward Ziff is a professor at NYU who works on neural transmission. I’ve never heard of either of them, which doesn’t rule them out as being able to say anything useful about evolution, but I usually am familiar with people who write about evolution for the NYRB, and they always had a name in evolutionary biology (viz., my student Allen Orr).  Anyway, here’s the article (click on screenshot to see it).
The problem with this article is that it’s deeply confused, conflating gene regulation within the lifetime of an individual (which can be achieved by “epigenetically” attaching or detaching methyl groups to genes to turn them on or off) with environmentally induced modification of the DNA that is inherited over several generations, causing a form of “non-Darwinian” evolution.
The former is not problematic; we’ve long known that genes can be regulated by environmental factors; that’s what Jacob and Monod got their Nobel Prizes for. We’ve learned more recently that this regulation can also be programmed to cause methylation: genes are adaptively activated and deactivated by the attachment of methyl groups (or small RNA molecules) to segments of genes. But even that adaptive regulation is coded for by DNA. That is, there are genes which are programmed to turn other genes on or off by adding small molecules to them. That’s the way cells differentiated so that even though all cells have the DNA, some become liver cells, others brain cells or blood cells, and so on. That is permanent changes in gene regulation occurring largely by methylation, and over generations of cells, but not over generations of individuals. Such differentiation within an individual still evolved in a Darwinian way, it’s just that natural selection favored particular ways to adaptively activate or inactivate genes.
However, a vocal group of biologists maintain that evolution can also occur when the environment rather than DNA (extrinsic factors like cold or starvation) can methylate genes too, and that methylation can be inherited. The problem with this, as I’ve emphasized repeatedly, is twofold. First those environmental changes are nearly always nonadaptive or maladaptive—they’re more like random screwups—and so can’t be the basis of adaptive evolution.
Second, environmentally induced changes in DNA are nearly always wiped out during gamete formation, and so those changes cannot be the basis of long-term evolution or adaptation. There are a few exceptions, but I don’t know of any such modifications that can longer than three generations. The fact that this form of “Lamarckian” inheritance isn’t pervasive is also shown by the numerous adaptations that have been dissected genetically: all of them are based on changes in the DNA sequence rather than attachment of methyl groups induced by the environment.
Rosenfield and Ziff conflate the programmed regulation of genes by other genes that induce methylation with the “Lamarckian” changes in DNA sequence. That’s clear when they say stuff like this:
Until the mid-1970s, no one suspected that the way in which the DNA was “read” could be altered by environmental factors, or that the nervous systems of people who grew up in stress-free environments would develop differently from those of people who did not. One’s development, it was thought, was guided only by one’s genetic makeup. As a result of epigenesis, a child deprived of nourishment may continue to crave and consume large amounts of food as an adult, even when he or she is being properly nourished, leading to obesity and diabetes. A child who loses a parent or is neglected or abused may have a genetic basis for experiencing anxiety and depression and possibly schizophrenia. Formerly, it had been widely believed that Darwinian evolutionary mechanisms—variation and natural selection—were the only means for introducing such long-lasting changes in brain function, a process that took place over generations. We now know that epigenetic mechanisms can do so as well, within the lifetime of a single person.
This is deeply confusing, for it conflates gene regulation within an individual with evolutionary changes that evolve over many generations. In fact, the epigenetic regulation mentioned by Rosenfield and Ziff did evolve over generations by natural selection. They are saying that two things are distinct and contradictory when in fact they are the same thing. It’s no surprise that this article (which is largely written in technical jargon) would confuse the layperson.

Read More: