Daily Archives: March 16, 2015

“Gleefully Dabbles”???? Check this out

Standard

This article originally appeared in Open Culture.  Absolutely amazing….”Gleefully dabbles…”???!!! wow.

Kahlo One

Walter Keane—supposed painter of “Big Eyed Children” and subject of a recent Tim Burton film—made a killing, attaining almost Thomas Kinkade-like status in the middlebrow art market of the 1950 and 60s. As it turns out, his wife, Margaret was in fact the artist, “painting 16 hours a day,” according to a Guardian profile. In some part, the story may illustrate how easy it was for a man like Walter to get millions of people to see what they wanted to see in the picture of success—a charismatic, talented man in front, his quiet, dutiful wife behind. Burton may not have taken too much license with the commonplace attitudes of the day when he has Christoph Waltz’s Walter Keane tell Margaret, “Sadly, people don’t buy lady art.”

And yet, far from the Keane’s San Francisco, and perhaps as far as a person can get from Margaret’s frustrated acquiescence, we have Frida Kahlo creating a body of work that would eventually overshadow her husband’s, muralist Diego Rivera. Unlike Walter Keane, Rivera was a very good painter who did not attempt to overshadow his wife. Instead of professional jealousy, he had plenty of the personal variety. Even so, Rivera encouraged Kahlo’s career and recognized her formidable talent, and she, in turn, supported him. In 1933, when Florence Davies—whom Kahlo biographer Gerry Souter describes as “a local news hen”—caught up with her in Detroit, Kahlo “played the cheeky, but adoring wife” of Diego while he labored to finish his famous Detroit mural project.

That may be so, but she did not do so at her own expense. Quite the contrary. Asked if Diego taught her to paint, she replies, “’No, I didn’t study with Diego. I didn’t study with anyone. I just started to paint.’” At which point, writes Davies, “her eyes begin to twinkle” as she goes on to say, “’Of course, he does pretty well for a little boy, but it is I who am the big artist.’” Davies praises Kahlo’s style as “skillful and beautiful” and the artist herself as “a miniature-like little person with her long black braids wound demurely about her head and a foolish little ruffled apron over her black silk dress.” And yet, despite Kahlo’s confidence and serious intent, represented by a prominent photo of her at serious work, Davies—or more likely her editor—decided to title the article, “Wife of the Master Mural Painter Gleefully Dabbles in Works of Art,” a move that reminds me of Walter Keane’s patronizing attitude.

Kahlo Two

The belittling headline is quaint and disheartening, speaking to us, like the unearthed 1938 letter from Disney to an aspiring female animator, of the cruelty of casual sexism. Davies apparently filed another article on Rivera the year prior. This time the headline doesn’t mention Frida, though her fierce unflinching gaze, not Rivera’s wrestler’s mug, again adorns the spread. One sentence in the article says it all: “Freda [sic], it must be understood, is Senora Rivera, who came very near to stealing the show.” Davies then goes on to again describe Kahlo’s appearance, noting of her work only that “she does paint with great charm.” Six years later, Kahlo would indeed steal the show at her first and only solo show in the United States, then again in Paris, where surrealist maestro Andre Breton championed her work and the Louvre bought a painting, its first by a twentieth-century Mexican artist.

Livin’ in a Two Story House

Standard
Livin’ in a Two Story House

Wound up week 5 of my songwriting class last week and am going into the sixth week with the lyrics to my C&W tune coming along pretty well.  I really enjoy this Berklee College of Music class.  The teacher is an old pro and very witty and we’re all having a lot of fun with it.

Here’s a little sample….

You’re livin’ in a two story house
His story
Your story
They never match
You’re livin’ in a two story house
Too many holes in your roof for a patch
Headed  back to the future ’cause
Yes, he still does, 
and back to the future where
We always were….
…….

 

Hey, Dumbass. This is for YOU.

Standard

Stupid is as stupid does….and Jonathon Gatehouse expresses it brilliantly in this repost from MacCleans.

America dumbs down

The U.S. is being overrun by a wave of anti-science, anti-intellectual thinking. Has the most powerful nation on Earth lost its mind?

Jonathon Gatehouse

Bill Pugliano/Getty Images

South Carolina’s state beverage is milk. Its insect is the praying mantis. There’s a designated dance—the shag—as well a sanctioned tartan, game bird, dog, flower, gem and snack food (boiled peanuts). But what Olivia McConnell noticed was missing from among her home’s 50 official symbols was a fossil. So last year, the eight-year-old science enthusiast wrote to the governor and her representatives to nominate the Columbian mammoth. Teeth from the woolly proboscidean, dug up by slaves on a local plantation in 1725, were among the first remains of an ancient species ever discovered in North America. Forty-three other states had already laid claim to various dinosaurs, trilobites, primitive whales and even petrified wood. It seemed like a no-brainer. “Fossils tell us about our past,” the Grade 2 student wrote.

And, as it turns out, the present, too. The bill that Olivia inspired has become the subject of considerable angst at the legislature in the state capital of Columbia. First, an objecting state senator attached three verses from Genesis to the act, outlining God’s creation of all living creatures. Then, after other lawmakers spiked the amendment as out of order for its introduction of the divinity, he took another crack, specifying that the Columbian mammoth “was created on the sixth day with the other beasts of the field.” That version passed in the senate in early April. But now the bill is back in committee as the lower house squabbles over the new language, and it’s seemingly destined for the same fate as its honouree—extinction.

What has doomed Olivia’s dream is a raging battle in South Carolina over the teaching of evolution in schools. Last week, the state’s education oversight committee approved a new set of science standards that, if adopted, would see students learn both the case for, and against, natural selection.

Charles Darwin’s signature discovery—first published 155 years ago and validated a million different ways since—long ago ceased to be a matter for serious debate in most of the world. But in the United States, reconciling science and religious belief remains oddly difficult. A national poll, conducted in March for the Associated Press, found that 42 per cent of Americans are “not too” or “not at all” confident that all life on Earth is the product of evolution. Similarly, 51 per cent of people expressed skepticism that the universe started with a “big bang” 13.8 billion years ago, and 36 per cent doubted the Earth has been around for 4.5 billion years.

The American public’s bias against established science doesn’t stop where the Bible leaves off, however. The same poll found that just 53 per cent of respondents were “extremely” or “very confident” that childhood vaccines are safe and effective. (Worldwide, the measles killed 120,000 people in 2012. In the United States, where a vaccine has been available since 1963, the last recorded measles death was in 2003.) When it comes to global warming, only 33 per cent expressed a high degree of confidence that it is “man made,” something the UN Intergovernmental Panel on Climate Change has declared is all but certain. (The good news, such as it was in the AP poll, was that 69 per cent actually believe in DNA, and 82 per cent now agree that smoking causes cancer.)

If the rise in uninformed opinion was limited to impenetrable subjects that would be one thing, but the scourge seems to be spreading. Everywhere you look these days, America is in a rush to embrace the stupid. Hell-bent on a path that’s not just irrational, but often self-destructive. Common-sense solutions to pressing problems are eschewed in favour of bumper-sticker simplicities and blind faith.

In a country bedevilled by mass shootings—Aurora, Colo.; Fort Hood, Texas; Virginia Tech—efforts at gun control have given way to ever-laxer standards. Georgia recently passed a law allowing people to pack weapons in state and local buildings, airports, churches and bars. Florida is debating legislation that will waive all firearm restrictions during state emergencies like riots or hurricanes. (One opponent has moved to rename it “an Act Relating to the Zombie Apocalypse.”) And since the December 2012 massacre of 20 children and six staff at Sandy Hook Elementary School, in Newtown, Conn., 12 states have passed laws allowing guns to be carried in schools, and 20 more are considering such measures.

The cost of a simple appendectomy in the United States averages $33,000 and it’s not uncommon for such bills to top six figures. More than 15 per cent of the population has no health insurance whatsoever. Yet efforts to fill that gaping hole via the Affordable Health Care Act—a.k.a. Obamacare—remain distinctly unpopular. Nonsensical myths about the government’s “real” intentions have found so much traction that 30 per cent still believe that there will be official “death panels” to make decisions on end-of-life care.

Since 2001, the U.S. government has been engaged in an ever-widening program of spying on its own—and foreign—citizens, tapping phones, intercepting emails and texts, and monitoring social media to track the movements, activities and connections of millions. Still, many Americans seem less concerned with the massive violations of their privacy in the name of the War on Terror, than imposing Taliban-like standards on the lives of others. Last month, the school board in Meridian, Idaho voted to remove The Absolutely True Diary of a Part-Time Indian by Sherman Alexie from its Grade 10 supplemental reading list following parental complaints about its uncouth language and depictions of sex and drug use. When 17-year-old student Brady Kissel teamed up with staff from a local store to give away copies at a park as a protest, a concerned citizen called police. It was the evening of April 23, which was also World Book Night, an event dedicated to “spreading the love of reading.”

If ignorance is contagious, it’s high time to put the United States in quarantine.

Americans have long worried that their education system is leaving their children behind. With good reason: national exams consistently reveal how little the kids actually know. In the last set, administered in 2010 (more are scheduled for this spring), most fourth graders were unable to explain why Abraham Lincoln was an important figure, and only half were able to order North America, the U.S., California and Los Angeles by size. Results in civics were similarly dismal. While math and reading scores have improved over the years, economics remains the “best” subject, with 42 per cent of high school seniors deemed “proficient.”

They don’t appear to be getting much smarter as they age. A 2013 survey of 166,000 adults across 20 countries that tested math, reading and technological problem-solving found Americans to be below the international average in every category. (Japan, Finland, Canada, South Korea and Slovakia were among the 11 nations that scored significantly higher.)

The trends are not encouraging. In 1978, 42 per cent of Americans reported that they had read 11 or more books in the past year. In 2014, just 28 per cent can say the same, while 23 per cent proudly admit to not having read even one, up from eight per cent in 1978. Newspaper and magazine circulation continues to decline sharply, as does viewership for cable news. The three big network supper-hour shows drew a combined average audience of 22.6 million in 2013, down from 52 million in 1980. While 82 per cent of Americans now say they seek out news digitally, the quality of the information they’re getting is suspect. Among current affairs websites, Buzzfeed logs almost as many monthly hits as the Washington Post.

The advance of ignorance and irrationalism in the U.S. has hardly gone unnoticed. The late Columbia University historian Richard Hofstadter won the Pulitzer prize back in 1964 for his book Anti-Intellectualism in American Life, which cast the nation’s tendency to embrace stupidity as a periodic by-product of its founding urge to democratize everything. By 2008, journalist Susan Jacoby was warning that the denseness—“a virulent mixture of anti-rationalism and low expectations”—was more of a permanent state. In her book, The Age of American Unreason, she posited that it trickled down from the top, fuelled by faux-populist politicians striving to make themselves sound approachable rather than smart. Their creeping tendency to refer to everyone—voters, experts, government officials—as “folks” is “symptomatic of a debasement of public speech inseparable from a more general erosion of American cultural standards,” she wrote. “Casual, colloquial language also conveys an implicit denial of the seriousness of whatever issue is being debated: talking about folks going off to war is the equivalent of describing rape victims as girls.”

That inarticulate legacy didn’t end with George W. Bush and Sarah Palin. Barack Obama, the most cerebral and eloquent American leader in a generation, regularly plays the same card, droppin’ his Gs and dialling down his vocabulary to Hee Haw standards. His ability to convincingly play a hayseed was instrumental in his 2012 campaign against the patrician Mitt Romney; in one of their televised debates the President referenced “folks” 17 times.

An aversion to complexity—at least when communicating with the public—can also be seen in the types of answers politicians now provide the media. The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140-character Twitter bursts.

Little wonder then that distrust—of leaders, institutions, experts, and those who report on them—is rampant. A YouGov poll conducted last December found that three-quarters of Americans agreed that science is a force for good in the world. Yet when asked if they truly believe what scientists tell them, only 36 per cent of respondents said yes. Just 12 per cent expressed strong confidence in the press to accurately report scientific findings. (Although according to a 2012 paper by Gordon Gauchat, a University of North Carolina sociologist, the erosion of trust in science over the past 40 years has been almost exclusively confined to two groups: conservatives and regular churchgoers. Counterintuitively, it is the most highly educated among them—with post-secondary education—who harbour the strongest doubts.)

The term “elitist” has become one of the most used, and feared, insults in American life. Even in the country’s halls of higher learning, there is now an ingrained bias that favours the accessible over the exacting.

“There’s a pervasive suspicion of rights, privileges, knowledge and specialization,” says Catherine Liu, the author of American Idyll: Academic Antielitism as Cultural Critique and a film and media studies professor at University of California at Irvine. Both ends of the political spectrum have come to reject the conspicuously clever, she says, if for very different reasons; the left because of worries about inclusiveness, the right because they equate objections with obstruction. As a result, the very mission of universities has changed, argues Liu. “We don’t educate people anymore. We train them to get jobs.” (Boomers, she says, deserve most of the blame. “They were so triumphalist in promoting pop culture and demoting the canon.”)

The digital revolution, which has brought boundless access to information and entertainment choices, has somehow only enhanced the lowest common denominators—LOL cat videos and the Kardashians. Instead of educating themselves via the Internet, most people simply use it to validate what they already suspect, wish or believe to be true. It creates an online environment where Jenny McCarthy, a former Playboy model with a high school education, can become a worldwide leader of the anti-vaccination movement, naysaying the advice of medical professionals.

Most perplexing, however, is where the stupid is flowing from. As conservative pundit David Frum recently noted, where it was once the least informed who were most vulnerable to inaccuracies, it now seems to be the exact opposite. “More sophisticated news consumers turn out to use this sophistication to do a better job of filtering out what they don’t want to hear,” he blogged.

But are things actually getting worse? There’s a long and not-so-proud history of American electors lashing out irrationally, or voting against their own interests. Political scientists have been tracking, since the early 1950s, just how poorly those who cast ballots seem to comprehend the policies of the parties and people they are endorsing. A wealth of research now suggests that at the most optimistic, only 70 per cent actually select the party that accurately represents their views—and there are only two choices.

Larry Bartels, the co-director of the Center for the Study of Democratic Institutions at Vanderbilt University, says he doubts that the spreading ignorance is a uniquely American phenomenon. Facing complex choices, uncertain about the consequences of the alternatives, and tasked with balancing the demands of jobs, family and the things that truly interest them with boring policy debates, people either cast their ballots reflexively, or not at all. The larger question might be whether engagement really matters. “If your vision of democracy is one in which elections provide solemn opportunities for voters to set the course of public policy and hold leaders accountable, yes,” Bartels wrote in an email to Maclean’s. “If you take the less ambitious view that elections provide a convenient, non-violent way for a society to agree on who is in charge at any given time, perhaps not.”

A study by two Princeton University researchers, Martin Gilens and Benjamin Page, released last month, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median-income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote.

Smart money versus dumb voters is hardly a fair fight. But it does offer compelling evidence that the survival of the fittest remains an unshakable truth even in American life. A sad sort of proof of evolution.