This is six year old, Ingrid. She worked very hard on her impersonation of Sarah Palin. I hope you enjoy it, and that you will share it if you do!
Thanks for watching!
This is six year old, Ingrid. She worked very hard on her impersonation of Sarah Palin. I hope you enjoy it, and that you will share it if you do!
Thanks for watching!
The Mills Brothers released their hit “Too Many Irons in the Fire” in 1946. 70 years later, this song could be designated my theme song. Yet, how many irons are too many? I suppose that much is subjective.
I live a multi-faceted existence and always seem to have a lot of irons in the fire. With the onset of the new year, however, I find myself busier than ever, but I am also happier than ever, and with great hope for the future.
I am working on opening a new business and have been developing workshops and programs for that, gathering partners and finances, and creating a dynamic endeavor that may take a couple of years to get off the ground, so I continue to work on other things as I focus on getting this done.
A friend, who is a former celebrity client from a decades-ago stint I did with an entertainment law firm, contacted me over the holidays to ask if I would be part of a $25 million capital raising campaign with a view toward producing 5 new independent films. I will be working in the capacity of a consultant, designing social media promotions and campaigns, but won’t know many details until some time next week. This will be my first MOIP-related, salaried work I have done since I received my masters degree, and while I’m excited about the work, this is not what I’ll be doing professionally, in the long run, but that is another story for another time.
In addition to my art work, a large part of my vocational time is spent writing. I have my various creative writing projects going on….my cookbook, my novel, my poetry and short stories, all of which take the back burner too often in favor of the writing work that I get paid for.
Over the last 48 hours, I have written essays on the world-renowned Monte Pascoal cigars, Missouri fly fishing, eyeliner, the Bakken oil fields, Maternity photo shoots and the merits of portable ballet barres. I have written essays for a graphic design company, two criminal law firms, a judge, an artist and a physician whose specialty is the treatment of diabetes. I have a long list of articles to complete today, and another list of articles that I will have to complete from our retreat at Lake Tahoe.
I have honed article writing down to a fine art and can knock out what my editors designate as “high quality” writing in a very short period of time. My research skills were honed to perfection while I was in graduate school, and I am able to produce many articles in a short period of time. All this, is in addition to writing the Chinese fashion catalog that provides an endless stream of work.
Admittedly, I do not feel a lot of passion about the paid writing that I do. My heart lies with my creative work, but the paid work provides a good income, and I rather enjoy it. It isn’t what I intend to do over the long run, but for now, it is fine.
I work through a number of different agencies. Over the years, my ranking has risen to the top with many of these agencies, and I have received a more noteworthy status than I once had as a hack writer. Today, I am frequently notified by editors and former clients, so that the majority of the work I do is for private clients or special projects.
My work involves long hours and intense concentration, and, therefore, results in my having to make a special efforts to exercise and stay healthy. This work can be all-consuming, and it is as easy to forget to eat and exercise as it is to breathe. There have been days when I have started work before the sun came up, and ended it well after midnight.
In this new year, I shall endeavor to moderate my writing into a more manageable enterprise. I vow to place my health first, and to exercise twice a day, beginning each morning with yoga and a long walk, and doing a concentrated aerobic effort each afternoon. I have been doing this three times a week, but I am going to up the ante.
This freedom to arrange my schedule as I want it is the primary reason I continue to pursue the writing life. This freedom to travel. This freedom to begin and end work when I want. The freedom to take off a half hour when my best friend calls, or the freedom to stop what I’m doing to pick Ingrid up from school. These are the reasons that I write.
Tomorrow, as my friends go to their offices and get snagged in rush hour traffic, I will be departing for Reno/Tahoe. THIS is why I engage in the writing life. This freedom to leave when I want or to sleep as late as I want …..although I am an early riser….this freedom is why I write.
Here is a little blurb from the new show, “Empowering Women Everywhere, ” which is hosted by my friend, Nann Gill.
Watch it on Channel 23 (TWC) at 7:30 ET, or on line at empoweringwomeneverywhere.tv.
Become part of the Empowering Women Everywhere Community!
Membership is FREE!
Read more about the 2 Left Feet Blues Festival HERE. <clickity click click click….
“Empowering Women Everywhere” is produced by the Academy of Film, Television and Stage Performing Arts, a 501c3 corporation.
The program reaches over a million households twice a week!
For a tax-deductible donation you can become a sponsor of the show.
Sponsorship packages are individually designed to best fit your giving criteria.
Please call 845-294-8444 to discuss your particular requirements.
I kid you not. I almost fainted when I watched this the first time. It is divine….in every possible way.
This is my friend, Nan, a powerful woman that I am very proud to call a friend. Please read about her new television show by clicking HERE.
Apart from the fact that this is a tiny little girl singing a song from a controversial video (that she has never seen, by the way), the cutest thing about Ingrid’s version of “Wrecking Ball” is the fact that she made up her own lyrics. They are lyrics that make no sense at all. One of my musician friends, Steve Grandinetti, said that if I took Ingrid to one of his shows, he would just let her take over. He posted her video on his Facebook for his fans to see. She will be thrilled when she finds out. 🙂
I cried as I watched this, and gave thanks to be as fortunate as I am.
Today, I finished the wasp nest bird sculpture that I wrote about a few weeks ago. This was one of the most challenging pieces I had ever done. The huge wasps nest was teaming with live wasps when I received it from New York. It was very dry, broken and crumbling….but I managed to work a little magic and to transform it into this weird creature that I have named “WaspLark”.
The piece is going to my friends Scott Cohen and Anastasia Trania who found the big nest in upstate New York.
It was a difficult process and even a little creepy, handling the wasp nest, but I somehow managed to fashion a piece of art out of it….albeit a very strange one.
It is very rustic, but, as is customary for me, I did manage to add a touch of bling here and there.
Sometimes, an artist just has to allow the materials to dictate the piece. In this case, I had no other choice.
I’ve been a Sam Brown fan for years. Sadly, a lot of people STILL don’t know about her. Take a listen…
“Reborners”. They’re not just the group of self-taught artists that hand-make, collect, and interact with these dolls. There are childless couples and single women that invest thousands upon thousands of dollars purchasing these dolls, buying their equipment and clothing. They hold them, dress them, wash their hair, and take them for walks in the park. They treat them exactly as if they are real babies.
Creating the dolls is a labor-intensive, complicated process. Some required up to 80 individual layers of painting, veining, blushing mottling, and toning, cured with heat. Strands of individual hairs are attached to the scalp one at a time. The dolls are weighted so that they feel like real babies when held in someone’s arms. Some of them are heated to 98.6 degrees, have mechanisms that make them move, or audible “heartbeats” and breathing devices. Shudder.
They completely freak me out.
I try my best to remain open minded. I truly do. I do not begrudge the “reborners” the right to do whatever they want, and it does no one any harm…especially me…. but admittedly, the whole thing disturbs me at some fundamental level. I can understand collecting these dolls for their artful workmanship….but I can’t relate to going into the fantasy of treating them as if they are real children. It just doesn’t seem healthy.
This is hard for me to explain, but it seems as though these people miss the point of actually having children and are only doing this reborn thing for show. Children are not put here to gratify our egos. They aren’t show pieces for us to dress up and parade around. This whole reborners thing seems shallow. Crazy….like other people that do things for appearance..such as pretend they have happy relationships when they don’t, creating “happy family” personae online but living in wild disharmony in real life.
This is just nuts. You can’t replace a child with a doll! If you’re so lonely that you need to throw away thousands of dollars on this sort of thing, why not take in a foster child? Why not volunteer at a hospital to hold and rock babies in nurseries. (I did that for years when I lived in Texas and loved every minute of it, even though I had actual children!) In other words, why not do some child some GOOD instead of becoming ensconced in a fantasy would that acts like these dolls are real people?
What follows is a documentary about this whole phenomenon. It isn’t the excellent BBC documentary that I originally watched about these people, but that one doesn’t seem to be online. This will give you an idea of what goes on in the reborn community, however.
South Carolina’s state beverage is milk. Its insect is the praying mantis. There’s a designated dance—the shag—as well a sanctioned tartan, game bird, dog, flower, gem and snack food (boiled peanuts). But what Olivia McConnell noticed was missing from among her home’s 50 official symbols was a fossil. So last year, the eight-year-old science enthusiast wrote to the governor and her representatives to nominate the Columbian mammoth. Teeth from the woolly proboscidean, dug up by slaves on a local plantation in 1725, were among the first remains of an ancient species ever discovered in North America. Forty-three other states had already laid claim to various dinosaurs, trilobites, primitive whales and even petrified wood. It seemed like a no-brainer. “Fossils tell us about our past,” the Grade 2 student wrote.
And, as it turns out, the present, too. The bill that Olivia inspired has become the subject of considerable angst at the legislature in the state capital of Columbia. First, an objecting state senator attached three verses from Genesis to the act, outlining God’s creation of all living creatures. Then, after other lawmakers spiked the amendment as out of order for its introduction of the divinity, he took another crack, specifying that the Columbian mammoth “was created on the sixth day with the other beasts of the field.” That version passed in the senate in early April. But now the bill is back in committee as the lower house squabbles over the new language, and it’s seemingly destined for the same fate as its honouree—extinction.
What has doomed Olivia’s dream is a raging battle in South Carolina over the teaching of evolution in schools. Last week, the state’s education oversight committee approved a new set of science standards that, if adopted, would see students learn both the case for, and against, natural selection.
Charles Darwin’s signature discovery—first published 155 years ago and validated a million different ways since—long ago ceased to be a matter for serious debate in most of the world. But in the United States, reconciling science and religious belief remains oddly difficult. A national poll, conducted in March for the Associated Press, found that 42 per cent of Americans are “not too” or “not at all” confident that all life on Earth is the product of evolution. Similarly, 51 per cent of people expressed skepticism that the universe started with a “big bang” 13.8 billion years ago, and 36 per cent doubted the Earth has been around for 4.5 billion years.
The American public’s bias against established science doesn’t stop where the Bible leaves off, however. The same poll found that just 53 per cent of respondents were “extremely” or “very confident” that childhood vaccines are safe and effective. (Worldwide, the measles killed 120,000 people in 2012. In the United States, where a vaccine has been available since 1963, the last recorded measles death was in 2003.) When it comes to global warming, only 33 per cent expressed a high degree of confidence that it is “man made,” something the UN Intergovernmental Panel on Climate Change has declared is all but certain. (The good news, such as it was in the AP poll, was that 69 per cent actually believe in DNA, and 82 per cent now agree that smoking causes cancer.)
If the rise in uninformed opinion was limited to impenetrable subjects that would be one thing, but the scourge seems to be spreading. Everywhere you look these days, America is in a rush to embrace the stupid. Hell-bent on a path that’s not just irrational, but often self-destructive. Common-sense solutions to pressing problems are eschewed in favour of bumper-sticker simplicities and blind faith.
In a country bedevilled by mass shootings—Aurora, Colo.; Fort Hood, Texas; Virginia Tech—efforts at gun control have given way to ever-laxer standards. Georgia recently passed a law allowing people to pack weapons in state and local buildings, airports, churches and bars. Florida is debating legislation that will waive all firearm restrictions during state emergencies like riots or hurricanes. (One opponent has moved to rename it “an Act Relating to the Zombie Apocalypse.”) And since the December 2012 massacre of 20 children and six staff at Sandy Hook Elementary School, in Newtown, Conn., 12 states have passed laws allowing guns to be carried in schools, and 20 more are considering such measures.
The cost of a simple appendectomy in the United States averages $33,000 and it’s not uncommon for such bills to top six figures. More than 15 per cent of the population has no health insurance whatsoever. Yet efforts to fill that gaping hole via the Affordable Health Care Act—a.k.a. Obamacare—remain distinctly unpopular. Nonsensical myths about the government’s “real” intentions have found so much traction that 30 per cent still believe that there will be official “death panels” to make decisions on end-of-life care.
Since 2001, the U.S. government has been engaged in an ever-widening program of spying on its own—and foreign—citizens, tapping phones, intercepting emails and texts, and monitoring social media to track the movements, activities and connections of millions. Still, many Americans seem less concerned with the massive violations of their privacy in the name of the War on Terror, than imposing Taliban-like standards on the lives of others. Last month, the school board in Meridian, Idaho voted to remove The Absolutely True Diary of a Part-Time Indian by Sherman Alexie from its Grade 10 supplemental reading list following parental complaints about its uncouth language and depictions of sex and drug use. When 17-year-old student Brady Kissel teamed up with staff from a local store to give away copies at a park as a protest, a concerned citizen called police. It was the evening of April 23, which was also World Book Night, an event dedicated to “spreading the love of reading.”
If ignorance is contagious, it’s high time to put the United States in quarantine.
Americans have long worried that their education system is leaving their children behind. With good reason: national exams consistently reveal how little the kids actually know. In the last set, administered in 2010 (more are scheduled for this spring), most fourth graders were unable to explain why Abraham Lincoln was an important figure, and only half were able to order North America, the U.S., California and Los Angeles by size. Results in civics were similarly dismal. While math and reading scores have improved over the years, economics remains the “best” subject, with 42 per cent of high school seniors deemed “proficient.”
They don’t appear to be getting much smarter as they age. A 2013 survey of 166,000 adults across 20 countries that tested math, reading and technological problem-solving found Americans to be below the international average in every category. (Japan, Finland, Canada, South Korea and Slovakia were among the 11 nations that scored significantly higher.)
The trends are not encouraging. In 1978, 42 per cent of Americans reported that they had read 11 or more books in the past year. In 2014, just 28 per cent can say the same, while 23 per cent proudly admit to not having read even one, up from eight per cent in 1978. Newspaper and magazine circulation continues to decline sharply, as does viewership for cable news. The three big network supper-hour shows drew a combined average audience of 22.6 million in 2013, down from 52 million in 1980. While 82 per cent of Americans now say they seek out news digitally, the quality of the information they’re getting is suspect. Among current affairs websites, Buzzfeed logs almost as many monthly hits as the Washington Post.
The advance of ignorance and irrationalism in the U.S. has hardly gone unnoticed. The late Columbia University historian Richard Hofstadter won the Pulitzer prize back in 1964 for his book Anti-Intellectualism in American Life, which cast the nation’s tendency to embrace stupidity as a periodic by-product of its founding urge to democratize everything. By 2008, journalist Susan Jacoby was warning that the denseness—“a virulent mixture of anti-rationalism and low expectations”—was more of a permanent state. In her book, The Age of American Unreason, she posited that it trickled down from the top, fuelled by faux-populist politicians striving to make themselves sound approachable rather than smart. Their creeping tendency to refer to everyone—voters, experts, government officials—as “folks” is “symptomatic of a debasement of public speech inseparable from a more general erosion of American cultural standards,” she wrote. “Casual, colloquial language also conveys an implicit denial of the seriousness of whatever issue is being debated: talking about folks going off to war is the equivalent of describing rape victims as girls.”
That inarticulate legacy didn’t end with George W. Bush and Sarah Palin. Barack Obama, the most cerebral and eloquent American leader in a generation, regularly plays the same card, droppin’ his Gs and dialling down his vocabulary to Hee Haw standards. His ability to convincingly play a hayseed was instrumental in his 2012 campaign against the patrician Mitt Romney; in one of their televised debates the President referenced “folks” 17 times.
An aversion to complexity—at least when communicating with the public—can also be seen in the types of answers politicians now provide the media. The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140-character Twitter bursts.
Little wonder then that distrust—of leaders, institutions, experts, and those who report on them—is rampant. A YouGov poll conducted last December found that three-quarters of Americans agreed that science is a force for good in the world. Yet when asked if they truly believe what scientists tell them, only 36 per cent of respondents said yes. Just 12 per cent expressed strong confidence in the press to accurately report scientific findings. (Although according to a 2012 paper by Gordon Gauchat, a University of North Carolina sociologist, the erosion of trust in science over the past 40 years has been almost exclusively confined to two groups: conservatives and regular churchgoers. Counterintuitively, it is the most highly educated among them—with post-secondary education—who harbour the strongest doubts.)
The term “elitist” has become one of the most used, and feared, insults in American life. Even in the country’s halls of higher learning, there is now an ingrained bias that favours the accessible over the exacting.
“There’s a pervasive suspicion of rights, privileges, knowledge and specialization,” says Catherine Liu, the author of American Idyll: Academic Antielitism as Cultural Critique and a film and media studies professor at University of California at Irvine. Both ends of the political spectrum have come to reject the conspicuously clever, she says, if for very different reasons; the left because of worries about inclusiveness, the right because they equate objections with obstruction. As a result, the very mission of universities has changed, argues Liu. “We don’t educate people anymore. We train them to get jobs.” (Boomers, she says, deserve most of the blame. “They were so triumphalist in promoting pop culture and demoting the canon.”)
The digital revolution, which has brought boundless access to information and entertainment choices, has somehow only enhanced the lowest common denominators—LOL cat videos and the Kardashians. Instead of educating themselves via the Internet, most people simply use it to validate what they already suspect, wish or believe to be true. It creates an online environment where Jenny McCarthy, a former Playboy model with a high school education, can become a worldwide leader of the anti-vaccination movement, naysaying the advice of medical professionals.
Most perplexing, however, is where the stupid is flowing from. As conservative pundit David Frum recently noted, where it was once the least informed who were most vulnerable to inaccuracies, it now seems to be the exact opposite. “More sophisticated news consumers turn out to use this sophistication to do a better job of filtering out what they don’t want to hear,” he blogged.
But are things actually getting worse? There’s a long and not-so-proud history of American electors lashing out irrationally, or voting against their own interests. Political scientists have been tracking, since the early 1950s, just how poorly those who cast ballots seem to comprehend the policies of the parties and people they are endorsing. A wealth of research now suggests that at the most optimistic, only 70 per cent actually select the party that accurately represents their views—and there are only two choices.
Larry Bartels, the co-director of the Center for the Study of Democratic Institutions at Vanderbilt University, says he doubts that the spreading ignorance is a uniquely American phenomenon. Facing complex choices, uncertain about the consequences of the alternatives, and tasked with balancing the demands of jobs, family and the things that truly interest them with boring policy debates, people either cast their ballots reflexively, or not at all. The larger question might be whether engagement really matters. “If your vision of democracy is one in which elections provide solemn opportunities for voters to set the course of public policy and hold leaders accountable, yes,” Bartels wrote in an email to Maclean’s. “If you take the less ambitious view that elections provide a convenient, non-violent way for a society to agree on who is in charge at any given time, perhaps not.”
A study by two Princeton University researchers, Martin Gilens and Benjamin Page, released last month, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median-income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote.
Smart money versus dumb voters is hardly a fair fight. But it does offer compelling evidence that the survival of the fittest remains an unshakable truth even in American life. A sad sort of proof of evolution.
I remember when Miles Davis died. My friend Katy wept like a baby and I shared her heavy heart. Miles was legendary. I loved this man so much that I named my son after him. Perhaps this is why he turned out to be such a fine musician in his own right. (Spelled “Miles” with a “y” however, because everyone else in the family had a “y” in their names.)
The Tanglewood performance shown in the video below, was possibly the largest audience that Miles Davis had encountered up to that point. His extraordinary band, containing many soon to be legendary musicians, was all deeply immersed in the early experiments into electric instrumentation.
This incendiary performance captures Miles embracing a rock dynamic in his music that was more electric, more funky, more rhythmic, and simply more “out there” than anything that had proceeded it.
Much of the material performed this night derives from Miles’ studio sessions during the groundbreaking In a Silent Way and Bitches Brew album sessions. Because the performance remains one long continuous suite, it allows one to follow the flow and logic of the music over an extended period of time. This continual flow, devoid of announcements identifying the songs, often left critics and some listeners confused, but focused listening reveals that distinct changes are taking place. Miles is thoroughly in control of the musical direction at all times, whether he is in the forefront or not. Miles guides the music back to particular vamps or themes, continually bringing focus to the group improvisations. The swift and agile response of the musicians to Miles’ cues and coded phrases is truly remarkable and is a primary reason for the relentless intensity of this music.
Miles and his group were opening for Santana that night, as Carlos Santana had hand-selected Davis for the slot. Years later, Carlos had this to say about the performance: “They played music meant for the cosmos. It was out, it was in, it was unreal, and it was oh so glorious.”
The band: (I had the honor of seeing nearly all of these guys in person upon numerous occasions.)
Miles Davis – trumpet
Gary Bartz – soprano and alto sax
Chick Corea – electric piano
Keith Jarrett – organ, electric piano
Dave Holland – electric and acoustic bass
Jack DeJohnette – drums
Airto Moriera – percussion
Leonard Nimoy, the sonorous, gaunt-faced actor who played Mr. Spock, the resolutely logical human-alien first officer of the Starship Enterprise in the television and movie juggernaut “Star Trek,” died on Friday morning at his home in the Bel Air section of Los Angeles. He was 83. The cause of death was end-stage chronic obstructive pulmonary disease.
Mr. Nimoy announced last year that he had the disease, which he attributed to years of smoking, a habit he had given up three decades earlier. He had been hospitalized earlier in the week.
His artistic pursuits — poetry, photography and music in addition to acting — ranged far beyond the United Federation of Planets, but it was as Mr. Spock that Mr. Nimoy became a folk hero, bringing to life one of the most indelible characters of the last half century: a cerebral, unflappable, pointy-eared Vulcan with a signature salute and blessing: “Live long and prosper” (from the Vulcan “Dif-tor heh smusma”).
Mr. Nimoy, who was teaching Method acting at his own studio when he was cast in the original “Star Trek” television series in the mid-1960s, relished playing outsiders, and he developed what he later admitted was a mystical identification with Spock, the lone alien on the starship’s bridge.
Yet he also acknowledged ambivalence about being tethered to the character, expressing it most plainly in the titles of two autobiographies: “I Am Not Spock,” published in 1977, and “I Am Spock,” published in 1995.
In the first, he wrote, “In Spock, I finally found the best of both worlds: to be widely accepted in public approval and yet be able to continue to play the insulated alien through the Vulcan character.”
“Star Trek,” which had its premiere on NBC on Sept. 8, 1966, made Mr. Nimoy a star. Gene Roddenberry, the creator of the franchise, called him “the conscience of ‘Star Trek’ ” — an often earnest, sometimes campy show that employed the distant future (as well as some primitive special effects by today’s standards) to take on social issues of the 1960s.
His stardom would endure. Though the series was canceled after three seasons because of low ratings, a cultlike following — the conference-holding, costume-wearing Trekkies, or Trekkers (the designation Mr. Nimoy preferred) — coalesced soon after “Star Trek” went into syndication.
The fans’ devotion only deepened when “Star Trek” was spun off into an animated show, various new series and an uneven parade of movies starring much of the original television cast, including — besides Mr. Nimoy — William Shatner (as Capt. James T. Kirk), DeForest Kelley (Dr. McCoy), George Takei (the helmsman, Sulu), James Doohan (the chief engineer, Scott), Nichelle Nichols (the chief communications officer, Uhura) and Walter Koenig (the navigator, Chekov).
When the director J. J. Abrams revived the “Star Trek” film franchise in 2009, with an all-new cast — including Zachary Quinto as Spock — he included a cameo part for Mr. Nimoy, as an older version of the same character. Mr. Nimoy also appeared in the 2013 follow-up, “Star Trek Into Darkness.”
His zeal to entertain and enlighten reached beyond “Star Trek” and crossed genres. He had a starring role in the dramatic television series “Mission: Impossible” and frequently performed onstage, notably as Tevye in “Fiddler on the Roof.” His poetry was voluminous, and he published books of his photography.
He also directed movies, including two from the “Star Trek” franchise, and television shows. And he made records, on which he sang pop songs, as well as original songs about “Star Trek,” and gave spoken-word performances — to the delight of his fans and the bewilderment of critics.
But all that was subsidiary to Mr. Spock, the most complex member of the Enterprise crew: both a colleague and a creature apart, who sometimes struggled with his warring racial halves.
In one of his most memorable “Star Trek” episodes, Mr. Nimoy tried to follow in the tradition of two actors he admired, Charles Laughton and Boris Karloff, who each played a monstrous character — Quasimodo and the Frankenstein monster — who is transformed by love.
In Episode 24, which was first shown on March 2, 1967, Mr. Spock is indeed transformed. Under the influence of aphrodisiacal spores he discovers on the planet Omicron Ceti III, he lets free his human side and announces his love for Leila Kalomi (Jill Ireland), a woman he had once known on Earth. In this episode, Mr. Nimoy brought to Spock’s metamorphosis not only warmth and compassion, but also a rarefied concept of alienation.
“I am what I am, Leila,” Mr. Spock declared. “And if there are self-made purgatories, then we all have to live in them. Mine can be no worse than someone else’s.”
Born in Boston on March 26, 1931, Leonard Simon Nimoy was the second son of Max and Dora Nimoy, Ukrainian immigrants and Orthodox Jews. His father worked as a barber.
From the age of 8, Leonard acted in local productions, winning parts at a community college, where he performed through his high school years. In 1949, after taking a summer course at Boston College, he traveled to Hollywood, though it wasn’t until 1951 that he landed small parts in two movies, “Queen for a Day” and “Rhubarb.”
He continued to be cast in little-known movies, although he did presciently play an alien invader in a cult serial called “Zombies of the Stratosphere,” and in 1961 he had a minor role on an episode of “The Twilight Zone.” His first starring movie role came in 1952 with “Kid Monk Baroni,” in which he played a disfigured Italian street-gang leader who becomes a boxer.
Mr. Nimoy served in the Army for two years, rising to sergeant and spending 18 months at Fort McPherson in Georgia, where he presided over shows for the Army’s Special Services branch. He also directed and starred as Stanley in the Atlanta Theater Guild’s production of “A Streetcar Named Desire” before receiving his final discharge in November 1955.
He then returned to California, where he worked as a soda jerk, movie usher and cabdriver while studying acting at the Pasadena Playhouse. He achieved wide visibility in the late 1950s and early 1960s on television shows like “Wagon Train,” “Rawhide” and “Perry Mason.” Then came “Star Trek.”
Mr. Nimoy returned to college in his 40s and earned a master’s degree in Spanish from Antioch University Austin, an affiliate of Antioch College in Ohio, in 1978. Antioch College later awarded Mr. Nimoy an honorary doctorate.
Mr. Nimoy directed two of the Star Trek movies, “Star Trek III: The Search for Spock” (1984) and “Star Trek IV: The Voyage Home” (1986), which he helped write. In 1991, the same year that he resurrected Mr. Spock on two episodes of “Star Trek: The Next Generation,” Mr. Nimoy was also the executive producer and a writer of the movie “Star Trek VI: The Undiscovered Country.”
He then directed the hugely successful comedy “Three Men and a Baby” (1987), a far cry from his science-fiction work, and appeared in made-for-television movies. He received an Emmy nomination for the 1982 movie “A Woman Called Golda,” in which he portrayed the husband of Golda Meir, the prime minister of Israel, who was played by Ingrid Bergman. It was the fourth Emmy nomination of his career — the other three were for his “Star Trek” work — although he never won.
Mr. Nimoy’s marriage to the actress Sandi Zober ended in divorce. Besides his wife, he is survived by his children, Adam and Julie Nimoy; a stepson, Aaron Bay Schuck; and six grandchildren; one great-grandchild, and an older brother, Melvin.
Though his speaking voice was among his chief assets as an actor, the critical consensus was that his music was mortifying. Mr. Nimoy, however, was undaunted, and his fans seemed to enjoy the camp of his covers of songs like “If I Had a Hammer.” (His first album was called “Leonard Nimoy Presents Mr. Spock’s Music From Outer Space.”)
From 1995 to 2003, Mr. Nimoy narrated the “Ancient Mysteries” series on the History Channel. He also appeared in commercials, including two with Mr. Shatner for Priceline.com. He provided the voice for animated characters in “Transformers: The Movie,” in 1986, and “The Pagemaster,” in 1994.
In 2001 he voiced the king of Atlantis in the Disney animated movie “Atlantis: The Lost Empire,” and in 2005 he furnished voice-overs for the computer game Civilization IV. More recently, he had a recurring role on the science-fiction series “Fringe” and was heard, as the voice of Spock, in an episode of the hit sitcom “The Big Bang Theory.”
Mr. Nimoy was an active supporter of the arts as well. The Thalia, a venerable movie theater on the Upper West Side of Manhattan, now a multi-use hall that is part of Symphony Space, was renamed the Leonard Nimoy Thalia in 2002.
He also found his voice as a writer. Besides his autobiographies, he published “A Lifetime of Love: Poems on the Passages of Life” in 2002. Typical of Mr. Nimoy’s simple free verse are these lines: “In my heart/Is the seed of the tree/Which will be me.”
In later years, he rediscovered his Jewish heritage, and in 1991 he produced and starred in “Never Forget,” a television movie based on the story of a Holocaust survivor who sued a neo-Nazi organization of Holocaust deniers.
In 2002, having illustrated his books of poetry with his photographs, Mr. Nimoy published “Shekhina,” a book devoted to photography with a Jewish theme, that of the feminine aspect of God. His black-and-white photographs of nude and seminude women struck some Orthodox Jewish leaders as heretical, but Mr. Nimoy asserted that his work was consistent with the teaching of the kabbalah.
His religious upbringing also influenced the characterization of Spock. The character’s split-fingered salute, he often explained, had been his idea: He based it on the kohanic blessing, a manual approximation of the Hebrew letter shin, which is the first letter in Shaddai, one of the Hebrew names for God.
“To this day, I sense Vulcan speech patterns, Vulcan social attitudes and even Vulcan patterns of logic and emotional suppression in my behavior,” Mr. Nimoy wrote years after the original series ended.
But that wasn’t such a bad thing, he discovered. “Given the choice,” he wrote, “if I had to be someone else, I would be Spock.”