Many Believe Women Should Have Equal Work Opportunity, but Are Better Than Men at Child-Rearing

(p. B1) A new study, based on national survey data from 1977 to 2016, helps explain why the path to equality seems in some ways to have stalled — despite the significant increases in women’s educational and professional opportunities during that period.
Two-thirds of Americans and three-quarters of millennials say they believe that men and women should be equal in both the public sphere of work and the private sphere of home. Only a small share of people, young or old, still say that men and women should be unequal in both spheres — 5 percent of millennials and 7 percent of those born from 1946 to 1980.
But the study revealed that roughly a quarter of people’s views about gender equality are more complicated, and differ regarding work and home. Most of them say that while women should have the same opportunities as men to work or participate in politics, they should do more homemaking and child-rearing, found the study, which is set to be published in the journal Gender and Society.
“You can believe men and women have truly different natural tendencies and skills, that women are better nurturers and caretakers, and still believe women should have equal rights in the labor force,” said Barbara Risman, a sociology professor at the University of Illinois at Chicago and an author of the paper along with William Scarborough, a sociology doctoral candidate there and Ray Sin, a behavioral scientist at Morningstar.

For the full commentary, see:
Miller, Claire Cain. “THE UPSHORT; Equality Valued at Work, Not Necessarily at Home.” The New York Times (Wednesday, Dec. 5, 2018): B1 & B5.
(Note: the online version of the commentary has the date Dec. 3, 2018, and has the title “THE UPSHORT; Americans Value Equality at Work More Than Equality at Home.”)

The academic paper mentioned above, has been published online in advance of print publication:
Scarborough, William J., Ray Sin, and Barbara Risman. “Attitudes and the Stalled Gender Revolution: Egalitarianism, Traditionalism, and Ambivalence from 1977 through 2016.” Gender & Society (2018):

Star Wars Details Allow “a Fully Believable, Escapist Experience”

(p. A15) Mr. Jameson clearly lays out the qualities that geeks appreciate in their art: realism bolstered by a deep internal history and the sort of “world-building” exemplified by Tolkien. But in Hollywood “Star Wars” changed the game thanks to its verisimilitude, “which immediately and thoroughly convinces viewers that they are watching humans and aliens skip from planet to planet in a vast, crowded other galaxy with its own detailed history.” Similarly, the biological background of the “Alien” series includes Xenomorphs “whose intricate life cycle can be described from beginning to end in grisly detail.” Books like “The Star Trek Encyclopedia,” in which the show’s designers document “all the alien planets and species that they’d invented” and present starship engineering schematics, are quintessential works of geek culture.
Detail is important to geeks, the author suggests, because they want without “any boundaries, any limits. . . . They don’t want the artwork to ever end.” Whether it’s playing a tabletop game filled with lore about previously unknown characters from the “Star Wars” galaxy or reading a “textbook” to study the fantastic beasts of the “Harry Potter” world, geeks want to believe–at least for a bit. As Mr. Jameson says, “geeks have long thought of artworks as places where one can hang out.” That’s one reason why single films have given way to trilogies and why characters have cross-populated to create Marvel’s seemingly endless “cinematic universe.”

For the full review, see:
Brian P. Kelly. “BOOKSHELF; The Geeks Strike Back.” The Wall Street Journal (Friday, June 8, 2018): A15.
(Note: ellipsis in original.)
(Note: the online version of the review has the date June 7, 2018, and has the title “BOOKSHELF; ‘I Find Your Lack of Faith Disturbing’ Review: The Geeks Strike Back; The “Star Wars” franchise and Marvel’s superhero films reign supreme in today’s Hollywood. How did that happen?”)

The book under review, is:
Jameson, A. D. I Find Your Lack of Faith Disturbing: Star Wars and the Triumph of Geek Culture. New York: Farrar, Straus and Giroux, 2018.

What Wofford’s Family “Lacked in Money, They Made Up for in Expectations”

(p. A19) Growing up on Buffalo’s rough and often neglected East Side, Keith H. Wofford recalled many crisp autumn Sundays spent with his father bonding over the Bills, following the team’s losses and wins on the radio.
Tickets to football games were not in the family’s budget: His father, John Wofford, worked at the nearby Chevrolet factory for 32 years, and his mother, Ruby, picked up odd jobs in retail to bring in extra income. But what the Woffords lacked in money, they made up for in expectations for their two sons.
“They always had an incredible amount of confidence in us,” Mr. Wofford, 49, said in an interview. “They made very clear that they didn’t see any limitations.”
Mr. Wofford held tight to that ideal as he left high school as a 17-year-old junior to attend Harvard University on a scholarship. Seven years later, he graduated from Harvard Law School. Last year, Mr. Wofford earned at least $4.3 million as a partner overseeing 300 lawyers and 700 employees at the New York office of international law firm Ropes & Gray, LLP, according to financial disclosure forms.
Now he’s the Republican nominee for state attorney general in New York, vying to become one of the most powerful law enforcement officials in the country.
“How many guys who work at a white shoe law firm had dads who had a union job?” asked C. Teo Balbach, 50, the chief executive of a software firm who grew up in Buffalo, and played intramural rugby at Harvard with Mr. Wofford.
“He’s a real hard worker and grinder, and that comes from that upbringing where you come from a middle-class family in a difficult neighborhood and you don’t take anything for granted,” Mr. Balbach added.
. . .
. . . issues facing Mr. Wofford should he win are potential conflicts of interest from his law practice.
. . .
Mr. Wofford said the criticism about him is indicative of Ms. James’s “hyperpartisan” attitude, and he sought to distinguish himself from her by characterizing himself as an outsider.
“Being on the wrong side of the tracks in Buffalo,” Mr. Wofford said, “is about as far from insider as you can get.”
His success as a lawyer, however, did allow him one heartfelt opportunity: In his father’s last years, Mr. Wofford returned to Buffalo, and during football season, they would bond again over Bills games — but in person, at the stadium, as a season-ticket holder.

For the full story, see:
Jeffery C. Mays. “Can an Unknown G.O.P. Candidate Become Attorney General?” The New York Times (Saturday, Oct. 13, 2018): A19.
(Note: ellipses added.)
(Note: the online version of the story has the date Oct. 12, 2018, and has the title “Can a Black Republican Who Voted for Trump Be New York’s Next Attorney General?”)

Buddhist Monks Fear Death

(p. C4) A recent paper in the journal Cognitive Science has an unusual combination of authors. A philosopher, a scholar of Buddhism, a social psychologist and a practicing Tibetan Buddhist tried to find out whether believing in Buddhism really does change how you feel about your self–and about death.
The philosopher Shaun Nichols of the University of Arizona and his fellow authors studied Christian and nonreligious Americans, Hindus and both everyday Tibetan Buddhists and Tibetan Buddhist monks.
. . .
The results were very surprising. Most participants reported about the same degree of fear, whether or not they believed in an afterlife. But the monks said that they were much more afraid of death than any other group did.
Why would this be? The Buddhist scholars themselves say that merely knowing there is no self isn’t enough to get rid of the feeling that the self is there. Neuroscience supports this idea.
. . .
Another factor in explaining why these monks were more afraid of death might be that they were trained to think constantly about mortality. The Buddha, perhaps apocryphally, once said that his followers should think about death with every breath. Maybe just ignoring death is a better strategy.

For the full commentary, see:
Alison Gopnik. “Who’s Most Afraid to Die? A Surprise.” The Wall Street Journal (Saturday, June 9, 2018): C4.
(Note: ellipses added.)
(Note: the online version of the commentary has the date June 6, 2018.)

The print version of the Cognitive Science article discussed above, is:
Nichols, Shaun, Nina Strohminger, Arun Rai, and Jay Garfield. “Death and the Self.” Cognitive Science 42, no. S1 (May 2018): 314-32.

Anthony Bourdain “Let the Locals Shine”

(p. A15) People are mourning celebrity chef Anthony Bourdain all over the world–from Kurdistan to South Africa, from Gaza to Mexico. That may surprise American social-justice warriors who have turned food into a battlefield for what they call “cultural appropriation.”
“When you’re cooking a country’s dish for other people,” an Oberlin College student wrote last year, “you’re also representing the meaning of the dish as well as its culture. So if people not from that heritage take food, modify it and serve it as ‘authentic,’ it is appropriative.” This was prompted by a dining-hall menu that included sushi and banh mi. Celebrity alumna Lena Dunham weighed in on the side of the social-justice warriors.
. . .
Bourdain was a frequent target of similar criticism. When he declared Filipino food the next big thing, a writer for London’s Independent newspaper complained that his “well-meaning” comments were “the latest from a Western (usually white) celebrity chef or food critic to take a once scoffed at cuisine, legitimize it and call it a trend.”
Bourdain took it in stride. Asked on his CNN show, “Anthony Bourdain: Parts Unknown,” what he thought about culinary cultural appropriation, he said: “Look, the story of food is the story of appropriation, of invasion and mixed marriages and war and, you know . . . it constantly changes. You know, what’s authentic anyway?”
. . .
When Bourdain took us to places like Libya and Venezuela and West Virginia, he let the locals shine. His vocation was about more than food. It was about people–understanding their cultures and their lives, lifting them up and making their dishes.

For the full commentary, see:
Elisha Maldonado. “Bourdain vs. the Social-Justice Warriors; The celebrity chef scoffed at the notion of opposing ‘cultural appropriation.'” The Wall Street Journal (Tuesday, June 12, 2018): A15.
(Note: ellipses added.)
(Note: the online version of the commentary has the date June 11, 2018.)

“Books Were Systematically Burned”

(p. 12) Vandalizing the Parthenon temple in Athens has been a tenacious tradition. Most famously, Lord Elgin appropriated the “Elgin marbles” in 1801-5. But that was hardly the first example. In the Byzantine era, when the temple had been turned into a church, two bishops — Marinos and Theodosios — carved their names on its monumental columns. The Ottomans used the Parthenon as a gunpowder magazine, hence its pockmarked masonry — the result of an attack by Venetian forces in the 17th century. Now Catherine Nixey, a classics teacher turned writer and journalist, takes us back to earlier desecrations, the destruction of the premier artworks of antiquity by Christian zealots (from the Greek zelos — ardor, eager rivalry) in what she calls “The Darkening Age.”
. . .
Debate — philosophically and physiologically — makes us human, whereas dogma cauterizes our potential as a species. Through the sharing of new ideas the ancients identified the atom, measured the circumference of the earth, grasped the environmental benefits of vegetarianism.
To be sure, Christians would not have a monopoly on orthodoxy, or indeed on suppression: The history of the ancient world typically makes for stomach-churning reading. Pagan philosophers too who flew in the face of religious consensus risked persecution; Socrates, we must not forget, was condemned to death on a religious charge.
But Christians did fetishize dogma. In A.D. 386 a law was passed declaring that those “who contend about religion … shall pay with their lives and blood.” Books were systematically burned.
. . .
. . . she opens her book with a potent description of black-robed zealots from 16 centuries ago taking iron bars to the beautiful statue of Athena in the sanctuary of Palmyra, located in modern-day Syria. Intellectuals in Antioch (in ancient Syria) were tortured and beheaded, as were the statues around them.
. . .
Nixey closes her book with the description of another Athena, in the city of her name, being decapitated around A.D. 529, her defiled body used as a steppingstone into what was once a world-renowned school of philosophy. Athena was the deity of wisdom. The words “wisdom” and “historian” have a common ancestor, a proto-Indo-European word meaning to see things clearly. Nixey delivers this ballista-bolt of a book with her eyes wide open and in an attempt to bring light as well as heat to the sad story of intellectual monoculture and religious intolerance. Her sympathy, corruscatingly, compellingly, is with the Roman orator Symmachus: “We see the same stars, the sky is shared by all, the same world surrounds us. What does it matter what wisdom a person uses to seek for the truth?”

For the full review, see:
Bettany Hughes. “‘How the Ancient World Was Destroyed.” The New York Times Book Review (Sunday, June 10, 2018): 12.
(Note: ellipses between, and at the start of, paragraphs, added; ellipsis internal to paragraph, in original.)
(Note: the online version of the review has the date June 8, 2018, and has the title “How Christians Destroyed the Ancient World.”)

The book under review, is:
Nixey, Catherine. The Darkening Age: The Christian Destruction of the Classical World. Boston: Houghton Mifflin Harcourt, 2018.

AI “Will Never Match the Creativity of Human Beings or the Fluidity of the Real World”

(p. A21) If you read Google’s public statement about Google Duplex, you’ll discover that the initial scope of the project is surprisingly limited. It encompasses just three tasks: helping users “make restaurant reservations, schedule hair salon appointments, and get holiday hours.”
Schedule hair salon appointments? The dream of artificial intelligence was supposed to be grander than this — to help revolutionize medicine, say, or to produce trustworthy robot helpers for the home.
The reason Google Duplex is so narrow in scope isn’t that it represents a small but important first step toward such goals. The reason is that the field of A.I. doesn’t yet have a clue how to do any better.
. . .
The narrower the scope of a conversation, the easier it is to have. If your interlocutor is more or less following a script, it is not hard to build a computer program that, with the help of simple phrase-book-like templates, can recognize a few variations on a theme. (“What time does your establishment close?” “I would like a reservation for four people at 7 p.m.”) But mastering a Berlitz phrase book doesn’t make you a fluent speaker of a foreign language. Sooner or later the non sequiturs start flowing.
. . .
To be fair, Google Duplex doesn’t literally use phrase-book-like templates. It uses “machine learning” techniques to extract a range of possible phrases drawn from an enormous data set of recordings of human conversations. But the basic problem remains the same: No matter how much data you have and how many patterns you discern, your data will never match the creativity of human beings or the fluidity of the real world. The universe of possible sentences is too complex. There is no end to the variety of life — or to the ways in which we can talk about that variety.
. . .
Today’s dominant approach to A.I. has not worked out. Yes, some remarkable applications have been built from it, including Google Translate and Google Duplex. But the limitations of these applications as a form of intelligence should be a wake-up call. If machine learning and big data can’t get us any further than a restaurant reservation, even in the hands of the world’s most capable A.I. company, it is time to reconsider that strategy.

For the full commentary, see:
Gary Marcus and Ernest Davis. “A.I. Is Harder Than You Think.” The New York Times (Saturday, May 19, 2018): A21.
(Note: ellipses added.)
(Note: the online version of the commentary has the date May 18, 2018.)

Philosopher Argued Artificial Intelligence Would Never Reach Human Intelligence

(p. A28) Professor Dreyfus became interested in artificial intelligence in the late 1950s, when he began teaching at the Massachusetts Institute of Technology. He often brushed shoulders with scientists trying to turn computers into reasoning machines.
. . .
Inevitably, he said, artificial intelligence ran up against something called the common-knowledge problem: the vast repository of facts and information that ordinary people possess as though by inheritance, and can draw on to make inferences and navigate their way through the world.
“Current claims and hopes for progress in models for making computers intelligent are like the belief that someone climbing a tree is making progress toward reaching the moon,” he wrote in “Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer” (1985), a book he collaborated on with his younger brother Stuart, a professor of industrial engineering at Berkeley.
His criticisms were greeted with intense hostility in the world of artificial intelligence researchers, who remained confident that success lay within reach as computers grew more powerful.
When that did not happen, Professor Dreyfus found himself vindicated, doubly so when research in the field began incorporating his arguments, expanded upon in a second edition of “What Computers Can’t Do” in 1979 and “What Computers Still Can’t Do” in 1992.
. . .
For his 2006 book “Philosophy: The Latest Answers to the Oldest Questions,” Nicholas Fearn broached the topic of artificial intelligence in an interview with Professor Dreyfus, who told him: “I don’t think about computers anymore. I figure I won and it’s over: They’ve given up.”

For the full obituary, see:
WILLIAM GRIMES. “Hubert L. Dreyfus, Who Put Computing In Its Place, Dies at 87.” The New York Times (Wednesday, May 3, 2017): A28.
(Note: ellipses added.)
(Note: the online version of the obituary has the date MAY 2, 2017, and has the title “Hubert L. Dreyfus, Philosopher of the Limits of Computers, Dies at 87.”)

Dreyfus’s last book on the limits of artificial intelligence, was:
Dreyfus, Hubert L. What Computers Still Can’t Do: A Critique of Artificial Reason. Cambridge, MA: The MIT Press, 1992.

Happiness “Emerges from the Pursuit of Purpose”

(p. C7) The modern positive-psychology movement– . . .–is a blend of wise goals, good studies, surprising discoveries, old truths and overblown promises. Daniel Horowitz’s history deftly reveals the eternal lessons that underlie all its incarnations: Money can’t buy happiness; human beings need social bonds, satisfying work and strong communities; a life based entirely on the pursuit of pleasure ultimately becomes pleasureless. As Viktor Frankl told us, “Happiness cannot be pursued; it must ensue. One must have a reason to ‘be happy.’ ” That reason, he said, emerges from the pursuit of purpose.

For the full review, see:
Carol Tavris. “”How Smiles Were Packaged and Sold.” The Wall Street Journal (Saturday, March 31, 2018): C5 & C7.
(Note: ellipsis added.)
(Note: the online version of the review has the date March 29, 2018, and has the title “”Happier?’ and ‘The Hope Circuit’ Reviews: How Smiles Were Packaged and Sold.”)

The book under review, is:
Horowitz, Daniel. Happier?: The History of a Cultural Movement That Aspired to Transform America. New York: Oxford University Press, 2017.

Individualistic Cultures Foster Innovation

IndividualismProductivityGraph2018-04-20.pngSource of graph: online version of the WSJ commentary quoted and cited below.

(p. B1) Luther matters to investors not because of the religion he founded, but because of the cultural impact of challenging the Catholic Church’s grip on society. By ushering in what Edmund Phelps, the Nobel-winning director of Columbia University’s Center on Capitalism and Society, calls the “the age of the individual,” Luther laid the groundwork for capitalism.
. . .
(p. B10) Mr. Phelps and collaborators Saifedean Ammous, Raicho Bojilov and Gylfi Zoega show that even in recent years, countries with more individualistic cultures have more innovative economies. They demonstrate a strong link between countries that surveys show to be more individualistic, and total factor productivity, a proxy for innovation that measures growth due to more efficient use of labor and capital. Less individualistic cultures, such as France, Spain and Japan, showed little innovation while the individualistic U.S. led.
As Mr. Bojilov points out, correlation doesn’t prove causation, so they looked at the effects of country of origin on the success of second, third and fourth-generation Americans as entrepreneurs. The effects turn out to be significant but leave room for debate about how important individualistic attitudes are to financial and economic success.

For the full commentary, see:
James Mackintosh. “STREETWISE; What Martin Luther Says About Capitalism.” The Wall Street Journal (Friday, Nov. 3, 2017): B1 & B10.
(Note: ellipsis added.)
(Note: the online version of the commentary has the date Nov. 2, 2017, and has the title “STREETWISE; What 500 Years of Protestantism Teaches Us About Capitalism’s Future.” Where there are minor differences in wording in the two versions, the passages quoted above follow the online version.)