Fire Cooked Carbohydrates Fed Bigger Brains

(p. D5) Scientists have long recognized that the diets of our ancestors went through a profound shift with the addition of meat. But in the September issue of The Quarterly Review of Biology, researchers argue that another item added to the menu was just as important: carbohydrates, bane of today’s paleo diet enthusiasts. In fact, the scientists propose, by incorporating cooked starches into their diet, our ancestors were able to fuel the evolution of our oversize brains.
. . .
Cooked meat provided increased protein, fat and energy, helping hominins grow and thrive. But Mark G. Thomas, an evolutionary geneticist at University College London, and his colleagues argue that there was another important food sizzling on the ancient hearth: tubers and other starchy plants.
Our bodies convert starch into glucose, the body’s fuel. The process begins as soon as we start chewing: Saliva contains an enzyme called amylase, which begins to break down starchy foods.
Amylase doesn’t work all that well on raw starches, however; it is much more effective on cooked foods. Cooking makes the average potato about 20 times as digestible, Dr. Thomas said: “It’s really profound.”
. . .
Dr. Thomas and his colleagues propose that the invention of fire, not farming, gave rise to the need for more amylase. Once early humans started cooking starchy foods, they needed more amylase to unlock the precious supply of glucose.
Mutations that gave people extra amylase helped them survive, and those mutations spread because of natural selection. That glucose, Dr. Thomas and his colleagues argue, provided the fuel for bigger brains.

For the full story, see:
Carl Zimmer. “MATTER; For Evolving Brains, a ‘Paleo’ Diet of Carbs.” The New York Times (Tues., AUG. 18, 2015): D5.
(Note: ellipses added.)
(Note: the online version of the story has the date AUG. 13, 2015.)

The academic article summarized in the passages above, is:
Hardy, Karen, Jennie Brand-Miller, Katherine D. Brown, Mark G. Thomas, and Les Copeland. “The Importance of Dietary Carbohydrate in Human Evolution.” The Quarterly Review of Biology 90, no. 3 (Sept. 2015): 251-68.

Science Is a Process, Not a Set of Settled Conclusions

(p. A11) Are there any phrases in today’s political lexicon more obnoxious than “the science is settled” and “climate-change deniers”?
The first is an oxymoron. By definition, science is never settled. It is always subject to change in the light of new evidence. The second phrase is nothing but an ad hominem attack, meant to evoke “Holocaust deniers,” those people who maintain that the Nazi Holocaust is a fiction, ignoring the overwhelming, incontestable evidence that it is a historical fact.
. . .
. . . , the release of thousands of emails from the University of East Anglia’s Climate Research Unit in 2009 showed climate scientists concerned with the lack of recent warming and how to “hide the decline.” The communications showed that whatever the emailers were engaged in, it was not the disinterested pursuit of science.
Another batch of 5,000 emails written by top climate scientists came out in 2011, discussing, among other public-relations matters, how to deal with skeptical editors and how to suppress unfavorable data. It is a measure of the intellectual corruption of the mainstream media that this wasn’t the scandal of the century. But then again I forget, “the science is settled.”

For the full commentary, see:
JOHN STEELE GORDON. “The Unsettling, Anti-Science Certitude on Global Warming; Climate-change ‘deniers’ are accused of heresy by true believers; That doesn’t sound like science to me.” The Wall Street Journal (Fri., July 31, 2015): A11.
(Note: ellipses added.)
(Note: the online version of the article has the date July 30, 2015.)

Marie Curie Opposed Patents Because Women Could Not Own Property in France

(p. C6) Ms. Wirtén, a professor at Linköping University in Sweden, pays special attention to the decision not to patent and how it was treated in the founding texts of the Curie legend: Curie’s 1923 biography of her husband, “Pierre Curie,” and their daughter Eve’s 1937 biography of her mother, “Madame Curie.” The books each recount a conversation in which husband and wife agree that patenting their radium method would be contrary to the spirit of science.
It is not quite that simple. As Ms. Wirtén points out, the Curies derived a significant portion of their income from Pierre’s patents on instruments. Various factors besides beneficence could have affected their decision not to extend this approach to their radium process. Intriguingly, the author suggests that the ineligibility of women to own property under French law might have shaped Curie’s perspective. “Because the law excluded her from the status of person upon which these intellectual property rights depend,” Ms. Wirtén writes, “the ‘property’ road was closed to Marie Curie. The persona road was not.”

For the full review, see:
EVAN HEPLER-SMITH. “Scientific Saint; After scandals in France, Curie was embraced by American women as an intellectual icon.” The Wall Street Journal (Sat., March 21, 2015): C6.
(Note: the online version of the review has the date March 20, 2015.)

The book under review, is:
Wirtén, Eva Hemmungs. Making Marie Curie: Intellectual Property and Celebrity Culture in an Age of Information. Chicago: University of Chicago Press, 2015.

From Self-Funding, and Sony, Khanna Builds PlayStation Supercomputer to Advance Science

KhannaGauravPlaystationSupercomputer2015-07-05.jpg“Gaurav Khanna with a supercomputer he built at the University of Massachusetts Dartmouth physics department using 200 Playstation 3 consoles that are housed in a refrigerated shipping container.” Source of caption: print version of the NYT article quoted and cited below. Source of photo: online version of the NYT article quoted and cited below.

(p. D3) This spring, Gaurav Khanna noticed that the University of Massachusetts Dartmouth physics department was more crowded than usual. Why, he wondered, were so many students suddenly so interested in science?”

It wasn’t a thirst for knowledge, it turns out. News of Dr. Khanna’s success in building a supercomputer using only PlayStation 3 video game consoles had spread quickly; the students, a lot of them gamers, just wanted to gape at the sight of nearly 200 consoles stacked on one another.
. . .
Making a supercomputer requires a large number of processors — standard desktops, laptops or the like — and a way to network them. Dr. Khanna picked the PlayStation 3 for its viability and cost, currently, $250 to $300 in stores. Unlike other game consoles, the PlayStation 3 allows users to install a preferred operating system, making it attractive to programmers and developers. (The latest model, the PlayStation 4, does not have this feature.)
“Gaming had grown into a huge market,” Dr. Khanna said. “There’s a huge push for performance, meaning you can buy low-cost, high-performance hardware very easily. I could go out and buy 100 PlayStation 3 consoles at my neighborhood Best Buy, if I wanted.”
That is just what Dr. Khanna did, though on a smaller scale. Because the National Science Foundation, which funds much of Dr. Khanna’s research, might not have viewed the bulk buying of video game consoles as a responsible use of grant money, he reached out to Sony Computer Entertainment America, the company behind the PlayStation 3. Sony donated four consoles to the experiment; Dr. Khanna’s university paid for eight more, and Dr. Khanna bought another four. He then installed the Linux operating system on all 16 consoles, plugged them into the Internet and booted up the supercomputer.
Lior Burko, an associate professor of physics at Georgia Gwinnett College and a past collaborator with Dr. Khanna, praised the idea as an “ingenious” way to get the function of a supercomputer without the prohibitive expense.
“Dr. Khanna was able to combine his two fields of expertise, namely general relativity and computer science, to invent something new that allowed for not just a neat new machine, but also scientific progress that otherwise might have taken many more years to achieve,” Dr. Burko said.
. . .
His team linked the consoles, housing them in a refrigerated shipping container designed to carry milk. The resulting supercomputer, Dr. Khanna said, had the computational power of nearly 3,000 laptop or desktop processors, and cost only $75,000 to make — about a tenth the cost of a comparable supercomputer made using traditional parts.

For the full story, see:
LAURA PARKER “An Economical Way to Save Progress.” The New York Times (Tues., DEC. 23, 2014): D3.
(Note: ellipses added.)
(Note: the online version of the story has the date DEC. 22, 2014, and has the title “That Old PlayStation Can Aid Science.”)

Rather than Debate Global Warming Skeptics, Some Label them “Denialists” to “Link Them to Holocaust Denial”

(p. D2) The contrarian scientists like to present these upbeat scenarios as the only plausible outcomes from runaway emissions growth. Mainstream scientists see them as being the low end of a range of possible outcomes that includes an alarming high end, and they say the only way to reduce the risks is to reduce emissions.
The dissenting scientists have been called “lukewarmers” by some, for their view that Earth will warm only a little. That is a term Dr. Michaels embraces. “I think it’s wonderful!” he said. He is working on a book, “The Lukewarmers’ Manifesto.”
When they publish in scientific journals, presenting data and arguments to support their views, these contrarians are practicing science, and perhaps the “skeptic” label is applicable. But not all of them are eager to embrace it.
“As far as I can tell, skepticism involves doubts about a plausible proposition,” another of these scientists, Richard S. Lindzen, told an audience a few years ago. “I think current global warming alarm does not represent a plausible proposition.”
. . .
It is perhaps no surprise that many environmentalists have started to call them deniers.
The scientific dissenters object to that word, claiming it is a deliberate attempt to link them to Holocaust denial. Some academics sharply dispute having any such intention, but others have started using the slightly softer word “denialist” to make the same point without stirring complaints about evoking the Holocaust.

For the full commentary, see:
Justin Gillis. “BY DEGREES; Verbal Warming: Labels in the Climate Debate.” The New York Times (Tues., FEB. 17, 2015): D1-D2.
(Note: ellipsis added.)
(Note: the online version of the commentary has the date FEB. 12 (sic), 2015.)

“Big Data” Does Not Tell Us What to Measure, and Ignores What Cannot Be Measured

(p. 6) BIG data will save the world. How often have we heard that over the past couple of years? We’re pretty sure both of us have said something similar dozens of times in the past few months.
If you’re trying to build a self-driving car or detect whether a picture has a cat in it, big data is amazing. But here’s a secret: If you’re trying to make important decisions about your health, wealth or happiness, big data is not enough.
The problem is this: The things we can measure are never exactly what we care about. Just trying to get a single, easy-to-measure number higher and higher (or lower and lower) doesn’t actually help us make the right choice. For this reason, the key question isn’t “What did I measure?” but “What did I miss?”
. . .
So what can big data do to help us make big decisions? One of us, Alex, is a data scientist at Facebook. The other, Seth, is a former data scientist at Google. There is a special sauce necessary to making big data work: surveys and the judgment of humans — two seemingly old-fashioned approaches that we will call small data.
Facebook has tons of data on how people use its site. It’s easy to see whether a particular news feed story was liked, clicked, commented on or shared. But not one of these is a perfect proxy for more important questions: What was the experience like? Did the story connect you with your friends? Did it inform you about the world? Did it make you laugh?
(p. 7) To get to these measures, Facebook has to take an old-fashioned approach: asking. Every day, hundreds of individuals load their news feed and answer questions about the stories they see there. Big data (likes, clicks, comments) is supplemented by small data (“Do you want to see this post in your News Feed?”) and contextualized (“Why?”).
Big data in the form of behaviors and small data in the form of surveys complement each other and produce insights rather than simple metrics.
. . .
Because of this need for small data, Facebook’s data teams look different than you would guess. Facebook employs social psychologists, anthropologists and sociologists precisely to find what simple measures miss.
And it’s not just Silicon Valley firms that employ the power of small data. Baseball is often used as the quintessential story of data geeks, crunching huge data sets, replacing fallible human experts, like scouts. This story was made famous in both the book and the movie “Moneyball.”
But the true story is not that simple. For one thing, many teams ended up going overboard on data. It was easy to measure offense and pitching, so some organizations ended up underestimating the importance of defense, which is harder to measure. In fact, in his book “The Signal and the Noise,” Nate Silver of fivethirtyeight.com estimates that the Oakland A’s were giving up 8 to 10 wins per year in the mid-1990s because of their lousy defense.
. . .
Human experts can also help data analysts figure out what to look for. For decades, scouts have judged catchers based on their ability to frame pitches — to make the pitch appear more like a strike to a watching umpire. Thanks to improved data on pitch location, analysts have recently checked this hypothesis and confirmed that catchers differ significantly in this skill.

For the full commentary, see:
ALEX PEYSAKHOVICH and SETH STEPHENS-DAVIDOWITZ. “How Not to Drown in Numbers.” The New York Times, SundayReview Section (Sun., MAY 3, 2015): 6-7.
(Note: ellipses added.)
(Note: the online version of the commentary has the date MAY 2, 2015.)

Babies “Have a Positive Hunger for the Unexpected”

(p. C2) In an amazingly clever new paper in the journal Science, Aimee Stahl and Lisa Feigenson at Johns Hopkins University show systematically that 11-month-old babies, like scientists, pay special attention when their predictions are violated, learn especially well as a result, and even do experiments to figure out just what happened.
They took off from some classic research showing that babies will look at something longer when it is unexpected. The babies in the new study either saw impossible events, like the apparent passage of a ball through a solid brick wall, or straightforward events, like the same ball simply moving through an empty space.
. . .
The babies explored objects more when they behaved unexpectedly. They also explored them differently depending on just how they behaved unexpectedly. If the ball had vanished through the wall, the babies banged the ball against a surface; if it had hovered in thin air, they dropped it. It was as if they were testing to see if the ball really was solid, or really did defy gravity, much like Georgie testing the fake eggs in the Easter basket.
In fact, these experiments suggest that babies may be even better scientists than grown-ups often are. Adults suffer from “confirmation bias”–we pay attention to the events that fit what we already know and ignore things that might shake up our preconceptions. Charles Darwin famously kept a special list of all the facts that were at odds with his theory, because he knew he’d otherwise be tempted to ignore or forget them.
Babies, on the other hand, seem to have a positive hunger for the unexpected. Like the ideal scientists proposed by the philosopher of science Karl Popper, babies are always on the lookout for a fact that falsifies their theories.

For the full commentary, see:
ALISON GOPNIK. “MIND AND MATTER; How 1-Year-Olds Figure Out the World.” The Wall Street Journal (Sat., April 15, 2015): C2.
(Note: ellipsis added.)
(Note: the online version of the commentary has the date April 15, 2015, and has the title “MIND AND MATTER; How 1-Year-Olds Figure Out the World.”)

The scientific article mentioned in the passages quoted, is:
Stahl, Aimee E., and Lisa Feigenson. “Observing the Unexpected Enhances Infants’ Learning and Exploration.” Science 348, no. 6230 (April 3, 2015): 91-94.

We Often “See” What We Expect to See

(p. 9) The Justice Department recently analyzed eight years of shootings by Philadelphia police officers. Its report contained two sobering statistics: Fifteen percent of those shot were unarmed; and in half of these cases, an officer reportedly misidentified a “nonthreatening object (e.g., a cellphone) or movement (e.g., tugging at the waistband)” as a weapon.
Many factors presumably contribute to such shootings, ranging from carelessness to unconscious bias to explicit racism, all of which have received considerable attention of late, and deservedly so.
But there is a lesser-known psychological phenomenon that might also explain some of these shootings. It’s called “affective realism”: the tendency of your feelings to influence what you see — not what you think you see, but the actual content of your perceptual experience.
. . .
The brain is a predictive organ. A majority of your brain activity consists of predictions about the world — thousands of them at a time — based on your past experience. These predictions are not deliberate prognostications like “the Red Sox will win the World Series,” but unconscious anticipations of every sight, sound and other sensation you might encounter in every instant. These neural “guesses” largely shape what you see, hear and otherwise perceive.
. . .
. . . , our lab at Northeastern University has conducted experiments to document affective realism. For example, in one study we showed an affectively neutral face to our test subjects, and using special equipment, we secretly accompanied it with a smiling or scowling face that the subjects could not consciously see. (The technique is called “continuous flash suppression.”) We found that the unseen faces influenced the subjects’ bodily activity (e.g., how fast their hearts beat) and their feelings. These in turn influenced their perceptions: In the presence of an unseen scowling face, our subjects felt unpleasant and perceived the neutral face as less likable, less trustworthy, less competent, less attractive and more likely to commit a crime than when we paired it with an unseen smiling face.
These weren’t just impressions; they were actual visual changes. The test subjects saw the neutral faces as having a more furrowed brow, a more surly mouth and so on. (Some of these findings were published in Emotion in 2012.)
. . .
. . . the brain is wired for prediction, and you predict most of the sights, sounds and other sensations in your life. You are, in large measure, the architect of your own experience.

For the full commentary, see:
Feldman Barrett, Lisa, and Jolie Wormwood. “When a Gun Is Not a Gun.” The New York Times, SundayReview Section (Sun., April 19, 2015): 9.
(Note: italics in original; ellipses added.)
(Note: the date of the online version of the commentary is APRIL 17, 2015.)

The academic article mentioned in the passage quoted above, is:
Anderson, Eric, Erika Siegel, Dominique White, and Lisa Feldman Barrett. “Out of Sight but Not out of Mind: Unseen Affective Faces Influence Evaluations and Social Impressions.” Emotion 12, no. 6 (Dec. 2012): 1210-21.

NOAA New Estimates Show Increase Since 1880 of Only 1.65 Degrees Fahrenheit

(p. A10) Scientists have long labored to explain what appeared to be a slowdown in global warming that began at the start of this century as, at the same time, heat-trapping emissions of carbon dioxide were soaring. The slowdown, sometimes inaccurately described as a halt or hiatus, became a major talking point for people critical of climate science.
Now, new research suggests the whole thing may have been based on incorrect data.
When adjustments are made to compensate for recently discovered problems in the way global temperatures were measured, the slowdown largely disappears, the National Oceanic and Atmospheric Administration declared in a scientific paper published Thursday. And when the particularly warm temperatures of 2013 and 2014 are averaged in, the slowdown goes away entirely, the agency said.
. . .
The Cato Institute, a libertarian think tank in Washington that is critical of climate science, issued a statement condemning the changes and questioning the agency’s methodology.
“The main claim by the authors that they have uncovered a significant recent warming trend is dubious,” said the statement, attributed to three contrarian climate scientists: Richard S. Lindzen, Patrick J. Michaels and Paul C. Knappenberger.
However, Russell S. Vose, chief of the climate science division at NOAA’s Asheville center, pointed out in an interview that while the corrections do eliminate the recent warming slowdown, the overall effect of the agency’s adjustments has long been to raise the reported global temperatures in the late 19th and early 20th centuries by a substantial margin. That makes the temperature increase of the past century appear less severe than it does in the raw data.
“If you just wanted to release to the American public our uncorrected data set, it would say that the world has warmed up about 2.071 degrees Fahrenheit since 1880,” Dr. Vose said. “Our corrected data set says things have warmed up about 1.65 degrees Fahrenheit. Our corrections lower the rate of warming on a global scale.”

For the full story, see:
JUSTIN GILLIS. “Global Warming ‘Hiatus’ Challenged by NOAA Research.” The New York Times (Fri., JUNE 5, 2015): A10.
(Note: ellipsis added.)
(Note: the online version of the story has the date JUNE 4, 2015.)

The scientific article mentioned in the passages quoted above, is:
Karl, Thomas R., Anthony Arguez, Boyin Huang, Jay H. Lawrimore, James R. McMahon, Matthew J. Menne, Thomas C. Peterson, Russell S. Vose, and Zhang Huai-Min. “Possible Artifacts of Data Biases in the Recent Global Surface Warming Hiatus.” Science 348, no. 6242 (June 26, 2015): 1469-72.

Chimps Are Willing to Delay Gratification in Order to Receive Cooked Food

This is a big deal because cooking food allows us humans to spend a lot less energy digesting our food, which allows a lot more energy to be used by the brain. So one theory is that the cooking technology allowed humans to eventually develop cognitive abilities superior to other primates.

(p. A3) . . . scientists from Harvard and Yale found that chimps have the patience and foresight to resist eating raw food and to place it in a device meant to appear, at least to the chimps, to cook it.
. . .
But they found that chimps would give up a raw slice of sweet potato in the hand for the prospect of a cooked slice of sweet potato a bit later. That kind of foresight and self-control is something any cook who has eaten too much raw cookie dough can admire.
The research grew out of the idea that cooking itself may have driven changes in human evolution, a hypothesis put forth by Richard Wrangham, an anthropologist at Harvard and several colleagues about 15 years ago in an article in Current Anthropology, and more recently in his book, “Catching Fire: How Cooking Made Us Human.”
He argued that cooking may have begun something like two million years ago, even though hard evidence only dates back about one million years. For that to be true, some early ancestors, perhaps not much more advanced than chimps, had to grasp the whole concept of transforming the raw into the cooked.
Felix Warneken at Harvard and Alexandra G. Rosati, who is about to move from Yale to Harvard, both of whom study cognition, wanted to see if chimpanzees, which often serve as stand-ins for human ancestors, had the cognitive foundation that would prepare them to cook.
. . .
Dr. Rosati said the experiments showed not only that chimps had the patience for cooking, but that they had the “minimal causal understanding they would need” to make the leap to cooking.

For the full story, see:
JAMES GORMAN. “Chimpanzees Would Cook if Given Chance, Research Says.” The New York Times (Weds., JUNE 3, 2015): A3.
(Note: ellipses added.)
(Note: the date of the online version of the story is JUNE 2, 2015, and has the title “Chimpanzees Would Cook if Given the Chance, Research Says.”)

The academic article discussed in the passages quoted above, is:
Warneken, Felix, and Alexandra G. Rosati. “Cognitive Capacities for Cooking in Chimpanzees.” Proceedings of the Royal Society of London B: Biological Sciences 282, no. 1809 (June 22, 2015).