After Failing to Enslave Indians, Starving Jamestown Colonists Ate 14-Year-Old Girl

JamestownFourteenYearOldCannibalized2013-05-14.jpg

“A facial reconstruction of a 14-year-old girl whose skull shows signs that her remains were used for food after her death and burial.” Source of caption and image: online version of the NYT article quoted and cited below.

Acemoglu and Robinson in the long, but thought-provoking, opening chapter of their Why Nations Fail book, discuss starvation at the Jamestown colony. Only they don’t mainly attribute it to a harsh winter or a slow rescue from England, as does the article quoted below (it is from the New York Times, after all).
Economists Acemoglu and Robinson (p. 23) instead criticize the colony’s initial plan to thrive by enslaving natives to bring them gold and food. Eventually John Smith made the bold suggestion that the colonists should try to work to produce something to eat or to trade. The rulers of the colony ignored Smith, resulting in starvation and cannibalism.

(p. A11) Archaeologists excavating a trash pit at the Jamestown colony site in Virginia have found the first physical evidence of cannibalism among the desperate population, corroborating written accounts left behind by witnesses. Cut marks on the skull and skeleton of a 14-year-old girl show that her flesh and brain were removed, presumably to be eaten by the starving colonists during the harsh winter of 1609.

The remains were excavated by archaeologists led by William Kelso of Preservation Virginia, a private nonprofit group, and analyzed by Douglas Owsley, a physical anthropologist at the National Museum of Natural History in Washington. The skull bears tentative cuts to the forehead, followed by four strikes to the back of the head, one of which split the skull open, according to an article in Smithsonian magazine, where the find was reported Wednesday.
It is unclear how the girl died, but she was almost certainly dead and buried before her remains were butchered. According to a letter written in 1625 by George Percy, president of Jamestown during the starvation period, the famine was so intense “thatt notheinge was Spared to mainteyne Lyfe and to doe those things which seame incredible, as to digge upp deade corpes outt of graves and to eate them.”

For the full story, see:
NICHOLAS WADE. “Girl’s Bones Bear Signs of Cannibalism by Starving Virginia Colonists.” The New York Times (Thurs., May 2, 2013): A11.
(Note: ellipsis added.)
(Note: the online version of the story has the date May 1, 2013.)

The Acemoglu book mentioned above is:
Acemoglu, Daron, and James Robinson. Why Nations Fail: The Origins of Power, Prosperity, and Poverty. New York: Crown Business, 2012.

JamestownBonesShowCannibalism2013-05-14.jpg “Human remains from the Jamestown colony site in Virginia bearing evidence of cannibalism.” Source of caption and photo: online version of the NYT article quoted and cited above.

Moore’s Law: Inevitable or Intel?

I believe that Moore’s Law remained true for a long time, not because it was inevitable, but because an exemplary company worked very hard and effectively to make it true.

(p. 159) In brief, Moore’s Law predicts that computing chips will shrink by half in size and cost every 18 to 24 months. For the past 50 years it has been astoundingly correct.

It has been steady and true, but does Moore’s Law reveal an imperative in the technium? In other words is Moore’s Law in some way inevitable? The answer is pivotal for civilization for several reasons. First, Moore’s Law represents the acceleration in computer technology, which is accelerating everything else. Faster jet engines don’t lead to higher corn yields, nor do better lasers lead to faster drug discoveries, but faster computer chips lead to all of these. These days all technology follows computer technology. Second, finding inevitability in one key area of technology suggests invariance and directionality may be found in the
rest of the technium.

Source:
Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

The Eccentric History of How Bureaucratic Paper-Pushing Drives Clerks Crazy

TheDemonOfWritingBK2013-05-13.jpg

Source of book image: http://d.gr-assets.com/books/1360928417l/15904345.jpg

(p. C4) If paperwork studies have an unofficial standard-bearer and theoretician, it’s Mr. Kafka. In “The Demon of Writing” he lays out a concise if eccentric intellectual history of people’s relationship with the paperwork that governs (and gums up) so many aspects of modern life. The rise of modern bureaucracy is a well-established topic in sociology and political science, where it is often related as a tale of increasing order and rationality. But the paper’s-eye view championed by Mr. Kafka tells a more chaotic story of things going wrong, or at least getting seriously messy.

It’s an idea that makes perfect sense to any modern cubicle dweller whose overflowing desk stands as a rebuke to the utopian promise of the paperless office. But Mr. Kafka traces the modern age of paperwork to the French Revolution and the Declaration of the Rights of Man, which guaranteed citizens the right to request a full accounting of the government. An explosion of paper followed, along with jokes, gripes and tirades against the indignity of rule by paper-pushing clerks, a fair number of whom, judging from the stories in Mr. Kafka’s book, went mad.

For the full story, see:
JENNIFER SCHUESSLER. “The Paper Trail Through History.” The New York Times (Mon., December 17, 2012): C1 & C4.
(Note: the online version of the story has the date December 16, 2012.)

Kafka’s book, mentioned above, is:
Kafka, Ben. The Demon of Writing: Powers and Failures of Paperwork. Cambridge, Mass.: Zone Books, 2012.

KafkaBenAuthor2013-05-13.jpg “Ben Kafka, author of “The Demon of Writing: Powers and Failures of Paperwork.”” Source of caption and photo: online version of the NYT article quoted and cited above.

We Worry Most About What We Cannot Control

(p. D7) Studies have compared Americans’ perceived ranking of dangers with the rankings of real dangers, measured either by actual accident figures or by estimated numbers of averted accidents. It turns out that we exaggerate the risks of events that are beyond our control, that cause many deaths at once or that kill in spectacular ways — crazy gunmen, terrorists, plane crashes, nuclear radiation, genetically modified crops. At the same time, we underestimate the risks of events that we can control (“That would never happen to me — I’m careful”) and of events that kill just one person in a mundane way.

For the full commentary, see:
JARED DIAMOND. “ESSAY; That Daily Shower Can Be a Killer.” The New York Times (Tues., January 28, 2013): D1 & D7.
(Note: the online version of the commentary has the date January 28, 2013.)

Faculty Unions Oppose MOOCs that Might Cost Them Their Jobs in Five to Seven Years

ThrunSabastianUdacityCEO2013-05-14.jpg “Sebastian Thrun, a research professor at Stanford, is Udacity’s chief executive officer.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A1) SAN JOSE, Calif. — Dazzled by the potential of free online college classes, educators are now turning to the gritty task of harnessing online materials to meet the toughest challenges in American higher education: giving more students access to college, and helping them graduate on time.
. . .
Here at San Jose State, . . . , two pilot programs weave material from the online classes into the instructional mix and allow students to earn credit for them.
“We’re in Silicon Valley, we (p. A3) breathe that entrepreneurial air, so it makes sense that we are the first university to try this,” said Mohammad Qayoumi, the university’s president. “In academia, people are scared to fail, but we know that innovation always comes with the possibility of failure. And if it doesn’t work the first time, we’ll figure out what went wrong and do better.”
. . .
Dr. Qayoumi favors the blended model for upper-level courses, but fully online courses like Udacity’s for lower-level classes, which could be expanded to serve many more students at low cost. Traditional teaching will be disappearing in five to seven years, he predicts, as more professors come to realize that lectures are not the best route to student engagement, and cash-strapped universities continue to seek cheaper instruction.
“There may still be face-to-face classes, but they would not be in lecture halls,” he said. “And they will have not only course material developed by the instructor, but MOOC materials and labs, and content from public broadcasting or corporate sources. But just as faculty currently decide what textbook to use, they will still have the autonomy to choose what materials to include.”
. . .
Any wholesale online expansion raises the specter of professors being laid off, turned into glorified teaching assistants or relegated to second-tier status, with only academic stars giving the lectures. Indeed, the faculty unions at all three California higher education systems oppose the legislation requiring credit for MOOCs for students shut out of on-campus classes.
. . .
“Our ego always runs ahead of us, making us think we can do it better than anyone else in the world,” Dr. Ghadiri said. “But why should we invent the wheel 10,000 times? This is M.I.T., No. 1 school in the nation — why would we not want to use their material?”
There are, he said, two ways of thinking about what the MOOC revolution portends: “One is me, me, me — me comes first. The other is, we are not in this business for ourselves, we are here to educate students.”

For the full story, see:
TAMAR LEWIN. “Colleges Adapt Online Courses to Ease Burden.” The New York Times (Tues., April 30, 2013): A1 & A3.
(Note: ellipses added.)
(Note: the online version of the story has the date April 29, 2013.)

KormanikKatieUdacityStudent2013-05-14.jpg “Katie Kormanik preparing to record a statistics course at Udacity, an online classroom instruction provider in Mountain View, Calif.” Source of caption and photo: online version of the NYT article quoted and cited above.

Early Societies Were Violent, Superstitious and Unfair

(p. 89) Human nature is malleable. We use our minds to change our values, expectations, and definition of ourselves. We have changed our nature since our hominin days, and once changed, we will continue to change ourselves even more. Our inventions, such as language, writing, law, and science, have ignited a level of progress that is so fundamental and embedded in the present that we now naively expect to see similar good things in the past as well. But much of what we consider “civil” or even “humane” was absent long ago. Early societies were not peaceful but rife with warfare. One of the most common causes of adult death in tribal societies was to be declared a witch or evil spirit. No rational evidence was needed for these superstitious accusations. Lethal atrocities for infractions within a clan were the norm; fairness, as we might think of it, did not extend outside the immediate tribe. Rampant inequality among genders and physical advantage for the strong guided a type of justice few modern people would want applied to them.

Source:
Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

Edison, Not Muybridge, Remains the Father of Hollywood

TheInventorAndTheTycoonBK2013-05-12.jpg

Source of book image: online version of the WSJ review quoted and cited below.

(p. A13) Wish it though we might, this strangely off-center Briton isn’t really the Father of Hollywood, nor even a distant progenitor of “Avatar.” The famous time-lapse images that he took for Stanford, proving that a horse does take all four hoofs off the ground while galloping–and the tens of thousands of photographs that he went on to make of birds flying and people sneezing or bending over and picking things up–were soon so comprehensively overtaken by newer technologies (lenses, shutters, celluloid) that his stature as a proto-movie-maker was soon reduced to a way-station. His contribution was technically interesting but hardly seminal at all. The tragic reality is that Thomas Edison, with whom Muybridge was friendly enough to propose collaboration, retains the laurels–though, as Mr. Ball points out with restrained politeness, Muybridge might have fared better had he been aware of Edison’s reputation for “borrowing the work of others and not returning it.”

For the full review, see:
SIMON WINCHESTER. “BOOKSHELF; Lights, Camera, Murder; The time-lapse photos Muybridge took in the 19th century were technically innovative, but they didn’t make him the Father of Hollywood.” The Wall Street Journal (Thurs., February 6, 2013): A13.
(Note: the online version of the review has the date February 6, 2013.)

The book under review is:
Ball, Edward. The Inventor and the Tycoon: A Gilded Age Murder and the Birth of Moving Pictures. New York: Doubleday, 2013.

World Population Growth Rate “Expected to Hit Zero Around 2070”

(p. C4) In the 1960s, some experts feared an exponentially accelerating population explosion, and in 1969, the State Department envisaged 7.5 billion people by the year 2000. In 1994, the United Nations’ medium estimate expected the seven-billion milestone to arrive around 2009. Compared with most population forecasts made in the past half century, the world keeps undershooting.

The growth rate of world population has halved since the ’60s and is now expected to hit zero around 2070, with population around 10 billion, though some news outlets prefer to focus on the U.N.’s “high” estimate that it “could” reach 15 billion. The truth is, nobody can know, but if it’s below 10 billion in 2100, we will have only increased in numbers by 1.5 times in the 21st century, compared with a fourfold increase in the 20th.

For the full commentary, see:
MATT RIDLEY. “MIND & MATTER; Who’s Afraid of Seven Billion People?” The Wall Street Journal (Sat., October 29, 2011): C4.

Tesla CTO Straubel Likes Biography of Tesla

StraubelJBteslaCTO2013-05-14.jpg

J.B. Straubel, Chief Technology Officer of Tesla Motors. Source of photo: online version of the NYT article quoted and cited below.

(p. 2) J. B. Straubel is a founder and the chief technical officer of Tesla Motors in Palo Alto, Calif. The company makes electric vehicles that some compare to Apple products in terms of obsessive attention to design, intuitive user interface and expense.

READING I like to read biographies of interesting people, mostly scientists and engineers. Right now, it’s “Steve Jobs,” by Walter Isaacson. One of my favorites biographies was “Wizard: The Life and Times of Nikola Tesla,” by Marc Seifer, which I read even before Tesla Motors started.
. . .
WATCHING I really like the movie “October Sky.” It’s about a guy who grew up in a little coal-mining town around the time of Sputnik. He fell in love with the idea of building rockets and the movie follows him through his high school years when he’s building rockets and eventually he ends up becoming an engineer at NASA. I watch it every year or so. It’s inspirational. I always come out of it wanting to work harder.

For the full interview, see:
KATE MURPHY. “DOWNLOAD; J. B. Straubel.” The New York Times, SundayReview Section (Sun., April 7, 2013): 2.
(Note: ellipsis added; bold in original.)
(Note: the online version of the interview has the date April 6, 2013.)

Cities Provide Children “Options for Their Future”

(p. 85) As Suketu Mehta, author of Maximum City (about Mumbai), says, “Why would anyone leave a brick house in the village with its two mango trees and its view of small hills in the East to come here?” Then he answers: “So that someday the eldest son can buy two rooms in Mira Road, at the northern edges of the city. And the younger one can move beyond that, to New Jersey. Discomfort is an investment.”
Then Mehta continues: “For the young person in an Indian village, the call of Mumbai isn’t just about money. It’s also about freedom.” Stewart Brand recounts this summation of the magnetic pull of cities by activist Kavita Ramdas: “In the village, all there is for a woman is to obey her husband and relatives, pound millet, and sing. If she moves to town, she can get a job, start a business, and get education for her children.” The Bedouin of Arabia were once seemingly the freest people on Earth, roaming the great Empty Quarter at will, under a tent of stars and no one’s thumb. But they are rapidly quitting their nomadic life and (p. 86) hustling into drab, concrete-block apartments in exploding Gulf-state ghettos. As reported by Donovan Webster in National Geographic, they stable their camels and goats in their ancestral village, because the bounty and attraction of the herder’s life still remain for them. The Bedouin are lured, not pushed, to the city because, in their own words: “We can always go into the desert to taste the old life. But this [new] life is better than the old way. Before there was no medical care, no schools for our children.” An eighty-year-old Bedouin chief sums it up better than I could: “The children will have more options for their future.”

Source:
Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.
(Note: italics, an bracketed “new,” in original.)