Pentagon Seeks Innovation from Private Start-Ups Since “They’ve Realized that the Old Model Wasn’t Working Anymore”

(p. A3) SAN FRANCISCO — A small group of high-ranking Pentagon officials made a quiet visit to Silicon Valley in December to solicit national security ideas from start-up firms with little or no history of working with the military.
The visit was made as part of an effort to find new ways to maintain a military advantage in an increasingly uncertain world.
In announcing its Defense Innovation Initiative in a speech in California in November, Chuck Hagel, then the defense secretary, mentioned examples of technologies like robotics, unmanned systems, miniaturization and 3-D printing as places to look for “game changing” technologies that would maintain military superiority.
“They’ve realized that the old model wasn’t working anymore,” said James Lewis, director of the Strategic Technologies Program at the Center for Strategic and International Studies in Washington. “They’re really worried about America’s capacity to innovate.”
There is a precedent for the initiative. Startled by the Soviet launch of the Sputnik satellite in 1957, President Dwight D. Eisenhower created the Advanced Research Projects Agency, or ARPA, at the Pentagon to ensure that the United States would not be blindsided by technological advances.
Now, the Pentagon has decided that the nation needs more than ARPA, renamed the Defense Advanced Research Projects Agency, or Darpa, if it is to find new technologies to maintain American military superiority.
. . .
The Pentagon focused on smaller companies during its December visit; it did not, for example, visit Google. Mr. Welby acknowledged that Silicon Valley start-ups were not likely to be focused on the Pentagon as a customer. The military has captive suppliers and a long and complex sales cycle, and it is perceived as being a small market compared with the hundreds of millions of customers for consumer electronics products.
Mr. Welby has worked for three different Darpa directors, but he said that Pentagon officials now believed they had to look beyond their own advanced technology offices.
“The Darpa culture is about trying to understand high-risk technology,” he said. “It’s about big leaps.” Today, however, the Pentagon needs to break out of what can be seen as a “not invented here” culture, he said.
“We’re thinking about what the world is going to look like in 2030 and what tools the department will need in 20 or 30 years,” he added.

For the full story, see:
JOHN MARKOFF. “Pentagon Shops in Silicon Valley for Game Changers.” The New York Times (Fri., FEB. 27, 2015): A3.
(Note: ellipsis added.)
(Note: the online version of the story has the date FEB. 26, 2015.)

Starting in Late Middle Ages the State Tried “to Control, Delineate, and Restrict Human Thought and Action”

(p. C6) . . . transregional organizations like Viking armies or the Hanseatic League mattered more than kings and courts. It was a world, as Mr. Pye says, in which “you went where you were known, where you could do the things you wanted to do, and where someone would protect you from being jailed, hanged, or broken on the wheel for doing them.”
. . .
This is a world in which money rules, but money is increasingly an abstraction, based on insider information, on speculation (the Bourse or stock market itself is a regional invention) and on the ability to apply mathematics: What was bought or sold was increasingly the relationships between prices in different locations rather than the goods themselves.
What happened to bring this powerful, creative pattern to a close? The author credits first the reaction to the Black Death of the mid-14th century, when fear of contamination (perhaps similar to our modern fear of terrorism) justified laws that limited travel and kept people in their place. Religious and sectarian strife further limited the free flow of ideas and people, forcing people to choose one identity to the exclusion of others or else to attempt to disappear into the underground of clandestine and subversive activities. And behind both of these was the rise of the state, a modern invention that attempted to control, delineate, and restrict human thought and action.

For the full review, see:
PATRICK J. GEARY. “Lighting Up the Dark Ages.” The Wall Street Journal (Sat., May 30, 2015): C6.
(Note: ellipses added.)
(Note: the online version of the review has the date May 29, 2015.)

The book under review, is:
Pye, Michael. The Edge of the World: A Cultural History of the North Sea and the Transformation of Europe. New York: Pegasus Books LLC, 2014.

More Tech Stars Skip College, at Least for a While

(p. B1) The college dropout-turned-entrepreneur is a staple of Silicon Valley mythology. Steve Jobs, Bill Gates and Mark Zuckerberg all left college.
In their day, those founders were very unusual. But a lot has changed since 2005, when Mr. Zuckerberg left Harvard. The new crop of dropouts has grown up with the Internet and smartphones. The tools to create new technology are more accessible. The cost to start a company has plunged, while the options for raising money have multiplied.
Moreover, the path isn’t as lonely.
. . .
Not long ago, dropping out of school to start a company was considered risky. For this generation, it is a badge of honor, evidence of ambition and focus. Very few dropouts become tycoons, but “failure” today often means going back to school or taking a six-figure job at a big tech company.
. . .
(p. B5) There are no hard numbers on the dropout trend, but applicants for the Thiel Fellowship tripled in the most recent year; the fellowship won’t disclose numbers.
. . .
It has tapped 82 fellows in the past five years.
“I don’t think college is always bad, but our society seems to think college is always good, for everyone, at any cost–and that is what we have to question,” says Mr. Thiel, a co-founder of PayPal and an early investor in Facebook.
Of the 43 fellows in the initial classes of 2011 and 2012, 26 didn’t return to school and continued to work on startups or independent projects. Five went to work for large tech firms, including a few through acquisitions. The remaining 12 went back to school.
Mr. Thiel says companies started by the fellows have raised $73 million, a record that he says has attracted additional applicants. He says fellows “learned far more than they would have in college.”

For the full story, see:
DAISUKE WAKABAYASHI. “College Dropouts Thrive in Tech.” The Wall Street Journal (Thurs., June 4, 2015): B1 & B10.
(Note: ellipses added. The phrase “the fellowship won’t disclose numbers” was in the online, but not the print, version of the article.)
(Note: the online version of the article has the date June 3, 2015, and has the title “College Dropouts Thrive in Tech.”)

Cultural and Institutional Differences Between Europe and U.S. Keep Europe from Having a Silicon Valley

(p. B7) “They all want a Silicon Valley,” Jacob Kirkegaard, a Danish economist and senior fellow at the Peterson Institute for International Economics, told me this week. “But none of them can match the scale and focus on the new and truly innovative technologies you have in the United States. Europe and the rest of the world are playing catch-up, to the great frustration of policy makers there.”
Petra Moser, assistant professor of economics at Stanford and its Europe Center, who was born in Germany, agreed that “Europeans are worried.”
“They’re trying to recreate Silicon Valley in places like Munich, so far with little success,” she said. “The institutional and cultural differences are still too great.”
. . .
There is . . . little or no stigma in Silicon Valley to being fired; Steve Jobs himself was forced out of Apple. “American companies allow their employees to leave and try something else,” Professor Moser said. “Then, if it works, great, the mother company acquires the start-up. If it doesn’t, they hire them back. It’s a great system. It allows people to experiment and try things. In Germany, you can’t do that. People would hold it against you. They’d see it as disloyal. It’s a very different ethic.”
Europeans are also much less receptive to the kind of truly disruptive innovation represented by a Google or a Facebook, Mr. Kirkegaard said.
He cited the example of Uber, the ride-hailing service that despite its German-sounding name is a thoroughly American upstart. Uber has been greeted in Europe like the arrival of a virus, and its reception says a lot about the power of incumbent taxi operators.
“But it goes deeper than that,” Mr. Kirkegaard said. “New Yorkers don’t get all nostalgic about yellow cabs. In London, the black cab is seen as something that makes London what it is. People like it that way. Americans tend to act in a more rational and less emotional way about the goods and services they consume, because it’s not tied up with their national and regional identities.”
. . .
With its emphasis on early testing and sorting, the educational system in Europe tends to be very rigid. “If you don’t do well at age 18, you’re out,” Professor Moser said. “That cuts out a lot of people who could do better but never get the chance. The person who does best at a test of rote memorization at age 17 may not be innovative at 23.” She added that many of Europe’s most enterprising students go to the United States to study and end up staying.
She is currently doing research into creativity. “The American education system is much more forgiving,” Professor Moser said. “Students can catch up and go on to excel.”
Even the vaunted European child-rearing, she believes, is too prescriptive. While she concedes there is as yet no hard scientific evidence to support her thesis, “European children may be better behaved, but American children may end up being more free to explore new things.”

For the full story, see:
JAMES B. STEWART. “Common Sense; A Fearless Culture Fuels Tech.” The New York Times (Fri., JUNE 19, 2015): B1 & B7.
(Note: ellipses added.)
(Note: the online version of the story has the date JUNE 18, 2015, and has the title “Common Sense; A Fearless Culture Fuels U.S. Tech Giants.”)

We Often “See” What We Expect to See

(p. 9) The Justice Department recently analyzed eight years of shootings by Philadelphia police officers. Its report contained two sobering statistics: Fifteen percent of those shot were unarmed; and in half of these cases, an officer reportedly misidentified a “nonthreatening object (e.g., a cellphone) or movement (e.g., tugging at the waistband)” as a weapon.
Many factors presumably contribute to such shootings, ranging from carelessness to unconscious bias to explicit racism, all of which have received considerable attention of late, and deservedly so.
But there is a lesser-known psychological phenomenon that might also explain some of these shootings. It’s called “affective realism”: the tendency of your feelings to influence what you see — not what you think you see, but the actual content of your perceptual experience.
. . .
The brain is a predictive organ. A majority of your brain activity consists of predictions about the world — thousands of them at a time — based on your past experience. These predictions are not deliberate prognostications like “the Red Sox will win the World Series,” but unconscious anticipations of every sight, sound and other sensation you might encounter in every instant. These neural “guesses” largely shape what you see, hear and otherwise perceive.
. . .
. . . , our lab at Northeastern University has conducted experiments to document affective realism. For example, in one study we showed an affectively neutral face to our test subjects, and using special equipment, we secretly accompanied it with a smiling or scowling face that the subjects could not consciously see. (The technique is called “continuous flash suppression.”) We found that the unseen faces influenced the subjects’ bodily activity (e.g., how fast their hearts beat) and their feelings. These in turn influenced their perceptions: In the presence of an unseen scowling face, our subjects felt unpleasant and perceived the neutral face as less likable, less trustworthy, less competent, less attractive and more likely to commit a crime than when we paired it with an unseen smiling face.
These weren’t just impressions; they were actual visual changes. The test subjects saw the neutral faces as having a more furrowed brow, a more surly mouth and so on. (Some of these findings were published in Emotion in 2012.)
. . .
. . . the brain is wired for prediction, and you predict most of the sights, sounds and other sensations in your life. You are, in large measure, the architect of your own experience.

For the full commentary, see:
Feldman Barrett, Lisa, and Jolie Wormwood. “When a Gun Is Not a Gun.” The New York Times, SundayReview Section (Sun., April 19, 2015): 9.
(Note: italics in original; ellipses added.)
(Note: the date of the online version of the commentary is APRIL 17, 2015.)

The academic article mentioned in the passage quoted above, is:
Anderson, Eric, Erika Siegel, Dominique White, and Lisa Feldman Barrett. “Out of Sight but Not out of Mind: Unseen Affective Faces Influence Evaluations and Social Impressions.” Emotion 12, no. 6 (Dec. 2012): 1210-21.

Steven Johnson Is Advocate of Collaboration in Innovation

(p. A13) Theories of innovation and entrepreneurship have always yo-yoed between two basic ideas. First, that it’s all about the single brilliant individual and his eureka moment that changes the world. Second, that it’s about networks, collaboration and context. The truth, as in all such philosophical dogfights, is somewhere in between. But that does not stop the bickering. This controversy blew up in a political context during the 2012 presidential election, when President Obama used an ill-chosen set of words (“you didn’t build that”) to suggest that government and society had a role in creating the setting for entrepreneurs to flourish, and Republicans berated him for denigrating the rugged individualists of American enterprise.
Through a series of elegant books about the history of technological innovation, Steven Johnson has become one of the most persuasive advocates for the role of collaboration in innovation. His latest, “How We Got to Now,” accompanies a PBS series on what he calls the “six innovations that made the modern world.” The six are detailed in chapters titled “Glass,” “Cold,” “Sound,” “Clean,” “Time” and “Light.” Mr. Johnson’s method is to start with a single innovation and then hopscotch through history to illuminate its vast and often unintended consequences.

For the full review, see:
PHILIP DELVES BROUGHTON. “BOOKSHELF; Unintended Consequences; Gutenberg’s printing press sparked a revolution in lens-making, which led to eyeglasses, microscopes and, yes, the selfie.” The Wall Street Journal (Tues., Sept. 30, 2014): A13.
(Note: ellipses added.)
(Note: the online version of the review has the date Sept. 29, 2014, and has the title “BOOKSHELF; Book Review: ‘How We Got to Now’ by Steven Johnson; Gutenberg’s printing press sparked a revolution in lens-making, which led to eyeglasses, microscopes and, yes, the selfie.” )

The book under review, is:
Johnson, Steven. How We Got to Now: Six Innovations That Made the Modern World. New York: Riverhead Books, 2014.

Plant Breeders Use Old Sloppy “Natural” Process to Avoid Regulatory Stasis

(p. A11) What’s in a name?
A lot, if the name is genetically modified organism, or G.M.O., which many people are dead set against. But what if scientists used the precise techniques of today’s molecular biology to give back to plants genes that had long ago been bred out of them? And what if that process were called “rewilding?”
That is the idea being floated by a group at the University of Copenhagen, which is proposing the name for the process that would result if scientists took a gene or two from an ancient plant variety and melded it with more modern species to promote greater resistant to drought, for example.
“I consider this something worth discussing,” said Michael B. Palmgren, a plant biologist at the Danish university who headed a group, including scientists, ethicists and lawyers, that is funded by the university and the Danish National Research Foundation.
They pondered the problem of fragile plants in organic farming, came up with the rewilding idea, and published their proposal Thursday in the journal Trends in Plant Science.
. . .
The idea of restoring long-lost genes to plants is not new, said Julian I. Schroeder, a plant researcher at the University of California, Davis. But, wary of the taint of genetic engineering, scientists have used traditional breeding methods to cross modern plants with ancient ones until they have the gene they want in a crop plant that needs it. The tedious process inevitably drags other genes along with the one that is targeted. But the older process is “natural,” Dr. Schroeder said.
. . .
Researchers have previously crossbred wheat plants with traits found in ancient varieties, noted Maarten Van Ginkel, who headed such a program in Mexico at the International Maize and Wheat Improvement Center.
“We selected for disease resistance, drought tolerance,” he said. “This method works but it has drawbacks. You prefer to move only the genes you want.”
When Dr. Van Ginkel crossbred for traits, he did not look for the specific genes conferring those traits. But with the flood-resistant rice plants, researchers knew exactly which gene they wanted. Nonetheless, they crossbred and did not use precision breeding to alter the plants.
Asked why not, Dr. Schroeder had a simple answer — a complex maze of regulations governing genetically engineered crops. With crossbreeding, he said, “the first varieties hit the fields in a couple of years.”
And if the researchers had used precision breeding to get the gene into the rice?
“They would still be stuck in the regulatory process,” Dr. Schroeder said.

For the full story, see:
GINA KOLATA. “A Proposal to Modify Plants Gives G.M.O. Debate New Life.” The Wall Street Journal (Fri., MAY 29, 2015): A11.
(Note: ellipses added.)
(Note: the online version of the story has the date MAY 28, 2015.)

Tesla Cars Are Built on Government Subsidies

(p. A13) Nowhere in Mr. Vance’s book, . . . , does the figure $7,500 appear–the direct taxpayer rebate to each U.S. buyer of Mr. Musk’s car. You wouldn’t know that 10% of all Model S cars have been sold in Norway–though Tesla’s own 10-K lists the possible loss of generous Norwegian tax benefits as a substantial risk to the company.
Barely developed in passing is that Tesla likely might not exist without a former State Department official whom Mr. Musk hired to explore “what types of tax credits and rebates Tesla might be able to drum up around its electric vehicles,” which eventually would include a $465 million government-backed loan.
And how Tesla came by its ex-Toyota factory in California “for free,” via a “string of fortunate turns” that allowed Tesla to float its IPO a few weeks later, is just a thing that happens in Mr. Vance’s book, not the full-bore political intrigue it actually was.
The fact is, Mr. Musk has yet to show that Tesla’s stock market value (currently $32 billion) is anything but a modest fraction of the discounted value of its expected future subsidies. In 2017, he plans to introduce his Model 3, a $35,000 car for the middle class. He expects to sell hundreds of thousands a year. Somehow we doubt he intends to make it easy for politicians to whip away the $7,500 tax credit just when somebody besides the rich can benefit from it–in which case the annual gift from taxpayers will quickly mount to several billion dollars each year.
Mother Jones, in a long piece about what Mr. Musk owes the taxpayer, suggested the wunderkind could be a “bit more grateful, a bit more humble.” Unmentioned was the shaky underpinning of this largess. Even today’s politicized climate modeling allows the possibility that climate sensitivity to carbon dioxide is far less than would justify incurring major expense to change the energy infrastructure of the world (and you certainly wouldn’t begin with luxury cars). Were this understanding to become widespread, the subliminal hum of government favoritism could overnight become Tesla’s biggest liability.

For the full commentary, see:
HOLMAN W. JENKINS, JR. “BUSINESS WORLD; The Savior Elon Musk; Tesla’s impresario is right about one thing: Humanity’s preservation is a legitimate government interest.” The Wall Street Journal (Sat., May 30, 2015): A13.
(Note: ellipsis added.)
(Note: the online version of the commentary has the date May 29, 2015.)

The book discussed in the commentary is:
Vance, Ashlee. Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future. New York: Ecco, 2015.

The Mother Jones article discussing government subsidies for Musk’s Tesla, is:
Harkinson, Josh. “Free Ride.” Mother Jones 38, no. 5 (Sept./Oct. 2013): 20-25.

Little Progress Toward Complex Autonomous Robots

(p. A8) [In June 2015] . . . , the Defense Advanced Research Projects Agency, a Pentagon research arm, . . . [held] the final competition in its Robotics Challenge in Pomona, Calif. With $2 million in prize money for the robot that performs best in a series of rescue-oriented tasks in under an hour, the event . . . offer[ed] what engineers refer to as the “ground truth” — a reality check on the state of the art in the field of mobile robotics.

A preview of their work suggests that nobody needs to worry about a Terminator creating havoc anytime soon. Given a year and a half to improve their machines, the roboticists, who shared details about their work in interviews before the contest in June, appear to have made limited progress.
. . .
“The extraordinary thing that has happened in the last five years is that we have seemed to make extraordininary progress in machine perception,” said Gill Pratt, the Darpa program manager in charge of the Robotics Challenge.
Pattern recognition hardware and software has made it possible for computers to make dramatic progress in computer vision and speech understanding. In contrast, Dr. Pratt said, little headway has been made in “cognition,” the higher-level humanlike processes required for robot planning and true autonomy. As a result, both in the Darpa contest and in the field of robotics more broadly, there has been a re-emphasis on the idea of human-machine partnerships.
“It is extremely important to remember that the Darpa Robotics Challenge is about a team of humans and machines working together,” he said. “Without the person, these machines could hardly do anything at all.”
In fact, the steep challenge in making progress toward mobile robots that can mimic human capabilities is causing robotics researchers worldwide to rethink their goals. Now, instead of trying to build completely autonomous robots, many researchers have begun to think instead of creating ensembles of humans and robots, an approach they describe as co-robots or “cloud robotics.”
Ken Goldberg, a University of California, Berkeley, roboticist, has called on the computing world to drop its obsession with singularity, the much-ballyhooed time when computers are predicted to surpass their human designers. Rather, he has proposed a concept he calls “multiplicity,” with diverse groups of humans and machines solving problems through collaboration.
For decades, artificial-intelligence researchers have noted that the simplest tasks for humans, such as reaching into a pocket to retrieve a quarter, are the most challenging for machines.
“The intuitive idea is that the more money you spend on a robot, the more autonomy you will be able to design into it,” said Rodney Brooks, an M.I.T. roboticist and co-founder two early companies, iRobot and Rethink Robotics. “The fact is actually the opposite is true: The cheaper the robot, the more autonomy it has.”
For example, iRobot’s Roomba robot is autonomous, but the vacuuming task it performs by wandering around rooms is extremely simple. By contrast, the company’s Packbot is more expensive, designed for defusing bombs, and must be teleoperated or controlled wirelessly by people.

For the full story, see:
JOHN MARKOFF. “A Reality Check for A.I.” The New York Times (Tues., MAY 26, 2015): D2.
(Note: ellipses, and bracketed expressions, added. I corrected a misspelling of “extraordinary.”)
(Note: the date of the online version of the story is MAY 25, 2015, and has the title “Relax, the Terminator Is Far Away.”)

George Bailey Wanted to Make Money, But He Wanted to Do More than Just Make Money

(p. 219) Actually, it’s not so strange. The norm for bankers was never just moneymaking, any more than it was for doctors or lawyers. Bankers made a livelihood, often quite a good one, by serving their clients– the depositors and borrowers– and the communities in which they worked. But traditionally, the aim of banking– even if sometimes honored only in the breach– was service, not just moneymaking.
In the movie It’s a Wonderful Life, James Stewart plays George Bailey, a small-town banker faced with a run on the bank– a liquidity crisis. When the townspeople rush into the bank to withdraw their money, Bailey tells them, “You’re thinking of this place all wrong. As if I had the money back in a safe. The money’s not here.” He goes on. “Your money’s in Joe’s house. Right next to yours. And in the Kennedy house, and Mrs. Backlin’s house, and a hundred others. Why, you’re lending them the money to build, and they’re going to pay you back, as best they can…. What are you going to do, foreclose on them?”
No, says George Bailey, “we’ve got to stick together. We’ve got to have faith in one another.” Fail to stick together, and the community will be ruined. Bailey took all the money he could get his hands on and gave it to his depositors to help see them through the crisis. Of course, George Bailey was interested in making money, but money was not the only point of what Bailey did.
Relying on a Hollywood script to provide evidence of good bankers is at some level absurd, but it does indicate something valuable about society’s expectations regarding the role of bankers. The norm for a “good banker” throughout most of the twentieth century was in fact someone who was trustworthy and who served the community, who was responsible to clients, and who took an interest in them.

Source:
Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.
(Note: italics in original.)