Big Increase in Costs of Adhering to Moore’s Law

(p. 219) Harald Bauer, Jan Veira, and Florian Weig consider “Moore’s Law: Repeal or Renewal?” “Moore’s law states that the number of transistors on integrated circuits doubles every two years, and for the past four decades it has set the pace for progress in the semiconductor industry. . . . Adherence to Moore’s law has led to continuously falling semiconductor prices. Per-bit prices of dynamic random-access memory chips, for example, have fallen by as much as 30 to 35 percent a year for several decades. . . . Some estimates ascribe up to 40 percent of the global productivity growth achieved during the last two decades to the expansion of information and communication technologies made possible by semiconductor performance and cost improvements.” But this continued technological progress comes at an ever-higher price. “A McKinsey analysis shows that moving from 32nm (p. 220) to 22nm nodes on 300-millimeter (mm) wafers causes typical fabrication costs to grow by roughly 40 percent. It also boosts the costs associated with process development by about 45 percent and with chip design by up to 50 percent. These dramatic increases will lead to process-development costs that exceed $1 billion for nodes below 20nm. In addition, the state-of-the art fabs needed to produce them will likely cost $10 billion or more. As a result, the number of companies capable of financing next-generation nodes and fabs will likely dwindle.” McKinsey Global Institute, December 2013, http://www.mckinsey.com/insights/high_tech_telecoms_internet/moores_law_repeal_or_renewal.

Source:
Taylor, Timothy. “Recommendations for Further Reading.” Journal of Economic Perspectives 28, no. 2 (Spring 2014): 213-20.
(Note: ellipses in original.)

Process Innovations, Allowed by Deregulation, Creatively Destroyed Railroads

(p. A11) In “American Railroads: Decline and Renaissance in the Twentieth Century,” transportation economists Robert E. Gallamore and John R. Meyer provide a comprehensive account of both the decline and the revival.   . . .    They point to excessive government regulation of railroad rates and services as the catalyst for the industry’s decay.
. . .
. . . deregulation, Mr. Gallamore and Meyer demonstrate, was a process of creative destruction. Conrail was created by the government in 1976 in a risky, last-ditch attempt to rescue Penn Central and other bankrupt Eastern railroads. It was quickly losing $1 million a day, and its plight helped make the case for the major revamp of railroad regulation that came in 1980. A wave of mergers followed, and the new companies slashed routes and employees on the way to profitability. The shrinking of the national rail system helped, too, as freight companies consolidated traffic on a smaller (and therefore cheaper) network. Freight-train crews were cut to two or three people from four or five. Cabooses were replaced by electronic gear at the end of freight trains.

For the full review, see:
DANIEL MACHALABA. “BOOKSHELF; Long Train Runnin’; Track conditions got so bad in the 1970s that stationary freight cars were falling off the rails thanks to rotting crossties.” The Wall Street Journal (Weds., July 9, 2014): A11.
(Note: ellipses added.)
(Note: the online version of the review has the date July 8, 2014, and has the title “BOOKSHELF; Book Review: ‘American Railroads’ by Robert E. Gallamore and John R. Meyer; Track conditions got so bad in the 1970s that stationary freight cars were falling off the rails thanks to rotting crossties.”)

The book under review is:
Gallamore, Robert E., and John R. Meyer. American Railroads: Decline and Renaissance in the Twentieth Century. Cambridge, MA: Harvard University Press, 2014.

Lynas Apologizes for Organizing Anti-GM (Genetic Modification) Movement

(p. 115) More than a decade and a half since the commercialization of first-generation agricultural biotechnology, concerns about transgenic crop impacts on human and environmental health remain, even though the experience across a cumulative 1.25 billion hectares suggests the relative safety of first-generation genetically engineered seed. The risks posed by agricultural biotechnology warrant continued attention, and new transgenic crops may pose different and bigger risks. Weighing against uncertain risks are benefits from increased food production, reduced insecticide use, and avoided health risks to food consumers and farm workers. At the same time, adoption is shown to increase herbicide use while reducing herbicide toxicity, save land by boosting yields while also making previously unfarmed lands profitable. Adoption benefits food consumers and farmers but also enriches seed companies that enjoy property right protections over new seed varieties. The (p. 116) balance of scientific knowledge weighs in favor of continued adoption of genetically engineered seed, which may explain why some longtime critics have reversed course. For example, Lord Melchett, who was the head of Greenpeace, has been advising biotechnology companies on overcoming constraints to the technology (St. Clair and Frank forthcoming). Mark Lynas, a journalist and organizer of the anti-GM (genetic modification) movement, publicly apologized for helping start the movement in his “Lecture to Oxford Farming Conference” (2013).
Agricultural biotechnology remains regulated by regimes developed at the introduction of the technology. Whereas precaution may have been appropriate before the relative magnitudes of risks and benefits could be empirically observed, accumulated knowledge suggests overregulation is inhibiting the introduction of new transgenic varieties. Regulation also discourages developing-country applications, where benefits are likely greatest. In the future, new genetic traits may promise greater benefits while also posing novel risks of greater magnitudes than existing traits. Efficient innovation and technology adoption will require different and, perhaps, more stringent regulation in the future, as well as continued insights from researchers, including economists, in order to assess evolving costs and benefits.

Source:
Barrows, Geoffrey, Steven Sexton, and David Zilberman. “Agricultural Biotechnology: The Promise and Prospects of Genetically Modified Crops.” Journal of Economic Perspectives 28, no. 1 (Winter 2014): 99-120.

Did Intel Succeed in Spite of, or Because of, Tension Between Noyce and Grove?

(p. C5) . . . , much more so than in earlier books on Intel and its principals, the embedded thread of “The Intel Trinity” is the dirty little secret few people outside of Intel knew: Andy Grove really didn’t like Bob Noyce.
. . .
(p. C6) . . . there’s the argument that one thing a startup needs is an inspiring, swashbuckling boss who lights up a room when he enters it and has the confidence to make anything he’s selling seem much bigger and more important than it actually is. And Mr. Malone makes a compelling case that Noyce was the right man for the job in this phase of the company. “Bob Noyce’s greatest gift, even more than his talent as a technical visionary,” Mr. Malone writes, “was his ability to inspire people to believe in his dreams, in their own abilities, and to follow him on the greatest adventure of their professional lives.”
. . .
Noyce hid from Mr. Grove, who was in charge of operations, the fact that Intel had a secret skunk works developing a microprocessor, a single general-purpose chip that would perform multiple functions–logic, calculation, memory and power control. Noyce had the man who was running it report directly to him rather than to Mr. Grove, even though Mr. Grove was his boss on the organizational chart. When Mr. Grove learned what was going on, he became furious, but like the good soldier he was, he snapped to attention and helped recruit a young engineer from Fairchild to be in charge of the project, which ultimately redefined the company.
. . .
Remarkably, none of this discord seemed to have much effect on the company’s day-to-day operations. Mr. Malone even suggests that the dysfunction empowered Intel’s take-no-prisoners warrior culture.
. . .
So while the humble, self-effacing Mr. Moore, who had his own time in the CEO’s chair from 1975 to 1987, played out his role as Intel’s big thinker, the brilliant visionary “who could see into the technological future better than anyone alive,” Mr. Grove was the kick-ass enforcer. No excuses. For anything.

For the full review, see:
STEWART PINKERTON. “Made in America; A Born Leader, a Frustrated Martinet Built One of Silicon Valley’s Giants.” The Wall Street Journal (Sat., July 19, 2014): C5-C6.
(Note: ellipses added.)
(Note: the online version of the review has the date July 18, 2014, and has the title “Book Review: ‘The Intel Trinity’ by Michael S. Malone; A born leader, an ethereal genius and a tough taskmaster built the most important company on the planet.”)

The book under review is:
Malone, Michael S. The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World’s Most Important Company. New York: HarperCollins Publishers, 2014.

Entrepreneur Gutenberg’s Press Creatively Destroyed the Jobs of Scribes

(p. 32) Poggio possessed . . . [a] gift that set him apart from virtually all the other book-hunting humanists. He was a superbly well-trained scribe, with exceptionally fine handwriting, great powers of concentration, and a high degree of accuracy. It is difficult for us, at this distance, to take in the significance of such qualities: our technologies for producing transcriptions, facsimiles, and copies have almost entirely erased what was once an important personal achievement. That importance began to decline, though not at all precipitously, even in Poggio’s own lifetime, for by the 1430s a German entrepreneur, Johann Gutenberg, began experimenting with a new invention, movable type, which would revolutionize the reproduction and transmission of texts. By the century’s end printers, especially the great Aldus in Venice, would print Latin texts in a typeface whose clarity and elegance remain unrivalled after five centuries. That typeface was based on the beautiful handwriting of Poggio and his humanist friends. What Poggio did by hand to produce a single copy would soon be done mechanically to produce hundreds.

Source:
Greenblatt, Stephen. The Swerve: How the World Became Modern. New York: W. W. Norton & Company, 2011.
(Note: ellipsis, and bracketed word, added.)

How Sega Came Out of Nowhere to Leapfrog Near-Monopolist Nintendo

ConsoleWarsBk2014-06-05.jpg

Source of book image: http://images.eurogamer.net/2014/usgamer/original.jpg/EG11/resize/958x-1/format/jpg

(p. C10) “Console Wars” tells how Sega, an unremarkable Japanese manufacturer of games played in arcades, came out of nowhere to challenge Nintendo for dominance of the videogame world in the first half of the 1990s. Nintendo, which had revived the stagnant home videogame category a few years earlier, had something close to a monopoly in 1990 and behaved accordingly, dictating terms to game developers and treating retailers as peons. Sega, in Mr. Harris’s telling, was a disruptive force in a highly concentrated market, introducing more advanced gaming technology, toppling Nintendo from its perch and becoming the largest seller of home videogame hardware in the U.S. by late 1993.

Mr. Harris’s hero is a former Mattel executive named Tom Kalinske, who became president of Sega of America, then a small subsidiary, in 1990. Mr. Kalinske assembled a team of crack marketers who would not have gone near Sega but for his reputation and persuasiveness. Within a year and a half, according to Mr. Harris, Mr. Kalinske’s leadership, along with a new gaming system called Genesis and a marketing assist from a mascot named Sonic the Hedgehog, made Sega the U.S. market leader in videogames.
And then, after only three years at the top, Sega fell from its pedestal. Sega’s management in Japan, suffering mightily from not-invented-here syndrome, rejected Mr. Kalinske’s proposals to collaborate with Sony and Silicon Graphics on new gaming systems. Instead, over his objections, Sega pushed out its ill-conceived Saturn game console in 1995. While Saturn flopped, Sony struck gold with its PlayStation; Silicon Graphics sold its chip with amazing graphics capabilities to Nintendo; and the game, so to speak, was over.
. . .
The author admits he has taken liberties: “I have re-created the scenes in this book using the information uncovered from my interviews, facts gathered from supporting documents, and my best judgment as to what version most closely fits the historical record,” he writes. The result is more a 558-page screenplay than a credible work of nonfiction.

For the full review, see:
MARC LEVINSON. “Sonic Boom; How a no-name company took on Nintendo, tied its fate to a hyperactive hedgehog, and–briefly–won.” The Wall Street Journal (Sat., May 24, 2014): C10.
(Note: ellipsis added.)
(Note: the online version of the review has the date May 23, 2014, an has the title “Book Review: ‘Console Wars’ by Blake J. Harris; How a no-name company took on Nintendo, tied its fate to a hyperactive hedgehog, and–briefly–won.”)

The book under review is:
J., Harris Blake. Console Wars: Sega, Nintendo, and the Battle That Defined a Generation. New York: HarperCollins Publishers, 2014.

Open Source Guru Admits to “Mismatched Incentives” and “Serious Trouble Down the Road”

RaymondEricOpenSourceElder2014-06-02.jpg “Eric S. Raymond said that the code-checking system had failed in the case of Heartbleed.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) SAN FRANCISCO — The Heartbleed bug that made news last week drew attention to one of the least understood elements of the Internet: Much of the invisible backbone of websites from Google to Amazon to the Federal Bureau of Investigation was built by volunteer programmers in what is known as the open-source community.

Heartbleed originated in this community, in which these volunteers, connected over the Internet, work together to build free software, to maintain and improve it and to look for bugs. Ideally, they check one another’s work in a peer review system similar to that found in science, or at least on the nonprofit Wikipedia, where motivated volunteers regularly add new information and fix others’ mistakes.
This process, advocates say, ensures trustworthy computer code.
But since the Heartbleed flaw got through, causing fears — as yet unproved — of widespread damage, members of that world are questioning whether the system is working the way it should.
“This bug was introduced two years ago, and yet nobody took the time to notice it,” said Steven M. Bellovin, a computer science professor at Columbia University. “Everybody’s job is not anybody’s job.”
. . .
(p. B2) Unlike proprietary software, which is built and maintained by only a few employees, open-source code like OpenSSL can be vetted by programmers the world over, advocates say.
“Given enough eyeballs, all bugs are shallow” is how Eric S. Raymond, one of the elders of the open-source movement, put it in his 1997 book, “The Cathedral & the Bazaar,” a kind of manifesto for open-source philosophy.
In the case of Heartbleed, though, “there weren’t any eyeballs,” Mr. Raymond said in an interview this week.
. . .
The problem, Mr. Raymond and other open-source advocates say, boils down to mismatched incentives. Mr. Raymond said firms don’t maintain OpenSSL code because they don’t profit directly from it, even though it is integrated into their products, and governments don’t feel political pain when the code has problems.
With OpenSSL, by contrast, “for those that do work on this, there’s no financial support, no salaries, no health insurance,” Mr. Raymond said. “They either have to live like monks or work nights and weekends. That is a recipe for serious trouble down the road.”

For the full story, see:
Perlroth, Nicole. “A Contradiction at the Heart of the Web.” The New York Times (Sat., April 19, 2014): B1 & B2.
(Note: ellipses added.)
(Note: the online version of the story was updated APRIL 18, 2014, and has the title “Heartbleed Highlights a Contradiction in the Web.”)

Raymond’s open source manifesto is:
Raymond, Eric S. The Cathedral & the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary. Sebastopol, CA: O’Reilly Media, Inc., 1999.

Forecasts of Mass Unemployment from Robots Were Wrong

(p. 215) Frank Levy and Richard J. Murnane consider the interaction between workers and machinery in “Dancing with Robots: Human Skills for Computerized Work.” “On March 22, 1964, President Lyndon Johnson received a short, alarming memorandum from the Ad Hoc Committee on the Triple Revolution. The memo warned the president of threats to the nation beginning with the likelihood that computers would soon create mass unemployment: ‘A new era of production has begun. Its principles of organization are as different from those of the industrial era as those of the industrial era were different from the agricultural. The cybernation revolution has been brought about by the combination of the computer and the automated self-regulating machine. This results in a system of almost unlimited productive capacity which requires progressively less human labor. Cybernation is already reorganizing the economic and social system to meet its own needs.’ The memo was signed by luminaries including Nobel Prize winning chemist Linus Pauling, Scientific American publisher Gerard Piel, and economist Gunnar Myrdal (a future Nobel Prize winner). Nonetheless, its warning was only half right. There was no mass unemployment–since 1964 the economy has added 74 million jobs. But computers have changed the jobs that are available, the skills those jobs require, and the wages the jobs pay. For the foreseeable future, the challenge of “cybernation” is not mass unemployment but the need to educate many more young people for the jobs computers cannot do.” Third Way, 2013, http://content.thirdway.org /publications/714/Dancing-With-Robots.pdf.

Source:
Taylor, Timothy. “Recommendations for Further Reading.” Journal of Economic Perspectives 27, no. 4 (Fall 2013): 211-18.
(Note: italics in original.)

They Begged for a Chance to Help Edison Create the Future

(p. 289) He, and anyone working for him, were perceived as standing at the very outer edge of the present, where it abuts the future. When a young John Lawson sought a position at Edison’s lab and wrote in 1879 that he was “willing to do anything, dirty work–become anything, almost a slave, only give me a chance,” he spoke with a fervency familiar to applicants knocking today on the door of the hot tech company du jour. In the age of the computer, different companies at different times–for example, Apple in the early 1980s, Microsoft in the early 1990s, Google in the first decade of the twenty-first century–inherited the temporary aura that once hovered over Edison’s Menlo Park laboratory, attracting young talents who applied in impossibly large numbers, all seeking a role in the creation of the zeitgeist (and, like John Ott, at the same time open to a chance to become wealthy). The lucky ones got inside (Lawson got a position and worked on electric light).

Source:
Stross, Randall E. The Wizard of Menlo Park: How Thomas Alva Edison Invented the Modern World. New York: Crown Publishers, 2007.

French Protest Amazon, but Buy There for Low Prices

(p. B1) LONDON — On weekends, Guillaume Rosquin browses the shelves of local bookstores in Lyon, France. He enjoys peppering the staff with questions about what he should be reading next. But his visits, he says, are also a protest against the growing power of Amazon. He is bothered by the way the American online retailer treats its warehouse employees.
Still, as with millions of other Europeans, there is a limit to how much he will protest.
“It depends on the price,” said Mr. Rosquin, 49, who acknowledged that he was planning to buy a $400 BlackBerry smartphone on Amazon because the handset was not yet available on rival French websites. “If you can get something for half-price at Amazon, you may put your issues with their working conditions aside.”

For the full story, see:
MARK SCOTT. “Principles Are No Match for Europe’s Love of U.S. Web Titans.” The New York Times (Mon., JULY 7, 2014): B1 & B3.
(Note: the online version of the story has the date JULY 6, 2014.)