The Palace of Discovery: “They Came for Wonder and Hope”

PalaceOfDiscoveryParis.jpg
The Palace of Discovery (aka Palais de la Decouverte) in Paris. Source of photo: http://www.flickr.com/photos/paris2e/2524827592/

Near the beginning of World War II, the 1937 Palace of Discovery in Paris, was a popular source of hope for the future:

(p. 206) An unexpectedly popular draw at the exposition was a relatively small hall hidden away behind the Grand Palais. The Palace of Discovery, as it was called, attracted more than 2 million visitors, five times the number that visited the modern art exhibit. They came for wonder and hope. The wonder was provided by exhibits including a huge electrostatic generator, like something from Dr. Frankenstein’s lab, two enormous metal spheres thirteen feet apart, across which a 5-million-volt current threw a hissing, crackling bolt of electricity. The hope came from the very nature of science itself. Designed by a group of liberal French researchers, the Palace of Discovery was intended to be more a “people’s university” than a stuffy museum, a place to hear inspiring lectures on the latest wonders of science, messages abut technological confidence and progress for the peoples of the world.

Source:
Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor’s Heroic Search for the World’s First Miracle Drug. New York: Three Rivers Press, 2007.

Industrialist Duisberg Made Domagk’s Sulfa Discovery Possible

(p. 65) . . . Domagk’s future would be determined not only by his desire to stop disease but also by his own ambition, his family needs, and the plans of a small group of businessmen he had never met. He probably had heard of their leader, however, one of the preeminent figures in German business, a man the London Times would later eulogize as “the greatest industrialist the world has yet had.” His name was Carl Duisberg.

Duisberg was a German version of Thomas Edison, Henry Ford, and John D. Rockefeller rolled into one. He had built an empire of science in Germany, leveraging the discoveries of dozens of chemists he employed into one of the most profitable businesses on earth. He knew how industrial science worked: He was himself a chemist. At least he had been long ago. Now, in the mid-1920s, in the twilight of his years, his fortunes made, his reputation assured, he often walked in his private park alone—still solidly built, with his shaved head and a bristling white mustache, still a commanding presence in his top hat and black overcoat—through acres of forest, fountains, classical statuary, around the pond in his full-scale Japanese garden by the lacquered teahouse, over his steams, and across his lawns.

Source:
Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor’s Heroic Search for the World’s First Miracle Drug. New York: Three Rivers Press, 2007.
(Note: ellipsis added.)

“Four G’s Needed for Success: Geduld, Geschick, Glück, Geld”

One of Domagk’s predecessors, in goal and method, was Paul Ehrlich, who was a leader in the search for the Zuberkugeln (magic bullet) against disease causing organisms. He systematized the trial and error method, and pursued dyes as promising chemicals that might be modified to attach themselves to the intruders. But he never quite found a magic bullet:

(p. 82) Ehrlich announced to the world that he had found a cure for sleeping sickness. But he spoke too soon. Number 418, also, proved too toxic for general use. He and his chemists resumed the search.

Ehrlich said his method consisted basically of “examining and sweating”—and his coworkers joked that Ehrlich examined while they sweated. There was another motto attributed to Ehrlich’s lab, the list of “Four Gs” needed for success: Geduld, Geschick, Glück, Geld—patience, skill, luck, and money.

Source:
Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor’s Heroic Search for the World’s First Miracle Drug. New York: Three Rivers Press, 2007.
(Note: do not confuse the “Paul Ehrlich” discussed above, with the modern environmentalist “Paul Ehrlich” who is best known for losing his bet with Julian Simon.)

Most Scientists’ Lives Are “Like Those of Anxious Middle Managers”

(p. 64) The truth is that scientists come in all types, just like everyone else. They are people, not pop paradigms. They worry about how they are going to pay their bills, and they get envious of the researchers who got the credit they should have gotten. They compete for grants and complain when those grants are awarded to someone else. They focus on prestige and work for advancement and usually do what their bosses (or, less directly, granting agencies) say. Most scientists, as the great British molecular biologist J. D. Bernal noted back in the 1930s, live lives more like those of anxious middle managers than great visionaries.

Source:
Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor’s Heroic Search for the World’s First Miracle Drug. New York: Three Rivers Press, 2007.

Doctors Rejected Pasteur’s Work

Whether in science, or in entrepreneurship, at the initial stages of an important new idea, the majority of experts will reject the idea. So a key for the advance of science, or for innovation in the economy, is to allow scientists and entrepreneurs to accumulate sufficient resources so that they can make informed bets based on their conjectures, and on their tacit knowledge.
A few entries ago, Hager recounted how Leeuwenhoek faced initial skepticism from the experts. In the passage below, Hager recounts how Pasteur also faced initial skepticism from the experts:

(p. 44) If bacteria could rot meat, Pasteur reasoned, they could cause diseases, and he spent years proving the point. Two major problems hindered the acceptance of his work within the medical community: First, Pasteur, regardless of his ingenuity, was a brewing chemist, not a physician, so what could he possibly know about disease? And second, his work was both incomplete and imprecise. He had inferred that bacteria caused disease, but it was impossible for him to definitively prove the point. In order to prove that a type of bacterium could cause a specific disease, precisely and to the satisfaction of the scientific world, it would be necessary to isolate that one type of bacterium for study, to create a pure culture, and then test the disease-causing abilities of this pure culture.

Source:
Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor’s Heroic Search for the World’s First Miracle Drug. New York: Three Rivers Press, 2007.

The Benefits from the Discovery of Sulfa, the First Antibiotic

I quoted a review of The Demon Under the Microscope in an entry from October 12, 2006. I finally managed to read the book, last month.
I don’t always agree with Hager’s interpretation of events, and his policy advice, but he writes well, and he has much to say of interest about how the first anti-bacterial antibiotic, sulfa, was developed.
In the coming weeks, I’ll be highlighting a few key passages of special interest. In today’s entry, below, Hager nicely summarizes the importance of the discovery of antibiotics for his (and my) baby boom generation.

(p. 3) I am part of that great demographic bulge, the World War II “Baby Boom” generation, which was the first in history to benefit from birth from the discovery of antibiotics. The impact of this discovery is difficult to overstate. If my parents came down with an ear infection as babies, they were treated with bed rest, painkillers, and sympathy. If I came down with an ear infection as a baby, I got antibiotics. If a cold turned into bronchitis, my parents got more bed rest and anxious vigilance; I got antibiotics. People in my parents’ generation, as children, could and all too often did die from strep throats, infected cuts, scarlet fever, meningitis, pneumonia, or any number of infectious diseases. I and my classmates survived because of antibiotics. My parents as children, and their parents before them, lost friends and relatives, often at very early ages, to bacterial epidemics that swept through American cities every fall and winter, killing tens of thousands. The suddenness and inevitability of these epidemic deaths, facts of life before the 1930s, were for me historical curiosities, artifacts of another age. Antibiotics virtually eliminated them. In many cases, much-feared diseases of my grandparents’ day—erysipelas, childbed fever, cellulitis—had become so rare they were nearly extinct. I never heard the names.

Source:
Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor’s Heroic Search for the World’s First Miracle Drug. New York: Three Rivers Press, 2007.

Age and Inventiveness

AgeProductivityGraph.gif Source of graph: online version of the WSJ article quoted and cited below.

(p. B5) A particularly stark view of age-related constraints on researchers’ work comes from Benjamin Jones, an associate professor at Northwestern University’s Kellogg School of Management. He examined biographical data over the past century for more than 700 Nobel laureates and renowned inventors.

His conclusion: “Innovators are productive over a narrowing span of their life cycle.” In the early 20th century, he found, researchers at the times of their greatest contributions averaged slightly more than 36 years old. In recent decades, innovation before the age of 30 became increasing rare, with the peak age of contribution rising toward age 40. Meanwhile, the frequency of key contributions has consistently diminished by researchers in their early or mid-50s.
Occasionally, Mr. Jones says, booming new fields “permit easier access to the frontier, allowing people to make contributions at younger ages.” That could account for the relative youth of Internet innovators, such as Netscape Communications Corp. founder Marc Andreessen and Messrs. Page and Brin. But “when the revolution is over,” Mr. Jones finds, “ages rise.”
Unwilling to see researchers at peak productivity for only a small part of their careers, tech companies are fighting back in a variety of ways. At microchip maker Texas Instruments Inc., in Dallas, executives are pairing up recent college graduates and other fresh research hires with experienced mentors, called “craftsmen,” for intensive training and coaching.
This system means that new design engineers can become fully effective in three or four years, instead of five to seven, says Taylor Efland, chief technologist for TI’s analog chip business. Analog chips are used in power management, data conversion and amplification.
At Sun Microsystems Inc., teams of younger and older researchers are common. That can help everyone’s productivity, says Greg Papadopoulos, chief technology officer for the Santa Clara, Calif., computer maker. Younger team members provide energy and optimism; veterans provide a savvier sense of what problems to tackle.

For the full story, see:
GEORGE ANDERS. “THEORY & PRACTICE; Companies Try to Extend Researchers’ Productivity; Teams of Various Ages, Newer Hires Combat Short Spans of Inventing.” The Wall Street Journal (Mon., AUGUST 18, 2008): B5.

A large literature exists on the relationship between age and scientific productivity. I am particularly fond of the following examples:

Diamond, Arthur M., Jr. “Age and the Acceptance of Cliometrics.” The Journal of Economic History 40, no. 4 (December 1980): 838-841.
Diamond, Arthur M., Jr. “An Economic Model of the Life-Cycle Research Productivity of Scientists.” Scientometrics 6, no. 3 (1984): 189-196.
Diamond, Arthur M., Jr. “The Life-Cycle Research Productivity of Mathematicians and Scientists.” The Journal of Gerontology 41, no. 4 (July 1986): 520-525.
Diamond, Arthur M., Jr. “An Optimal Control Model of the Life-Cycle Research Productivity of Scientists.” Scientometrics 11, nos. 3-4 (1987): 247-249.
Diamond, Arthur M., Jr. “The Polywater Episode and the Appraisal of Theories.” In A. Donovan, L. Laudan and R. Laudan, eds., Scrutinizing Science: Empirical Studies of Scientific Change. Dordrecht, Holland: Kluwer Academic Publishers, 1988, 181-198.
Hull, David L., Peter D. Tessner and Arthur M. Diamond, Jr. “Planck’s Principle: Do Younger Scientists Accept New Scientific Ideas with Greater Alacrity than Older Scientists?” Science 202 (November 17, 1978): 717-723.

Science Fiction Writers Provide More Accurate Forecasts Than Economists

Robert Fogel, quoted below, is a Nobel-Prize-winning professor of economics at the University of Chicago:

(p. 13) I think I’ve largely covered how things looked after World War II, highlighting both what now seems to have been an unjustified pessimism and also the difficulties in forecasting the future. I close with an anecdote from Simon Kuznets. He used to give a one-year course in growth economics, both at Johns Hopkins and Harvard. One of the points he made was that if you wanted to find accurate forecasts of what happened in the past, don’t look at what the economists said. The economists in 1850 wrote that the progress of the last decade had been so great that it could not possibly continue. And economists at the end of the nineteenth century wrote that the progress of the last half century had been so great that it could not possibly continue during the twentieth century. Kuznets said you would come closest to an accurate forecast if you read the writers of science fiction. But even the writers of science fiction were too pessimistic. Jules Verne recognized that we might eventually get to the moon, but he couldn’t conceive of the technology that actually made the journey possible.

I was at a 2003 conference at Rockefeller University that brought together about 30 people from different disciplines (economics, biology, chemistry, and physics, as well as some industrial leaders) who put forward their views of what was likely to happen in the new millennium. And I must say that the noneconomists were far more bullish than most of the economists I know. So I suspect if we have another MussaFest in 2024, we’ll all look back at how pessimistic we were in 2004.

Source:
Fogel, Robert W. “Reconsidering Expectations of Economic Growth after World War Ii from the Perspective of 2004.” IMF Staff Papers 52 (Special Issue 2005): 6-14.

Founder of Experimental Science Received Prison as His Reward

(p. 53) Where men had once said, ‘Credo ut intelligam’ (understanding can come only through belief), they now said, ‘Intelligo ut credam’ (belief can come only through understanding). In 1277, Roger Bacon was imprisoned for an indefinite period for holding these opinions. Free and rational investigation of nature was to come hard in the clash between reason and faith which would echo down to our own time.

Source:
Burke, James. The Day the Universe Changed: How Galileo’s Telescope Changed the Truth and Other Events in History That Dramatically Altered Our Understanding of the World. Back Bay Books, 1995.

Michael Crichton’s Scariest Story

CrichtonMichael2003.jpg

Michael Crichton speaking on environmentalism at the Fairmont Hotel on September 15, 2003. Source of photo: Bill Adams’ posting at http://www.pbase.com/bill_adams/image/21439440

The papers announced yesterday (11/6/08) that Michael Crichton had died of cancer a couple of days earlier (11/4/08).
I had mixed feelings about his stories. On the one hand, they seemed mainly to stir up unrealistic fears about technology, which I see as mainly a benefit to humanity. On the other hand, they often involved intelligent heroes who struggled against danger, and won (or at least partly won).
Crichton’s best story may have been one of his last, State of Fear. In that book, he took on the environmental movement, and showed in a powerful appendix, how some scientists and scientific institutions have failed us, by creating fear that is not grounded in the free exchange of ideas and evidence.
Crichton did not have to take on this issue—it earned him vituperative enemies, and probably lost him some readers. But in the end, he too was an intelligent hero who struggled against danger—the danger of politically correct closed minds.
Michael Crichton, Rest in Peace.

P.S.: Crichton had some scientific credentials. Here are a couple of interesting facts about his life:

(p. A27) At Harvard, after a professor criticized his writing style, the younger Mr. Crichton changed his major from English to anthropology and graduated summa cum laude in 1964. He then spent a year teaching anthropology on a fellowship at Cambridge University. In 1966 he entered Harvard Medical School and began writing on the side to help pay tuition.
. . .
In 1969, after earning his medical degree, Mr. Crichton moved to the La Jolla section of San Diego and spent a year as a postdoctoral fellow at the Salk Institute for Biological Studies. Already inclining toward a writing career, he tilted decisively with “The Andromeda Strain,” a medical thriller about a group of scientists racing against time to stop the spread of a lethal organism from outer space code-named Andromeda.

For the full obituary, see:
WILLIAM GRIMES. “Michael Crichton, Author of Thrillers, Dies at 66.” The New York Times (Thurs., November 5, 2008): A27.
(Note: ellipsis added.)

CrichtonMichaelHarvard2002.jpg Michael Crichton during an April 11, 2002 lecture at the Harvard Medical School (from which he graduated). Source of photo: http://www.hno.harvard.edu/gazette/2002/04.18/11-crichton.html