by Jason Crawford · August 5, 2018 · 5 min read
Lately I’ve been reading about the history of plastic, particularly via Stephen Fenichell’s Plastic: The Making of a Synthetic Century. I’ll have more to say about plastic in the future—it is an amazing and vastly underrated substance, a true wonder material—but for now I want to talk about a broader idea.
A major theme of the 19th century was the transition from plant and animal materials to synthetic versions or substitutes mostly from non-organic sources. Some key examples:
Ivory. Before the invention of plastics, small objects from combs to knife handles were often made from bone or similar animal products: toroise shells, horns and antlers, baleen from whales (called “whalebone”), and especially ivory from elephant tusks. These biomaterials were hard, smooth, lightweight, and waterproof; they didn’t rust, and were amenable to carving. Around the time of the Civil War, the game of pool became very popular in the US, leading to soaring demand for ivory for billard balls (the New York Times warned in 1867 that elephants were becoming endangered). In 1863, the billards firm of Phelan & Collender ran newspaper ads offering $10,000 in gold to anyone who invented a suitable substitute material. It was this challenge that drove John Wesley Hyatt to experiment until he invented celluloid, the earliest plastic. (It seems that Hyatt never collected the prize, but he did found a company to make products from celluloid, including billard balls.)
Fertilizer. Prior to the Haber-Bosch process, fertilizer was obtained or created from natural sources: guano deposits on islands, or rock salts in the desert. By the end of the century, it was clear that the supplies would run out soon; in 1898, Sir William Crookes, head of the British Academy of Sciences, called on the chemists of the world to discover a way to sythesize fertilizer in order to avoid mass famine. The solution, which Haber discovered the key to and Bosch figured out how to industrialize, was a process to create ammonia (a precursor to most fertilizers) from hydrogen in water and nitrogen in the atmosphere.
Lighting. The first big market for the oil industry, well before the invention of the internal combustion engine, was kerosene for lighting. Prior to kerosene, common lighting sources included candles made from animal fat, and lamps lit with oil from the sperm whale (which has led some writers to claim that “Rockefeller saved the whales.”)
Smelting. The iron industry in 18th-century England led to massive deforestation, as trees were felled to turn into charcoal for smelting. The solution was to convert to a mineral source, coal, which England had plenty of. (This conversion was achieved by developing a process to purify coal into coke by charring, the same way wood is purified into charcoal.)
Shellac. A secretion of the lac beetle, shellac has been used to make varnish in Asia since ancient times. The process is slow, however: it takes 15,000 lac beetles six months to produce enough resin for one pound of shellac. As long as the world only needed the substance to coat furniture, this process was sufficient. But with the advent of electricity, there was suddenly a tremendous need for insulation material, and shellac became expensive—so much so that the chemist, inventor, and businessman Leo Baekeland, already independently wealthy from his work in the photgraphy industry, decided in the early 1900s that an artificial shellac was the number one most important problem he could work on. His solution was Bakelite, another important early plastic. Bakelite was eventually used not only for wire insulation but for countless other purposes, from the iconic Bell telephone to, again, billard balls, replacing the inferior celluloid. (Shellac was also used for records, in which capacity it was replaced some decades later by another plastic, vinyl.)
These are just a handful of examples. There are many other biomaterials we once relied on—rubber, silk, leather and furs, straw, beeswax, wood tar, natural inks and dyes—that have been partially or fully replaced by synthetic or artificial substitutes, especially plastics, that can be derived from mineral sources. They had to be replaced, because the natural sources couldn’t keep up with rapidly increasing demand. The only way to ramp up production—the only way to escape the Malthusian trap and sustain an exponentially increasing population while actually improving everyone’s standard of living—was to find new, more abundant sources of raw materials and new, more efficient processes to create the end products we needed. As you can see from some of these examples, this drive to find substitutes was often conscious and deliberate, motivated by an explicit understanding of the looming resource crisis.
In short, plant and animal materials had become unsustainable.
The more I saw this theme, the more it seemed strange to me that today, there is a drive to return to biological sources of materials, in the name of “sustainability”. For instance, what is today referred to as sustainable or “green” plastic is made from “renewable” feedstocks, such as polylactic acid, which can be derived from corn. Similarly, biofuels are supposed to be part of the solution to the “unsustainability” of oil.
If plant and animal materials were unsustainable in the 19th century, why are they the solution to sustainability in the 21st?
The answer, I think, lies in a different concept of sustainability, based on a different vision of what exactly we want to sustain.
In the 19th century, the priority was to sustain growth and progress. For the first time in history, economic production and consequently standards of living were undergoing a long, sustained rise. The whole world appreciated this and saw the imperative to keep it going. Anything that got in the way, or even threatened to slow it down, was an obstacle to overcome, lest the world regress into famine, disease, and literal darkness—the very state that humanity had just, finally, pulled itself out of.
To my knowledge, the term “sustainability” was not used in the 19th century in the context of these problems. The term in its current sense was coined by the modern environmental movement circa 1972. It seems to represent a new and different concept: not the sustaining of growth, but simply the sustaining of a given industrial process indefinitely. It also has, perhaps, a connotation of avoiding unforseen disasters caused by technology and industry.
What often seems left out of discussions of sustainability is the sustaining of growth and progress. Indeed, one of the goals of the environmental movement is the exact opposite: to reduce consumption (as in the common mantra “reduce, reuse, recycle”).
To my mind, any solution to sustainability that involves reducing consumption or lowering our standard of living is no solution at all. It is giving up and admitting defeat. If running out of a resource means that we have to regress back to earlier technologies, that is a failure—a failure to do what we did in the 19th century and replace unsustainable technologies with new, improved ones that can take humanity to the next level and support orders of magnitude more growth.
In the 21st century, this could mean energy from improved forms of nuclear power, or some way of harnessing energy from the Sun that is much better than today’s solar panels. It could mean breakthroughs in biotechnology: new sources of food, medicines that conquer bacterial resistance, or even biomaterials that can be manufactured at industrial scale. It could mean machine learning algorithms that optimize the global economy so that we can increase economic production much faster than we increase its mineral inputs.
But based on the history of the 19th century, I’m skeptical that it means plastic made from corn.
Social media link image credit: Ranjithsiji via Wikimedia Commons (CC BY-SA 3.0)
« Learning the loom Problem-solution history »