AdamSmith Posted November 14, 2017 Posted November 14, 2017 Well, isn't this pleasant? The Uninhabitable Earth Famine, economic collapse, a sun that cooks us: What climate change could wreak — sooner than you think. By David Wallace-Wells I. ‘Doomsday’ Peering beyond scientific reticence. It is, I promise, worse than you think. If your anxiety about global warming is dominated by fears of sea-level rise, you are barely scratching the surface of what terrors are possible, even within the lifetime of a teenager today. And yet the swelling seas — and the cities they will drown — have so dominated the picture of global warming, and so overwhelmed our capacity for climate panic, that they have occluded our perception of other threats, many much closer at hand. Rising oceans are bad, in fact very bad; but fleeing the coastline will not be enough. Indeed, absent a significant adjustment to how billions of humans conduct their lives, parts of the Earth will likely become close to uninhabitable, and other parts horrifically inhospitable, as soon as the end of this century. Even when we train our eyes on climate change, we are unable to comprehend its scope. This past winter, a string of days 60 and 70 degrees warmer than normal baked the North Pole, melting the permafrost that encased Norway’s Svalbard seed vault — a global food bank nicknamed “Doomsday,” designed to ensure that our agriculture survives any catastrophe, and which appeared to have been flooded by climate change less than ten years after being built. The Doomsday vault is fine, for now: The structure has been secured and the seeds are safe. But treating the episode as a parable of impending flooding missed the more important news. Until recently, permafrost was not a major concern of climate scientists, because, as the name suggests, it was soil that stayed permanently frozen. But Arctic permafrost contains 1.8 trillion tons of carbon, more than twice as much as is currently suspended in the Earth’s atmosphere. When it thaws and is released, that carbon may evaporate as methane, which is 34 times as powerful a greenhouse-gas warming blanket as carbon dioxide when judged on the timescale of a century; when judged on the timescale of two decades, it is 86 times as powerful. In other words, we have, trapped in Arctic permafrost, twice as much carbon as is currently wrecking the atmosphere of the planet, all of it scheduled to be released at a date that keeps getting moved up, partially in the form of a gas that multiplies its warming power 86 times over. Maybe you know that already — there are alarming stories in the news every day, like those, last month, that seemed to suggest satellite data showed the globe warming since 1998 more than twice as fast as scientists had thought (in fact, the underlying story was considerably less alarming than the headlines). Or the news from Antarctica this past May, when a crack in an ice shelf grew 11 miles in six days, then kept going; the break now has just three miles to go — by the time you read this, it may already have met the open water, where it will drop into the sea one of the biggest icebergs ever, a process known poetically as “calving.” But no matter how well-informed you are, you are surely not alarmed enough. Over the past decades, our culture has gone apocalyptic with zombie movies and Mad Max dystopias, perhaps the collective result of displaced climate anxiety, and yet when it comes to contemplating real-world warming dangers, we suffer from an incredible failure of imagination. The reasons for that are many: the timid language of scientific probabilities, which the climatologist James Hansen once called “scientific reticence” in a paper chastising scientists for editing their own observations so conscientiously that they failed to communicate how dire the threat really was; the fact that the country is dominated by a group of technocrats who believe any problem can be solved and an opposing culture that doesn’t even see warming as a problem worth addressing; the way that climate denialism has made scientists even more cautious in offering speculative warnings; the simple speed of change and, also, its slowness, such that we are only seeing effects now of warming from decades past; our uncertainty about uncertainty, which the climate writer Naomi Oreskes in particular has suggested stops us from preparing as though anything worse than a median outcome were even possible; the way we assume climate change will hit hardest elsewhere, not everywhere; the smallness (two degrees) and largeness (1.8 trillion tons) and abstractness (400 parts per million) of the numbers; the discomfort of considering a problem that is very difficult, if not impossible, to solve; the altogether incomprehensible scale of that problem, which amounts to the prospect of our own annihilation; simple fear. But aversion arising from fear is a form of denial, too. In between scientific reticence and science fiction is science itself. This article is the result of dozens of interviews and exchanges with climatologists and researchers in related fields and reflects hundreds of scientific papers on the subject of climate change. What follows is not a series of predictions of what will happen — that will be determined in large part by the much-less-certain science of human response. Instead, it is a portrait of our best understanding of where the planet is heading absent aggressive action. It is unlikely that all of these warming scenarios will be fully realized, largely because the devastation along the way will shake our complacency. But those scenarios, and not the present climate, are the baseline. In fact, they are our schedule. The present tense of climate change — the destruction we’ve already baked into our future — is horrifying enough. Most people talk as if Miami and Bangladesh still have a chance of surviving; most of the scientists I spoke with assume we’ll lose them within the century, even if we stop burning fossil fuel in the next decade. Two degrees of warming used to be considered the threshold of catastrophe: tens of millions of climate refugees unleashed upon an unprepared world. Now two degrees is our goal, per the Paris climate accords, and experts give us only slim odds of hitting it. The U.N. Intergovernmental Panel on Climate Change issues serial reports, often called the “gold standard” of climate research; the most recent one projects us to hit four degrees of warming by the beginning of the next century, should we stay the present course. But that’s just a median projection. The upper end of the probability curve runs as high as eight degrees — and the authors still haven’t figured out how to deal with that permafrost melt. The IPCC reports also don’t fully account for the albedo effect (less ice means less reflected and more absorbed sunlight, hence more warming); more cloud cover (which traps heat); or the dieback of forests and other flora (which extract carbon from the atmosphere). Each of these promises to accelerate warming, and the history of the planet shows that temperature can shift as much as five degrees Celsius within thirteen years. The last time the planet was even four degrees warmer, Peter Brannen points out in The Ends of the World, his new history of the planet’s major extinction events, the oceans were hundreds of feet higher.* The Earth has experienced five mass extinctions before the one we are living through now, each so complete a slate-wiping of the evolutionary record it functioned as a resetting of the planetary clock, and many climate scientists will tell you they are the best analog for the ecological future we are diving headlong into. Unless you are a teenager, you probably read in your high-school textbooks that these extinctions were the result of asteroids. In fact, all but the one that killed the dinosaurs were caused by climate change produced by greenhouse gas. The most notorious was 252 million years ago; it began when carbon warmed the planet by five degrees, accelerated when that warming triggered the release of methane in the Arctic, and ended with 97 percent of all life on Earth dead. We are currently adding carbon to the atmosphere at a considerably faster rate; by most estimates, at least ten times faster. The rate is accelerating. This is what Stephen Hawking had in mind when he said, this spring, that the species needs to colonize other planets in the next century to survive, and what drove Elon Musk, last month, to unveil his plans to build a Mars habitat in 40 to 100 years. These are nonspecialists, of course, and probably as inclined to irrational panic as you or I. But the many sober-minded scientists I interviewed over the past several months — the most credentialed and tenured in the field, few of them inclined to alarmism and many advisers to the IPCC who nevertheless criticize its conservatism — have quietly reached an apocalyptic conclusion, too: No plausible program of emissions reductions alone can prevent climate disaster. Over the past few decades, the term “Anthropocene” has climbed out of academic discourse and into the popular imagination — a name given to the geologic era we live in now, and a way to signal that it is a new era, defined on the wall chart of deep history by human intervention. One problem with the term is that it implies a conquest of nature (and even echoes the biblical “dominion”). And however sanguine you might be about the proposition that we have already ravaged the natural world, which we surely have, it is another thing entirely to consider the possibility that we have only provoked it, engineering first in ignorance and then in denial a climate system that will now go to war with us for many centuries, perhaps until it destroys us. That is what Wallace Smith Broecker, the avuncular oceanographer who coined the term “global warming,” means when he calls the planet an “angry beast.” You could also go with “war machine.” Each day we arm it more. http://nymag.com/daily/intelligencer/2017/07/climate-change-earth-too-hot-for-humans.html Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 II. Heat Death The bahraining of New York. In the sugarcane region of El Salvador, as much as one-fifth of the population has chronic kidney disease, the presumed result of dehydration from working the fields they were able to comfortably harvest as recently as two decades ago. Photo: Heartless Machine Humans, like all mammals, are heat engines; surviving means having to continually cool off, like panting dogs. For that, the temperature needs to be low enough for the air to act as a kind of refrigerant, drawing heat off the skin so the engine can keep pumping. At seven degrees of warming, that would become impossible for large portions of the planet’s equatorial band, and especially the tropics, where humidity adds to the problem; in the jungles of Costa Rica, for instance, where humidity routinely tops 90 percent, simply moving around outside when it’s over 105 degrees Fahrenheit would be lethal. And the effect would be fast: Within a few hours, a human body would be cooked to death from both inside and out. Climate-change skeptics point out that the planet has warmed and cooled many times before, but the climate window that has allowed for human life is very narrow, even by the standards of planetary history. At 11 or 12 degrees of warming, more than half the world’s population, as distributed today, would die of direct heat. Things almost certainly won’t get that hot this century, though models of unabated emissions do bring us that far eventually. This century, and especially in the tropics, the pain points will pinch much more quickly even than an increase of seven degrees. The key factor is something called wet-bulb temperature, which is a term of measurement as home-laboratory-kit as it sounds: the heat registered on a thermometer wrapped in a damp sock as it’s swung around in the air (since the moisture evaporates from a sock more quickly in dry air, this single number reflects both heat and humidity). At present, most regions reach a wet-bulb maximum of 26 or 27 degrees Celsius; the true red line for habitability is 35 degrees. What is called heat stress comes much sooner. Actually, we’re about there already. Since 1980, the planet has experienced a 50-fold increase in the number of places experiencing dangerous or extreme heat; a bigger increase is to come. The five warmest summers in Europe since 1500 have all occurred since 2002, and soon, the IPCC warns, simply being outdoors that time of year will be unhealthy for much of the globe. Even if we meet the Paris goals of two degrees warming, cities like Karachi and Kolkata will become close to uninhabitable, annually encountering deadly heat waves like those that crippled them in 2015. At four degrees, the deadly European heat wave of 2003, which killed as many as 2,000 people a day, will be a normal summer. At six, according to an assessment focused only on effects within the U.S. from the National Oceanic and Atmospheric Administration, summer labor of any kind would become impossible in the lower Mississippi Valley, and everybody in the country east of the Rockies would be under more heat stress than anyone, anywhere, in the world today. As Joseph Romm has put it in his authoritative primer Climate Change: What Everyone Needs to Know, heat stress in New York City would exceed that of present-day Bahrain, one of the planet’s hottest spots, and the temperature in Bahrain “would induce hyperthermia in even sleeping humans.” The high-end IPCC estimate, remember, is two degrees warmer still. By the end of the century, the World Bank has estimated, the coolest months in tropical South America, Africa, and the Pacific are likely to be warmer than the warmest months at the end of the 20th century. Air-conditioning can help but will ultimately only add to the carbon problem; plus, the climate-controlled malls of the Arab emirates aside, it is not remotely plausible to wholesale air-condition all the hottest parts of the world, many of them also the poorest. And indeed, the crisis will be most dramatic across the Middle East and Persian Gulf, where in 2015 the heat index registered temperatures as high as 163 degrees Fahrenheit. As soon as several decades from now, the hajj will become physically impossible for the 2 million Muslims who make the pilgrimage each year. It is not just the hajj, and it is not just Mecca; heat is already killing us. In the sugarcane region of El Salvador, as much as one-fifth of the population has chronic kidney disease, including over a quarter of the men, the presumed result of dehydration from working the fields they were able to comfortably harvest as recently as two decades ago. With dialysis, which is expensive, those with kidney failure can expect to live five years; without it, life expectancy is in the weeks. Of course, heat stress promises to pummel us in places other than our kidneys, too. As I type that sentence, in the California desert in mid-June, it is 121 degrees outside my door. It is not a record high. Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 III. The End of Food Praying for cornfields in the tundra. Climates differ and plants vary, but the basic rule for staple cereal crops grown at optimal temperature is that for every degree of warming, yields decline by 10 percent. Some estimates run as high as 15 or even 17 percent. Which means that if the planet is five degrees warmer at the end of the century, we may have as many as 50 percent more people to feed and 50 percent less grain to give them. And proteins are worse: It takes 16 calories of grain to produce just a single calorie of hamburger meat, butchered from a cow that spent its life polluting the climate with methane farts. Pollyannaish plant physiologists will point out that the cereal-crop math applies only to those regions already at peak growing temperature, and they are right — theoretically, a warmer climate will make it easier to grow corn in Greenland. But as the pathbreaking work by Rosamond Naylor and David Battisti has shown, the tropics are already too hot to efficiently grow grain, and those places where grain is produced today are already at optimal growing temperature — which means even a small warming will push them down the slope of declining productivity. And you can’t easily move croplands north a few hundred miles, because yields in places like remote Canada and Russia are limited by the quality of soil there; it takes many centuries for the planet to produce optimally fertile dirt. Drought might be an even bigger problem than heat, with some of the world’s most arable land turning quickly to desert. Precipitation is notoriously hard to model, yet predictions for later this century are basically unanimous: unprecedented droughts nearly everywhere food is today produced. By 2080, without dramatic reductions in emissions, southern Europe will be in permanent extreme drought, much worse than the American dust bowl ever was. The same will be true in Iraq and Syria and much of the rest of the Middle East; some of the most densely populated parts of Australia, Africa, and South America; and the breadbasket regions of China. None of these places, which today supply much of the world’s food, will be reliable sources of any. As for the original dust bowl: The droughts in the American plains and Southwest would not just be worse than in the 1930s, a 2015 NASA study predicted, but worse than any droughts in a thousand years — and that includes those that struck between 1100 and 1300, which “dried up all the rivers East of the Sierra Nevada mountains” and may have been responsible for the death of the Anasazi civilization. Remember, we do not live in a world without hunger as it is. Far from it: Most estimates put the number of undernourished at 800 million globally. In case you haven’t heard, this spring has already brought an unprecedented quadruple famine to Africa and the Middle East; the U.N. has warned that separate starvation events in Somalia, South Sudan, Nigeria, and Yemen could kill 20 million this year alone. Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 IV. Climate Plagues What happens when the bubonic ice melts? Rock, in the right spot, is a record of planetary history, eras as long as millions of years flattened by the forces of geological time into strata with amplitudes of just inches, or just an inch, or even less. Ice works that way, too, as a climate ledger, but it is also frozen history, some of which can be reanimated when unfrozen. There are now, trapped in Arctic ice, diseases that have not circulated in the air for millions of years — in some cases, since before humans were around to encounter them. Which means our immune systems would have no idea how to fight back when those prehistoric plagues emerge from the ice. The Arctic also stores terrifying bugs from more recent times. In Alaska, already, researchers have discovered remnants of the 1918 flu that infected as many as 500 million and killed as many as 100 million — about 5 percent of the world’s population and almost six times as many as had died in the world war for which the pandemic served as a kind of gruesome capstone. As the BBC reported in May, scientists suspect smallpox and the bubonic plague are trapped in Siberian ice, too — an abridged history of devastating human sickness, left out like egg salad in the Arctic sun. Experts caution that many of these organisms won’t actually survive the thaw and point to the fastidious lab conditions under which they have already reanimated several of them — the 32,000-year-old “extremophile” bacteria revived in 2005, an 8 million-year-old bug brought back to life in 2007, the 3.5 million–year–old one a Russian scientist self-injected just out of curiosity — to suggest that those are necessary conditions for the return of such ancient plagues. But already last year, a boy was killed and 20 others infected by anthrax released when retreating permafrost exposed the frozen carcass of a reindeer killed by the bacteria at least 75 years earlier; 2,000 present-day reindeer were infected, too, carrying and spreading the disease beyond the tundra. What concerns epidemiologists more than ancient diseases are existing scourges relocated, rewired, or even re-evolved by warming. The first effect is geographical. Before the early-modern period, when adventuring sailboats accelerated the mixing of peoples and their bugs, human provinciality was a guard against pandemic. Today, even with globalization and the enormous intermingling of human populations, our ecosystems are mostly stable, and this functions as another limit, but global warming will scramble those ecosystems and help disease trespass those limits as surely as Cortés did. You don’t worry much about dengue or malaria if you are living in Maine or France. But as the tropics creep northward and mosquitoes migrate with them, you will. You didn’t much worry about Zika a couple of years ago, either. As it happens, Zika may also be a good model of the second worrying effect — disease mutation. One reason you hadn’t heard about Zika until recently is that it had been trapped in Uganda; another is that it did not, until recently, appear to cause birth defects. Scientists still don’t entirely understand what happened, or what they missed. But there are things we do know for sure about how climate affects some diseases: Malaria, for instance, thrives in hotter regions not just because the mosquitoes that carry it do, too, but because for every degree increase in temperature, the parasite reproduces ten times faster. Which is one reason that the World Bank estimates that by 2050, 5.2 billion people will be reckoning with it. Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 V. Unbreathable Air A rolling death smog that suffocates millions. By the end of the century, the coolest months in tropical South America, Africa, and the Pacific are likely to be warmer than the warmest months at the end of the 20th century. Photo: Heartless Machine Our lungs need oxygen, but that is only a fraction of what we breathe. The fraction of carbon dioxide is growing: It just crossed 400 parts per million, and high-end estimates extrapolating from current trends suggest it will hit 1,000 ppm by 2100. At that concentration, compared to the air we breathe now, human cognitive ability declines by 21 percent. Other stuff in the hotter air is even scarier, with small increases in pollution capable of shortening life spans by ten years. The warmer the planet gets, the more ozone forms, and by mid-century, Americans will likely suffer a 70 percent increase in unhealthy ozone smog, the National Center for Atmospheric Research has projected. By 2090, as many as 2 billion people globally will be breathing air above the WHO “safe” level; one paper last month showed that, among other effects, a pregnant mother’s exposure to ozone raises the child’s risk of autism (as much as tenfold, combined with other environmental factors). Which does make you think again about the autism epidemic in West Hollywood. Already, more than 10,000 people die each day from the small particles emitted from fossil-fuel burning; each year, 339,000 people die from wildfire smoke, in part because climate change has extended forest-fire season (in the U.S., it’s increased by 78 days since 1970). By 2050, according to the U.S. Forest Service, wildfires will be twice as destructive as they are today; in some places, the area burned could grow fivefold. What worries people even more is the effect that would have on emissions, especially when the fires ravage forests arising out of peat. Peatland fires in Indonesia in 1997, for instance, added to the global CO2 release by up to 40 percent, and more burning only means more warming only means more burning. There is also the terrifying possibility that rain forests like the Amazon, which in 2010 suffered its second “hundred-year drought” in the space of five years, could dry out enough to become vulnerable to these kinds of devastating, rolling forest fires — which would not only expel enormous amounts of carbon into the atmosphere but also shrink the size of the forest. That is especially bad because the Amazon alone provides 20 percent of our oxygen. Then there are the more familiar forms of pollution. In 2013, melting Arctic ice remodeled Asian weather patterns, depriving industrial China of the natural ventilation systems it had come to depend on, which blanketed much of the country’s north in an unbreathable smog. Literally unbreathable. A metric called the Air Quality Index categorizes the risks and tops out at the 301-to-500 range, warning of “serious aggravation of heart or lung disease and premature mortality in persons with cardiopulmonary disease and the elderly” and, for all others, “serious risk of respiratory effects”; at that level, “everyone should avoid all outdoor exertion.” The Chinese “airpocalypse” of 2013 peaked at what would have been an Air Quality Index of over 800. That year, smog was responsible for a third of all deaths in the country. Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 VI. Perpetual War The violence baked into heat. Climatologists are very careful when talking about Syria. They want you to know that while climate change did produce a drought that contributed to civil war, it is not exactly fair to saythat the conflict is the result of warming; next door, for instance, Lebanon suffered the same crop failures. But researchers like Marshall Burke and Solomon Hsiang have managed to quantify some of the non-obvious relationships between temperature and violence: For every half-degree of warming, they say, societies will see between a 10 and 20 percent increase in the likelihood of armed conflict. In climate science, nothing is simple, but the arithmetic is harrowing: A planet five degrees warmer would have at least half again as many wars as we do today. Overall, social conflict could more than double this century. This is one reason that, as nearly every climate scientist I spoke to pointed out, the U.S. military is obsessed with climate change: The drowning of all American Navy bases by sea-level rise is trouble enough, but being the world’s policeman is quite a bit harder when the crime rate doubles. Of course, it’s not just Syria where climate has contributed to conflict. Some speculate that the elevated level of strife across the Middle East over the past generation reflects the pressures of global warming — a hypothesis all the more cruel considering that warming began accelerating when the industrialized world extracted and then burned the region’s oil. What accounts for the relationship between climate and conflict? Some of it comes down to agriculture and economics; a lot has to do with forced migration, already at a record high, with at least 65 million displaced people wandering the planet right now. But there is also the simple fact of individual irritability. Heat increases municipal crime rates, and swearing on social media, and the likelihood that a major-league pitcher, coming to the mound after his teammate has been hit by a pitch, will hit an opposing batter in retaliation. And the arrival of air-conditioning in the developed world, in the middle of the past century, did little to solve the problem of the summer crime wave. Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 VII. Permanent Economic Collapse Dismal capitalism in a half-poorer world. The murmuring mantra of global neoliberalism, which prevailed between the end of the Cold War and the onset of the Great Recession, is that economic growth would save us from anything and everything. But in the aftermath of the 2008 crash, a growing number of historians studying what they call “fossil capitalism” have begun to suggest that the entire history of swift economic growth, which began somewhat suddenly in the 18th century, is not the result of innovation or trade or the dynamics of global capitalism but simply our discovery of fossil fuels and all their raw power — a onetime injection of new “value” into a system that had previously been characterized by global subsistence living. Before fossil fuels, nobody lived better than their parents or grandparents or ancestors from 500 years before, except in the immediate aftermath of a great plague like the Black Death, which allowed the lucky survivors to gobble up the resources liberated by mass graves. After we’ve burned all the fossil fuels, these scholars suggest, perhaps we will return to a “steady state” global economy. Of course, that onetime injection has a devastating long-term cost: climate change. The most exciting research on the economics of warming has also come from Hsiang and his colleagues, who are not historians of fossil capitalism but who offer some very bleak analysis of their own: Every degree Celsius of warming costs, on average, 1.2 percent of GDP (an enormous number, considering we count growth in the low single digits as “strong”). This is the sterling work in the field, and their median projection is for a 23 percent loss in per capita earning globally by the end of this century (resulting from changes in agriculture, crime, storms, energy, mortality, and labor). Tracing the shape of the probability curve is even scarier: There is a 12 percent chance that climate change will reduce global output by more than 50 percent by 2100, they say, and a 51 percent chance that it lowers per capita GDP by 20 percent or more by then, unless emissions decline. By comparison, the Great Recession lowered global GDP by about 6 percent, in a onetime shock; Hsiang and his colleagues estimate a one-in-eight chance of an ongoing and irreversible effect by the end of the century that is eight times worse. The scale of that economic devastation is hard to comprehend, but you can start by imagining what the world would look like today with an economy half as big, which would produce only half as much value, generating only half as much to offer the workers of the world. It makes the grounding of flights out of heat-stricken Phoenix last month seem like pathetically small economic potatoes. And, among other things, it makes the idea of postponing government action on reducing emissions and relying solely on growth and technology to solve the problem an absurd business calculation. Every round-trip ticket on flights from New York to London, keep in mind, costs the Arctic three more square meters of ice. Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 VIII. Poisoned Oceans Sulfide burps off the skeleton coast. That the sea will become a killer is a given. Barring a radical reduction of emissions, we will see at least four feet of sea-level rise and possibly ten by the end of the century. A third of the world’s major cities are on the coast, not to mention its power plants, ports, navy bases, farmlands, fisheries, river deltas, marshlands, and rice-paddy empires, and even those above ten feet will flood much more easily, and much more regularly, if the water gets that high. At least 600 million people live within ten meters of sea level today. But the drowning of those homelands is just the start. At present, more than a third of the world’s carbon is sucked up by the oceans — thank God, or else we’d have that much more warming already. But the result is what’s called “ocean acidification,” which, on its own, may add a half a degree to warming this century. It is also already burning through the planet’s water basins — you may remember these as the place where life arose in the first place. You have probably heard of “coral bleaching” — that is, coral dying — which is very bad news, because reefs support as much as a quarter of all marine life and supply food for half a billion people. Ocean acidification will fry fish populations directly, too, though scientists aren’t yet sure how to predict the effects on the stuff we haul out of the ocean to eat; they do know that in acid waters, oysters and mussels will struggle to grow their shells, and that when the pH of human blood drops as much as the oceans’ pH has over the past generation, it induces seizures, comas, and sudden death. That isn’t all that ocean acidification can do. Carbon absorption can initiate a feedback loop in which underoxygenated waters breed different kinds of microbes that turn the water still more “anoxic,” first in deep ocean “dead zones,” then gradually up toward the surface. There, the small fish die out, unable to breathe, which means oxygen-eating bacteria thrive, and the feedback loop doubles back. This process, in which dead zones grow like cancers, choking off marine life and wiping out fisheries, is already quite advanced in parts of the Gulf of Mexico and just off Namibia, where hydrogen sulfide is bubbling out of the sea along a thousand-mile stretch of land known as the “Skeleton Coast.” The name originally referred to the detritus of the whaling industry, but today it’s more apt than ever. Hydrogen sulfide is so toxic that evolution has trained us to recognize the tiniest, safest traces of it, which is why our noses are so exquisitely skilled at registering flatulence. Hydrogen sulfide is also the thing that finally did us in that time 97 percent of all life on Earth died, once all the feedback loops had been triggered and the circulating jet streams of a warmed ocean ground to a halt — it’s the planet’s preferred gas for a natural holocaust. Gradually, the ocean’s dead zones spread, killing off marine species that had dominated the oceans for hundreds of millions of years, and the gas the inert waters gave off into the atmosphere poisoned everything on land. Plants, too. It was millions of years before the oceans recovered. Latbear4blk 1 Quote
AdamSmith Posted November 14, 2017 Author Posted November 14, 2017 IX. The Great Filter Our present eeriness cannot last. So why can’t we see it? In his recent book-length essay The Great Derangement, the Indian novelist Amitav Ghosh wonders why global warming and natural disaster haven’t become major subjects of contemporary fiction — why we don’t seem able to imagine climate catastrophe, and why we haven’t yet had a spate of novels in the genre he basically imagines into half-existence and names “the environmental uncanny.” “Consider, for example, the stories that congeal around questions like, ‘Where were you when the Berlin Wall fell?’ or ‘Where were you on 9/11?’ ” he writes. “Will it ever be possible to ask, in the same vein, ‘Where were you at 400 ppm?’ or ‘Where were you when the Larsen B ice shelf broke up?’ ” His answer: Probably not, because the dilemmas and dramas of climate change are simply incompatible with the kinds of stories we tell ourselves about ourselves, especially in novels, which tend to emphasize the journey of an individual conscience rather than the poisonous miasma of social fate. Surely this blindness will not last — the world we are about to inhabit will not permit it. In a six-degree-warmer world, the Earth’s ecosystem will boil with so many natural disasters that we will just start calling them “weather”: a constant swarm of out-of-control typhoons and tornadoes and floods and droughts, the planet assaulted regularly with climate events that not so long ago destroyed whole civilizations. The strongest hurricanes will come more often, and we’ll have to invent new categories with which to describe them; tornadoes will grow longer and wider and strike much more frequently, and hail rocks will quadruple in size. Humans used to watch the weather to prophesy the future; going forward, we will see in its wrath the vengeance of the past. Early naturalists talked often about “deep time” — the perception they had, contemplating the grandeur of this valley or that rock basin, of the profound slowness of nature. What lies in store for us is more like what the Victorian anthropologists identified as “dreamtime,” or “everywhen”: the semi-mythical experience, described by Aboriginal Australians, of encountering, in the present moment, an out-of-time past, when ancestors, heroes, and demigods crowded an epic stage. You can find it already watching footage of an iceberg collapsing into the sea — a feeling of history happening all at once. It is. Many people perceive climate change as a sort of moral and economic debt, accumulated since the beginning of the Industrial Revolution and now come due after several centuries — a helpful perspective, in a way, since it is the carbon-burning processes that began in 18th-century England that lit the fuse of everything that followed. But more than half of the carbon humanity has exhaled into the atmosphere in its entire history has been emitted in just the past three decades; since the end of World War II, the figure is 85 percent. Which means that, in the length of a single generation, global warming has brought us to the brink of planetary catastrophe, and that the story of the industrial world’s kamikaze mission is also the story of a single lifetime. My father’s, for instance: born in 1938, among his first memories the news of Pearl Harbor and the mythic Air Force of the propaganda films that followed, films that doubled as advertisements for imperial-American industrial might; and among his last memories the coverage of the desperate signing of the Paris climate accords on cable news, ten weeks before he died of lung cancer last July. Or my mother’s: born in 1945, to German Jews fleeing the smokestacks through which their relatives were incinerated, now enjoying her 72nd year in an American commodity paradise, a paradise supported by the supply chains of an industrialized developing world. She has been smoking for 57 of those years, unfiltered. Or the scientists’. Some of the men who first identified a changing climate (and given the generation, those who became famous were men) are still alive; a few are even still working. Wally Broecker is 84 years old and drives to work at the Lamont-Doherty Earth Observatory across the Hudson every day from the Upper West Side. Like most of those who first raised the alarm, he believes that no amount of emissions reduction alone can meaningfully help avoid disaster. Instead, he puts his faith in carbon capture — untested technology to extract carbon dioxide from the atmosphere, which Broecker estimates will cost at least several trillion dollars — and various forms of “geoengineering,” the catchall name for a variety of moon-shot technologies far-fetched enough that many climate scientists prefer to regard them as dreams, or nightmares, from science fiction. He is especially focused on what’s called the aerosol approach — dispersing so much sulfur dioxide into the atmosphere that when it converts to sulfuric acid, it will cloud a fifth of the horizon and reflect back 2 percent of the sun’s rays, buying the planet at least a little wiggle room, heat-wise. “Of course, that would make our sunsets very red, would bleach the sky, would make more acid rain,” he says. “But you have to look at the magnitude of the problem. You got to watch that you don’t say the giant problem shouldn’t be solved because the solution causes some smaller problems.” He won’t be around to see that, he told me. “But in your lifetime …” Jim Hansen is another member of this godfather generation. Born in 1941, he became a climatologist at the University of Iowa, developed the groundbreaking “Zero Model” for projecting climate change, and later became the head of climate research at NASA, only to leave under pressure when, while still a federal employee, he filed a lawsuit against the federal government charging inaction on warming (along the way he got arrested a few times for protesting, too). The lawsuit, which is brought by a collective called Our Children’s Trust and is often described as “kids versus climate change,” is built on an appeal to the equal-protection clause, namely, that in failing to take action on warming, the government is violating it by imposing massive costs on future generations; it is scheduled to be heard this winter in Oregon district court. Hansen has recently given up on solving the climate problem with a carbon tax alone, which had been his preferred approach, and has set about calculating the total cost of the additional measure of extracting carbon from the atmosphere. Related Stories Climate Scientist James Hansen: ‘The Planet Could Become Ungovernable’ The 10-Book ‘Uninhabitable Earth’ Reading List Hansen began his career studying Venus, which was once a very Earth-like planet with plenty of life-supporting water before runaway climate change rapidly transformed it into an arid and uninhabitable sphere enveloped in an unbreathable gas; he switched to studying our planet by 30, wondering why he should be squinting across the solar system to explore rapid environmental change when he could see it all around him on the planet he was standing on. “When we wrote our first paper on this, in 1981,” he told me, “I remember saying to one of my co-authors, ‘This is going to be very interesting. Sometime during our careers, we’re going to see these things beginning to happen.’ ” Several of the scientists I spoke with proposed global warming as the solution to Fermi’s famous paradox, which asks, If the universe is so big, then why haven’t we encountered any other intelligent life in it? The answer, they suggested, is that the natural life span of a civilization may be only several thousand years, and the life span of an industrial civilization perhaps only several hundred. In a universe that is many billions of years old, with star systems separated as much by time as by space, civilizations might emerge and develop and burn themselves up simply too fast to ever find one another. Peter Ward, a charismatic paleontologist among those responsible for discovering that the planet’s mass extinctions were caused by greenhouse gas, calls this the “Great Filter”: “Civilizations rise, but there’s an environmental filter that causes them to die off again and disappear fairly quickly,” he told me. “If you look at planet Earth, the filtering we’ve had in the past has been in these mass extinctions.” The mass extinction we are now living through has only just begun; so much more dying is coming. And yet, improbably, Ward is an optimist. So are Broecker and Hansen and many of the other scientists I spoke to. We have not developed much of a religion of meaning around climate change that might comfort us, or give us purpose, in the face of possible annihilation. But climate scientists have a strange kind of faith: We will find a way to forestall radical warming, they say, because we must. It is not easy to know how much to be reassured by that bleak certainty, and how much to wonder whether it is another form of delusion; for global warming to work as parable, of course, someone needs to survive to tell the story. The scientists know that to even meet the Paris goals, by 2050, carbon emissions from energy and industry, which are still rising, will have to fall by half each decade; emissions from land use (deforestation, cow farts, etc.) will have to zero out; and we will need to have invented technologies to extract, annually, twice as much carbon from the atmosphere as the entire planet’s plants now do. Nevertheless, by and large, the scientists have an enormous confidence in the ingenuity of humans — a confidence perhaps bolstered by their appreciation for climate change, which is, after all, a human invention, too. They point to the Apollo project, the hole in the ozone we patched in the 1980s, the passing of the fear of mutually assured destruction. Now we’ve found a way to engineer our own doomsday, and surely we will find a way to engineer our way out of it, one way or another. The planet is not used to being provoked like this, and climate systems designed to give feedback over centuries or millennia prevent us — even those who may be watching closely — from fully imagining the damage done already to the planet. But when we do truly see the world we’ve made, they say, we will also find a way to make it livable. For them, the alternative is simply unimaginable. *This article appears in the July 10, 2017, issue of New York Magazine. *This article has been updated to provide context for the recent news reports about revisions to a satellite data set, to more accurately reflect the rate of warming during the Paleocene–Eocene Thermal Maximum, to clarify a reference to Peter Brannen’s The Ends of the World, and to make clear that James Hansen still supports a carbon-tax based approach to emissions. Latbear4blk 1 Quote
Members Latbear4blk Posted November 14, 2017 Members Posted November 14, 2017 Perfect reading to feed the Thanksgiving Spirit. We should be thankful, we have the chance to understand human nature better than any generation before. We have been aware for a long time now of our insignificance as individuals, or even as a civilization or a culture; now we are confronted to our insignificance as a species. Nature is about to erase Human Kind, no one may be left to keep the record of our existence. Life and the Universe will continue without the human infestation. The End of the Days will trigger a lot of insanity (should I say "is triggering"?), but also will inspire a lot of wisdom and awareness. Fortunately, there is Sex. AdamSmith 1 Quote
AdamSmith Posted November 15, 2017 Author Posted November 15, 2017 When Did the End Begin? A scientific debate that’s oddly amusing to entertain: At what point, exactly, did mankind irrevocably put the Earth on the road to ruin? By Robert Sullivan A while back, I got invited by an artist friend to her loft for a Sunday-afternoon discussion she was hosting on the Anthropocene. I RSVP’d “yes,” enthusiastically, even though I wasn’t precisely sure what the term meant. The definition I’d sort of assumed — the age in which mankind had managed to overwhelm the world — had come to me almost through osmosis, from having encountered the term in journals, magazines, song lyrics. It was like a song that had been playing on the radio so often that I could sing along even if I didn’t really know the words. But I knew it was basically about the coming doomsday, and that it was all our fault. By Sunday, the notion of preparing for a discussion was causing me a kind of dread, in part owing to my dread-inclined nature, in part to the reading list: everything from “Art for the Anthropocene Era,” in Art in America last winter, to “Agency at the Time of the Anthropocene,” in New Literary History, which begins with a kind of hands-in-the-air-and-surrender sentiment: “How can we absorb the odd novelty of the headline: ‘The amount of CO2 in the air is the highest it has been for more than 2.5 million years — the threshold of 400 ppm of CO2, the main agent of global warming, is going to be crossed this year’?” How can we indeed? Elizabeth Kolbert’s work on extinctions was on the list, as was Bill McKibben’s The End of Nature, a book I read in 1989, at which point I distinctly remember predicting it would change everyone’s thinking on global warming, which it did and, terrifyingly, didn’t at all. And now the Anthropocene was everywhere, like a marketing campaign, or taxes. “I have a term that I’ve been throwing out occasionally,” Erle Ellis, a global ecologist at the University of Maryland, told me. “It’s the Zeitangst.” I went to my friend’s salon — a social scientist presented some interesting research about changing human behavior, followed by likewise interesting dialogue and great cake — and I left determined to better understand the definition of the Anthropocene, only in so doing I found myself nearly lost, swallowed up in the tar pit of debate. Theory 2: Was it all the way back when humans first learned to control fire? Science Photo Library/Corbis The Anthropocene does not, in the strictest sense, exist. By the standards of the International Commission on Stratigraphy (ICS), the administrators of the geologic time scale — that old-school conceptual ruler notched with eons, eras, periods, epochs, and ages stretching back to what we refer to as the Earth’s beginning, about four and a half billion years ago — we are living in the epoch called the Holocene. The proposition, however, is that this is no longer true — that we are now in a new epoch, one defined by humanity’s significant impact on the Earth’s ecosystems. This has come, for obvious reasons, to be freighted with political portent. Academics, particularly scientific ones, are excited by this notion. But they are not satisfied with the term as a metaphor; they seek the authority of geologists, the scientists historically interested in marking vast tracts of time, humanity’s official timekeepers. And so, to learn more, I Skyped with Jan Zalasiewicz, a delightful paleogeologist at the University of Leicester and the chair of the International Commission on Stratigraphy’s Anthropocene Working Group. “Nottingham Castle is built on Sherwood sandstone,” he noted, listing the local geologic sites, “which has cut into it Britain’s oldest pub, from the time of the Crusades. If you ever visit Nottingham, you must have a pint!” I asked Zalasiewicz when he’d first heard of the Anthropocene. He cited a 2002 paper in Nature written by Paul Crutzen, a Nobel Prize–winning chemist and a prominent practitioner of Earth-system science, which is the study of the interactions between the atmosphere and other spheres (bio-, litho-, etc). “I read it and I thought, How interesting. What a nice idea. And then, like a lot of geologists, I forgot about it — until the word began to appear in the literature, as if it was formal, without irony, without inverted commas.” In Nature, Crutzen urged the scientific community to formally adopt what he named the Anthropocene (anthro from the Greek anthrópos, meaning “human being”) and to mark its beginning at the start of the Industrial Revolution. The evidence he cited is too depressing to recount in full here: The human population has increased tenfold in the past 300 years; species are dying; most freshwater is being sucked up by humans; not to mention the man-induced changes in the chemical composition of the atmosphere — essentially, all the facts the world is ignoring, avoiding, or paying people to obfuscate. Crutzen’s proposal barely registered in Zalasiewicz’s field. “It was a geologic concept launched by an atmospheric chemist within an Earth-science-systems context,” he explained. (Translation: An NHL player suggested an NBA rule.) But by the time it came up at the Geological Society of London in 2008, the ICS was in the strange position of debating a term that had already been accepted not just by laypeople but by other scientists — especially the Earth-system scientists who first trumpeted global warming. Anthropocene has proved wildly appealing. For laypeople, it’s big and futuristic and implies a science-induced bad ending; for climate-change scientists, it marks all their hard work in the lucidly solid and enduring traditions of geology. Everybody likes that it is a simple, epic, ostensibly scientific stake in the Earth that says “We’re fucked.” “The geologists, as ever, were very late to catch up with this,” Zalasiewicz told me. “We grind slowly.” Though the commission agreed at that meeting that the term had “merit,” for it to be fully ratified, its proponents would have to play by geologists’ rules. Chief among them would be to pinpoint exactly when we left the Holocene age and the end began. Theory 3: Or was it when we invented farming, destroying much plant life? Photo: DEA/G. Dagli Orti/Getty Images Academic papers, from myriad sources, have since poured in, in a situation reminiscent of the cataclysmic flood that suddenly overran the Pacific Northwest 15,000 years ago, on the day that glacial Lake Missoula’s ice dams gave way. The aim is to identify an acceptable “golden spike” — traditionally a mark, visible in the Earth, where a geologist can observe a change in the past. But to be accepted as a spike, formal criteria must be met. The marker must be geosynchronous‚ i.e., align with similar markers around the world from the same time. It must be technical and practical, something scientists in the field can find in the rocks and sediments. This golden-spike search has felt backward to many geologists, who historically have first found a change in the rocks and then named it, rather than observing a change and then scouring the rocks for a marker. If you are pushing for an Anthropocene golden spike at 10,000 to 50,000 years ago, then you are endorsing a spike at the extinction of the megafauna, the massive creatures like saber-toothed tigers and the Coelodonta, a Eurasian woolly rhinoceros that was likely hunted away by humans. This is a spike favored by some archaeologists whose long view is geologic in itself and admirable in its clarity: Humanity’s impact, they argue, starts near the beginning of humanity. (The moment humans controlled fire is another spike candidate.) However, for many stratigraphers, these early proposals fall short. There is scant geosynchronous evidence, plus the larger question of whether hunting mammoths is really what we’re talking about when we’re talking about a man-doomed world. So then maybe it’s not the beginning of humanity itself but the moment man began to farm, a development that destroyed plants and thus caused other extinctions. At that same time, many forests were converted to grazing lands, often via controlled, carbon-releasing fires. Votes for a farming spike might come from anthropologists and ecologists. Still, while it might be possible to track the fossil pollen from domesticated plants, farming takes off unevenly, beginning 11,000 years ago in Southwest Asia but not until 4,000 or 5,000 years ago in Africa and North America. Theory 4: Or maybe it was when Columbus discovered the New World, marking the beginning of globalization. Photo: De Agostini/Getty Images This March, two scientists from University College, London, offered a new candidate. Writing in Nature, Simon Lewis, an ecologist, and Mark Maslin, a geographer, proposed the arrival of Europeans in the Caribbean in 1492 — what the authors of the paper called “the largest human population replacement in the past 13,000 years, the first global trade networks linking Europe, China, Africa and the Americas, and the resultant mixing of previously separate biotas, known as the Columbian Exchange.” Counterintuitively, this spike correlates with a decrease in carbon in the atmosphere that began in 1610, caused by the destruction of North American agricultural civilizations (likely from disease spread by humans) and the return of carbon-sucking woodlands. Lewis and Maslin have named this dip the Orbis spike (Orbis for “world”) because, they write, after 1492, “humans on the two hemispheres were connected, trade became global, and some prominent social scientists refer to this time as the beginning of the modern ‘world-system.’ ” And yet the betting is that the Orbis spike will not make the cut among the geologists: The dip in carbon dioxide is small and observable only in remote ice, just for starters. In the office of Paige West, a professor in the Columbia-Barnard anthropology department, she and two colleagues — J. C. Salyer, a sociology professor at Barnard; and Jerry Jacka, an anthropologist from the University of Colorado at Boulder — introduced me to yet another theory of when we condemned our planet to death. The three of them are submitting a paper to Nature this month that nominates the arrival of capitalism. That moment — they peg it to roughly 1800, when, they argue, the ideology began to push economic growth above all — has a similar logic to Crutzen’s proposal of the Industrial Revolution but is even less tied to observable phenomena (like the remnants of coal ash). The argument seems purely philosophical, except it’s not. “Capitalism fundamentally changed what people find to be valuable in society and in terms of the natural world,” West told me. Jacka added that it “allowed people to do things to their environment in ways that would have never been possible before.” Their point is that humans don’t have to act the way we do now — and the minute we started, we launched a new age. “It’s a choice,” Salyer said. “But it’s not a choice as in ‘Oh, I as an individual would like to live differently.’ Because it’s a social system.” And then there are the proposals for extremely contemporary spikes, like ones that occurred within the lifetimes of some of the proponents. They mostly coalesce around what’s referred to by Earth-system scientists as the Great Acceleration — the huge increase in population, economic activity, resource use, transport, communication, and chemical production that kicked in around World War II. Think of our overaccessorized lives, or think of what Peter Haff, a geology professor at Duke, refers to as the “technosphere,” a system that includes all those things that we have created but lost control of: highways and power systems and communications lines. The identifiable alterations to the Earth are easy to identify — coal ash, lead and other metals, plastics everywhere. One candidate is simpler still, and its mark can be found in tree rings and, potentially, in underwater sediments: nuclear-weapon detonations. Theory 5: Perhaps it was when modern capitalism taught us to pillage the Earth. Corbis A lot of stratigraphers find this whole exercise useless — an attempt by well-meaning activists to hijack the language of geology and then justify their work by inventing arguably nonexistent spikes. “What’s the stratigraphic record of the Anthropocene?” asked Stan Finney, a geologist at California State University at Long Beach. “There’s nothing. Or it’s so minimal as to be inconsequential.” He is not, he insisted, saying humans have not affected the Earth. Au contraire. “I live in L.A., and I’m disgusted by the spread of the city,” he said. “And sure, humans have impacted the Earth. There’s no debate. The debate is over what we do in the ICS and what’s the nature of our units and the concept. And that’s not really a debate.” Marking the Anthropocene at the dawn of the Industrial Revolution or farming or at the sailing of the Niña, Pinta, and Santa María would be, to his mind, a ridiculous privileging of human history over rock history. Erle Ellis, the global ecologist, is a member of Zalasiewicz’s working group, and though he likes the term Anthropocene, his relationship with it has evolved. He started out certain that stratigraphers should accept it. He worked on papers looking at agriculture. Then, given the vagaries of marking soil, he semi-reluctantly settled on the nuclear spike, though even that has proved more difficult to track than expected. His most recent paper is subtitled “Is a formally designated ‘Anthropocene’ a good idea?” Now his proposal is to keep debating and not to decide. How do we even know yet what the Anthropocene looks like? Some animal species have declined in the human-impacted world, while others, like wolves, have done well. “The story of the Anthropocene is almost too complicated — and too important — for the Anthropocene to be formalized,” he said. The Anthropocene Working Group is hoping to present its findings at the International Geological Congress in 2016, and that fact alone is an evolution worth spiking, a change in science’s ecology. (It took more than 50 years for the Holocene to be formalized.) Zalasiewicz told me that opinions are coalescing. “Not unanimously, but we tend to be coming round to the mid–20th century,” he said, since evidence of man-made environmental destruction is so easy to find then. In the meantime, the argument is not merely over the spikes, or even the formal adoption of a new epoch; it’s over what the arrival of the term means for science and everyone else. “There’s basically a tidal wave of change coming down,” Haff told me, “and I think it’s incumbent upon the intellectual and professional community to try to respond in some positive way.” Theory 6: Or when mass production started to produce mass trash. Wikimedia Commons For geologists, this means figuring out how to bring your field up to speed without bastardizing it. For many Earth-system scientists and their sympathizers, it means using the Anthropocene to scare the crap out of the world (for a good reason, obviously, but still); they talk about our being at a tipping point, that we should work to prepare for the worst. Others find the term empowering. Last summer, Andrew Revkin, a journalist and AWG member who has scrupulously documented the Anthropocene conversation (and can be said to have coined the idea ten years before Crutzen), used the phrase “‘good’ Anthropocene” to suggest that the epoch might not be all gloom. He was swiftly attacked; the ethicist Clive Hamilton called the phrase a “failure of courage, courage to face the facts,” and Elizabeth Kolbert tweeted that good and Anthropocene were “two words that probably should not be used in sequence.” And yet maybe they should. Personally, I like aiming for a good Anthropocene better than the alternative, and it seems to me that you can be in awe of all that humans have ruined without demanding that everyone despair. As Ellis said to me, “We’re in it for better or worse; I’m in it for better.” Over the last year, Stephanie Wakefield, a geographer at CUNY Queens, has been helping organize gatherings at a space in an old building in Ridgewood called Woodbine, where friends and neighbors attend lectures, neighborhood meetings, gardening events, you name it. Wakefield herself has given a few talks on the history of apocalyptic thinking in America, a topic that tends to numb people into inaction. To her, the term Anthropocene is galvanizing, like a wake that goes well: One epoch has died to be replaced by something else, and we all have a stake in what that new world will be. “People talk for hours,” she says of the public discussions. “There’s a new vibe to it. Everyone gets to the heart of it really fast.” A few months ago, a guest speaker Skyped in from England: Jan Zalasiewicz. The people of Ridgewood had lots of questions for him. “Jan said something really interesting that I had not thought of before,” Wakefield said, “which is that we all see that the popularity of the Anthropocene allows for all these new things — for people to meet across barriers, to talk about revolutionary things outside of the political sphere. How are we going to remake life? Let’s figure it out.” Zalasiewicz explained why he thinks it’s important for geologists to recognize the shift in some way. “It gives it that legitimacy,” Wakefield said about Zalasiewicz’s talk. “If it’s just a phrase, then it’s like, well, there are a lot of phrases.” *This article appears in the June 15, 2015 issue of New York Magazine. http://nymag.com/scienceofus/2015/06/anthropocene-debate.html Quote
Guest Larstrup Posted November 19, 2017 Posted November 19, 2017 Will Elon Musk’s Solar Roof Tiles Power the Future? While we wait for Elon Musk to build his human colony on Mars to combat global warming, his latest product — Tesla’s Solar Roof — aims to help protect the planet in a far more practical and cost-effective way. Musk, never one to shy away from grand statements, says the panels are positioned to revolutionize how we power and design our homes. Unlike the bulky and ugly panels on most solar roofs, these are sleek and stylish, designed for a clean, streamlined look. Moreover, the panels come with an "infinity warranty." Let’s explore this latest invention from our favorite visionary entrepreneur and mad scientist — starting with some background on how Musk got to this point. Let's Get Started Quote