The Precipice

The Book in 3 Sentences

We undervalue the possible benefit we could have to future people over the long-term especially through existential risk—the ways we might permanently setback or end human civilization.

We are living in a risky era—The Precipice—in which we have a roughly 1/6 chance of going extinct this century.

Our possible positive impact on the future is enormous and we have the opportunity to safeguard and improve humanity's future in the same way that the thousands of generations before us did.

Impressions:

Great concise writing. It's not beautiful but Ord gets every point across clearly. These problems are hard to relate to because of their distance from us across time and space, but Ord frames them well to make the risks feel tangible and serious.

My Top 3 Quotes:

The greatest risks are caused by human action, and they can be addressed by human action.

If we think of one million years in terms of a single, eighty-year life, then today humanity would be in its adolescence—sixteen years old; just coming into our power; just old enough to get ourselves in serious trouble.

Environmentalism burst in upon the global political scene less than twenty years before I was born, and yet I was raised in a milieu where it was one of the main parts of our moral education; where the earlier disregard for the environment had become unthinkable to my generation. This can happen again.

Who Should Read it?

If you are into effective altruism, especially if you are feeling a bit skeptical about or interested in longtermism.

If you are interested in the future of humanity, or ethics.

How the Book Changed me

A much deeper appreciation for the possibly absurd scale of the future and the possible trillions of future people. The most helpful framing for me was the way everything I have in this life is 99.99% due to past generations hard work, and seeing that we have the same opportunity to pass this on.

A greater ethical appreciation and consideration of the moral value of future people. It often feels like we should help the people suffering right now, and although it's not as intuitive or exciting, there are many times where it could be more beneficial to improve things generally for future generations.

Other Great Quotes

Protection from existential risk is an intergenerational global public good. So even the entire population of the globe acting in concert could be expected to undervalue existential risks by a very large factor, leaving them greatly neglected.

Five big risks: those around nuclear war, climate change, other environmental damage, engineered pandemics and unaligned AI.


My Ideas from the Book

Humanity is in its adolescence

Future people are disenfranchised

Future people may be the easiest to help

The Agricultural Revolution may have been bad for humanity

The world is getting better in nearly every measurable way

?Humanity has come close to losing it all

We can pay our efforts forward to the future like future generations did for us. This could be our duty.

?Existential risk is an intergenerational global public good

We can't appreciate exponential growth

Control comes from the threat of force and not the use of it

AI Safety is important even if it an unlikely and speculative

There are ways to guard against unknown risks

The repugnant conclusions seems true

Notes

All Quotes

What is clear is that we came precariously close to a nuclear strike on the blockading fleet—a strike which would most likely have resulted in nuclear retaliation, then escalation to a full-scale nuclear war (the only kind the US had plans for). -- Toby Ord, The Precipice, loc. 158-60

The greatest risks are caused by human action, and they can be addressed by human action. -- Toby Ord, The Precipice, loc. 167-68

I came to realize that the risks to humanity’s future are just as real and just as urgent—yet even more neglected. And that the people of the future may be even more powerless to protect themselves from the risks we impose than the dispossessed of our own time. -- Toby Ord, The Precipice, loc. 201-3

humanity’s attention is scarce and precious, and must not be wasted on flawed narratives or ideas. -- Toby Ord, The Precipice, loc. 210-12

In ecological terms, it is not a human that is remarkable, but humanity. -- Toby Ord, The Precipice, loc. 253-54

Life in the centuries following the Agricultural Revolution generally involved more work, reduced nutrition and increased disease. -- Toby Ord, The Precipice, loc. 321-22

It is shocking then to look further back and see that prior to the Industrial Revolution 19 out of 20 people lived on less than two dollars a day (even adjusting for inflation and purchasing power). -- Toby Ord, The Precipice, loc. 337-38

On what principle is it that, when we see nothing but improvement behind us, we are to expect nothing but deterioration before us?36 -- Toby Ord, The Precipice, loc. 375-76

If we think of one million years in terms of a single, eighty-year life, then today humanity would be in its adolescence—sixteen years old; just coming into our power; just old enough to get ourselves in serious trouble. -- Toby Ord, The Precipice, loc. 390-92

And six years later, a single thermonuclear bomb held more energy than every explosive used in the entire course of the Second World War. -- Toby Ord, The Precipice, loc. 424-25

Shortly after, Kennedy told a close adviser that he thought the probability of it ending in nuclear war with the USSR was “somewhere between one out of three, and even.” -- Toby Ord, The Precipice, loc. 481-83

the Cuban Missile Crisis would remain one of the pivotal moments in 200,000 years of human history: perhaps the closest we have ever come to losing it all. -- Toby Ord, The Precipice, loc. 493-94

The problem is not so much an excess of technology as a lack of wisdom. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 30, loc. 508

To risk destroying this future, for the sake of some advantage limited only to the present, seems to me profoundly parochial and dangerously short-sighted. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 44, loc. 720

But the possibility of preventable existential risks in our lifetimes shows that there are issues where our actions can have sustained positive effects over the whole longterm future, and where we are the only generation in a position to produce those effects. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 45, loc. 738

And so this knowledge will be unfolded only through long successive ages. There will come a time when our descendants will be amazed that we did not know things that are so plain to them… -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 50, loc. 807

And they smiled, telling me that this wasn’t how it worked—that one doesn’t repay one’s parents. One passes it on. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 50, loc. 819

On this view, our duties to future generations may thus be grounded in the work our ancestors did for us when we were future generations. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 51, loc. 822

So long as we find the case for safeguarding our future quite plausible, it would be extremely reckless to neglect it.50 -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 56, loc. 915

But worse, protection from existential risk is a global public good—one where the pool of beneficiaries spans the globe. This means that even nation states will neglect it. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 58, loc. 953

Protection from existential risk is an intergenerational global public good. So even the entire population of the globe acting in concert could be expected to undervalue existential risks by a very large factor, leaving them greatly neglected.60 -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 59, loc. 967

If only seeing is believing, we will step blindly over the precipice. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 61, loc. 991

Environmentalism burst in upon the global political scene less than twenty years before I was born, and yet I was raised in a milieu where it was one of the main parts of our moral education; where the earlier disregard for the environment had become unthinkable to my generation. This can happen again. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 63, loc. 1031

A willingness to think seriously about imprecise probabilities of unprecedented events is crucial to grappling with risks to humanity’s future. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 73, loc. 1158

Even the much smaller eruption of Indonesia’s Mount Tambora in 1815 (less than a hundredth the size) caused a global cooling of 1°C, with places as far away as the United States suffering crop failure and June snows in what became known as the “year without a summer.”31 -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 74, loc. 1171

For almost our entire history we have been subject to risks to which we were completely oblivious. And there is no reason to think that the flurry of discovery has finished -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 81, loc. 1277

The scientists and military appear to have assumed full responsibility for an act that threatened all life on Earth. Was this a responsibility that was theirs to assume? -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 93, loc. 1447

It took thirteen years and $500 million to produce the full DNA sequence of the human genome. Just 15 years later, a genome can be sequenced for under $1,000 or within a single hour. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 133, loc. 2102

Perhaps the best example is the Aum Shinrikyo cult in Japan, active between 1984 and 1995, which sought to bring about the destruction of humanity. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 134, loc. 2111

Or dangerous ideas, like the published techniques for how to resurrect smallpox and 1918 flu from these genomes (undermining all prior attempts to restrict physical access to them). Once released, this information spreads as far as any virus, and is as resistant to eradication. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 137, loc. 2162

For even when the overwhelming majority of scientists think the danger outweighs the benefit, it takes just one overly optimistic estimate to lead to the information being released.66 Contrary to good scientific practice, the community’s decision is being determined by a single outlier. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 138, loc. 2170

AlphaZero: a neural network–based system that learned to play chess from scratch. It went from novice to grand master in just four hours. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 140, loc. 2211

As the reigning world champion, Ke Jie, put it: “After humanity spent thousands of years improving our tactics, computers tell us that humans are completely wrong… I would go as far as to say not a single human has touched the edge of the truth of Go.” -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 140, loc. 2219

An intelligent agent would also resist attempts to change its reward function to something more aligned with human values—for it can predict that this would lead it to get less of what it currently sees as rewarding. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 145, loc. 2290

Without robotic manipulators, the system can only produce words, pictures and sounds. But a moment’s reflection shows that these are exactly what is needed to take control. For the most damaging people in history have not been the strongest. Hitler, Stalin and Genghis Khan achieved their absolute control over large parts of the world by using words to convince millions of others to win the requisite physical contests. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 146, loc. 2308

Yet a speculative case that there is a large risk can be more important than a robust case for a very low-probability risk, such as that posed by asteroids. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 149, loc. 2350

The main points of those who downplay the risks are that (1) we likely have decades left before AI matches or exceeds human abilities, and (2) attempting to immediately regulate research in AI would be a great mistake. Yet neither of these points is actually contested by those who counsel caution: -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 149, loc. 2361

In the words of Demis Hassabis, co-founder of DeepMind: We need to use the downtime, when things are calm, to prepare for when things get serious in the decades to come. The time we have now is valuable, and we need to make use of it. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 152, loc. 2415

Even though the most plausible moral views have a lot of agreement about which small changes to the world are good and which are bad, they tend to come strongly apart in their recommendations about what an optimal world would look like. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 157, loc. 2480

But the more that rigorous and rational debate is encouraged, the more truth contributes to memetic success. So encouraging a culture of such debate may be one way we can now help avoid this fate. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 158, loc. 2499

We might adopt the guiding principle of minimizing lock-in. Or to avoid the double negative, of preserving our options. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 158, loc. 2502

We have proof that some form of this is possible within our very own cells, where atomically precise machinery already performs their essential functions. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 158, loc. 2510

I sometimes think about this landscape in terms of five big risks: those around nuclear war, climate change, other environmental damage, engineered pandemics and unaligned AI. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 169, loc. 2642

To get the right answer, we need to note that the importance of eliminating a risk lies in the amount of total risk that would disappear were that risk to be eliminated. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 175, loc. 2727

Just because existential risk declines as some other goal is pursued doesn’t mean that the other goal is the most effective way to secure our future. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 180, loc. 2812

For some risks might be easier to address. For example, we might be able to reduce a smaller risk from 5 percent to 1 percent with the same resources that would be needed to reduce a larger, but more stubborn, risk from 20 percent to 19 percent. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 180, loc. 2823

One such approach is to note that the more a problem is important, tractable or neglected, the more cost-effective it is to work on it, and thus the higher its priority. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 181, loc. 2832

Let’s consider three more heuristics for setting our priorities: focusing on risks that are soon, sudden and sharp. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 182, loc. 2858

Early action is best for changing course. If we are headed the wrong way, it is better to correct this at an early stage. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 185, loc. 2897

In short, early action is higher leverage, but more easily wasted. It has more power, but less accuracy. If we do act far in advance of a threat, we should do so in ways that take advantage of this leverage, while being robust to near-sightedness.38 This often means a focus on knowledge and capacity building, over direct work. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 185, loc. 2903

Many risks, such as disease, war, tyranny and permanently locking in bad values are correlated across different planets: if they affect one, they are somewhat more likely to affect the others too. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 194, loc. 3037

As Carl Sagan memorably put it: “Theories that involve the end of the world are not amenable to experimental verification—or at least, not more than once.” -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 197, loc. 3089

So if a disaster does occur, it is much more likely to be because there was an estimation mistake and the real risk was higher, rather than because a one in a trillion event occurred. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 198, loc. 3110

So while I think a more mature world would indeed restrain its growth in destructive capability to a level where it was adequately managed, I don’t see much value in advocating for this at the moment. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 207, loc. 3245

there is substantially more transition risk than state risk (in large part because there is more anthropogenic risk). This suggests that rushing our overall technological progress is not warranted. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 209, loc. 3288

**Cajoling them about why it is more important than a cause they hold dear is even worse. **-- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 213, loc. 3342

Don’t be tribal. Safeguarding our future is not left or right, not eastern or western, not owned by the rich or the poor. It is not partisan. Framing it as a political issue on one side of a contentious divide would be a disaster. Everyone has a stake in our future and we must work together to protect it. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 213, loc. 3342

Don’t act without integrity. When something immensely important is at stake and others are dragging their feet, people feel licensed to do whatever it takes to succeed. We must never give in to such temptation. A single person acting without integrity could stain the whole cause and damage everything we hope to achieve. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 213, loc. 3345

If you work in computer science or programming, you might be able to shift your career toward helping address the existential risk arising from AI: perhaps through much-needed technical research on AI alignment, or by working as an engineer for an AI project that takes the risks seriously. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 214, loc. 3358

I started out studying computer science, before moving into ethics. Then within ethics, the focus of my work has shifted only recently from the issues of global poverty to the very different issues around existential risk. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 215, loc. 3375

Around the same time, we saw the breakthroughs in agricultural science now known as the Green Revolution, which saw hundreds of millions of people lifted from hunger through the creation of high-yield varieties of staple crops. Norman Borlaug, the scientist who led these efforts, would win the Nobel Peace Prize in 1970 for his work. Borlaug’s work, and the roll-out of these technologies in the developing world, was funded by private philanthropists. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 216, loc. 3385

Within 200 million years, the steady gravitational pull of the Moon will slow the Earth’s rotation, and stretch our day to 25 hours. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 221, loc. 3467

Even if our spacecraft traveled at just 1 percent of the speed of light, and took 1,000 years to establish a new settlement, the entire galaxy could be settled within 100 million years—long before the Earth becomes uninhabitable. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 231, loc. 3604

If, today, you shine a ray of light out into space, it could reach any galaxy that is currently less than 16 billion light years away. But galaxies further than this are being pulled away so quickly that neither light, nor anything else we might send, could ever affect them.34 And next year this affectable universe will shrink by a single light year. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 232, loc. 3628

Mice know very little of music, art or humor. Toward what experiences are we as mice? What beauties are we blind to? -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 238, loc. 3709

These battles can be fought by our successors. Only we can make sure we get through this period of danger, that we navigate the Precipice and find our way to safety; that we give our children the very pages on which they will author our future. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 242, loc. 3755

Many economists agree, stating that pure time preference is irrational, unfounded or immoral.7 For example, Ramsey himself said it was “ethically indefensible and arises merely from the weakness of the imagination,” while R. F. Harrod called it “a polite expression for rapacity and the conquest of reason by passion.” -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 255, loc. 3886

For example, while we might take a smaller benefit now over a larger one later—just because it comes sooner—we rarely do so when making the choice on behalf of others. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 256, loc. 3907

This leaves us with the uncertain and changing catastrophe rate. And discounting by this is just another way of saying that we should value the future at its expected value: if we have empirical grounds for thinking our future is very long in expectation, there is no further dampening of its value coming from the process of discounting. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 258, loc. 3936

repugnant conclusion: for any outcome where everyone has high wellbeing, there is an ever better outcome where everyone has only a tiny amount of wellbeing, but there are so many people that quantity makes up for quality. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 260, loc. 3957

eliminating the 90 percent risk would be 81 times as important as eliminating the 10 percent risk -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 270, loc. 4120

Reducing the 90 percent risk to 89 percent is 9 times as important as reducing the 10 percent risk to 9 percent. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 270, loc. 4124

But an intuitive way to see how low risk levels make reduction more important is that halving existential risk in all periods doubles the expected length of time before catastrophe. -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 275, loc. 4173

As Larry Downes (2009) put it: “technology changes exponentially, but social, economic, and legal systems change incrementally.” -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 299, loc. 4552

Organizations focused on reducing existential risk include: The Future of Humanity Institute (FHI) The Centre for the Study of Existential Risk (CSER) The Future of Life Institute (FLI) The Global Catastrophic Risk Institute (GCRI) The Berkeley Existential Risk Initiative (BERI) The Open Philanthropy Project (OpenPhil) The Nuclear Threat Initiative (NTI) The Bulletin of the Atomic Scientists The Global Challenges Foundation The Law and Governance of Existential Risk group (LGER) Alliance to Feed the Earth in Disasters (ALLFED) The high-impact careers site 80,000 Hours maintains an up-to-date job board, including such positions: 80000hours.org/job-board and explanations of the kinds of careers that can really help: 80000hours.org/career-reviews -- ToOrd, The Precipice: Existential Risk and the Future of Humanity, pg. 403, loc. 6799