Stories where Earth is utterly, irretrievably destroyed hit me in the gut like few other things
So many sci-fi stories obsess over whether humans can survive the catastrophe. Earth's just a convenient stage setting and life-support system for the human drama.
I can't get on with that total disrespect for nature.
That movie Interstellar was like that. Those people in that movie wrecked Earth to the point of making it uninhabitable. Without missing a beat they jumped into spaceships, not a lesson learned, to carry on the cycle all over again.
The natural world is a wonder and a treasure. As important as humans are in our own minds, we aren't the only game going. Only our own self-absorption makes it seem that way.
I'd rather humans go extinct than carry on with a handful of human survivors fleeing a demolished Earth.
When I read books like Neal Stephenson's Seveneves, it's painful. Even though that techno-fantasy ends up working out, kinda, none of the characters ever seem too bothered by the magnitude of the disaster and what they've lost.
Nobody seems too bothered by the loss of Earth and every living thing on it. They're too busy worrying about this week's hair styles.
Imagine the sucker-punch I got while reading Pellegrino and Zebrowski's The Killing Star, in which (no spoilers) Earth's surface is scorched to glowing charcoal in the first few pages after impact by a barrage of missiles traveling at near light speed.
Nothing survives except two humans in a deep-sea submersible and bacteria far enough underground to survive the strike.
And then things get serious for the survivors
The Killing Star is a depressing book in many ways. After that sucker-punch of an opener, we follow a handful of survivors as they flee the alien attackers's clean-up crew.
Despite losing almost everything and nearly extinguishing all life in our solar system, the book ends on a bittersweet -- not quite upbeat -- note.
A word about that.
Here at RP I don't believe in criticism.
Critiquing the work of others is an inherently negative, destructive activity with little value, in my humble but correct opinion.
I'd rather find the good in a work, if it's a mixed bag, or not simply read it at all if it's not my thing.
The Killing Star is that first kind of book. There's a lot to like here if you're into sci-fi and the icy fingers of existential horror.
A flawed book it is, in some respects, but what of it? This ain't English class. I came to watch the world blow up and see the survivors scramble... and I got what I came for.
Professionally unlikable reviewers online do a lot of whining about a simulation of the Titanic, which is really one scene at the beginning of the book. It's mentioned in passing at a few other spots as one of the characters copes with the trauma. The horror.
Shallow complaints from petty readers are like people who go to a Godzilla movie and complain that there was too much monster-punching action.
The rest of us can squeeze the sweet juice out of stories even when they don't click with us.
What I want to talk about here is the Big Idea behind The Killing Star.
What is it that led to this catastrophe?
The axioms of cosmic law
Unlike many tales of doom, the Big Idea behind The Killing Star is a mystery story.
The readers gradually discover the motivations of the alien attackers.
Halfway into the book, we get a discussion between eggheads about the pros and cons of contacting extraterrestrial life.
Some are for it. Others are dead set against it, for reasons Machiavellian and bleak:
A) A species would place the survival of its own ahead of any other species.
B) A species that comes to dominate the planet would be, in addition to intelligence, be vigilant, ruthless and aggressive whenever it becomes necessary.
C) The above two laws applies to any other species in the universe.
The logical conclusion of these three premises?
Make contact aliens and you risk blasting your home planet to radioactive slag.
The Killing Star anticipates Cixin Liu's more recent exploration of this theme in The Dark Forest.
Liu also plays with the theme of a hostile universe, determined by inescapable laws and ironclad logic. In The Dark Forest, we find another warning against taking candy from tentacled strangers.
Dark Forest Theory of cosmic sociology
Liu's characters offer a more sophisticated system of deductions:
First: Survival is the primary need of civilization.
Second: Civilization continuously grows and expands, but the total matter in the universe remains constant.
One more thing: To derive a basic picture of cosmic sociology from these two axioms, you need two other important concepts: chains of suspicion and the technological explosion.
This is a little different from Pellegrino and Zebrowski. Liu's more interested in a "capitalistic" take on life. Living things always grow and expand, requiring more stuff.
But there's only so much stuff. That sets every species on a hostile posture with all the others.
That leads us to The Dark Forest postulate:
The universe is a dark forest. Every civilization is an armed hunter stalking through the trees like a ghost, gently pushing aside branches that block the path and trying to tread without sound. Even breathing is done with care. The hunter has to be careful, because everywhere in the forest are stealthy hunters like him. If he finds another life—another hunter, angel, or a demon, a delicate infant to tottering old man, a fairy or demigod—there’s only one thing he can do: open fire and eliminate them.
Eerie, provocative, and absolutely horrifying.
But is it right?
What's wrong with this picture?
Let's start at the beginning.
The trouble with axiomatic theories of behavior is that they assume that...
- There is such a thing as a behavioral universal at the level of a species, much less a sophisticated tool-using species capable of spaceflight.
- We humans are in any position to know the universal rules based on our observations of Earth life.
- It's possible to deduce specific actions from universal rules even if you have them.
The first thing might be the most dubious of all. If you considered "Earth", circa 2021, as a coherent civilization, you wouldn't find many determinate behavioral universals.
What's interesting about humans is that we do share so much in common... and yet everywhere you look you find idiosyncrasies and incompatible values and beliefs in spite of the common features.
If you're wondering whether the differences aren't more interesting than the universals, you wouldn't be the first to ask that. Way back in the 4th century BC, Aristotle wrote
Deliberation is concerned with things that happen in a certain way for the most part, but in which the event is obscure, and with things in which it is indeterminate. We call in others to aid us in deliberation on important questions, distrusting ourselves as not being equal to deciding. (Nicomachean Ethics Book III.3)
Humans don't make practical decisions according to universal behavioral rules because we live uncertain and indeterminate lives.
We reason as individuals and together in groups to figure out what's good and right and fine and what's best to do because so few things are unchanging.
Even if there are universal constraints on the behavior of an intelligent tool-using species, the idea that you can predict specific actions from those universal rules is highly unlikely. Any intelligent species is going to think intelligently.
If they're anything like us, intelligent action of the members is not going to lead to anything remotely resembling coherent action at the species level.
Now the counter-argument is that they're not going to be anything like us.
You can't say that thinking intelligently is going to look like human persons getting together to deliberate about politics and whatever. What we'll be dealing with is a civilization likely to have a big-time mechanical aspect, presumably lots of AI or cyborg influences. Aliens don't have to be individual persons.
You don't want to rule that out if you're speculating on fictional situations. Any life capable of spreading out to the stars may be un-human-like in the extreme.
But there's another reason to be skeptical that the abstract high-level civilization is going to act like a mechanical optimizer.
Nothing in nature does that.
You'd think that if there was any competitive advantage to either (a) killing everything that wasn't your own kin or (b) expanding without brakes, something would have done it and Earth would be a world populated exclusively by carnivorous grass or whatever.
If nature hasn't converged on that solution after a few billion years, it's unlikely that some nerd with a computer or some rule-following idiot AI acting like a bureaucrat is going to beat the control.
The punchline is that rational decision making often turns out hard to distinguish from psychology and sociology. Universals end up being somebody's particulars.
One man's rationality is another's schizophrenia.