We’ve updated our Terms of Use to reflect our new entity name and address. You can review the changes here.
We’ve updated our Terms of Use. You can review the changes here.

MULTIPLANETARY

from EXISTENTIAL RISK by LOSTBOYEVSKY & Egoid

supported by
/

lyrics

Elon Musk plans to colonize mars
Elon Musk plans to colonize mars
Elon Musk plans to colonize mars
Elon Musk plans to colonize mars

I just thought you should know
I just thought you should know
I just thought you should know
I just thought you should probably know that because

Humanity could become a multi-planetary species
And that would be cool because existence on Earth is precarious
There’s hella existential risk if we’re all on this one pebble
But if we expand outward into space—if we build a self-sustaining civilization on another planet—we increase our species’ resilience exponentially
We may even be able to engineer another biosphere on Mars or elsewhere that can sustain other earthly species—to increase the resilience of many earthly species, not just humans

Our branch of evolution is an anomaly in the cosmos, as far as we know
Our candle of consciousness, as far as we know, is the only one that exists
Our diverse and thriving biosphere is a jewel in the vast black void of space
Life is bursting with color and beauty and wonder and love
And pizza and orgasms and cannabis and all sorts of other great stuff
So we should want to preserve that
We should want to preserve our branch of evolution
Think of the trillions of future human and post-human beings who will thank us if we do so
If we at least ensure that our branch of evolution persists, creation can continue to evolve and produce untold wonders for potentially trillions of years
Some promising ways to mitigate global catastrophic risk are to stop destroying the environment, transition to renewable energy and resources, find ways of reversing overpopulation, and disarm all nuclear weapons, or perhaps reduce humanity’s nuclear arsenal to the bare minimum necessary to disincentivize major international conflicts.
But even if we do these things, Earth is still risky
An asteroid could strike the Earth again and cause an apocalyptic scenario
Eventually some kind of mass extinction event will occur
Whether caused by an asteroid impact, climate change, or the eventual exponential expansion of the sun in its giant red phase, sooner or later something will wipe out much or all of life on Earth
For intelligent earthly life to persist into the deep, deep future, high technology was always going to be a necessity. To mitigate natural existential risks and eventually leave Earth, we needed to develop high technology at some point.
Don’t get me wrong: we should preserve Earth for as long as possible and treat it and its inhabitants much better than we historically have
But eventually, if we want to continue surviving, we will need to leave the pale blue dot and venture out into the stars
High technology is necessary for us to do this, but unfortunately, it also poses momentous risks
Experts believe molecular nanotechnology weapons have a 5% chance of causing the extinction of humanity before 2100
The same experts suggest a 5% chance that superintelligent AI optimizing for non-human-friendly values will result in humanity’s demise prior to 2100
And the same experts give a 2% probability to the possibility of an engineered pandemic, a kind of bioweapon, wiping out humanity before 2100
I’m really not making this stuff up—read the Wikipedia page for ‘Global Catastrophic Risks’
Because of these technological risks, Vinay Gupta has suggested that we should move all research and development of these dangerous technologies out of Earth’s orbit—conduct the R&D on the moon or Mars instead, so that if something goes wrong, our home planet isn’t decimated
What I’m describing might seem like the substance of science-fiction, but we now live in a sci-fi world
Look around you
These are real risks, and some of the world’s most intelligent people are discussing them in all seriousness
This is why getting some humans off of Earth is a good idea
If we can become a multiplanetary and eventually interstellar or intergalactic civilization, we will all but ensure the persistence of life in the universe

We are life

We are life
Our most basic drive is to help life continue to exist
So I say
Let the roadshow go on
This is much larger than you or I or the 7 billion of us and the billions of animals on Earth right now
This is about trillions upon trillions upon trillions of potential future beings that will never know life, unless we enable them to
So if we do nothing else in this world, we must ensure that humanity doesn’t destroy itself and all other life
We must ensure that life is protected from the various existential risks that may threaten Earth in the coming decades and centuries
This is the most basic question we should be asking ourselves: How to ensure the persistence of our evolutionary branch, of our flicker of consciousness
The show must go on
Life is such a marvelous, ineffable, and incomprehensibly complex show
It must go on
You can help with this
Unfortunately, our species’ ethical and moral progress has not kept pace with our technological progress
You can help to change that, by prioritizing your moral and ethical development
Live compassionately, cooperatively, minimalistically, sustainably, non-dogmatically, and self-reliantly
Begin to view yourself as a member of a vast community of 7 billion humans and billions of other sentient beings
The greater the peace and solidarity among humanity, the less chance that we will blow ourselves to hell
Do whatever you can to help our systems become more sustainable and regenerative as well
Basic stuff like reducing, reusing, and recycling
Try not to waste stuff, and don’t buy so much unnecessary stuff
Consider trying not to own a car, or get a hybrid or electric car
Think about transitioning to solar power or other renewable energy sources in the coming years or decades
Start where you are, and do what you can with what you have
Google “effective altruism” and read about their initiatives
Google “Vinay Gupta” and listen to like 10 hours of his lectures and interviews
Read ‘The Righteous Mind’ by Jonathan Haidt
Look into things like the Future of Life Institute, the Centre for the Study of Existential Risk, and the Future of Humanity Institute
And spread these ideas
Raise awareness of the fact that addressing existential risks and becoming a multiplanetary civilization is good for humanity, if we care about giving future sentient life --- human, non-human, post-human, and otherwise --- the greatest chance of actually coming into this universe and experiencing its wonders and grandeur and creating things that are incomprehensible to us currently
I think we should care about that more than anything
Don’t get me wrong: I think ending or greatly reducing extreme poverty, slavery, human trafficking, animal cruelty, and other things are extremely important and worthy causes too
But we have to start from square one --- with the fact that if life on Earth is utterly decimated or ceases to exist, no other causes matter
It will all be over
Everything we know and love --- gone
No chance for future generations to experience this cosmos
As I said, we are life
Life has already become so much, and it can become so much more
Let’s make sure that it does

Let the show go on
Let the show go on
Let the show go on
Let the show go on

credits

from EXISTENTIAL RISK, released May 31, 2017

license

all rights reserved

tags

about

jordan bates Villa Las Estrellas, Antarctica

aka jb, lostboyevsky, goku, mowgli, master splinter, ryokan the fool, trick jester, tom bombadil, albus voldemort, etc

contact / help

Contact jordan bates

Streaming and
Download help

Redeem code

Report this track or account

If you like jordan bates, you may also like: