Crossposted from https://kirstensnotebook.blogspot.com/2021/04/biblical-advice-for-people-with-short.html?m=1
I have a surprising number of friends, or friends of friends, who believe the world as we know it will likely end in the next 20 or 30 years.
They believe that transformative artificial intelligence will eventually either: a) solve most human problems, allowing humans to live forever, or b) kill/enslave everyone.
A lot of people honestly aren't sure of the timelines, but they're sure that this is the future. People who believe there's a good chance of transformative AI in the next 20-30 years are called people with "short timelines."
There are a lot of parallels between people with short AI timelines and the early Christian church. Early Christians believed that Jesus was going to come back within their lifetimes. A lot of early Christians were quitting their jobs and selling their property to devote more to the church, in part because they thought they wouldn't be on earth for much longer! Both early Christians and people with short AI timelines believe(d):
-you're on the brink of eternal life,
-you've got a short window of opportunity to make things better before you lock in to some kind of end state, and
-everything's going to change in the next 20 or 30 years, so you don't need a pension!
So what advice did early church leaders give to Christians living with these beliefs?
Boldly tell the truth: Early church leaders were routinely beaten, imprisoned or killed for their controversial beliefs. They never told early Christians to attempt to blend in. They did, however, instruct early Christians to...
Follow common sense morality: The Apostle Paul writes to the Romans that they should "Respect what is right in the sight of all people." Even though early Christians had a radically different worldview from others at the time, they're encouraged to remain married to their unbelieving spouses, be good neighbours, and generally act in a way that would be above reproach. As part of that, church leaders also advised early Christians...
Don't quit your day job: In Paul's second letter to the Thessalonians, he had to specifically tell them to go get jobs again, because so many of them had quit their jobs and become busybodies in preparation for the apocalypse. Even Paul himself, while preaching the Gospel, sometimes worked as a tentmaker. Early Christians were advised to work. A few of them worked full time on the mission of spreading the good news of Christ with the support and blessing of their community. Most of them worked on the normal boring jobs that they had before. In the modern day, this would likely also include making sure you have a pension and do other normal life admin.
I am uncertain how much relevance Christian teachings have for people with short AI timelines. I don't know if it's comforting or disturbing to know that you're not the first community to experience life that you believe to be at the hinge of history.
Relatedly, a behaviour I dislike is being repeatedly publicly wrong without changing and acknowledging fault. Mainstream Christianity is guilty of this, though so are many other social movements.
I think if it turns out that short AI timelines are wrong, those with short timelines should acknowledge it and the EA as a whole should seek to understand why we got it so wrong. I will think it odd if those who make repeatedly wrong predictions continue to be taken seriously.
Also, I'd like to see more concrete testable short term predictions from those we trust with AI predictions. Are they good forecasters in general? Are they well calibrated or insightful in ways we can test?
That's fair pushback - a lot of that really doesn't seem that risky if you're young and have a very employable skillset. I endorse this rephrasing of my view, thanks
I guess you're still exposed to SOME increased risk, eg that the tech industry in general becomes much smaller/harder to get into/less well paying, but you're still exposed to risks like "the US pension system collapses" anyway, so this seems reasonable to mostly ignore. (Unless there's a good way of buying insurance against this?)