How You Can Counterfactually Send Millions of Dollars to EA Charities

"The 2.16% U.S. federal funds rate in 2019 is one of the most conservative interest rates possible."

The U.S. Federal Funds rate has been effectively 0% since April 2020 and was roughly 0% for six years from 2009 to 2015. The same is roughly true of the UK. Central banks in both countries are saying they'll keep rates low for years to come.

I can't immediately find a reputable business savings accounts in the UK/US that currently offers more than 1%.

Those that offer the highest rates (something approaching 1%) on comparison sites tend to have conditions (e.g. you lock the money up for a period, or have to keep depositing regularly), and usually have a maximum amount on which you can earn interest, a maximum which is low enough to be binding for these organisations.

These accounts usually offer a high rate to attract customers for a while, then dramatically reduce the interest rate and trust you won't be bothered moving your money. I think that's their basic business model.

Opening bank accounts for non-profits, at least in the UK, is a pain — something that will take a few weeks, and some time/attention from the operations team, management and trustees (who are needed for e.g. security checks). It looks like you usually won't be able to put in more than a million dollars/pounds in any given account, often less.

So you'd need to open many accounts, keep track of them, secure the chequebooks, have them audited annually, integrate them into your bookkeeping system, change the signatures when staff turn over, figure out the idiosyncratic requirements to pull out money when you need to, and so on.

This may sound simple but if you've worked in operations you'll know it's actually a big hassle.

In return, for each account opened you make <£10k a year, and probably need to keep closing accounts and moving your money into new ones every few years, as the teaser rate used to draw you in is removed.

This may all be worth it, but it's far from a no-brainer, as these organisation have other fruitful projects they could be using staff to pursue.

If Causes Differ Astronomically in Cost-Effectiveness, Then Personal Fit In Career Choice Is Unimportant

In addition to the issues raised by other commentators I would worry that someone trying to work on something they're a bad fit for can easily be harmful.

That especially goes for things related to existential risk.

And in addition to the obvious mechanisms, having most of the people in a field be ill-suited to what they're doing but persisting for 'astronomical waste' reasons will mean most participants struggle to make progress, get demoralized, and repel others from joining them.

How much does a vote matter?

He says he's going to write a response. If I recall Jason isn't a consequentialist so he may have a different take on what kinds of things we can have a duty to do.

How much does a vote matter?

Want to write a TLDR summary? I could find somewhere to stick it.

How much does a vote matter?

It seems like to figure out whether it's a good use of time for 300 people like you to vote, you still need to figure out if it's worth it for any single of them.

When you shouldn't use EA jargon and how to avoid it

I'm actually more favourable to a smaller EA community, but I still think jargon is bad. Using jargon doesn't disproportionately appeal to the people we want.

The most capable folks are busy with other stuff and don't have time to waste trying to understanding us. They're also more secure and uninterested in any silly in-group signalling games.

When you shouldn't use EA jargon and how to avoid it

Yes but grok also lacks that connotation to the ~97% of the population who don't know what it means or where it came from.

[Link] "Where are all the successful rationalists?"

The EA community seems to have a lot of very successful people by normal social standards, pursuing earning to give, research, politics and more. They are often doing better by their own lights as a result of having learned things from other people interested in EA-ish topics. Typically they aren't yet at the top of their fields but that's unsurprising as most are 25-35.

The rationality community, inasmuch as it doesn't overlap with the EA community, also has plenty of people who are successful by their own lights, but their goals tend to be becoming thinkers and writers who offer the world fresh ideas and a unique perspective on things. That does seems to be the comparative advantage of that group. So then it's not so surprising that we don't see lots of people e.g. getting rich. They mostly aren't trying to. 🤷‍♂️

Avoiding Munich's Mistakes: Advice for CEA and Local Groups

To better understand your view, what are some cases where you think it would be right to either

  1. not invite someone to speak, or
  2. cancel a talk you've already started organising,

but only just?

That is, cases where it's just slightly over the line of being justified.

Can my self-worth compare to my instrumental value?

For whatever reason people who place substantial intrinsic value on themselves seem to be more successful and have a larger social impact in the long term. It appears to be better for mental health, risk-taking, and confidence among other things.

You're also almost always better placed than anyone else to provide the things you need — e.g. sleep, recreation, fun, friends, healthy behaviours — so it's each person's comparative advantage to put extra effort into looking out for themselves. I don't know why, but doing that is more motivating if it feels like it has intrinsic and not just instrumental value.

Even the most self-effacing among us have a part of their mind that is selfish and cares about their welfare more than the welfare of strangers.

Folks who currently neglect their wellbeing and intrinsic value to a dangerous extent can start by fostering ways of thinking that build up that endorse and build up that selfishness.

Load More