Robert_Wiblin

Topic Contributions

Comments

AI Risk is like Terminator; Stop Saying it's Not

I interpreted them not as saying that Terminator underplays the issue but rather that it misrepresents what a real AI would be able to do (in a way that probably makes the problem seem far easier to solve). But that may be me suffering from the curse of knowledge.

AI Risk is like Terminator; Stop Saying it's Not

Isn't a key difference that in Terminator the AI seems incredibly incompetent at wiping us out? Surely we'd be destroyed in no time — to start with it could just manufacture a poison like dioxin and coat the world (or something much smarter). Going around with tanks and guns as depicted in the film is entirely unnecessary.

Some clarifications on the Future Fund's approach to grantmaking

If it's just a form where the main reason for rejection is chosen from a list then that's probably fine/good.

I've seen people try to do written feedback before and find it a nightmare so I guess people's mileage varies a fair bit.

Some clarifications on the Future Fund's approach to grantmaking

"However, banking on this as handling the concerns that were raised doesn't account for all the things that come with unqualified rejection and people deciding to do other things, leave EA, incur critical stakeholder instability etc. as a result. "

I mean I think people are radically underestimating the opportunity cost of doing feedback properly at the moment. If I'm right then getting feedback might reduce people's chances of getting funded by say, 30%, or 50%, because the throughput for grants will be much reduced.

I would probably rather have a 20% chance of getting funding for my project without feedback than a 10% chance with feedback, though people's preferences may vary.

(Alternatively all the time spent explaining and writing and corresponding will mean worse projects get funded as there's not much time left to actually think through which projects are most impactful.)

Some clarifications on the Future Fund's approach to grantmaking

It would be very surprising if there weren't an opportunity cost to providing feedback. Those might include:

  1. Senior management time to oversee the project, bottlenecking other plans
  2. PR firefighting and morale counselling when 1 in ~100 people get angry at what you say and cause you grief (this will absolutely happen)
  3. Any hires capable of thinking up and communicating helpful feedback (this is difficult!) could otherwise use that time to read and make decisions on more grant proposals in more areas — or just improve the decision-making among the same pool of applicants.

That there's an opportunity cost doesn't show it's not worth it but my guess is right now it would be huge mistake for Future Fund to provide substantial feedback except in rare cases.

That could change in future if their other streams of successful applicants dry up and improving the projects of people who were previously rejected becomes the best way to find new things they want to fund.

Tentative Reasons You Might Be Underrating Having Kids

I find these arguments intellectually interesting to a degree.

But like you, my aesthetic preference is just that people who personally feel like having kids should have kids, and those who personally don't feel like having kids shouldn't.

If we followed that dollar-store rule of thumb I expect things would go roughly as well as they can, all things considered.

FTX/CEA - show us your numbers!

My guess is this would reduce grant output a lot relative to how much I think anyone would learn (maybe it would grantmaking in half?) so personally I'd rather see them just push ahead and make a lot of grants then review or write about just a handful of them from time to time.

Introducing 80k After Hours

Here you go: https://www.stitcher.com/show/80k-after-hours

(Seems like Stitcher is having technical problems, I've contacted their technical support about it.)

Why is Operations no longer an 80K Priority Path?

For the 10/10 criteria do you mean a $50k hiring bonus, or a $50k annual salary?

Think about EA alignment like skill mastery, not cult indoctrination

"creating closed social circles"

Just on this my impression is that more senior people in the EA community actively recommend not closing your social circle because, among other reasons, it's more robust to have a range of social supports from separate groups of people, and it's better epistemically not to exclusively hang out with people who already share your views on things.

Inasmuch as people's social circles shrink I don't think it's due to guidance from leaders (as in a typical cult, I would think) but rather because people naturally find it more fun to socialise with people who share their beliefs and values, even if they think that's not in their long-term best interest.

Load More