Given this quote from the Wall Street journal: “ Ellison testified that Bankman-Fried was a risk-taker who was comfortable with lying and stealing as long as it benefited the greater good. To attract FTX customers, he cultivated an appearance “as a smart, competent, somewhat eccentric founder,” she said”

And this Substack post:

To me, it seems like a consequentialist view clearly requires you to lie in certain circumstances, but it is very difficult to realize when it would be a net positive. Doesn’t that force an EA into a rule consequentialism?




New Answer
New Comment

1 Answers sorted by

Technically multi-level consequentialism. But yeah, the basic idea that you should be cautious of trying to directly maximize utility (unconstrained by generally reliable rules or norms) is certainly a sound one.

It's not really unique to consequentialism either. Every non-absolutist view requires you to lie in some circumstances (e.g. to save innocent people from an inquiring murderer).

Sorted by Click to highlight new comments since: Today at 11:18 AM

I've changed the title from "Thoughts?" to be more descriptive (but it could still be improved, e.g. I think "Does EA imply rule consequentialism?" would be a better title).

Please try to use more descriptive titles in the future.