PhD student in Philosophy @ Oxford
405 karmaJoined Apr



    Book Summary: The Precipice


    I feel like these actions and attitudes embody many of the virtues of effective altruism. You really genuinely wanted to help somebody, and you took personally costly actions to do so. I feel great about having people like you in the EA Community. My advice is to keep the feeling of how important you were to Tlalok's life as you do good effectively with other parts of your time and effort, knowing you are perhaps making a profound difference in many lives.

    What is the timeline for announcing the result of this competition?

    Was the result of this competition ever announced? I can't seem to locate it.

    Are these fellowships open to applicants outside of computer science/engineering etc. doing relevant work?

    I really like time shifter but honestly the following has worked better for me:

    Fast for ~16 hours prior to 7am in my new time-zone.

    Take melatonin, usually ~10pm in my new timezone and again if I wake up and stop feeling sleepy before around 5am in my new timezone. (I have no idea if this second dosing is optimal but it seems to work).

    I highly recommend getting a good neck pillow, earplugs, and eye mask if you travel often or on long trips (e.g. if you are Australian and go overseas almost anywhere).

    Thanks to Chris Watkins for suggesting the fasting routine.

    The schedule looks like it's all dated for August, is that the right link?

    I'd also potentially include the latest version of Carlsmiths chapter on Power-seeking AI.

    I think Thorstad's "Against the singularity hypothesis" might complement the week 10 readings.

    A quick clarification: I mean that "maximize expected utility" is what both CDT and EDT do, so saying "In other words, this would be the kind of decision theory that recommends decisions that maximize expected utility" is perhaps misleading

    I quite like this post. I think though that your conclusion, to use CDT when probabilities aren't affected by your choice and use EDT when they are affected, is slightly strange. As you note, CDT gives the same recommendations EDT in cases where your decision affects the probabilities, so it sounds to me like you would actually follow CDT in all situations (and only trivially follow EDT in the special cases where EDT and CDT make the same recommendations).

    I think there's something to pointing out that CDT in fact recommends one boxing wherever your action can affect what is in the boxes, but I think you should be more explicit about how you prefer CDT.

    I think near the end of the post you want to call it Bayesian decision theory. That's a nice name, but I don't think you need a new name, especially because causal decision theory already captures the same idea, is well known, and points to the distinctive feature of this view: that you care about causal probabilities rather than probabilities that use your own actions as evidence when they make no causal difference.

    When you say "This would be the kind of decision theory that smokes, one-boxes, and doesn’t pay the biker ex-post, but “chooses to pay the biker ex-ante.” In other words, this would be the kind of decision theory that recommends decisions that maximize expected utility." I find this an odd thing to say, and perhaps a bit misleading, because that's what both EDT and CDT already do, they just have different conceptions of what expected utility is.

    Load more