I think it's a bad result of a view if it implies that no actions we perform are good or bad. Intuitively it doesn't seem like all chaotic actions are neutral.
I don't agree with that. Cluelessness seems to only arise if you have reason to think that on average your actions won't make things better. And yet even a very flawed procedure will, on average across worlds, do better than chance. This seems to deal with epistemic cluelessness fine.
Thanks, yes I think I fired this post off too quickly without taking time to read deeper analysis of it. I'll try to give your post a read when I get the chance.
Interesting point, though I disagree--I think there are strong arguments for thinking that you should just maximize utility https://joecarlsmith.com/2022/03/16/on-expected-utility-part-1-skyscrapers-and-madmen/
Yes it would imply that a bit of extra energy can vastly increase consciousness. But so what? Why be 99.9999% confident that it can't?