A

Austin

Cofounder @ Manifund & Manifold
3829 karmaJoined San Francisco, CA, USA

Bio

Hey there~ I'm Austin, currently building https://manifund.org. Always happy to meet people; reach out at akrolsmir@gmail.com, or find a time on https://calendly.com/austinchen/manifold !

Comments
219

Thanks, we'll definitely consider that option for future pieces!

For example, Substack is a bigger deal now than a few years ago, and if the Forum becomes a much worse platform for authors by comparison, losing strong writers to Substack is a risk to the Forum community.

 

I've proposed to the LW folks and I'll propose to y'all: make it easy to import/xpost Substack posts into EA Forum! RIght now a lot of my writing goes from Notion draft => our Substack => LW/EAF, and getting the formatting exactly right (esp around images, spacing, and footnotes) is a pain. I would love the ability to just drop in our Substack link and have that automatically, correctly, import the article into these places.

I'm also not sure if this is what SWP is going for, but the entire proposal reminds me of Paul Christiano's on humane egg offsets, which I've long been fond of: https://sideways-view.com/2021/03/21/robust-egg-offsetting/

 

With Paul's, the egg certificate solves a problem of "I want humane eggs, but I can buy regular egg + humane cert = humane egg". Maybe the same would apply for stunned shrimp, eg a supermarket might say "I want to brand my shrimp as stunned for marketing or for commitments; I can buy regular shrimp + stun cert = stunned shrimp" 

Austin
4
2
1
80% ➔ 90% agree

Vote power should scale with karma

 

This gives EA Forum and LessWrong a very useful property of markets: more influence accrues to individuals who have a good track record of posting.

Really appreciate this post! I think it's really important to try new things, and also have the courage to notice when things are not working and stop them. As a person who habitually starts projects, I often struggle with the latter myself, haha.

(speaking of new projects, Manifund might be interested in hosting donor lotteries or something similar in the future -- lmk if there's interest in continuity there!)

Hey! Thanks for the thoughts. I'm unfortunately very busy these days (including, with preparing for Manifest 2025!) so can't guarantee I'll be able to address everything thoroughly, but a few quick points, written hastily and without strong conviction:

  • re non sequitor, I'm not sure if you've been on a podcast before but one tends to just, like, say stuff that comes to mind; it's not an all-things-considered take. I agree that Hanania denouncing his past self is a great and probably more central example of growth, I just didn't reference it because the SWP stuff was more top of mind (interesting, unexpected).
  • I know approximately nothing about HBD fwiw; like I'm not even super sure what the term refers to (my guess without checking: the controversial idea that certain populations/races have higher IQs?). It's not the case that I've looked a bunch into HBD and decided I'll invite these 6 speakers because of their HBD beliefs; I outlined the specific reasons I invited them, which is that they each had an interesting topic to talk about (none of which were HBD afaik). You could accuse me of dereliction of duty wrt researching the downstream effects of inviting speakers with controversy? idk, maybe, I'm open to that criticism, it's just there's a lot of stuff to juggle and it feels a bit like an isolated demand on my time.
  • I agree that racism directly harms people, beyond being offensive, and this can be very bad. It's not obvious to me where and how of racism is happening in my local community (broadly construed, ie the spaces I spend time in IRL and online), or what specific bad things that are caused by this racism? Like, I think my general view of racism is that it's an important cause area, alongside many other important causes to work on like AI safety, animal welfare, GHD, climate change, progress, etc -- but it happens to be not very neglected or tractable, for me personally to address.

No updates on ACX Grants to share atm; stay tuned!

Thank you Caleb, I appreciate the endorsement!

And yeah, I was very surprised by the dearth of strong community efforts in SF. Some guesses at this:

  • Berkeley and Oakland have been historical nexus for EA and rationality, with a rich-get-richer effect where people migrating to the bay choose East Bay
  • In SF, there's much more competition for talent: people can go work on startups, AI labs, FAANG, VC
  • And also competition for mindshare: SF's higher population and density means there are many other communities (eg climbing, biking, improv, yimby, partying)

Some are! Check out each project in the post, some have links to source code. 

(I do wish we'd gotten source code for all of them, next time might consider an open source hackathon!) 

Thanks Angelina! It was indeed fun, hope to have you join in some future version of this~

And yeah definitely great to highlight that list of projects, many juicy ideas in there for any aspiring epistemics hacker, still unexplored. (I think it might be good for @Owen Cotton-Barratt et al to just post that as a standalone article!)

Load more