TL;DR: Lately I talked to several people who'd consider cofounding an EA startup but are blocked by having no concrete idea. Help! Please post your ideas here and I'll get potential CTOs to read them
The rest of the post is only if you're unsure what such people often would or wouldn't want to work on, feel free to skip it and just pitch your idea or share this question with someone else. This is all somewhat time sensitive. Thanks!
They're looking for something that feels like a startup
Such as Momentum, Wave, or Metaculus.
Not something that feels like a side project, such as a small chrome extension.
Also not a "regular" job as a senior software developer. They are aware of the 80k job board as an option, this post is aiming at something else.
Something that EAs have some kind of advantage in
For example "we care about this more than usual". Something that would explain why nobody else already implemented the idea just to make a ton of money.
Ideally there's a CEO
Especially if it's a very ambitious idea such as "a twitter that promotes high quality conversations" which many people tried and it's unclear (to me) how to pull it off.
Ideally the CEO would post here and be open for questions.
Ideas I'm aware of
- Ambitious Altruistic Software Engineering Efforts: Opportunities and Benefits
- Even More Ambitious Altruistic Tech Efforts
- A list of technical EA projects
- What Are Your Software Needs?
I'm still going over them, but this is time sensitive, so posting meanwhile
The closest matches so far:
- Prediction market ideas: I'm checking those out
- Ambitious Twitter-like ideas: Blocked by the CEO problem
I don't think this passes a LT funding or talent bar but an idea I've been interested in for a while is a way for people to anonymously report sexual harassment or abuse, or possibly abuse in general*.
I haven't thought much about implementation details, but I think the idea would be for the accused to not be exposed until there are >=3 reports or something, to reduce false positives and on the assumption that most abusers are serial abusers.
There are some technical nuances. Specifically, you want a way for the website to check for uniqueness of identities of people who report (so someone can't create 30 fake accounts to report) while not exposing the identities to outsiders. It might also be good for the website to not store the identities of accusers in the backend either, for obvious security concerns. You can solve this with a number of privacy techniques (the most obvious that I can think of is saving a hash of people's Facebook unique IDs, though of course this isn't great. There might be a better off-the-shelf solution however).
*One use case I'm tangentially familiar with is abuse of power from PhD supervisors.
EDIT: decided to retract this comment because the space of potential altruistic projects is extremely wide, and even though I'm inside-view more excited about this project than many others, it still seems like a bad norm to suggest things for an EA audience that even I don't think would be competitive with top LT projects.