Do you have examples of systemic problems in the EA project that could be solved by targeted coordination mechanisms?
I'll give some answers as examples. I'd like to see answers even if you aren't sure if they are actually problems or not, or if they are partially solved, or even if you think that there might be a better solution - just mention it in the text.
I'm asking mostly because I'm curious about the extent to which we could use more coordination mechanisms, but I might also try to tackle something here - especially if it's related to local groups or to prioritization research (which I plan on learning more about).
A global "CRM".
It might be useful to have a global table of contact people who are sympathetic to the EA community and can offer help in their expertise or their position. Such a system can not demand too much out of any contact person, so some coordination there is needed.
RE:
- In it's current state the Hub is not equipped to deal with people who aren't opting in to being listed there.
- I imagine such people would not want to be publicly listed unless they were already onboard with EA and willing to be a resource, so some kind of closed/private list would probably be more realistic
- One option is that you could have profiles lis
... (read more)