Matthias_Samwald

Associate Professor @ Institute of Artificial Intelligence, Medical University of Vienna

Language-centric AI / Speeding up progress in biomedical research / AI robustness, interpretability, alignment

Involved in EA since 2015

https://samwald.info/

Posts

Sorted by New

Topic Contributions

Comments

I feel anxious that there is all this money around. Let's talk about it

There should also be more transparency about how funding decisions are made, now that very substantial budgets are available. We don't want to find out at some point that many funding decisions have been made on less-than-objective grounds. 

EA funding has reached a level where 'evaluating  evaluations' should soon become an important project.

Announcing the Future Fund

I love the format, but I also have to voice one concern.  Having to judge a potentially large number of brief proposals in a rather quick manner with a small team might entail that the eminence of the applicants ends  up being used as a significant decisive factor (e.g., well-established persons at elite institutions in the US/UK or known EA organizations having a significantly better chance).

While such a bias might partially be rational, the resulting Matthew effect ('the rich get richer')  would also have negative consequences. There are diminishing returns for funding absorbed by already well-funded institutions, perceived unfairness might discourage potentially great applicants in future funding rounds, and opportunities for spreading EA-relevant work into novel communities (which can leverage novel perspectives, talent, follow-up funding) might be lost.

I'm not sure what the consequence of this is for the particular call --perhaps you are well aware of these issues and committed to avoid them anyways-- but it would probably be good to keep it in mind.