Hide table of contents

I'm apparently too lazy to write Wikipedia articles, but not too lazy to identify needed articles. In cases where I have links that may be helpful to article writers, I've included them.

Center for Security and Emerging Technology (CSET)

Think tank at Georgetown University launched in 2019 with a $55m grant from the Open Philanthropy Project.

https://www.educationdive.com/news/stanford-adds-think-tank-to-expand-artificial-intelligence-work/550762/

https://www.washingtonpost.com/local/education/georgetown-launches-think-tank-on-security-and-emerging-technology/2019/02/27/d6dabc62-391f-11e9-a2cd-307b06d0257b_story.html?noredirect=on&utm_term=.386683fe7537

https://www.georgetown.edu/news/q-and-a-with-cset-founding-director-jason-matheny

https://www.openphilanthropy.org/giving/grants/georgetown-university-center-security-and-emerging-technology

https://www.georgetown.edu/news/largest-us-center-on-artificial-intelligence-and-policy-comes-to-georgetown

https://cset.georgetown.edu

Animal Charity Evaluators

Has a sentence in the main EA article.

Centre for Effective Altruism

Mentioned in the 80,000 Hours and Giving What We Can articles.

Forethought Foundation / Global Priorities Institute

Organizations associated with William MacAskill.

Effective Altruism Foundation / Stiftung für Effektiven Altruismus

Parent organization of Raising for Effective Giving and other projects.

The AI Does Not Hate You

New book by Tom Chivers just released in the UK. Focused on AI risk and the rationality movement, but covers EA and EA-relevant AI safety organizations.

Ought

Machine learning research organization that has received grants from the Long Term Future Fund and the Open Philanthropy Project. Note: there currently do not seem to be any news or magazine articles about it, which may be an obstacle to a Wikipedia article.

Center for Human-compatible AI (CHAI)

UC Berkeley research group funded by the Open Philanthropy Project and others.

Kelsey Piper / Future Perfect

Future Perfect has a paragraph in the main EA article.

28

0
0

Reactions

0
0
Comments13
Sorted by Click to highlight new comments since: Today at 9:07 AM

Animal Charity Evaluators previously had a page on Wikipedia, but was deleted after discussion. You can see a copy of what the page looked like, which can also be used in case someone wants to write the page.

My guess (based on intuition/experience and without spending any time digging up sources) is that almost all of these do not meet Wikipedia's general notability guideline, so it is somewhat uncertain as to whether they would survive for long (if someone were to write the page). In other words, they might be deleted like the ACE article.

The Chivers book will likely meet the notability criteria for books (if it hasn't already).

Interesting list. I know some people and organisations (which I am obviously not going to name) prefer not to have Wikipedia pages - is it possible that some of the above groups might fall into this category?

Another one: Open Phil is only a subset of the GiveWell wiki page, and doesn't have it's own page (but is very notable)


Open Phil used to have its own page; see e.g. this version and the revision history for some context. (Disclosure: I wrote the original version of the page.)

Thanks for the link! Is there a reason it's no longer an active page? It seems like that kind of information is probably useful to have out there.

From the edit logs: "almost no unique, well-sourced content here. Merged what was unique to GiveWell"

This is the final note from an editor who deleted the page. This was in early 2017; I'd expect an independent Open Phil page to make a lot more sense now (if they want one to exist).

In case someone has capacity to do this right now, I'm under the impression that Open Phil does want their own page (based on conversation I had with someone researching there).

The Life You Can Save (which I work for) would be very interested in getting a Wikipedia page set up. My understanding is that Wikipedia doesn’t allow employees or volunteers to create one, but we’d be very happy if someone in the EA community took it on themselves to create one. There’s already a Wiki page for TLYCS the book, so we’d be hoping to get a new separate page for the non-profit organization. And this disambiguation would be particularly helpful prior to the 4Q19 release of the updated and revised 10th anniversary edition of the book.

Can you make a case as to why the two have enough notability separately to deserve their own separate Wikipedia pages?

The original book was well received and got significant amounts of attention (e.g. an excerpt ran in the NYT, Peter was on the Colbert Report to talk about it, etc.). It was also highly influential, and has contributed to the way a lot of EAs (including Cari Tuna) think about giving. I’m not sure how many languages it’s been translated into, but it’s a pretty good number.

The organization has also received attention from a variety of major media outlets and has moved a considerable amount of money to effective charities (~$5.25 million in 2018 and expected to be much higher in 2019). With the publicity push around the release of the new edition, there should be much more media attention around the corner.

Also, Peter Singer is clearly notable and disambiguating the book and the nonprofit will help clarify discussion about Peter. The disambiguation is becoming even more important with the new edition (which will have substantial changes), as there will soon be two books and a charity all with the same name.

I feel like it would be more appropriate for the organisation to have its own page, while information about the book could be divided as appropriate between that page, and those of effective altruism and Peter Singer.

Interestingly the EA wikipedia page gets an average of 9000 pageviews a month. Curious where most of these people first hear about it before googling, maybe Doing Good Better?

Apparently existential risk does not have its own Wikipedia article.

Some related concepts like human extinction, global catastrophic risks, existential risk from AGI, biotechnology risk do have their own Wikipedia articles. On closer inspection, hyperlinks for "existential risk" on Wikipedia redirect to the global catastrophic risk Wiki page. A lot of Wiki articles have started using the term "existential risk". Should there be a seperate article for existential risk?

Curated and popular this week
Relevant opportunities