vipulnaik

Wiki Contributions

Comments

Understanding Open Philanthropy's evolution on migration policy

Good point. My understanding is that Open Phil made a general decision to focus only on US policy for most of their policy areas, for the reason that there are high fixed costs to getting familiar with a policy space. In some areas like animal welfare they've gone beyond US policy, but those are areas where they are spending way more money.

Their grants to Labor Mobility Partnerships stand out as not being US-specific, though LaMP is still currently more focused on the US.

I do expect that if there are shovel-ready, easy-to-justify opportunities outside the US, Open Phil would take them.

You can now apply to EA Funds anytime! (LTFF & EAIF only)

Hello! I'm wondering what implications the switch to rolling applications has on how payout reports are published? https://funds.effectivealtruism.org/funds/far-future#payout-reports doesn't include anything beyond April 1, 2021. Previously there would be three reports per year tied to the (discrete) grant rounds.

Wikipedia editing is important, tractable, and neglected

Hi Darius!

I appreciate that you've raised this issue and provided a reasonably thorough discussion of it. I would like to highlight a bunch of aspects based on my experience editing Wikipedia as well as studying its culture in some depth. While the paid editing phase and the subsequent fallout inform my views partly, these are actually based on several years of experience before (and some after) that incident.

While none of what I say falsifies what you wrote, it is in tension with some of your tone and emphasis. So in some ways these observations are critical of your post.

How much reverence does Wikipedia's process deserve?

I think that, if your goal is to spend a lot of time editing Wikipedia, it's really important to study Wikipedia's policies -- both the de jure ones and the de facto ones. That's because the policies are not completely intuitive, and the enforcement can often be swift and unfriendly -- giving you little chance to recover once you get on the bad side of it.

So in that sense, it is important to respect and understand Wikipedia's policies.

But, that is not the same as being reverent toward the policies and enforcement mechanisms. I think your post has some of that reverence, as well as a "just world" belief, that the policies and their enforcement are sensible and just, and align with effective altruist ideals. For instance, you write:

Therefore, anyone considering making contributions to Wikipedia should become familiar with its rules, and in particular adhere to the requirement not to approach editing as an advocacy tool. This is important both because trying to paint an overly favourable picture of EA-related topics will, as Brian notes above, likely backfire, and because observing such a requirement is in line with EA's commitment to intellectual honesty and moral cooperation. Wikipedia is one of the world’s greatest altruistic projects—their contributors share many of our core values, and we should respect their norms and efforts to maintain Wikipedia’s high quality.

and:

Don’t feel like you need to have read all articles about Wikipedia rules and norms before you can start to edit. While reading them upfront may help you avoid some frustrating experiences later, the biggest failure mode is getting overwhelmed and being discouraged from ever taking the first step on your editing journey. Most of Wikipedia’s rules and norms are commonsensical, and you are bound to become familiar with them as you gather editing experience.

In contrast, my take on understanding the Wikipedia system is that it bears many resemblances to other legal and bureaucratic systems -- many of the rules make sense in theory, and have good rationales, but their application is often really bad. Going in with a positive "just world" belief in Wikipedia seems like a recipe for falling down rather hard at the first incident. I think the best way is to be well-prepared in terms of understanding the dynamics and the kinds of attacks you may endure, so that then once you do get in there you have no false expectations, and if you do get into a fight you can bow down and stay cool without feeling rattled.

You've linked to Gwern's inclusionism article already; a few other links I recommend: Wikipedia Bureaucracy (continued), Robert Walker's answer on frustrating aspects of being a Wikipedia editor, and Gwern's piece of dark side editing.

On that note, what kind of preparation is necessary?

Based on my experience editing Wikipedia, and seeing my edited articles spend several years surviving, growing, getting deleted, or shrinking -- all of which have happened to me -- I can say it's important to be prepared when editing Wikipedia in a few ways:

  • Prepare for your work getting deleted or maimed: On a process level, this means keeping off-Wikipedia backups (Issa and I implemented technical solutions to back up the content of articles we were editing automatically, in addition to manual syncing we did at every edit). During a mass deletion episode following the paid editing, we almost lost the content of several articles, but were fortunately able to retrieve it. At an emotional level, it means accepting the possibility that stuff you spent a lot of time writing can, sometimes immediately and sometimes after years, randomly get deleted or maimed beyond recognition. And even if reasons are proffered for the maiming or deletion, you are unlikely to consider them good reasons.

  • Prepare to be attacked or questioned in ways you might find ridiculous: This may not happen to you for years, and then may suddenly happen even if you are on your best behavior -- because somebody somewhere notices something. While there are a number of strategies to reduce the probability of this happening (don't get into fights, avoid editing controversial stuff, avoid overtly promotional or low-quality edits) they are no guarantee. And if you have a large corpus of edits, once somebody is suspicious of you, they can go after your whole body of work. The emotional and psychological preparation for that -- and the background knowledge of it so that you can make an informed decision to edit Wikipedia -- is important.

A few specific tripping points of effective altruist projects to edit Wikipedia

When do you get into trouble on Wikipedia, keep in mind these likely truths about the other side (though this could vary a lot from situation to situation, and you could well get lucky enough for these not to apply to you):

  • The bulk of the people will be highly suspicious of you.
  • Those opposing you probably have a lot more time than you do and a better ability to navigate Wikipedia's channels.
  • They will not be impressed by your efforts to defend yourself, even against points you consider clearly illogical.
  • Efforts to point to noble goals (e.g. effective altruism) or measurement tools (e.g. pageviews) will make them more suspicious of yours, as it will be taken as evidence of a conflict of interest.
  • Your efforts to recruit people through off-Wikipedia channels (e.g., this EA Forum post) may make matters worse, as it might lead to accusations of canvassing.
  • Being mindful of your feelings will not be a priority for them.

What kind of Wikipedia editing might still be safe and okay to do?

This will vary from person to person. I think the following are likely to be okay for anybody altruistically inclined but moderately risk-averse:

  • Drive-by fixes to spelling, grammar, punctuation, formatting, broken links, etc.: Once you have acquired basic familiarity with Wikipedia editing, making these fixes when you notice issues is quick and easy.
  • Substantive edits or even new page creations where you have fairly high confidence that your edits will pass under the radar of zealous attackers (this tends to work well for obscure but protected topics; some academic areas such as in higher mathematics could be like this).
  • Substantive edits or even new page creations where, even if the edit gets reverted or the page deleted, the output you create (in terms of update to your state of mind, or the off-Wikipedia copy of the updated content) makes it worthwhile.

A positive note to end on

I will end with a wholehearted commendation of the spirit of your post; as I see it, this is about being prosocial in a broad sense, "giving back" to a great resource, and finding opportunities to benefit multiple communities and work in a collaborative fashion with different groups to create more for the world. I generally favor producing public output while learning new topics; where the format and goals allow it, this could be Wikipedia pages! Issa Rice has even documented this "paper trail" approach I follow.

PS: I thank Issa Rice for some of the links and thoughts that I've included in this comment as well as for reviewing my draft of the comment. Responsibility for errors and omissions is fully mine; I did not incorporate all of Issa's feedback.

Wikipedia editing is important, tractable, and neglected

Hi Linch! I have a loose summary of my sponsored Wikipedia editing efforts at https://vipulnaik.com/sponsored-wikipedia-editing/ that I have just updated to include more information and links.

For third-party coverage of the incident, check out https://web.archive.org/web/20170625001549/http://en.kingswiki.com/wiki/Vipulgate -- I'm linking to Wayback Machine since that wiki seems to no longer exist; also a warning that the site's general viewpoints are redpill, which might be a dealbreaker for some readers. But this particular article seems reasonably well-done in terms of its reporting/coverage, and isn't too redpill.

Announcing an updated drawing protocol for the EffectiveAltruism.org donor lotteries

It looks like the NIST randomness beacon will be back in time for the draw date of the lottery. https://www.nist.gov/programs-projects/nist-randomness-beacon says "NIST will reopen at 6:00 AM on Monday, January 28, 2019."

Might it make sense to return to the NIST randomness beacon for the drawing?

In defence of epistemic modesty

The comments on naming beliefs by Robin Hanson (2008) appears to be how the consensus around the impressions/beliefs distinction began to form (the commenters include such movers and shakers as Eliezer and Anna Salamon).

Also, impression track records by Katja (September 2017) recent blog post/article circulated in the rationalist community that revived the terminology.

Introducing fortify hEAlth: an EA-aligned charity startup

Against Malaria Foundation was started by a guy who had some business and marketing experience but no global health chops. It is now a GiveWell top charity

https://issarice.com/against-malaria-foundation

https://timelines.issarice.com/wiki/Timeline_of_Against_Malaria_Foundation

Disclosure: I funded the creation of the latter page, which inspired the creation of the former.

Why & How to Make Progress on Diversity & Inclusion in EA

I'm not sure why you brought up the downvoting in your reply to my reply to your comment, rather than replying directly to the downvoted comment. To be clear, though, I did not downvote the comment, ask others to downvote the comment, or hear from others saying they had downvoted the comment.

Also, I could (and should) have been clearer that I was focusing only on points that I didn't see covered in the post, rather than providing an exhaustive list of points. I generally try to comment with marginal value-add rather than reiterating things already mentioned in the post, which I think is sound, but for others who don't know I'm doing that, it can be misleading. Thank you for making me notice that.

Also:

I think this may be part of the problem in this context. Some EAs seem to take the attitude (i'm exaggerating a bit for effect) that if there was a post on the internet about it once, it's been discussed.

In my case, I was basing it on stuff explicitly, directly mentioned in the post on which I am commenting, and a prominently linked post. This isn't "there was a post on the internet about it once" this is more like "it is mentioned right here, in this post". So I don't think my comment is an example of this problem you highlight.

Speaking to the general problem you claim happens, I think it is a reasonable concern. I don't generally endorse expecting people to have intricate knowledge of years' worth of community material. People who cite previous discussions should generally try to link as specifically as possible to them, so that others can easily know what they're talking about without having had a full map of past discussions.

But imo it's also bad to bring up points as if they are brand new, when they have already been discussed before, and especially when others in the discussion have already explicitly linked to past discussions of those points.

Load More