Hide table of contents

tl;dr, request for more features that allow people to interact with the forum privately

Why?

This forum's current culture of complete epistemic humility, honesty and openness runs counter to a lot of professional cultures.

A person in a professional culture may want to not disclose some sensitive opinions of theirs. This could be a political leaning towards left or right. It could be revealling motivations or vulnerabilities - for instance if someone dicloses they're actually more interested in things other than what they claimed in their career. It could even be something extreme like discussion over AI alignment or infohazards, which a person cannot speak publicly about because they're in a position of authority on those issues.

There may also occassionally be reasons for someone to be deceptive for good reason in their professional life. And project more confidence than is epistemically honest.

All of this matters more to people who have more influence and authority, which is exactly who the EA forum may benefit from more interaction with. They currently would choose to limit their interaction with such forums to be on the safe side. Privacy features could change this.

Feature requested

 - ability to delete account and all posts/comments

 - ability to delete account but leave posts/comments up

 - ability to change username

  - ability to disable profile from search indexing (and ensure the subdomain or link alloted reflects changed username)

  - ability to disable posts from search indexing

 - ability to download user data

 - ability to create account without linking an email

22

0
0

Reactions

0
0
Comments6


Sorted by Click to highlight new comments since:

Thanks for providing feedback on the Forum! I just wanted to clarify some of the specific features you requested.

  1. ability to delete account and all posts/comments
  2. ability to delete account but leave posts/comments up
  3. ability to change username
    • You can now do this from your account settings page! Though you only get one change, after that you'll have to contact us to change it again.
  4. ensure the subdomain or link alloted reflects changed username
  5. ability to disable profile from search indexing
  6. ability to disable posts from search indexing

We really appreciate hearing from folks. We keep an eye on the Forum feature suggestion thread, so please feel free to post suggestions there.

Aiight.

 

To onlookers reading this, to the degree that these are designed to erase forum content to prevent public discoverability, you should know that these are effectively impossible in some sense:

 - ability to delete account and all posts/comments

 - ability to delete account but leave posts/comments up

 - ability to change username

  - ability to disable profile from search indexing (and ensure the subdomain or link alloted reflects changed username)

  - ability to disable posts from search indexing

For example, this project itself makes this impossible. 

In fact, much simpler actions, such as running a Python snippet I could provide, could download much of this information in <5 minutes on an office computer.

I write this for "personal deontological reasons" basically, however the availability and permanence of  this information is not due to me. For example, consider the situation if one had information about other, non-EA entities collecting this information (which might suggest one reason to collect this, of many).

Note that this situation cannot be prevented by shutting down the API (e.g. array of scraping residential proxies).

Separately and additionally, much more information is available than might be obvious, again using just the publicly available information on the forum.

 

In addition to "deontological" reasons, one reason I comment is that I want to discourage adding duties or tasks to the EA forum development team, since this might often not actually do the thing you want.

Without having thought at all about the potential downsides, those all seem like plausibly useful features. Having said that, you're allowed to create an anonymous or pseudonymous account, people frequently do, and creating a new email address isn't hard. 

Is there a reason I'm missing that means that these features would give people a substantial advantage over just creating an account with a non-identifiable name?

Some people might make an account on the EA Forum, post some comments online about a variety of topics, but then later find themselves running for public office or otherwise suddenly exposed to greater public scrutiny.  In that situation, many people would want to go back and delete or hide previous comments they'd made even on relatively innocuous (by EA standards) topics.  For example, in 80,000 Hours' guide for a career path as a congressional staffer, they advise "Keep a low profile: Don’t publish controversial opinions on social media or do anything else that could make you look bad.  Congressional staffers are in the public eye, so doing so could prevent you from getting a job."

One recent example of something like this happening in real life was when Andrew Sabinsky was hired by Dominic Cummings in the UK to promote Phil-Tetlock-style "superforecasting" and prediction-market initiatives within the UK government, but was soon forced to resign in disgrace when past comments on rationalist blogs -- about embryo selection, IQ, nootropics, and other controversial Slate-Star-Codex-y topics -- were exposed by the media and used to paint a picture of Sabinsky as a misogynist and eugenicist.  Most EA Forum commenters have a much less aggro writing style than Sabinsky and don't have as many hot-button controversial opinions as he did.  But even my own writing on the EA Forum would probably look pretty bad if I suddenly found myself running for congress.  After all, I've:

Fortunately, I'm an aerospace engineer with good personal financial security, and I have no twitter account and no plans to run for office, so I can say all this stuff without fear!  But plans change.  I could imagine plenty of unlikely-but-possible scenarios where I'd enjoy the ability to delete my account, or change my username from "Jackson Wagner" to something more discreet like "LongtermistResearcher", or lower my posts' prominence in search engine results.

Based on my very limited understanding of the GDPR, it's possible that some of these features (like the ability to delete one's account and all posts) are legally required, given that a) some users are EU citizens and b) many users have their name (which is personally identifying information) in their username. Well, I guess to be GDPR-compliant, you don't need these to be automated features; it should be okay if EA Forum administrators are able to manually, e.g., delete all posts by an EU user, within one month after they request it.

A person in a professional culture may want to not disclose some sensitive opinions of theirs. This could be a political leaning towards left or right. It could be revealling motivations or vulnerabilities - for instance if someone dicloses they're actually more interested in things other than what they claimed in their career. It could even be something extreme like discussion over AI alignment or infohazards, which a person cannot speak publicly about because they're in a position of authority on those issues.

There may also occassionally be reasons for someone to be deceptive for good reason in their professional life. And project more confidence than is epistemically honest.

All of this matters more to people who have more influence and authority, which is exactly who the EA forum may benefit from more interaction with. They currently would choose to limit their interaction with such forums to be on the safe side. Privacy features could change this.

 

Yeah, this isn't really true.

 

The most relevant situation might be for a small group of people who join and briefly make "magisterial" comments with great depth and perspective. This is some of the most valuable and unique engagement the forum gets. 

These people giving magisterial perspective are usually senior and are basically trained to communicate and present themselves. In some sense, it is their duty and wish to use their persona in this way. 

So things are kind of exactly the opposite of the perspective in this post.

 

For established EAs in a position of authority who want to say something about "AI alignment or infohazards", there are several channels that they can act or dissent, and I don't think the EA forum is a large part of that. It's also trivial to create a new forum account, and a much more difficult task to really try to conceal their identity.
 

Curated and popular this week
 ·  · 11m read
 · 
Does a food carbon tax increase animal deaths and/or the total time of suffering of cows, pigs, chickens, and fish? Theoretically, this is possible, as a carbon tax could lead consumers to substitute, for example, beef with chicken. However, this is not per se the case, as animal products are not perfect substitutes.  I'm presenting the results of my master's thesis in Environmental Economics, which I re-worked and published on SSRN as a pre-print. My thesis develops a model of animal product substitution after a carbon tax, slaughter tax, and a meat tax. When I calibrate[1] this model for the U.S., there is a decrease in animal deaths and duration of suffering following a carbon tax. This suggests that a carbon tax can reduce animal suffering. Key points * Some animal products are carbon-intensive, like beef, but causes relatively few animal deaths or total time of suffering because the animals are large. Other animal products, like chicken, causes relatively many animal deaths or total time of suffering because the animals are small, but cause relatively low greenhouse gas emissions. * A carbon tax will make some animal products, like beef, much more expensive. As a result, people may buy more chicken. This would increase animal suffering, assuming that farm animals suffer. However, this is not per se the case. It is also possible that the direct negative effect of a carbon tax on chicken consumption is stronger than the indirect (positive) substitution effect from carbon-intensive products to chicken. * I developed a non-linear market model to predict the consumption of different animal products after a tax, based on own-price and cross-price elasticities. * When calibrated for the United States, this model predicts a decrease in the consumption of all animal products considered (beef, chicken, pork, and farmed fish). Therefore, the modelled carbon tax is actually good for animal welfare, assuming that animals live net-negative lives. * A slaughter tax (a
 ·  · 2m read
 · 
I can’t recall the last time I read a book in one sitting, but that’s what happened with Moral Ambition by bestselling author Rutger Bregman. I read the German edition, though it’s also available in Dutch (see James Herbert's Quick Take). An English release is slated for May. The book opens with the statement: “The greatest waste of our times is the waste of talent.” From there, Bregman builds a compelling case for privileged individuals to leave their “bullshit jobs” and tackle the world’s most pressing challenges. He weaves together narratives spanning historical movements like abolitionism, suffrage, and civil rights through to contemporary initiatives such as Against Malaria Foundation, Charity Entrepreneurship, LEEP, and the Shrimp Welfare Project. If you’ve been engaged with EA ideas, much of this will sound familiar, but I initially didn’t expect to enjoy the book as much as I did. However, Bregman’s skill as a storyteller and his knack for balancing theory and narrative make Moral Ambition a fascinating read. He reframes EA concepts in a more accessible way, such as replacing “counterfactuals” with the sports acronym “VORP” (Value Over Replacement Player). His use of stories and examples, paired with over 500 footnotes for details, makes the book approachable without sacrificing depth. I had some initial reservations. The book draws heavily on examples from the EA community but rarely engages directly with the movement, mentioning EA mainly in the context of FTX. The final chapter also promotes Bregman’s own initiative, The School for Moral Ambition. However, the school’s values closely align with core EA principles. The ITN framework and pitches for major EA cause areas are in the book, albeit with varying levels of depth. Having finished the book, I can appreciate its approach. Moral Ambition feels like a more pragmatic, less theory-heavy version of EA. The School for Moral Ambition has attracted better-known figures in Germany, such as the political e
MarieF🔸
 ·  · 4m read
 · 
Summary * After >2 years at Hi-Med, I have decided to step down from my role. * This allows me to complete my medical residency for long-term career resilience, whilst still allowing part-time flexibility for direct charity work. It also allows me to donate more again. * Hi-Med is now looking to appoint its next Executive Director; the application deadline is 26 January 2025. * I will join Hi-Med’s governing board once we have appointed the next Executive Director. Before the role When I graduated from medical school in 2017, I had already started to give 10% of my income to effective charities, but I was unsure as to how I could best use my medical degree to make this world a better place. After dipping my toe into nonprofit fundraising (with Doctors Without Borders) and working in a medical career-related start-up to upskill, a talk given by Dixon Chibanda at EAG London 2018 deeply inspired me. I formed a rough plan to later found an organisation that would teach Post-traumatic stress disorder (PTSD)-specific psychotherapeutic techniques to lay people to make evidence-based treatment of PTSD scalable. I started my medical residency in psychosomatic medicine in 2019, working for a specialised clinic for PTSD treatment until 2021, then rotated to child and adolescent psychiatry for a year and was half a year into the continuation of my specialisation training at a third hospital, when Akhil Bansal, whom I met at a recent EAG in London, reached out and encouraged me to apply for the ED position at Hi-Med - an organisation that I knew through my participation in their introductory fellowship (an academic paper about the outcomes of this first cohort can be found here). I seized the opportunity, applied, was offered the position, and started working full-time in November 2022.  During the role I feel truly privileged to have had the opportunity to lead High Impact Medicine for the past two years. My learning curve was steep - there were so many new things to