You can now subscribe to be notified when posts are added to a sequence. You can see more details in GitHub here.
We’ve also made it a bit easier to create and edit sequences, including allowing users to delete sequences they’ve made.
I've been thinking a bit about how to improve sequences, so I'd be curious to hear:
Just letting you know that you can now subscribe to be notified when posts are added to a sequence. Hope this is helpful, and let me know if you run into any issues!
You can now subscribe to be notified when posts are added to a sequence. Hope this is helpful, and let me know if you run into any issues!
Ah I see, glad to hear it! Yeah dismissing the popup in one tab doesn't cause the others to refresh or pull the updated data.
My apologies for the inconvenience, and thanks for flagging this! Unfortunately I'm unable to reproduce the issue. Could you provide some more details, such as what browsers/devices you see this on? Does it persist when you refresh? Are you only seeing this when logged in, or have you encountered this while logged out?
Users can directly edit wiki pages once they have gotten to 10 karma by using the site. I don't believe this list is actively maintained - you can see in the edit history that the last addition was in 2022.
We’ve updated our new user onboarding flow! You can see more details in GitHub here.
In addition to making it way prettier, we’re trying out adding some optional steps, including:
One suggestion I would add is to try volunteering. I'm also introverted and went to my first EAG without knowing anyone, and I found it way easier to chat with other volunteers than random people at the conference. The people who volunteer tend to be either fellow first time attendees (including many other students) or people who are friendlier than average.
Glad it was helpful! :) Unfortunately the Facebook group is the most active Boston EA space. The group website looks quite outdated, but I think you can still get notified of events via the email list, so I would recommend joining that.
A few quick things I would recommend (if you haven't already done them):
I'll bet EAF put a lot of thought into their palette.
As Ollie mentioned, I made the set you referenced for just this one thread. As far as I remember it was meant to to support positive vibes in that thread and was done very quickly, so I would not say a lot of thought went into that palette.
"Filter by topics" lets you search for and select any number of topics, and the results will show anything that has all of the selected topics. Hope that helps!
Thanks for the feedback Linda! I believe you can accomplish this using the topic filters on our current search page, but please let me know if you run into any issues.
I agree that this should be a consideration. Based on the small amount of data I have talking with employees at major AI labs about this, I currently think that overall their workers are less concerned about safety than their management, so I'm worried this could be counterproductive.
Thanks for sharing your feedback! Responding to each point:
That's right, you should be able to mention users with @ and posts with #. However, it does seem like they're both currently broken, likely because we recently updated our search software. Thanks for flagging this! We'll look into it.
I definitely feel it was worth the time for me personally. It was great for learning about the field of AI alignment (problems and proposed solutions). I was hoping the course would spend more time on arguments for and against AI being an x-risk, but unfortunately there was little of that, so it didn't change my mind much.
I'm enjoying the podcast so far! One suggestion: I'd love if you could put links in the episode description for all the things you bring up during the episode (books, newsletters, articles, etc.), to make them easier to reference.
You'll need a site admin to help with both of these. Could you contact us with the details (ex. how you want the chapters organized)? Thanks!
Interesting, thanks for flagging this bug! It should be fixed now - please let us know if you run into any related issues.
Oh sorry, a recent change to images caused a bug, but it should be fixed now. (You can fix your image by editing and submitting your comment.)
You can add "Community" as an option by clicking on the + button and searching for it.
You can re-hide them from that section by opening "Customize Feed" and setting "Community" to be "Hidden":
Thanks for reporting the bug! I just deployed a fix, so the "Magic" default sorting should be properly applied now.
Yeah I was wondering if this was what the question asker was getting at. Thank you for clearly explaining it.
You're right that this doesn't exist. My instinct is that this doesn't provide enough value to be worth the cost of the extra UX complication and the slight deanonymizing affect on voting. I'd be curious to hear how this kind of feature would be helpful for you.
Yeah, the forum relies a lot on hover effects, which don't work very well on mobile. To avoid that in this case seems like it would overcomplicate the UI though, so I'm not sure what an improved UX would look like. I'll add this to our backlog for triage.
Hovering over the karma score displays how many votes there are. Does that address your request, or is there something missing?
Thanks for flagging that we had a bug affecting voting! It should be fixed now, please let me know if you see any more related issues.
I appreciate the suggestions! I agree we should make this info easier to find - added these to our list for triage.
I think I'm confused by where the "additional suffering" is coming from. If dying via being caught is approximately as painful as dying via other common means, then is this argument based on the premise that the fish will lead net negative lives?
Thanks for sharing this - I found it informative! :)
Fisheries subsidy reform would probably cause more fish to be brought into existence and more fish to suffer highly painful deaths by being caught in fisheries. This additional suffering would likely outweigh any benefits from fewer fish being caught in the short-term.
Just to clarify, are you saying that a fish dying via being caught in a fishery is more painful than other ways that it's likely to die? If so, can you expand on why that is?
People interested in global health & development and this post might be interested in applying to the Program Operations Assistant, Global Health & Wellbeing role at Open Philanthropy, and EA-aligned research and grantmaking foundation.
This is a test by the EA Forum Team to gauge interest in job ads relevant to posts - give us feedback here.
Thanks for the feedback! This is one of multiple job-related tests we're running on the forum, to see if we can find something impactful to build. We did try out your suggestion in the form of the Who's hiring? (May-September 2022) thread, and we're still analyzing the results. The difference here, as Lorenzo pointed out, is that we could potentially capture people who are not actively looking for a job but would consider applying if they were made aware of relevant opportunities.
We're testing this via comments because it's a cheap MVP - no coding necessar...
People interested in global health & development and this post might be interested in applying to the Senior Researcher role at GiveWell, a non-profit focused on helping people do as much good as possible with their donations.
This is a test by the EA Forum Team to gauge interest in job ads relevant to posts - give us feedback here.
Thanks for asking! I'd suggest looking at the 80k Hours job board filtered by "nuclear security" - for example, I see the Junior Policy Fellow position at the Centre for Science and Policy in Cambridge, UK. Hope that helps! :)
People interested in animal welfare and this post might be interested in applying to the Executive Assistant role at Mercy for Animals, which is working to end industrial animal agriculture.
This is a test by the EA Forum Team to gauge interest in job ads relevant to posts - give us feedback here.
People interested in AI risk and this post might be interested in applying to the Business Operations role at Anthropic, which is working to build reliable, interpretable, and steerable AI systems.
This is a test by the EA Forum Team to gauge interest in job ads relevant to posts - give us feedback here.
People interested in nuclear security and this post might be interested in applying to the Spring 2023 Internship Program at the Nuclear Threat Initiative, a non-profit working to prevent global catastrophic risks by driving systemic solutions to nuclear and biological threats imperiling humanity. Applications are due by Friday, October 21.
This is a test by the EA Forum Team to gauge interest in job ads relevant to posts - give us feedback here.
People interested in AI risk and this post might be interested in applying to the researcher or software engineer roles at the Alignment Research Center, a non-profit organization focused on theoretical research to align future machine learning systems with human interests.
This is a test by the EA Forum Team to gauge interest in job ads relevant to posts - give us feedback here.
The virtual version of this event is just a zoom webinar; as far as I can tell, there is no open discussion available virtually. Attendees can propose questions to be asked by the moderator at the end of each talk. Sounds like the recordings of the presentations will be made available afterwards.