University student in Budapest, Hungary. I currently learn philosophy, understand pure math; know a lot of computer science. Wannabe AI alignment researcher.
I often lack the motivation to do things without external validation, so if you want to help me, just tell me to do it.
I can understand math-heavy ideas and can describe them unintelligently. I can solve any problem with computers remotely by telling you to turn them off and then turn them on.
Cap the number of strong votes per week.
Strong votes with large weights have their uses in uncommon situations. But these situations are uncommon, so instead of weakening strong votes, make them rarer.
The guideline says use them only in exceptional cases, but there is no mechanism enforcing it: socially, strong votes are anonymous and look like standard votes; and technically, any number of them could be used. They could make a comment section appear very one-sided, but with rarity, some ideas can be lifted/hidden, and the rest of the section can be more diverse.
I do not think this is a problem now, because current power users are responsible. But this is our fortune and not a fact, and could change in the future. Incidentally, this would also set a bar for what is considered exceptional, like this comment is in the top X this week.
I, as an individual, agree with the statement. No one is infallible, every organization has smaller or bigger problems.
On the other hand, idols and community leaders provide an easy point for concentration of force. A few big coalitions have a larger impact than many scattered small groups, and if someone wants to organize a campaign a few leaders can reach a decision much faster than a large group of individuals.
If no one agrees on what to do then the movement of the Movement will grind to a halt. That's why there is value in keeping EA high-trust, and somehow accepting the word of the few at the top.
but given the amount of scandals in the last few months maybe they overshoot this high-trusting thingy a little bit, imho a bit more transparency would be nice
This is my favourite drama. In my interpretation it's more about AI risk (the last idea we need, the invention of all inventions), but Durrenmatt was limited by the technology of his age. I mean, if you think Solomon is the AI character, then the end of the play is about Solomon excaping the "box" while trapping their creators inside.
That could be a plus. If you're running a local group, and lend books at some public event, (like tabling), then this will incentivise the takers to attend the next local EA event too, where they can bring the books back.