All of lincolnq's Comments + Replies

There are so many different bags and brands available that you should specify more constraints if you want more personalized recommendations. Vegan is not too hard to satisfy - most luggage that's vegan won't necessarily mention it, you just have to look in the description for animal products to avoid (leather / suede mostly).

For me personally, my main carry is a Tom Bihn Techonaut 30 - it's big enough to carry 5+ days of clothing and my laptop and other gear without needing another bag, but lightweight enough that when I need more space, I am happy to carry it as just a small backpack alongside my Travelpro Maxlite 5.

I also like the /r/onebag subreddit.

(My personal opinion, not EV's:)

EV is winding down and being on this board is quite a lot of work. This makes it very hard to recruit for! The positive flip side though of the fact of the wind-down means that the cultural leadership we are doing is a bit less impactful than it was say a year or two ago.

When we faced the decision of whether to keep searching or accept the candidates in front of us, I considered many factors but eventually agreed that it was ok to prioritize allowing the existing board members to leave (which they couldn't do until we found ... (read more)

-2
Jason
2mo
The EVF Boards have to approve the new setup of the spun-out organizations, right? The projects themselves (e.g., CEA, GWWC) have no legal personality as of present. "Their" assets, IP, etc. belong to the EVFs, which must approve the spin-off proposals transferring those resources to a new organization. It's not obvious to me why the new-organization Board composition is for staff in the to-be-spun-off project to decide, rather than the EVF Boards. Staff picking Board members is not a best practice in the majority of cases; at a minimum, there should be less deference to the staff nominations than there would be to an independent nominations panel. In other words, the EVF Boards probably shouldn't agree to transfer "the project's" assets over to a new non-profit corporation unless it is satisfied that appropriate corporate governance procedures are in place.[1]  I think there are a number of presumptive criteria the EVF Boards could set for approval of a spin-off: spin-out Boards should have at least five members except for perhaps the smallest of new orgs; no more than 2/5 of Board members from one cluster of influence [2]; not all men; not all White people; not all US or all UK people, etc. If an would-be spinoff org wants a variance, it needs to make an affirmative case to the Boards as to why the advantages of a variance outweigh the reasons behind the presumptive criteria.  1. ^ I would be more inclined to defer to, e.g., a non-binding GWWC director/trustee election conducted by GWWC pledgers even if I found the resulting slate a bit shaky. 2. ^ A bit challenging to define, but "the Open Phil cluster" might be an example.
5
Ulrik Horn
2mo
Thanks Lincoln that's a good point - I had not considered that. I'm so used to thinking EV super big deal. Let's hope there is good diversity in the leadership in the various spun out organisations.

I would like to point out that this is one of those things where n=1 is enough to improve people's lives (e.g., the placebo effect works in your favor), in the same way that I can improve my life by taking a weird supplement that isn't scientifically known to work but helps me when I take it.

For what it's worth, my life did seem to start going better (I started to feel more in touch with my emotional side) after becoming vegan.

While I broadly agree with Rocky's list I want to push back a little vs. your points:

Re your (2): I've found that small entities are in a constant struggle for survival, and must move fast and focus on the most important problems unique to their ability to make a difference in the world. Small-seeming requirements like "new hires have to find their own housing" can easily make the difference between being able to move quickly vs. slowly on some project that makes or breaks the company. I think for new entities the risks of incurring large costs before you ... (read more)

4
lilly
7mo
Thanks for your perspective on this! Do you have an example of this? It is surprising to me that maintaining reasonable/standard professional norms could actually sink a company. (Among other things because at a small company, you have limited manpower, and so personnel time devoted to helping someone find housing is presumably coming out of time spent somewhere else—i.e., working on the time-sensitive project.) I suspect we're just defining "professional" differently here (or thinking about really different professional contexts), but my experience is pretty strongly informed by having worked in an office pre-COVID, and seeing how profoundly professional culture has eroded, and how hard it has been to build any of that back. I think grad students/academics who have taught undergrads post-COVID have also been struck by this: it seems like norms within education quickly (and understandably!) became quite lax during COVID, but it's been quite difficult to reverse those changes (i.e., get students to turn stuff in on time, respond to emails, show up to mandatory events, etc). That said, I know that older people have always tended to think that the youth are a bunch of degenerates, so plausibly that's coloring our perception here, too.
Answer by lincolnqSep 03, 20231
0
2
1

You might want to check out some of Phil Trammell's reports, where he analyzes what he calls time preference (time discount rate) with respect to philanthropy: https://docs.google.com/document/d/1NcfTgZsqT9k30ngeQbappYyn-UO4vltjkm64n4or5r4/edit

1
Moritz Linder
7mo
Thanks for this link, I liked it very much. However, I am inquiring about estimations on "returns on giving" to philantropic funds that could be compared to "returns on investment" for conventional stock or index funds.

Congrats on having invented something exciting!

Usually, the best way to get innovative new technology into the hands of beneficiaries quickly is to get a for-profit company to invest with a promise of making money. This can happen via licensing a patent to an existing manufacturer, or creating a whole startup company and raising venture capital, etc.

One of the things such investors want to see is a 'moat': something that this company can do that no other company can easily copy. A patent/exclusive license is a good way to create a moat.

There are some domai... (read more)

I'm a bit confused about this because "getting ambitious slowly" seems like one of those things where you might not be able to successfully fool yourself: once you can conceive that your true goal is to cure cancer, you are already "ambitious"; unless you're really good at fooling yourself, you will immediately view smaller goals as instrumental to the big one. It doesn't work to say I'm going to get ambitious slowly.

What does work is focusing on achievable goals though! Like, I can say I want to cure cancer but then decide to focus on understanding metabolic pathways of the cell, or whatever. I think if you are saying that you need to focus on smaller stuff, then I am 100% in agreement.

3
Elizabeth
8mo
Does what I said here and here answer this? the goal isn't "put the breaks on internally motivated ambition", it's "if you want to get unambitious people to do bigger projects, you will achieve your goal faster if you start them with a snowball rather than try to skip them straight to Very Big Plans". I separately think we should be clearer on the distinction between goals (things you are actively working on, have a plan with concrete next steps and feedback loops, and could learn from failure) and dreams (things you vaguely aspire and maybe are working in the vicinity of, but no concrete plans). Dreams are good, but the proper handling of them is pretty different from that of goals. 

I avoid reading, and don't usually respond to, comments on my posts, or replies to my own comments.

The reason is that it's emotionally intense to do so: after posting something on the EA Forum, I avoid checking the forum at all for ~24h or so (for fear of noticing replies in the 'recents' area, or changes in my karma), and after that I mainly skim for people flagging major errors or omissions that need my input to be resolved.

Lizka's You Don't Have to Respond to Every Comment talks about this a bit (and was enormously helpful for me) - I am not strongly av... (read more)

3
geoffrey
8mo
This was a good nudge for me to lower the frequency on all my notifications (especially the karma one to weekly, which I’ve been checking more than I’d like lately)
2
OllieBase
8mo
Yep, I do something similar! Sounds like I end up replying to more things than you, but I usually disavow the forum for several days or just go on holiday after posting.
1
Charlie_Guthmann
8mo
I feel this. It would be cool if you could drop a post and put a zoom link at the bottom to discuss it in like 24 or 48 hours, that way there can still be a discussion but maybe skirts around some of this obsessive forum checking ego stuff

Can you give some evidence/an example for "unable to mentor many of the qualified applicants"?

3
Will Aldred
8mo
One example is what Ben Garfinkel has said about the GovAI summer research fellowship:

I think this is a useful question and I'm glad to be discussing this.

I agree with many of your concerns - and would love to see a more culturally-unified EA on the axis of how conscious we are of our own impact - but I also think you're failing to acknowledge something crucial: As much as EA is about altruism, it is also about focus on what's important, and your post doesn't acknowledge this as a potential trade-off for the folks you're discussing.

You'll find a lot of EA folks perceive climate change as a real problem but also perceive marginal carbon cost... (read more)

1
Vaipan
8mo
I personally offset all my CO2 with Wren And yet I highly doubt that most EA do that. You say that carbon is offsetable but it's still a vigorous debate. The measures we take to offset the said carbone often won't remove carbon before years, if not centuries.  For someone who goes to a conference, how can they really measure the trade-offs? meeting one person who helps them get a EA job with 10 other persons from other contexts? It sounds hypocritical. Truth is, it's hard to calculate truthfully the impact you're having at these conferences because the results take years; however, the carbon is spent. Here. Now. And seeing global warming as a 'marginal' is a grave error to make IMO.  These folks justify their highly carbonate cost of living by saying they make impact elsewhere,but they can't really calculate it.  All this doesn't make my post less relevant : 1) we need to talk about it more and have some kind of pledge/be transparent about it 2) we need to do something about this carelesness because of lack of accountability. 

I'm interested in the discussion of whether in fact we are at a hinge of history, maybe this is a good comments section for that. I agree that Will's analysis barely scratches the surface and has some flaws.

Factors under consideration for me:

  • Existence of technologies that can have direct impacts on future society through making the world much better or much worse: computation and AI, the internet & social media, nanotech, biotech, the printing press, energy production / Dyson spheres
  • Do population/economic growth rates matter? i.e., if we are growing
... (read more)

The GiveDirectly founders (Michael Faye and Paul Niehaus) also founded TapTapSend (https://techcrunch.com/2021/12/20/taptap-send-raises-65m-to-build-cross-border-remittances-focused-on-the-most-underserved-markets/) which competes with Sendwave to keep remittance prices down.

Answer by lincolnqJul 25, 202333
2
0

Yeah. I just joined the board so I don’t exactly know why, but we are definitely aware of missing this deadline and the charity commission is as well, and I think it is caused by the ongoing investigation.

This is correct. We ended up needing to resolve a couple issues related to the inquiry before we can file. We’ve stayed in touch with the Charity Commission about the delay.

It's a fair critique. I use "legible" in this way, and I don't really want to give it up, and I think it's not too bad jargon-wise because even non-EA people seem to understand it without too much prefixing with definition.

Your alternatives don't quite capture the idea right:

  • If I were to set a "clear" or "understandable" goal, I would expect people to be able to make sense of the goal statement but not necessarily see what KPIs went into it.
  • "Verifiable" is the opposite: I would expect people to expect that they could check whether or not we made the goa
... (read more)
1
Sailboat
9mo
Thanks for the pushback! I agree that the specific meaning you're outlining isn't captured by any of my alternatives. That said, part of my issue is that I don't think EAs consistently use it that way: if you search "legible" on the Forum, I think a lot of the instances are essentially just synonyms for "clear". If people used "legible" when they meant "clear and verifiable" and used "clear" when they meant "clear", I'd be on board with that. Even if the status quo continues, I don't think it's the end of the world or anything. I just sometimes feel like we're using jargon because it makes us feel cool rather than to facilitate communication, and that gives me some weird vibes. 

Why does it make sense for Rethink Priorities to host research related to all five of the listed focus areas within one research org? It seems like they have little in common (other than, I guess, all being popular EA topics)?

6
Peter Wildeford
9mo
I agree this is confusing. I get into this in my answer to Sebastian Schmidt.

You said in your "Five years" post that you are planning to do more self-eval and impact assessments, and I strongly encourage this. What are the most realistic bits of evidence you could get from an impact report of Rethink Priorities which would cause you to dramatically update your strategy? (or, another generator: what are you most worried about learning from such assessments?)

How has your experience as co-CEO been? How do you share responsibilities? Would you recommend it to other orgs?

I’ve personally liked it. There have been several times when I’ve talked with my co-CEO Marcus about whether one of us should just become CEO and it’s never really made sense. We work well together and the co-CEO dynamic creates a great balance between our pros and cons as leaders – Marcus leads the organization to be more deliberate and careful at the cost of potentially going too slowly and I lead the organization to be more visionary at the cost of potentially being too chaotic.

Right now we split the organization very well where Marcus handles the portf... (read more)

Excellent piece! I agree with this mindset but regularly struggle to explain why it's motivating / good to think this way, and I think you've done a nice job.

I don’t believe this is an unbelievably terrible idea; it makes sense to do this in some circumstances. That said, take resentment buildup seriously! If you feel that you are the sort of person who has even a small chance of feeling resentful about this choice later on, it is probably not worth it. You need to feel unambiguously good about this decision in the short and long term.

Yeah, sorry, I wrote the comment quickly and "resources" was overloaded. My first reference to resources was intended to be money; the second was information like career guides and such.

I think the critical-info-in-private thing is actually super impactful towards centralization, because when the info leaks, the "decentralized people" have a high-salience moment where they realize that what's happening privately isn't what they thought was happening publicly, they feel slightly lied-to or betrayed, lose perceived empowerment and engagement.

The tractability of further centralisation seems low

I'm not sure yet about my overall take on the piece but I do quibble a bit with this; I think that there are lots of simple steps that CEA/Will/various central actors (possibly including me) could do, if we wished, to push towards centralization. Things like:

  • Having most of the resources come from one place
  • Declaring that a certain type of resource is the "official" resource which we "recommend"
  • Running invite-only conferences where we invite all the people that are looked-up-to as leaders in the comm
... (read more)
1
Larks
10mo
I feel like you ... maybe did not try very hard to brainstorm incremental pro-centralisation steps? I set aside 5 minutes and came up with 17 options, mainly quite tractable, that CEA/EV/OP could do if they wished, starting with very simple ideas like "publicly announce that centralisation is good".  (Not sharing the list because I'm not convinced I want more power centralised).

Thanks!  I agree that we are already (kind of) doing most of these things. So the question is whether further centralisation is tractable (and desirable). Like I say, it seems to me the big thing is if there’s someone, or some group of people, who really wants to make that further centralisation to happen. (E.g. I don’t think I’d be the right person even if I wanted to do it.)

Some things I didn't understand from your bullet-point list:


Having most of the resources come from one place


By “resources” do you primarily mean funding?  (I'll assume ... (read more)

I mostly agree, but would add that it seems totally okay if two orgs sometimes work on the same thing! It's easy to over-index on the simple existence of an item within scope and say "oh that's covered" and move on, without actually asking "is this need really being met in the world?" Competition is good in general, and I wouldn't want to overly discourage it.

Vague agree with the framing of questions vs. answers, but I feel worried that "answer-based communities" are quite divergent from the epistemic culture of EA. Like, religions are answer-based communities but a lot of EAs would dispute that EA is a religion or that it is prescriptive in that way.

Not sure how exactly this fits into what you wrote, but figured I should register it.

1[anonymous]10mo
I don't feel worried about that. I feel worried that this post frames neartermist-leaning orgs (like the OP's) as question-based i.e. as having an EA epistemic culture, while longtermist-leaning orgs are framed as answer-based i.e. as having an un-EA epistemic culture without good reason.

I wrote up my nutrition notes here, from my first year of being vegan: http://www.lincolnquirk.com/2023/06/02/vegan_nutrition.html

5
Julia_Wise
11mo
Really glad to have you joining!
4
Kirsten
11mo
The EV board is lucky to have you!

I want to push back a little against this. I care more about the epistemic climate than I do about the emotional climate. Ideally in most cases they don't trade off. Where they do, though, I would rather people prioritize the epistemic climate, since I think knowing what is true is incredibly core to EA, more than the motivational aspect of it!

1
tcelferact
1y
I agree with this. Where there is a tradeoff, err on the side of truthfulness.

Everything here is based on friends' recommendations and very lightweight research, I didn't do much original research and didn't measure my levels. I'll probably get around to measuring soon and I expect this plan to change a bit. Philosophically I have chosen a low effort/low risk plan which I think is sustainable for me.

I take creatine and b12 when I remember to take them, which tends to be on days I go to the gym and make a smoothie afterwards. I take D3 sporadically when I think of it during the winter months (although this winter I didn't bother for ... (read more)

Ok, my best idea is to highlight a Marxist theory of labor vs. capital at the small scale. I know this sounds very high brow but I think a distillation of it could work?

Give someone a loaf, they can eat it.
Teach them to bake, they can join the labor market and work hard to feed themselves.
Give them money for an oven, they can own the means of production.

Ok, I mostly agree with you, but let's reframe as a devil's advocate: what if "EA" is a shaky concept in the first place (doesn't carve reality at joints)? Would you then agree that borders should be redrawn to have a more coherent mission, even if that ends up cutting out some bits of the "old EA"?

3
Wil Perkins
1y
I think we can manage to have different enclaves of EA with different norms, that still broadly agree and play nicely with each other. As a community organizer I hope to get a better idea of what different groups value so I can navigate these situations better. Could you explain a bit more as to what you’re proposing?

Very clear - makes a point that I've been struggling to think about and explain to people. Thanks for writing this.

1
rgb
1y
Thank you!

Great post! It inspired me to write this, because I worry that such posts might accidentally discourage others from working on this cause area. https://forum.effectivealtruism.org/posts/e8ZJvaiuxwQraG3yL/don-t-over-update-on-others-failures

(to be clear: I really appreciate postmortems and want more content like it!)

4
saulius
1y
I worry about the same thing and it's one of the reasons why I hesitated to post this for a long time. Thank you for your comment and your post. I want to paste a comment I wrote under your post because I want people who work on WAW to read it even though it's kind of trivial: And to reiterate, I think that most WAW is still very promising compared to most other altruistic work, especially when you are one of only a few people working on it. I just don't think we have enough evidence that it's impactful yet to massively scale it up. But it is important to test it. EDIT: I just want to also add that I might still recommend WAW as a career choice for some people. For example, if you are an expert in ecology and have a aptitude for handling messy research problems.

hm, at a minimum: moving lots of money, and making a big impact on the discussion around ai risk, and probably also making a pretty big impact on animal welfare advocacy.

My loose understanding of farmed animal advocacy is that something like half the money, and most of the leaders, are EA-aligned or EA-adjacent. And the moral value of their $s is very high. Like you just see wins after wins every year, on a total budget across the entire field on the order of tens of millions.

A lot of organisations with totally awful ideas and norms have nonethless ended up moving lots of money and persuading a lot of people. You can insert your favourite punching bag pseudoscience movement or bad political party here.  The OP is not saying that the norms of EA are worse than those organisations, just that they're not as good as they could be

4
Guy Raveh
1y
Are we at all sure that these have had, or will have, a positive impact?

Nice. Thanks. Really well written, very clear language, and I think this is pointed in a pretty good direction. Overall I learned a lot.

I do have the sense it maybe proves too much -- i.e. if these critiques are all correct then I think it's surprising that EA is as successful as it is, and that raises alarm bells for me about the overall writeup.

I don't see you doing much acknowledging what might be good about the stuff that you critique -- for example, you critique the focus on individual rationality over e.g. deferring to external consensus. But it seem... (read more)

9
Chris Leong
1y
"I do have the sense it maybe proves too much -- i.e. if these critiques are all correct then I think it's surprising that EA is as successful as it is, and that raises alarm bells for me about the overall writeup" Agreed. Chesterton's fence applies here.

“I don't see you doing much acknowledging what might be good about the stuff that you critique”

I don’t think it’s important for criticisms to do this.

I think it’s fair to expect readers to view things on a spectrum, and interpret critiques as an argument in favour of moving in a certain direction along a spectrum, rather than going to the other extreme.

2
Guy Raveh
1y
In what ways is EA very successful? Especially if you go outside the area of global health?

It’s interesting to me, because many entrepreneurs like myself get into entrepreneurship with (we sincerely believe) a goal of making the world a better place. Some are seemingly frauds. It is good to read this, to gain perspective on what not to do.

The problem with considering optics is that it’s chaotic. I think Wytham is a reasonable example. You might want a fancy space so you can have good optics - imagining that you need to convince fancy people of things, otherwise they won’t take you seriously. Or you might imagine that it looks too fancy, and then people won’t take you seriously because it looks like you’re spending too much money.

Pretty much everything in “PR” has weird nonlinear dynamics like this. I’m not going to say that it is completely unpredictable but I do think that it’s quite hard ... (read more)

7
Jason
1y
Holding conferences is not "actually good for the world" in any direct sense. It is good only to the extent that it results in net good outcomes -- and you're quite right that those outcomes can be hard to predict. What I think we have to be careful to avoid is the crediting the hoped-for positive aspects while dismissing the negative aspects as "optics" that cannot be adequately predicted.  Also, you could always commission a survey to generate at least some data on how the public would perceive an action. That doesn't give much confidence in what the actual perception would be . . . but these sorts of things are hard to measure/predict on both the positive and negative ends. If people are just too unpredictable to make EV estimates based on their reactions to anything, then we should just hold all conferences at the local Motel 6 or wherever the cheapest venue is. "Dollars spent" is at least measurable.
3
Lukas_Gloor
1y
I agree with this. It's also not clear where to draw the boundary. If even well-informed people who shared your worldview and values thought a given purchase was bad, then there's no need to call it "optics" – it's just a bad purchase.  So "optics" is about what people think who either don't have all the info or who have different views and values. There's a whole range of potential  differences here that can affect what people think.  Some people are more averse to spending large amounts of money without some careful process that's there to prevent corruption. Some people might be fine with the decision but would've liked to see things being addressed and explained more proactively. Some people may have uncharitable priors towards EA or towards everyone (including themselves?) so they'd never accept multi-step arguments about why some investment is actually altruistic if it superficially looks like what a selfish rich person would also buy. And maybe some people don't understand how investments work (the fact that you can sell something again and get money back).  At the extreme, it seems unreasonable to give weight to all the ways a decision could cause backlash – some of the viewpoints I described above are clearly stupid. At the same time, factoring in that there are parts of EA that would welcome more transparency or some kind of process designed to prevent risk of corruption – that seems fine/good.  Relevant: PR is corrosive reputation is not

The problem with considering optics is that it’s chaotic.

The world is chaotic, and everything EAs try to do have a largely unpredictable long-term effect because of complex dynamic interactions. We should try to think through the contingencies and make the best guess we can, but completely ignoring chaotic considerations just seems impossible.

It’s a better heuristic to focus on things which are actually good for the world, consistent with your values.

This sounds good in principle, but there are a ton of things that might conceivably be good-but-for-... (read more)

Thanks Patrick - glad to see you on EA forum.

Did you reach out to EA funders for VaccinateCA? From the linked article:

I called in favors and pled our case up and down the tech industry, and scraped together about $1.2 million in funding.

I have the sense that (at least today) a project with this level of prioritization, organizational competence and star power would be able to pull down 5x that amount with 1/10th the fundraising effort through the EA network. I think that was approximately still the case in early 2021.

(FWIW I've been a fan of yours sinc... (read more)

I didn't reach out to any EA funders, for somewhat quirky and contingent reasons, and I'm unfortunately going to be slightly elliptical here rather than saying everything I know:

At various points when I was raising money, I had miscalibrated understanding of how much money was committed or on the cusp of being committed. Since I was optimizing for speed-to-commitment, at most points I favored either my own network or networks I had perceived-high-quality intros to rather than attempting to light up funding sources which I perceived would not have a high pr... (read more)

Answer by lincolnqDec 06, 202268
21
2

[note: I don't work for CEA, but I did recently invest in a house to live in and do events in.] I wrote a piece on my blog about why. Here's what I wrote:

Real estate purchases can make sense for financial planning reasons in some cases. This money should not be considered to trade off against, e.g., donations to effective charities. Instead it should trade off against short-term rental budgets for retreats, conferences, etc. And because banks are willing to loan against real estate at very good rates, it is surprisingly cheap to invest in real estate, requ... (read more)

Useful perspective. (I'm excited about this debate because I think you're wrong, but feel free to stop responding anytime obviously! You've already helped me a ton, to clarify my thoughts on this.)

First, what I agree with: I am excited by your last paragraph - my ideal EA community also helps people reason better, and the topics you listed definitely seem like part of the 'curriculum'. I only think it needs to be introduced gently, and with low expectations (e.g. in my envisioned EA world, the ~bottom 75% of engaged EAs will probably not change their caree... (read more)

Useful input. Can you give a bit more color about your feelings? In particular whether this is a disagreement with the core direction being proposed vs. just something I wrote down that seems off? (if the latter - i wrote this quickly trying to give a gist so not surprised. if the former i'm more surprised and interested in what I am missing.)

I am not fully sure, and it's a bit late. Here are some thoughts that came to mind on thinking more about this: 

I think I do personally believe if you actually think hard about the impact, few things matter, and also that the world is confusing and lots of stuff turns out to be net-negative (like, I think if you take AI X-risk seriously a lot of stuff that seemed previously good in terms of accelerating technological progress now suddenly looks quite bad). 

And so, I don't even know whether a community that just broadly encourages people to do thi... (read more)

Thanks for writing!

To be clear, I don't think we as a community should be scope insensitive. But here's the FAQ I would write about this...

  • Q: does EA mean I should only work on the most important cause areas?
    • no! being in EA means you choose to do good with your life, and think about those choices. We hope that you'll choose to improve your life / career / donations in more-altruistic ways, and we might talk with you and discover ideas for making your altruistic life even better.
  • Q: does EA mean I should do or support [crazy thing X] to improve the wo
... (read more)
5
Habryka
1y
I like this comment, but also genuinely think that this Q&A would indicate that EA had lost a lot of what I think makes it valuable, and I would likely be much less interested in being engaged.

We should retain awareness around optics, in good times and bad

I'd like to push back on this frame a bit. I almost never want to be thinking about "optics" as a category, but instead to focus on mitigating specific risks, some of which might be reputational.

See https://lesswrong.substack.com/p/pr-is-corrosive-reputation-is-not for a more in-depth explanation that I tend to agree with.

I don't mean to suggest never worrying about "optics" but I think a few of the things you cited in that category are miscategorized:

err on the side of registering charita

... (read more)
4
Jack Lewars
1y
This is interesting and I agree with much of it. I think two extra things: 1. I think optics are worth adding in to your risk calculations, even if e.g. you think the principle risk is legal 2. I didn't mention in the OP the most egregious examples of bad optics but I think some exist - I would argue flying people to the Bahamas to cowork has dreadful optics and that might be a strong argument against doing it

Huh, useful analogy. I do think cryptocurrency has potential, I just think the expected altruism-value of the whole thing is quite negative currently, and has been for 5+ years, and this was super not true in the early days of the internet, even during the crash years.

(I was a very well-connected teenager in 1999 and I remember some things about the early internet... I remember the browser wars, Netscape, AltaVista, then Google, eBay and PayPal, as well as the adware, email viruses, chain letters, worms, hoaxes, etc.)

Early internet was clearly awful in so ... (read more)

A sweeping condemnation of crypto based on FTX's failure seems about as prudent or rational as a sweeping condemnation of democracy

Ah, but (to be a bit of a devil's advocate here) a lot of people have been sweepingly condemning crypto since before this whole fiasco, we are just making more noise and have a higher chance to be heard now :)

At risk of derailing the thread, I would argue none of your #1-5 are panning out in any substantive way. (I know a lot about 1 and 2, and claim that the entire crypto industry is at least an order of magnitude less effe... (read more)

I understand this skepticism. You're right that crypto still has a lot to prove in terms of large-scale utility, security, reliability, and regulatory compliance.

I would just caution that in the early 2000's after the dot-com bubble, many people expressed the same kinds of skepticism about the Internet itself, and about all online businesses. From 1995 to 2002, there were too many scammers, sociopaths, and opportunists who had ridden the initial wave of hype; security protocols were not well-developed; regulation was patchy and unclear; the use cases were ... (read more)

I agree! As a founder, I promise to never engage in fraud, either personally or with my business, even if it seems like doing so would result in large amounts of money (or other benefits) to good things in the world. I also intend to discourage other people who ask my advice from making similar trade-offs.

This should obviously go without saying, and I already was operating this way, but it is worth writing down publicly that I think fraud is of course wrong, and is not in line with how I operate the philosophy of EA.

I endorse the sentiment but I think anyone who was planning to commit fraud would say the same thing, so I don't think that promise is particularly useful. 

Answer by lincolnqNov 10, 202240
18
4

I think you might now be overreacting to recent negative news; but before then you were probably overreacting to positive news. I do recommend building your own culture and brand for a company.

At Wave, we touch on EA in our mission and certainly did a little bit of hiring through EA aligned venues; but our mission is both simple to explain and largely independent of the whims of EA movement stuff. I think that's the way it should be; it was a deliberate choice for us and I think has served us well.

6
Sharmake
1y
I don't think so. Normally I'm not a fan of reacting to the news, but here's the thing. This event could very well turn into a catastrophic risk for EA in the PR realm, and the consequences for things like EA and quantitative thinking could be disastrous. It certainly makes sense to bring in PR resources now.

I think 80k has tried to emphasize personal fit in the content, but something about the presentation seems to dominate the content, and I think that is somehow related to social dynamics. Something seems to get in the way of the "personal fit" message coming through; I think it is related to having "top recommended career paths". I don't know how to ameliorate this, or I would suggest it directly.

I'm sure this is frustrating to you too, since like 90% of the guide is dedicated to making the point that personal fit is important; and people seem to gloss ove... (read more)

3
IanDavidMoss
1y
I agree this is an important point, but also think identifying top-ranked paths and problems is one of 80K's core added values, so don't want to throw out the baby with the bathwater here. One less extreme intervention that could help would be to keep the list of top recommendations, but not rank them. Instead 80K could list them as "particularly promising pathways" or something like that, emphasizing in the first paragraphs of text that personal fit should be a large part of the decision of choosing a career and that the identification of a top tier of careers is intended to help the reader judge where they might fit. Another possibility, I don't know if you all have thought of this, would be to offer something that's almost like a wizard interface where a user inputs or checks boxes relating to various strengths/weaknesses they have, where they're authorized to work, core beliefs or moral preferences, etc., and then the program spits back a few options of "you might want to consider careers x, y, and z -- for more, sign up for a session with one of our advisors." Then promote that as the primary draw for the website more than the career guides. Just a thought?

(I submitted this to your form, figured I could also write it here for further discussion):

I think 80k has provided a clear and easy-to-consume career guide, which has influenced the conversation. There are a set of careers/cause areas which are legibly high priority and thus approved by 80k. This has the effect of nudging people into the approved careers and discouraging them from everything else.

I suspect this has both positive and negative effects. The positive ones are the first-order effects: hopefully people actually make better career choices and th... (read more)

3
Arden Koehler
1y
Thanks! Agree about there being tradeoffs here. Curious if you have more to say on this: Am I right in thinking that the worry that, by raising the status of some careers, 80k creates social pressure to do those rather than the one you have greater personal fit for? (Do you think there’s a (reasonable) amount of emphasis on personal fit we could present which would mostly ameliorate your worries on this?)

Awesome - excited about this! Please let me know if you want me or someone from Wave to give a talk or say hello. (I am not likely to make it in person, but it's not crazy that someone from my team might be.)

1
Anne Nganga
1y
Hi there Lincoln, Thank you for your gracious offer. We would be honored to have Waves' presence in the program. Please send me an email on annenganga554@gmail.com so we can work everything out.  I look forward to speaking with you. Cheers, Anne N N

These feel like classic arguments. Such arguments hold some weight. But (in agreement with Derek Shiller) I think they are more an argument against a very strong form of longtermism where you are actively sacrificing everything else we care about in order to make the future better. Such lines of reasoning seem bad-in-general because errors in reasoning magnify tremendously. Instead, we should mix lots of different worldviews into our moral strategy, preferring actions which are robustly good across many worldviews.

I also have noticed (in the process of eat... (read more)

2
Jeremy
1y
The second paragraph really hits on the nose how I feel, without having ever been able to put it into words - regarding both eating less animal products and recycling.
Load more