dpiepgrass

I'm a senior software developer in Canada (earning ~US$70K in a good year) who, being late to the EA party, earns to give. Historically I've have a chronic lack of interest in making money; instead I've developed an unhealthy interest in foundational software that free markets don't build because their effects would consist almost entirely of positive externalities.

I dream of making the world better by improving programming languages and developer tools, but AFAIK no funding is available for this kind of work outside academia. My open-source projects can be seen at loyc.net, core.loyc.net, ungglish.loyc.net and ecsharp.net (among others).

Topic Contributions

Comments

EA will likely get more attention soon

Yup, I saw somebody on Medium speaking favorably about a Phil Torres piece as a footnote of his article on Ukraine (I responded here). And earlier I responded to Alice Crary's piece. Right now the anti-EAs are often self-styled intellectual elites, but a chorus of bad faith could go mainstream at some point. (And then I hope you guys will see why I'm proposing an evidence clearinghouse, to help build a new and more efficient culture of good epistemics and better information... whether or not you think my idea would work as intended.)

Being Open and Honest

I don't know any other EAs in my area so I haven't witnessed this phenomenon. On the one hand, I like the self-deprecating style because it is the exact opposite style to my arch-nemeses, the anti-science/dark-epistemology people (you know them by many names: climate dismissives, anti-vaxxers, anti-nuclear-power zealots, math deniers...).

On the other hand, there is a reason my nemeses act this way: it works well for them. Certain anti-vaxxers probably earn over $1 million annually on Substack from $5/mo. subscriptions. Clearly, a great many people are attracted to a confident "I'm always right" style of speaking and acting. Are there people who would like EA more if it were more like that? No doubt. Are there enough people who reject "weirdness" that EA would grow more if it worked harder to look cool? Plausible. Can we look cool without risking the soul of EA? Maybe.

But golly, I wouldn't want to take a position without collecting empirical data on all this!

Did Peter Thiel give "the keynote address at an EA conference"?

Yeah, but who is speaking here? Beckstead? I don't know any "Beckstead"s. Phil Torres is claiming that The Longtermist Stance is "we should prioritise the lives of people in rich countries over those in poor countries", even though I've never heard EAs  say that. At most Beckstead thinks so, though that's not what Beckstead said. What Beckstead said was provisional ("now seems more plausible to me") and not a call to action. Torres is trying to drag down discourse by killing nuance and saying misleading things.

Torres' article is filled with misleading statements, and I have made longer and stronger remarks about it here. (Even so I'm upvoting you, because -6 is too harsh IMO)

Did Peter Thiel give "the keynote address at an EA conference"?

Summary of the talk:

  • "Looking forward to having a conversation today with people about...how we can make the world a better place in the years and decades and centuries ahead. ... I thought it would be valuable where I ... go through some thoughts I have for ... why I think it's so important to be pushing the frontiers of technology in certain ways, both in a for-profit and non-profit context"
  • "My claim is there are only four possible charts you can come up with" for how technological development will proceed over time [I'm not sure what he means exactly; I certainly think the real-life outcome may well look more messy than the ones he presented, but at least I agree that the cyclic one is incorrect. Says he likes exponential growth.]
  • He suggests globalization has been overemphasized over technological development. "The question I always like to pose is, how can we go about developing the developed world."
  • "maybe we can no longer have globalization continue without technological progress at this point"
  • "[by 2030 we may be] losing a consensus even in a place like China for globalization, and we're close to a breakpoint in places like Brazil, Turkey [...] probably a very different paradigm is going to be needed in the decade ahead"
  • "with the founders fund and with some of the nonprofit things that we've done" he's been trying to reverse recent trends via increasing technological progress.
  • "we can think about shaping a future in which there's more technological progress"
  • he presents a dichotomy between "technology" meaning "going from 0 to 1", i.e. making something new, and "globalization" meaning "1 to N", i.e. spreading existing technology around the world. Says almost all nonprofits today (i.e. 2013) are focused on globalization in this sense, not technology.
  • "we have an educational system where we believe that all truth is collective and that the true answers are the answers that everybody knows to be true, whereas if you're going from 0 to 1, you kind of come up with a truth that nobody else knows at that point yet, and [there's] always this question about how do you explain this and [...] pull people in when you're trying to do something that's very, very new... I think there are... a number of features in our society that have made it unusually hostile to this idea of going from 0 to 1"
  • What should we do to go back to an optimistic, definite future? We should ask what things are valuable, that we can actually do, that others are not doing. He uses this as an interview question and finds that most people find it very hard to answer. Says nonprofits should be looking for answers to this question.
  • "good education...probably looks like something where everybody gets educated in a different way that's unique to them as a person"
  • aging/dying is a topic on which "there's more psychological denial than any other topic"; there are many bad arguments against life extension; "every myth on this planet teaches us that the meaning of life is death... [this area] strikes me as grossly underfunded"
  • AI "seems plausible in the next few decades" ... "probably the biggest 0 to 1 thing would be to get something like generalized artificial intelligence. It would change the world in ways that are more radical than we could imagine." [This is kind of an odd comment: he doesn't say that this immense change would be good thing or a dangerous thing, just that it would be radical.]
Did Peter Thiel give "the keynote address at an EA conference"?

Depends if you like Peter Thiel. I don't know much about him, but his support for Trump was a big turnoff for me. I'm not sure why he wrote the techno-optimistic book "Zero to One" and then decided to plonk down his cash on a pathological liar bullshitter whose greatest concern seemed to be keeping Mexicans and Muslims out of the U.S. ... but he did get a tax cut out of it.

Anyway, Phil Torres is cheating here by saying "longtermists are directly associated with a Trump supporter in 2013!" when Trump did not run for president until 2015.

Did Peter Thiel give "the keynote address at an EA conference"?

I see. Rather than being "the" keynote speaker in 2013, there were four keynote speakers of which Thiel was one (the others were Peter Singer, Jaan Tallinn, and Holden Karnofsky.)

Small and Vulnerable

Thanks for pointing that out! Of course, my actual worry is that she won't pick up on EA principles when the only EA in her environment is me. I hate to have to move to an overpriced EA hub city to provide more intellectual infrastructure, but it's on the table.

Small and Vulnerable

Wow. I only lived two years with a legal guardian who was similarly intolerant of me for being "weird" and not making friends and loving computers.

Would you say you had some kind of insatiable need to go against your parents' wishes, or was it more like a limitation in which you simply didn't know how to be what they wanted?

I'm worried about the same problem in reverse. I'm having a child soon, and worry about her being normal. I don't want normal, I want weird like me! I want a high-curiosity, high-openness, high-altruism, thoughtful, epistemologically strong naturist like myself... which may be out of reach. Maybe she will love to play with Barbie dolls, spend tremendous effort fitting into a school clique, read romance novels, swoon for the next Beiber, and fret about having too few shoes and dresses? There's no EA community in my area to lean on... I wonder if there are any EA children's books.

(Edit: Jeez, EAs sure can be mean with their votes.)

Where would we set up the next EA hubs?

I propose someplace affordable. What's wrong with, say, Toledo, OH (between Detroit and Cleveland), which according to one source has the least expensive rent among U.S. cities? Its location on Lake Erie should give it reasonably good weather.

Load More