I was wondering if someone here has a better idea of how EA developed in its early days than I do.

Judging by the link I posted, it seems like Yudkowsky used the term “effective altruist” years before Will MacAskill or Peter Singer adopted it. The link doesn’t mention this explicitly, but Will MacAskill was also a lesswrong user, so it seems at least plausible that Yudkowsky is the true father of the movement.

I want to sort this out because I’ve noticed that a recently lot of EAs have been downplaying the AI and longtermist elements within the movement and talking more about Peter Singer as the movement’s founder. By contrast the impression I get about EA’s founding based on what I know is that EA started with Yudkowsky and then MacAskill, with Peter Singer only getting involved later. Is my impression mistaken?

  • elmtonic@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    Eh, the impression that I get here is that Eliezer happened to put “effective” and “altruist” together without intending to use them as a new term. This is Yud we’re talking about - he’s written roughly 500,000 more words about Harry Potter than the average person does in their lifetime.

    Even if he had invented the term, I wouldn’t say this is a smoking gun of how intertwined EAs are with the LW rats - there’s much better evidence out there.

    • jonhendry@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      He has quite possibly written more words about Harry Potter than She Who Shall Not Be Named, herself.

    • GorillasAreForEating@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Thank you, that link is exactly what I was looking for (and also sated my curiosity about how Yudkowsky got involved with Bostrom and Hanson, I had heard they met on the extropian listserv but I had never seen any proof).

  • titotal@awful.systems
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    1 year ago

    EA as a movement was a combination of a few different groups (This account says Giving what we can/80000 hours, Givewell, and yudkowsky’s MIRI). However, the main source of early influx of people was the rationalist movement, as Yud had heavily promoted EA-style ideas in the sequences.

    So if you look at surveys, right now a a relatively small percentage (like 15%) of EA’s first heard about it through lesswrong or SSC. But back in 2014, and earlier, Lesswrong was the number one onroad into the movement (like 30%) . (I’m sure a bunch of the other answers may have heard about it from rationalist friends as well). I think it would have been even more if you go back earlier.

    Nowadays, most of the recruiting is independent from the rationalists, so you have a bunch of people coming in and being like, what’s with all the weird shit? However they still adopt a ton of rationalist ideas and language, and the EA forum is run by the same people as Lesswrong. It leads to some tension: someone wrote a post saying that “yudkowsky is frequently confidently, egregiousl wrong”, and it was somewhat upvoted on EA forum but massively downvoted on Lesswrong.

  • David Gerard@awful.systemsM
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    downplaying the AI and longtermist elements within the movement

    and this has always been an issue - they’re a fucking embarrassment, but they also do a lot of the organisational work so it’s hard to get rid of them

  • David Gerard@awful.systemsM
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    yeah, he totally did. EAs claiming otherwise are just incorrect.

    remember that the original Roko’s Basilisk post talked about the dilemma of being an “altruist”, i.e. how to donate as much money as possible to MIRI (or SIAI as it was in 2010).

    the terms were in extremely heavy use back then.

    a bunch of the various Singer-inspired groups tried to work out a name for the whole thing, and they picked Yudkowsky’s coinage.

  • Evinceo@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    This is gonna be really helpful next time someone tells me straight up that EA and Rationalist are totally different things and just overlap by coincidence.

  • Shitgenstein1@awful.systems
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Speaking of Big Yud, has there been any new developments with him since the fallout of his Time Magazine article in March? Not like there’s been any international coalition to first strike rogue data centers or whatever. Everyone just kind of ignored him?