• conciselyverbose
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 months ago

    TLDR: he thinks the techniques are fine and you can just brute force them for the foreseeable future.

      • Voroxpete
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        4 months ago

        Because he’s a salesman, and he’s selling you bullshit.

        What the experts are now saying is that it looks like the LLM approach to AI will require exponentially larger amounts of training data (and data processing) to achieve linear growth. Next generation AI models will cost ten times as much to train, and the generation after that will cost ten times as much again.

        The whole thing is a giant con. Kevin is just trying to keep investor confidence floating for a little longer.

      • conciselyverbose
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        lol I honestly needed to open the article to parse the title. That’s why I posted.

        But I’m definitely of the belief that you need a hell of a lot more architecture than they have to go meaningfully further. Humans are a hell of a lot more complicated than a bit like of neurons.