Hi, I’m currently starting to learn how LLM works in depth, so I started using nanoGPT to understand how to train a model and I’d like to play around with the code a little more. So I set myself a goal to train a model that can write basic French, it doesn’t to be coherent or deep in its writing, just French with correct grammar. I only have a laptop that doesn’t have a proper GPU, so I can’t really train a model with billions of parameters. Do you think it’s possible without too much dataset or intensive training? Is it a better idea if I use something different from nanoGPT?

TLDR: I’d like to train my own LLM on my laptop which doesn’t have a GPU. It’s only for learning purpose, so my goal is that it can write basic French. Is it doable? If it is, do you have any tips to make this easier?

  • h3ndrik@feddit.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    11 months ago

    If you want to learn machine learning, you could maybe play around with the examples recognizing single digit handwritten numbers with that MNIST dataset or something in that kind of league.

    I think training an LLM that can be somewhat useful will be way out of scope with the RAM and computing capabilities such a laptop has to offer. Maybe correct grammer if you don’t care to wait for a long long time. Something with the level of intelligence of autocomplete. But definitely not coherent or intelligent or answering your questions.

    You could rent a VM in the cloud. Services like runpod.io or vast.ai offer you a proper GPU for like $2 an hour. There is also Amazon, Google, Azure, Lambda…

    • Mixtral
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 months ago

      Do cloud services see everything - text/images data for training? And finished trained model? If so, runpod.io etc are no solution.