Hi, I’m currently starting to learn how LLM works in depth, so I started using nanoGPT to understand how to train a model and I’d like to play around with the code a little more. So I set myself a goal to train a model that can write basic French, it doesn’t to be coherent or deep in its writing, just French with correct grammar. I only have a laptop that doesn’t have a proper GPU, so I can’t really train a model with billions of parameters. Do you think it’s possible without too much dataset or intensive training? Is it a better idea if I use something different from nanoGPT?

TLDR: I’d like to train my own LLM on my laptop which doesn’t have a GPU. It’s only for learning purpose, so my goal is that it can write basic French. Is it doable? If it is, do you have any tips to make this easier?

  • blackstampede
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    10 months ago

    TL;DR yeah, it’s doable, just slow.

    You can train without a GPU, it just takes longer. More RAM and a better CPU will help up to a point. I don’t think text generation is a particularly difficult task- you could probably do it with something like a Markov chain rather than an LLM if you don’t care whether it’s particularly coherent.

    • MatburnxOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Well, I use my laptop as a daily-driver, so training an AI in the background, even when I don’t use it seems a bit complicated. The Markov chain seems like an interesting alternative for what I’m looking, does any tools to use one exist or should I build one from scratch?