• noneabove1182OPM
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      according to the config it looks like it’s only 4096, and they specify in the arxiv that they kept the training data under that value so it must be 4096… i’m sure people will expand it soon like they have with others

  • Pennomi@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    How does it compare to Mistral? That’s the best performing 7b model and it’s suspiciously missing from this report.

    • noneabove1182OPM
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’m looking forward to trying it today, I think this might make a good RAG model based on the orca 2 paper, but testing will be needed