• @Hanabie
    link
    English
    47 months ago

    What’s the context window size?

    • @noneabove1182OPM
      link
      English
      27 months ago

      according to the config it looks like it’s only 4096, and they specify in the arxiv that they kept the training data under that value so it must be 4096… i’m sure people will expand it soon like they have with others

  • Pennomi
    link
    fedilink
    English
    37 months ago

    How does it compare to Mistral? That’s the best performing 7b model and it’s suspiciously missing from this report.

    • @noneabove1182OPM
      link
      English
      27 months ago

      I’m looking forward to trying it today, I think this might make a good RAG model based on the orca 2 paper, but testing will be needed