• RennederOPM
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago
    • Chinese LLMs are reducing the size and number of parameters to make it easier for startups and small organizations to implement them.
    • The Beijing Academy of Artificial Intelligence (BAAI) has unveiled Wu Dao 3.0, an open source LLM series.
    • Wu Dao 3.0 includes smaller models such as Wu Dao AquilaChat and Wu Dao AquilaCode.
    • The Wu Dao Vision series focuses on computer vision and includes Emu, EVA, general purpose segmentation, and more.
    • BAAI has updated the FlagOpen open source system for large model development.
    • A strategic choice for BAAI could be to use small open source models due to high costs and sanctions.
    • Smaller models have lower inference costs and are easier to commercialize, especially for niche applications.
    • The Chinese government encourages the use of open source models, data and computing resources to stimulate the development of artificial intelligence.
  • Kerfuffle
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    1.7 trillion parameters is huge so it doesn’t take a lot to be smaller than that. 33b is really small though. Just from my own playing around with this stuff, models seem to get decent around the 65-70m parameter mark.