@ModerateImprovement to [email protected]English • 2 months agoApple AI Released a 7B Open-Source Language Model Trained on 2.5T Tokens on Open Datasets.www.marktechpost.commessage-square21arrow-up1157arrow-down15
arrow-up1152arrow-down1external-linkApple AI Released a 7B Open-Source Language Model Trained on 2.5T Tokens on Open Datasets.www.marktechpost.com@ModerateImprovement to [email protected]English • 2 months agomessage-square21
minus-square@[email protected]linkfedilinkEnglish31•edit-22 months agoThey managed a substantial incremental improvement over previous models by first creating a better set of data as their starting point. https://huggingface.co/apple/DCLM-7B
They managed a substantial incremental improvement over previous models by first creating a better set of data as their starting point.
https://huggingface.co/apple/DCLM-7B