“Open source” in ML is a really bad description for what it is. “Free binary with a bit of metadata” would be more accurate. The code used to create deepseek is not open source, nor is the training datasets. 99% of “open source” models are this way. The only interesting part of the open sourcing is the architecture used to run the models, as it lends a lot of insight into the training process, and allows for derivatives via post-training
“Open source” in ML is a really bad description for what it is. “Free binary with a bit of metadata” would be more accurate. The code used to create deepseek is not open source, nor is the training datasets. 99% of “open source” models are this way. The only interesting part of the open sourcing is the architecture used to run the models, as it lends a lot of insight into the training process, and allows for derivatives via post-training
Deepseek actually released a bunch of their infrastructure code, including the infamous tricks for making training and interference more efficient, a couple of weeks ago.
It certainly is a lot more open source than OpenAI, that’s for sure.