noneabove1182M to LocalLLaMAEnglish · 1 year agoLlama-2, Mo’ Lora (proof of concept MOE of LoRAs)crumbly.medium.comexternal-linkmessage-square2fedilinkarrow-up17arrow-down11file-text
arrow-up16arrow-down1external-linkLlama-2, Mo’ Lora (proof of concept MOE of LoRAs)crumbly.medium.comnoneabove1182M to LocalLLaMAEnglish · 1 year agomessage-square2fedilinkfile-text
https://twitter.com/aicrumb/status/1681846805959528448?t=sG6Xn4p0hodDoB-g7gmuJQ https://colab.research.google.com/#fileId=https%3A//huggingface.co/datasets/crumb/Wizard-EvolInstruct70k-k4/blob/main/MoLora_7b_(PROOF_OF_CONCEPT).ipynb Very interesting concept, excited to see where it goes
minus-squarenoneabove1182OPMlinkfedilinkEnglisharrow-up2·1 year agoProbably just because of the MOE (mixture of experts) lol
Probably just because of the MOE (mixture of experts) lol