Wander@yiffit.net to LocalLLaMAEnglish · 1 year agoWhat is better: higher quantiation or higher parameter count?message-squaremessage-square16fedilinkarrow-up119arrow-down10file-text
arrow-up119arrow-down1message-squareWhat is better: higher quantiation or higher parameter count?Wander@yiffit.net to LocalLLaMAEnglish · 1 year agomessage-square16fedilinkfile-text
For example, does a 13B parameter model at 2_K quantiation perform worse than a 7B parameter model at 8bit or 16bit?
minus-squarenoneabove1182MlinkfedilinkEnglisharrow-up2·1 year agoThese are good sources, to add one more, the GPTQ paper talks a lot about perplexity at several quantization and model sizes: https://arxiv.org/abs/2210.17323
These are good sources, to add one more, the GPTQ paper talks a lot about perplexity at several quantization and model sizes:
https://arxiv.org/abs/2210.17323