SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · 1 day agoWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?message-squaremessage-square15linkfedilinkarrow-up131arrow-down112file-text
arrow-up119arrow-down1message-squareWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · 1 day agomessage-square15linkfedilinkfile-text
minus-squareEager Eagle@lemmy.worldlinkfedilinkEnglisharrow-up11arrow-down1·1 day agoheavily depends on the model and quantization level choose the model you want on this website and it’ll give you some specs likely to run it https://runthisllm.com/ any/most distros will do, especially if you run it on Docker if you’re going with intel cards (best $ per GB VRAM right now), you could get a decent machine under $3k
heavily depends on the model and quantization level
choose the model you want on this website and it’ll give you some specs likely to run it
https://runthisllm.com/
any/most distros will do, especially if you run it on Docker
if you’re going with intel cards (best $ per GB VRAM right now), you could get a decent machine under $3k