SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · 1 day agoWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?message-squaremessage-square15linkfedilinkarrow-up131arrow-down112file-text
arrow-up119arrow-down1message-squareWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · 1 day agomessage-square15linkfedilinkfile-text
minus-squarewraekscadu@vargar.orglinkfedilinkarrow-up2arrow-down1·1 day agoDepends on how big your local LLM model is. You can easily run 2-3B param models on a potato. BUT, the model is shit.
Depends on how big your local LLM model is. You can easily run 2-3B param models on a potato. BUT, the model is shit.