ugjka@lemmy.world to linuxmemes@lemmy.worldEnglish · 9 hours agoDamn electron and the likesi.imgur.comimagemessage-square42fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1imageDamn electron and the likesi.imgur.comugjka@lemmy.world to linuxmemes@lemmy.worldEnglish · 9 hours agomessage-square42fedilink
minus-squareTodd Bonzalez@lemm.eelinkfedilinkarrow-up0·4 hours agoYeah, but if you’re interested in running an LLM faster than 1 token per minute, RAM won’t matter. You’ll need as much VRAM as you can get.
Yeah, but if you’re interested in running an LLM faster than 1 token per minute, RAM won’t matter. You’ll need as much VRAM as you can get.