wuphysics87@lemmy.ml to Privacy@lemmy.ml · 2 months agoCan you trust locally run LLMs?message-squaremessage-square20fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 2 months agomessage-square20fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squareEmberleaf@lemmy.mllinkfedilinkarrow-up0·2 months agoI’m not sure what you mean by ‘hard to load’. You find the model you want, you download it, then load it up to chat. What’s the issue you’re having?
I’m not sure what you mean by ‘hard to load’. You find the model you want, you download it, then load it up to chat. What’s the issue you’re having?