Allah@lemmy.world to Ask Lemmy@lemmy.world · 20 hours agoWhat can an individual do to fight against tech billionaires?lemmy.worldimagemessage-square78fedilinkarrow-up1193arrow-down17
arrow-up1186arrow-down1imageWhat can an individual do to fight against tech billionaires?lemmy.worldAllah@lemmy.world to Ask Lemmy@lemmy.world · 20 hours agomessage-square78fedilink
minus-squareMajorSauce@sh.itjust.workslinkfedilinkarrow-up2·14 hours agoTry hosting locally DeepSync R1, for me the results are similar to ChatGPT without needing to send any into on the internet. LM Studio is a good start.
minus-squareMajorSauce@sh.itjust.workslinkfedilinkarrow-up1·edit-24 hours agoYou would benefit from it with some GPU offloading, this would considerably accelerate the speed of the answers. But you only need enough RAM to load the model at the bare minimum.
Try hosting locally DeepSync R1, for me the results are similar to ChatGPT without needing to send any into on the internet.
LM Studio is a good start.
Don’t you need a fast GPU to do so?
You would benefit from it with some GPU offloading, this would considerably accelerate the speed of the answers. But you only need enough RAM to load the model at the bare minimum.
Thanks I’m trying it now!