Track_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 1 day agoHexadecimalslrpnk.netimagemessage-square88fedilinkarrow-up1880arrow-down122
arrow-up1858arrow-down1imageHexadecimalslrpnk.netTrack_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 1 day agomessage-square88fedilink
minus-squareMasterNerd@lemm.eelinkfedilinkarrow-up27arrow-down1·8 hours agoJust run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
minus-squaressillyssadass@lemmy.worldlinkfedilinkarrow-up2·24 minutes agoDon’t you need a beefy GPU to run local LLMs?
minus-squareDasus@lemmy.worldlinkfedilinkarrow-up1·30 minutes agoAfter censorship, bias still remains.
minus-squarePsythik@lemm.eelinkfedilinkarrow-up5·4 hours agoOr just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
minus-squareTja@programming.devlinkfedilinkarrow-up4arrow-down1·4 hours agoIs the local version censored at all?
minus-squareRaptorox@sh.itjust.workslinkfedilinkarrow-up1·8 hours agoHow? The tweaking part, of course
Just run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
Don’t you need a beefy GPU to run local LLMs?
After censorship, bias still remains.
Or just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
Is the local version censored at all?
How? The tweaking part, of course