I keep seeing this local AI app “Locally” on socials and it’s pretty cool to see people chatting about + using local LLMs!

Before a smaller group were, now it feels like more people are doing it! Only downside is battery, it’d be cool if it added ollama support or let you run a server of this on Mac

*****
Written on