You may be able to find me on other platforms by the same name!

Mastodon: [email protected]

Contact me on SimpleX or Signal!

  • 0 Posts
  • 38 Comments
Joined 1 year ago
cake
Cake day: March 5th, 2025

help-circle


  • If Gemini truly can’t see PII (no way to add “notes” for example) then I don’t think that would be too big of a concern for most people, at least for those who don’t have a distain for LLMs in the first place. Though I do feel that people with “high threat models” (would be good to be precise about what a “high threat model” is in this instance) would prefer to have a local app that interfaces with a local Ollama API, rather than an internet-connected service.

    What precisely is Gemini “calculating” here and why can’t its function be replaced on a lightweight local LLM?

    Edit: After reading the information from the website, it sounds like there are a lot of opportunities for users to accidentally identify themselves to AI providers or open up de-anonymization attack vectors. If I were very concerned about my identity being linked to my recovery behavior, I would probably not use this service as it is now.