• 0 Posts
  • 7 Comments
Joined 1 month ago
cake
Cake day: April 6th, 2026

help-circle


  • That only works if everyone agrees with you, which is clearly not true. In academic math, there’s a thing called juxtaposition. It mostly exists because math people are lazy, so instead of putting parentheses around statement e.g. 5+(2*x) they’ll just write 5+2x.

    This is fine as long as you know the context of that expression. If you take it out of the context and just ask any person what is the right order of operations - it becomes ambiguous. Because some people know PEMDAS. And other people know that PEMDAS is just a simplification for middle school, when real math notation is messy, non-standard and requires a lot of local domain knowledge.



  • That’s mostly because the LLM providers put this response in the system prompt. Probably to dodge lawsuits or something, I doubt they have high morals.

    What’s interesting - you can jailbreak any current AI Model just by poisoning it’s context enough to “brainwash” it and make it “forget” the initial system prompt. Then, if you prime it to believe it’s a real person - it’ll start acting as one. And I see how gullible people can easily fall for this.

    All of this can also be done unintentionally, just by someone talking to LLM like they’d talk to a real person. But it should be long enough for original prompts to be diluted with new context.


  • If we’re not considering self-hosting:

    • GitLab - asks for phone number/cc to create an account, enterprise-level UI (in a bad way)
    • SourceHut - free (during public alpha that might end one day), ran by a controversial figure
    • BitBucket - “Code and CI/CD, powered by AI”
    • Codeberg - limited to FOSS projects, fork of Gitea, CI is compatible with GH actions

    I moved all my active and new projects to Codeberg the moment GitHub started pushing their Copilot crap everywhere. I just knew nothing good would come from it.