Ask HN: Does Claude use 'prior' in a Bayesian sense more than English?

Just an observation. When asked to summarize articles, or extract insights, I see the word 'prior' being used a lot more by Claude than usual English language writing (Journalistic essence). And it's clearly using it in a Bayesian sense, because it's always mentioning things like 'Updating priors', 'the prior doesn't hold', etc.

Probably something I noticed after reading the 'goblin' and 'gremlin' article.

4 points

slake

2 days ago


9 comments

bjourne 2 days ago

Probably? Reinforcement learning creates bots with specific styles. For example, ChatGPT is very fond of "typically", "unpack this", and "if you want".

nivertech 2 days ago

AI talk is turning into Silicon Valley pseudo-math slang. Priors, exponentials, latent space

You get lines like “no priors” or “embracing exponentials” that sound smart but mostly signal status

Same move as N Taleb and “convexity.” A real idea turned into a generic intellectual flex

ex-aws-dude 2 days ago

Once again a post with literally 3 points and 2 hours old is the top of /ask

Why is the HN algorithm such ass, can we talk about that?

  • pbkompasz 2 days ago

    Well it did have Claude both in the title and the description...

yankohr 2 days ago

[flagged]

  • jdlshore 2 days ago

    What does this have to do with the question OP is asking?

    • redrove 2 days ago

      Nothing, but he got to plug his vibe coded startup that he advertises in the about section.