I asked the new AI engine ChatGPT: “Hey, where did you get your priors?” And she (it?) replied:
“I was trained by OpenAI on a dataset of billions of words taken from the Internet. As a language model, my training involves predicting the next word in a sequence of words, given the words that come before it. I do not have personal experiences or biases, and I do not have access to external information beyond what I was trained on. I exist to assist users by generating human-like text based on the input that I receive. Is there anything else I can help with?“
What? The ChatGPT bot says it has no priors? It is either misinformed or a liar!

Is it possible to be a Bayesian thinker with no priors?
The computational capacity of modern AI is so sophisticated; it could mimic intelligible sentience
no, that is not possible; that is why GPT’s response to my query is neither credible nor sincere …
You have a point; any stored information or previous programming could be considered priors.
Agreed. At the very least, ChatGPT certainly “updates” its priors as it receives new information!
If you haven’t read this one; I figured you would get a kick out of it :
https://betonit.substack.com/p/chatgpt-takes-my-midterm-and-gets?utm_source=post-email-title&publication_id=820634&post_id=94520267&isFreemail=true&utm_medium=email
I am a huge BC fan and finally got to meet him in person in Los Angeles last month; I will check out his post!
Nice. I am assuming you probably had long conversation with; considering his diverse array of interests, it must have been engaging to say the least.
Yes, several others showed up to the meet-up, but Bryan made time to sit next to and talk with each of us!
He sounds down-to-Earth considering he is New York Times best-selling author. Being humble is an admirable trait.
He is super-down-to-Earth and an excellent conversationalist!
Reblogged this on prior probability and commented:
Today, I asked ChatGPT to solve the “Liar Paradox”, and it responded as follows:
“The Liar Paradox is a statement that says ‘This statement is false.’ If the statement is true, then it is false, and if it is false, then it must be true. This creates a contradiction, and it is impossible to determine if the statement is true or false.
“There are a few different ways that philosophers and logicians have tried to resolve the Liar Paradox. One approach is to reject the assumption that the statement is either true or false, and instead consider it to be meaningless. Another approach is to accept the contradiction as a valid result, and to consider the truth value of the statement to be indeterminate.
“Ultimately, the Liar Paradox is a challenge to our usual assumptions about truth and falsehood, and it highlights the limitations of language and logic. There is no one ‘correct’ way to resolve the paradox, and different people may have different perspectives on how to approach it.”