Having devoted my life to experimental physics, I can remember occasions when I pretended to understand arcane theories and got by with plausible-sounding explanations -- i.e. what ChatGPT is doing now. This raises disturbing questions. We are born with a hard-wired kernel of data mixed with algorithms and an app for training our neural network on others' input. Initially we "understand" only what's hard-wired, but we watch others exhibiting "understanding" and learn to emulate it, much like ChatGPT. Validity and accuracy are not priorities until we learn (either socially or from bitter experience) that they are supposed to be. So... what if that delicious "Aha!" moment when we see how something "works" is also an emulation? I can't believe that logic is a fiction, but I can imagine how reason might be... after all, millions of people are very adept at emulating "reason" in service of their delusions. I don't think I am just another ChatGPT, but of course I would "think" that....