When a system turns into advanced and our data peters out, we’re tempted to claim, within the phrases of Gilbert Ryle, that there’s a ‘ghost within the machine.’
“How does the stoplight work?” “Effectively, it is aware of that there’s a break within the site visitors so it switches from inexperienced to pink.”
Truly, it doesn’t ‘know’ something.
Professionals can reply questions on how. All the best way down.
[This is one reason why the LLM AI tech stack is so confounding. Because there are no experts who can tell you exactly what’s going to happen next. It turns out that there might be a ghost, or at least that’s the easiest way to explain it.]