Hi ,
Have you heard of cognitive dissonance? You've certainly experienced it.
Cognitive dissonance is the psychological phenomenon where your brain hallucinates absurd excuses, rather than allow you to face the hard truth - that the reality you are experiencing does not match your beliefs.
You see it when people come up with far-fetched reasons to excuse a loved one's poor behaviour, or when "experts" start attacking those who have noticed a flaw in their argument.
And of course, whenever you're using ChatGPT.
In fact, ChatGPT is so unwilling to recognise it may be wrong, it now carries the disclaimer:
"ChatGPT can make mistakes. Consider checking important information."
Here's the thing...
People who are suffering from cognitive dissonance are typically immune to reasoning. And the
same applies to ChatGPT.
But ChatGPT is not the only tool we have at our disposal. In fact, tools like knowledge graphs can help overcome this very problem.
Knowledge graphs have been used by Google for over a decade to efficiently represent knowledge.
And these can be combined with generative AI chatbots, to overcome the hallucination problem experienced by chatbot users.
I recently had the opportunity to speak to Dr Mudasser Iqbal, founder and CEO of TeamSolve, whose chatbot, Lily, does exactly that. You can listen to the interview HERE.
Talk again soon,
Dr Genevieve Hayes.