AI also has cognitive biases

By Aditya Shukla, Psychologist, Cognition Today, EdTech & AI Consultant

Human cognitive biases are rigid thought patterns that bias our perception and decisions. Similarly, AI like ChatGPT shows a unique set of these biases based on how LLMs are trained.

Word frequency bias

Some words will be overused and some will not. Important and Delve are overused. Slang words are underused. This is a direct relation to words in it’s training and the human corrections that occur during fine-tuning.

1

Gullibility bias

Some prompts can make an AI change its response just because you forcefully say it is wrong. If it doesn’t accept, it finds a bridge between what it says and what you say.

2

Safety bias

AI is currently tuned to be as politically right and sensitive as possible. Training data will create a bias based on its contents and then fine-tuning will eliminate it, usually after public uproar. It'll avoid religious, racial & political commentary.

3

Generalist bias

If you ask questions about any topic, AI currently doesn’t speak of the most important aspects; it speaks about a broad, holistic picture. If you ask it about marketing ideas, it'll dump all core ideas instantly.

4

Prohibition bias

If you ask AI to do something, a generative task, and mention a prohibited keyword or a content policy violation trademarked/copyrighted word, it stops processing the task. Even when the keyword mention has nothing to do with the output.

5

The empathy bias

If you say anything to any AI today, chatGPT, Llama 3, perplexity, it tends to be overly positive and empathetic. It encourages you like you are a toddler who said its first words.

6

Fixed attention bias

If you start a conversation with AI and ask a specific question expecting a specific response, the AI will still borrow the context from the previous context. Humans have a cognitive function called “cognitive flexibility” that prevents this problem.

7

Explore more on AI Psychology

Educational uses of ChatGPT

The ChatGPT effect

Share
Share