You can reduce chatGPT's hallucinations!

Sponsored by

hi there.. happy Friday 🙂 

Many people try saying "be honest" in their questions.

This doesn't help.

Here's why:

AI doesn't know what's true or false.

It just guesses the next word based on what it learned.

That's all it does.

When you ask something, AI doesn't:

1. Look up real facts

2. Know if it's right or wrong

3. Understand when it makes mistakes

It only copies patterns from old information.

You can't fix this with better questions.

AI simply doesn't understand truth.

BUT here are a few ways you can reduce hallucinations

1. Give AI trusted sources to use

* Use tools that check real documents

* Upload your own reliable files

2. Let AI search the internet

* Turn on web search in your AI tool

* Check where it gets its information

3. Compare different answers

* Ask the same question to different AI tools (I use chatGPT and gemini a lot)

* If answers are very different, be careful

4. Double-check important stuff

* Never trust AI alone for health, money, or legal advice

* Always ask real experts too

No AI is perfect.

Not now.

Not ever.

But these tips help you get better answers.

Growth unlocks straight to your inbox

Optimizing for growth? Go-to-Millions is Ari Murray’s ecommerce newsletter packed with proven tactics, creative that converts, and real operator insights—from product strategy to paid media. No mushy strategy. Just what’s working. Subscribe free for weekly ideas that drive revenue.

AI Tool Of The Week - Ultimate Clipping Tool

Turn 1 long video into 10+ viral clips. Create 10x faster.

OpusClip turns long videos into shorts, and publishes them to all social platforms in one click.

So you can dominate ALL social platforms with just 1 video 🫰

Go check it out → Opus

As always let me know if you have any questions?

best,

ali