Author: Martin Di Lorenzo - mb&l consultores

When AI Hallucinates… and So Do Salespeople…

In recent months, Generative Artificial Intelligence has become a central topic in many conversations. One of the most common expressions used to describe its functioning is “hallucinations.”

What are “hallucinations”?

This term refers to answers that a model provides with complete confidence, but which are false, fabricated, or inconsistent with reality. The peculiarity is that they are not presented as hypotheses or doubts, but as certainties. It is a design problem that, for now, has no definitive solution. This apparent confidence makes the recipient more likely to believe the answer without questioning it.

Examples of AI hallucinations

  • Quotes attributed to authors who never wrote them.
  • Statistics with precise numbers, but with no real source.
  • Dates, names, or addresses that seem plausible but turn out to be incorrect.

In all cases, the problem is not just the error itself, but the conviction with which the system presents it.

How to reduce AI hallucinations

Although they cannot be completely eliminated, it is possible to mitigate them:

  • Source verification: cross-check what AI outputs against reliable references.
  • Better prompt design: the quality of the question determines the quality of the answer.
    • Be specific: instead of “tell me about sales,” ask “list three digital prospecting tactics for mid-sized B2B companies.”
    • Clarify the role: “answer as if you were a senior sales consultant.”
    • Define the output format: “present it in a table with columns for benefits and risks.”
    • Provide relevant context: “assume the client sells cybersecurity software in Mexico.”
    • Specify the expected type of response: whether you want a hard fact, an opinion, or for the AI to state clearly, “I don’t have access to that information.” This prior instruction limits hallucinations by setting clear boundaries.
  • Support role: use AI as an assistant, not as an absolute source of truth.
  • Hybrid systems: combine AI with search mechanisms, databases, and human validation.

In short, the key is to accept the limitation and establish a method that allows us to separate facts from illusions.


The Parallel with B2B Sales

Interestingly, something similar happens in the commercial world. Salespeople can also “hallucinate”: confusing perceptions with facts and communicating them as certainties.

Some examples:

  • “The client is very interested” because they showed enthusiasm in a meeting.
  • “The budget is already approved” based on an informal comment.
  • “We are one step away from closing” without any official confirmation.
  •  

These interpretations are not made in bad faith, but they are risky. When management makes decisions based on assumptions, the probability of error increases.

How to reduce hallucinations in sales

As with AI, the important thing is to have consistent processes:

  • Document events: record meetings, calls, and commitments in a shared system.
  • Cross-check information with additional sources.
  • Validate with the client: confirm what was interpreted instead of assuming.
  • Clear methodology: apply thorough qualification, which allows distinguishing perceptions from real evidence.

Conclusion

The lesson is twofold. In AI, hallucinations reveal the limits of the technology and the need to cross-check. In sales, they remind us that intuition cannot replace data.

In both cases, the challenge is not that hallucinations exist, but that we lack a method to detect and correct them. True value is created by those who can separate facts from illusions.

Newsletter Subscription

Receive dedicated information and content from mb&l in your email.

More Resources

Find out more about our ideas about consultative selling.

Let's co-construct a Commercial Transformation Program tailored to your needs

Complete the following form and we will get in touch to tackle your challenges together

mb&l consultores

Expert consultancy to achieve goals and boost business performance.

Newsletter Subscription

Receive dedicated information and content from mb&l in your email.