Your AI habit could be offending the real experts

Leigh :) Stark an award winning journalist and reviewer with almost 20 years of experience. Heard on ABC, 2GB, 3AW, and more regularly.

Meet Charlie*.

Charlie is an expert in a lot of things and a voice you can turn to for pretty much anything.

Need a book recommendation? Charlie has a few. How about a cool destination? Charlie knows a lot about places. And what about a way to make a few more quid? Yep, Charlie has some ideas in that, too.

Charlie
Happy to help with questions or tasks

The problem is Charlie isn't real.

He's one of many personas that we use for the AI systems now beginning to take over our lives.

But that isn't stopping people from asking Charlie for a second opinion over that of a real person.

Charlie could be ChatGPT or he could be Claudia for Claude. They might be Guilia Gemini, Percy Perplexity, or maybe just "Kimi". Go beyond mere text assistance and you might have Michael Midjourney helping you make images for something, or Christopher Cursor and Beatrice Bolt as AIs helping you to code.

The AI chat assistant in your life might not even have a personified name. They might simply be what you turn to without giving it a pet name. A tool you rely on, part of your digital life.

Whatever you call them, AI use is having an effect on real-life people, and not necessarily in the way you might expect.

While the loss of jobs seems like an everyday risk in a world where AI is seemingly taking over, affecting everything including the environment, our gradual push to incorporate AI in every part of our lives is having some unintended consequences: it's making us distrustful of experts.

An issue of trust

A recent study by Australia's Monash University in collaboration with the MUMA College of Business at the University of South Florida has shown AI might make human advisors less motivated to work with people using AI for advice, especially over that of an actual person they were getting advice from.

It means that the real specialist person you ask for advice from is kind of being overridden by your thoughts on what a computer trained on a variety of topics could suggest, and the result is affecting the human in the equation.

It's a little like asking the librarian for book advice, turning to ChatGPT on your phone, and then turning back and telling the human "you're wrong". Understandably, it's a situation that isn't impressing the real-life people, especially because it feels like they're being lumped in with the tools, rather than being portrayed as the experts.

"Advisors view AI as substantially inferior to themselves," said Associate Profession Gerri Spassover, the first author on the research from Monash University. "Thus, being placed in the same category as an AI system feels insulting and signals disrespect, undermining advisors' willingness to engage."

The research effectively shows that if you are going to use both AI and real-life advisors, you might want to spare the human counterparts from mentioning that you used an AI in your research.

While they can probably guess that it might have been your starting point, as would be a lot of web searches in general, noting that AI is just as relevant and important as an expert in a field may actually be the point, showing it as demeaning.

It's worth noting that AI doesn't always get it right, and sometimes gets it very wrong. AI has led people to places that don't exist on holidays and other trips, including some right in our own backyard of Australia, a sign of why you might want to trust someone in the field over a ChatBot.

The technology tends to fail at medical diagnoses even though they can sound convincing, suggesting doctors and GPs are the people to turn to, not the bots.

And while AI might end up assisting with date and financial decisions at one point, the government warns that it can be wrong and isn't licensed for personal financial advice, like human experts may be.

While AI may well be capable of providing lots of answers, it may not be specialised or definitive, where a human with experience could be. That could change over time, but right now, the human appears to be in the better position.

It's a tool. Like all tools, it wouldn't be thoroughly surprising to use a tool to form or understand positions. However, that doesn't necessarily mean relying on it solely, particularly over an expert you might be turning to in a field.

"Ultimately, we believe that at present it is better for clients not to disclose to their advisors that they consulted AI," said Assistant Professor Spassova.

"And this may be particularly true for new client-advisor relationships, where the client has no track record of doing business with the advisor and where trust may not have been established yet."