Can Chat-GPT coach the business leaders of tomorrow?

Business coaching: two women having a meeting, sat at a large wooden desk.

AI-powered content creation is the new ‘big thing’ in the world of business and marketing. Many readers will be familiar with Chat-GPT – a large language model that uses advanced algorithms to generate ‘natural’ language responses to questions.

But while the answers may often appear intelligent, Chat-GPT isn’t ‘thinking’ about the questions we ask it in the way that a trained researcher might. Rather, it produces answers based on what it thinks a good answer should look like.

This is a subtle yet important difference.

While it isn’t quite a proper intelligence in human terms, Chat-GPT gives a semblance of intelligence, such that many people may be fooled into thinking the output has been produced by a real human being.

This is an issue because Chat-GPT is being used in more and more applications, from search engines and social media sites to content creation, and even academic research.

A world of smoke an mirrors?

In some research published in the Journal of Work-Applied Management, Jonathan Passmore and David Tee explore the potential for AI to produce content that could be used in business coaching. The idea being that one day, AI might be used in place of humans to coach the business leaders of tomorrow.

The research found that while Chat-GPT can certainly produce content that appears meaningful, the output lacked the depth and rigour of analysis required of a successful human coach.

Of course, this isn’t to say that large language models might not be useful at some point in the future. However, as it stands today, GPT-4 (the latest version of Chat-GPT) isn’t quite ready for these purposes.

The dangers of relying on tech

This research has many implications. One of the most important, perhaps, is that educational organisations, academic institutions and training providers should be wary of making use of Chat-GPT when it comes to producing training materials – or really, anything used in teaching and learning. While the output of Chat-GPT may appear useful on first glance, when we really dig beneath the surface, the quality of content isn’t rigorous enough to be of use.

Indeed, to use such tools for training purposes would be extremely unethical, and could even potentially cause harm, should someone use the lessons given by AI in the ‘real world’.

Where next for humanity?

But of course, there are many other issues raised by Chat-GPT and its many competitors.

One of the main issues we face right now is the way that chatbots are slowly eroding the need for human interactions. Gone are the days when you might expect to phone a call centre or speak to a human customer service representative. Rather now we join a chat room and type our question into a smart system that points us in the direction of what we’re looking for.

With social media platforms such as LinkedIn offering AI-powered content creation options we risk entering a world where machines are speaking to machines, and there is no room for human interaction at all. While this may sound like a dark vision of the future, there is a real risk that we undermine the very fabric on which our society is built.

Chat-GPT has the potential to be a fantastic tool and resource for businesses, researchers and consumers alike. But it comes with some caveats, and as a society we shouldn’t rely on machines to do all of our thinking for us.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.