Three Lessons From Chatting About Strategy With ChatGPT

When generative AI’s capacity for strategy creation is put to the test, it reveals where its strengths lie — and where humans still have the edge.

Reading Time: 7 min 


Permissions and PDF

Carolyn Geason-Beissel/MIT SMR | Getty Images

When Geoffrey Hinton, a pioneer of deep learning, quit Google recently, he made it clear that he is worried about the risks of artificial intelligence. He is not alone. Following the launch of ChatGPT-4, thousands of artificial intelligence experts signed a letter calling for a pause in the development of more-powerful AI systems.

On the other hand, excitement about the opportunities arising from large language models (LLMs) and their speed of adoption is unprecedented. Microsoft was quick to integrate the new technology into its Bing search engine, and the company’s founder, Bill Gates, stated that ChatGPT will “change our world” without necessarily putting jobs at risk.

While speculating about the future of AI is irresistible, the more practical question is how we can use it right now. Conversations about this are taking place in classrooms, newsrooms, and workplaces around the world.

As business strategists, we wanted to see what generative AI could add to our work. We explored this question through a series of experiments on different aspects of the strategy creation process. In each of the experiments, we put a realistic question of strategy to ChatGPT, followed by a lengthy back-and-forth to refine the initial responses. The intention was to understand how the tool can support ideation, experimentation, evaluation, and the building of stories — and where it falls down.

Three lessons emerged from these experiments.

1. Expect interesting input, not infallible recommendations.

In one of our experiments, we asked ChatGPT to suggest some disruptive business ideas for a large European transport provider. The chatbot suggested a personalized planning app, a ride-sharing service, hyperloop transportation, and a smart-luggage delivery service. Coincidently, the first three matched ideas from a recent workshop with a transport provider in another European country. The tool was also able to provide business models and cost estimates for these business ideas.

On the one hand, this is impressive. At the same time, it highlights that the tool seems unlikely to come up with ideas humans can’t, although it gets results faster and with less effort. Several other experiments confirmed this.

The tool seems unlikely to come up with ideas humans can’t, although it gets results faster and with less effort.

In another experiment, it also became obvious that humans are better at translating ideas into actions.


Reprint #:


More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.