AI Fails to Understand Strategy

It has been my position for years that strategic plans are rarely strategic. They're just plans for the school's development (driven by the school, with everything in its locus of control), and have little to do with actual strategy. In fact, following a review of over 100 strategic plans that I did in 2010, I came to the conclusion that strategic plans were dangerously formulaic. Some thirteen years later, having reviewed hundreds more, I hold to my earlier assertion. Key learning: if something is formulaic, it can be replicated easily by means of an algorithm.

Enter ChatGPT. For all the fretting within the education sector about this product right now, consider the following prompt I entered, and the answer that was given, underscoring the formulaic, algorithmic nature of strategic plans:

This result resembles far too many strategic plans in schools and institutions of higher education...and it should, since ChatGPT relies on structured information that it can find/access on the internet. I've written elsewhere about the danger of monoculture in independent and international schools, and this app-driven 'strategic plan' exercise is no different. If a handheld device using ChatGPT can produce such a thing, why waste time on committee meetings and working with a consultant, when a school will simply end up with a similar product? What's more, it would be a product that the senior leadership team and a handful of teachers could have devised on their own, without a six-month process. Will that product likely be substantially different from what ChatGPT has produced here? Unlikely, apart from a flashy PDF or print brochure. That ought to be concerning to leaders of institutions of learning, and to boards, and it should underscore that we need to consider strategy far more seriously, and depart from a highly replicable process and 'solution.'

I've been deeply influenced by Roger L. Martin's work on strategy over the years. When one sees--even dimly--the difference between real strategy and so-called 'strategic planning,' one cannot unsee it. Strategic plans stick out like a sore thumb. I commend his tome (with co-author A.G. Lafley) Playing to Win: How Strategy Really Works, as well as Martin's profuse postings that further tease out the discipline of strategy here. Eye-opening and life-altering, when it comes to strategy.

The screenshots from my ChatGPT query above may be attractive and impressive at first glance; that would be because one is looking at the content through the lens of the familiar heuristic of strategic plans in schools and in higher education. If one is wedded to that perspective, then, yes, ChatGPT's answer should create a feeling of fright.

On closer inspection by someone who understands strategy, though, the ChatGPT sample strategic plan is woefully insufficient, to the point of being irrelevant. Consider the following:

  • Where is the obstacle that the strategy is meant to overcome? There is none present here; no problem to be solved. So, it's a list of 'continuous improvements.' Improvements against what?
  • There is no talk of choice: real strategy compels us to make choices (we will do this, and we will not do that). Has a choice been made? Unclear.
  • Where does the plan identify where the school will 'play' -- in which geography, among which audiences, etc.? No sense of its playing field.
  • How will this school win, in its given area of play? There is no positioning of a product/service that shows its differentiation against a set of other possible players on the playing field. Sidebar: if 'winning' involves 10% enrollment growth every year (as ChatGPT states, above, as a Goal), how does that increase show how the school wins? And do they aspire to be a large school? Unclear, again. Why assume that bigger is better? In fairly quick order, the size of the school would double.
  • Where are the capabilities that would be required, in order to win (in other words: to influence families to seek and favor this school against others, with a willingness to pay tuition)? Plenty of actions listed, but they're not tied to ensuring that the capabilities to deliver this putative brand that families would desire. Strategy almost always necessitates building capabilities that the institution doesn't have currently, but will need, in order to succeed ("to win" with families).
  • Management systems, as Martin would say, are integral to the practice of high-quality strategy. He writes, "A company needs management systems that build and maintain the distinctive capabilities that underpin a unique how to win in the chosen where to play that meets its winning aspiration. If a strategy does not have specific management systems that serve the purpose of building and maintaining distinctive capabilities, then those capabilities either won’t get built in the first place or will deteriorate because they are not systematically maintained. Additionally, if the capabilities and management systems of an organization are entirely or nearly identical to those of competitors, its where-to-play and how-to-win choices will be replicated as soon as shown to be successful. Hence distinctiveness is a key attribute to management systems. Sameness in management systems is typically matched with sameness in capabilities which delivers competitive parity not competitive advantage." (source) 

ChatGPT fails miserably at strategy. Not only is AI unable to sense and 'make sense' of context, it fails in the domains of imagination, creativity, and judgment, all of which are required in strategy work. In other words, strategy is a wonderfully human exercise. At Westlake, we treat strategy as advisory work, not as consulting work, because we recognise that strategy, once formulated, requires ongoing monitoring and adjusting to conditions; in other words, it requires advising that relies on imagination, creativity, judgment, and sense-making. It is not episodic and formulaic; it doesn't "happen" every five years.

Take-away: AI fails miserably at strategy. It succeeds brilliantly, however, at linear, predictable planning within a monoculture of educational thought. How should we feel about that?

Previous
Previous

Accreditation as Elegant Fulcrum

Next
Next

Lack of Moral Courage Reaches Crisis Proportions