It is very rare that any technology gets elevated to this status – most technologists however, do agree that AI should make the cut: along with the internal combustion engine, electricity and the internet. Many definitions exist, here’s one:
1. Pervasiveness – The GPT should spread to most sectors.
2. Improvement – The GPT should get better over time and, hence, should keep lowering the costs of its users.
3. Innovation spawning – The GPT should make it easier to invent and produce new products or processes.
While this has been true with the Consumer, can we say that AI is a GPT for the Enterprise? While there are several (and growing) examples of how AI is shaping different parts of the industry value chain, what might be the criteria relevant for the Enterprise? And does it matter? In this blog, I will argue that it is important for Enterprises to accept that AI is indeed a GPT and why that matters.
To begin, let’s try to apply the above definition to the Enterprise:
- Pervasiveness: There is enough evidence, although scattered across enterprises. Take any process within your value chain – AI has the potential to accelerate the decision cycle times (e.g. Intelligent Process Automation); enable better decisions (e.g. recommend a Next Best Action to improve the Customer Journey) and even more, help businesses discover new opportunities (e.g. help identify Product Features that create the highest returns). And the best part is that the technology is already here – it is for every company to take a broad view of the entire enterprise and start executing on the use-cases.
- Improvement: This is one of AI’s core defining features. AI systems get better over time, as they continue to learn. And this is true in Enterprises as well, where the improvements in recommendations and process automation etc. will not just continue to lower the cost of running these systems, but at the same time, will improve the quality of the decisions.
- Innovation spawning: This is likely to be the biggest challenge for Enterprises. How do you create an ecosystem that enables AI driven innovation in the functional units? This is a fundamental challenge that could determine how well AI will serve the Enterprise
It is becoming increasingly clear that AI has the potential to enable transformational change in organizations – and in the process, fulfill its promise as a GPT. The question then is: what will it take to make Enterprise AI to realize this potential?
The history of the GPTs is a good place to start. If you study the internal combustion engine, electricity, the internet – it becomes apparent that governments, entrepreneurs and markets had to come together to truly enable the GPTs as the catalysts that changed economies and societies for ever. Similarly, it is for the Enterprise to drive the right set of policies, processes and platforms to exploit the true potential of AI:
- Organization Change: Easily one of the most difficult and challenging part. Just as GPTs triggered changes that went beyond economic and altered the social and political landscapes of countries (e.g. internal combustion engine helped the British economy move from an agrarian to an industrial powerhouse; and in turn fired up the colonial machinery that established British rule over half the world), the Enterprise needs to be ready to transform itself to be able to well and truly exploit AI. The challenges are many, but can be summarized into three major ones:
- Talent, talent, talent: It is not enough to hire a data scientist or two into functional units – every employee needs to adapt to survive and thrive in this atmosphere where routine tasks will be automated and whole value chains will be disrupted. A Procurement manager can no longer afford to be good at just negotiating prices with vendors – automated reverse auction engines powered by intelligent algorithms can easily outperform humans. How does it impact your workforce strategy? you then re-tool the Procurement Manager? This is a distributed problem – each functional unit needs to re-think roles, responsibilities and talent composition from ground-up.
- Build capability: Most functional units are struggling today to build the capability. Enterprises would do well to be look at the Venture Studio model for inspiration. A centralized group under the CDO should focus on building this seed capability with individual functional units – e.g. help define AI driven transformational programs, execute a few iterations, and in parallel help the functional unit hire and train talent. This is probably the biggest and most important role of the CDO.
- Cultural Integration: While individual functions should invest in building their Data Science Teams, the functions will run into the inevitable cultural challenge of integrating the new (and different) skillsets of the Data Scientists with the legacy teams. The cracks are already visible – starting with compensation and incentive structures. Thanks to the red-hot job market for Data Scientists, they command significant salary premiums over the traditional job functions. In the early days of the internal combustion engine, the skilled machine operators were the most visible face of change – it was not just that they were not only paid more than the weaver, they were blamed as the reason for the weavers losing their jobs. A microcosm of a similar phenomenon is happening in the Enterprise – and it is important to deal with it. While it is a distributed problem, the CDO can play an important role by enabling the right project structures that enable collaboration (e.g. cross-functional Pods to drive transformation projects) and in the longer run, help train/re-tool the existing personnel in the individual functional units.
- Data Standards: All the GPTs took off once the standards were established. This allowed for innovation ecosystems to flourish across the value chain – e.g. establishing a standard definition of electricity across the entire country (e.g. 120V AC supply) created a market for electrical goods. Likewise, it is necessary to not just build a common data ontology, but also provide clear definitions of core data elements (e.g. Customer, Product etc.) which will enable the data to be threaded across the value chain. Easier said than done – especially when there is the legacy baggage: decades of siloed IT systems that in some egregious cases, don’t even have a standard definition for a Customer. This is clearly a centralized responsibility that should sit with the Chief Data Officer. The three primary challenges here:
- Working with the individual siloed IT system owners to drive standards on data capture at the right grain, frequency with enterprise wide standard definitions
- Building the pipelines to bring data from the vast cornucopia of legacy systems (owned not just by IT, but Shadow-IT, business applications)
- Connecting datasets across siloed systems which have never bothered to talk to each other – e.g. threading data across the value chain to come up with a cogent picture of the customer journey.
- Platforms: GPTs should provide increasing returns to scale. And for that to happen, we need the necessary infrastructure – the electricity grid; the ARPANET are examples. These tend to be capital intensive and hence, best served by a centralized organization. Enter the CDO again – this time, with the right set of technology infrastructure that enables the datasets that can be used for AI models to learn; and a technology environment where AI models can be rapidly developed and deployed at scale. The three primary challenges here:
- Ensure the availability of data through the data pipelines from source systems: AI/ML models are data hungry and they need to be fed on a continuous basis.
- The real challenge with AI models is not the creation but consumption. Integrating the output of an AI model into a business workflow requires not just scoring the models, but also effectively integrating with operational systems, often managed by IT. This requires a well-designed and executed API layer that exposes the model outputs, often with expectations of performance, uptime etc. This requires an Engineering mindset
- Enabling re-use: ‘Let a thousand models bloom’ may have been the mantra in the early days of AI, but that needs to change. If AI has to truly scale, the platforms need to enable a healthy ecosystem that not just enables the sharing of datasets, but also model components (e.g. feature engineering) that can be leveraged across multiple use-cases.
There is a lot of work to be done in these 3 dimensions to truly enable Enterprise AI to unlock the potential of a GPT for an organization. The next few years will be defining for almost every large enterprise – and the CDO has to be the agent that drives this transformation.
References:
Bresnahan, T.F., Trajtenberg, M. (1996). “General purpose technologies: ‘engines of growth’?”. Journal of Econometrics, Annals of Econometrics 65, 83–108.
Leave a Reply