How Should Businesses Position Their Data and AI Strategy For GPT-4?
Data and AI strategy is forward-looking and prescriptive. Part of the planning process must include evaluating the potential impacts of emerging models like GPT-4. The challenge in data and AI strategy is separating hype from reality.
GPT-4 isn’t out yet and is getting the same treatment as early iPhones. Rumors about its capabilities are difficult to pin down, and that’s not the direction strategy planning should follow. Best-in-class capabilities are almost guaranteed, and that’s the right starting point.
In conversations with my clients, C-level leaders want to understand what’s actionable. The focus is on applications and the opportunities created by them. One asked me last year if they could implement GPT-3 for translating documents into different languages. My response was, “Yes, but that would be like using a nuclear weapon to take out a spider’s nest.”
We need to avoid these types of use cases. They fall into a category I label, ‘Yes, we could, but no, we shouldn’t.’ Data and AI strategy must be actionable; one immediate benefit is focusing on profitability instead of possibility.
The key to a forward-looking strategy is to search for novel applications instead of dwelling on known use cases. What can’t we do without an advanced model? Natural language translation has been around for several years, so there’s nothing new to evaluate. Incremental accuracy doesn’t change the nature of the use cases.
Chatbots fall into the same category. We’ve been planning for a time when chatbots are more capable, and that time has arrived. The Q&A functionality is improved but doesn’t require a data and AI strategy shift. A better Alexa doesn’t need to be on our radar.
“Is it worth exploring building our own GPT-4?” That question came up in a conversation in mid-December. In the next 2 years, that will not be worth thinking about. 3 years out is a different story. Chip manufacturers and cloud providers are working to reduce model training costs.
Massive datasets will still be out of most companies’ reach. However, if the business has access to or is curating a large, unique dataset, it’s worth discussing as a long-term possibility. Still, very few use cases will justify building vs. buying.
The same logic will hold in 3 years. Just because it will be feasible doesn’t mean it will be profitable. Most companies will buy or lease access to large models from best-in-class model services companies. That brings forward opportunities worth exploring.
Getting In Front Of GPT-4’s Business Wave
Machine learning as a service (MLaaS), AI as a service (AIaaS), etc., are emerging business models. OpenAI is raising money at a valuation that’s around $30B and revealing the level of interest in these types of businesses. That’s worth paying attention to and planning for.
VCs can be wrong about individual companies and technical trends, but they are usually on target about business models. GPT-4 will have multiple applications, each an opportunity for monetization. Data and AI strategy should evaluate the business’s opportunities with this business model.
Why is OpenAI potentially worth so much? It’s a function of how many different use cases it can improve. Put the model behind customer service functionality, and it will handle most knowledge and support requests. Look at the Q&A section of a product on Best Buy or Amazon for the types of requests GPT-4 will successfully manage.
GPT-4 will allow businesses to support their products across all these different sites in near real-time. It will work on social media too. Questions I used to need a gate agent to answer while I was waiting for my flight will be managed on the airline’s app.
Microsoft has a suite of applications leveraging GPT-4 to deliver new or more capable functionality. If they can get another $1 per month from every Office365 subscriber, they would generate over $4B in new revenue annually. Other use cases in PowerBI, GitHub, and more apps push the opportunity scope for Microsoft into massive numbers.
The disruptive power of MLaaS/AIaaS business models is their unique monetization. Models don’t do 1 thing, and the breadth of potential applications is much greater than traditional software. Data and AI strategy need to have the objective of cultivating MLaaS/AIaaS business model opportunities.
Understanding A New Customer Retention Paradigm
The key is best in class. Many chatbots are available today, but GPT-4 will be the most functional and reliable for now. There will be competitors like in any other market, but only the top-performing models will gain traction. Data and AI strategy should look for opportunities for the business to develop a best-in-class model.
That’s critical because customers won’t be loyal to a model. In 12 months, no one will care if they’re using GPT-4, BLOOM, PaLM, Megatron, or Model McModely Face. Customers have stuck with Apple even when Samsung released a phone with better specifications. Models can’t rely on the same loyalty, and their protective moats are non-existent. State-of-the-art only lasts a few months for models. There’s always someone with a better approach working to deliver it to the market.
In the near term, the quality and novelty of a company’s data will determine its opportunity to develop a best-in-class model. As I said earlier, the massive dataset and compute requirements will be out of reach for most businesses. Novel datasets are another avenue to best-in-class models because competitors will lack the raw materials to develop their own version.
It’s one of the only competitive advantages in the machine learning world. To be part of the next technology gold rush, data and AI strategies should prioritize curating novel datasets. That will require intentional data gathering and curation. Experiments and access to complex user or customer workflows are two sources of novel datasets.
Anticipating Implications Of GPT-4
There will be downstream impacts of GPT-4. Network effects create the most significant risks and opportunities because most businesses don’t see them coming. It’s a critical advantage of having a data and AI strategy. Technical strategists are looking for ways to monetize the implications of new technologies.
By all indications, there will be an impact on search. GPT-4 will take some of the traffic that now goes through Google. What downstream effects could that create? Some Google ads will not perform as well.
I use this framework in the data and AI strategy planning process. Technology {X} could create change {Y}. What would the implications of the change be, and how could that impact the business? I break impacts into opportunities and risks. Then I evaluate modifications to the strategy that will either capitalize on or mitigate them.
In the Google Ads example, the business needs to monitor the types of traffic that GPT-4 takes from Google Search. If a critical customer segment gets swept up, the company should redirect advertising spend before performance takes a hit.
There will also be an impact on autogenerated code. The hype says that software engineers should worry about their jobs. The more pragmatic implication is increasingly functional self-service data tools. It’s not just GPT-4. Google has been working on a similar project to compete with Microsoft’s Copilot.
GPT-4 should support more text-to-code tools, knowledge discovery, defect resolution, and a more comprehensive range of data and analytics use cases. Data and AI strategy should plan for more model-literate users with more powerful tools. The data team’s workload will decrease, and they’ll have time to work on more advanced projects.
The product and initiative roadmaps should be updated based on that expectation. Budget should be allocated for self-service tools vs. continued hiring and data team ramp-up. It’s small insights like these that can have the greatest impact. They are the difference between a business moving forward consistently or backtracking after a misstep becomes obvious.
What I’m Preparing My Clients For
I am working with clients to develop data curation initiatives. These low-cost projects catalog all the data the business has access to. Data curation and cataloging are the first steps in creating an AIaaS/MLaaS product. The sooner the discovery process starts, the better.
I am warning clients to be on the lookout for competitors working with one of the other companies that offer models like OpenAI’s. A competitor who leverages a generative model to leap ahead is the greatest threat. I advise clients to reach out to a few AIaaS/MLaaS startups for a brainstorming session. See what kinds of ideas are out there. Even if leadership doesn’t decide to move forward with any of them, it provides a more concrete idea of what to watch out for.
I am helping clients adopt a data and AI product-first mindset. The purpose of every data and AI initiative should be a product with well-defined expected returns. That requires process changes to mature past prototypes and pilot projects. Data Product Managers are a crucial part of the process too.
Similarly, OpenAI has captured a broad interest in data and AI. That means customers and internal users will be more willing to adopt. This is a tipping point for data and model-supported products. I am working with clients to implement more streamlined processes to take customer needs and put solutions into their hands.
Bottom-up opportunity discovery is a process that reduces the time it takes to discover, define, estimate the opportunity size, assess feasibility, and quantify costs. That is the critical upfront work required to prioritize initiatives and put them on the product roadmap. Most businesses will struggle to respond to customer needs without it.
There are smaller themes, but those 4 generalize across all businesses in competitive industries. As strategists, we need to put the company in a position to profit from data and AI. This is the perfect opportunity to revisit or develop a data and AI strategy.