For many small businesses, AI does not feel cheap at all.

That is the first myth worth killing.

From the outside, the AI market can look like a gold rush full of free tiers, modest subscriptions and cheerful promises of productivity. From the inside, for a normal small business, it often looks very different. Most SMEs already have working systems, established processes and staff who know how to do the job. AI is not arriving to fix a broken business. It is arriving and asking to be inserted into a functioning one, so the business can spend money and time finding out whether the technology is mature enough to deserve a permanent place. That is a very different proposition from "cheap experimentation" as Silicon Valley likes to describe it. BCG has argued that the value of AI comes not just from the model itself but from the wider changes required in data, workflows and ways of working, which supports the point that the true cost of adoption is much broader than the subscription fee alone.

That matters because the current AI race is being financed at a level that is not remotely normal for a mature software market. Reuters reported in February 2026 that Alphabet, Amazon, Meta and Microsoft alone were expected to invest about $650 billion in AI infrastructure in 2026, up sharply from 2025. Reuters also reported that Broadcom said major tech groups were planning to invest more than $600 billion in AI infrastructure this year, while HPE warned that AI servers remained expensive to produce and that memory costs and supply pressures were likely to persist into 2027. Those are not the economics of a calm, stable service market. They are the economics of a brutal race for position.

SMEs are already paying, even before the market matures

This is the point too many AI commentaries skip over.

The providers may still be subsidising growth and fighting for market share, but from a small business point of view the costs are already real. A business with a working process for handling quotes, customer service, reporting, document creation, admin or analysis is not being offered a clean replacement. It is being asked to fund a test. That test usually means trial licences, duplicated effort while old and new methods run side by side, staff learning time, governance concerns, checking outputs for errors, and tolerating disruption in systems that were previously doing the job well enough. BCG's work on digital platform lock-in also reinforces that switching and reworking digital systems is costly because of technical migration, retraining and productivity loss.

That is why AI often feels expensive to SMEs even now. The cost is not just the invoice from the provider. It is the cost of proving whether the tool is mature enough to trust. Until that proof exists, AI is not really a saving. It is a speculative business expense.

AI is being sold as efficiency before it has earned the right

There are certainly cases where AI creates immediate value. But the broad sales pitch runs ahead of reality.

A business does not become more efficient simply because it adds AI to a process. It only becomes more efficient if the output is good enough, safe enough, repeatable enough and genuinely less burdensome than what came before. That is a much higher bar than the current hype cycle usually acknowledges. Many firms are not adopting AI because their old systems failed. They are adopting it because the market is telling them they will be left behind if they do not.

That creates an awkward dynamic. Providers want real-world usage because real-world usage improves positioning, feedback and dependence. Customers, meanwhile, are being asked to absorb the risk of immature deployment. Put bluntly, many SMEs are helping de-risk AI vendors' products by testing them inside live businesses.

The current AI race is not designed to keep prices low forever

Even if a business gets through this phase and finds genuine use cases, there is little reason to assume the pricing environment will stay generous.

The current market is still in its land-grab phase. Prices, bundles and access models are being shaped by a fight for adoption. OpenAI, Anthropic, Google, Microsoft, Meta, xAI and others are still trying to secure position. That tends to produce generous-looking offers, rapid feature rollouts and underpriced capability compared with the long-term economics of the industry. But that phase does not last forever. History across software, cloud and SaaS suggests the usual pattern: first comes market capture, then ecosystem dependence, then monetisation.

"First comes market capture, then ecosystem dependence, then monetisation. That phase does not last forever โ€” and history across software and cloud suggests what comes next."

The evidence from adjacent software markets is not subtle. Microsoft announced in December 2025 that it would update commercial Microsoft 365 pricing from 1 July 2026 alongside additional AI, security and management capabilities. Salesforce announced that list prices for several major products would rise by an average of 6% from August 2025, explicitly citing ongoing innovation and customer value. In other words, the idea that technology providers keep passing efficiency gains straight back to customers is not how this normally works in practice.

The future bill is unlikely to be one subscription

Another reason AI costs are likely to rise is that businesses probably will not end up paying just one provider.

The more realistic future is AI subscription stacking. A business might pay for a frontier general-purpose model, then also pay for specialist legal, support, finance, design, analytics or workflow tools. It may pay once via Microsoft 365 Copilot, again via a niche SaaS product with embedded AI, and again through API usage powering internal workflows. Microsoft's own pricing structure shows that Microsoft 365 Copilot generally requires a qualifying Microsoft 365 plan as well as the Copilot licence itself. Anthropic offers Free, Pro, Max, Team and Enterprise plans plus API pricing. OpenAI prices by model and token usage, while Google's Gemini API has separate free and paid tiers and model-specific rates. This is not a simple one-provider future. It is a layered spend model.

That matters because the total cost of AI in a business can rise even if some individual model prices fall. If AI starts appearing in more tools, more departments and more workflows, the overall bill can grow long before any one provider announces a dramatic price rise.

The AI bubble probably bursts, but that does not help customers much

It is very possible that the current AI boom goes through a correction. In fact, it would be odd if it did not.

Reuters reported that the planned AI spending surge was creating investor unease about profitability and the risk posed to software and data firms. More than half of respondents in a Bank of America survey still thought AI was a bubble, and a large share viewed it as the biggest tail risk going into 2026. More recently, Reuters reported that Nvidia and OpenAI had moved away from discussion of an even larger investment structure, partly amid concerns over the sector's stability.

But a burst bubble is not necessarily good news for customers. More often, a correction leads to consolidation. Some startups disappear. Some specialist tools are acquired or shut down. Some firms decide they are not going to win the race for the top-tier model and stop trying. The result is fewer serious competitors, not a happy era of permanently cheap AI. And fewer serious competitors usually means more pricing power for the survivors.

Lock-in is real, and AI lock-in may be worse than cloud lock-in

Businesses already understand vendor lock-in in the cloud and software world. AI may make it worse.

Switching AI providers is not always a clean migration. It can involve rewriting prompts, changing APIs, retesting output quality, retraining staff, revising governance and reworking workflows that have gradually adapted to one provider's strengths and quirks. BCG's 2025 work on digital platform lock-in found that firms still cite major concern around technical migration, productivity loss and downtime when switching platforms. That general finding translates uncomfortably well into AI, where the provider is often embedded much closer to the work itself.

The practical point is simple: signing up is easy, leaving is harder. Once businesses build internal habits and processes around a particular model family, price increases become easier for the provider to impose because the customer is no longer buying a simple commodity. They are paying to avoid disruption.

Token efficiency is a real downward force โ€” but only if competition forces savings through

There is a fair counterargument here.

Model efficiency is improving. Smaller models are becoming more capable. Official pricing pages from OpenAI, Anthropic and Google all show cheaper model tiers or lower-cost options for lighter workloads. Anthropic also said in February 2026 that pricing for Claude Opus 4.6 would remain the same as the previous rate even as the model improved, while Google's Gemini pages explicitly present "best price-performance" options for lower-latency, higher-volume tasks. So yes, there are genuine downward pressures in parts of the market.

But lower technical cost does not automatically become lower customer cost.

That only happens if competition remains strong enough to force providers to pass savings through. If the market narrows, if switching remains painful, and if businesses grow dependent on a handful of providers, efficiency gains can simply become margin. Cloud got more efficient. SaaS got more efficient. Yet that did not stop total business software bills from rising over time. AI may follow the same pattern: cheaper units, higher total spend.

The hardware and energy reality makes permanent generosity unlikely

There is also a harder physical constraint beneath all this.

AI is expensive because the infrastructure is expensive. It depends on advanced chips, memory, networking, data centre build-out and large amounts of power. McKinsey has noted that data centre power demand is rising sharply and that it is already difficult to add 100, 200, 500 megawatts or more at a single site because there simply are not that many places where that power can be attached to the grid quickly. McKinsey has also estimated that data-centre power demand could reach 1,400 terawatt-hours by 2030, or around 4% of total global power demand. Reuters' recent coverage of Broadcom and HPE adds to the same picture by highlighting continuing infrastructure spending and persistent cost pressures.

That matters because some of the more optimistic AI pricing narratives quietly assume that software economics alone will drive costs down. But AI is not just software. It is software tied to a highly capital-intensive, energy-hungry infrastructure base. That makes a permanently cheap frontier market much less plausible.

The likely end state is that AI captures part of the productivity gain

This may be the most important conclusion for small businesses.

AI will probably improve productivity in a lot of firms. But that does not mean those firms get to keep the whole benefit. A meaningful share of the gain may be captured by the providers through subscriptions, usage billing, premium tiers, agent pricing and embedded add-ons across existing software estates.

"AI may become less like a one-off productivity tool and more like a tax on productivity. Not because it is useless. Because it is useful enough to become unavoidable."

That is not unusual. It is how platform economics often works. The provider helps increase output, then captures part of the value through pricing once the customer becomes dependent enough that the service is hard to remove. Microsoft's repeated ability to repackage value and raise commercial pricing is a useful reminder of how mature platform power behaves. Salesforce's pricing changes tell a similar story. AI may simply become the next layer where productivity gains are partially converted into recurring rent payable to someone else.

What this means for small businesses

Small businesses are in an awkward position. They are being asked to spend during the least mature phase of the market, when the technology still needs proving, the workflows still need redesigning and the failure modes are still being discovered. Then, if the tools do prove useful and become embedded, those same businesses may face a later phase where the field has narrowed, lock-in has deepened and prices have more room to rise.

That is the double hit. First, SMEs pay to experiment. Then they may pay more to stay dependent.

This does not mean businesses should ignore AI. It does mean they should stop telling themselves a comforting story that AI is naturally heading toward abundant, cheap, interchangeable utility pricing. Some parts of it may get cheaper. But the AI that businesses genuinely rely on is more likely to sit inside a market shaped by concentration, lock-in, layered subscriptions and the very normal habit of technology providers raising prices once dependence has formed.

The real question is not whether some AI tools will get cheaper.

They will.

The real question is whether the AI that businesses come to depend on will stay cheap once the race settles. History suggests it will not.

Richard Lowe — Founder of Small World Solutions, helping UK SMEs navigate IT infrastructure, security and AI adoption.

If you want to discuss AI dependency, lock-in, or how to evaluate AI without sleepwalking into permanent cost exposure, use the contact page.

โ† Back to Ramblings Get in Touch

References

  1. Anthropic (2026) Claude pricing. Available at: Claude pricing pages and API pricing documentation.
  2. Anthropic (2026) Introducing Claude Opus 4.6. Available at: Anthropic News.
  3. Boston Consulting Group (2025) Managing the Evolving Dynamics of Digital Platform Lock-In. Available at: BCG.
  4. Google (2026) Gemini Developer API pricing and billing. Available at: Google AI for Developers.
  5. McKinsey & Company (2025) Data centers: The race to power AI and Building data centers bigger, faster. Available at: McKinsey.
  6. Microsoft (2025) Advancing Microsoft 365: New capabilities and pricing update and related licensing notices. Available at: Microsoft commercial licensing and Partner Center pages.
  7. Microsoft (2026) Microsoft 365 Copilot pricing and licensing. Available at: Microsoft pricing and licensing pages.
  8. OpenAI (2026) API pricing and pricing documentation. Available at: OpenAI / OpenAI Developers.
  9. Reuters (2025) Global funds fear AI investment indigestion. Available at: Reuters, 19 November 2025.
  10. Reuters (2026) Big Tech to invest about $650 billion in AI in 2026, Bridgewater says. Available at: Reuters, 23 February 2026.
  11. Reuters (2026) Broadcom rises as $100 billion AI forecast signals gains in Nvidia-dominated market and related coverage. Available at: Reuters, 4โ€“5 March 2026.
  12. Reuters (2026) HPE projects revenue above estimates, focuses on higher-margin networking orders. Available at: Reuters, 9 March 2026.
  13. Reuters (2026) Nvidia CEO hints at end of investments in OpenAI, Anthropic. Available at: Reuters, 4 March 2026.
  14. Reuters (2026) AI fears temper interest as private equity firms weigh data company deals. Available at: Reuters, 5 March 2026.
  15. Salesforce (2025) Updated Product Packaging and Pricing Offer New AI Capabilities and More Value. Available at: Salesforce, 17 June 2025.