top of page
Search

Trickle-Down AI: Big Bets, Small Businesses and SMB AI Adoption

  • Writer: Rudy Tossel
    Rudy Tossel
  • Jul 10
  • 13 min read
Bar chart titled "The trickle-down experiment has failed," compares income growth of all tax units vs. bottom 90% across business cycles.
Source: Center for American Progress

In the 1980s, we were sold "trickle-down economics" – feed the rich, and the crumbs will feed everyone else. Has that gotten us anywhere? 


Long answer: not really.


Today, Big Tech and Wall Street titans hawk a similar narrative with artificial intelligence: they pour billions into Godzilla-sized AI models and infrastructure, promising that the benefits will eventually cascade to the little guys on Main Street. We are now in the age of Trickle-Down AI, where $100M+ bets on large language models (LLMs) and AI platforms made in boardrooms are supposed to empower the bakery, the 5-person SaaS startup, and the boutique marketing agency. Will these colossal investments uplift small and medium businesses – or absolutely crush them? Let's separate the two and try to find out.


The $100M AI Arms Race at the Top


In the last two years, an arms race has unfolded among the world's wealthiest companies to build and own AI. Microsoft dropped a $10 billion investment into OpenAI to secure an early lead in generative AI, integrating GPT-4 into everything from Bing to Office. Google, Amazon, Meta, and Salesforce are likewise investing heavily to develop proprietary AI models and cloud services. If money is power, then AI is the new grid, and the Fortune 500 are scrambling to control the power plants.


Take Bloomberg LP, a financial data giant that quietly built BloombergGPT, a 50-billion parameter domain-specific LLM. Trained on decades of proprietary financial data, BloombergGPT is tuned for finance – tasked with answering questions, analyzing news, and even drafting reports for users of the Bloomberg Terminal. Bloomberg claims their model outperforms general AI on finance tasks by “significant margins”. In other words, they've created an AI brain with an MBA and a Wall Street address. Why? To keep their edge and lock customers into their ecosystem with AI-driven insights that no startup can easily replicate. As the eMarketer analysis put it, Bloomberg's exclusive data and market dominance give it a trust and performance advantage rival AI upstarts can't match. This is AI as a moat: spend a fortune to build a model only you can afford, and reap the competitive advantage.


In banking, JPMorgan Chase – hardly known for tech agility – has gone all-in on AI. The bank advertised over 3,600 AI-related jobs in just three months and filed a trademark for something called IndexGPT, an AI to help pick securities for investment portfolios. In plain English: Jamie Dimon wants a ChatGPT for stocks. While still in development, IndexGPT hints at a future where algorithms assist (or replace) financial advisors. Not to be outdone, Goldman Sachs and Morgan Stanley have their own in-house LLM pilots – Goldman using generative AI to write code, Morgan Stanley using it to assist financial advisors with research. Even BlackRock’s CEO is touting AI as the key to 30% productivity gains. The message is clear: enterprise AI is now table stakes in finance, and those who fail to ante up risk being left in the dust. 


Big Tech is, of course, both enabler and competitor in this race. Amazon’s AWS has rolled out Bedrock and custom chips for AI, vying to host the world’s models in its cloud. Google, after a late start, released its PaLM 2 and Gemini models and integrated AI copilots into everything from Gmail to Cloud. IBM launched Watsonx, aiming to sell foundation models and tooling to enterprise clients. These initiatives often carry nine-figure R&D budgets. The idea is to create full-stack AI ecosystems – from silicon to software – that lock in corporate customers (and their developers) for years to come. As Microsoft’s CEO Satya Nadella noted, having an integrated tech stack means every dollar invested in AI infrastructure can be leveraged across their cloud, software, and AI services – “all one tech stack... that advantage trickles down... to third parties”. Trickle-down, indeed. 


Proprietary Models - Moats and Micro-Effects


These proprietary AI models are the new status symbols of corporate might. But beyond bragging rights, they serve strategic purposes. They can act as moats, keeping customers tied to a platform. BloombergGPT, for example, is baked into Bloomberg's products – a private feature for subscribers that no external AI can fully mimic. In effect, it's a defensive wall against competitors and an offensive tool to upsell clients with AI capabilities. Customers (including many financial SMBs like boutique investment firms) who rely on Bloomberg’s data will now also rely on Bloomberg’s AI inside the terminal for summaries, sentiment analysis, and even drafting regulatory filings. A small fintech analytics vendor that once sold add-on tools for these tasks might suddenly find the ground cut from under it – Bloomberg’s native AI does it all. The trickle-down to that SMB vendor feels more like a shakedown: adapt or become irrelevant.


JPMorgan’s efforts show another path: using AI internally to streamline operations and then productizing that expertise. The bank’s new LLM Suite – a proprietary platform launched in 2024 – onboarded 200,000 employees in 8 months to leverage generative AI for everything from answering policy questions to generating code. Productivity of tech teams jumped ~10–20% with AI-assisted coding. This massive internal adoption not only saves JPMorgan time and money, it also lays the groundwork for future products. If IndexGPT (or similar offerings around the finance industry) mature, JPMorgan could offer AI-driven investment products to clients, or license tools to smaller banks. Imagine being a regional bank or fintech startup: do you try to build your own AI from scratch, or do you piggyback on JPMorgan’s model (and pay for the privilege)? Either way, the big bank’s $100M+ investment reshuffles the playing field.


There are plenty more examples of corporate LLMs: Insurance companies developing AI for claims and risk analysis, healthcare networks training models on medical records to assist diagnostics, retail giants using AI to optimize supply chains and customer service chatbots. Not all of these are publicized with catchy names, but the trend is unmistakable. These bespoke models and AI platforms are initially built for enterprise scale, yet they inevitably create ripple effects downstream – in supply chains, software ecosystems, and labor markets that involve smaller businesses.


The Trickle-Down - Tools for the Rest of Us?


So how exactly does an SMB feel the impact of these big-money AI investments? Ideally, through new tools and capabilities that they could not have built on their own. We’ve seen a flood of AI APIs and platforms become available, many spun out of big-company efforts. OpenAI’s GPT-4 (bankrolled by Microsoft) opened its API to developers large and small, unleashing a Cambrian explosion of AI-powered startups and features. In theory, a five-person startup or a local marketing firm can now rent world-class AI by the hour, via API call, instead of needing a research lab. This is the promise of trickle-down AI: expensive innovations at the top become affordable utilities at the bottom. 


And indeed, small business adoption of AI has surged in the past year. A Verizon survey found that the number of SMBs using AI more than doubled in one year – from just 14% in 2023 to 39% in 2024. In large part, this jump is due to the growing accessibility of AI tools and the hype wave triggered by ChatGPT’s debut. Suddenly, every business owner knew about AI and had a user-friendly entry point to try it. Whether it’s using ChatGPT to draft marketing copy, or tapping an AI image generator for product photos, the barrier to experimentation dropped dramatically. Trickle-down AI in action: a massive research investment created a viral consumer tool, which in turn became a business tool for millions of hustlers and entrepreneurs overnight.


SMB AI adoption surged from 2023 to 2024. Nearly two in five SMBs now use AI, up from just 14% the year before — and for those of us without multiple math or engineering degrees to make sense of that number, that’s more than twice the number from 2024. Many are leveraging generative AI for marketing content, customer communication, and data analysis without having to build any AI models from scratch.


Enterprise investments have also spawned an entire ecosystem of AI infrastructure now at everyone’s fingertips. Need to search and summarize your company documents with AI? Once only big firms could do that, but now any dev can plug into OpenAI or Anthropic’s large models via cloud APIs. Fancy vector databases (for AI memory and semantic search) were niche tech a couple years ago; now startups like Pinecone have raised over $100M to offer vector DBs as a service. This means even a small app company can afford sophisticated semantic search by renting infrastructure that big tech helped pioneer. Talk about full cycle.

As one venture capitalist put it, vector databases have become “critical infrastructure for Generative AI”, attracting major funding and quickly being productized for broad use. The same goes for developer libraries like LangChain, or platforms like Hugging Face: they take cutting-edge AI building blocks and make them accessible to the masses.


Even the open-source community’s gains are a form of trickle-down AI. Consider Meta’s LLaMA series of language models. Meta spent a fortune training these models (LLaMA 2 has 70 billion parameters) and then did something radical: open-sourced them. 


The result? An explosion of adoption – over 350 million downloads of LLaMA models on HuggingFace, a 10x increase in a year. An “exploding number of companies large and small, startups, governments, and non-profits” are building products on these open models, noted Meta’s chief AI scientist. Startups have fine-tuned LLaMA into domain-specific tools – from FinGPT for finance to biotech and coding assistants. This is trickle-down by design: a Big Tech company essentially gave away the golden goose (albeit an earlier generation one) to empower developers everywhere and to undermine closed competitors. Small businesses benefit by getting powerful AI engines for free or cheap, avoiding dependency on a single vendor. It's a savvy move by Meta to seed an ecosystem that challenges the proprietary approach of OpenAI.


On the software side, enterprise AI adoption is seeping into the SaaS products that SMBs already use. Microsoft Copilot (powered by OpenAI by the way, for the unaware) can draft emails, create PowerPoints, and analyze Excel data – features soon available to any small business with an Office subscription. 


Adobe’s Firefly generative AI now helps even a solo freelance designer create stunning visuals in Photoshop with a few clicks. 


Salesforce is baking Einstein GPT into its CRM tools, meaning a small sales team can get AI-generated insights and auto-filled data without hiring data scientists. 


Trickle-down AI is often indirect: you might not build an AI yourself, but the platforms and software you rely on are suddenly supercharged with AI features, thanks to big companies’ investments. The playing field of productivity is ostensibly leveled, but just a bit. 


New Dependencies - Cloud Empires and AI Toll Roads 


Robots dancing energetically in a spacious lab with large windows. A yellow quadruped and two humanoids appear in motion, creating a lively scene.
Source: Boston Dynamics/YouTube

Before we sing kumbaya about democratized AI, let's talk about the fine print. When SMBs tap these powerful AI tools, they often hitch themselves to the infrastructure and rules of the big players. An entrepreneur building an AI-powered app in 2024 is very likely using someone else’s model via API, someone else’s cloud, and maybe storing data in someone else’s specialized database. The new tech stack for “AI-powered” businesses is heavily leveraged on external services. This creates both leverage and risk.


On one hand, it's fantastic that a small firm can rent a world-class model like GPT-4 or Claude and not need a PhD research team. On the other hand, these small firms now have a strategic dependency on the AI provider. If OpenAI (or Azure, or AWS, or Gemini or Enter Model/Provider name here) raises prices, suffers an outage, or changes its terms of service, the downstream startup is in trouble. We've already seen hints of this: earlier this year, OpenAI’s API had a few high-profile outages, leaving applications (from one-person Chrome plugins to VC-backed startups) non-functional until service was restored. For SMBs relying on third-party AI, downtime and price hikes are very real business risks – ones largely outside their control. The AI giants become gatekeepers and toll collectors.

Cloud concentration exacerbates this asymmetry. The GPU compute required to run advanced AI models is so scarce and expensive that most small players have no choice but to go through cloud providers (who themselves are vying for GPU capacity). And so the cycle continues.


The past year saw shortages of Nvidia AI chips so severe that startups were scrambling for computing power, sometimes resorting to outrageous workarounds. Amazon Web Services even admitted the surging demand caught them off guard, leading to capacity shortfalls. One startup founder quipped that he wondered if buying Nvidia stock would yield more profit than running his AI business, given the sky-high costs of renting GPUs. In short, the big companies and well-funded AI labs are grabbing the lion’s share of AI hardware, and smaller companies pay a premium for the leftovers. Trickle-down? More like fighting for crumbs. Global spending on AI-focused chips hit $53 billion in 2023 and is set to double in four years, as the arms race continues. If you’re an SMB trying to train or host your own model, good luck competing with the cloud budgets of an AWS or the sovereign wealth funds backing big AI labs. 


Another emerging dependency is on data and knowledge pipelines. Many of the enterprise-built AI models come with proprietary data behind them. If an SMB in finance wants the insights of BloombergGPT, they effectively need to be in Bloomberg’s ecosystem (e.g. using the Terminal or associated services). The alternative – curating and training on all that data yourself – is prohibitively costly. Similarly, consider Amazon: it has built AI recommendation systems fine-tuned on its massive retail data. A small ecommerce player might improve their sales by using Amazon’s fulfillment and platform – but then they live under Amazon’s rules (and shadow). If Amazon deploys an AI-driven dynamic pricing or search ranking update, the SMB seller has no choice but to adapt, with little say in the matter.


SMB AI Adoption: Adaptation vs Extinction


Despite these asymmetries, many small and mid-size businesses are proving remarkably adept at adapting in the age of AI. Adoption is no longer optional – it's becoming essential for survival, and savvy SMBs know it. A U.S. Chamber of Commerce report noted that generative AI use among small businesses nearly doubled in a year, to ~40%, and the most common uses were marketing, content creation, and customer engagement. This aligns with what we see on the ground: a local restaurant using ChatGPT to write snappy Facebook posts, an Etsy seller using Midjourney to create product images, a 10-person software company using GitHub Copilot to speed up coding, or even a Marketing business using all 3 of these avenues as a catalyst for their entire business model. These incremental enhancements can be the difference between a small business growing or stagnating.

Moreover, early evidence suggests SMBs that embrace AI are reaping tangible benefits. In one survey, small businesses using generative AI reported significantly easier hiring and stronger employee performance. Offloading drudge work to algorithms can free up human employees for higher-value tasks, improving morale and output. A quarter of small businesses in 2024 said they are already benefiting from GenAI in their operations. Those benefits include cost savings (automation of manual tasks), faster product development cycles, and even new revenue streams from AI-augmented services. For example, a boutique consulting firm that adopts an AI research assistant might handle more clients or offer new analysis reports powered by LLMs, punching above their weight. 


Some SMBs are also finding niches alongside the giants, rather than being steamrolled by them. The big boys build general-purpose platforms; nimble startups and specialist firms can build vertical or tailored solutions on top. A large enterprise might buy a generic AI document analysis tool, but a smaller company can customize an open-source model for, say, analyzing legal contracts in a specific jurisdiction – and do it better for that narrow use-case than a one-size-fits-all model. In this way, competition isn’t dead. It's shifting: small players either integrate the new AI capabilities or focus on areas the giants overlook. The open-source LLM movement provides a lifeline here – it gives smaller companies the raw clay to mold their own AI, avoiding perpetual dependence on Big Tech. Just like Meta’s Llama case showed, with open models in hand, even startups can create products that outperform closed models on niche metrics. At a price. And very likely not at scale. But the tooling is still available In other words, SMBs need to fight fire with fire: use the outputs of one giant (Meta) to avoid lock-in with another (OpenAI/Microsoft). 


However, there's a flip side: not all SMBs are keeping up. A significant portion still haven’t touched AI at all. Some 43% of small businesses surveyed in 2024 said they have never even considered using generative AI. Many lack the knowledge or trust in these tools, or assume (wrongly) that it's only for the big players. There's a real risk of a digital divide opening up, where early-adopting SMBs thrive and tech-shy ones fall further behind. If AI does boost productivity by double digits (as big companies like BlackRock predict), then over just a few years the gap in efficiency and innovation can become an unbridgeable chasm. In blunt terms: the mom-and-pop shops and laggard firms might find themselves outgunned in everything from customer service to marketing, unable to match the AI-augmented prowess of their peers or larger rivals. Trickle-down AI won't save those who don't grab at it.


Conclusion - No malice, Just Adaptation


The market isn't personal – it's just business. The rise of enterprise AI is exactly that. Big companies are doing what big companies do: investing heavily to secure their future and maximize profits, no mercy or nostalgia for who gets disrupted. Someone like Kim Scott, who I mention exhaustively in my content, might add: the most candid advice to small and medium businesses is to embrace reality and be direct about the stakes. AI is here — it's uneven, and nobody is going to hand you success out of charity. The playing field isn’t fair – but it never was.


The good news? Unlike some past technological shifts, AI truly does have democratizing potential if seized. A savvy small business can plug into the greatest innovations of our time for a fraction of the cost it took to create them. That's astonishing. The trickle-down can be real – if you position your bucket correctly. SMBs that educate themselves, leverage the new tools, and maintain flexibility can punch far above their weight. We’re seeing it in the data: nearly 40% of SMBs now using AI, and many reporting tangible wins in productivity and growth. The barrier to entry for implementing AI has fallen dramatically; what remains is the will to jump in and the wisdom to do so strategically.


In the final analysis, trickle-down AI is not a guarantee – it's an opportunity. The large enterprises have lit the AI bonfire with their $100M matches. It's up to the small businesses to come and light their candles from it. Some will get burned, by dependency, by competition, or by missteps, but many will find new light to work by. The pace of adoption will only accelerate, and the downstream effects will permeate every industry from finance to farming.


As we stand in this pivotal moment, the message to SMBs is clear: adapt, adopt, and carve your niche with the new AI tools, or risk getting left behind in the shadows of the giants. In a market with no mercy and no malice, fortune favors the bold.


 


If you’re one of the 43% gf SMBs who still haven’t integrated AI into at least some part of their workflow, let’s chat: https://www.piedmontsoftware.com/book-online 


And if you already are? Then there’s a lot that we can talk about. ☝️


Follow us on LinkedIn to join the conversation: https://www.linkedin.com/company/piedmont-software-llc

 
 
 

Comments


bottom of page