THE ONE THING YOU NEED TO KNOW ABOUT DATA THIS WEEK is what happened last week at Nvidia’s annual developers’ conference. With 16,000 attendees packed into purple-lit stadium seats, 300,000 watching virtually, a fifty-foot projection screen, and an opening act, this was more like a rock concert or the convening of a megachurch than a user conference for a frickin chip manufacturer. People are already calling it “AI Woodstock.”
Remember that Nvidia makes GPUs, the processing units at the heart of accelerated computing and AI. A year ago you’d never heard of them. Now their market cap is $2.35 trillion--bigger than Amazon.
And it did feel like some kind of turning point.
Here’s what stands out.
1. You think of GPUs as hardware. Nvidia is making them into systems.
I think of silicon chips like toys. Like if I placed a big order of GPUs from Nvidia, a truck would show up at my house and tip a zillion tiny plastic chips into my driveway, going clickety-click like Legos as they skitter on the asphalt.
Obviously this is wrong in many ways. But the most essential way it is wrong, is that those chips are not standalone. Like something you could drop off. They need software.
Code is instructing the chips. Nvidia has been offering its CUDA software for years--a kind of operating system for its GPU clients. CUDA links your program in your system off your CPU… to the GPU that Nvidia has just sold you.
And not just to that GPU, but to any GPU the software can link to for parallel processing. Firing all those GPUs at once is what makes the computing fast and “accelerated.”
Sure, you can create your own software to make use of the GPUs. But why bother? Nvidia’s is probably better than anything you could make, and it’s tailor-made for the hardware.
And once you’re hooked on the software… now Nvidia can push any number of services to you.
Hardware + software + services = systems.
With CUDA as the core, an ecosystem of back-end integrations is growing up around Nvidia like Bay Area kudzu. “The same resource used for accelerated computing will be used for AI,” CEO Jensen Huang says in his shiny biker jacket, signature gray coif, and dork-dad delivery.
It’s giant statement. He’s basically saying, I can make your computing faster, cheaper, more energy efficient. And while I’m there, I’ll open up the AI heavens to you.
Remember “Clapton is God?” Well, Jensen Huang is the new Tech Jesus.
2. This will accelerate AI adoption… the way cloud computing spread the use of big data.
Now that accelerated computing and AI is a service, anybody can use it. In the keynote, Jensen announced something called “microservices.”
Micro-wha?
These are services that, they claim, ship virtual containers holding “pre-trained models”... that live on your platform… connect you to the LLM of your choice… and use your own data to generate… well, whatever you want.
In other words, now that services are attached to the hardware, AI computing can be dialed up (heavy / custom) and down (light/off-the-shelf) according to the level of savvy, and investment, of the user.
A flurry of press releases over the last few weeks illustrates this.
Health company J&J makes Band-Aids and Tylenol… they are no tech company... but now they’re partnering with Nvidia for surgical analytics.
Mercedes makes rich-grandma-sedans with luxury leather seats… but now they do self-driving cars with Nvidia.
The Department of Energy’s Office of Science is working with Nvidia tools to help their mission to create “integrated models of climate, socioeconomic, infrastructural, and other human systems.”
Which means you don’t have to be Big Tech to boost your business with AI. Now, you can use an entry-level Nvidia product.
It’s like the birth of cloud computing ten years ago. Then, miraculous new tech (data storage and access) was democratized for cheap. You didn’t need to fund a Level 3 data center to use your big data. You just needed an AWS account and enough cash to keep subscribing.
That pushed the capabilities to organizations of every size.
3. What does this mean for the rest of us?
A few things to think about, in the hangover of AI Woodstock:
The systems model will pump cash, profit, and market power into Nvidia. Just like AWS drew those things for Amazon. It’s all about those network effects. Razor and razor blades. Smart phones and app stores. Already biggies like Google Cloud, Intel, and Qualcomm, among others, are getting spooked, and announced the formation of the Unified Acceleration Foundation to create an open source alternative to Nvidia’s CUDA.
On the positive side, look for innovation, anywhere--come on, nonprofits! B-Corps! Climate groups!--to accelerate on the back of AI infrastructure-for-rent.
Look for tech-soft sectors (like consumer goods) to get savvier, as the cleverer ones pull away from the pack based on how fast they embrace the new tools.
There will be a pull-forward effect on big data, too. Look for companies who in the past might have considered, but passed, on harnessing the power of big data--their own, or someone else’s--to suddenly get motivated to do so. Why? Use cases for big data can be fuzzy and hard to identify. Having democratized AI will make the use cases commonplace, and hence, better understood. Plus, AI-for-all will make the business case, too. If the timeline to make money from data is now shorter… and the investment lower (it’s just a rental)… then more players in more sectors will start gathering their data, and put it to work, to advance their mission with AI.
If this is Woodstock, should we be having a generational moment? Yes, that smell in the air is profits, not pot; it’s technobabble you hear, not Carlos Santana shredding. Yet it’s hard not to look around you, checking if everyone feels the same thing you do. That there is genuine wonder in the math and the machines.
Plenty of time, later, for the consequences.