Scale, Not LLMs, Win Big in A.I.
That which cannot scale becomes a stranded asset - a boat without water
LLMs Are About Scale – And Data Centers Do Not Scale – Period.
Data centers are going to become stranded assets - boats in a desert, because software efficiency is finally catching up with chip design.
A.I., if you believe the tech press, is all about who has the largest LLM – large language model. Lot of “larges” in that concept.
To them – mine is bigger than yours – matters.
Elon can LLM all your Twitter posts. META can do it with Facebook, the Deep Seek guys in China can do it with smaller data centers.
The A.I. industry has defined – in one single word – who wins.
The winner is the one who can scale the most.
Now comes the problem.
Scalability is hardware, energy and farmland constrained.
He who gathers the most farmland the fastest – before neighbors with pitchforks drive zoning commissions to stop the madness - wins.
Maybe, but perhaps it he who can build the biggest data center the fastest – maybe it’s that guy or maybe someone who can repurpose the current data center to run LLMs.
Well, maybe not.
It might be the person who locks up the most energy – maybe even getting the local community to pop for a nuclear power plant – how would you like one of those 8 miles from your house?
Eight miles, nuclear – nice choice of words.
Early A.I. devotees said the A.I. winner would be the one with the most data – but that doesn’t work in a world where the internet, social media made data infinite – and easily obtainable by anyone.
Scale chooses the winner, and the winners can scale better than the other guy.
Is scaling incremental?
Can Google put more data centers – farmland – energy to work faster than the competitor?
If it can, how much more of these resources instantiates a permanent market dominance?
How much for a dominance of a year or two?
When scale is defined by throwing 10 figure capital at a problem - yearly, it’s a game few can play and none can win.
Nobody ever wins, everyone spends billions for tiny, incremental advantages.
A.I. scale is no longer incremental.
You heard if first, here at The Sustainable Computing Initiative– yes, it’s us, the Fractal guys.
If your technology needs data centers to scale and data centers are woefully constrained by land, power and zoning to grow – you are pretty much screwed at some point.
Not Fractal, as we are showing now with the ability to run the world’s largest, or smallest, or most entertaining LLMs in a fully distributed manner – without a data center, without high energy bills while the zoning people never complain.
Let’s tell you what we are doing – now that our customers and friends insisted we get out there and do a road show (of sorts, we hate to fly), to show off “non-constrained LLM’s”
We will use a different term as that will never pass our marketing guys – but you get the pix.
That means demonstrating, live, great big applications - ones everyone agrees need a whopping big boy data center - and we show them off running on a computer 4 inches by 4 inches - a thousand times faster than a data center could do it.
Here’s our pitch, and how we are proving it and we hope you will let us show it to you sometime this summer.
Fractal is fully distributed.
Fractal must be distributed because it was born of a U.S. Intelligence Community project where they wanted massive, quantum speed (they did not use that word, it was a while ago), compute without a data center.
Those guys didn’t care about energy or farmland because they would just take all they needed if they wanted to.
They did care about a data center as a target – big time.
Their requirement was survivability – thus no data center – period.
So no data center, no central point of control, quantum speed, tiny applications, compute where the data resides, every Fractal has 100% the same code just different data. Phew!
Those were generally the requirements.
For the last few years, we have been quietly building our customer portfolio, never bothering anyone.
Then came the LLM stuff and customers asked if we could do LLMs without a data center – thus infinite scalability.
In a recent post, we noted how we can use Apple devices and install Fractals on every Apple watch, computer, phone, tablet at a specific agency or company and make it instantly a super computer – and, yes, it can do LLMs all day long without a data center.
Without those constraints.
Some wags wrote in and said it would be cool if we could do it on heterogenous hardware – like Apple, Android, HP, Dell – and we guess we never made the point but of course we can and do demonstrate that all the time.
You likely know we love working with partners.
Everyone seems to be freaking out that data centers are eating up farmland for A.I. so we thought we would give the citizens a way to prove data centers are going to be obsolete in a couple of years.
Data centers are what stranded assets look like before they get stranded by new technology.
The first question is answered out of the box – no, we do not need a data center, no matter how large the application and we can prove it all day long.
As you may have read on our micro site TheFractalGovernment.com, we took the entire Federal Election Commission System – 680,000,000 records – which needs both a data center and a cloud – and we run it on a 4 inch by 4 inch cube using less power than a kitchen microwave.
The second question is yes, Fractal scales essentially infinitely.
The MIT guys gave it a boundary with more zeros than the federal budget squared, so it’s not a thing.
If you are into LLMs, and if you are feeling the pain of not enough energy, cannot build data centers fast enough or maybe the natives aren’t cool with a nuclear power plant down the street – and if you have really, really big compute problems, come see us and let’s chat.
Fractal eliminates the need for centralized, energy-consuming data centers.
Now that LLMs are eating the world, it looks like we need to lend a hand and help citizens save their rural farmland - by proving to their government that data centers are not the future - they are the past.
We are easy to reach - we are happy to prove this any day - just give us a shout.FractalComputing Substack is a newsletter about the journey of taking a massively disruptive technology to market. We envision a book about our journey so each post is a way to capture some fun events.
Subscribe at FractalComputing.Substack.com
Fractal Website: Fractal-Computing.com
Fractal Utility Site: TheFractalUtility.com
Fractal Government Site: TheFractalGovernment.com
Our new Sustainable Computing site: TheSustainableComputingInitiative.com
Portions of our revenue are given to animal rescue charities.
Fractal will adapt well as AI grift-giddyness moves toward "small LLMs" -- (recent NVIDIA paper "Small Language Models are the Future of Agentic AI,") -- namely, iPhone or NUC sized.
Appreciate the JV analysis vibe. Thanks.