A.I. Will Doom Large Data Centers
The future of A.I. is decisioning - and decisioning inherently cannot be centralized
Thesis:
Data centers - centralized computing - were built to deliver batch reports, to a bureaucracy, to manage day-to-day corporate activities.
A.I. is about real-time decisioning - gathering the most relevant data, rendering a decision solving a business problem - creating economic value.
Business problems, by definition, are specific and constrained.
A.I. applications delivering the fastest decision-speed across the largest relevant data set drive out all competitors.
Centralized computing is inherently unable to match the speed or data relevance of distributed edge, quantum-speed computing.
Thus, the A.I. market will shift from centralized data centers to intelligent edge devices - because that’s where the money aggregates.
Large, centralized data centers become stranded assets as they can no longer compete for the most value-generating A.I. applications - decisioning.
Citizens in states with new data centers will have their property values, lifestyles and health negatively impacted.
This is an Apple Studio, about 6 inches square - on which the Fractal team ran an Oracle Cluster - a hunk of a data center.
What is the purpose of A.I.?
What is the financial payoff of A.I. that will support the trillions of dollars in investment?
If you follow the tech press narrative, you already know what the future is NOT.
The future of A.I. is not anyone asking any question about any topic and getting the correct, optimal answer, instantly.
If you doubt that, take a look at the articles about ChatGPT and its abysmal financials.
Take a look at how Oracle financiers are backing out of that Oracle ChatGPT deal - at quantum speed .
The funding players asked “…where does A.I. create economic value?’“
The future of A.I. is simple - making FAST, INFORMED decisions about constrained problems - driving big dollar benefits.
Decisions. Fast. Informed. Constrained.
That’s it. Period.
The ingredients of optimal decisioning are speed - beyond anything current data centers and 1980s I/O intensive software can deliver.
Speed:
Your politicos are pitching the Golden Dome - how to stop hypersonic missiles coming over Greenland to Chicago.
Speed was always important - the stakes just multiplied.
Informed:
The critical A.I. ingredient is data - in multiple formats, different types - data in rows and columns, videos, meters, devices, voice - integrated instantly to support a fast and informed decision.
The data comes from scores, hundreds, thousands or more devices - depending on the application.
Applying data and speed yields intelligence - for decisioning.
Merging data, instantly for context - for decisioning - makes the difference and creates economic value.
Fast without informed is lethal.
Informed without speed - useless.
These two ingredients are so vital they create a Zero Sum Game, winner take all - finality.
If your side is off by 1 second in the A.I. world, your missile fails and your adversary’s lands.
If you make a financial hedge bet, sub seconds drive billions of dollars in profit or loss.
If you are not gathering ALL the relevant data, merging it instantly - you lose.
A.I. applications create a very dangerous game at full scale.
You are entering the ultimate winner takes all game.
The winning A.I. application must have both - speed and informed data sets.
Centralized computing - data centers - cannot render the fastest speed and most informed decisioning.
It’s the architecture - and it cannot scale to the emerging money applications in A.I.
In philosophy this is called an inherent contradiction.
You never want one of those in your thesis.
A property of inherent contradictions is they are often discovered TOO LATE- after investors mindlessly bet trillions of dollars on ripping up Pennsylvania farmland for data centers.
Inherent contradictions show up when you scale your thesis - when you go big, things fall apart. That is the future for many data centers as the economic value of A.I. moves away from ChatGPT - type endeavors - to real-time decisioning.
Data centers are infrastructure, not intelligence.
Data centers are needed for ancient software - like Oracle or Palantir - which are I/O generators so painfully slow at the individual transaction level they need machines covering acres of farmland to run them.
Data center software systems are I/O latency factories - that means the “FAST” part of the data center - the CPU - is essentially inactive, like asleep, about 95% of the time.
Virtually all enterprise software today is I/O intensive - thus 95% of the time the CPU is at rest.
Quantum-speed software, from Fractal, operating without a data center - now being demonstrated daily - delivers decisioning speed unobtainable with current software technology.
In the quantum-speed, on current hardware world, that same CPU is decisioning 98% of the time.
95% of the time at rest - centralized data centers = current enterprise tech stack.
98% of the time chugging away decisioning, current quantum-speed software = computers so small they do not require a data center.
That’s a fatal equation for data centers - as time will demonstrate.
A.I. needs that speed - centralized computing models cannot deliver it - thus A.I. requirements will over time doom these data centers into stranded assets.
The upcoming battle in A.I. is about which infrastructure delivers the speed and information fastest for decisioning.
The choices are centralized command and control - data centers - or distributed intelligent edge devices - edge computing.
The problem in centralized compute (DATA CENTER) is moving data from the decision point to the processing point - which is fraught with both time delays and the danger of comms loss. Then send the information back to the decision point.
One adversary has a battlefield team, collecting data from devices, drones, planes, satellites, in a fight - processing it right there without a data center - receiving an instant response.
The other adversary collects data from devices, sends data to a data center nearby or far away and awaits an answer.
You know where this is going.
This is the inherent contradiction: A.I. is about fast, informed decisioning.
Centralized computing will always be slower and less informed than computing and decisioning at the point of the problem, or the fight - intelligent edge computing.
In a world where seconds matter - centralized computing fails for A.I systems.
I/O latency - operating a CPU 5% of the time - is only scalable with data centers the size of Manhattan.
Certainly in battlefield conditions, there cannot be a data center - period.
We were called about a less sanguine example this week.
Autos - the kind without drivers, are failing to scale in Europe according to a person who is somewhat responsible for their safety. Each auto collects massive data, every second, each must send a lot of data to a central point to know where the other driverless autos may be.
Slow response speed is fatal.
It does not scale - it is a disaster.
The problem is each auto is an edge computing platform, collecting more data as it drives along European streets than the early space program collected from the Lunar Module.
Centralized computing creates a latency among the multiple autos.
Their navigation data cannot be processed and sent back instantly - keeping them from driving into each other - so they drive somewhat blindly - and “blind” is not a good thing in a car.
This is the future of A.I. computing - gather data as it is created, where it emerges, process it locally, at quantum-speed, share it with peers - decision it instantly.
Like this:
Driverless cars collect data constantly, in multiple forms.
Every car has different data - it’s own data - but the same application.
Those “applications” which we call Fractals form an instant compute MESH and every car knows where every other Fractal car is all the time.
Every vehicle is a Fractal.
Want 100,000 more vehicles, add more Fractals - there is no degradation on speed, all data is collected by each vehicle.
There is no central data center - no data is sent to a central computer.
Decisioning ls local, instant, thus effective and scalable.
There is no data center needed because the car is an intelligent edge device. The car does the computing.
All the cars together form an intelligent edge MESH - which delivers decisioning faster than a data center - without a central point of processing.
Even if all communication is lost - this system continues to operate.
You cannot do this with centralized command and control, thus NO DATA CENTER.
This is DECISIONING where the decision counts - combining speed and information.
This is where economic value is created - as in getting paid for this A.I. solution.
If there are two competing A.I. systems, one decisioning at the problem point while the other needs a data center delivering a decision seconds or minutes later - which gets funded over the long term?
The data center system costs 100 times more - destroys farmland - citizens pass the hat to hire lawyers to fight the central compute model - that’s the variable nobody tracks right now.
Data centers everywhere are not going to happen - the opposition is now too great.
We need to visit the concept of “constrained.”
A.I. - the kind that pays the bills, solves problems for businesses, agencies, governments - has its financial future delivering decisioning applications.
Applications solve problems - when you solve a problem, you get paid.
Every problem is by definition constrained.
If the problem you are trying to solve is “what is the optimal lubricant for this machine under these stress conditions” your data reflects everything about that machine, the oil, all past histories.
A.I. must enable the human to be taken out of that decision - so when a machine has a problem, the A.I. system runs the query and applies the decision.
Seconds may prevent catastrophic failure.
Companies pay millions of dollars to stop multi - billion dollar enterprises - like a drilling platform - from suffering a catastrophic failure.
Why send the data to a centralized data center when all that computing can be done at the EDGE - on an oil rig - locally - instantly?
The inherent contradiction is command and control - DATA CENTERS - are architecturally prohibited from delivering the blinding speed for A.I. decisioning.
It’s not technology here - it is the computing model.
One computing model has a centralized infrastructure.
The other computing model has no central point of control - it computes at the problem.
Why haven’t you read there is an alternative to every piece of virgin farmland in America needing a data center?
The tech press is subsidized by the obsolete, ancient software industry which lived quite comfortably having I/O laden software run in a centralized data center.
A.I. changed the game - overnight.
Data centers are for reports.
Data centers run the H.R. and 401(k) systems.
Data centers and Oracle and Palantir are there to manage the humdrum daily systems for a batch processing-oriented bureaucracy.
A.I. renders decisions - decisions are inherently time sensitive - so speed is now life or death for an application.
You don’t need to hear it from us - you can see Oracle and the others scrambling to make it sound like they are in the forefront of A.I. while you also read articles everywhere about A.I. not paying off.
A.I. isn’t yet paying off because the centralized compute model does not allow A.I. decisioning at the problem point.
Central computing delivering batch reports is not built for agility, speed and decisioning.
That is changing as our team and others demonstrate every day.
Watch for the upcoming webinar: Sustainable Computing for An A.I. World sponsored by Fractal and our partners - it will be on the SustainableComputingInitiative.com site.
We are working with early adopter partners building applications which decision at the point of the problem. While these are far faster than any centralized infrastructure, they also make us popular with the NO DATA CENTER crowd.
Our team moved an entire Oracle Cluster - kind of a hunk of a data center - to an Apple mini - even less powerful than the photo above.
The Apple mini cost about $2,000. It plugs into a wall, it uses less power than a kitchen microwave oven.
If someone has an application - like the world’s largest billing application - and it costs them today $1.2 billion a year to operate, and it needs a dedicated data center and a staff of dozens and takes 23 days to process……
The same application can be done on a hundred or so Apple minis, at a cost of a million or two million dollars, with a couple of engineers running it, using about $3,000 a month in power - without a data center.
So all in, $1.2 billion a year versus a couple of million - say $40 million a year - those are the economics of Fractal edge computing.
And the bills?
Instead of 23 days for about 150 million bills - they would be done in a couple of hours.
So when you hear 99.8% of the tech press telling you America needs to destroy Virginia, Maryland, Georgia, Pennsylvania, Arizona and Texas rural landscape to get A.I. dominance over the Chinese, remember we told you it ain’t so.
Our team is working with citizen groups in a dozen states proving to their legislators these mammoth data centers are on their way to becoming stranded assets in the next few years.
The upcoming March webinar is the first in a monthly series for anyone who wants to see the future from their dashboard rather than from their rear view mirror.
Remember the same tech press - their moms and dads - told you Blockbuster would have a video store on every corner and Kodak would have your film processed in every strip mall.
The future is seen by 99.8% of the population in their rear view mirror - follow us here - on this Substack - share it with your friends - so you can see what the tech press cannot.
NOTE: Our team is doing a webinar in early March, with partners, demonstrating there is no need for a data center. We will run a full Oracle Cluster application on an Apple mini - and show you the future.
We are working on the dates and speakers now.






There is edge without A.I. today. Speed is not critical for non-A.I. apps.
A.I. makes edge the best landing zone because it takes advantage of edge computing where the data is.
So A.I. edge will become a big deal.
It is already a big deal with some emerging applications.
Thanks – this edge computing makes a lot of sense, but why call it AI?
It is the opposite of the craziness of LLMs such as ChatGPT. The clue is in the name – GPT is Generative Pre-trained Transformer. They have to be pre-trained instead of learning from experience, so they are not intelligent.
Also, why would self-driving cars need to know where all the others are? When I drive a car, everything I need to know to do so comes from sensing what is immediately around me.
I look forward to learning more about Fractal Computing 👍
Regards,
Trevor