The Black Swan Event About To Hit A.I.
You are about to learn more about computer design than your 17 year old - we try to make it painless
If you follow us here you are probably interested in one of the greatest technology head-butts about to happen in the last 50 years.
There is a Black Swan Event now hitting the A.I. world and you have a front row seat - and we are telling you what to look for!
Maybe you joined the site because you were originally interested in voter fraud and saw our work in 2022 but now you are sucked into the Big Fight about to happen, and you have that ringside seat.
Here’s the fight. Here’s the BLACK SWAN.
99% of the tech planet, even the guys you love like Elon and Vivek and that VP guy who knows zip about technology - are telling you America could be falling behind the Chinese in A.I. and needs quadrillions of dollars in data centers to catch up.
They are backed by every major tech company on planet Earth.
Every consulting company publishes “whitepapers” (brochures) about the race toward more data centers.
TV pundits - in the financial space - like Charles Payne and the bald guy who jumps around, screams with his shirtsleeves up to the elbow - and tells you which stocks to buy - pitches “data centers.”
They are in the near corner.
Then there’s us - scrappy Austin guys - who built some pretty famous systems - eBay fraud detection, TSA No-Fly List, that sort of stuff - telling you they are ALL wrong.
This is how these sort of things start - some small entity stands up and says “it ain’t so!”
For 2,500 years - children heard the story of The Emperor Has No Clothes!
We are now those children.
Often, big things come from such types.
This is a Rocky movie - instead of a guy in a sweatshirt running through slums at dawn to fight the undisputed champ, you are reading a Substack about some guys nobody ever heard of telling you the whole world is wrong.
The world isn’t wrong about a weather forecast. They aren’t wrong about CO2 levels in the atmosphere. They are fundamentally wrong about the single biggest tech challenge in the last 40 years!
Data centers are obsolete - there is no corporate or government application that requires a data center!
That’s why you want to stay tuned here - because to prove we are correct (we are, we have live customers), there must be a Black Swan event where a massive data center is moved to a computer the size of a pack of cigarettes.
When that becomes widely known, - wow, they are just screwed!
Our guys are building quiet systems for electric utilities and now for some famous politicos - demonstrating data centers are NEVER needed.
At stake are trillions of dollars in revenue - plus trillions more in stranded infrastructure.
Our new friends in Virginia, holding vigils and fighting with town councils to stop their state being overrun with data centers - they are about to join the fight - on our side.
That’s no bullshit - we are talking trillions if we are right - or trillions for the other side if we are wrong - but trillions nonetheless.
We don’t get the dough, but that’s what’s at stake.
Tell me this - is there any fight more interesting for you to watch right now?
Well, maybe watching Trump fire half the government, DC and Maryland homeowners selling their houses at any price because they lost their hidden income - that’s more fun!
But we are close!
Isn’t this more interesting than hockey?
Today, you are going to get an education in advanced software architecture - chip design, and a peek into the metrics that show why we are right.
Remember how the Black Swan thing works.
It’s one thing to predict there will be a Black Swan event.
It is quite another to have major U.S. Government systems - that need a data center a city block in size - and run the same stuff - 1,000 times faster on the computer you see below.
We do it every day. The Black Swan ain’t coming - she’s here.
You are looking at a data center - that runs 680,000,000 million Federal Election Commission records, at 200,000,000 transactions per second, and uses the power of a table lamp!
If you are not a Pepsi drinker, you can put these computers in your coat pocket.
The FEC needs a multi-million dollar data center PLUS a cloud - which is just someone else’s data center.
They need 80 people to run this thing.
They use the electric power of a city of 25,000 to run it.
They use over 100,000 gallons of water a day to cool it.
We do it ALL on that little computer - not both of them - either of them - today.
How the heck do we do this?
This is a longer post, stay with us because in about a year everyone will be asking how we did it - and you will know the answer!
That education is so simple - so easy to understand - at the end of this article you can explain it with authority to anyone.
If you can, and you will be able, just being able to explain the chip/processing/speed problem will convince you we are right.
A year ago, when we got boastful that our way was right, nobody listened.
Our dogs just wanted to take a walk and have a treat. Our kids thought wow, at least he’s not a greeter at Walmart.
It’s a little different now - and we are giving you a peek - because this is ending up in a book someday and you can tell your friends you were one of the subscribers who read about these crazy guys.
Let’s take a walk into software and chip design - and show you why you see more of the future than the tech writers at the Wall Street Journal or Forbes, more than many VCs, and certainly more than some of your tech heroes.
What is I/O?
I/O is the hero or the goat of our story - and if you understand I/O, you can see the future of the tech industry.
Computers have CPUs - the brain. Big computers have lots. Your phone has at least one.
A CPU operates at silicon speed. That’s close to the speed of light.
It does a calculation or other piece of meaningful work.
When it finishes the calc, it leans back into what is called Layer 1 cache and grabs the next thing - instruction - piece of data - called an I/O event.
I/O stands for input output - and it is what computers do. They process data, using a CPU to deal with I/O events.
Let’s be a CPU for a moment to make the critical point.
A CPU does one instruction - one thing - and we will say that one thing is a day - in human terms. It’s a nanosecond, but we are going to give you a relative value here - so if we say that nanosecond is the equivalent of a day, what’s next?
Well, there is that I/O event.
So with modern computers, if we call the CPU action the equivalent of a day, the next I/O event is the equivalent of 100 years. Yeah - a century!
So every time a CPU finishes a task, it waits the equivalent of a century (in human terms) for the next task. That is latency.
One day’s work, wait a century, one task, wait a century, one task, wait a century - you get the picture.
Now you don’t see that on your desktop PC because it processes so few transactions.
Try running the billing system for AT&T!
AT&T needs a data center occupying blocks of space in Dallas.
They need 23 days - fully using a massive data center - to generate 125,000,000 monthly bills.
They spend $1.2 billion a year - and 99% of the energy consumed is for - yup - you guessed it - for latency - the CPU isn’t doing squat!
Those computers are waiting for I/O events. Now you can see why those nanoseconds add up!
The computer world has really smart guys and women - so fix it! Obviously everyone knows this.
Well, you are correct - the chip designers made chips run faster and faster - so the speed hid the latency problems. Results came in faster because chips were smaller and faster - but the latency persisted.
Why?
Aha, now the clever part nobody told you.
If you were a computer programmer in 1975 - you never heard of “off the shelf software.” You built everything yourself - bespoke as he Brits say - from the operating system up and it was really fast.
That little “tech stack” you created to run as a computer application knew lots of stuff about how the underlying chip worked.
You “pipelined” your data because you knew transaction A would always come before transaction M.
No off the shelf - general purpose software. Everything purpose-built.
That was 1975.
In the 1980s, the “off the shelf software” (COTS - commercial of the shelf) industry exploded. In came Oracle and 5,000 others with software that did everything for everyone.
Security, development, virtualization, graphics, statistics - everything was purchasable.
That software was general purpose - to a point.
Every security software vendor made software for about any industry. Databases could store any arbitrary value for any arbitrary application.
That was a huge step forward.
What nobody did was optimize to take advantage of innovations in the chip - that processor down in the computer.
Why optimize anything if chips are getting faster every day? No need.
Well, chips aren’t getting any faster today - so data centers have to get bigger.
OK, enough.
Our team worked with one of the intelligence agencies several decades ago on a classified project that sidestepped all that COTS software.
Our team had to build software that “knew about the underlying chip” and optimized I/O to such a degree that today we run from a thousand to a million times faster than any current tech.
It did other stuff - really important stuff too which we will cover in later posts.
These include inverting the compute model.
Currently, companies move their data to a computer - centralized.
The better way is to move all compute to the point where the data resides - which we do and is the subject of a future post.
We also use A.I. to store the data in the manner in which it will be retrieved - called locality of reference - another post there too.
The result is our Fractal technology runs at quantum speed - today, on current hardware - and works for about 95% of all government and commercial applications.
So the Black Swan event is nations are hitting the wall on building more data centers. Even if you can build one in a couple of years the electric power is not there - and a utility may take 5 years to build a new plant to deliver the energy infrastructure.
When the proverbial immovable object hits the irresistible force - very cool stuff happens.
So, here we are.
Nobody is going to have the electricity for a decade to do what the A.I. guys claim is needed.
We currently run some of the country’s largest systems - in full production - without data centers at a cost 1/100th that of conventional technology.
We have live customers - the Black Swan is here.
So how do you think this is going to turn out?
FractalComputing Substack is a newsletter about the journey of taking a massively disruptive technology to market. We envision a book about our journey so each post is a way to capture some fun events.
Subscribe at FractalComputing.Substack.com
Fractal Website: Fractal-Computing.com
Fractal Utility Site: TheFractalUtility.com
Fractal Government Site: TheFractalGovernment.com
Jay@FractalWeb.App
Portions of our revenue are given to animal rescue charities
The government is using data centers. We know because we just took the data from the FEC data center and ran it on a 4 inch computer.
When the President of the U.S. stands up with Larry Ellison saying he and his pals are raising $500 billion for data centers, it's a thing.
Thanks for the stupid, preposterous comment.
Since you were likely drunk when you wrote it, we will let it pass.
Our team built identity analysis tech that remains the state-of-the art for identity naming analysis for the TSA No-Fly List.
The TSA can put anyone on the list they choose - which they obviously did.
If you want to be a fool, I point you to Twitter where you will be welcomed.