On the Flipside of the Stargate
Is Stargate the AI Apocalypse, Or Is There More Here to Consider?
While this week has had monumental developments with Trump’s Executive Orders, it’s safe to say that the very first press conference announcement caught many by surprise. Trump has represented hope for many, me included, as his first term ignited our economy and the climate change nonsense was put into perspective as we provided our own energy. The topic of his first press conference of his second term was out of left field, and I’m alarmed as well as optimistic in many different ways. And to be honest I was floored when the first technology announcement from his administration was with Sam Altman of OpenAI and Larry Ellison, two oligarchs not even prominent during the campaign. Their project is Stargate, a series of massive data centers needed to power the systems which provide AI services.
Sam is truly out of left field, his AI app ChatGPT wouldn’t engage in conversations regarding Hillary Clinton and RussiaGate, the Hunter Biden Laptop and would not allow a full on criticism of handling the Covid crisis. I expected Trump to announce that Musk would be heading up any AI efforts. So Trump has pulled a fast one on Musk as well.
There’s significantly bad ideas involved with Stargate, and significantly good ideas as well. The best knife is a sharp knife, but if that knife is used against you that’s a problem. But that doesn’t mean I think knives should be illegalized.
For this analysis we’re getting into the specs for Stargate, the National Security Memorandum on AI, why OpenAI is afraid of competition, something called the Breakaway Civilization and what positive elements could emerge if nuclear energy becomes a primary driver for powering our grid.
The Stargate Specs
Stargate project championed by OpenAI, SoftBank, Oracle, and MGX consortium and will be a $500 billion of private investment. SoftBank and MGX are the financiers, with Masayoshi Son, CEO of SoftBank, promising $100 billion ready for utilization immediately. MGX is an investment firm launched by the Abu Dhabi government that focuses on high tech startups, with the board of directors helmed by Sheikh Tahnoun bin Zayed al Nahyan. Nahyan was not present for the press conference. It’s interesting that MGX has also invested in xAI, Musk’s alternative to OpenAI’s mainline product ChatGPT.
But again, many were too quick to state that the federal government was paying for this. It’s a private investment.
Larry Ellison, who frankly has been off my radar for a while, was suddenly front and center of this announcement. While I pay attention to the impact of AI, I don’t follow the investment side and maybe that should change. I don’t recall seeing Ellison’s name mentioned during the campaign, and rarely in the context of leading edge AI. It’s always been ChatGPT and Sam Altman and the other tech bros.
The goal of Stargate is to jumpstart the energy infrastructure build-out to support AI in the United States. AI requires considerable more power for processing when compared to other computer operations. According to Utility Drive
U.S. data center load is expected to grow to nearly 21 GW this year, up from 19 GW in 2023, according to a Federal Energy Regulatory Commission report this month. Data center electricity demand across the U.S. is expected to climb to 35 GW by the end of this decade, according to the FERC report.
The Big Tech companies have done a miserable job at remaining green with their energy consumption and CO2 emissions. CO2 is not detrimental to the atmosphere, but still this is a problem if Big Tech fails to meet there own standards while advising me to eat less meat and live in smaller quarters to save the planet from anthropomorphic global warming. I wrote of Big Tech’s failure in Still Our Wisest Pursuit.
In OpenAI’s case, its daily costs for maintaining its systems in 2023 were $700,000.
For the US to retain its position as the leader in AI development, the amount of electricity that data centers consume will double. As I have written many times before, the drive to clear more land and confiscate property via eminent domain will only increase once AI is considered a primary factor in our economy. More solar electricity only comes from more solar panels.
Stargate offers a different approach, and that is the use of nuclear power. Sam Altman wrote last year about hoping to adopt nuclear energy to make up the difference. That’s actually a good idea, and the best implementation would be to locate the nuclear energy operations at the data center itself, you won’t need to clear miles of trees for new transmission lines, you won’t need to deforest and destroy farmland to line the horizon with wind turbines and solar wastelands.
The first initiative of Stargate is underway in Texas, and there will be 10 centers nationwide. The number of jobs that this will create is around 100,000, including construction and data center operations.
The federal government’s role will be to facilitate the permitting of nuclear energy. If we have a sense of Trump, this could mean clearing red tape.
Larry And the Jab
Larry Ellison made some very dumb, CEO hyperbolic statements about the wonders that this investment will bring. Larry has always been jealous of Bill Gates, I recall Larry stating in the early days of the Internet “the new age of technology lies with bits and bites, and that is what the internet will deliver. Not your software delivered to you on DVD dropped off by a truck”. Microsoft was able to drive many corporations to their products, particularly their database system which was cheaper and easier to administer than Ellison’s Oracle software. But Ellison has many accomplishments, yet I couldn’t help feel that Larry was eager to have the spotlight regarding AI because many consider Oracle to be behind the drive for AI adoption.
His statement regarding AI being able to create an mRNA vaccine that would cure cancer and tailored to specific immune systems in 48 hours is just brain dead. For one, if you have cancer you receive treatment. The pre-Covid definition of a vaccine is a substance that prevents you from contracting an illness.
mRNA’s original intent was to act as a gene therapy treatment for cancer. As a vaccine it should never have been used, it’s killed and maimed people.
AI has been used in medicine to aid in cancer diagnosis, as the strength of AI is the ability to analyze data and identify trends at a high speed. AI has been used for gene sequencing and finding potential new combinations of proteins that are of potential use in medicine. There have been studies of the genes of Neanderthals used in combination with AI to potentially identify new treatments for medical disorders.
But as Dr Malone, one of the pioneers of mRNA, says human immunology is too complex to just believe that new permutations of protein combinations alone can result in quick fix medicine. While Yuval Harari of the WEF thinks that human DNA is hackable and he has the right to treat your genetic makeup as Lego blocks, reality tells a different story.
So Larry is doing a terrible sales job here. People are rightfully outraged given what we have endured with Covid, but what Larry described is not a Fauci mandate. It is still alarming. Can mRNA treatment be used in cancer research? If we return to safe trials and take the time needed, it’s very possible. But it’s far too early to make such a triumphant prediction.
Should we take the mRNA anti-cancer vaccine like they recommended with Gardasil? I’m not signing up for that. The idea of trusting AI to whip up new vaccines frightens me with the bird flu activity and approvals from a vaccine. But Larry is being ridiculous here, everyone sees it. And everyone should be alarmed.
Destroying The National Security Memorandum for AI
In 2024 President Biden issued an Executive Order 14110.
Trump rescinded this order on his first day in office. I was surprised Trump recognized the need so early, and it absolutely needed to go. It represented several horrible things.
Tight control by the federal government in the development of AI based technology
Oversight with testing AI technology for safety regarding the transmission of disinformation and information considered harmful. Censorship, in short.
Consolidation of the Tech industry through onerous compliance and continual reporting of AI system activity. In other words , incident reporting regarding misuse.
Monitoring of the training of the AI systems. This meant that sources of knowledge that an AI system would be monitored and rated. It also meant identifying resources on the internet that, if used in training AI, would result in harm. One way to undermine AI is to provide it with data that would make it “hateful”. That information would be put on a watch list.
The NSM also identified AI as a primary component in our national defense. Given this new distinction, it meant that the federal government would take every measure to protect and control that technology. It also meant that the DoD National Defense Industrial Strategy, which identified bolstering supply chains, would be applicable, and this would drive economic activity at the guidance of the federal government.
Trump wisely eliminated this. It truly was a consolidation in favor of the established large players. Sam Altman was in favor of the NSM, because it brought benefit to his company in many ways. OpenAI had funding to comply with the regimen for testing and validation, smaller competitors do not. For example, it is estimated that OpenAI expended $7 billion dollars in 2024 for its operations and development. Few companies can sustain those levels, and fewer still could sustain additional costs to jump through hoops of requirements proposed by a government consortium supported by OpenAI.
OpenAI’s Competition
ChatGPT is OpenAI’s sole product. It is what you call a Large Language Model (LLM). Think of it as a dictionary and a set of rules that can interpret your written questions and provide answers. LLMs can be specialized, there is a process called tuning where you can adapt the model with changes to perform for specific tasks favor certain methods.
But models are also interchangeable, particularly to businesses. We don’t care how a result is produced as long as it is accurate and produced consistently. AI is really good at accepting reference documents and transforming that information into different formats. I’ve avoided programming reports for people like the plague for years, because they wanted too many changes and would change their minds constantly. And there were too many one-off reports at the last minute. With AI, the subject matter experts don’t need software developers nor complicated software to create the information in a tailored presentation format. I won’t address whether that’s good or bad here. The point is AI is a tool that empowers people in cost effective ways.
OpenAI CEO has floated the chance to use a PhD version of AI for $2,000 a month. That threatens actual PhDs, but for a business that is still a huge cost if you have 100 of your employees sign up. Currently there are two paid tiers, $20 and $200 per month, with less features. OpenAI currently has 200 million users a week.
A few weeks ago, a competing LLM from DeepSeek released a product that matches, and in some cases, beats OpenAI’s ChatGPT performance. This LLM is open source, meaning that anyone can download it, and with the right equipment, run it on their private network.
For free.
Not only is DeepSeek free to download, it was far cheaper to train than OpenAI’s model that it beat. While OpenAI is reluctant to release details, it is rumored that it took $500 million for each 6 months of training. DeepSeek accomplished their results for around $5 million dollars. This cost is computational time, and not the salaries of the technical team. Note that a thousand tokens represents approximately 750 words of output.
(Sources for graphs at conclusion )1
The irony here is that DeepSeek is the work of a Chinese company with a staff of 200 people, while OpenAI has 4500 employees. DeepSeek has released it’s code, while OpenAI keeps the inner workings of ChatGPT a secret.
If I were a business and wanted to avoid the headache of operating and maintaining DeepSeek’s LLM, I can still do so quite cheaply. DeepSeek is 95% cheaper than OpenAI’s online operation.
But let’s say I, as a single business operator, wanted to run a LLM on my own computer. DeepSeek offers a smaller version called “Distilled” that is not as powerful, yet still performs many tasks as well with negligible performance issues. I can download it and experiment myself for free.
You can’t with OpenAI.
The NSM could have severely limited products like DeepSeek should it have continued its operation in 2025. You see, the NSM protected OpenAI’s near monopoly. That’s gone now. And for near ChatGPT levels of performance, there is a product that can do so with 95% cost savings.
That is a threat to OpenAI’s business model. You can make technical arguments as to why you would remain with ChatGPT, but most businesses will opt for the savings. OpenAI’s lead is not that large.
Larry, Sam and the Breakaway Civilization
Katherine Austin Fitz and Dr Joseph Ferrell have both described the divide between the elite and the remainder of humanity as the Breakaway Civilization, where technology and resources are diverted to the elite few, while we must contend with less and less. The advantages garnered by massing wealth translates into technology that benefits the elite, while there is no trickle down for the rest of us.
During the Biden presidency, Sam Altman floated the idea that nuclear energy would be an option, and hoped that special provisions be given so that AI could adapt additional energy sources to move forward and benefit humanity. At the time I thought that this would be yet another arrow in the elite’s quiver, another step forward for the Breakaway Civilization. In light of the NSM passed by, it wouldn’t be a stretch in the last administration’s tenure to shower favor on Sam Altman and the tech elite by granting them permission to use nuclear, while the rest of us get barren land and solar panels. Two caste system.
Larry is frightening in that he believes that people are better behaved when they know they are under surveillance. By this thinking, the would a higher degree of surveillance ensure we would be better behaved? Further, why not just place cameras in our homes in case we are getting misinformation coming to the wrong conclusions? The idea of Larry, Peter Thiel of Palantir or Elon Musk gaining more control over our information and feeding it to vast data centers of AI scares the hell out me. Let’s say Trump keeps the surveillance state at bay, what happens when he leaves office?
Sam Altman is no better, and believes that due to AI, we will require a restructuring of society. Is this to gain investment from super fans of Yuval Harari and Klaus Schwab? This has the air of someone who feels they are acting for our benefit, but then again these are the same tones that say we should depopulate the planet and that freewill is an illusion. Or at least my freewill is an illusion and because AI is just so much better at decisions it should do it for me.
https://x.com/TFTC21/status/1882571514891080030
We have every right to be alarmed by these developments. Was Trump’s announcement a betrayal, did he just institute a surveillance state and will we be forced to take mRNA vaccines? No. But I am very concerned, we should raise our voices to ensure this doesn’t take on a trajectory we can’t control.
That said, Trump struck another blow to the surveillance state when he ended the sudden rush to a Centralized Banking Digital Currency with his executive order STRENGTHENING AMERICAN LEADERSHIP
IN DIGITAL FINANCIAL TECHNOLOGY
taking measures to protect Americans from the risks of Central Bank Digital Currencies (CBDCs), which threaten the stability of the financial system, individual privacy, and the sovereignty of the United States, including by prohibiting the establishment, issuance, circulation, and use of a CBDC within the jurisdiction of the United States.
Sec. 5. Prohibition of Central Bank Digital Currencies.
(a) Except to the extent required by law, agencies are hereby prohibited from undertaking any action to establish, issue, or promote CBDCs within the jurisdiction of the United States or abroad.
(b) Except to the extent required by law, any ongoing plans or initiatives at any agency related to the creation of a CBDC within the jurisdiction of the United States shall be immediately terminated, and no further actions may be taken to develop or implement such plans or initiatives.
A centralized digital currency has long been feared as a control mechanism that would track all facets of life. China has such a system, the World Economic Forum has promoted this system as a way to enforce “good” behavior. I bring attention to this in order for people to take a step back and assess if we have been betrayed. And also to be alert.
An Opportunity For the Vigilant
All is not lost. There is an opportunity here, a big one. Trump has ended the Green New Deal policies, and announced to the WEF that our energy policy will include coal, natural gas and oil. Trump has also stated that nuclear energy is very much an option.
If we can fast track the construction of energy infrastructure for OpenAI with nuclear reactors, why can’t we push for this in other arenas?
Rolls Royce has developed a Small Modular Reactor that apparently can power 1 million homes. If this or the Marvel Microreactor developed at the Idaho National Laboratory is approved for use with data centers, then what is preventing us from implementing such solutions for other commercial use in our states? We have to ask is AI is the only thing that will benefit humanity. Since that has yet to be seen and we know the benefits of nuclear energy, why not go with safe bets for our society? Currently in Michigan we hope to restart our Palisades reactor, it seems that there is an awful lot of bureaucracy that stands in the way. That plant had been operating, and if Sam Altman and Larry Ellison can get attention, we should push the issue for ourselves as well.
I have extreme reservations about Larry and Sam’s ability to deliver wonder drugs built with AI, but we have an opportunity to push for energy solutions that we know will work. We have to take advantage of that opportunity while remaining vigilant.
We have no other choice.
I have subscriptions to gift, anyone you would like to have follow your Substack? Happy to gift them, for you and for Jason….if you can think of people that would read them, need them, benefit by them and spread your knowledge, please let me know through private DM and am happy to gift away!
Great article as always, my feelings about Oracle and their fearless whimp are well known.. I hope people tune it, lots of great and important info here We have not hear anything from Elon about this have we? Hmmmm…. He usually is not quiet