It’s easy to get caught up in the rush of people claiming AI will completely revolutionize every facet of life. That humanity is on track to Artificial General Intelligence (AGI), a lofty and still yet-to-be-defined concept of an AI “superintelligence.” But reality is dirty and complicated.
College students and new graduates face the bizarre situation of having higher unemployment rates than the aggregate of all American workers since at least 1992. According to a study by J.P. Morgan, employment across key tech industries has plateaued since December 2022, the time ChatGPT was released. Their fears of unemployment and a rocky job market are at the very least not unfounded.
Ask any prospective computer science major here at BHS or a current student at university about jobs, and you’ll be painted a similar picture. The rise of AI has increased the productivity of senior developers at tech companies, reducing the need for junior positions, thereby shrinking openings for students. Essentially, this means horror stories of college juniors and seniors applying to hundreds or thousands of jobs and internships, receiving only a handful of interviews or offline assessments, let alone a position.
But despite the outsized usage of AI in the tech sector, the general sentiment around it is buoyed by speculation and hype. Not only are companies looking to make use of AI to cut costs and providers of AI services swept up in the hype, so are manufacturers of the chips and GPUs needed to run AI datacenters, and the companies that build these eye-wateringly expensive facilities. Independent estimates put the average cost of a new AI cloud computing datacenter ranging at tens of millions, with the current largest planned at a staggering $100bn.
This begs the question, where are these AI startups getting all this money? As it turns out, via something Sam Altman has ironically dubbed “financial innovation,” which is in reality an intricate web of overlapping deals between mega tech companies and smaller AI startups.

Above is a graphical representation of the circular nature of AI finances at the trillion-dollar scale it has reached. The best example of this is the $100bn deal for a new “Stargate” datacenter between OpenAI and Nvidia just last month. OpenAI received $10bn up front, with another $10bn for every gigawatt of computing power deployed. It plans to lease the chips, then commit over $300bn towards Oracle, a massive cloud computing firm, to deploy an added 4.5 gigawatts in computing power. Since the best chips for AI inference in the world are Nvidia chips, it is no mystery where most of that money is going.
In essence, Nvidia invests in OpenAI, which then pays Oracle to deploy the compute, who buys chips from Nvidia. This is alarmingly similar to a feature of the 2001 dot-com bubble, termed “vendor financing,” where telecommunications companies would finance customers’ purchase of their own products. As the most valuable company on the face of the planet, Nvidia is under scrutiny by analysts and academics as having inflated stock prices and revenue numbers via these circular deals.
Granted, there are tangible customers buying actual products from outside this apparent loop. Big investors like Microsoft and Google have cash from actual business, money that isn’t being swapped around like hot potatoes. But so far, spending on AI far outpaces monetization. OpenAI, the maker of ChatGPT and the largest private company in the world, has never made a profit. In fact, they are on track to make a loss of $27bn this year alone.
The chokehold AI has on jobs appears poised to collapse in a moment’s notice or endure and transform technology and how we use it forever. But either way, even with how AI has swept through industries, its own future remains obscured by speculation and bubble-sentiment. AMD’s CEO Lisa Su assures that Oracle, Nvidia, and OpenAI are locked in a “virtuous, positive cycle.” 2001 begs to differ.
.png)







