TechReaderDaily.com
TechReaderDaily
Live
Opinion · Technology Economics

Big Tech Earnings: The $715 Billion Line Item That Redrew the Cycle

Four hyperscalers reported strong earnings but tempered forward guidance as combined AI capex commitments topped Switzerland's GDP.

Rows of AI server racks in a modern data center with NVIDIA GPUs illuminated by blue LED lighting, showing the scale of hyperscale infrastructure buildout. tomshardware.com
In this article
  1. The Goldman Bear Case and the Math That Undercuts It
  2. What the Lease Liabilities Tell Us

On page 47 of Microsoft's 10-Q, filed the morning of April 30, a single line item caught the attention of sell-side analysts at Goldman Sachs and Morgan Stanley before the earnings call even began: finance leases entered into during the quarter had risen 47 percent sequentially, pushing total lease commitments to $108.4 billion, nearly all of it tied to data-center construction and GPU-cluster provisioning. The company did not highlight the figure in its press release. It did not need to. The number told investors what the prepared remarks would spend twenty minutes massaging: capital expenditure is not decelerating, and the financing structure behind it is becoming more complex.

Across the five trading sessions that followed, the hyperscalers reported in rapid succession. By the time Apple closed the cycle on May 1, the four principal AI spenders, Amazon, Alphabet, Microsoft, and Meta, had together disclosed 2026 capital-expenditure plans that, depending on which analyst's lease-capitalization methodology you prefer, land between $674 billion and $715 billion. Blockonomi tabulated the high end of that range on May 2, noting that the aggregate nearly doubles the prior year's figure. The reaction from equity markets was bifurcated: companies that could point to revenue acceleration tied to AI workloads were rewarded; companies that asked investors to trust a multiyear payoff saw their shares marked down.

It is worth pausing on the scale. Seven hundred and fifteen billion dollars represents roughly 2.5 percent of United States GDP. It is larger than the combined market capitalization of all but a handful of publicly traded companies. It exceeds total federal spending on transportation, housing, and education. For readers who prefer a corporate frame: it is nearly three times the entire revenue base of Oracle, a company that has itself become a significant AI-infrastructure participant. The capital is real, the concrete is being poured, and the NVIDIA GB200 racks are being bolted into place from Ashburn, Virginia, to Västerås, Sweden.

The critical question that emerged from the earnings cycle is not whether the spending will occur. It will. The question is whether the incremental dollar of capex deployed in the second half of 2026 will earn a return that justifies the cost of capital, and what happens to the multiples assigned to these companies when the year-over-year growth rate of that spending line inevitably rolls over in 2027. On this point, the sell-side has begun to split.

The Goldman Bear Case and the Math That Undercuts It

Goldman Sachs published a note during earnings week arguing that AI capex is now consuming free cash flow at a rate that will force either dividend restraint, buyback reduction, or incremental leverage across three of the four hyperscalers by the fourth quarter. The analysts flagged a specific metric: maintenance capex plus AI investment is running at 1.7 times operating cash flow at Meta, a level that has historically preceded multiple compression in the enterprise-hardware sector. Forbes contributor Jon Markman examined the Goldman thesis on May 5 and identified a weakness: the firm's model assumes a 12 percent weighted average cost of capital for AI projects but applies it uniformly, ignoring the fact that Alphabet and Microsoft are financing a meaningful share of their builds through leases and structured debt at effective rates below 5.5 percent.

The distinction matters. If your hurdle rate is 12 percent, the present value of an AI workload that matures in 2029 looks unappealing. If your actual blended financing rate is closer to 5 percent, the same cash-flow stream clears the bar with room to spare. A long-only portfolio manager at a Boston fund, speaking off the record, put it this way: "The Goldman note got the numerator right and the denominator wrong. That is not a small error when you are discounting over a decade." The PM added that his firm has been adding to Alphabet and Microsoft on the post-earnings dip, specifically because the lease-adjusted cost of capital argument works in their favor.

The 2027 capex trajectory is where even the bulls grow cautious. Seeking Alpha published an analysis on May 1 that projects AI capex growth decelerating from more than 100 percent year-over-year in 2026 to between 15 and 30 percent in 2027. The arithmetic is straightforward: a $715 billion base leaves little room for triple-digit growth without absorbing the entirety of the S&P 500's free cash flow. At some point, the law of large numbers asserts itself. The question is whether the deceleration signals a healthy normalization or the beginning of a correction that drags down the semiconductor supply chain, the data-center REITs, and the debt markets that have funded the buildout.

Meta's earnings report on April 29 provided the most dramatic illustration of the tension. The company raised its 2026 capex guidance to a midpoint of $98 billion, up from a prior $84 billion, representing roughly 45 percent of expected revenue. At the same time, it announced 11,000 additional layoffs, concentrated in non-AI engineering and policy roles. The Economic Times, reporting on the broader pattern on May 3, noted that the five largest tech employers have shed a combined 47,000 positions in the first four months of 2026, even as their aggregate revenue grew 14 percent year-over-year. The capital is being reallocated from labor to silicon.

If you look at the total addressable market for these investments, we are still in the first or second inning. The infrastructure we are building today will serve workloads that have not yet been invented., Meta CFO Susan Li, Q1 2026 earnings call, April 29

Susan Li's framing has become the standard rebuttal to the capex-skeptic thesis across the sector. Alphabet's Ruth Porat used nearly identical language on her call the following day, and Amazon's Brian Olsavsky devoted a section of his prepared remarks to what he called "the capacity overbuild that isn't an overbuild." The argument rests on an empirical claim: that enterprise AI adoption is still in its earliest phase and that the inference workloads of 2029 and 2030 will consume multiples of the compute currently deployed. If that claim is correct, today's spending looks prescient. If it is incorrect, the assets become stranded.

The evidence for the bullish case is not trivial. Amazon Web Services reported that its AI-related revenue run rate crossed $65 billion in the March quarter, up from $38 billion a year earlier. Microsoft's Azure AI services grew 94 percent year-over-year, though the company declined to break out the absolute dollar figure. Alphabet's Cloud division, which includes its Vertex AI platform, posted 33 percent growth and expanded its operating margin by 400 basis points. These are not trivial numbers, and they suggest that at least some of the infrastructure spend is finding a revenue home.

The counterargument, articulated most forcefully by AI researcher Gary Marcus in an interview with Business Insider published April 30, is that revenue growth from AI services is being purchased at negative gross margins in many cases, particularly where hyperscalers are subsidizing inference to win enterprise accounts from competitors. Marcus called the collective spending "the greatest capital misallocation in history." He is not a neutral observer; he has been a persistent critic of the scaling hypothesis. But his point about subsidized pricing is one that even the hyperscalers acknowledge in private conversations with institutional investors.

Paul Tudor Jones offered a different perspective in a televised interview on May 7, suggesting the market is in a period analogous to late 1999, with the AI-driven rally having "another two years" to run before the cycle exhausts itself. Seeking Alpha's analysis of Jones's comments, published ten days ago, focused on the liquidity-squeeze dimension: as capex consumes an ever-larger share of corporate cash flow, the pool of capital available for share repurchases, which have been a critical support for Big Tech valuations, shrinks. Jones did not frame it as a contradiction; he framed it as a timing question.

What the Lease Liabilities Tell Us

One structural shift that emerged across the Q1 2026 filings deserves more attention than it received during earnings week: the hyperscalers are increasingly financing AI infrastructure through operating leases and structured finance vehicles rather than outright capital expenditure. The accounting treatment matters because operating leases keep the associated liabilities partially off the traditional debt-to-equity calculations that screeners and rating agencies use.

The rating agencies have begun to notice. Moody's placed Amazon's A1 senior unsecured rating on negative outlook in late April, explicitly citing the scale and pace of AI infrastructure commitments. S&P Global followed with a similar action on Meta's A+ rating on May 2. Neither action constitutes a downgrade, but both signal that the credit-rating apparatus is now treating AI capex not as a transient earnings-cycle phenomenon but as a structural shift in the leverage profile of the sector.

The capital-markets dimension is equally significant. Microsoft alone issued $22 billion in investment-grade debt during the March quarter, the largest single-quarter corporate bond issuance in history, with the proceeds explicitly earmarked for data-center construction and GPU procurement. Alphabet followed with an $18 billion offering in April. The bonds were oversubscribed, and the yields were modest, Microsoft's 10-year tranche priced at 4.85 percent, just 95 basis points over the Treasury. The debt markets are, for now, fully accommodating the capex cycle. The question is what happens if and when the revenue curve fails to match the borrowing curve.

A metric to track: the ratio of AI-attributed revenue to AI-attributed capex, computed on a trailing-twelve-month basis and compared quarter-over-quarter. For the March 2026 quarter, the blended ratio across the four hyperscalers stood at approximately 0.31, meaning they are spending more than three dollars on AI infrastructure for every dollar of identifiable AI revenue. That ratio has worsened from roughly 0.26 a year ago, but the deterioration is slowing, which is what the bulls are watching. If the ratio inflects above 0.50 by the fourth quarter, the capex narrative shifts from "overbuild" to "ramp." If it stalls at 0.30, the bear case strengthens considerably.

The chip-supply constraint, which dominated discourse in 2024 and 2025, has eased. NVIDIA's B200 and B100 GPUs are now shipping in volume, and the lead times that stretched to 52 weeks in early 2025 have contracted to roughly 18 weeks. The binding constraint has shifted from supply to demand: the question is no longer whether the hyperscalers can acquire enough silicon, but whether the enterprise customers exist who can profitably consume the inference capacity that silicon enables. On that score, the Q1 earnings provided encouraging signals but not yet conclusive evidence.

Apple's earnings, reported on May 1, offered a useful counterpoint. The company's capex came in at $9.2 billion for the quarter, far below the hyperscaler cohort, and its AI strategy relies heavily on on-device inference and a partnership model with OpenAI that does not require massive proprietary infrastructure. Apple's shares rose 2.3 percent on earnings day while Microsoft declined 1.7 percent on the same afternoon. The market is rewarding capital efficiency in AI, not merely scale.

The next checkpoint arrives in mid-July, when the Q2 earnings cycle will provide the first read on whether the $715 billion deployment rate is holding, accelerating, or moderating. Between now and then, three variables will determine the direction of the debate: the trajectory of enterprise AI adoption as measured by cloud-provider bookings, the pace of further job cuts at the hyperscalers and whether they begin to attract political attention, and the shape of the yield curve, which will dictate the cost of the next round of bond issuance that the capex cycle requires. The market has priced in a soft landing for AI spending. The Q1 filings suggest that landing strip is narrower than the consensus assumes.

Read next

Progress 0% ≈ 9 min left
Subscribe Daily Brief

Get the Daily Brief
before your first meeting.

Five stories. Four minutes. Zero hot takes. Sent at 7:00 a.m. local time, every weekday.

No spam. Unsubscribe in one click.