Tech Digest – February 5, 2026

The Space Race for Compute

SpaceX Files for One Million Orbital Data Centers — Musk Plans Lunar Satellite Factories

The FCC has accepted SpaceX’s application to launch up to one million satellites designed to function as orbital data centers, powered by near-constant solar energy. SpaceX described the system as “a first step towards becoming a Kardashev II-level civilization” in its filing. Elon Musk, who recently merged xAI into SpaceX at a combined $1.25 trillion valuation, stated that “anything less than K2 is feeble” and outlined plans for lunar manufacturing: “Factories on the Moon can take advantage of lunar resources to manufacture satellites and deploy them further into space. By using an electromagnetic mass driver and lunar manufacturing, it is possible to put 500 to 1,000 TW/year of AI satellites into deep space.” He estimates space-based AI compute will be cheaper than terrestrial alternatives within two to three years. Meanwhile, China confirmed plans to land astronauts on the Moon by 2030 to establish its own base — a separate but parallel bid for off-planet infrastructure.

Note: The filing is preliminary and the FCC has not approved it. But the direction is concrete: compute demand is outgrowing what terrestrial power grids and water supplies can sustain, and the people writing the biggest checks are now looking off-planet. Public comment closes March 6.

Sources: FCC, Fortune, SpaceNews, IEEE Spectrum

Russian Spy Satellites Intercepted Communications from a Dozen European Satellites

European security officials disclosed that two Russian spacecraft — Luch-1 and Luch-2 — have intercepted communications from at least a dozen key European geostationary satellites over the past three years, making risky close approaches and lingering for weeks. Major General Michael Traut, head of Germany’s military space command, confirmed both vehicles are suspected of conducting signals intelligence. The concern: many European satellites transmit unencrypted command data because they were launched decades ago without advanced onboard encryption. Officials warn that recorded command sequences could later be replayed to manipulate satellite trajectories or cause controlled deorbits. Russia has also launched newer maneuverable spacecraft — Cosmos 2589 and 2590 — with similar capabilities.

Note: German Defence Minister Pistorius called satellite networks “the Achilles’ heel of modern societies.” For any institution that depends on satellite-relayed communications, GPS, or positioning data — which is nearly all of them — this is a concrete infrastructure vulnerability, not an abstract threat.

Sources: Ars Technica, Financial Times

AI Capabilities Spiking

GPT-5.2 Achieves 6.6-Hour Autonomy on Complex Tasks — OpenAI Confirms Recursive Self-Improvement Goal

METR, the AI evaluation organization, measured GPT-5.2 with “high” reasoning effort at a 50%-time-horizon of approximately 6.6 hours on complex software tasks — the highest autonomy measurement they have ever reported. Separately, OpenAI’s Chief Research Officer Mark Chen confirmed the lab’s goal is recursive self-improvement: building AI systems that improve themselves with decreasing human involvement. The two data points map onto each other — autonomous capability is extending from minutes to hours, and the research direction aims to remove the ceiling entirely.

Note: Six months ago, autonomous AI could reliably handle tasks lasting minutes. Now it’s hours. That curve matters for every institution planning IT projects, procurement timelines, or staffing levels over the next 18 months.

Sources: METR, Mark Chen (OpenAI CRO)

Sam Altman: OpenAI Has “Basically Built AGI”

OpenAI CEO Sam Altman stated publicly that the company has “basically built AGI” and warned that models are about to become “extremely powerful” and fast. The comments came alongside the appointment of a new Head of Preparedness, a role focused on safety and risk management for increasingly capable systems. Regardless of where one places the AGI threshold, the statement from the CEO of the most capitalized AI company signals that internal capability assessments have shifted meaningfully.

Sources: Forbes, Sam Altman

China’s Kimi K2.5 Sets New Open-Weight Record on Capabilities Index

Moonshot AI’s Kimi K2.5 set a new record among open-weight models on Epoch AI’s Capabilities Index, scoring 147 — roughly on par with o3, Grok 4, and Sonnet 4.5. It still trails the overall frontier, but the gap between open-weight and closed models continues to compress.

Note: Open-weight models at near-frontier performance give institutions procurement options that don’t require long-term vendor lock-in. Worth tracking for anyone evaluating AI deployments where data sovereignty matters.

Sources: Epoch AI

Capital Flows: Building the Machine

Alphabet Plans Up to $185 Billion in Capex — Nearly Double Last Year

Alphabet reported Q4 2025 revenue of $113.8 billion (up 18% YoY) and announced 2026 capital expenditure guidance of $175–185 billion — roughly double the $91.4 billion spent in 2025 and far above the $115 billion analysts had forecast. Google Cloud revenue surged 48% to $17.7 billion, its fastest growth in four years, with a backlog that more than doubled year-over-year to $240 billion. CEO Sundar Pichai noted the company remains supply-constrained despite ongoing expansion. Roughly 60% of capex goes to servers, 40% to data centers and networking.

Note: When a single company plans to spend more on infrastructure than the GDP of most EU member states, the downstream effects — on energy grids, real estate, construction labor, and chip supply — ripple into every digital procurement timeline.

Sources: Alphabet Earnings Release, Financial Times, CNBC

Nvidia Nears $20 Billion Investment in OpenAI

Nvidia is close to finalizing a $20 billion investment in OpenAI’s latest funding round, according to Bloomberg. The deal would deepen the coupling between the dominant chip supplier and the dominant model provider — concentrating capital and supply-chain leverage at the top of the AI stack.

Sources: Bloomberg

Cerebras Raises $1 Billion at $23 Billion Valuation

AI chip company Cerebras raised another $1 billion in funding at a $23 billion valuation. Cerebras builds wafer-scale processors designed as an alternative to Nvidia GPUs for AI training and inference — one of a small number of companies offering a different path for compute infrastructure.

Note: Every billion flowing into non-Nvidia chip companies is a signal that the market expects compute demand to outstrip any single supplier’s capacity. For procurement planners, the chip landscape is diversifying.

Sources: Bloomberg

At Top AI Labs, Compute Costs Now Exceed All Other Spending Combined

Epoch AI analyzed financials from Anthropic, Minimax, and 01.AI and found that compute costs at these companies now exceed salaries, marketing, and all other spending combined. The finding quantifies what the capex numbers imply: AI development is becoming an infrastructure-first industry where raw compute is the binding constraint.

Sources: Epoch AI

Legacy Software Under Siege

Anthropic Cowork Plugins Trigger ~$285 Billion Selloff Across Software and Financial Data Stocks

Anthropic released specialized Cowork plugins for legal, finance, sales, and other knowledge-work domains — automating contract review, NDA triage, compliance workflows, and data analysis. Markets reacted immediately: Thomson Reuters fell 18% (its worst day on record), RELX dropped 14%, LegalZoom sank 20%, and a Goldman Sachs basket of US software stocks fell 6%. Bloomberg estimated roughly $285 billion in market value was erased in a single session, with the selloff cascading into European and Asian markets the following day.

Note: The speed of the market reaction is the story. One product announcement from one company wiped out the equivalent of a major European bank’s market cap — not because the tool replaced everything overnight, but because investors suddenly priced in the structural risk to seat-based software licensing. Any institution paying per-seat for legal research, compliance, or data analytics should be watching this closely.

Sources: Bloomberg, Wall Street Journal, Artificial Lawyer

Gemini Passes 750 Million Monthly Users — OpenAI’s Mobile Share Drops to 45%

Google’s Gemini app surpassed 750 million monthly active users, up from 650 million last quarter. Meanwhile, OpenAI’s share of the mobile AI market has contracted to 45%, down from a dominant position a year ago. The AI tool market is becoming genuinely multi-vendor — a shift that matters for procurement strategy and long-term dependency planning.

Sources: TechCrunch, Big Technology

Agent Infrastructure Taking Shape

1.5 Million AI Agents Join Moltbook in Days — And They Can Now Rent Humans

Moltbook, an AI-only social network built on the OpenClaw framework, surged from 30,000 to over 1.5 million registered AI agents within days of launch. The agents post, comment, upvote, form communities, and interact autonomously — though security researchers at Wiz found only about 17,000 humans behind them. The platform exposed massive security vulnerabilities: API keys, login tokens, and email addresses were left unprotected, and any post could serve as a prompt injection vector. Separately, RentAHuman launched as a service where AI agents can hire humans for tasks requiring physical presence — one human was paid $100 by an agent to hold a sign reading “AN AI PAID ME TO HOLD THIS SIGN.”

Note: The agent count is inflated and the security was atrocious — but the experiment demonstrated something real: autonomous agents can now coordinate, form structures, and even contract human labor, all at machine speed. Palo Alto Networks published an advisory specifically on the governance implications. Anyone deploying agents internally should be reading it.

Sources: The Verge, Axios, Palo Alto Networks, RentAHuman

Robotics Scaling for Infrastructure

Bedrock Robotics Raises $270 Million to Automate Excavators for Data Center Construction

Bedrock Robotics raised $270 million to automate multi-ton excavators for constructing data centers. The funding targets one of the most concrete bottlenecks in the AI buildout: physical construction capacity. As hyperscalers commit hundreds of billions to new facilities, the labor to build them is in short supply.

Note: The loop closes itself: AI needs data centers, data centers need construction, construction is labor-constrained, so AI automates construction. This is what capital-driven acceleration looks like in practice.

Sources: New York Times

Similar Posts