WhoKnows.
← All briefings
TECHACTIONMONEYCAREER 3 stories

Daily Briefing — May 7, 2026


01

Mira Murati told the court she couldn't trust what Sam Altman said

The Verge →
Career & skills + Money & markets

Mira Murati, OpenAI's former CTO, testified under oath in the Musk v. Altman trial that Altman told her a new AI model had been cleared by legal and didn't need to go through the deployment safety board. She checked with the general counsel directly. The stories didn't match. So she ran the model through the safety board anyway.

Her grievance was specific. She wasn't saying Altman was a bad boss or OpenAI was a bad place to work. She was saying she couldn't do her job because her CEO wasn't being straight with her about safety decisions.

That should bother anyone watching how AI governance actually works inside these companies. When safety relies on people acting in good faith instead of systems that make shortcuts impossible, you have a problem.

SO WHAT

Safety processes only work if they don't depend on the honesty of the person at the top. Look at how your own org handles safety or compliance reviews, and ask whether the process would survive someone senior deciding to skip it.


02

Global finance watchdog flags private credit risk in the AI boom

The Guardian Tech →
Money & markets + Tech shifts

The Financial Stability Board, the global referee for financial system risk, just flagged the private credit industry as a weak spot in the AI funding chain. AI companies have been borrowing heavily from private lenders to build datacentres, and the bet is starting to look concentrated. The AI sector took more than a third of all private credit deals in 2025, up from 17% over the previous five years. That's a fast move.

The FSB isn't predicting a crash. The point is that when one industry absorbs that much private lending that quickly, the whole stack gets sensitive to a single bad season. If AI valuations correct hard, the losses don't stay inside tech. They ripple back through private credit funds, and from there into the broader system.

SO WHAT

For anyone working in or around tech, this matters. The funding climate isn't as bulletproof as the hype suggests. A tighter credit environment could slow hiring, shrink tooling budgets, and cool the infrastructure buildout faster than most roadmaps assume.


03

Chrome silently installs a 4 GB AI model on your device

Hacker News →
Tech shifts + What to do

Google Chrome has been quietly installing a 4 GB file called weights.bin onto user devices. The file holds the weights for Gemini Nano, Google's on-device LLM, and lives in a folder called OptGuideOnDeviceModel. Delete it and Chrome downloads it again next launch. No consent dialog. No opt-out.

Chrome has billions of users. Hanff estimates the carbon cost of pushing this one file across that base at somewhere between six thousand and sixty thousand tonnes of CO2 equivalent. One company made that call unilaterally — no privacy notice, no terms update, no climate disclosure.

SO WHAT

If you work in compliance, data protection, or product governance, this is going to land on your desk. EU regulators especially are going to want answers about consent when AI vendors start treating user devices as distributed compute. Your third-party software policy probably assumes you have visibility over silent vendor installs. You almost certainly don't.