Last week, Anthropic and OpenAI released new flagship models within minutes of each other, and buried in the technical specs was a number that matters more than most of the benchmark scores the companies like to tout: Anthropic’s Claude Opus 4.6 now offers a one-million-token context window, the first time its top-tier model has reached that threshold. If that sentence means nothing to you, then this article is for you, because context windows are one of those under-discussed technical details that shape what AI can and cannot do in practice, and their relentless expansion is contributing to a global memory shortage that is about to make a lot of everyday electronics more expensive.
For now, the practical picture is this: AI’s working memory is getting dramatically bigger and, at last, meaningfully more reliable. That is genuinely useful if one works with long documents, complex projects, or extended conversations. It is also genuinely expensive, not just for the companies running the models, but for anyone buying a laptop, a phone, or a TV in 2026. The insatiable memory of the machine turns out to be everyone’s bill to pay.