- John Carmack has shared an idea for using fiber rather than RAM
- This is a vision of the future for replacing RAM modules in AI workloads
- While it’s highly theoretical and a long way off, there are other possible nearer-term solutions to reduce AI’s all-consuming appetite for RAM
John Carmack has aired an idea to effectively use fiber cables as ‘storage’ rather than conventional RAM modules, which is a particularly intriguing vision of the future given the current memory crisis and all the havoc it’s wreaking.
Tom’s Hardware noticed the cofounder of id Software’s post on X where Carmack proposes that a very long fiber optic cable – and we’re talking 200km long – could effectively fill in for system RAM, at least when working with AI models.
Carmack observes: “256Tb/s data rates over 200km distance have been demonstrated on single-mode fiber optic, which works out to 32GB of data in flight, ‘stored’ in the fiber, with 32TB/s bandwidth. Neural network inference and training [AI] can have deterministic weight reference patterns, so it is amusing to consider a system with no DRAM, and weights continuously streamed into an L2 cache by a recycling fiber loop.”
What this means is that said length of fiber is a loop where the needed data (normally stored in RAM) is being “continuously streamed” and keeping the AI processor always fed (as the AI model weights can be accessed sequentially – this wouldn’t work otherwise). This would be a very eco-friendly, power-saving way of completing these tasks, too, compared to traditional RAM.
As Carmack points out, this is the “modern equivalent of the ancient mercury echo tube memories”, or delay-line memory, where data is stored in waves going through a coil of wire.
It’s not an idea that’s feasible now, but a concept for the future, as mentioned – and what Carmack is arguing is that it’s a conceivable path forward which possibly has a “better growth trajectory” than we’re currently looking at with traditional DRAM.
Analysis: flash forward
There are very obvious problems with RAM right now in terms of supply and demand, with the latter far outstripping the former thanks to the rise of AI and the huge memory requirements therein. (Not just for servers in data centers that field the queries to popular AI models, but video RAM in AI accelerator boards, too.)
So what Carmack is envisioning is a different way to operate with AI models that uses fiber lines instead. This could, in theory, leave the rest of us free to stop worrying about RAM costing a ridiculous amount of cash (or indeed a PC, or a graphics card, and the list goes on with the knock-on pricing effects of the memory crisis).
The problem is that there are a lot of issues with such a fiber proposition, as Carmack acknowledges. That includes the sheer quantity of fiber needed and difficulties around maintaining the signal strength through the loop.
However, there are other possibilities along these lines, and other people have been talking about similar concepts over the past few years. Carmack mentions: “Much more practically, you should be able to gang cheap flash memory together to provide almost any read bandwidth you require, as long as it is done a page at a time and pipelined well ahead. That should be viable for inference serving today if flash and accelerator vendors could agree on a high-speed interface.”
In other words, this is an army of cheap flash memory modules slapped together, working massively in parallel, but as Carmack notes, the key would be agreeing on an interface where these chips could work directly with the AI accelerator.
This is an interesting nearer-term proposition, but one that relies on the relevant manufacturers (of AI GPUs and storage) getting their act together and hammering out a new system in this vein.
The RAM crisis is forecast to last this year, and likely next year too, potentially dragging on for even longer than that with all sorts of pain involved for consumers. So, looking to alternative solutions for memory in terms of AI models could be a valuable pursuit towards ensuring this RAM crisis is the last such episode we have to suffer through.

Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course, you can also follow TechRadar on YouTube and TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
” data-join-the-conversation-text=”Join the Conversation”>
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
