AI GPU accelerators with 6TB HBM memory could appear by 2035 as AI GPU die sizes set to shrink – but there’s far worse coming up
Future AI memory chips could demand more power than entire industrial zones combined 6TB of memory in one GPU sounds amazing until you see the power draw HBM8 stacks are impressive in theory, but terrifying in practice for any energy-conscious enterprise The relentless drive to expand AI processing power is ushering in a new era…
