Researchers from Purdue University and the Georgia Institute of Technology argue that the growing energy demands of artificial‑intelligence models stem from the longstanding "memory wall"—the gap between fast processors and slower memory. By adopting brain‑inspired architectures that integrate processing and memory, such as spiking neural networks and compute‑in‑memory (CIM) designs, they claim AI can become far more efficient. The study highlights potential benefits for devices ranging from medical equipment to drones, suggesting a shift away from traditional von Neumann designs could dramatically cut power consumption while preserving performance.
Leia mais →