The first AI chip that learns and infers has been developed, combining FeCAPs and memristors for efficient, adaptive edge computing.
According to Robin Mitchell, one of the significant barriers to progress in artificial intelligence is not algorithmic, but physical.
The challenge lies in running AI systems efficiently on hardware, as current approaches require large amounts of specialized silicon, massive energy consumption, and complicated cooling systems.
This approach is anything but sustainable, and if AI continues to scale at its current pace, could see serious resource shortages.
Data centers already strain power grids, and the amount of heat generated requires industrial-scale cooling, often supported by water-intensive processes that introduce environmental and economic issues.
Of all AI processes, training remains the most energy-intensive.
Author's summary: AI chip learns and infers efficiently.