No products in the cart.

Analog in-memory Computing Attention Mechanism for Fast and Energy-efficient Large Language Models

A Nature paper describes an innovative analog in-memory computing (IMC) architecture tailored for the attention mechanism in large language models (LLMs). They want to drastically reduce latency and energy consumption during inference. The design leverages gain-cell crossbar arrays—capacitor-based memory devices made from oxide semiconductor field-effect transistors (IGZO or ITO)—to store key (K) and value (V) … Read more