An AI processor developed by an Indian start-up and IIT Madras has been able to cut inferencing costs by nearly 50% compared to global benchmarks by running large language models on CPUs instead of expensive GPUs. This CPU-based AI system is known as ‘Kompact’.