Machine learning has advanced considerably in recent years, with algorithms achieving human-level performance in diverse tasks. However, the real challenge lies not just in creating these models, but in implementing them efficiently in real-world applications. This is where inference in AI takes center stage, arising as a critical focus for experts