Artificial Intelligence has achieved significant progress in recent years, with systems matching human capabilities in various tasks. However, the real challenge lies not just in developing these models, but in deploying them optimally in everyday use cases. This is where inference in AI comes into play, surfacing as a critical focus for experts an