
Check out our latest products
The GPUs and AI tools help developers and professionals run tasks like design, engineering, and AI more easily and with better performance.

Intel has introduced a new lineup of graphics processing units (GPUs) and AI accelerators aimed at professionals and developers. The announcement highlights Intel’s continued focus on delivering high-performance tools for AI and workstation applications.
The Intel Arc Pro B-Series GPUs include the Arc Pro B60 and Arc Pro B50, expanding the existing Arc Pro family. These GPUs are designed for AI inference and professional workloads, offering configurations suited for demanding tasks in areas like engineering, media, and design.

Intel also launched the Gaudi 3 AI accelerators, which are now available in both PCIe card and rack-scale system formats. These accelerators provide scalable and open solutions for enterprise and cloud-based AI inference, helping customers run a wide range of AI models efficiently.
Additionally, Intel made the AI Assistant Builder publicly available on GitHub. This lightweight software framework allows developers to create local, purpose-built AI agents optimized for Intel hardware, enabling easier deployment of AI tools across personal and enterprise systems.
The Intel Arc Pro B60 and B50 GPUs, built on the Xe2 architecture, feature Intel® Xe Matrix Extensions (XMX) AI cores and advanced ray tracing units. These additions bring powerful capabilities to professionals working in content creation, software development, and engineering.
Designed specifically for AI inference and workstation workloads, the Arc Pro B60 and B50 come with 24GB and 16GB of memory, respectively. They support multi-GPU setups, making them suitable for demanding professional environments. This expansion of Intel’s GPU lineup offers scalable, AI-ready solutions for creators and developers needing reliable performance.
The GPUs are optimized for architecture, engineering, and construction (AEC) use, as well as AI inference tasks. They come with broad support from independent software vendors (ISVs) and work with both consumer and professional drivers on Windows. On Linux, they support a containerized software stack to simplify deployment and will receive ongoing updates for improved functionality. Their high memory capacity and strong software compatibility make them cost-effective choices for professionals building or scaling AI systems.