Intel LLM-Scaler-Omni Update Brings ComfyUI & SGLang Improvements On Arc Graphics
([Intel] 6 Hours Ago
LLM-Scaler-Omni 0.1.0-b5)
- Reference: 0001607023
- News link: https://www.phoronix.com/news/Intel-LLM-Scaler-Omni-0.1.0-b5
- Source link:
Following last week's [1]updated Intel LLM-Scaler-vLLM release for helping advance vLLM usage on Intel Arc Graphics, LLM Scaler Omni is out with a new release today for that LLM-Scaler environment focused on image / voice / video generation using Omni Studio and Omni Serving modes.
LLM-Scaler-Omni 0.1.0-b5 is the new release that adds support for Python 3.12 and PyTorch 2.9 for delivering some additional performance benefits. The updated LLM Scaler Omni brings a number of ComfyUI upgrades including support for new models and workflows such as Qwen-Image-Layered, Qwen-Image-Edit-2511, Qwen-Image-2512, and HY-Motion. The ComfyUI upgrades also include support for ComfyUI-GGUF for enabling GGUF model usage.
LLM-Scaler-Omni 0.1.0-b5 also brings SGLang Diffusion updates including support for CacheDiT, Tensor Parallelism support for multi-XPU inference, and SGLD ComfyUI custom node support.
There are also updated code samples and other improvements with the Docker image of LLM-Scaler-Omni 0.1.0-b5. Downloads and more details on the new LLM Scaler Omni release for further advancing AI capabilities on Intel Arc Graphics Battlemage hardware via [2]GitHub .
[1] https://www.phoronix.com/news/Intel-LLM-Scaler-vLLM-0.11.1-b7
[2] https://github.com/intel/llm-scaler/releases/tag/omni-0.1.0-b5
LLM-Scaler-Omni 0.1.0-b5 is the new release that adds support for Python 3.12 and PyTorch 2.9 for delivering some additional performance benefits. The updated LLM Scaler Omni brings a number of ComfyUI upgrades including support for new models and workflows such as Qwen-Image-Layered, Qwen-Image-Edit-2511, Qwen-Image-2512, and HY-Motion. The ComfyUI upgrades also include support for ComfyUI-GGUF for enabling GGUF model usage.
LLM-Scaler-Omni 0.1.0-b5 also brings SGLang Diffusion updates including support for CacheDiT, Tensor Parallelism support for multi-XPU inference, and SGLD ComfyUI custom node support.
There are also updated code samples and other improvements with the Docker image of LLM-Scaler-Omni 0.1.0-b5. Downloads and more details on the new LLM Scaler Omni release for further advancing AI capabilities on Intel Arc Graphics Battlemage hardware via [2]GitHub .
[1] https://www.phoronix.com/news/Intel-LLM-Scaler-vLLM-0.11.1-b7
[2] https://github.com/intel/llm-scaler/releases/tag/omni-0.1.0-b5