github · vllm-project/vllm-omni
vllm-project/vllm-omni is trending on GitHub
A framework for efficient model inference with omni-modality models
Why it matters: Useful discovery radar beyond the fixed watchlist, but still secondary to the anchor sources.
Raw Extract
- Repo: vllm-project/vllm-omni
- Description: A framework for efficient model inference with omni-modality models
Notes
Pulled from GitHub Trending daily and filtered against the project signal profile.