Skip to content
Change the repository type filter

All

    Repositories list

    • A Flexible Framework for Experiencing Cutting-edge LLM Inference Optimizations
      Python
      Apache License 2.0
      53902241Updated Feb 10, 2025Feb 10, 2025
    • FlashInfer: Kernel Library for LLM Serving
      Cuda
      Apache License 2.0
      198000Updated Feb 10, 2025Feb 10, 2025
    • Mooncake

      Public
      Mooncake is the serving platform for Kimi, a leading LLM service provided by Moonshot AI.
      C++
      Apache License 2.0
      1472.5k203Updated Feb 8, 2025Feb 8, 2025
    • A Flexible Framework for Experiencing Cutting-edge LLM Inference Optimizations
      Python
      Apache License 2.0
      53000Updated Feb 8, 2025Feb 8, 2025
    • vllm

      Public
      A high-throughput and memory-efficient inference and serving engine for LLMs
      Python
      Apache License 2.0
      5.6k800Updated Feb 7, 2025Feb 7, 2025