Discussion
Loading...

Post

Log in
  • Sign up
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Orhun Parmaksız 👾
Orhun Parmaksız 👾
@orhun@fosstodon.org  ·  activity timestamp 6 days ago

New TUI dropped for managing LLM traffic and GPU resources 🔥

🌀 **ollamaMQ** — Async message queue proxy for Ollama

💯 Per-user queues, fair-share scheduling, OpenAI-compatible endpoints, streaming

🦀 Written in Rust & built with @ratatui_rs

⭐ GitHub: https://github.com/Chleba/ollamaMQ

#rustlang #ratatui #tui #gpu #llm #ollama #backend #proxy #terminal

Your browser does not support the video tag.
GIF
GIF
Open
GIF
GitHub

GitHub - Chleba/ollamaMQ: High-performance Ollama proxy with per-user fair-share queuing, round-robin scheduling, and a real-time TUI dashboard. Built in Rust.

High-performance Ollama proxy with per-user fair-share queuing, round-robin scheduling, and a real-time TUI dashboard. Built in Rust. - Chleba/ollamaMQ
  • Copy link
  • Flag this post
  • Block

Indieweb Studio

This is a relaxed, online social space for the indieweb community, brought to you by indieweb.social.

Please abide by our code of conduct and have a nice time!

Indieweb Studio: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.2-alpha.34 no JS en
Automatic federation enabled
Log in Create account
Instance logo
  • Explore
  • About
  • Members
  • Code of Conduct