A tiny, practical tool that gives you instant visibility into your Ollama workloads without cluttering your desktop.
If you’re running Ollama locally on Windows—whether you’re tinkering with LLMs, building local AI demos, or just curious about the overhead of large models—you’ve probably wondered: Is it still running? How much CPU is it chewing? Did that model load?
Welcome to ElBruno.OllamaMonitor.
It’s a no-frills system tray app that sits in your Windows notification area and tells you, at a glance, exactly what your local Ollama instance is doing. No dashboards. No complexity. Just real-time status, resource metrics, and a floating details window when you need more info.
You could open Task Manager. You could open a terminal and curl the Ollama API. You could build a web dashboard.
But you probably don’t want to. You want something that Just Works™—that launches with your system, stays out of the way, and gives you instant feedback without thinking about it.
That’s the philosophy behind this tool:
When you run ollamamon, you get:
ollamamon # Launch app
ollamamon --help # Show help
ollamamon config # View settings
ollamamon config set endpoint <url> # Change endpoint
ollamamon config set refresh-interval <sec> # Change polling rate
ollamamon config reset # Reset defaults
You’re prototyping a .NET app that uses local embeddings or inference. You need to know:
Answer: Glance at the tray. Done.
You’re live-demoing a local AI feature. Suddenly, the inference slows down. Your audience wonders: Is it CPU-bound? GPU-saturated? Out of memory?
Answer: Click the tray icon, show the floating window with real-time metrics, explain what’s happening. Credibility earned.
You’re running Ollama in the background while you work. Occasionally, your system gets sluggish. Is it Ollama? Is the model loading? Is it GPU-bound?
Answer: Check the tray. If it’s orange, you know Ollama is hogging resources. You can decide whether to wait or pause it.
You’re running tests that depend on a local Ollama instance. You want to verify Ollama is healthy before running tests.
Answer: Use the CLI:
ollamamon config # Returns current config
curl http://localhost:11434/api/version # Via the app, you know the endpoint
dotnet tool install --global ElBruno.OllamaMonitor
That’s it. The ollamamon command is now in your PATH.
Clone the repository, build, and run:
git clone https://github.com/ElBruno/ElBruno.OllamaMonitor.git
cd ElBruno.OllamaMonitor
dotnet build
dotnet run --project src/ElBruno.OllamaMonitor/
All settings are stored in a JSON file:
%LOCALAPPDATA%\ElBruno\OllamaMonitor\settings.json
Default values are sensible for most use cases:
{
"endpoint": "http://localhost:11434",
"refreshIntervalSeconds": 2,
"startMinimizedToTray": true,
"enableGpuMetrics": true,
"highCpuThresholdPercent": 80,
"highMemoryThresholdGb": 16,
"highGpuThresholdPercent": 85
}
Need to monitor a remote Ollama instance? Just set the endpoint:
ollamamon config set endpoint http://192.168.1.100:11434
H.NotifyIcon.Wpf for clean tray integrationnvidia-smi if available, gracefully degrades otherwiseThis is the foundational release. You get:
Future roadmap includes:
dotnet tool install --global ElBruno.OllamaMonitor
ollama serve
ollamamon
Look at the system tray for the status icon. Click it to see details.
ollamamon config set refresh-interval 5 # Slower polling
ollamamon config set endpoint http://192.168.1.50:11434 # Remote Ollama
Comprehensive docs are in the repository:
Found a bug? Have an idea? Open an issue or PR on GitHub.
The codebase is clean, documented, and ready for contributions. Phase 1 is intentionally minimal—Phase 2 is where we add the fancy stuff.
ElBruno.OllamaMonitor is proof that sometimes the best tools are the simplest ones. No bloat. No complexity. Just a tray icon that tells you what’s going on.
Whether you’re a local AI enthusiast, a .NET developer, or just someone who runs Ollama and wonders about resource usage, give it a try. It’s free, open-source, and takes 30 seconds to install.
Your productivity will thank you.
Try it: dotnet tool install --global ElBruno.OllamaMonitor
Docs: See the GitHub repository
Made by: El Bruno — A .NET developer obsessed with local AI and productivity.