ElBruno.OllamaMonitor

LinkedIn Post

Main Text

🟒 Just shipped: ElBruno.OllamaMonitor β€” a tiny Windows tray app to monitor your local Ollama runtime.

No dashboards. No bloat. Just a status icon that tells you:

Glance at your system tray. That’s it.

Perfect for: β†’ Local AI developers who need quick visibility β†’ Demo presenters showing real-time resource impact β†’ Anyone running Ollama and curious about overhead

The tray icon changes color:

Built on .NET 10. Fully configurable. CLI support. Open source.

Install: dotnet tool install --global ElBruno.OllamaMonitor

GitHub: [Link to repo]


Hashtags

#Ollama #LocalAI #dotNET #Windows #OpenSource #Developer #AITools #Productivity


Media Suggestions


Call to Action

⭐ Star it on GitHub πŸ› Report issues or suggest features πŸš€ Contribute to Phase 2 (historical charts, MVVM, multi-instance support)


Variant (Shorter Version)

🟒 New: ElBruno.OllamaMonitor β€” See your Ollama status in the Windows tray.

Quick visual feedback: Is Ollama running? Is a model loaded? How much CPU/GPU is it using?

One glance. That’s all you need.

πŸ‘‰ dotnet tool install --global ElBruno.OllamaMonitor

#Ollama #LocalAI #dotNET #Windows


Variant (More Technical)

🟒 Shipped: ElBruno.OllamaMonitor v0.1.0

A .NET 10 system tray app for monitoring local Ollama workloads on Windows.

Features: β€’ Real-time status icon (5-state FSM) β€’ Floating details window (CPU, RAM, disk, GPU metrics) β€’ CLI configuration β€’ Best-effort NVIDIA GPU tracking β€’ Copy diagnostics to clipboard

Built for developers. Minimal footprint. Full documentation.

GitHub: [Link to repo] dotnet tool install --global ElBruno.OllamaMonitor

#OpenSource #dotNET #Ollama #Windows #LocalAI


Engagement Questions