ElBruno.OllamaMonitor

Architecture Guide

Overview

ElBruno.OllamaMonitor is a .NET WPF desktop application built for Windows with a system tray interface, a standard details window, and a compact always-on-top mini monitor.

The architecture is split into two tracks:

Project Structure

src/ElBruno.OllamaMonitor/
├── Cli/
│   ├── CliCommand.cs              # Command model
│   ├── CliCommandKind.cs          # Command type enum
│   ├── CliCommandParser.cs        # Parse args -> CliCommand
│   └── CliCommandRunner.cs        # Execute commands (help, config, reset, etc.)
├── Configuration/
│   ├── AppSettings.cs             # Settings data model (JSON-serializable)
│   └── AppSettingsService.cs      # Load/save settings from disk
├── Diagnostics/
│   ├── StatusFormatter.cs         # Format snapshots as readable text
│   ├── ClipboardService.cs        # Copy diagnostics to clipboard
│   └── DiagnosticsLogService.cs   # Event logging
├── Helpers/
│   ├── ProcessLauncher.cs         # Launch URLs / external commands
│   └── ...
├── Interop/
│   └── ...                        # P/Invoke / Windows interop
├── Models/
│   ├── OllamaMonitorState.cs      # State enum (Gray, Green, Blue, Orange, Red)
│   ├── OllamaMonitorSnapshot.cs   # Aggregated status snapshot
│   ├── OllamaModelSnapshot.cs     # Model info
│   └── ResourceSnapshot.cs        # CPU, RAM, GPU metrics
├── Ollama/
│   ├── OllamaClient.cs            # HTTP client for Ollama API
│   ├── OllamaStatusService.cs     # Aggregate Ollama state
│   ├── OllamaStatus.cs            # API response models
│   └── OllamaModelInfo.cs         # Model details
├── Services/
│   ├── ProcessMetricsService.cs   # CPU, RAM from Ollama process
│   ├── NvidiaSmiMetricsService.cs # GPU metrics via nvidia-smi
│   ├── TrayIconService.cs         # System tray lifecycle
│   ├── TrayStatusMapper.cs        # OllamaMonitorState -> icon color
│   └── TrayMenuBuilder.cs         # Context menu construction
├── ViewModels/
│   └── MainWindowViewModel.cs     # UI state, refresh logic
├── App.xaml / App.xaml.cs         # WPF Application entry point
├── MainWindow.xaml / MainWindow.xaml.cs      # Standard details window
└── MiniMonitorWindow.xaml / .cs              # Compact always-on-top monitor

Command Flow

Application Startup

App.OnStartup()
  ├─ Parse CLI args (CliCommandParser)
  ├─ If args = ["--help", "config", "config set", etc.]
  │   └─ Run CLI command (CliCommandRunner) → exit
  └─ Else
      └─ LaunchTrayApplication()

Tray Application Bootstrap

LaunchTrayApplication()
  ├─ Load settings (AppSettingsService)
  ├─ Create HttpClient (singleton)
  ├─ Create service stack:
  │   ├─ OllamaClient
  │   ├─ ProcessMetricsService
  │   ├─ NvidiaSmiMetricsService
  │   └─ OllamaStatusService (aggregates all three)
  ├─ Create MainWindow, MiniMonitorWindow, and MainWindowViewModel
  ├─ Create TrayIconService
  ├─ Start DispatcherTimer (refresh loop)
  └─ Show window or minimize to tray (based on settings)

Refresh Loop

Every N seconds (default 2, configurable):

DispatcherTimer.Tick
  └─ MainWindowViewModel.RefreshAsync()
      └─ OllamaStatusService.GetStatusAsync()
          ├─ OllamaClient.GetVersionAsync()
          ├─ OllamaClient.GetTagsAsync()
          ├─ OllamaClient.GetProcessesAsync()
          ├─ ProcessMetricsService.GetMetricsAsync()
          └─ NvidiaSmiMetricsService.GetGpuMetricsAsync()
      └─ Determine OllamaMonitorState (Gray/Green/Blue/Orange/Red)
      └─ Update UI bindings
      └─ Update tray icon color (TrayStatusMapper)

State Model

OllamaMonitorState

Determines the tray icon color:

public enum OllamaMonitorState
{
    NotReachable,    // Gray   — API unreachable
    Running,         // Green  — API reachable, no model
    ModelLoaded,     // Blue   — Model loaded, low usage
    HighUsage,       // Orange — Model running, high usage
    Error            // Red    — Unexpected error
}

State Determination Logic

  1. Can we reach the Ollama API?
    • No → NotReachable
  2. Is a model loaded or running?
    • No → Running
    • Yes, CPU/GPU low → ModelLoaded
    • Yes, CPU/GPU > threshold → HighUsage
  3. Any errors?
    • Yes → Error (overrides other states)

Configuration

Settings are stored as JSON at:

%LOCALAPPDATA%\ElBruno\OllamaMonitor\settings.json

Editable via:

Key settings:

Key Type Default Purpose
endpoint string http://localhost:11434 Ollama API endpoint
refreshIntervalSeconds int 2 Polling interval
startMinimizedToTray bool true Hide the windows on startup
showFloatingWindowOnStart bool false Show the details window on startup
enableGpuMetrics bool true Include GPU metrics
enableDiskMetrics bool true Include disk I/O metrics
highCpuThresholdPercent double 80 CPU% to trigger HighUsage state
highMemoryThresholdGb double 16 RAM GB to trigger HighUsage state
highGpuThresholdPercent double 85 GPU% to trigger HighUsage state

Key Classes

OllamaStatusService

Aggregates Ollama API state, process metrics, and GPU metrics into a single OllamaMonitorSnapshot.

Task<OllamaMonitorSnapshot> GetStatusAsync(CancellationToken cancellationToken)

Returns:

ProcessMetricsService

Polls the Ollama process for CPU and memory usage using System.Diagnostics.Process.

Task<ResourceSnapshot?> GetMetricsAsync(CancellationToken cancellationToken)

Returns CPU%, RAM (MB), disk I/O if available.

NvidiaSmiMetricsService

Best-effort GPU metrics via nvidia-smi CLI tool.

Task<GpuMetrics?> GetGpuMetricsAsync(CancellationToken cancellationToken)

Returns GPU utilization%, VRAM used/total if available. Fails gracefully if nvidia-smi not found.

TrayIconService

Manages system tray lifecycle, context menu, and state-driven icon updates.

MainWindowViewModel

Binds UI to OllamaMonitorSnapshot. Handles:

CLI Commands

All commands are parsed and executed by CliCommandRunner:

Command Effect
ollamamon Launch tray app
ollamamon --help Show help text
ollamamon config Print current settings
ollamamon config set endpoint <url> Change Ollama endpoint
ollamamon config set refresh-interval <seconds> Change polling interval
ollamamon config reset Reset to defaults

Error Handling

Deployment

The project is packaged as a .NET global tool:

<PackAsTool>true</PackAsTool>
<ToolCommandName>ollamamon</ToolCommandName>

Install via:

dotnet tool install --global ElBruno.OllamaMonitor

This places the executable in the user’s PATH and creates the ollamamon command.

Testing Checklist (Phase 1)


Next Phase (Phase 2): MVVM framework, logging framework, unit tests, settings UI dialog, historical charts.