git clone https://github.com/ElBruno/ElBruno.OllamaMonitor.git
cd ElBruno.OllamaMonitor
dotnet restore
dotnet --version
Should be .NET 10.0.x or later.
dotnet build
To build in Release mode:
dotnet build -c Release
dotnet run --project src/ElBruno.OllamaMonitor/
The app will launch. Check the system tray for the icon.
# Show help
dotnet run --project src/ElBruno.OllamaMonitor/ -- --help
# Show current config
dotnet run --project src/ElBruno.OllamaMonitor/ -- config
# Set endpoint
dotnet run --project src/ElBruno.OllamaMonitor/ -- config set endpoint http://localhost:11434
# Reset config
dotnet run --project src/ElBruno.OllamaMonitor/ -- config reset
src/ElBruno.OllamaMonitor/
├── Cli/
│ ├── CliCommand.cs # Command model
│ ├── CliCommandKind.cs # Command types enum
│ ├── CliCommandParser.cs # Parse command-line args
│ └── CliCommandRunner.cs # Execute commands
├── Configuration/
│ ├── AppSettings.cs # Settings model (JSON-serializable)
│ └── AppSettingsService.cs # Load/save settings file
├── Diagnostics/
│ ├── DiagnosticsLogService.cs # Event logging
│ ├── StatusFormatter.cs # Format snapshots as text
│ └── ClipboardService.cs # Copy to clipboard
├── Helpers/
│ ├── ProcessLauncher.cs # Launch URLs / processes
│ └── AppPaths.cs # Config/log paths
├── Interop/
│ └── ... # Windows P/Invoke helpers
├── Models/
│ ├── OllamaMonitorState.cs # State enum (Gray/Green/Blue/Orange/Red)
│ ├── OllamaMonitorSnapshot.cs # Aggregated status
│ ├── OllamaModelSnapshot.cs # Model info
│ └── ResourceSnapshot.cs # CPU/RAM/GPU metrics
├── Ollama/
│ ├── OllamaClient.cs # HTTP client for Ollama API
│ ├── OllamaStatusService.cs # Aggregate Ollama state
│ ├── OllamaStatus.cs # API response models
│ └── OllamaModelInfo.cs # Model info models
├── Services/
│ ├── ProcessMetricsService.cs # CPU/RAM metrics
│ ├── NvidiaSmiMetricsService.cs # GPU metrics
│ ├── TrayIconService.cs # System tray lifecycle
│ ├── TrayStatusMapper.cs # Map state to icon color
│ └── TrayMenuBuilder.cs # Build context menu
├── ViewModels/
│ └── MainWindowViewModel.cs # UI state and logic
├── App.xaml / App.xaml.cs # WPF Application entry
└── MainWindow.xaml / MainWindow.xaml.cs # Floating details window
CliCommandKind enum in Cli/CliCommandKind.cs:
public enum CliCommandKind
{
// ... existing
MyNewCommand
}
Update Cli/CliCommandParser.cs to recognize the new command in the parser logic
Cli/CliCommandRunner.cs:
if (command.Kind == CliCommandKind.MyNewCommand)
{
// Your logic here
return 0; // success
}
Update help text in HelpCommand.cs if needed
dotnet run --project src/ElBruno.OllamaMonitor/ -- <your-new-command>To add or change what metrics are collected:
Models/ResourceSnapshot.cs to add new fieldsServices/ProcessMetricsService.cs or create a new serviceOllama/OllamaStatusService.cs in the GetStatusAsync methodMainWindowViewModel.cs to display the new metric if neededTray icon logic lives in these files:
Services/TrayStatusMapper.cs — Maps OllamaMonitorState to colors/iconsServices/TrayIconService.cs — Manages lifecycle and menuServices/TrayMenuBuilder.cs — Constructs context menu itemsTo change icon colors or add menu items, modify these files.
Configuration/AppSettings.cs:
public int MyNewSetting { get; init; } = 123;
Update the default in Configuration/AppSettingsService.cs if needed
Add CLI command to set it (see “Adding a New CLI Command” above)
AppSettingsLogs are written to:
%LOCALAPPDATA%\ElBruno\OllamaMonitor\logs\
Use DiagnosticsLogService to write logs:
_diagnostics.WriteInfo("This is an info message");
_diagnostics.WriteError("An error occurred", exception);
_diagnostics.WriteWarning("A warning");
To view logs, open the logs directory with Windows Explorer or your editor.
ElBruno.OllamaMonitor.slnIf the app is already running:
ElBruno.OllamaMonitor processBefore submitting a pull request or release:
dotnet builddotnet run --project src/ElBruno.OllamaMonitor/dotnet run ... -- --helpdotnet run ... -- configEdit Services/DiagnosticsLogService.cs or App.xaml.cs to write more info logs during startup.
Stop Ollama:
# On Windows, if running as service:
sc stop ollama
# Or if running in terminal, press Ctrl+C
Run the app—it should show gray tray icon and “Not Reachable” status.
Set endpoint to a different machine:
dotnet run --project src/ElBruno.OllamaMonitor/ -- config set endpoint http://192.168.1.100:11434
Load a different model in Ollama:
ollama pull llama2
ollama run llama2
The app should update within the refresh interval.
dotnet build -c Release
The .csproj is configured with <PackAsTool>true</PackAsTool>, so you can pack it:
dotnet pack -c Release
This creates a .nupkg file in the bin/Release/ folder.
(Requires NuGet API key and publishing rights)
dotnet nuget push bin/Release/ElBruno.OllamaMonitor.0.1.0.nupkg --api-key <your-api-key> --source https://api.nuget.org/v3/index.json
Once published, users can install via:
dotnet tool install --global ElBruno.OllamaMonitor
git checkout -b feature/my-featureFor a detailed understanding of how the app is structured, see Architecture Guide.
Ensure you have the Windows desktop development workload installed:
dotnet workload install wafxaml
Or install Visual Studio with “.NET desktop development” workload.
Check:
%LOCALAPPDATA%\ElBruno\OllamaMonitor\logs\If you see “Request timeout” messages, check:
ollama serve in a terminalollamamon configNext Steps: