8.4 KiB
Architecture
This document describes the design of the Telemetry Monitor application. Read it before making non-trivial changes.
Goals
- Smoothly render up to 16 rolling charts at 60 Hz from a 1 kHz packet stream.
- Run identically on Linux, Android, and Web from one codebase.
- Provide CSV export, scrollback, zoom, and pause without coupling these features to each other.
- Stay maintainable: small modules, clear data flow, single sources of truth.
High-level data flow
WebSocket
│ Uint8List frames
▼
TransportLayer ──── ValueListenable<WsConnectionState>
│ Stream<Uint8List>
▼
DecoderLayer (isolate on native, inline on Web)
│ Stream<Envelope>
▼
SessionController
├── PacketBuffer (ring, ~10 min @ 1 kHz)
├── LogBuffer (ring, ~5k entries)
├── ViewState (window, anchor, userPaused)
├── PpsCounter (sliding 1s arrival times)
├── Ticker (vsync) ──┬─► frameTick (ValueNotifier<int>)
│ └─► statusSnapshot (recomputed per frame)
└── logTick (ValueNotifier<int>, fires per new log)
UI layer subscribes to the notifiers above:
- ChartGrid → ChartWidget (CustomPainter, repaint: frameTick)
- StatusBar → statusSnapshot, connectionState
- ErrorLogPanel / FullLogTab → logTick
- Toolbar → ViewState, dialogs
Layers
Transport (lib/transport/)
Abstraction over web_socket_channel so the rest of the app doesn't know
about platform-specific channels.
WebSocketTransport(abstract):connect,disconnect,frames,state.websocket_transport_io.dart: native, usesIOWebSocketChannel.websocket_transport_web.dart: Web, usesHtmlWebSocketChannel.- Conditional import via
websocket_transport.dart.
Owns reconnection with exponential backoff. Reconnect parameters configurable in Settings.
Decoder (lib/decoder/)
Turns raw frames into Envelope messages.
Decoder(abstract):feed(Uint8List),envelopesstream.decoder_isolate.dart: native. Spawns an isolate that decodes and accumulates envelopes, sending them back as aList<Envelope>every ~8 ms (configurable). Reduces SendPort overhead at 1 kHz.decoder_inline.dart: Web. Decodes synchronously infeed().- Conditional import via
decoder.dart.
The 8 ms batch interval is chosen to be half a frame at 60 Hz, ensuring no batch ever crosses a frame boundary unobserved.
Session (lib/session/)
Central state owner.
PacketBuffer: fixed-capacity ring ofDataPacket. Supports binary search by timestamp and iteration over a time range.LogBuffer: fixed-capacity ring ofLogPacket.ViewState: window width (Duration), anchor (followLive | absolute(t)), userPaused. Window-clamping logic lives here. Does not hold proto-pause.PpsCounter:Queue<DateTime>of arrival times, popped to a 1-second window. PPS = queue length.Decimator: per-channel min/max decimation with LRU cache keyed on(viewStart, viewEnd, pixelWidth). Also computes gap segments classified as "hatched" (≥ pixel width) or "marker" (< pixel width). Marker pixel-x positions are deduplicated.SessionController: owns the above, subscribes to the decoder stream, drives the Ticker, exposes notifiers.
Status snapshot computation (per frame)
StatusSnapshot { connection, pps, statusValues[8], protoPaused } is
recomputed each frame by walking backward through the packet buffer up to
settings.statusLookback packets, collecting the most recent value for each
of the 8 status fields and the pause flag. Fields not seen within the
lookback window are reported as null (rendered as "unknown" in the UI).
This guarantees single-source-of-truth: the buffer is the only store. The walk is bounded and stops early once all 9 fields are resolved.
Pause composition
isPaused = userPaused || (statusSnapshot.protoPaused ?? false)
The status bar attributes the pause source by inspecting both flags.
Clear-all
clearAll() empties both buffers, resets the PPS counter, and forces the
view back to live mode. The status snapshot becomes all-null on the next
frame (since the buffer is empty), which is correct.
Layout (lib/layout/)
Display configuration. Persisted to shared_preferences under layout.*.
ChartConfig: per-chart settings — channel ID, enabled flag, name, Y mode (auto / fixed / userZoomed), yMin, yMax.GridConfig: rows, cols, cell-to-channel mapping.LayoutController(ChangeNotifier): owns the above plus the 8 status indicator names. ProvidesloadFromPrefs/saveToPrefs.
Export (lib/export/)
CSV writers behind a platform-split interface.
CsvExporter(abstract):exportData,exportLog.csv_exporter_io.dart: native. Usespath_providerfor save location.csv_exporter_web.dart: Web. Builds a Blob and triggers a download via an anchor element.- Both yield to the event loop between chunks (chunk size ~1000 rows) and report progress via callback.
Data CSV format
Columns: timestamp_us, ch1, ch2, ..., ch16, pause, status1, ..., status8.
Disabled channels are omitted from the column set. Missing values within
included columns appear as blank cells. Pause and status values appear as
integers (or blank if absent in that packet).
Log CSV format
Columns: timestamp_us, severity, error_number, description. The description
field is taken verbatim from the proto. Severity is the enum name.
UI (lib/ui/)
app.dart: MaterialApp, providers (via plainInheritedNotifier).toolbar.dart: pause, go-live, reset-view, layout, settings, export-data, clear-all, zoom readout.tab_scaffold.dart: tabs (Dashboard, Full log).chart_grid.dart: buildsChartWidgetper cell fromLayoutController.chart_widget.dart:RepaintBoundary+CustomPaint(painter listens toframeTick).chart_painter.dart: per-frame draw — decimation, gaps, line, axes, labels.status_bar.dart: connection pill, PPS, pause indicator with attribution, 8 status pills (subscribes tostatusSnapshot).mini_log_panel.dart: filtered log view (severity from settings).full_log_tab.dart: chip-filterable log view + log export button.settings_dialog.dart,layout_dialog.dart: configuration UIs.
Persistence keys
Both stored in shared_preferences:
| Prefix | Owner | Contents |
|---|---|---|
settings.* |
Settings |
WS URL, buffer caps, decoder/reconnect params, mini-log severity set, status lookback |
layout.* |
LayoutController |
Grid shape, per-chart config, channel names, status names |
Frame budget at 1 kHz / 60 Hz
Per frame (16.6 ms):
- Decoder isolate has already produced ~16 packets, batched to one SendPort message.
- Main isolate writes those packets to
PacketBuffer(16 × O(1) writes). - Ticker fires.
statusSnapshotwalks back ≤ 1000 packets × 9 fields, stops early when all resolved. Typically completes in 1 step.- 16 charts repaint:
- Each looks up its decimation cache.
- On cache hit: just draws ~400 line segments.
- On cache miss (zoom/pan): re-decimates (one pass over visible packets per channel).
- Status bar reads
statusSnapshotand renders.
Total CPU per frame is dominated by chart drawing, which Skia/Impeller handle well. Cache hit rate is ~100% in live-follow mode (the tail extends, prior columns are unchanged) and drops only during zoom/pan.
Threading model
- Native: 2 isolates. Main runs Flutter and everything except protobuf decoding. The decoder isolate parses bytes and emits envelope batches.
- Web: 1 isolate. Decoder runs inline on the main isolate. At 1 kHz with small protobuf messages this is acceptable; if it becomes a bottleneck, consider Web Workers via JS interop (significant work).
Conditional imports pattern
Each platform-split module follows:
// foo.dart (the public import)
export 'foo_io.dart' if (dart.library.html) 'foo_web.dart';
// foo_io.dart — native impl
// foo_web.dart — web impl
This means UI code just import 'foo.dart' and gets the right one.
Non-goals
- Multiple simultaneous WebSocket connections.
- Server-side filtering or downsampling.
- Recording to disk continuously (export is on-demand only).
- Per-chart x-axis (X is global).
- Internationalization (English only for now).