Skip to content

Data Trim

Large files can bog down browsers, slow every preprocessing step, and clutter plots with millions of points.
Trimming lets you define a smaller “working window” before you run Fill → Resample → Normalize → Smooth. Trim can be performed from two locations, either: 1) from the "Main Control Panel" tab under the "Data Trim" section, or 2) from the "Auto labeler" tab under the "Data Trim" section.


Why Trim First?

Benefit Impact
Speed GUI remains responsive; preprocessing finishes sooner.
Focus Zoom in on the time span or region you actually need to label.
Memory Reduces RAM/VRAM load—important for long signals or high-resolution traces.
Safer experimentation Apply aggressive filters on a slice before committing to the full set.

Typical Workflow

  1. Open Preview – Load the raw series in the viewer.
  2. Select Range – Drag start/stop handles (or enter timestamps) to bracket the region of interest.
  3. Apply Trim – The GUI hides data outside the window; subsequent preprocessing uses only the trimmed slice.
  4. (Optional) Iterate – Adjust the window as you inspect results, then lock it in before exporting labels.

Tip: Trimming is non-destructive—your original file stays intact.
Simply clear the trim to restore the full record.


Best-Practice Guidelines

  • Trim early, often: Even a rough cut (e.g., one day from a year-long log) can boost interactivity.
  • Leave overlap: If you plan to stitch adjacent segments later, keep a small overlap (e.g., 5 %) at each edge.
  • Re-trim after resampling: If down-sampling changes time resolution, you may need to adjust the window.
  • Document your window: Save start/stop indices or timestamps with your preprocessing pipeline for reproducibility.

References & Further Reading

  • S. W. Smith, The Scientist & Engineer’s Guide to DSP — ch. 3 “Digital Data Acquisition” (data size vs. performance). Free PDF
  • H. Wickham, “Tidy Data,” J. Stat. Software 59 (10), 2014 — importance of defining observational units before analysis. Link