top of page

Unlocking Data Secrets: Advanced Brush and Link Chart Analysis for LabVIEW & TDMS

Updated: Nov 4


ree

In modern testing environments whether in intensive Research & Development (R&D) or high-volume manufacturing—we face a constant paradox: we collect gigabytes (GBs) or even terabytes (TBs) of data to ensure quality, but we lack the intuitive tools to effectively analyze it.

While LabVIEW is the gold standard for data acquisition, its default visualization capabilities, primarily basic XY graphs, often fall short when dealing with truly large volume data. Manually zooming through millions of data points to find a single anomaly is inefficient, slow, and error-prone.

This is why advanced test engineering teams often develop or integrate a powerful analysis tool: the Brush and Link Chart.

The Power of the Brush and Link Chart: Engineered for Big Data

The Brush and Link Chart is a custom solution designed specifically to validate and analyze extensive datasets far beyond the capacity of standard LabVIEW graphs. This tool introduces a dynamic, two-level visualization hierarchy that drastically speeds up data investigation.

This type of custom solution is built to tackle extreme demands, such as loading and analyzing weeks of complex manufacturing data:

  • Scale of Data: Successfully loads 11+ GB of TDMS data from 40+ channels.

  • Speed: Data loads in a fraction of time (often milliseconds).

  • Efficiency: Handles high-volume data (collected at 18,000 samples per minute) while maintaining a minimal memory footprint (typically around 300 MB).

How It Works: Dynamic Correlation

The core value of the tool lies in the instantaneous correlation it provides:

  1. The Brush Chart (Overview): This main graph displays the entirety of your dataset—a macro view of all collected data.

  2. The Link Chart (Detail): This secondary graph is directly linked to the first.

When an engineer uses the "brush" (a simple selection rectangle) to highlight a specific area of interest on the Brush Chart, the Link Chart immediately and automatically zooms in to display only the data within that selected window.

This instant, dynamic filtering allows you to start broad, identify an anomalous region (perhaps a spike or a dip), and immediately drill down for micro-level analysis without cumbersome manual scrolling or reconfiguring plot settings. You can then further zoom and refine your selection on the Link Chart to validate the smallest data irregularities.


Driving Efficiency in R&D and Manufacturing

This level of detailed, rapid analysis is critical across the product lifecycle:

Area

Challenge Solved by Brush & Link

R&D / Validation

Faster Failure Analysis: Quickly isolating the exact moment a prototype failed during a long-running test. Engineers can find the precise data point responsible for a failure in seconds, not hours.

Manufacturing

Process Drift Detection: Instantly visualize long-term quality trends across a production line. When a Key Performance Indicator (KPI) starts to drift, the tool allows quality teams to pinpoint which specific batch or test step caused the shift.

The Data Backbone: Seamless TDMS Integration

A powerful analysis tool is useless without access to the right data. A robust solution must be designed to integrate natively and efficiently with the most important data formats:

  • TDMS (Technical Data Management Streaming): Direct integration with the TDMS file format is essential. This ensures that huge, high-speed datasets—the very type generated by modern PXI and DAQ systems—can be fetched, processed, and visualized immediately. This avoids conversion bottlenecks and maintains data integrity.

  • Universal Compatibility: While performance is optimized for TDMS, the tool must be flexible and extensible to pull data from other standard datasets, including CSV, ASCII, and SQL databases, ensuring it fits seamlessly into any existing data pipeline .

Implementation and Validation: The Non-Negotiable Step

The Brush and Link Chart approach is more than just a visualization technique; it requires careful Data Validation. When dealing with massive datasets, the custom tool itself must be validated to ensure the data processing, filtering, and presentation are accurate and consistent.

 
 
 

Comments


bottom of page