Custom Automated Test System – Quantifying Energy and Durability Performance for Refrigeration
Automation reduces manual labor while improving traceability
Assessing performance for improved energy ratings and longevity
Client – Zero Zone – Commercial refrigeration systems manufacturer
Zero Zone wanted to improve the capabilities and durability of their new reach-in refrigeration products.
You might think that refrigeration is a mundane product line, but that is just not true! So many innovations are occurring as manufacturers are redesigning their products to improve their environmental footprint through better energy efficiency, coolants, and durability.
Assessment requires an understanding of the performance of the refrigeration units under many conditions. Zero Zone was taking measurements with a datalogger with too few channels, and no synchronization, to other devices that feed into the system. Plus, they had multiple models of their reach-in refrigerators that needed to be assessed. Furthermore, simplifying the data collection and analysis would make it easier to validate against ASHRAE standards.
Zero Zone came to Viewpoint with the following high-level desires:
Expand the measurements by adding more channels and channel types (e.g., 4-20 mA, ±10 VDC and digital I/O).
Provide graphs and KPIs to enable faster analysis of the data during the test.
Minimize the chance of data loss during long test runs.
Synchronize data collection and actuation.
Automate storage of measurements per a user-defined period to eliminate manual start/stop of data collection.
Simplify the manual configuration setup.
Enable a way to find relevant data perhaps months or years after the test run.
Viewpoint developed a monitoring and control durability test system that could exercise Zero Zone’s refrigerators through hundreds of operation cycles over multiple conditions to simulate actual usage in, for example, a grocery store.
During initial conversations, we collaborated closely with Zero Zone to brainstorm on some potential approaches. We made some suggestions that could satisfy their desires while also managing their time and cost budgets.
For example, by automatically populating the cells in an Excel template based on their original systems’ Excel spreadsheet, we provided streamlined report generation without having to rewrite all the calculation code embedded in their Excel file in another app. The compromises we jointly endorsed were:
Run an app on a PC to configure and monitor the test.
Use both NI Compact RIO and Compact DAQ to enable robust and synchronized data collection and control with the ability to expand channels by adding modules in both the cRIO and cDAQ chassis.
Store data on a local PC rather than a remote server to minimize the probability of data lost during the test run.
Save configurations into Excel files for recall, and cloning, of prior setups.
Write measurements automatically into the same Excel file for archive of the test setup and measurements.
Create, in this same Excel file through cell formulas, the summary report from the summary calculations. This approach allowed flexibility for changes to internal and external test standards.
Upload the summary data and test reference info into a SQL database for data management and long-term test statistics.
Digital outputs (DOs) were used to control various aspects of the test, such as door open/close and defrost on/off cycles. For flexibility, the user can specify the sequencing of these DO channels, in the Excel file used for the test, with various parameters that define the duty cycle, period, number of cycles, and start delay. The timing of these DO state changes was synchronized to the data acquisition by the real-time loop in the cRIO.
This system was deployed to 6 test bays, each one of which might be testing a unit for as little as a few weeks or as much as a few months.
The main goal of this project was to reduce the effort and associated human error in the design and execution of the test run.
Some of the primary benefits for this automated system were:
Reduced Errors: pre-verified template files used for test configuration and data storage lent consistency to test setup and execution.
Less Testing Time and Effort: the automatic execution of the test and storage of measurements enabled running tests for multiple days (and nights) without technician interaction. Technicians could work on setting up other units for test rather than babysit the existing test. On average, based on the duration of the test time, testing throughput increased by approximately 25% to 40%.
Shorter Reporting Time and Effort: reports were available about 85% faster than the time previously spent creating manually. The quicker feedback saved costs through early detection of unit problems and faster teardown at the end.
Some additional major benefits were:
More details on refrigerator operation: “Wow! We never saw that before.”
Database consolidation: statistical analysis takes hours not days and includes all tests run in the lab, not just ASHRAE tests. This central database enables long term retrieval of all test data.
Reuse: techs embraced ability to reuse and modify previous setups.
Consistency: driving the test definition through an Excel file encouraged uniformity.
Traceability: documented and timestamped calibration measurements.
Flexibility: channel counts, acquisition modules configuration, calibration, and calculation formulas were straightforward to change for new test setups.
The test automation provided by this system greatly reduced the labor involved in configuring, running, and analyzing the test run. Furthermore, the customer benefited from the consistency that resulted from the software-enforced process.
We developed the application in LabVIEW and LabVIEW RT combined with a cRIO connected to a cDAQ via TSN Ethernet.
The data acquisition modules slotted into the cRIO and cDAQ chassis handled the I/O to the customer sensors and actuators. The sensors mostly measured:
Data logging of between 50 and 150 channels and control via digital signals
Interface with Excel files for configuration, data logging, and summary calculations
Custom Automated Test System – Characterization of Heat Transfer System Thermal Performance
R&D testing required flexibility in control schemes and measurement I/O
Client – ATSI, a large-scale System Engineering Provider
Our client, ATSI, Inc., headquartered in Amherst NY, designs and builds complex structures and process systems, from industrial construction projects to mechanical systems for power engineering. A previous, long-standing Viewpoint customer that does research and design of thermal energy systems approached ATSI to engage in the build of a specialized test skid that would be used to assess and characterize a heat transfer system. Our long-standing end-customer requested a data acquisition subsystem based on LabVIEW.
Furthermore, the end-customer requested a subsystem that supported flexibility in the data acquisition by channel count and type, since the R&D nature necessitated adaptability. The overall test system needed to automate progress through a sequence of setpoints and ramps.
ATSI designed the automated control and sequencing with a Modicon PLC. Viewpoint augmented ATSI’s engineering resources to provide the data acquisition subsystem and setpoint sequence editing. This sequence was passed to the PLC for automated sequencing thought the setpoint list. Because our mutual end-customer did not provide explicit design details, we had flexibility to decide which aspects of the control and data acquisition needs would be automated by the Modicon PLC and which by the PC running LabVIEW.
Since Viewpoint had previously developed a similar application for our end-customer, with some of the required data acquisition needs, we chose to leverage and enhance that software platform for this project. That choice drove some of the other designs and defined the scope of work for Viewpoint and ATSI.
Some overall design decisions were:
The LabVIEW application provided data acquisition, test configuration, and operator screens.
The Modicon-based subsystems provided process control and safety.
A PLC HMI for process system operation and status as well as control loop tuning.
NI Compact DAQ (cDAQ) offered flexible PC-based acquisition channels for high sample rate historical data collection.
A sequence editor on the PC defined the test setpoints, durations, and limits to pass to the PLC for execution.
The test configuration encompassed cDAQ channel configuration, PLC tag configuration, sequence editing, and graphical views on the acquired data. Some channels were acquired at slow rates, e.g., up to about 1 S/s for sensors measuring parameters such as temperature and flow, while others had fast rates, e.g., 1 kS/s to 10s of kS/s for sensors measuring parameters such as transient pressure and vibration. Handling the datafile storage and display of this wide range of data types and rates was important for the end-customer to compare and correlate the effects of changing operating conditions.
Data logging is configured by the sequence editor to occur on certain conditions such as immediately entering a new step, time delayed after entering a step, and activated by the PLC upon reaching stable setpoint control. This flexibility gave the end-customer management of when data collection occurred to ease the comparison and correlation of readings from selected sensors.
After the configuration is completed, the sequence is passed to the PLC. The operator starts the test on the PLC HMI and the PLC automates the test run. Data collected during the test run could be displayed in live graphs during testing, used for verification of setup and operation; post-test in stacked graphs and overlaid plots; and exported for specialized analysis, display, and review.
The design of this system was driven largely by the need for flexibility. Sensors and channels could be added, the test sequence could be edited with a variable number of steps with editable execution features, and the data acquisition and storage permitted various rates and logging criteria.
These design choices offered the following advantages:
As a partner of the team, Viewpoint acted as staff augmentation for ATSI by providing experienced engineers with expert LabVIEW and data acquisition capabilities.
Flexibility of test sequences, including setpoints and their stabilization criteria.
Tight integration between the Modicon PLC and LabVIEW-based PC enables critical control and safety to execute reliably and yet adjustably.
Customized mechanical all-welded skid plugs into end-customer’s test article.
Setup of data logging including configurable sample rates.
Ability to add channels by plugging in supplemental DAQmx-based cDAQ modules.
The LabVIEW application architecture is actor-based for straightforward inclusion of new data sources as needed in the future.
New data sources are registered with the object-based data aggregator.
The system handles multiple days of test execution
The multi-pronged viewer allows verification checks while in setup an operation as well as post-test review.
The custom automated test system supplied to the end-customer was a hybrid, made up of PC-based and PLC-based components coupled with the fluid-handling components on the skid. The hardware listed below includes only the data acquisition, control, and safety items, and only a high-level description of the mechanical aspects.
NI LabVIEW for Windows [Viewpoint]
NI LabVIEW Modbus driver [Viewpoint]
NI DAQmx hardware drivers [Viewpoint]
Actor-based object-oriented LabVIEW application for the PC [Viewpoint]
Modicon Concept software [ATSI]
Blue Open Studio HMI software [ATSI]
Function Block Programming for the PLC [ATSI]
Modicon PLC and modules for pressure, temperature, flow and other process variables
NI Compact DAQ modules, including 4-20 mA, RTD, thermocouple, thermistor
600 VDC Power supplies
Components to flow fluid, including pumps, valves, pressure regulators
Replacing Wire-wrap Boards with Software, FPGAs, and Custom Signal Conditioning
Electronic components of fielded systems were aging out Reverse engineering effort converted wire wrap boards to FPGA-based I/O
Client – Amentum – A supplier for Military Range System Support
Amentum (www.amentum.com) supports a decades-old system deployed in the early 1980s. While the mechanical subsystems were still functioning, the wire-wrapped discrete logic and analog circuitry was having intermittent problems.
Systems designed and built decades ago can sometimes have wonderful documentation packets. Nevertheless, we’ve been burned too often when the docs don’t incorporate the latest redlines, last-minute changes, or other updates.
The replacement system needed to be a form-fit-function replacement to land in the same mounting locations as the original equipment with the same behavior and connections. Below is an image of the existing wire-wrap boards and their enclosure. We had to fit the new equipment in this same spot.
Figure 1 – Original wire-wrap boards
Finally, Amentum wanted to work with Viewpoint in a joint development approach. While our joint capabilities looked complementary, we didn’t know at the start how well we would mesh with our technical expertise and work culture – it turns out we worked extremely well together as a team and neither one alone could have easily delivered the solution.
Since the team treated the existing documentation package with suspicion, we adopted a “trust but verify” approach. We would use the documents to give overall direction, but we would need details from the signals to verify operation.
Leveraging Amentum’s experience with the fielded systems, the team decided early on to record actual signals to understand the real I/O behavior. We used the system’s “test verification” unit to run the system through some check out procedures normally run prior to system usage. This verification unit enabled us to use a logic analyzer for the I/O to and from the discrete logic digital signals and an oscilloscope and DMM for the analog signals. The available schematics were reviewed to assure that the signals made sense.
With a trustable understanding of system operation, Amentum created a requirements document. We jointly worked on the design of the new system. There were both an “inside” system (in a control shelter) and an “outside” system (in the unit’s pedestal).
Some overall tasks were:
Viewpoint recommended an architecture for the inside application running on PXIe LabVIEW RT and FPGA layers.
Amentum created the system control software on a Linux PC.
Viewpoint developed the more intricate parts of the inside application and mentored Amentum on other parts they developed. This work recreated the existing discrete logic and analog I/O using PXIe NI FPGA boards.
Viewpoint designed custom interposer boards to connect harnesses to the NI PXIe equipment, including a test point and backplane boards.
Amentum designed and developed the cRIO-based outside system application and Viewpoint created a set of custom interposer boards to connect harnesses to the cSeries modules.
The PXIe FPGA boards handled the required 60 MHz clock-derived signals with correct phases, polarity, and so on. Furthermore, the wire-wrap boards were register-based so the PXIe had to decode “bus signals” sent over a Thunderbolt bus to emulate the programming and readouts from the various wire-wrap boards.
Figure 2 – PXIe replacement to wire-wrap boards
Amentum wanted to be able to support the LabVIEW FPGA VIs used to replace the functionality of the discrete logic. So, Viewpoint acted as mentor and code reviewer with Amentum to ramp them up on using LabVIEW FPGA effectively. Neither one of us alone would have been successful coding the applications in the allotted time. Joint knowledge and experience from both Viewpoint and Amentum were required.
Signal conditioning and harnesses needed to be reworked or replaced as well, of course, since the landing points for the wires were different in the new system. Viewpoint suggested a technique, which we’ve used frequently in past obsolescence upgrade projects, to create PCB boards that accepted existing connectors.
For the cRIO, these interposer “connection” PCBs plugged directly into the cRIO cSeries module. For the PXIe, these interposer PCBs accepted the field wiring connectors and converted them to COTS cables that connected to the PXIe modules. These interposer PCBs could have signal conditioning incorporated as needed. This approach significantly reduced the need for custom harnesses. All told, about 200 signals were passed between the PXIe and various other subsystems, and about 100 for the cRIO. This approach saved significant wiring labor and cost.
Figure 3 – cRIO with interposer boards between cSeries and field harnesses
The work to design and build the signal conditioning custom electronics was split between Viewpoint and Amentum. Viewpoint did more design than build and handed over the schematics and Gerber files to Amentum so they could manage the builds while also being able to make modifications to the boards as needed.
Amentum wanted an engineering firm that was willing to work along side them as a partner. Joint discussions about architecture and design led to a collaborative development effort where Amentum benefited from Viewpoint’s extensive expertise and guidance on LabVIEW architectural implementation and FPGA coding style.
The main outcomes were:
As a partner of the team, Viewpoint acted as staff augmentation by providing experienced engineers with technical capabilities that Amentum initially lacked.
This team approach delivered a stronger product to the end-customer more quickly than either of us could do alone.
The combination of Viewpoint’s and Amentum’s experience reduced the amount of reverse engineering needed due to the lack of firm requirements.
Reduction of electronics obsolescence by using software-centric FPGA-based functionality. Recompiled LabVIEW FPGA could target future boards models.
Increased software-based functionality simplifies future updates and modifications.
Decrease in number of parts leading to simpler maintenance.
Lower wattage consumed eliminated need for an anticipated HVAC upgrade.
Cybersecurity concerns were reduced by using Linux-based systems and FPGA coding.
Using software to emulate the old hardware was a critical success factor. Since the requirements were not 100% solid at the start of the project, some field-testing was required for final verification and validation. The flexibility of the software approach eased modifications and tweaks as development progressed. A hardware-only solution would have necessitated difficult and costly changes. For example, some of the changes occurred very near the final deployment after the system was finally connected to an actual unit in the field.
Emulate original discrete logic functions via FPGAs
Emulate original analog signal I/O
Overall system control via Linux PC
Maintain the same user experience as existed before
Modern application architecture for simpler maintenance
NI cRIO chassis with various cSeries modules
NI PXIe chassis with FPGA modules to handle all the analog and digital I/O via a combination of multifunction and digital-only cards
Custom PCBs for signal conditioning and connectivity
Enhanced Portable Data Acquisition and Data Storage System
Using a Real-Time Operating System (RTOS) provides a high level of synchronization and determinism for acquired data.
Tier 1 Automotive Design and Manufacturing Supplier
Our client had an existing data acquisition system, used for mechanical product validation testing, that had undergone many updates and patches for over 15 years. These updates and patches, performed by multiple developers, had rendered the software portion of the system somewhat unstable. Furthermore, the system hardware was based on NI SCXI, which was becoming obsolete. These issues prompted our client to migrate to an entirely new system.
New requirements for this upgrade included utilizing a PXI controller running NI Linux Real-Time, a RTOS, executing a LabVIEW RT application. The data acquisition software had to support a variable mix of signal conditioning modules in the PXI chassis. In addition, the data acquired from these signal conditioning modules needed to be synchronized within microseconds.
Viewpoint leveraged another application, developed for the client a few years prior, to harmonize the user interface and to reduce development effort. Most of the development time focused on support and configuration of the multiple module types and ensuring that the data synchronization functioned as required. The result was an ultra-flexible, portable, high-speed data acquisition software/hardware combination that can be used to acquire time-sensitive, synchronized data across multiple modules in a PXI chassis running a real-time operating system.
The upgraded system offers the following features:
Highly configurable real-time data acquisition hardware/software solution based on LabVIEW RT and PXI hardware. Our client works closely with OEMs to assure compatibility and durability with their products, often going to the OEM’s test cells to collect performance data. The configurability in modules and channels affords the fastest possible setup at the OEM’s site which minimizes time and cost in the test cell.
Configuration files stored in a SQL database format. Saving channel and module setups in SQL allows the test engineer to locate previous hardware and data acquisition configurations. The usual alternative is a bulk save of an entire system setup rather than using a more granular, and hence, more flexible approach afforded by using the database.
Immediate test feedback through graphs and analog indicators, used to assure data quality before leaving the test cell.
Data playback features after the data has been acquired, used for in-depth review of data after leaving the test cell.
Data acquisition on the RTOS provides assurance that the acquisition will not be interrupted by network or other OS activities, which was occasionally an issue with the prior Windows-based application.
Synchronization between signal conditioning modules ensures time-critical data taken on separate modules can be compared and analyzed.
The system consisted of custom LabVIEW RT software intended to run on an engineer’s laptop and the PXI real-time controller and a PXI chassis populated with a flexible assortment of NI signal conditioning modules (provided by the client).
The software used an object-oriented Actor-based architecture, which facilitates adding new signal conditioning modules and flexible communications between the host PC and the real-time controller.
Standardizing on testing technique & reporting
Increasing test throughput with automation
Client – Industrial pump manufacturer
Pump manufacturers typically test their product in the same facility in which the pumps are created. These tests are well defined and based on standards created by organizations such as API and ANSI to name a few. These tests are run to verify the performance of the pump as well as provide a report to the customer demonstrating that the pump they purchased will meet their needs. Sometimes these tests become factory witness tests where the customer sits in on the testing being performed on the pumps they had purchased.
This particular pump manufacturer had a test facility where many different sizes and types of pumps could be tested. The software provided to them by another integrator to run the test stand had the desired user interface screens, but did not function as requested, and was missing certain desired features.
Our client came to us with the following requests:
Evaluate the software written by the previous integrator to assess whether any of the code could be reused in the new application.
Ensure that existing cDAQ hardware would be compatible with changes and improvements to the software.
Deploy the hardware/software solution on the test site to verify it performed as required.
It interfaced with simple delimited text files and Excel workbook templates to save the configuration information necessary to set up and run the required tests.
The resulting data acquired during the test, both raw data acquired from the sensors and calculated data used to characterize the pump being tested, are saved to delimited text files.
Both high-speed (10 kS/s) and low-speed (10 S/s) data are acquired. High-speed data are graphed real time in a separate UI that allows the user to set cursors on the graph for visualization and printing to a pdf report.
Low-speed data is written to a delimited text file for archival storage and retrieval if additional analysis is required.
The high-speed data are for vibration measurements. The low-speed data are for pressure, temperature, RPM and flowrate measurements.
Flow rate and vacuum pressure were automatically adjusted using a PID loop and 4-20 mA controlled valves.
The pump test configuration is performed within the application from a series of drop-down selections populated with sensors found in an Excel workbook. The Excel workbook is updated and maintained outside the application. Once the test sensors and conditions have been selected, those selections are written to an Excel workbook for use in the reports.
The Excel ActiveX interface was used to develop the reports the client provided to their customers for the slow data. The Excel workbook template contained the formatting necessary for the reports. As the software wrote the data into the workbook, the reports were built from the formulas and formatting already configured in the Excel template.
At the end of the test, the software printed the appropriate worksheets containing the elements of the report required. The vibration (high-speed data) is written to a binary file and a pdf.
Reducing the number of pumps in the production queue
Pump manufacturers typically test their product in the same facility in which the pumps are created. These tests are well defined and based on standards created by organizations such as API and ANSI to name a few. These tests are run to verify the performance of the pump as well as provide a report to the customer demonstrating that the pump they purchased will meet their needs. Sometimes these tests become factory witness tests where the customer sits in on the testing being performed on the pumps they had purchased.
Most pump manufacturers have more than one testing site at a facility to accommodate different pump types and sizes. The ability to automate these tests and to present a common look to the testing process and reports make the customer experience more positive to those reading the reports and/or witness the testing.
Our client came to us with the following requests:
Evaluate the software written by the previous integrator to assess whether any of the code could be reused in the new application.
Specify a hardware and software solution to acquire the signals needed to compute the performance results for standard tests.
Deploy the hardware/software solution on the first test site to verify it performs as required and then deploy to the remaining test sites at their facility.
The Pump Test Utility had the following features:
A Pump Test application that can run one of two different tests; a performance test and a net positive suction head (NPSH) test.
An Access database was used to store the available sensors for that test site. The user selected the appropriate sensors while configuring the test.
The test data was stored in an Excel file where each pump received its own Excel file.
The LabVIEW Report Generation toolkit was used to populate the Excel file with data as well as create the report for the pump
Standardization of testing technique – now each pump will be tested with the same procedures and calculations/algorithms used are standardized across all test sites within the facility.
Standardization of report content and presentation – now every customer that purchases a pump from our client will receive a report with identical information presented and that information will have been derived from the same calculations/algorithms.
Reduction of the number of pumps in the production queue (and hence inventory) by roughly as much as 1/2 due to faster data acquisition and especially the archiving of the test results and the generation of the final report and its associated calculations.
We developed the Pump Test Utility application to allow our client’s engineers and operators to:
Run one of two guided tests that visually provides pump performance feedback during the test.
Create a test configuration file based on an Excel template that will be used to store test data as well as generate a report.
The pump test software was developed in LabVIEW and interfaced with an Access database and Excel workbook to acquire the configuration information necessary to set up and run the required tests. The resulting data acquired during the test, both raw data acquired from the sensors and calculated data used to characterize the pump being tested, are saved to the Excel workbook. Both high speed (10 kS/s) and low speed (10 S/s) data are acquired and stored into one data file for archival storage and retrieval if additional analysis is required. The high-speed data are for vibration and sound measurements and low speed data are for pressure, temperature, RPM and flowrate measurements.
The pump test configuration is performed within the application from a series of drop down selections populated with sensors found in an Access database. The database is updated and maintained by the client through a series of user interfaces within the application. Once the test sensors and conditions have been selected, those selections are written to the Excel workbook for use in the reports.
The LabVIEW Report Generation Toolkit software was used to develop the reports the client provided to their customers. The Excel workbook template contained the formatting necessary for the reports. As the software wrote the data into the workbook, the reports were built from the formulas and formatting already configured in the Excel template. At the end of the test, the software printed the appropriate worksheets containing the elements of the report required.
Developing an industrial monitoring system for ultrasound-based sensing in a harsh environment
Client – Energy Research Lab
Our client was experiencing problems making temperature measurements in a hostile, irradiated environment. Traditional temperature sensors don’t last long in this environment, so our client was developing a sensor designed for these conditions.
Special equipment is required to drive this sensor. It’s an active sensor requiring an ultrasound pulser/receiver (P/R) and high-speed digitizer to make it function.
The prior attempt the client made at using an original set of special equipment was having reliability and connectivity issues. This reduced reliability was of critical concern due to the requirement for the sensor to operate for years without downtime.
In addition, the existing application was incapable of displaying live data and lacked a user-friendly interface. On top of that, data analysis had to be done after the application was run, causing delays.
Our client needed reliable and robust hardware to drive the sensors and an application that would eliminate the challenges associated with the existing system.
Viewpoint accomplished the following:
Evaluated two different ultrasonic P/R sensor driver hardware solutions to select a solution that would provide the connectivity robustness, configurability, and correct sensor driver characteristics required for the given sensors.
Decoupled the digitizer embedded in the original P/R by adding a PXI digitizer with better capability.
Provided backward compatibility with previous measurement hardware to aid in performance comparisons with the new hardware.
Developed a LabVIEW-based application that corrected all the issues with the existing application including real-time data analysis, real-time data visibility and a modern user interface. The new application also provided sensor performance traceability using the sensor’s serial number.
The enhanced measurement system offers the following benefits:
Reliable sensor subsystem to ensure uninterrupted data acquisition.
Measurement hardware configurability for sample rate, collection duration, and pulsing repetition rate.
Application configurability for automating the analysis, historical archiving, and results reporting.
Real-time data analysis.
Sensor traceability through serial number and data files.
Engineering mode to take control of the entire measurement system.
Improved data logging to include raw and analyzed data.
Improved application user experience via robust data collection and configurability.
The deployed temperature monitoring system consisted of the following components:
COTS pulser/receiver hardware for driving the sensors.
COTS high-speed DAQ for retrieving ultrasound signals.
A LabVIEW-based software application to provide real time data monitoring, error/alarm notification, data analysis, data logging, part traceability and backward compatibility with the older sensor driver hardware.
Custom FlexRIO Adaptor Module supports HIL Test Upgrade
A custom-COTS approach reduces cost and delivery time.
Client – Major National Research Lab
Our client has a client (the end-user) for which they developed an HIL test system several years prior. Parts were obsolete and the system needed an upgrade. The prior system had many custom-designed electronic components which could not be replaced without a complete redesign.
Consequently, our client wanted to use COTS. However, one device needed 28 VDC digital I/O, a couple of lines which carried significant current (amp, not milliamp, levels) and at switching rates much higher than a COTS solid state relay could provide.
Viewpoint reviewed the requirements and created a hybrid COTS-custom solution. We combined an NI FlexRIO module with a custom FlexRIO Adapter Module (FAM) for the front end to satisfy the 28 VDC signals levels and required current drive.
COTS FlexRIO integrates into the remainder of our client’s PXI-based test system.
The Custom I/O was designed for flexibility. Our client can use this FAM for both their initial end-user and other programs / clients too.
Reduced cost relative to a completely custom solution.
Delivery time reduced by months relative to a custom solution
The custom FAM interfaced with the NI FlexRIO module, which offered low-level digital I/O (3.3 V logic), to digital signal conditioning hardware that provided the 28 VDC signal levels and required current drive.
Each I/O pin was configurable as input or output (source or sink). Each bank of 4 channels had an adjustable threshold level set via an adjustable DAC output. Some of the channels are designed for amp-level current drive, while the remainder were 250 mA. All I/O was fused appropriately.
Viewpoint also developed LabVIEW FPGA and VHDL to enable our client and the end-user to:
Configure the I/O as in or out.
Communicate to the DAC to allow custom input threshold trigger levels.
At maximum throughput, the Aedis systems needed to consume and produce more than about 800 MB/s/slot.
A large company involved in C4ISR was developing a system for a new high-speed digital sensor device. Viewpoint was contracted to build a test system used in design validation and ultimately endurance testing of the sensor. Since the sensor was a component of a larger system which was being developed at the same time, another test system was created to simulate the sensor by feeding signals into the system.
Both the amount of data and the frequencies of the various digital signals were nearly at the limit of hardware capabilities. At maximum throughput, the systems needed to consume during record and produce during playback about 800 MB/s/slot. The FPGA clock on the FlexRIO had to run up to 300 MHz. The skew between triggers for data transmission needed to be less than 5 ns even between multiple FlexRIO cards even when the parallel data paths has inherent skews associated with the sensor. Finally, the systems needed to handle clocks that might be out-of-phase.
Achieving these requirements required significant engineering design in the face of multiple possible roadblocks, any one of which could have eliminated a successful outcome.
Furthermore, as usual, the development timeline was tight. In this case, it was a very tight 3 months.
To meet the timeline, we had to work in parallel across several fronts:
LabVIEW-based application development for both record and playback
LabVIEW FPGA development for marshalling data between the controller and DRAM
Custom FAM circuit board design and build
FlexRIO FPGA CLIP nodes and code for low-level data handling
This sensor had several parallel data paths of clock and data lines with clock speeds up to 300 MHz on each path requiring exacting design and build of a custom FlexRIO Adapter Module (FAM) and unique custom CLIP nodes for extending the FlexRIO FPGA capabilities. The FAM also had a special connector for interfacing to the customer’s hardware.
Additional NI hardware and software completed the system components.
The choice to base the Aedis emulators on NI hardware and software was critical to completing this project. The open architecture in both hardware (custom FAM) and software (CLIP Nodes) enabled us to include some very creative extensions to the base toolset without which the project would not have succeeded in the allotted pressured schedule and on a predetermined budget. We were able to stretch the capabilities of the hardware and software very close to their maximum specifications by combining COTS and custom much more cost effectively than a purely custom design.
The host application, written in LabVIEW, managed the configuration of the data acquisition and the control of the LabVIEW RT-based FlexRIO systems. The configuration primarily dealt with the number of sensor channels in use, skew settings between digital lines, and other parameters that dealt with the organization of the data passed between the sensor and the FlexRIO.
Two FlexRIO applications were written, one for record and one for playback. Each FlexRIO application was written in LabVIEW, and managed the configuration of the FlexRIO cards and the movement of data between the FlexRIO cards and the RAID drives. Note that Windows supported for the RAID driver. Between 10 and 32 DMA channels were used for streaming, depending on the number of sensor channels being used.
And, each FlexRIO application had an FPGA layer, written in LabVIEW FPGA enhanced with custom CLIP nodes. For the record application, we developed a custom DRAM FIFO on the FPGA to assist with the latencies on the PXIe bus. For the playback application, we were able to stream directly from DRAM.
The FlexRIO and stock FAMs from NI were initially considered as candidates for this project. Clearly, working with commercial-off-the-shelf (COTS) components would be most effective. Three options were available at the project start which could accommodate the required clock frequencies, but none offered both the required channel counts and skew/routing limitations. Hence, we had to design a custom FAM. This decision, made before the start of the project, turned out to be wise in hindsight because the parallel development path resulted in some shifts of sensor requirements which could be accommodated with the custom FAM but might have led to a dead-end with a COTS FAM.
In LabVIEW FPGA, a CLIP Node is a method to import custom FPGA IP (i.e., code) into a LabVIEW FPGA application. CLIP stands for Component-Level Intellectual Property. We needed to use special Socketed CLIP Nodes (i.e., VHDL that can access FPGA pins) for this project because we could expose additional features of the Xilinx Virtex-5 not exposed in LabVIEW FPGA by accessing Xilinx primitives. Some specific features were:
Faster FPGA clocking
Additional clocking options
Individual clock and skew control
Custom PLL de-jitter nodes
Essentially, the FPGA design had a majority of FPGA code developed in LabVIEW FPGA and we used CLIP Nodes for interfacing the signals between the FlexRIO and the FAM.
FlexRIO Adapter Module
As mentioned earlier, we had to create a custom FAM because of the need to route high speed signals from customer-specific high density connectors while synchronizing signals across multiple data channels and FPGA modules to within one (300 MHz) clock cycle.
At these high-speeds, the FAM needed careful buffering and impedance matching both on the signals as well internal components on the FAM PCB. At the start of the design, we utilized Mentor Graphics HyperLynx High Speed DDR signaling Simulation software to minimize signal reflections prior to building actual hardware. This step saved countless hours in spinning physical hardware designs.
We designed the FAM to allow channel routing and access to additional clock and trigger pins on the Xilinx chip and PXIe backplane.
Pump Test Station Used Across Multiple Locations Worldwide
Client – ITT Goulds Pumps
Pumps are used for everything from sump pumps for the consumer home market all the way to large pumps for industry capable of moving thousands of gallons per minute. Just as varied is the fluid being pumped: from water to slurries to hydrocarbon-based fluids.
Pump manufacturers typically build and test in plants across the world and each of those facilities is responsible for testing every pump manufactured at that site to ensure that the pump will perform as the customer expects. These tests are well defined and based on standards such as API, ANSI and other organizations. These standards provide test procedures but do not give details as to how to perform the tests. Each site typically tests their product without guidance as to how to satisfy the aforementioned standards. As a result, differing hardware and software solutions are usually put in place to test the individual site’s products.
Such varied testing systems make it exceedingly difficult to compare test results across testing sites, both within plants and between plants.
We were asked by our client to create a homogeneous test platform with which they could compare data across manufacturing plants and test sites within the plants as well as automate calculations of plant performance metrics and reporting.
This client engaged us to develop and implement a software and information storage solution that could run the prescribed tests on any testing site and make these data available to their engineers. These tests were to be semi-automated and guide the test operator through a test ensuring that the procedure and the resulting data were collected in the same manner on every test site, worldwide.
The Pump Test Globalization application consists of the following sub-applications:
A Pump Test application that can run 1 of five different tests simultaneously to reduce the amount of time the UUT is under test. A separate data file is generated for each test and those data files are stored in the database along with the test results.
A Test Configuration application that helps to manage the orders and the tests association with those orders.
A Report Generation application that creates a report for each test run on a pump. Additional performance graphs are generated along with options for graphs depicting vibration and orbital performance of the pump.
Standardization of testing technique – now each pump will be tested with the same procedures and calculations/algorithms used are standardized across all manufacturing test sites.
Standardization of report content and presentation – now every customer that purchases a pump from our client, regardless of origin of manufacture, will receive a report with identical information presented and that information will have been derived from the same calculations/algorithms.
Ability to generate manufacturing performance data – metrics such as first pass yield may be calculated for all manufacturing sites. Data from all manufacturing sites may now be compared.
Abstraction of data acquisition hardware – measurement data can be acquired from a variety of sources including OPC servers and NI DAQ Hardware. With this abstraction, the client’s existing hardware was reused where it made sense and replaced with new hardware as needed.
We developed Pump Test, Pump Test Configurator and Pump Test Report Generator applications to allow our client’s engineers and operators to:
Run one or a series of guided tests that visually provides pump performance feedback during the test.
Configure a test for a specific pump model number and serial number. This configuration is read in by the Pump Test software to set up the test according to the configuration.
Generate a report that would be sent with the pump to their customer showing how the pump performed and that it met the customer’s requirements.
The pump test software was developed in LabVIEW and interfaced with a SQL database to acquire the configuration information necessary to set up and run the required tests. The resulting data acquired during the test, both raw data acquired from the sensors and calculated data used to characterize the pump being tested, are saved to the database. Both high speed (10 kS/s) and low speed (10 S/s) data are acquired simultaneously and stored into one data file for archival storage and retrieval if additional analysis is required. The high-speed data are for vibration and orbital measurements and low speed data are for pressure, temperature, rpm and flowrate type measurements.
The pump test configuration software was also developed in LabVIEW and is a separate application that uses a SQL database on the back end. The database is located on a secure server and has been designed to retain the following information:
Lists of all the manufacturing and test facilities.
List for all the motors used for running the pumps.
List for all the available sensors and hardware for each test station for every manufacturing plant.
Ability to associate each sensor with a hardware channel for acquisition.
Create and edit orders that contain pump specific information such as model number and serial number.
Create and edit test configuration information for a given order.
The report generation software was also developed in LabVIEW and provides the user a means to create standard reports for each of the test types. Additional addendums to the standard report can be created to include graphs utilizing the high-speed data such as vibration and orbital information.
Custom Endurance Test System – for a medical device
Increased level of automation allows for multi-day and multi-week test runs
A medical device manufacturer
Our client wanted to improve the endurance testing of an implantable medical device product to help determine the recommended lifetime of the product. An obsolete test system existed, but the client wanted improved performance, UX and configurability. They wanted to just hit the “go” button and let it run for days or weeks. They also needed to be able to have new features added after the first release.
The custom product validation endurance test system utilizes NI cDAQ off-the-shelf hardware combined with custom LabVIEW-based software to provide automated N-up endurance testing of the UUT.
Higher fidelity DAQ
Increased configurability of the system to run tests the way the client wants to
Increased level of automation allows for multi-day and multi-week test runs.
The endurance tester physically stresses the UUT to measure force and eventually breakage events. These events are used to help determine the recommended lifetime of the product. The tester tests multiple UUTs in parallel in order to gather more data faster for statistical validity. The system collects data until all UUTs break or the operator stops the test.
Viewpoint provided the software and advised DAQ hardware selection. The rest of the test system hardware was selected and assembled by the client.
The automated test system applies a varying cyclical force to multiple UUTs while measuring the force applied to the device. The software automates the data acquisition, analysis, load application, and motor during a test. The system measures all forces applied simultaneously while synchronizing that data to a cycle counter. That data is analyzed to determine average, maximum, and minimum force applied to the device over a user configurable number of cycles.
While running there are multiple alarm states that are monitored. When these alarm states occur, a file can be generated to dump a user configurable duration of force measurements to a file. Other alarms generated trigger the system to change a digital output state triggering a text message to be sent to the operators of the system. The system was designed to test for weeks at a time.
NI FlexRIO enables Device Evaluation & Characterization for high-data-rate sub-system
100s of man-hours saved in capturing the data.
Client – Automotive Manufacturer
New product development drove the need for validation of a new sub-system (a RADAR sensor ) for use in a next-gen system in an automobile. They needed a way to evaluate and characterize the performance of the component under various conditions that were not defined in the UUTs specs. They wanted to use as much COTS hardware as possible for this first run testing because of the expense of a custom test solution and the timeline.
The NI FlexRIO-based product validation system utilizes COTS hardware, along with some Viewpoint-developed custom software to allow for evaluation and characterization of the UUT.
The utilization of COTs (vs a custom-built FPGA board) test hardware.
100s of man-hours saved in capturing the data.
Allowed customer to manipulate captured data within the LabVIEW environment for more efficient testing, making changes on the fly.
Error Checking done at the FPGA Level allows for guaranteed valid transfers
Packet Decode completed at FPGA Level allows for real-time de-packetization for use in storing only payload data.
All Data captured with TDMS Files for use in over layering different scenarios.
Scalable to add additional serial data channels allowing for more than one sensor to be captured with a single FlexRIO card.
NI’s FlexRIO with NI’s LVDS FAM was used. The NI flying lead cable was utilized initially to connect to the UUT. On the software side custom VHDL was created to handle the 8b/10b serial stream data and clock recovery. The VHDL interfaced to LabVIEW FPGA which was utilized to stream the data to disk on the PXI-based system.
Custom FAM VHDL and LabVIEW FPGA interface Development
Custom Test System Using NI PXI for Electrical Test
Updating an obsolete tester that maintains functionality
Client – Medical Device Manufacturer
Our client already had a test system in place, but the tester (really two test systems testing two different product variants) was becoming obsolete. The tester was old, hardware was failing, and it was getting harder and harder to keep it reliably running. They wanted a new tester to improve reliability, but maintain the functionality of the existing tester to keep the FDA-mandated verification and validation time to a minimum.
The updated end-of-line manufacturing test system maintains the functionality of the old test systems, but with updated hardware and software. The same software is utilized for both the manual test system update and the automated test system update. Our client deployed 6 manual testers and 1 automated tester.
Improved maintainability and reliability with updated hardware and software
Maintains existing test system functionality to keep certification time down
There were two variants of the new test system. One was for an older product line that utilized manual test, with an operator that connected/disconnected the UUT, and initiated the test. The other was an automated tester, integrated into a manufacturing machine. Both testers utilized custom fixtures (provided by the client), off-the-shelf NI measurement hardware (selected by Viewpoint), and custom test software (developed by Viewpoint). The software is configurable for both the manual test system and the automated test system.
Read UUT limits from config file
Perform tester self-test
Measure UUT output
Perform leak down pressure test
PLC interface (for automated tester) for start, done, pass, fail
Manufacturing Inspection System Uses Machine Vision to verify assembly and labeling
Reducing human error with automated inspection
Client – Automotive Component Manufacturer
Our client already had an end-of-line tester in place. However, preventing incorrect product shipments drove them to add machine vision capabilities to verify that the part being packed is of the correct physical configuration and that the part was labeled correctly. They also wanted a more automated way to track which serial numbers were being shipped.
Viewpoint enhanced the existing end-of-line tester by adding machine vision capabilities to verify correct part assembly and part labeling. This capability also allowed for automated tracking of which parts went into which shipping container.
Automated part assembly verification to reduce human error from manual visual inspection
Automated label verification to reduce the chance of shipping the wrong product
The enhanced system added machine vision-based capabilities to an existing end-of-line manufacturing test system. New hardware (cameras, lighting, fixture) was selected and integrated by the client. Viewpoint developed the image analysis routines using the Cognex In-Sight software. These routines were then downloaded and controlled using LabVIEW software developed by Viewpoint. In addition, the LabVIEW GUI contained the image acquired by the camera and the results of the image analysis. The tester can inspect four different part types.
The software essentially performs the following functions:
Look up the expected characteristics of the part being inspected.
Populate the on-camera In-Sight “spreadsheet” with configuration information used in the image analysis/inspection.
Trigger the image capture and read results from the on-camera spreadsheet.
Use the on-camera image analysis to check a critical angle of the part as the part is set in the nest fixture.
Check the information laser etched on the part and compare the results with what should be on the part (relative to the barcode read in for the lot and the 2D barcode on the part) using the OCR/OCV capabilities of the camera.
Perform other physical part characterization image analyses to verify the part was correctly labeled & assembled.
Look up expected part characteristics
Trigger image capture
Read results of on-camera image analysis
Display image taken by camera and show if test passed or failed
Monitor contiguous part failures & initiate shutdown
Automated Manufacturing Test System for Electronic Medical Devices
Using PXI and LabVIEW for modular testing of over 1,000 different models
Client – a medical device manufacturer and repair depot
Our client manufactures hospital patient pendants used to control bed frame, nurse calling, and TV functions. The company was also growing after adapting a business model of being a repair depot for older designs for their own and the pendants of other manufacturers. As such, their products are very high mix and medium volume.
The basic functions for all these pendant models are closely related, so the client wanted a means to build a single automated test system that could verify functionality for 1000s of models. And, since the products are medical devices, the testers needed to comply to 21 CFR Part 820 and Part 11.
The testers were designed to support the common measurements needed to test the circuitry of the devices as well as the complex signals required to drive TVs and entertainment systems. A test sequence editor was created which allowed the client to create as many test sequences as needed to test each specific pendant model by creating a list from pre-defined basic measurement steps configured for each specific measurement.
For example, each device had a power supply, the voltage of which needed to be tested. To test a specific model, a voltage measurement step was added to the model-specific sequence and configured with the upper and lower measurement limits for the power supply. The complete test sequence was created by adding and configuring other measurements test steps as needed. Each test step could also be configured with switch configurations to connect the measurement equipment, such as a DMM, to the proper pins on the device circuit board.
Using this configuration process, the client was able to support the testing of well over 1000 models without any programming. A separate application was developed to create these test sequences which were saved as XML and fed to the test system for selection and execution.
The test execution was managed by NI TestStand and the pre-defined common test steps were written in LabVIEW. The test sequences and test results were interfaced to the client SQL database which they used in their ERP system. This ERP system used the results produced by the test system to help manage the workflow of production, for example by assuring that all units had passed testing before being shipped. Part 11 compliance was handled through checksums used to check if results had been modified.
Test sequence editor used to develop and maintain tests for 1000s of device models
Enabling our client to create test sequences without programming reduced overall development costs by about 50%.
Test sequences and test results were stored in the client’s ERP SQL-compliant database for integration with manufacturing workflow
Modular and common software developed for the test systems reduced the V&V effort during IQ & OQ by allowing testing of the test execution application separate from the individual test sequences.
The automated test system was able to execute each test sequence in three different modes: engineering, service, and production. Each mode has been specifically designed for various departments throughout the manufacturing floor. Typically, the manufacturing engineer would verify the sequence by executing it in engineering mode. Once the test sequence parameters pass, it was then approved for production testing.
During actual product testing, an approved and digitally-signed test sequence is loaded and executed via the test sequencer, designed for automated production. During execution, test results are displayed to the operator and simultaneously pushed to a database. The automated test system produces a record for each tested device, indicating the disposition of each test step and the overall performance of the device. All result data are digitally signed and protected from tampering.
The architecture of the test system follows a typical client – server model.
All client stations communicate with a central ERP and SQL server and each computer is secured by applying operating system security. The SQL server contains all of the test definitions, device history records and results. Information from it can be queried at any time by quality engineers throughout the organization, assuming they have proper login access. This provides real time status about products ready for shipment. Also, other than the software running on the client stations, no other user has permission to write or modify any information in this database. The client is able to keep the server in a protected area separating it from the manufacturing environment while the client test stations are placed throughout the manufacturing area.
Surprisingly, there were only twelve test steps needed to uniquely configure and be combined to create sequences to test well over 2000 unique models. Test steps are capable of measuring basic resistance, current and voltage parameters as well as perform sound quality measurements and high speed digital waveform analysis. Several tests were designed to be subjective while others are fully automated and test to a specified acceptable tolerance. During configuration, each test step requires the manufacturing engineer to enter expected values and tolerance limits to define pass – fail status. Upon testing, the devices are attached to a generic interface connection box and the test system makes the appropriate connections and measurements.
Low-level measurement drivers to interface to a DMM, signal generator, switches, and data acquisition cards.
Measurement-based test steps
Test sequence execution
Test sequence management
User access management
Test report creation and management
Verification of test sequence content and ability of user to execute
Verification of the content of the test results
NI PXI chassis and controller
NI PXI acquisition cards for analog measurements
NI PXI acquisition cards for digital input and output
NI PXI DMM for precision voltage and resistance measurements
Automated Manufacturing Test Systems for Medical Diagnostic Equipment
Using NI PXI and LabVIEW as a common architecture for multiple test systems testing several subassemblies
Client: a manufacturer of automated blood analysis machines
Our client was embarking on a complete redesign of their flagship automated in-vitro Class 1 blood diagnostic machine. In order to meet schedule goals, the design and build of several automated test systems needed to occur in parallel with the overall machine. In a major design paradigm shift, many components of the machine were being manufactured as modular subassemblies, every one of which was an electro-mechanical device. Thus, multiple testers were required to test each of the specific subassemblies in the machine. And, since this was a medical device, the testers needed to comply to 21 CFR Part 820 and Part 11.
With a looming deadline, the testers needed a common architecture, so that all testers could leverage the development from the others. Since each subassembly could be tested independently of the overall machine prior to final assembly, the design of the testers was based on a common measurement and reporting architecture, written in LabVIEW, that interfaced to the customers Part 11 compliant database for testing procedures and measurement results. Furthermore, procedures and validation checks for calibration of the testers were part of the overall test architecture.
Modularization of the test system architecture aided development and maintenance
Reduced overall development costs due to standardization of test sequence steps and reporting
Both test sequences and test results were stored in a managed database that satisfied 21 CFR Part 11 requirements
Modular and common software developed for the test systems reduced the V&V effort during IQ & OQ.
Since multiple subassemblies were being tested, with one part-specific test system per part, the automated test systems used as much common hardware as possible to simplify the development effort through common hardware drivers and test steps. Measurements were made with PXI equipment. Test steps and the test executive that executed the test sequence(s) were developed using LabVIEW.
The types of test steps required to verify the proper operation of each subassembly were categorized into basic operations, such as voltage reading, pulse counting, temperature reading, and communications with on-board microcontrollers. The specifics of each measurement could be configured for each of these measurement types so that each test step accommodated the needs of the specifics of each subassembly. For example, one subassembly might have needed to run the pulse counting for 2 seconds to accumulate enough pulses for accurate RPM calculation while another subassembly might have only needed 0.5 seconds to accomplish that calculation.
The configuration of a test step algorithm was accomplished via an XML description. The accumulation of these XML descriptions of each test step defined the test sequence run on that specific subassembly.
Test results were associated with these test sequences by completing the entries initially left blank in the test sequence, so that all results were explicitly bound to the test sequence.
The operator user interface distinguished between released and unreleased test sequences. With unreleased test sequences, engineers could try the most recent subassembly designs without needing to wait for final validation. The released sequences were only available to test operators. This login-driven branching was managed using the Windows login, so that the client employees could use their company badge-driven login process. Once logged in, the user would be able to execute the test sequence in automated mode, where all steps happen automatically, or manual mode, where one step could be operated at a time.
Furthermore, the Windows environment was locked down using built-in user account group policies to designate the level at which a user could access Windows or be locked into accessing only the test application.
During the V&V effort, each test sequence was verified for expected operation, against both known good and bad parts. Once verified, the sequence was validated against the requirements and, when assured to be as expected, a checksum was applied to the resulting XML test sequence file and all was saved in a Part 11 compliant database. Upon retrieval, when ready to run a test, the sequence was checked against this checksum to assure that a sequence had not been tampered.
Test results, saved as XML in the same file format as the test sequence, were also surrounded by a checksum to verify that no tampering had occurred.
The IQ/OQ efforts were handled in a traditional manner with the client developing the IQ/OQ documentation, with our assistance, and then executing these procedures, again with our assistance.
Low-level measurement drivers
Measurement-based test steps
Test sequence execution
Test sequence management
User access management
Test report creation and management
Verification of test sequence content and ability of user to execute
Verification of the content of the test results
PXI chassis and controller
PXI acquisition cards for analog measurements
PXI acquisition cards for digital input and output
Our client was already doing validation, but it was manual, and the client’s customer started requesting faster turnaround of results. Their customer was also requesting data to be sent with the results. Our client chose to automate the validation process to enhance their productivity.
Logs errors during the test (e.g., for continuous monitoring tests, logging the number of instances of when a UUT’s LIN (Local Interconnect Network) response deviates from a static, current draw outside of limits)
Capable of testing a large variety of product lines
Logs pertinent data to a database for post-test analysis/inclusion into reports
The UUT is an electro-mechanical part that falls under a variety of different product lines. As such, the client had a couple variants of the tester, based on the communication needs of the UUT. A total of more than a dozen testers were deployed. The functionality of the tester evolved over time, specifically modifying software to make the tests faster / decrease cycle time.
Extensive diagnostic/manual operation of system for debug of software and electrical connections between the UUT and the test stand/tooling.
Product-specific software components to operate unique products.
Execute mechanical endurance tests.
Execute environmental endurance tests.
Database output containing results from every test cycle (either mechanical cycles or time depending on test being run).
The client already had a test system in place, but it was old and was becoming unmaintainable. Increasing demands from the test engineers and the old software architecture not lending itself to clean implementation of these new features (new sequencer capabilities and ECU CAN communication) drove the need for a rewrite of the software application.
The updated product validation tester supports product validation of the UUT by automating long tests (sometimes a week or more) providing the desired set point control, allowing the client to prove more obviously that their part met the stated specification. Viewpoint developed the software and the client selected the hardware.
Automate long duration tests
Improved operator UX by making controls and indicators more intuitive to the user as well as providing additional capability within one application.
Acquire ECU data along with measured UUT data to allow for engineering performance characterization analysis
Playback utility enables the Test Engineer to quickly view collected data to chart out a path forward for further testing.
Automate a Design of Experiments matrix of conditions, through new sequencer capabilities, to more quickly arrive at product characterization parameters.
All collected signals are now housed in one TDMS file instead of multiple files from different applications.
The UUT is a complete engine with a focus on one of the mechanical subsystems. Data is collected on over 100 channels, measuring temperature, vibration, strain, RPM, position and pressure. Engine management data (e.g., component location, pressures, engine speed, and status flags) is collected via CAN. The engine speed is set via an analog output, and subsystem setpoints are sent to the ECU via CAN. SCXI still used on some of the old test stands, but is being phased out in favor of cDAQ. The test system software was developed in LabVIEW.
Synchronizing data from multiple data logging instruments
Client – Manufacturer of commercial equipment
The client already had a method in place to log data needed for validation testing. However, this data was acquired from multiple independent data logging applications. They needed something to aggregate the data and align/synchronize the data across multiple instruments.
Viewpoint developed a LabVIEW-based product validation solution that continued to utilize the existing data logging hardware, but uses software to aggregate & synchronize the data from multiple sources. This simplifies post-processing.
Synchronized & aggregated data from multiple instruments
Ability to add capability for new instruments later on
Real time graphing of all channels
Channel averaging across multiple instruments
Ability to save data acquisition configurations for future use
Faster channel configuration than current data logging applications
The data logging software unifies the collection of data for a particular validation test. The software configures each instrument, kicks them off, logs the data to a TDMS file, and also graphs data and displays real-time values.
An automated system permits faster validation, unattended test, an increase in throughput, and can free up resources for other tasks during the weeks long endurance test.
Client – A manufacturer of aircraft components in the mil-aero industry
New product development drove the need for a new endurance test system for product validation. The old systems were not designed to test the newly designed part (aircraft actuators), and the company didn’t have the time or resources to reconfigure existing systems to perform the testing required.
The new PXI-based endurance test system provides automated electromechanical testing, full data recording, report generation and a diagnostic panel for intelligent debug. Viewpoint selected the NI equipment, while the test consoles, and other components were selected and fabricated by the customer.
An automated system permits faster validation, unattended test, an increase in throughput, and can free up resources for other tasks during the weeks long endurance test.
Full data recording with a data viewer enables post analysis, which provides the ability to review and analyze raw signals captured during execution. Channel examples are actuator LVDT position, load, current, and encoder actuator position.
Summary report capability allows the customer to document the amount of testing completed against the full endurance test schedules.
A manual diagnostic operational panel provides the ability to verify particular DUT functionality or components without running an entire schedule.
Systems can be paused and restarted to allow for “scheduled maintenance” of the DUT such as inspections, lubrication, etc.
The PXI-based endurance test system enables data collection, deterministic PID Loop Control, emergency shutdown and a diagnostic panel for manual test and debug operation. The system runs endurance test schedules, that are defined as a recipe for test execution. These schedules, which are customer-defined and DUT-specific, are designed to simulate the actual conditions the DUT would see in real world application as closely as possible. LabVIEW-RT was used for the deterministic looping for Closed Loop Control of Actuator Position and Load Control. LVDT demodulation was performed on a PXI FPGA card programmed with LabVIEW FPGA.
Full Data Collection for Real-Time and Post Analysis
Deterministic PID Loop Control
Diagnostics Panel for Manual Test and Debug
Endurance Test Schedule Execution
Hydraulic Control Panel for Source & Load PSI Control
Ability to run tests unattended and overnight reduces operator labor and compresses test schedules
Client – Major Aerospace Component Supplier / Manufacturer
The client had an older VB & PLC-based test system in place already, but it was obsolete. A new endurance test system needed to be developed to validate prototyped components (in this case, aircraft & aerospace bearings). Many of the prototypes are one-off, so it was important that the test system not destroy the component.
A new endurance test system was developed to validate prototyped components. The test system can be configured for automatic shutdowns so as not to destroy the component under test in the event of unexpected performance of electro-mechanical subsystem components. The updated endurance tester supports product validation by allowing the product to run under various test conditions (e.g. speed, load, oil flow, temperature) and collecting data for analysis.
Viewpoint developed the software and selected the NI hardware (other hardware was selected by the client).
Ability to run tests unattended and overnight eases operator labor and compresses test schedules
Data collection allows for offline engineering analysis
Automatic shutdowns reduce destruction of the prototype component under test
The updated cRIO-based endurance tester incorporates configurable profiles, data logging, and automatic shutdown to allow for safer extended validation testing. LabVIEW FPGA and LabVIEW RT were used together to interface with the test hardware sensors and controls. LabVIEW as used create the HMI for the test system.
Closed loop control of bearing test oil flow
Axial load control
Driver for Emerson VFD
E-Stop and safety management (shutdowns based on alarm limits)
Data collection – temperature, pressure, flow, vibration, frequency
Multiple International Deployments Helps Prove Product Meets Spec.
Each endurance test can run upwards of 6 months.
Client: Major Automotive Component Supplier
A new endurance test system was developed to give more precision in the control setpoint. This additional precision enabled potential clients to review the product performance in real-life situations. Each endurance test can run upwards of 6 months.
The updated endurance tester supports product validation by providing the desired parameter control method, allowing the client to prove more obviously that their part met the stated specification.
Viewpoint developed the software and selected the NI hardware for the first unit. The client is now deploying copies of this system to multiple international manufacturing plants.
Able to prove meeting a particular product specification of interest
Closed loop parameter control
Emergency shutdown functionality
The cRIO-based endurance tester provides closed loop control, data collection, and alarming with controlled and emergency shutdown functions. The operator can manually configure a test or load a saved configuration. After a manual operator check to make sure the setup is operating correctly, a successful test will run its full duration and stop on its own.
Creating an N-Up Tester to handle increased production volume demands
Enhanced throughput offers ROI payback period of less than 1 year
Automotive Components Supplier / Manufacturer
The company makes automotive components in very large volume, several part models each at more than 1 million per year.
The client’s primary concern was conserving floor space. They were completely out of spare manufacturing space.
Viewpoint created an N-up NI PXI-based Manufacturing Test System. In this case, N=6 because analysis showed that a 6-up electronic part tester allowed the test operator to cover the test time with the load/unload time.
At the high volumes needed, the client needed to parallelize as much as possible. The cost of 6 sets of test equipment and device sockets was less important than speed. Using the equation:
ProfitPerUnit x NumberAdditionalPartsPerYearAfterParallelizing > CostOfTestEquipment,
being able to completely parallelize made the number of extra units per year large enough that the payback time for completely duplicating the measurement instrumentation for each UUT socket was less than about 1 year.
Paid for itself in less than 1 year by the enhanced throughput.
This approach consumed about 20% the floor space that would have been used for duplicating the test system 5 more times (for a total of 6 testers)
Viewpoint developed an NI TestStand application that ran 6 instances of the test sequence independently of each other utilizing the duplicated PXI-based test equipment. The common parts of the overall master sequence were:
Startup check for the entire test stand
Shutdown of the entire test stand
Archiving the test results into the database
Part handling was managed by a PLC and robot which delivered the parts from a tray into the UUT sockets. Digital bits were used for signaling the test sequence which parts were present in their sockets and ready to test.
Reduced test time across several products by an average of ~25% and reduced time to create paperwork by ~3x
Manufacturer of high-voltage power supplies
The client already had an existing manufacturing test system in place. They wanted Viewpoint to enhance the tester due to an increase in production volume demand. Viewpoint reviewed the existing test system and noted 3 areas for improvement:
Automation available in the measurement instruments – most of the test equipment was automatable, via some combination of serial, GPIB, or Ethernet interfaces. Furthermore, some equipment, such as an oscilloscope, had the ability to store and recall setup configurations. The test operators already used these configurations to decrease setup time for the next test step. Most test equipment did not have automated setup.
Operator time spent on each test step – the client had been through a Lean assessment and had already done a good job of timing operations. However, we specifically noted that the operator was manually connecting to the test points and manually transcribing to paper the measurement results from instrument displays.
Automating the connections – many types of product models were being tested at this test system. Connecting the test equipment to all sorts of products would require either 1) many types of test harnesses and connectors or 2) a redesign of the products to make test connections simpler and quicker.
The enhanced automated test system included automation of instrumentation interfaces, a test executive to run the test sequences, automated test report generation, and automated test data archiving for the electronic UUT.
Reduced total test time across several products by an average of ~25%.
Time to create paperwork was reduced by ~2/3 due to automated data collection.
The enhanced test system included the following updates:
Test sequence automation
Automated test report generation
Automated test data archiving
Automation of instrumentation interfaces
Configurable automated test steps associated with each type of measurement instrument. The test operators would create a sequence of steps to setup each instrument and record the resulting measurement. The sequence of steps could be saved and recalled for each product to be tested, so the instruments could be used automatically.
New programmable meter – integrated the new DMM meter with a programmable interface to replace the one that was not automatable.
Foot switch integration – Since the connections to the test points were manual, a foot switch allowed the operator to take the measurement and advance to the next step.
The StepWise test executive platform managed the multiple test procedures created for the different products. StepWise also handled creation of HTML reports for every part tested.
It did not provide ability for unattended operation
The thermal control had to be set manually
They wanted to do less manual review of the data
The client develops mission-critical products, so there’s a desire to reduce manual operations because they have to explain any anomalies, and manual operations are typically more error-prone. They needed repeatable results that they could trust.
Viewpoint developed a new test system that utilized new hardware and software, augmented by existing low level hardware and firmware. The test system was developed to perform both functional test for production and environmental testing, and was designed to handle up to 4 DUTs at once. The test system utilizes the StepWise test executive software with custom test steps, which allowed the client to create their own highly configurable test sequences. The system was developed in two phases, with the second phase adding support for a FPGA expansion backplane (NI CompactRIO chassis) in order to provide future capability for bringing some of the microcontroller sequence activity into the NI space. In addition, the previous version had a mix of serial, TTL, and USB instrumentation, which was not as robust as Ethernet based instrumentation. Phase II involved upgrading to all Ethernet based instrumentation, and did away with the original test system’s many manual toggle switches that could be used instead of the programmable mode through the SW.
~40% test time reduction per unit
~25% reduction in anomalies that needed to be justified
Our client produces welding consumables. These products are inspected for continuous improvement of product performance. Our client wanted to standardize their data collection method to improve product quality and utilize SPC (statistical process control) across multiple international manufacturing facilities.
The solution is a relatively straightforward data acquisition system measuring force, vibration and voltage for comparison across multiple international manufacturing facilities to support continuous improvement of product performance.
Standardization of data collection across multiple manufacturing sites
Ability to check product performance tolerances, which could trigger root cause analysis
Ability to analyze data across product runs and across sites for SPC
The system utilizes off-the-shelf data acquisition hardware from National Instruments along with custom LabVIEW code to perform force and vibration measurement and basic calculations such as RMS Min and Max. Each test generates an MS Word file showing summary data as well as graphs of each attribute over time. In addition, the program creates (and automatically archives) a complete data set of all data recorded during the trial and finally adds a line with all the summary results and comments to a Master log file. This Master log file can then be sorted by date, wire type, diameter, or any other input for analysis.
Our client had an old manufacturing inspection system (really two systems: one inspection system and an assembly/inspection system) that would no longer be supported by IT and was going to be removed from the network. They needed the operating system updated, so they decided to take this as an opportunity to port the old code from VB to C#.NET, as well as update some hardware.
As migration projects often do, this effort began by working with the client to solidify requirements, followed by a reverse engineering effort to understand the old system to try to make it match the new system as much as possible.
The updated manufacturing inspection system (one inspection system and an assembly/inspection system) included a new operating system, ported code, new motion control software, new machine vision software, and a new GUI.
OS Update – Updated operating system that is supported by the IT department and is less of a security risk
Software Porting – Ported software to more maintainable language
Measurement Accuracy – Increased inspection measurement accuracy for sub-set of measurements
New GUI – improved operator user experience by improving readability, reducing # of required button clicks, and adding auto scroll functionality
Report Generation – maintained existing format to interface with customer database
The device under inspection is essentially an image sensor array used for scanning images in high end commercial-grade scanning printers. The inspection system utilizes machine vision and precision motion control to verify the location & orientation of several parts, with measurement accuracy measured in microns.
Sharing Business and Test Data Enables Efficiency Improvements
Reduce Production Costs by Coordinating Business and Test Data
Client: A major manufacturer of aerospace components
Many companies operate in a high-mix, low-volume manufacturing environment. In these situations, production of such parts is often complex, with long assembly and test procedures describing the process to make and verify the part. Discussions of automating any part of these processes are often dismissed because an automated test system is thought to be expensive, especially when each part is thought to need a unique test system.
Our client wanted to improve their capability to manage the assembly procedures and get clarity on the status of any parts, whether partially or fully assembled. The existing situation had data manually-entered into a database form or even handwritten data that needed to be transcribed into a database. Often the database was local to the assembly cell. The chance for error was significant and the lag between data collection and updating the database was often days. When questions arose about the status of a particular unit, many hours could be spent in locating and evaluating the associated forms and paperwork.
The steps needed to achieve these goals were clear: automate the collection data on each part while being assembled so that those results would appear in a business-level database which would give a plant-wide view of the status of all the parts in progress.
Thus, this project needed to allow read/write access to sections of the Manufacturing Enterprise System (MES) database so that information about a part being assembled could be obtained automatically and results could be submitted to that MES database automatically.
We designed the PXI-based system based on the StepWise test executive platform to automate the assembly and testing. This platform enables two significant changes. These changes were made at each assembly cell by having the operator use a test PC and perhaps some measurement equipment as appropriate for the part(s) being assembled at that cell.
First, we replaced all the printed assembly procedures with electronic records so that any operator could review the latest version of the work instructions on a computer screen. This approach helped with version control, especially important since the client had various model revisions that came through the factor for rework, each with slightly different versions of assembly instructions.
Second, we displayed those electronically documented work procedures as steps in a test executive, allowing the results of each step in the assembly procedure to be captured electronically. When an assembly step was purely manual with no measurements, the fact that step was completed would be recorded, along with information such as the name of the operator performing the step, the duration that the step took, and so on. When a step required a measurement to be made, such as a functionality verification or a calibration result, the measurement would be collected. If the equipment making that measurement could be automated, we would collect that data automatically, and not require the operator to type the result into a computer form.
The outcome of this effort has enabled the client to get a snapshot of the status of parts in assembly, i.e., Works in Progress (WIP), quickly and accurately.
After these changes were made, many additional capabilities are now available with the advent of purpose-built queries into the appropriate MES database tables. The table below shows the overall efficiency gains achieved.
The key is the combination of the electronic test results obtained at the test equipment with information on work orders and manufacturing flow held in the various tables in the business MES database. This improvement happens even with manual or semi-automated test systems, and does not require a completely automated assembly and test system. Thus, the cost of the test system is much less than usually expected and, hence, the benefits are more easily cost-justified.
Designing an Automated Fuel Cell Validation Test Stand
Verifying a New Fuel Cell Design Through Automated Operation
Client: A major automotive manufacturer
Micro Instrument, an automation vendor that builds test and validation stands, has extensive experience with programmable logic controllers (PLCs) and stand-alone controllers for controlling repetitive motion, safeties, and other “environmental” parameters such as pressure and temperature. The company typically uses PLCs to reliably deliver discrete I/O control and standard PID loop control.
However, Micro Instrument’s customer, a major automotive company, was interested in investigating fuel cells as a power source and they needed to run these fuel cells under a wide range of conditions for extended durations, for both design validation testing and durability testing purposes. Furthermore, the client wanted to implement more advanced control algorithms than simple PID.
The customer knew they needed control loops that predicted system response so we could eliminate overshoot and/or achieve a faster approach to a setpoint. But, because the customer did not know in advance exactly what such “smart” controls would entail, it was beneficial to have the full power of LabVIEW to develop such controls. Providing this functionality with a PLC would be cumbersome, if not impossible.
The customer had some Compact FieldPoint which they wanted to use for this project, so we needed to ensure that this equipment would be sufficient to deliver the required control performance and tolerances. Also, the system needed to conduct PID control in two forms – PWM and continuous control. Importantly, this Fieldpoint hardware had a real-time controller running LabVIEW Real-Time.
We developed a flexible control environment using NI Compact FieldPoint and LabVIEW Real-Time to meet the customer’s system control demands. For example, to predict system response, we programmed the Compact FieldPoint to run control loops that were aware of imminent system-state changes and changed their control schemes accordingly.
As with most validation test systems, we needed to monitor conditions for safety. New product designs are often operated near the edges of safe operation in order for the designer to understand how the product performs in extreme conditions. For this fuel cell application, destructive over-heating and over-pressure could occur. Both digital and analog signals were watched in real-time to assure operation within reasonable bounds and allow a safe shutdown if the fuel cell ran into out-of-bound conditions.
The application used the following independent parallel loops:
Seven for PWM-based temperatures control
Two for continuous pressure monitoring
Four for solenoid and sensor monitoring and control
15 safety loops
Data collected during the validation tests were saved to a local PC for later performance analysis and anomaly detection.
The combination of Compact FieldPoint with LabVIEW Real-Time enabled the customer to run the required custom control algorithms and it surpassed the capabilities offered by standard PLCs.
Client: A major manufacturer of data-critical three-phase uninterruptable power supplies
A major manufacturer of very large three-phase uninterruptible power supplies (UPSs) needed better measurement, analysis, and report generation capabilities. Their clients used these UPSs on mission critical equipment, such as data warehouse server farms, communications equipment, and so one. Existing testing procedures used equipment that did not allow for complete simultaneous coverage of all sections of a UPS unit, from input to output. Our client wanted a better understanding of the signals on each of the three phases at various locations within the UPS, especially when power sources were switched or faults were induced.
Also, in the prior test procedure, factory acceptance reports were manually assembled for our client’s end-customers, delaying the final sign-off. Finally, since the end-customer might want to run a specially configured test or run a series of tests in a different sequence than some other end-customer, our client wanted to be able to rerun certain types of tests or run tests in a customer-specific order. Thus, the test sequencing needed to be flexible and editable, possibly on the fly.
Finally, synchronization between the data collection on all signals was critical to assess functionality, since all 3-phases of the UPS output needed to be in the proper timing relationship.
At a high-level, the majority of testing a UPS relies on knowing the reaction of the UPS to changes on the input side (such as a grid power outage) and changes on the output side (such as an immediate heavy load). Thus, many of the tests performed on a UPS deal with power quality measurements, such as defined by IEEE 519 or IEC 61000 series standards, which cover both continuous and transient operation. The StepWise test execution platform was utilized to allow the customer to develop arbitrary test sequences using the application specific test steps developed for the program.
Our solution used a cRIO to measure both current and voltage from each leg of the 3-phase power (and neutral) by using appropriate cSeries modules connected to various voltage and current test points within the UPS. The cRIO had enough slots to allow a single cRIO to measure a single UPS.
Assessment of continuous operation mainly reviewed the UPS output power quality. Here, it was important to know the amplitude and phase of each leg of the 3-phase power. Synchronous data acquisition between all voltages and current channels was needed for proper timing alignment of collected data points.
Assessment of transient operation was often a review of power ripple and recovery time. For example, in the event of grid power loss, a UPS would switch over to backup power, with the result being a small transient created on the output a UPS. Again, the voltages and currents needed to be collected synchronously to assure that event timing was aligned.
For increased power capacity, the UPSs could be connected in parallel. When ganged together, the continuous and transient behavior of each UPS needed to be compared to the others, in order to capture the behavior of the entire combined system. Consequently, each cRIO (one per UPS) had to share a clock to enable synchronous data collection across all cRIOs. A timing and synchronization module was placed into each cRIO chassis with one cRIO acting as the master clock source and the others being slaved to that clock.
The overall test system architecture has a master PC communicating with each cRIO. Each cRIO was placed in certain activity states by the master PC, such as “arm for measurement”, “transfer collected data”, and “respond with system health”. This arrangement enables the number of cRIO to shrink or grow depending on the number of UPSs being testing in parallel.
The test system connected the timing module in each cRIO in a daisy-chained configuration, leading to data sampling synchronization error of less than 100 ns between all cRIOs, which translates to about +/-0.001 degree phase error for 60 Hz power signals. This timing synchronization was more than sufficient to analyze the collected waveform data for power quality and transient structure.
LabVIEW was used to create various configurable test steps that could be executed in random order as well as in an automated sequential manner. Our client was thus able to test a UPS in a predefined manner as well as react rapidly to queries from their customer when they were viewing a factory run-off test. For example, the customer might ask to re-run the same test several times in a row to validate consistent responses.
Each type of test included automated analysis routines that numerically calculated the relevant parameters against which the UPS was being checked. Not only was this automated calculation faster, but it reduced mistakes and improved reproducibility as compared to the previous post-testing partially manual calculations.
Data from all tests, even repeated ones, on a given UPS were archived for quality control purposes and made a part of the device history for that UPS.
Finally, the report generation capability built into this test system was far superior to the previous methodology by allowing our client to hand their customer a professional report package practically immediately the testing was complete. Customer satisfaction was improved substantially with this state-of-the-art test system.
Client: A major manufacturer of implantable cardiac and neural stimulators
Our client needed several extremely reliable test systems to test the batteries that power their implantable medical devices. These new test systems were needed for two main reasons. First, the needed to upgrade existing obsolete test equipment, based on antiquated hardware and software. Second, new battery designs could not be tested on the old equipment.
A critical aspect of the new test system was the need to detect any excessive charge being extracted from the battery, thus rendering it unsuitable for surgical implantation. Thus, the test system needed to monitor the total energy withdrawn from a battery during testing to assure that it never exceeded a certain limit while also offering precise control of the type of pulses being drained from a battery.
All test results had to be stored in a database in order to maintain device history for each battery manufactured for archiving, quality control, and process improvements.
The updated manufacturing test system is PXI-based along with a custom micro-controller-based circuit board for some low-level control. Each PXI controller communicated to the microcontroller (uC) on the custom PCB via CAN. The uC controlled the current drain from the battery while monitoring actual current and voltage from the battery at over 1000 samples per second using a precision 6.5 digit PXI DMM. Additionally, each PXI chassis was used to test many hundreds of batteries. Signal connections were handled by several switch multiplexers. Overall control of all the PXI testers was managed by a host PC connected to the PXI controller.
Reduced test system cost vs complete COTS solution with combo LabVIEW RT on PXI and firmware on microcontroller-based custom circuit board
Enabled tight control of DUT operation on controller with microsecond level responsiveness while being supervised by higher-level PXI RT
Quick-reaction test abort capability
Test results stored to database for archiving, quality control, and process improvements
In a simplified view, the testing proceeded by pulsing the battery with a series of different durations and varying amperages. The exact sequence of this pulsing is unique for each DUT model. Measurements were made using a PXI filled with various NI boards such as DMMs, for accuracy, and data acquisition cards, for general purpose use.
Additionally, the pulsing amperage levels needed to be tightly controlled in order to know that the tests have been performed properly. Thus, a real-time amperage control scheme had to be implemented to maintain the level requested for the pulse. We chose to accomplish this control via an analog control circuit developed using a custom Viewpoint-developed circuit board. This board was controlled via a Microchip PIC microprocessor. The LabVIEW RT application communicated with the microcontroller to setup the pulsing sequence and coordinate the start and stop of the pulsing and the NI acquisition hardware.
This custom circuitry also reduced the overall cost of the test system by about 40%.
The engineering time to design this custom circuitry was more than offset by the reduction in material costs because more than 10 test systems were deployed, allowing the non-recurring engineering effort to be shared between many systems.
When no critical issues were detected, the waveforms acquired by the PXI system were stored and then analyzed to determine the viability of the DUT. The pass/fail disposition, the waveforms, the total energy consumed, and other test results were then passed along to a master PC that managed all these results in a database for archiving, quality control, and process improvements, each set of results being tied to the unique unit serial number.
The test systems provided reliable operation for testing the large annual production volumes of the mission-critical DUTs.
LabVIEW RT – for managing the microcontroller functions and overall data collection and safety monitoring
Microcontroller application – to provide precision pulsing of the batteries
Communicate to the host PC – to both receive pulsing instructions and configurations and to return pulse waveforms for each battery tested.
Monitoring of Testing Inside Environmental Chambers
Our customer required a system that would replace manual charting of tests performed inside various environmental chambers.
Viewpoint designed an automated solution which notifies the technician when the test chamber requires attention and reports chamber utilization for planning and scheduling purposes.
This application was designed for a group that provides long-term thermal and environmental testing to a large number of internal customers at its facility. The group is responsible for approximately 80 environmental chambers which are used for a variety of tests for electronic circuit boards and modules.
These tests typically last between 100 and 4000 hours, with the environmental chambers cycling temperatures according to an externally programmed profile. This system was developed to automatically monitor and provide oversight to the various test chambers under the department’s control.
On an individual chamber basis, the system can verify that the chamber is performing to the test expectations, provide an audit mechanism and generate alarms when the chamber is not operating correctly. The software also is flexible enough to add and edit individual chambers and the tests inside them. The data collected is compared to set limits and, where appropriate, alarms are generated and events are logged to keep a history of what occurred during a test. The test system is capable of running many tests simultaneously.
The system is scalable and more thermal chambers can be added as needed. The operator can view the status of any given test by selecting the test to be viewed and observing the trend. The server software running on the server PC is tolerant of user logins and logoffs as it is running as an Windows service.
The software was written in LabVIEW as a client/server style application. Using LabVIEW and a small stub of “C” code, the server portion of the software was built into a Windows Service. There is no interface to the server other than the client. The client uses the LabVIEW VI Server technology to communicate with the server. This configuration allows the technicians to check the status of any test from their desk or a remote location.
Test configuration allows the operator to be notified when alarm conditions occur or for a regularly scheduled check of the chamber. The system notifies the operator by sending an email and/or by sending a message to their pager.
All test status information is persistent in an MS Access database so if a power failure occurs, or the system goes down, the tests in progress are not lost. When the system is powered up again, the system will restart any tests that were in progress. Two days of history data is kept in memory for each test so trends can be identified.
The system can generate a number of reports, such as job status, journal events, chamber status, completed test results, and chamber utilization. For each type of report, the technicians can pick from a list of criteria to filter the requested information.
For this application, Dresser-Rand needed an extensible system capable of monitoring numerous signals interfaced to a large gas turbine. Well over a
thousand signals needed to be collected from an extremely varied set of data acquisition devices and instruments. The configuration of this system and
viewing of data needed to be available from any of a number of computers connected to the data acquisition network. Also, data needed to be available for additional processing on other connected networks. Dresser-Rand required that all of the components that were necessary to run a test, such as the server, database, acquisition, configuration, and viewing, were able to be run on one computer or distributed over several computers.
This system utilizes Client-Server architecture to acquire signals from a variety of devices and logs the data to a central SQL Server database. The data is then processed and viewed on remote terminals. It is modularly designed to facilitate changes in acquisition hardware as well as viewing and processing software. There are three important components to this application: a SQL Server data management system, TCP/IP packet based messages for configuration and data, and a flexible, applicationindependent driver model.
National Instrument’s LabVIEW was used for the bulk of this project. C, Visual Basic, and Fortran were also used to develop analysis routines and interface with various pieces of hardware.
TCP/IP packet based messages for communication of data and commands
100base-T local network with bridge to other company/worldwide networks
Remote configuration and viewing
SQL Server database
High channel count (1000+ signals)
Flexible data acquisition system
Diverse data acquisition devices: DAQ, GPIB, VXI, RS-232, PLC
Common driver model – drop in drivers, self-aware configuration
Common calculation model – drop in calculations, self-aware configuration
Flexible GUIs with drop in screens
Several software technologies used for various aspects of the project: LabVIEW, Microsoft SQL Server, Microsoft PowerStation Fortran, Microsoft Visual Basic, Microsoft C, Microsoft Access
Client – ECR International: A manufacturer of heating and cooling systems.
ECR has significant domain expertise in developing boiler systems. Viewpoint has significant domain expertise in measurement and control systems. To ensure quality control ECR International utilizes an end-of-line testing stand. Each boiler is test fired and adjustments are made to optimize proper combustion. Results of the testing are recorded along with the boiler’s unique serial number.
The team at ECR needed an upgrade to one of their end-of-line test systems to support an increase in production capacity without sacrificing the testing and quality assurances process.
ECR also wanted to eliminate the need to constantly adjust test limits based on temperature. This manual adjustment process was time consuming.
They took this as an opportunity to update and clean up the code base for supportability.
Viewpoint was asked to upgrade the existing test stand code and add a bit of functionality. Since ECR already had the necessary hardware, Viewpoint worked with the existing hardware set, porting software and adding new features.
The updates improved usability, saved time, and increased accuracy.
The solution was delivered on time and under budget.
Test time reduction and increased accuracy (automated temperature-based test parameter control)
Increased test flexibility (can test at multiple boiler capacities)
Improved operability with updated user interface
Improved development supportability with cleaned up code base
Improved IT supportability with updated code base
Increased stability (EEPROM test stand lock-up resolved)
Simplifying Report Generation for High-Mix, Low-Volume Industrial Servo Valve Tests
Client: A major industrial servo valve manufacturer
A manufacturer of components for both commercial and military aircraft built a large number of different models of servo valves. Some models were made only a few times each year, while other models were made with an order of magnitude higher volume. Each unit underwent rigorous testing during and after assembly.
Our client needed to submit the results of that testing to their customers but since the production and testing of each unit happened in many locations, possibly even around the world, many hours were spent locating the appropriate datasets and assembling the report.
Furthermore, our client wanted to improve their responsiveness to requests from their customers by having rapid retrieval of the test report for any part after it had been delivered into the field.
Since the test datasets were varied due to the large numbers of different valve models and associated test procedures, a database was created using a platform based on the Resource Description Framework (RDF). An RDF database can accept arbitrary types of data, manage that data through metadata tags, and adjust gracefully to changes in content and shape of the connections between objects in the database.
This adaptability was key to our client being able to leap past some of the issues in standard SQL-based relational databases.
The results from each test run on each part at each (PXI-based) test system were tagged with metadata and pushed into the RDF database. The StepWise test executive platform interfaced to the RDF database by outputting XML content which was scanned by a routine created for the RDF database and converted into the RDF data and links. The part ID was a critical tag since this allowed searching the RDF database for all results associated with that specific part. This database resided on a server at the client’s headquarters and accepted data from worldwide locations.
Once the data for each part was housed in the database, a report could be generated. To accommodate the variety of data in that report, web technology was used to render the report pages based on the types of data entered into the database, as described by the metadata tags. For example, data identified as waveforms could be plotted or listed in tabular format. Having reports rendered based on the data types made it possible to handle adjustments to the types of data measured by the test system.
With the ability to render reports quickly, our client could produce detailed reports for their customers indicating the performance of any specific requested servo valve.
Our client was able to trim the time to create reports to less than 1 day from the previous effort of 3-5 days and with less error.
Data are now organized uniformly, simplifying the location of desired information, as compared with files stored on various test PCs and file servers.
The client has the ability to generate automatic emails to their customers with the required reports already attached and ready to go.
In potential warranty and customer service situations, having the ability to send the customer a report within hours represented great customer service.
All these features are available consistently across worldwide manufacturing facilities, reducing training and maintenance of procedures. And, of course, the reports handle using metric or English units as appropriate for the end customer.
Client: A major manufacturer of aircraft landing systems
A major manufacturer of aircraft landing equipment needed to develop a means of endurance and fatigue testing new designs for aircraft steering. The actuators involved in steering the nose landing gear (NLG) required precise and reliable control through thousands of steering cycles.
Control loops needed to be closed at faster than 1 ms.
Prior systems were handled manually without real-time control and monitoring.
Our customer designed and built a test rig to provide the hydraulics and environmental conditions for the endurance testing on the NLG. Viewpoint Systems supplied the electronic data acquisition and control hardware coupled with real-time software to provide the required fast control loops. The configuration and execution of the 1000s of steering cycles were managed by the same data acquisition and control system through a set of configuration screens that allowed specification of turn rates, min/max angles, drive and resistive torque settings, and so on.
The various PID control loop configurations were also configurable along with gain scheduling required under different operating conditions.
The environmental conditions were supported by controlling a temperature chamber through ramp and soak settings occurring during the steering tests.
Measurements on the steering performance were collected from commanded setpoints, sensor readings, and controller outputs during the entire test run.
Alarm and fault conditions, such as force exceedance, were monitored continuously during operation so that the system could safely run unattended.
The entire system underwent an extremely rigorous acceptance testing procedure to verify proper and safe operation.
Arbitrary Load and Position Profiles
Flight Position Control
Load Position/Force Control
Endurance/Flight Schedule Execution
Deterministic RT for DAQ and PID Control
PXI/SCXI Hybrid RT Chassis
Discrete Pump Skid Interface
Custom Control Panel/Console
Prior to deployment of our system, setup of a test was much more manual and operators needed to be around to monitor operation.
With our new system, complete endurance testing could be specified and executed with minimal supervision. Furthermore, the tight integration of real-time control and coordinated data collection made report creation much simpler than before.
The rigorous acceptance test gave trustworthiness to the data and allowed the design engineers to validate performance more quickly than the prior semi-automatic and manual methods of operation.
Setup of tests has been improved from prior operations. The endurance testing itself operated over a huge number of cycles lasting weeks to months between scheduled lubrication and maintenance.
The deployed system measures performance during the entire testing, even between the scheduled downtime.
Play back digital test patterns for the RF receiver at real-time rates to understand bit-error rates
Understand effects of RF chain prior to digitization
Allow for platform to assist with algorithm development, debug and optimization
We utilized off-the-shelf hardware combined with custom software and had a working system after ~7 man-weeks of effort. The DRAP system records and plays back digital data only, with A/D conversion being handled by the DUT. The system was developed on the National Instruments PXI Express platform. A RAID array of disks is used to continuously record data. Data manipulation is performed on a Xilinx Kintex-7 FPGA that forms the basis of a National Instruments High Speed Serial board. The DRAP system is connected to the RF receiver using standard SFP+ connectors. A UI connects to the system locally or over Ethernet to monitor and control DRAP during record/playback. The customer can also control the system via an API so that it can be integrated into a larger test system.
Allows for repeatable data through the processing chain.
Can re-sample data, inject new headers into data packets, and re-pack new data.
Replacing Obsolete Custom Electronics with cRIOs in High-Power Capacitor Testing
Modular Embedded cRIO Systems Shortens Development and Reduces Risk in Complex PC-based Test System
Client: A major manufacturer of electrical power generation and distribution equipment.
This project involved retrofitting a test system used to verify operation of a high-power capacitor used in electrical power distribution. This system was originally built around 1990. Critical sections of the original test system relied on custom, wire-wrapped analog and digital circuitry to process, analyze, and isolate the high-voltage and high-current signals created by the capacitor. Analog filters, rectifiers, and comparators produced pass/fail status signals. A master PC, other measurement and control equipment, the analog circuits, and a six-position carousel were integrated to create the entire automated test and control system.
For each unit under test (UUT), test specifications are obtained from a Manufacturing Execution System (MES) and cached locally. The subsystems at each carousel position are designed to run independently. This parallel capability allows greater throughput and reduced test time per capacitor unit. In addition, as different capacitor models move through the carousel stations, the test parameters and conditions must be aware of the particular model being tested.
Test results for UUT are pushed back to the MES system for record retention and data mining. The existing MES interfaces were retained exactly for the retrofit.
All capacitors require 100% testing prior to shipment, so the test system is critical for the facility operation. Two or even three shifts are common depending on production needs and the facility cannot afford any significant downtime. Thus, a challenge was to design and build a test system that worked and was very robust.
Another huge challenge was the lack of documentation on the existing system, requiring a sizable amount of reverse engineering to understand the test system operation before development on the new system could begin.
Furthermore, one of the most important challenges surrounded replacement of substantial amounts of original test equipment before the new test equipment could be installed. Thus, we absolutely had to minimize the time and risk in this upgrade changeover.
A schematic of the overall system architecture is shown in the figure. The major components of the system are:
Master PC for supervisory control and test execution management
NI cRIOs with FPGAs and Ethernet for independent yet PC-supervised operation
Station-specific FPGA code for replacing wire-wrap circuitry functionality
Integration with existing MES, safety equipment, tooling, and measurement hardware
The architecture chosen was made very modular by the capabilities offered by the cRIO. The Master PC interfaced with station-specific measurement instrumentation as needed, such as GPIB controlled equipment, and coordinated control and outcomes from the cRIOs. This additional equipment is not shown in the figure.
The Master PC coordinated all the activities including interfacing with the existing MES database and printers at the manufacturing facility. In addition, this PC provided the operator interface and, when needed, access to engineering screen on a diagnostic laptop.
The cRIOs were essential to the success of this test system. Each cRIO functioned as the equivalent of a high-speed standalone instrument.
The cRIOs at each carousel test position had to provide the following features:
Digital I/O for machine feedback, safeties, and fault conditions
State machines to coordinate with external commands and signals
Perform numeric calculations to emulate the old analog circuitry
Control loops for currents associated with voltages needed by different capacitors
Communication support with the master PC
Computation and detection of internal fault and UUT pass/fail conditions
We were able to duplicate the behavior of the wire-wrapped circuitry by converting the schematic diagrams of these circuits into FPGA code and then tweaking that code to mimicking the actual signals we measured with data acquisition equipment on the original test hardware.
The outputs of the circuitry were reconstructed on the FPGA with band-pass filtering, calibration compensation, point-to-point RMS, and phase & frequency functions. This functionality was implemented in fixed-point math and the 24-bit inputs on the A/D provided sufficient resolution and bandwidth for a faithful reproduction of the electronic circuitry. These embedded cRIOs provided a very effective solution to what otherwise might have required another set of costly and rigid custom circuits.
Finally, for optimizing the task of replacing the old equipment, we used a set of cRIOs, not shown in Figure 1, to provide Hardware-In-the-Loop (HIL) simulation of the manufacturing and measurement equipment. These cRIOs imitated the rest of the machine by providing inputs to and reacting to outputs from the embedded cRIO controllers, thus supporting comprehensive verification of the new test system before the tear-out of the existing hardware. Furthermore, these HIL cRIOs enabled fault injection for conditions that would have been difficult and possibly dangerous to create on the actual equipment.