Electronic Products & Technology

Synthetic Instruments – Real Measurements

Staff   

Electronics CEL

By John Stratton

With the current trend to drive down the total cost of ownership of Automatic Test Systems (ATS), industry-standard open architectures have been seen as both a way of driving down the cost of test and of reducing the size of the ATS platforms.

Since these open architectures have been based on rapidly changing commercial computer standards, a large investment in hardware is quickly becoming a support problem ? the very problem that an open architecture was supposed to fix. A synthetic instrument (SI) solution is currently being driven by the defense electronics market where 20+ years support-life is essential. To achieve a high reuse of SI hardware assets, a tremendous additional burden is placed on the software architecture. The SI software must be flexible enough to accommodate many different types of hardware transducers while maintaining NIST traceability of the final measurement.

What is a Synthetic Instrument?
A synthetic instrument is a concatenation of hardware and software modules used in combination to emulate a traditional piece of electronic instrumentation. The key objectives of automatic test systems based on synthetic instruments are to allow insertion of technology when higher performance measurements are required and to lower cost by promoting competition. These goals must be accomplished while minimizing or eliminating the need to re-write test application software.

 According to the Synthetic Instrument Working Group there are four major components in the SI architecture, as seen in figure 1. This simplified architectural block diagram can describe most microwave instruments, like signal generators, spectrum analyzers, frequency counters, network analyzers, et cetera. However, the implementation with SI modules of these microwave instruments may require multiple signal conditioners, frequency converters and data converters to emulate the function of its original instrument (example: vector network analyzer).

 The most important functional block in the synthetic instrument architecture is the data converter (analog-to-digital converter (ADC) and digital-to-analog converter (DAC)). Data converters have two main drawbacks to them: limited bandwidth (sample rate) and distortion-free dynamic range (effective bits). While this may seem obvious, there continues to be great advances in data converter technology, bringing us closer to an ideal software-defined instrument (where no frequency converters are required). I believe that the reason that data converters are the most important module is that if the digitizer (ADC module) or the arbitrary waveform generator (AWG with DAC module) has enough dynamic range and bandwidth to capture or generate the entire signal(s) of interest, one needs only apply software algorithms to convert the voltage, current, or power data to the desired measurement or signal. In other words, multiple measurements can be performed from a single time-data capture. For some low-frequency, narrow-bandwidth applications you can achieve this today (for example, audio). Figure 1: Synthetic instrument architecture.

Advertisement

How much is enough when determining bits and bandwidth requirements? When trying to understand this question I like to think of data converters in the frequency domain (amplitude versus frequency), like a spectrum analyzer. Imagine the data converter as a window of available signal. This window has only a finite amount of width (sample rate bandwidth) and height (dynamic range or effective bits). If you can view or generate your signal within the confines of the window, no signal conditioning or frequency translation will be required.

 Let?s now take a simple example of generating a 100 MHz CW tone using an AWG (see figure 2). When analyzing this CW tone with a wide bandwidth digitizer, such as one used in an oscilloscope (7+ effective bits), one can plainly see the 100 MHz signal, but the digitizer doesn?t have enough dynamic range to measure the Fs/4 spurs created by the DAC in the AWG. The logical choice would be to choose a digitizer with higher dynamic range, like a spectrum analyzer (11 to 13+ effective bits), if such analysis is required. However, with the increased dynamic range there is a corresponding decrease in bandwidth, now requiring some frequency conversion along with multiple acquisitions to capture and ?stitch? together the entire signal of interest. When covering multiple gigahertz of analysis bandwidth, as when searching for unknown spurious signals, many

hundreds or even thousands of acquisitions may be required to cover the frequency span of interest.
 The frequency converter block is a concept easy to understand, but difficult to implement when low spurious or low noise performance is required, such as in a microwave signal generator or microwave signal analyzer. This SI module does just what you would think, converting a signal from one frequency to another. A signal generator may take the I/Q output of the AWG and translate it to 10 GHz, thereby generating a radar signal. Conversely, to perform some modulation analysis of a communication signal by down-converting the signal from a 43 GHz (LMDS transmission) to an intermediate frequency (IF) that fits within the bandwidth of your digitizer, one can analyze the output of the digitizer. Since most frequency converters are based on a super heterodyne architecture, the internal mixers create images and spurs. The challenge in designing frequency converters is to minimize unwanted signals during the conversion process.

Common Hardware And Software Interfaces ? Critical To Long-Term Support Life
All measurement software and algorithms must be separated from the measurement hardware. Let?s take a simple example like the ?peak search? function in a signal analyzer. There is a single command in most signal analyzers that support this function. However, each manufacturer implements this function differently, which may yield different results. If, on the other hand, the algorithm resides in the ATS software (whether purchased or written) the results would yield the same answer no matter which hardware was used (assuming the same hardware performance specifications). This would virtually eliminate the hardware obsolescence problem, thus highlighting the need for common hardware and software interfaces.

 The software to calibrate the measurement signal path becomes more complex than it was previously. Most instrument manufacturers achieve their high performance specifications by employing internal correction algorithms that remove amplitude, phase, and frequency dependant nonlinearities. These NIST traceable, high-accuracy measurements are also dependant on the components within the instrument itself.
 For example, a signal analyzer will be disaggregated into a down converter, a digitizer, and measurement software. If, as expected, one or more of these modules could be substituted with another vendor?s module, how will traceability be guaranteed? A sophisticated software application will be required to solve this instrument agnostic problem.

Moore?s Law Affects Test Equipment Market
Instrument manufacturers and systems integrators have found that using external PCs gives the best price-versus-performance profile that the market offers. However, most instruments still have an embedded controller. The customers who demand lower cost and higher performance test systems are driving the demand for external PC usage. This trend will continue as long as Moore?s Law holds true (performance doubling every 18 months).

 The partially adopted VME/VXI architectural design reduces the cost and increases the reliability of sophisticated ATSs as requested by customers. Most VXI implementations were typically a redesign of currently shipping test instruments into the VXI form factor. While VXI did address the size and the cost of new instrument development it failed to adequately address production cost, product availability, and long-term support.

 During the early 1990s Intel
proposed a new bus architecture for PCs. Peripheral component interconnect (PCI) promised to provide significant benefits over its predecessors, and it did. This was followed by Compact PCI (cPCI), and led to its instrument counterpart (PXI).Figure 2: 100 MHz CW tone with Fs/4 Spurs.

Technology advances have given the PXI/Euro-Card form factor large support. For products that are comprised mainly of semiconductor components, such as digital I/O and low frequency analog, one can design the functionality of an older C-sized VXI card onto a significantly smaller PXI/cPCI card. This new smaller card can be designed with a lower manufacturing cost and with increased reliability. However, this smaller form factor, while good for the digital designer, produces limitations for the RF/microwave engineer when components don?t change at Moore?s Law frequency and are physically larger than their low frequency and digital counterparts. For high-performance microwave measurements or devices requiring high sensitivity measurements, which are common in most aerospace/defense applications, PXI currently doesn?t provide the EMI/EMC shielding or quiet power supply that is necessary.
 Even as PXI is enjoying large year over year industry growth and acceptance, Intel has decided PCI has run out of life. While PCI has been the dominant bus standard for the past ten years, Moore?s Law has once again made it inadequate for newer, more powerful applications and designs. PCI Express is the future, according to Intel.
 So where does PCI/PXI go from here? Since the physical hardware backplanes are not compatible (PCI is parallel, PCI Express serial) Intel has developed a bridging strategy while the industry converts. A logical question might be, how long will PCI Express be around?

LAN-Based Synthetic
Instrument Modules
For synthetic instrument?s concept to be optimally successful ? with both reusable hardware and open software ? the industry must agree upon a standard. While no interface standard lasts forever, LAN has the staying power to meet the 20+ years support life demanded by the A/D industry.

 LAN, as a communication medium, was introduced in 1985 (IEEE 802.3a) operating at 10 Mbits per second and has increased in bandwidth to support 10Gbits per second by 2002 (IEEE 802.3ae), thus progressing in speed, three orders of magnitude in the span of 17 years. Additionally, LAN is one of the lowest cost interfaces to implement in computers or instruments (based on computer standard interfaces). The biggest detraction of LAN however, is the non-deterministic nature and inter-module synchronization required by some test applications.

 Most of these issues can be addressed with the new IEEE 1588 (precision clock synchronization protocol for networked measurement and control systems) standard introduced in 2002. John Eidson of Agilent Labs originally developed this technique, which eventually became IEEE 1588, for distributed instrumentation and control tasks. These LAN-based modules would be based upon the new LXI (LAN extensions for instrumentation) format.

 Driving down the total cost of ownership of ATSs will require careful consideration of what role the functional elements should take. Ideally, one would utilize as many commercial industry standards as possible (like IVI-COM) without locking oneself into a standard that is likely to change through technology advancement. With this in mind, LAN-based synthetic instruments offer the best compromise between cost, performance, size, and ? most importantly ? long life.

John Stratton is product marketing manager for Systems Product Operation Defence, Aerospace and Transporation Systems Division at Agilent Technologies Inc. www.agilent.com.

Advertisement

Stories continue below

Print this page

Related Stories