Most measurement comes down to measuring voltage or current. The simplest way to measure voltage would be with a DMM set to DC Voltage. A lot of ATE type instruments still just measure voltage but add a lot of higher level functions and increased capabilities. I’m going to look at some simple measurement issues and then some more advanced instruments and what they can do. Measurement Basics Voltage and Current Measurements are the basic measurements that are the basis of what the various measurement instruments do. Voltage is measured in parallel across two points and current is measured in series, in the current path. A resistive measurement is made by measuring both the voltage and current and calculating the resistance using ohms law. A useful improvement to measuring resistance is to use a 4-wire or Kelvin measurement. Figure 1 shows how a 4-wire measurement is made. Figure 1. A 4-Wire or Kelvin resistance measurement. In Figure 1 a separate wire is used to connect the ammeter and the voltmeter to the resistance you intend to measure. The advantage here is that the forced current causes voltage drop on its’ wires but not on the voltmeters’ wires. So, the voltmeter sees very little current and the measured resistance is more accurate. Instrument Accuracy It’s important to understand how to calculate accuracy for the instrument you are using. It happens all the time that a measurement seems like it is not working, only to determine that the problem is the accuracy of the select measurement range. Figure 2 shows the DC voltage accuracy specification for the National Instruments 4071 PXI DMM. Figure 2. DC voltage accuracy specification for the National Instruments 4071 PXI DMM. Figure 2 shows that accuracy is calculated as parts per million of reading plus parts per million of range. The figure shows many different columns that are related to the calibration and temperature. For example, within 24 hours of calibration in the 100 mV range the accuracy is 5 + 4. If you are trying to read 50 mV, the accuracy will be calculated as follows. Or 650 nV. To make it a little simpler you could just calculate the worst case accuracy for the range you are in and know you will always have better accuracy than that. To do that, add the measurement and range together and take the full range voltage, like this. Or 900 nV. Accuracy isn’t always in PPM, sometimes it’s given as a percentage, which I think is a little more intuitive, but it’s still the same idea. While the instrument itself has accuracy as just discussed, there are many other factors that can affect measurement accuracy. These include: input loading, leakage resistance and current, shielding and guarding. Here is another example of calculating accuracy, in this case for resistance. Figure 3 shows a resistor where the voltages are known on either side of the resistor. Figure 3. Resistor to be measured Error The error is how much a measurement differs from the true value. Usually, this is a ratio where the maximum error can be calculated based on the worst case observed or calculated measurements. For example, if you take the 50mV with 650nV of measurement error from above, as a percent the worst case is: Resolution
The table in Figure 2 also has column for resolution. Resolution is the smallest portion of input signal that the instrument can display. For a 5½ digit display, there can be 200,000 counts (0 to 199,999). This makes the resolution of the display 1/200,000 = 0.0005%. Sensitivity This is kind of like the resolution, it is the smallest change in the signal that can be detected. The instrument has to detect and display a change in a signal, which is calculated based on the resolution and current measurement range. If the range is 200mV and the counts is 200,000, then 0.2/200,000 = 1 uV sensitivity. There are also calculations for sensitivity based on the number of bits the instrument uses (this is bits on the ADC or DAC), if you had 10 bits the sensitivity is your input range divided into 2^10 parts. Depending on where you look the definition given here for sensitivity may be the definition given for resolution. It’s kind of up to the different instrument manufactures as to how it’s defined. Range The range is pretty obvious, you want to make sure you are not trying to measure a value that is larger than the selected range. However, just selecting the largest range possible all the time will hurt the accuracy of the measurement. I would say you want to measure in a range where the value you are attempting to measure are less than 90% of the range. If you are above 90% consider if you are going to see variation that will be out of range. Precision The term “precision” is used a lot, but this really a more informal term that should not be substituted for accuracy. It is defined more in terms of measurement repeatability and reproducibility. It’s easy to find an explanation of accuracy vs. precision that usually involves a dart board. Instrument Types There are many different types of instruments, I’m going to try and cover the most general purpose ones that I have worked with. DAQ A DAQ is kind of its own class of instrument that covers multiple functions all in one instrument. If you look at the websites for National Instruments or Keithley, they list a DAQ in its own section even though it duplicated the function of many other instruments. A DAQ usually contains a counter/timer, digital I/O lines, analog I/O for single values, waveform output and digitizing. This really covers a lot and you can do a lot with a DAQ. When higher performance is required, that is when you need to go to more specialized instruments, like ARBs or High Speed Digitizers. DMM/LCR A DMM (digital multimeter) is usually required over a DAQ when increase accuracy for measurements is required. The trade off with a DMM is usually speed, in order to achieve higher accuracies the DMM internally averages measurements over a longer time. Many DMMs also have the ability to digitize waveforms, only at slower speeds relative to other digitizing options. In addition to voltage, current and resistance many DMMs include LCR capabilities, measuring capacitance and inductance. A DMM is often specified in terms of digits. The NI 4071 from Figure 2 is a 7½ digit DMM. Where we have 01.234567 if this was the number you were reading from the DMM, the 1 through 7 is the 7 digits and the 0 is the ½. It is only ½ because this digit can only be a 0 or a 1. If you look at the ranges in Figure 2 the ½ digit really only becomes a 1 when the range is maxed out. Programmable Power Supply, SMU A programmable power supply is just what it sounds like. A power supply that can be controlled through software. An SMU (Source Measure Unit) is a programmable supply that also has measurement capability. An SMU can typically be configured to force either voltage and measure current or force current and measure voltage (FVMI, FIMV for short). The SMU works where you set the voltage you want to output and set the maximum current limit. Then, the instrument will vary the current in order to hold the voltage constant and generate some type of alarm is the current limit is reached. Another nice feature of SMUs and programmable supplies is the ability to sweep outputs through a list of settings. ARB, Function Generator An ARB or arbitrary waveform generator is a little different from a function generator in that the output can be programmed to a custom waveform. A function generator will typically only generate preset waveforms like sine or square waves. Again, having a DAQ might make these instruments unnecessary, it just depends on the specifications of each. Some features that may be better than a DAQ are: higher speed, higher accuracy, more triggering, and ARB options, improved noise performance and filtering. Counter Timer A counter timer is an instrument for counting pulses and measuring frequencies. Some applications I’ve seen include counting the number of times a signal crosses a set threshold in a given time. Generating a tightly controlled timing signal. That is, a signal is output at logic high from the counter timer and that signal is used as an acquisition time for a measurement. Oscilloscope, High Speed Digitizer High speed digitizers or the similar but more flexible oscilloscope are need for measuring high speed signals that are too fast for other options like DMMs or DAQs. Where these instruments are really useful is for what’s called visualization which means, seeing what’s going on. When something is not working, it’s often important to have a high speed oscilloscope in order to catch the noise or whatever glitch might be causing the trouble. Summary A few basic concepts are important to know when making electrical measurements, like how to determine the accuracy and knowing the effect of resolution and range on measurements. A few points on several different types of instruments were covered. There is certainly a lot more to know about any of these instruments, but I feel these are the kind of basics that I keep in my head about any of these to have a simple knowledge of what they do. Comments are closed.
|
Archives
December 2022
Categories
All
|