QUESTION: Hi, I have some instrumental analysis homework that I am not quite understanding. The questions is asking me if I am fixing a circuit board, what resistance should I use on the multimeter to test the circuit board. My guess is the highest resistance because it would give the lowest loading error but I am confused if the answer is as simple as that. It also asks how I would know if the instrument was running to specifications which I would guess would mean is the current read out of the instrument was within the error % of the specifications. I hope I gave enough information to get the guidance I need to answer these questions.
ANSWER: Without knowing the circuit and the MM it is impossible to give you a precise testing procedure.
But, normally for checking voltage you would set the MM on the highest voltage range, then reduce the range switch until you get a reading on the scale. For example if the voltage being tested is only 5 volts and you put the MM on the 1000v range you would not see the needle move; so switch it down one by one until you get to the 10v range which would put the needle in the center of the scale. If you are worried about the MM loading down the circuit under test you could put a resistor in series with the test probe and using ohms law determine how much the input resistance of the meter would give the current for which you could then compute the voltage on the circuit. If you know the input resistance of the MM you can use a series resistor of equal value; then the 5v reading would be 2.5v on the MM scale. Hope that is not confusing.
Best practice for checking if the instrument is working to spec is using the specifications, measure the performance using a calibrated meter. If the instrument under test has a measurable output you can compare the output with that which should be according to the manual or specifications of the device. It being a circuit board there are probably input and output terminals. Testing would include feeding in a test signal and measuring the output to see if it matches the specs.
Let me know if you need more.
---------- FOLLOW-UP ----------
QUESTION: I know I am not supposed to post the question straight from my homework but I don't think I am describing it correctly
1. You're trying to troubleshoot a broken instrument and you suspect there's a problem on the circuit board. You have looked up the specifications for the voltages at certain points on the circuit board and are going to check them to try to find the problem. Your digital multimeter has several settings for the resistance of the meter (50 ohms, 10000 ohms, and 10 megaohms), which should you use and why?
2. (Using the parameters above...) On what should be a 12 V measurement plus or minus 10% (according to the specifications) you decide to try all three settings and you get 0.45 V, 8.55 V, and 9.39 V. Which measurement do you trust and why? Is the instrument running to specifications at this point?
For 1. I wanted to say 10 mega ohms because that would procure smallest loading error and for 2. I wanted to say 9.39 because it was at the largest resistance but it is not up to specifications because it is not reading within error.
Thanks. You are on the right track. Yes, the highest setting would give the least loading on the circuit whilst the DMM is connected to the circuit. And would, therefore, have the least swamping of the circuit and allow for the highest voltage to be realized.
Depending on the impedance of the circuit components, the high resistance load of the DMM might not effect the voltage all that much. You can work backwards by knowing the resistance load of the DMM and the voltage reading at the three settings you should be able to predict the actual loading by computing voltage differences using the parallel resistance formulas and Ohms law.
Good work. Keep going.