Results 1 to 2 of 2

Thread: Confusion about reading units on a multimeter

  1. #1 Confusion about reading units on a multimeter 
    Junior Member
    Join Date
    Oct 2013
    Hi everyone,

    I am having trouble finding information online to answer this question. I am using a multimeter to measure DC current but am having trouble with the units. The MM is connected correctly in the circuit (in series) with the probes all plugged into the right spots. I've also tried a second MM with the same result.

    What's happening is when the MM is set on 20m the value I read from the circuit is 0.23, but when it's set on 200m it reads 2.3.

    Now the way I understand it (and what my reading online appears to confirm), the 20m setting means that the MM can read a maximum of 20 milliamps while on that setting, with any reading given being in milliamps. Therefore the 0.23 displayed= 0.23mA. And the 200m setting means a maximum of 200 milliamps in the units of mA so therefore the 2.3 displayed = 2.3mA.

    Clearly I'm not interpreting what the 20m and 200m settings mean correctly, because according to what it should mean, I am reading 0.23mA on one setting, and 2.3mA on the other setting. Is there a conversion factor that applies? Anyone know how to determine which is the correct reading and what the units are?

    Thanks very much!
    Reply With Quote  

  2. #2  
    Senior Member
    Join Date
    Jan 2014
    out there
    Is some cheap multimeter with analog reading. Look at the right scale. Other than that read the manual.
    Reply With Quote  

Tags for this Thread

View Tag Cloud

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts