All these years later, as a metrologist of some 20 years - 7:40 onwards is closer to what happens than most blithe explanations. Calibration proves past measurements: broadly.
Without having to read all the comments, I'll ask a question, hoping that it's not been asked a lot before. Accuracy, Calibration, Resolution shouldn't we also be talking about precision? Here is how I understand the terms: Accuracy: How close to the actual value is to the measured value. Precision: How much the measurement might vary, when you measure again and again. Resolution: That is explained very well in the video, and I can't find the words to describe it right now. Great video!
+gordovious Actually you can eliminate possibility of 50:50 if you have odd number of multimeters. Scientifically you should calculate the arithmetical average, in practice you could use the middle result as your result. Same goes to any measurement devices including clocks/watches. If you have 2, you cannot be sure what time is it.
@@rpdigital17 ur definitelly not sure with one.. i'm more confident using two methods, multimeter + scredriver tester, or multimeter + multimeter when it comes to AC mains, but for precision measurements/electronics you pretty much need two or three meters , like Dave said the more you track the meter history, the more confidence you will have in it, and less multimeters will be required
you say things like "people get it wrong all the time" and "everyone knows" but I think a lot of people DON'T know, and you know that, and that's why I think such a post would be very valuable to the community as a end-all go to ressource for all things multimeter, somewhere to send all the noobs and the not so noob to learn something or consolidate training, that would be wonderful !
Yes, that's a normal part of the process too. It doesn't even have to be outside the specs for that to happen, just enough to doubt that it will remain within specification for the next interval. In this case they usually give you the before and after adjustment figures. And the serious user will drop the cal interval until confidence is restored in that meter. You can also specifically ask for adjustment every time regardless.
It's not about saving display segments, it's the count range of the ADC. Higher count ranges need a higher resolution ADC, which is more expensive and trickier to implement with good accuracy.
That clarifies things. It's similar to digital calipers, they display to the fourth decimal place ( ten-thousandths ) but are still only accurate to plus or minus two thousandths. Obviously it depends on the brand/quality, but my $350 dollar Brown & Sharpe calipers, even though they are pretty accurate, like all calipers, you can't expect them to be more than 1-2 thousandths accuracy.
If you are a calibration tech working for say the US Navy, and you are "calibrating" an instrument following the cal procedure, it most certainly DOES means adustment or repair, if you cannot get it into original mfgr specs. If I spend the money for calibration I expect to get the instrument back from them adjusted to within factory specs...
First I want to say great video - good information. I want to slightly correct one of your terms though. The term "traceablity" when it comes to calibration does not refer to the historical record of the item as you said. It actually means there is a traceable path through the measurement standards used to calibrat ethe item. (transfer standards, back to primary standards, etc...)
How does this apply when we get to the lower reading limits of a meter ? As an example the Brymen BM867 is listed as being able to read 0,01mV and has an accuracy of 0,03%+2d. To calculate the error at the lower limit can we multiply 0,01mV by 0,0003 or the lower readings always have lower accuracy ? The meter specs sometimes state the accuracy as "reading digits" * Accuracy percentage + digits. Are "reading digits" the same as the value that's being showed on display or the range's max val ?
I'd like to know the answer too. In the old days of analog meters the accuracy was always based on the entire scale. So if you set it to 200V max, an accuracy of +/-0.5% always meant +/-1 V regardless on which part of the scale you were. With DMMs, especially the auto ranging ones, it's not so easy to understand accuracy anymore.
that was real interesting ! with that and that other post about voltage burden, maybe you could have a post where you go through all the other critical characteristics of DMMs ? I have a fluke 87III and there's so much I don't know about it like how to use that REL delta button, or how to test transistors with it it be great to have a video that is a single reference for all thing about, and that you can do with, a multimeter
I have several types of DMM and USB meter, some type of USB meter can display voltage approximately 999,999 counts, for example AVHzY CT3, Shizuku... Hope to see the accuracy review between DMM and USB meter by your devices one day
Thanks for your interesting videos! Now, I need help. I have a Supco DM-25 Multimeter, that had a leaking battery. Now I need to calibrate it, so I can use it with confidence. How do I calibrate it now that it is working? Give me some ideas. Thanks. Tom Erickson.
Yes, it takes ten times as long for each extra digit of resolution. Meters get very expensive past 5 digits to get the resolution required. Pretty much all 5+ digit meters have this feature.
@ion010101 the word resolution has many meanings in many different industries. How high is the "count" or full scale. The resolution is the least significant digit.
What is the most accurate DMM on the market for something under say $300 or $400? What is the best value in Fluke (used or new) .... I know the Fluke 87 V is great, but is it much better than a 87III or 77 or 15B or 17B for basic Volt\/Ohm/Current. I am not too worried about Temp readings but capacitance would be nice. Still I can use a separate meter for LC.
People tend to confuse precision and accuracy all the time. Some instrument can have really high precision but not be at all accurate. Sounds like "counts" relates to precision.
+rchandraonline Precision deals with how repeatable the measurements are and how clustered the data will be. Accuracy on the other hand deals with how close those measurements are to the true value (usually when all data points get averaged), regardless of how clustered the data is. I think counts are only slightly related to precision. Measurements are easily repeated and the data is relatively more clustered (precise) if the device's resolution only allows for a few digits. Imagine trying to repeat measurements to the 15th decimal place, you'll probably never write down the same number twice! Counts are related less to precision and more so to the resolution and the displayable values on the screen. In reality, the precision (repeatability) and accuracy (closeness to true value) of the meter are going to come mostly from the careful design of the protective features on the inputs, the internal ADC and any of its conversion errors and resolution, component tolerances, etc.
when we talk about Calibration we always mention the fluke 87 V which makes me feel very bad and unsafe coz every time i need 2 measure something i feel it is drifting away and i dont have another multimeter.is the Fluke 87 V needs to be calibrated once in a while even if i dont use it that much ? im thinking to buy another one Fluke 179 ,never heard any said any about clibrating this meter ,does it need to be calibrated once in a while just like the Fluke 87 V ?
bullshit me that is) and I know this is probably beginner stuff and a really great thing about your blog is that you go in-depth about stuff and don't assume you're talking to noobs and you probably think this is a bit boring to talk about ... but I think it'd be really great in fact I would love to help you do that ! wow that 500 character limit is really a pain in the ass !!!
About fluke: Why anyone would need an option for count ? Why can't it be at the highest count by default? Does no of counts influence the frequency of readings?
Sweet! You really awnsered alot of questions I had in this. Thanks! I was looking into getting another meter for myself, and possibly have my employer get a new one as well. The one we currently have doesn't have enough resolution for some of the things we need. Some of the meter accuracies I've seen just have a % on the sheet and don't say the + counts for it? IE: The EX330, says 0.5%. If I understand correctly, does this mean that 27.75 VDC could read 26.61 to 27.88 ?
very interesting i have a meter just like the middle one but it says amprobe at the top but everything else is the same symbol and numbers at the top it looks like
Aging of the components like resistors and components in the a/d converters. Wear and tear on the connectors will increase the contact resistance dropping more voltage.
@NMOEG Why are you so worried about calibration? I have a few older 87 meters and they are just as accurate as my 87V and they have never been calibrated.
Great video about accuracy and resolution.. I really enjoyed the video. I noticed yout tried to catch yourself when you used the word "traceability" as an instrument's history... When I was a cal tech, to me, traceability was the highest laboratory level you could "trace" the standards used for calibration... It's a "picky" world out there, thanks for helping to clear up terminology.. Regards wb7ond...
Sure. You don't want to be measuring resistance, especially low resistance, with a set of cables 6 feet long, or capacitance. When measuring frequency, you need good shielded cables. When measuring current, especially higher currents, you need cables large enough in gauge to handle the higher currents.
Hi Dave, How about a review on the fluke 77, Lots of these available cheaply now as ex-military contract and should be great quality. You're the man when it comes to showing us beginners what's what. Love the vids and appreciate the help!! Mark reddog694
(and I think this is true for other tools "everyone knows" how to operate like the function generator/counter even adjustable power supply and the big one, the oscilloscope, I have 2 of them analog ones, and I plan to get a pair of those sweet Rigol ones but I don't know what half the buttons do on them and I tried asking electronics students and the local electronics shop guy and they just stare blankly and say I'm not really sure what that button does, when they don't
+Dragos Puri , in general, no. Precision refers to how finely divided the gradations of measurement are. For example, if you had a meter stick which is marked with 98 equal markings (0 and one meter would not have marks, they'd be the ends of the stick), in other words, marked at every centimeter, it would have centimeter precision. Another one could have 998 markings, or every millimeter. The one with 998 markings would be more precise. Standard measuring procedure is to read off all marked digits, then estimate where the actual length is between the marks for an additional digit, or of course zero for the additional digit if it genuinely looks like the length is exactly on a mark. Accuracy refers to adherence to a standard. Taking the example of a meter stick again, if your meter stick with 998 markings measures 110.0 cm when matched to other, more accurately calibrated meter sticks, it still has millimeter precision but is not accurate because each marking (and its entire length) is too big, by 10%. The combination of these two means all measurements, especially in science, should be stated as the measurement and then plus or minus some precision (a.k.a. tolerance), and optionally stating the accuracy of the device used to measure the quantity. In our 998 mark meter stick example, we could say a measurement is 53.26 cm +/- 0.01 cm or +/- 0.1 cm (I'm not really sure which is more appropriate). Minds better and more educated than mine know how to work with (basically mathematcially combine) both precision and accuracy.
+rchandraonline Sounds like you are confusing precision with concepts like resolution (smallest increment of change between two measurable points) and significant figures (your standard measuring procedure). Precision is related to how repeatable and clustered the data is. Another way of thinking about precision is the statistical variation and spread in the data. Precise data has a very small standard deviation. It will be difficult for you to understand precision by using a meter stick example. When humans generally read a meter stick they aren't going to read different values each time they put the meter stick next to the object, they've made their estimation and will stubbornly stick to it. It is different when it comes to a device such as a DMM which can read different values (4.998V, 5.001V, 4.999V) just by removing the probes and attaching them again. This is why precision is important in different measurement tools.
+Pelnied Let's say I measure a value. And it reads 3.7. I measure it again and it reads 3.8, another time 3.9, again 3.6 and so on, ±0.2. I call that ±0.2 PRECISION. Am I wrong? Now, for Accuracy, I think of it more like and offset. so let's say that the actual, true value of the measurement was 4.0. So my device is wrong by +0.3 on average. Accuracy can be tuned, precision cannot. Please correct me if I'm wrong. Thanks.
I think the something and a half digit specs are plain wank. Any reasonable person can expect that: 10 ^ 3.5 = 3162 counts (approx 3000) In fact, I've only seen meter that actually "meets" its digits spec. It's the 3457A, advertised as 7.5 digits, and it has a resolution 30 million counts.
hey dave......GOOD JOB!!!! greating from Trinidad & tobago....i'v looked at alot of your vids and learnt alot....i have never owned a multimeter before, but i want to get myself one for automotive and home use. I happen to like the yokogawa TY530 and TY720, I know both are not the automotive type but can you do a review on them?? My girlfriend likes the "sex on a stick" she said we shoud try it lmao.....
Just when I thought measurement conventions of Western origin can't be any more bollocks :) Awed again and I bet it's not the last time. Good video! Thank you.
Still coming back here to learn stuff 14 years later. :D. Love ya Dave!
I havent watched more than a couple of your videos, but you clearly shine through as a kind and patient person and an excellent teacher.
I've been trying to understand how counts come into play with accuracy ratings, this video made it crystal clear. Thanks!
All these years later, as a metrologist of some 20 years - 7:40 onwards is closer to what happens than most blithe explanations. Calibration proves past measurements: broadly.
Without having to read all the comments, I'll ask a question, hoping that it's not been asked a lot before. Accuracy, Calibration, Resolution shouldn't we also be talking about precision?
Here is how I understand the terms:
Accuracy: How close to the actual value is to the measured value.
Precision: How much the measurement might vary, when you measure again and again.
Resolution: That is explained very well in the video, and I can't find the words to describe it right now.
Great video!
You use precision here for what most engineers would call "repeatability".
I don't like the term precision because it is an imprecise term.
Instead of "precision", use the word "uncertainty"
"A man with a meter knows the voltage; a man with two is never sure." - Unknown
+gordovious Actually you can eliminate possibility of 50:50 if you have odd number of multimeters. Scientifically you should calculate the arithmetical average, in practice you could use the middle result as your result. Same goes to any measurement devices including clocks/watches. If you have 2, you cannot be sure what time is it.
Both men need to get their meters calibrated.
A man with a meter _thinks_ he knows the voltage; a man with two discovers he doesn't. :D
@@camurgo that's accurate lol
@@rpdigital17 ur definitelly not sure with one.. i'm more confident using two methods, multimeter + scredriver tester, or multimeter + multimeter when it comes to AC mains, but for precision measurements/electronics you pretty much need two or three meters , like Dave said the more you track the meter history, the more confidence you will have in it, and less multimeters will be required
you say things like "people get it wrong all the time" and "everyone knows" but I think a lot of people DON'T know, and you know that, and that's why I think such a post would be very valuable to the community as a end-all go to ressource for all things multimeter, somewhere to send all the noobs and the not so noob to learn something or consolidate training, that would be wonderful !
Thank you for making this easy to understand. Best video out there on the subject! You're an excellent teacher brother!
Yes, that's a normal part of the process too. It doesn't even have to be outside the specs for that to happen, just enough to doubt that it will remain within specification for the next interval. In this case they usually give you the before and after adjustment figures. And the serious user will drop the cal interval until confidence is restored in that meter.
You can also specifically ask for adjustment every time regardless.
if calibration isn't adjustment of the meter that. why is it called calibration? shouldn't it just be called verification instead?
This is the best and most informative EE blog ever ! Thanks you very much.
It's not about saving display segments, it's the count range of the ADC. Higher count ranges need a higher resolution ADC, which is more expensive and trickier to implement with good accuracy.
The old ones are still good ones. Needed a refresher on this.
Very helpful, thank you for explaining this. I get it now. I've been trawling the web for an answer and this is the clearest I have found.
Thank you sir. This is really helpful video for me as I am sales engineer and dealing with multimeters.
@jpmcbride The term is used in many contexts, and often refers to the entire traceable path, including the history of the instrument itself.
That clarifies things. It's similar to digital calipers, they display to the fourth decimal place ( ten-thousandths ) but are still only accurate to plus or minus two thousandths. Obviously it depends on the brand/quality, but my $350 dollar Brown & Sharpe calipers, even though they are pretty accurate, like all calipers, you can't expect them to be more than 1-2 thousandths accuracy.
This tutorial was excellent!!!!!!!!!! Although I haven’t done math in years🤭🤭🤭😄
I was surprised as well that calibration does not mean adjustment the majority of the time. but its completely logical.
Good stuff as usual.
Why is this logical .there is no path to deduce this.
If you are a calibration tech working for say the US Navy, and you are "calibrating" an instrument following the cal procedure, it most certainly DOES means adustment or repair, if you cannot get it into original mfgr specs. If I spend the money for calibration I expect to get the instrument back from them adjusted to within factory specs...
Thanks for the bit at 6 mins in, it was very helpful!
First I want to say great video - good information. I want to slightly correct one of your terms though. The term "traceablity" when it comes to calibration does not refer to the historical record of the item as you said. It actually means there is a traceable path through the measurement standards used to calibrat ethe item. (transfer standards, back to primary standards, etc...)
12 years later still watching!
How does this apply when we get to the lower reading limits of a meter ? As an example the Brymen BM867 is listed as being able to read 0,01mV and has an accuracy of 0,03%+2d. To calculate the error at the lower limit can we multiply 0,01mV by 0,0003 or the lower readings always have lower accuracy ? The meter specs sometimes state the accuracy as "reading digits" * Accuracy percentage + digits. Are "reading digits" the same as the value that's being showed on display or the range's max val ?
I'd like to know the answer too. In the old days of analog meters the accuracy was always based on the entire scale. So if you set it to 200V max, an accuracy of +/-0.5% always meant +/-1 V regardless on which part of the scale you were. With DMMs, especially the auto ranging ones, it's not so easy to understand accuracy anymore.
that was real interesting !
with that and that other post about voltage burden, maybe you could have a post where you go through all the other critical characteristics of DMMs ?
I have a fluke 87III and there's so much I don't know about it
like how to use that REL delta button, or how to test transistors with it
it be great to have a video that is a single reference for all thing about, and that you can do with, a multimeter
I have several types of DMM and USB meter, some type of USB meter can display voltage approximately 999,999 counts, for example AVHzY CT3, Shizuku... Hope to see the accuracy review between DMM and USB meter by your devices one day
Thanks for your interesting videos! Now, I need help. I have a Supco DM-25 Multimeter, that had a leaking battery. Now I need to calibrate it, so I can use it with confidence. How do I calibrate it now that it is working? Give me some ideas. Thanks. Tom Erickson.
Why doesn't the Fluke just stay in high resolution mode all the time? It doesn't make sense. Does it take that much longer to make a measurement?
Yes, it takes ten times as long for each extra digit of resolution. Meters get very expensive past 5 digits to get the resolution required. Pretty much all 5+ digit meters have this feature.
Dual slope Analog to digital converters take time. Successive approximation ADCs are fast but they need a DAC, and they are expensive.
@ion010101 the word resolution has many meanings in many different industries. How high is the "count" or full scale. The resolution is the least significant digit.
Is it true that 12 000 counts multimeter, 20 000 counts multimeter and 22 000 counts multimeter have both 4 and 1/2 digits?
What is the most accurate DMM on the market for something under say $300 or $400?
What is the best value in Fluke (used or new) .... I know the Fluke 87 V is great, but is it much better than a 87III or 77 or 15B or 17B for basic Volt\/Ohm/Current. I am not too worried about Temp readings but capacitance would be nice. Still I can use a separate meter for LC.
I knew I could "count" on you for the answer.
People tend to confuse precision and accuracy all the time. Some instrument can have really high precision but not be at all accurate. Sounds like "counts" relates to precision.
+rchandraonline Precision deals with how repeatable the measurements are and how clustered the data will be. Accuracy on the other hand deals with how close those measurements are to the true value (usually when all data points get averaged), regardless of how clustered the data is. I think counts are only slightly related to precision. Measurements are easily repeated and the data is relatively more clustered (precise) if the device's resolution only allows for a few digits. Imagine trying to repeat measurements to the 15th decimal place, you'll probably never write down the same number twice!
Counts are related less to precision and more so to the resolution and the displayable values on the screen. In reality, the precision (repeatability) and accuracy (closeness to true value) of the meter are going to come mostly from the careful design of the protective features on the inputs, the internal ADC and any of its conversion errors and resolution, component tolerances, etc.
Pelnied THANKS
when we talk about Calibration we always mention the fluke 87 V which makes me feel very bad and unsafe coz every time i need 2 measure something i feel it is drifting away and i dont have another multimeter.is the Fluke 87 V needs to be calibrated once in a while even if i dont use it that much ? im thinking to buy another one Fluke 179 ,never heard any said any about clibrating this meter ,does it need to be calibrated once in a while just like the Fluke 87 V ?
Thanks for the invaluable information!
bloody good video mate
bullshit me that is)
and I know this is probably beginner stuff and a really great thing about your blog is that you go in-depth about stuff and don't assume you're talking to noobs and you probably think this is a bit boring to talk about ... but I think it'd be really great
in fact I would love to help you do that !
wow that 500 character limit is really a pain in the ass !!!
About fluke: Why anyone would need an option for count ? Why can't it be at the highest count by default? Does no of counts influence the frequency of readings?
Sweet! You really awnsered alot of questions I had in this. Thanks! I was looking into getting another meter for myself, and possibly have my employer get a new one as well. The one we currently have doesn't have enough resolution for some of the things we need. Some of the meter accuracies I've seen just have a % on the sheet and don't say the + counts for it? IE: The EX330, says 0.5%. If I understand correctly, does this mean that 27.75 VDC could read 26.61 to 27.88 ?
Luv my Southwire Technician Pro Multimeter (14070T) it's a 6,000 count. Awesome Multimeter 💪
Which one better
Benchmeter or hand hold one?
very interesting i have a meter just like the middle one but it says amprobe at the top but everything else is the same symbol and numbers at the top it looks like
Thanks! I never understood what this meant until know.
What is the normal price to have a meter (for example, a Fluke 189) calibrated?
Thank you very much for clear understanding....
Good info, thanks again for another great video!
What would change in the meter to cause it to need to be sent away for calibration?
Aging of the components like resistors and components in the a/d converters. Wear and tear on the connectors will increase the contact resistance dropping more voltage.
Is a Greenlee DM-820A a rebrand Bryman???
Resolution is analogous to precision in this sense, right?
@NMOEG Why are you so worried about calibration? I have a few older 87 meters and they are just as accurate as my 87V and they have never been calibrated.
Great video about accuracy and resolution.. I really enjoyed the video. I noticed yout tried to catch yourself when you used the word "traceability" as an instrument's history... When I was a cal tech, to me, traceability was the highest laboratory level you could "trace" the standards used for calibration... It's a "picky" world out there, thanks for helping to clear up terminology.. Regards wb7ond...
You're correct, here in the US (North America) lab standards are traceable to NIST.
Is that an old simpson 260 back there on the shelf.
Thanks very much for the video. You explain things very well
Can the quality of test leads effect the accuracy?
Sure. You don't want to be measuring resistance, especially low resistance, with a set of cables 6 feet long, or capacitance. When measuring frequency, you need good shielded cables. When measuring current, especially higher currents, you need cables large enough in gauge to handle the higher currents.
great video, thanks
What's a 3 1/2 inch meter?
Thanks for the info. Fluke 16 rocks!
Hi Dave, How about a review on the fluke 77, Lots of these available cheaply now as ex-military contract and should be great quality. You're the man when it comes to showing us beginners what's what. Love the vids and appreciate the help!!
Mark
reddog694
(and I think this is true for other tools "everyone knows" how to operate like the function generator/counter even adjustable power supply and the big one, the oscilloscope, I have 2 of them analog ones, and I plan to get a pair of those sweet Rigol ones but I don't know what half the buttons do on them and I tried asking electronics students and the local electronics shop guy and they just stare blankly and say I'm not really sure what that button does, when they don't
then why are 4 1/2 count meter expensive than 3 1/2 meter
Great introduction thanks
Shouldn't Accuracy actually be Precision?
+Dragos Puri , in general, no.
Precision refers to how finely divided the gradations of measurement are. For example, if you had a meter stick which is marked with 98 equal markings (0 and one meter would not have marks, they'd be the ends of the stick), in other words, marked at every centimeter, it would have centimeter precision. Another one could have 998 markings, or every millimeter. The one with 998 markings would be more precise.
Standard measuring procedure is to read off all marked digits, then estimate where the actual length is between the marks for an additional digit, or of course zero for the additional digit if it genuinely looks like the length is exactly on a mark.
Accuracy refers to adherence to a standard. Taking the example of a meter stick again, if your meter stick with 998 markings measures 110.0 cm when matched to other, more accurately calibrated meter sticks, it still has millimeter precision but is not accurate because each marking (and its entire length) is too big, by 10%.
The combination of these two means all measurements, especially in science, should be stated as the measurement and then plus or minus some precision (a.k.a. tolerance), and optionally stating the accuracy of the device used to measure the quantity. In our 998 mark meter stick example, we could say a measurement is 53.26 cm +/- 0.01 cm or +/- 0.1 cm (I'm not really sure which is more appropriate). Minds better and more educated than mine know how to work with (basically mathematcially combine) both precision and accuracy.
+rchandraonline Sounds like you are confusing precision with concepts like resolution (smallest increment of change between two measurable points) and significant figures (your standard measuring procedure). Precision is related to how repeatable and clustered the data is. Another way of thinking about precision is the statistical variation and spread in the data. Precise data has a very small standard deviation.
It will be difficult for you to understand precision by using a meter stick example. When humans generally read a meter stick they aren't going to read different values each time they put the meter stick next to the object, they've made their estimation and will stubbornly stick to it. It is different when it comes to a device such as a DMM which can read different values (4.998V, 5.001V, 4.999V) just by removing the probes and attaching them again. This is why precision is important in different measurement tools.
+Pelnied Let's say I measure a value. And it reads 3.7. I measure it again and it reads 3.8, another time 3.9, again 3.6 and so on, ±0.2. I call that ±0.2 PRECISION. Am I wrong?
Now, for Accuracy, I think of it more like and offset. so let's say that the actual, true value of the measurement was 4.0. So my device is wrong by +0.3 on average. Accuracy can be tuned, precision cannot.
Please correct me if I'm wrong. Thanks.
and 5 4/5 digit ?
Why not make all of the digits go up to 9? It cannot be that hard!
I think the something and a half digit specs are plain wank. Any reasonable person can expect that:
10 ^ 3.5 = 3162 counts (approx 3000)
In fact, I've only seen meter that actually "meets" its digits spec. It's the 3457A, advertised as 7.5 digits, and it has a resolution 30 million counts.
3:00 I have the one to the right and my day says it costs like $500
hey dave......GOOD JOB!!!! greating from Trinidad & tobago....i'v looked at alot of your vids and learnt alot....i have never owned a multimeter before, but i want to get myself one for automotive and home use. I happen to like the yokogawa TY530 and TY720, I know both are not the automotive type but can you do a review on them?? My girlfriend likes the "sex on a stick" she said we shoud try it lmao.....
I suppose 'resolution' is the term electrical engineer's use but it's 'precision'. As in 'accuracy vs. precision'.
👍👍
helpful
Dont need accuracy I want digits :b....to a point
@EEVblog HAHAHA!!! You are right - go to his channel!!!
Just when I thought measurement conventions of Western origin can't be any more bollocks :) Awed again and I bet it's not the last time.
Good video! Thank you.
Well, shit at first i thought he was but now.... not so much. Cus i was laughing so hard..
@AaronLow1*Dad not day
What's with the voice? Is this for kids?
No kidding - It's boomhauer haha.
LOL say that's funny !!! and I have no idea how to even play a banjo. And I HATE COUNTRY MUSIC !!!! ROCK AND ROOL FTW !!!!!!
I enjoying the gay voice of this guy 😂
If you refer to resolution as accuracy, at best you'd sound like a dunce... at worst, you might use the wrong units and crash into Mars.