I say “embarrassing problem” because this is about my battle with that most sophisticated and subtle of analog electronics circuits, the voltage divider. There’s more going on here, for sure, but I’ve been tearing my hair out (what little there is of it) over this. I even resorted to building myself a voltage divider on a breadboard so I could convince myself I wasn’t going completely mad.
This is quite long, so only read if you’re feeling charitable…
Here’s the circuit, which is part of my Teensy Load project, which is a programmable load that’s programmed through an attached Teensy:
This part of the circuit is for limiting the voltage supplied by the device under test attached to the load: The opamp U102C is used as a comparator, and as long as the voltage sensed from the DUT is below a threshold, the opamp output is +3.3V and the diode D301 is reverse-biased. If the DUT voltage rises above the threshold, the comparator output drops to ground, the diode is forward biased and the gate of the FET that controls the current through the load is pulled low, stopping current flow through the load.
That bit’s all OK. The really weird thing that’s going on is to do with the voltage divider for setting the threshold voltage. This is driven from a DAC (U301), and the divider is supposed to convert the output range from the DAC (0 - 3.3V) to a suitable range for the threshold (0 - 2V, because of the way I’ve set things up).
So, when the output from the DAC is 3.3V (i.e. the voltage at point A is 3.3V), what’s the voltage at point B? (All this with no DUT connected at all.)
Naively, I calculate 3.3 x (27K + 33K) / (27K + 33K + 39K) = 2V.
Actual voltage measured at point B = 1.3V!
I tihnk I have a very slightly better idea what’s going on now, but the first time I saw this, I did have a tiny existential crisis. It’s a voltage divider with a 39 kΩ side and a 60 kΩ side, so for a total voltage of 3.3V you’d expect to drop 2V across the 60 kΩ side and 1.3V across the 39 kΩ side. Instead it’s the other way around, with 2V across the 39 kΩ side and 1.3V across the 60 kΩ side!
If you calculate the currents through the resistors, you find that the current through the 39 kΩ resistor is 51.3 μA, and the current through the 60 kΩ serial combination is 21.7 μA. That seems to mean that there is a current of 29.6 μA flowing into the input of the opamp. Can that really be right? The TLV4333 is a CMOS opamp which I assumed meant that it would have a huge input impedance and draw essentially no current.
I’m a little suspicious about what happens at point C, the other input of the comparator, but I tried grounding everything that had some sort of path to there and it didn’t make a difference.
Does anyone have any idea what could be happening here? As yet, I know far too little about opamps to work out what’s happening. Could there be some feedback path through the rest of the circuit that’s causing this? (But that nearly 30 μA of current flowing into an opamp input makes no sense to me at all!)
I’m really hoping this is something amazingly obvious, and someone can just give a one-line answer than sends me slinking off back to my bench. I’m more than happy to make a fool of myself in public if I learn something.
Additional information that might be useful: here’s a spreadsheet that shows what happens when the DAC voltage varies. The chart on the left shows voltages (the “_expect” values are my naive linear expectations), the chart on the right voltages across the 39 kΩ and 60 kΩ sides of the voltage divider as fractions of the total DAC voltage (which I’d have expected to be constant).
Something weird obviously starts to happen when the DAC output voltage gets up to about 1V. Below that, the voltage fractions are constant and what I’d expect. But then, as the DAC voltage increases, they move away from the values I’d expect until they eventually cross over and change places completely!