Mosfet voltage divider gate bias

Started by Unlikekurt, February 14, 2020, 09:50:07 PM

Previous topic - Next topic

Unlikekurt

Hello!
I'm using a pchannel mosfet for reverse polarity protection.  Due to the supply voltage being above that of the mosfet's Vgs rating I'm utilizing a voltage divider to bias the gate.  According to the spec sheet, if I bias to 1/2 of my supply voltage I will be above the Vgs for the minimum RDSon and also below the max Vgs rating.  Therefore I figured a voltage divider of two 100K resistors would be perfect as it would do the trick and also keep gate current low.
In application, the divider output (gate voltage) is .6v less than .5*Vin.
I'm wondering why that is?  At first I thought perhaps the body diode of the mosfet but then, that should be bypassed due to the incredibly low RDSon.  Vds however is less than .001mV so clearly the circuit is working as intended otherwise and the body diode isn't at play at the Source.
Any ideas would be helpful.  Unfortunately I haven't a schematic on hand to attach but I think the circuit is fairly straightforward:
Vin to Drain
Source to load
R1 between S and G
R2 between G and Gnd

Thanks for any help on this

PRR

  • SUPPORTER

Rob Strand

#2
QuoteI'm wondering why that is? 
Could be your meter loading down the divider.
100k + 100k divider and a 1M ohm meter impedance load will measure around 0.7V low.

To check:
If you connect a 1M resistor in series with your meter lead then measure a 9V rail
you should get about 4.5V if your meter is 1M input impedance.

Good meters tend to be around 10M ohm, cheaper ones around 1M ohm.  One weird thing these days is some meters have different input impedances depending on the voltage range.   Helps to check the manual.


Send:     . .- .-. - .... / - --- / --. --- .-. -
According to the water analogy of electricity, transistor leakage is caused by holes.

PRR

> One weird thing these days is some meters have different input impedances depending on the voltage range.

That used to be "normal". A volt meter was an mA meter with selectable 5k/50k/500k resistor. You can still buy these. This one is 0.5mA; 2 bucks more buys 0.1mA.
  • SUPPORTER

Rob Strand

QuoteThat used to be "normal". A volt meter was an mA meter with selectable 5k/50k/500k resistor. You can still buy these. This one is 0.5mA; 2 bucks more buys 0.1mA.
Back then at least you knew it was like that and had a (usually constant) ohm/V figure in your head.

Some of the DMMs these days have weird impedance behaviours.  Like some ranges will be 10M and some will be 11M.   Others are 10M but go to 4M on some ranges; IIRC some Flukes did that.  There's no real pattern to it, if you want to account for meter loading you *have* to know what the meter is doing.   (Having said that a few high-end meters have 1G ohm inputs on the lower voltage ranges, but at least that's easy to remember.)

In the old days DMMs were either 10M or 1M and pretty much stayed constant across the ranges.  I use to measure the DMM impedance accurately and write it in the manual.
Send:     . .- .-. - .... / - --- / --. --- .-. -
According to the water analogy of electricity, transistor leakage is caused by holes.

Unlikekurt

Rob nailed it!  And thanks for that.   I was using a Craftsman multimeter 5at happened to be laying out.  Grabbed a couple of nicer meters and all was well.= checked the manuals and sure enough >1M.  Vs >10M.