Author Topic: Minimum Current?  (Read 4334 times)


Minimum Current?
« on: October 05, 2006, 11:31:55 PM »
I've noticed that on chip specs there is always a minimum voltage rating, but no minimum current in mA. How do you determine the minimum current a circuit needs for optimal functioning.

I want to use 18v from two 9v batteries in series, but the 200mA current from this supply may not be ideal for the amp I have in mind.


Re: Minimum Current?
« Reply #1 on: October 17, 2006, 10:35:10 PM »
You mean the current per battery is 100mA only ?


Re: Minimum Current?
« Reply #2 on: October 18, 2006, 04:03:44 AM »
My guess is to operate the effect and measure the current draw the effect needs. To do this you need to put a meter in series with the power supply. This is detailed in the FAQ.

Peter Snowberg

Re: Minimum Current?
« Reply #3 on: October 18, 2006, 10:05:22 PM »
(Most) Datasheets (at least for chips) will list a figure called "quiescent current". This is the minimum current that the chip will draw, or better stated, this is the current that it takes to make the chip "idle". As you pump signals through, the chip will consume more.

There are other current draws in a circuit (bias generators, single transistor stages, etc.) so as Aron pointed out, your best bet is to use a milliammeter meter in series with the supply connection to measure the draw of the whole circuit.

As you put batteries in series, the voltage will increase but the current will not. To get more current, you need to put the batteries in parallel.

For little amps a 9V works, but a set of 6 individual 1.5V cells in series will get you much more current capacity. :D

If you're building an amp, always keep in mind that the efficiency of the speaker is a HUGE factor. A 9V battery powered amp with a big and efficient speaker is more than enough to cause problems with neighbors. The same 9V amp with a small and inefficient speaker may not get much louder than a (loud) whisper.
Eschew paradigm obfuscation