How did you maximize randomness in a S&H?

Started by Mark Hammer, January 27, 2018, 01:18:13 PM

Previous topic - Next topic

Rob Strand

#40
QuoteDepends on your scoring polynomial for "good".
True but ... The next random number is either (2*last random number + 1) or (2*last random number + 0).
The pattern would hardly be considered as random.   The randomness of the single bit stream is a different matter.

QuoteIn a world of $0.50 PICs, there's no good reason for short LFSRs any more.
Very true.

The linear congruential generators are *far* better for generating number.   If you choose an  m = power of 2 then you can still get a good generator with minimal processing and coding effort.  In factor if you use multipliers which don't have many 1's you can replace the multiplies with a few shift and adds.   The processing effort is comparable to LFSRs.     There are patterns in the linear congruential generators as well but they are *far* less detrimental than the obvious patterns in the LFSR.  There is a procedure called the Spectral Test which lets you choose multipliers with the best structured pattern.  Normally you would choose a multiplier that gives a  maximal length sequence (which is of the form 8k+3 or 8k+5 when m= power of 2) then you run it through the Spectral Test to pluck out the best ones which have full length sequences.   This process has been around since the late 60's and is written up on Donald Knuth's famous book.    Modern stuff goes somewhat further but the congruential method is certainly implementable for small processors.  There are some modern ones which have LFSRs in the machinery by it's not in it's raw form.

Send:     . .- .-. - .... / - --- / --. --- .-. -
According to the water analogy of electricity, transistor leakage is caused by holes.

R.G.

Quote from: Rob Strand on January 30, 2018, 06:26:27 PM
QuoteDepends on your scoring polynomial for "good".
True but ... The next random number is either (2*last random number + 1) or (2*last random number + 0).
The pattern would hardly be considered as random.   The randomness of the single bit stream is a different matter.
That is of course why you don't do that.
For an R bit random word, you sample your word, clock it any number of times R or greater, then grab your next word. That is as random as the bit stream itself.
Quote
The linear congruential generators are *far* better for generating number.  [...] The processing effort is comparable to LFSRs.     There are patterns in the linear congruential generators as well but they are *far* less detrimental than the obvious patterns in the LFSR.  There is a procedure called the Spectral Test which lets you choose multipliers with the best structured pattern.  Normally you would choose a multiplier that gives a  maximal length sequence (which is of the form 8k+3 or 8k+5 when m= power of 2) [...]
There are long arguments over which pseudorandom generator to use in the areas of crypto. The references I lookes up pointed out a strong similarity to LFSRs and LinCon generators. None of these are pertinent to using whatever to generate randomness for effects or audio.

I rather suspect that the scoring polynomial to estimate "better" in "*far* better" may contain terms not tied too much to the randomness of successive sampled words, or the whiteness of the spectrum they give.
R.G.

In response to the questions in the forum - PCB Layout for Musical Effects is available from The Book Patch. Search "PCB Layout" and it ought to appear.

Rob Strand

#42
QuoteFor an R bit random word, you sample your word, clock it any number of times R or greater, then grab your next word. That is as random as the bit stream itself.
That's a *far* better scheme.

QuoteThere are long arguments over which pseudorandom generator to use in the areas of crypto. The references I lookes up pointed out a strong similarity to LFSRs and LinCon generators. None of these are pertinent to using whatever to generate randomness for effects or audio.
Very true.  The modern day stuff is at another level compared to the simple schemes although you do see papers coming with better ways to jumble LFRS's.    For a small micro the old methods are certainly good enough.

QuoteI rather suspect that the scoring polynomial to estimate "better" in "*far* better" may contain terms not tied too much to the randomness of successive sampled words, or the whiteness of the spectrum they give.
Agreed and LFSR's work fine here.  If you are really fussy about flatness you can even put in a filter to fix the small amount of roll-off at high frequencies - a lot of function generators do that "sinc" correction.

[Edit:  I forgot to mention, the good thing about LCG's is you have the (fast) Spectral Test but I'm not aware of an equivalent for the LFSR.  The Spectral Test in it's most basic form would probably take a crazy long time to run.]
Send:     . .- .-. - .... / - --- / --. --- .-. -
According to the water analogy of electricity, transistor leakage is caused by holes.

Mark Hammer

What we seem to be neglecting (or perhaps it was explicitly stated but went over my head) is the role of perceived randomness or unpredictability, and auditory memory.

Let's say we have a device that can randomly spit out one of two possible values: X or Y.  Over 10's of thousands of samples, the mathematical probabilit of X or Y may be dead even.  But if it spits out the X value 117 times in a row before spitting out a Y, I'm unlikely to think of it as "random". (Think about the last timeyou were tossing a coin and started to think over multiple coin tosses that maybe it was "rigged").

Let's hike it up a notch to a sequencer.  If I have a humble 4-step sequencer, it will definitely sound like a repeating LFO.  If I have a 32-step sequencer, where I can set the pot for every step individually, the length of the sequence can exceed what I can recall and feel close to random because of that.

So, maybe the ideal is a microcontroller that not only has access to a many-bits source of random values, but also has a "look-back" function in the programming, to check if any random value has been used within recent memory, with the length of the look-back depending on clock/step rate.  As an example, if I was using such a source in a S&H arrangement, at 4 steps per second, I might only need to look back against the last 20 values (5 seconds' worth) for the sequence to be perceived as pretty random.  If the current sampled random value has been recently used, I pull another one and check again.  If the step rate is slower, I might not have to look back quite as many steps to cover the same time interval and assure the perception of non-repeatability.

Within the S&H context, the look-back function would also be driven by the degree of contrast pursued.  So, if I have a 16-bit random source, but I'm only going to make use of values between this one and that one, rather than all possible values, then my look-back function may be different, and perhaps go back even farther.

Rob Strand

#44
QuoteLet's say we have a device that can randomly spit out one of two possible values: X or Y.  Over 10's of thousands of samples, the mathematical probabilit of X or Y may be dead even.  But if it spits out the X value 117 times in a row before spitting out a Y, I'm unlikely to think of it as "random". (Think about the last timeyou were tossing a coin and started to think over multiple coin tosses that maybe it was "rigged").

It's an interesting problem.  If it was truely random then it is statistically possible for that to occur.   The long repeated sequences have a low probability.   If the filter stayed at the same frequency with decreasing likelyhood for longer repeats then it could actually add to the randomness.   It is adding randomness to the change period as well as filter frequency.

However, if you want to enforce a requirement that the filter must change at the set rate then that's a different story.  That's like saying the periodic bip, bap, bup is part of the desired effect.  In fact you would probably want to enforce more than just a change, you would want to enforce a perceivable change (which will be on a log scale for the filter frequency).  You could even set the minimum change as a parameter.  It would be interesting to hear what that sounded like (an ideal problem to play around with on a DSP.)  I suspect enforcing a perceivable change is sufficient and there is no need to keep histories.

Writing this out made me think about what you really want for the filter.  I have a feeling you want the filter frequencies to be more or less distributed on a log scale.  In addition to that you might want to avoid having extremely low and high values too often.  Then there's the issue you brought up of not having the repeats, or perhaps restating that as not being too close to the last point.

What do you think?

Obviously this is easier to implement with a micro.

Send:     . .- .-. - .... / - --- / --. --- .-. -
According to the water analogy of electricity, transistor leakage is caused by holes.

ElectricDruid

Quote from: Mark Hammer on January 30, 2018, 08:30:17 PM
What we seem to be neglecting (or perhaps it was explicitly stated but went over my head) is the role of perceived randomness or unpredictability, and auditory memory.

Let's say we have a device that can randomly spit out one of two possible values: X or Y.  Over 10's of thousands of samples, the mathematical probabilit of X or Y may be dead even.  But if it spits out the X value 117 times in a row before spitting out a Y, I'm unlikely to think of it as "random". (Think about the last timeyou were tossing a coin and started to think over multiple coin tosses that maybe it was "rigged").

Let's hike it up a notch to a sequencer.  If I have a humble 4-step sequencer, it will definitely sound like a repeating LFO.  If I have a 32-step sequencer, where I can set the pot for every step individually, the length of the sequence can exceed what I can recall and feel close to random because of that.

So, maybe the ideal is a microcontroller that not only has access to a many-bits source of random values, but also has a "look-back" function in the programming, to check if any random value has been used within recent memory, with the length of the look-back depending on clock/step rate.  As an example, if I was using such a source in a S&H arrangement, at 4 steps per second, I might only need to look back against the last 20 values (5 seconds' worth) for the sequence to be perceived as pretty random.  If the current sampled random value has been recently used, I pull another one and check again.  If the step rate is slower, I might not have to look back quite as many steps to cover the same time interval and assure the perception of non-repeatability.

Within the S&H context, the look-back function would also be driven by the degree of contrast pursued.  So, if I have a 16-bit random source, but I'm only going to make use of values between this one and that one, rather than all possible values, then my look-back function may be different, and perhaps go back even farther.

I like it. You're right that the raw pseudo random output will sometimes generate fairly long sequences of similar values (in fact, in theory for the LFSR bitstream, it's the same as the coin tosses - you will get some long runs of heads or tails).
It'd be relatively easy to modify the code to keep trying new random values until it got one that was more than X away from the last one, and X could be variable. That wouldn't eliminate the possibility of recurring sequences of values, but that's much less likely anyway.

"Rigged S&H"...hohoho. Nice, Mark.

Tom

R.G.

I really think you'd be happy filtering it down to 1/F noise.
R.G.

In response to the questions in the forum - PCB Layout for Musical Effects is available from The Book Patch. Search "PCB Layout" and it ought to appear.

amptramp

A lot of 1/f noise looks like popcorn noise where there is a random fluctuation but it is between two discrete values that repeat frequently.  It does appear to be caused by movement of charge carriers past inclusions in the semiconductor or conductor if you are using an amplified resistor as a noise source.  As such, it appears more correlated than normal white noise.

PRR

OPA140 appears to have 1/f corner at 20Hz.  For audio hiss, application is trivial. It costs 2 bucks; more than Zener or TL072, but less than futzing with such devices.

Mark's question is different; and perhaps more "art of perception" than math.
  • SUPPORTER

anotherjim

Does anyone remember the fuss over I-pods "random shuffle" playback selection not being random, because it seemed to pick the same tunes more often than expected? Given a limited range - intervals in our musical scales and beats in a bar, tunes in a playlist - randomness can appear repetitive. Picking the same option twice in succession can happen randomly. The I-pod solution was to bend the rules, so the last played tune could not be repeated for some number of selections, by barring it from random selection.


Digital Larry

Quote from: ElectricDruid on January 27, 2018, 02:52:28 PM
I've done some on dsPIC that produce white noise at 16-bit/96KHz quality and which don't repeat for ten million years!

Are you SURE about that?   :icon_wink:
Digital Larry
Want to quickly design your own effects patches for the Spin FV-1 DSP chip?
https://github.com/HolyCityAudio/SpinCAD-Designer

amz-fx

Quote from: Digital Larry on February 02, 2018, 09:42:33 AM
Are you SURE about that?   :icon_wink:

Definitely possible if you use a lot of bits in the shift register. I've done 56-bits in an AVR chip and it has a REALLY long cycle if you make sure that the feedback bits are selected to give a maximal length output. The output could be 16-bit but the PRNG would use more.

The small micros are probably too limited to hold Mersenne Twister code, but that's one way to go for long cycles without repeats.

Best regards, Jack

Mark Hammer

Quote from: anotherjim on February 02, 2018, 04:38:08 AM
Does anyone remember the fuss over I-pods "random shuffle" playback selection not being random, because it seemed to pick the same tunes more often than expected? Given a limited range - intervals in our musical scales and beats in a bar, tunes in a playlist - randomness can appear repetitive. Picking the same option twice in succession can happen randomly. The I-pod solution was to bend the rules, so the last played tune could not be repeated for some number of selections, by barring it from random selection.
Prior to retiring, I used to work on a survey of government managers, regarding their hiring practices.  We would dip into the government-wide personnel-file and extract a "random" 12% sample of cases where there were indications that the individual had recently been hired in a new job.  We would then contact the manager of that person, and ask them to complete a survey with details of how that hire came about. 

The received wisdom among government HR folks is that 20% of the managers do 80% of the hiring, with some managers constantly having to recruit new folks for entry-level positions that incumbents leave on a regular basis.  Although the sample really WAS random, some managers kept coming up again and again, because of what I note above.  Even though the survey requests were sent out maybe once a year, we would often get irritated/paranoid calls and e-mails from those managers, wondering if they were being auditted or otherwise targetted.  I had to work hard to convince them that it really WAS a random sample, despite the fact that their name kept coming up.

Human perception of relative probability is a funny thing.  Good thing I have my rabbit's foot, cross my fingers, and don't step on any sidewalk cracks when I go to buy a lottery ticket.  And particularly important that I only use birthdates when selecting numbers.  You know, just to get chance to work in my favour.  :icon_wink:

Digital Larry

Quote from: amz-fx on February 02, 2018, 12:12:36 PM
Quote from: Digital Larry on February 02, 2018, 09:42:33 AM
Are you SURE about that?   :icon_wink:

Definitely possible if you use a lot of bits in the shift register. I've done 56-bits in an AVR chip and it has a REALLY long cycle if you make sure that the feedback bits are selected to give a maximal length output. The output could be 16-bit but the PRNG would use more.

The small micros are probably too limited to hold Mersenne Twister code, but that's one way to go for long cycles without repeats.

Best regards, Jack

All I'm saying is that after all this buildup, if I get it and it repeats after just a million years, I'm gonna be pretty steamed.  My audience demands it.   ;D
Digital Larry
Want to quickly design your own effects patches for the Spin FV-1 DSP chip?
https://github.com/HolyCityAudio/SpinCAD-Designer

R.G.

Just clock the uC at several THz for a while and check for repeats.  :icon_lol:
R.G.

In response to the questions in the forum - PCB Layout for Musical Effects is available from The Book Patch. Search "PCB Layout" and it ought to appear.

EBK

#55
At some point in the future something will inevitably go wrong, and your random white noise sound will end like this:
  • SUPPORTER
Technical difficulties.  Please stand by.

Digital Larry

#56
One other thing about the perception of randomness... it depends on the control characteristic of whatever you're controlling.  Suppose your filter has a linear V to F characteristic, for example - 20 to 2560 Hz (7 octaves) would have the same voltage spread as 2000 to 4540 Hz (a little over 1 octave).  So you might think it spent most of its time in the high range.

Somebody probably already said that...
Digital Larry
Want to quickly design your own effects patches for the Spin FV-1 DSP chip?
https://github.com/HolyCityAudio/SpinCAD-Designer

Atodovax

Sorry to bump this thread. Is it normal to experience hiss when you are in UP mode in the filter dection and you are not playing?