Help with adapting this audio delay program...

Started by stfala, November 05, 2015, 06:25:09 AM

Previous topic - Next topic

stfala

http://blog.vinu.co.in/2012/05/generating-audio-echo-using-atmega32.html

I'd like to try this circuit but with a variable resistor to control delay time instead of having push buttons to get a more 'analogue' feeling. From what I understand (which is limited in coding), the push button has 4 different selections for delay time. So if I had a VR connected to say ADCO, I'd have to initialize this pin as an ADC, but how would I tell it to control the delay time? Also, how would I set the maximum and minimum values  (and those in between) of delay time for the VR to select between?

Any help/ideas/snippets of code would be helpful!

Cheers guys.

cloudscapes

From what I can tell in the code, they are using a fixed-time sample delay of 90 microseconds, and then just vary the delay buffer length via the pushbuttons, up to a maximum of 1900 samples.

To get continuously variable delay time, you can't do it this way. The delay buffer length is what needs to be fixed, and the sample delay needs to be variable (via potentiometer/adc input). The main samplerate needs to be variable. The value that needs to change via pot is this:

_delay_us(90);

Replace 90 with a variable that has a range, controlled by the pot. You can probably do away with most of the push-button stuff, though you'll have to figure out where the part is where the buffer length is defined each time a button is pressed, and then just have it fixed.

This is how I did my own delay at the start of this year. Fixed buffer lengths, variable samplerate.

I haven't had my morning coffee yet, so I'm a bit confused on how why they're looping the sample counter the way they are at the end of main().
~~~~~~~~~~~~~~~~~~~~~~
{DIY blog}
{www.dronecloud.org}

anotherjim

You can vary delay time with fixed sample rate and fixed buffer. The delay is the position of an old sample behind each new sample up to the maximum length of the buffer. So, whatever the record pointer address to the buffer is at any moment needs to have some value subtracted to point to the required delayed sample to be played. The buffer is cyclic so you need to make sure the subtraction "wraps" around zero and stays within the buffer...
If buffer length is 1000 and record position is 100 and delay required is 200, then play position needs to be 900, not -100. As record position increments, play position will be incrementing too and always point to a sample 200 places older than the current record. The delay amount could simply be the value of an ADC read of a pot.

The ADC Mux would need to be switched between reading a pot and getting an audio sample. You can push an ATMega ADC to run off up 1MHz clock no problem, but that is over 70kHz. A 250kHz ADC clock gives over 9kHz.. You can set the conversion rate to essentially be the sample rate so don't need a separate timing, just let the code wait for the ADC complete flag. The PWM can be left free running at a high rate and simply be updated whenever the ADC cycle updates the buffer cycle.

I don't really do C, but can see enough to say it should be possible to mod it. ~shame the author doesn't write comments!
There's no need for the code to truncate the ADC reading to 8bit. Set it to produce a left justified result (ADLAR) and only read ADCH for the 8 most significant bits.

cloudscapes

#3
Quote from: anotherjim on November 05, 2015, 01:33:35 PM
You can vary delay time with fixed sample rate and fixed buffer. The delay is the position of an old sample behind each new sample up to the maximum length of the buffer. So, whatever the record pointer address to the buffer is at any moment needs to have some value subtracted to point to the required delayed sample to be played. The buffer is cyclic so you need to make sure the subtraction "wraps" around zero and stays within the buffer...
If buffer length is 1000 and record position is 100 and delay required is 200, then play position needs to be 900, not -100. As record position increments, play position will be incrementing too and always point to a sample 200 places older than the current record. The delay amount could simply be the value of an ADC read of a pot.

by "more 'analogue' feeling", I felt the OP wanted the pitch bending you get when twisting a delay time pot as you get on all analog delays and most old digital delays. that's because the delay time pot serves as a master clock control. I felt you kind of had to do the same thing if you wanted to use a microcontroller. unless you wanted to do heavy DSP math and interpolation, you're not going to get the pitch bending with delay time manipulation. you have to vary the samplerate along with the sample counter.

that's how I did it on the delay I designed early this year:
http://www.diystompboxes.com/smfforum/index.php?topic=109314.0

or maybe I misunderstood the OP.
~~~~~~~~~~~~~~~~~~~~~~
{DIY blog}
{www.dronecloud.org}

stfala

cloudscapes, you understood correctly. Sounds like I'll have to settle for just varying the delay time without the pitch bending 'effect'. That's not a major problem though, the main thing is being able to vary the delay time with the pot.
Thanks for the input guys, i'll try the methods suggested..

cloudscapes

I think you still can have the delay time do the pitch bending effect. I see no reason you cant. try this as a mod. keep most of the code as-is, but have a pot change this part:

_delay_us(90);

have the pot range between 40 to 200 or something. experiment with the values. it should be really quick to implement if you already have a working prototype of the original.
~~~~~~~~~~~~~~~~~~~~~~
{DIY blog}
{www.dronecloud.org}

anotherjim

Tape like speed adjust then. As Ettiene suggests vary that delay. What I suggested would simulate types where the play head can be moved to change the basic delay.
Maybe you can have both ;)

Digital Larry

I can get pitch bending with delay time changes without changing the sample rate on an FV-1, BUT you have to slow down the delay time signal to make it change slowly so it doesn't just jump to the new value.  I'm guessing you could do something similar with either analog or digital low pass filtering of the delay time voltage.
Digital Larry
Want to quickly design your own effects patches for the Spin FV-1 DSP chip?
https://github.com/HolyCityAudio/SpinCAD-Designer

stfala

In reference to the original code I posted, what exactly does this first line of the infinite loop state?

rd = adc_read() +  buf[j]*echo_level;

Is it literally just adding what is read in from the ADC with what is held in the buffer and multiplying by the echo level selected?

Also, if I wanted to change the delay time function with a pot. like has been suggested already, how would I state that I wanted to it to change through every value from say, 50ms - 200ms?
Is there any part of the ADC initialization that would need to be changed?

Excuse what may seem like pretty obvious questions.

anotherjim

It might be better to do wet/dry mixing externally in the analog world, in which case digital control of echo level is pointless and you can remove all that.

To read a pot you'll need to switch the ADC channel in ADMUX. The settings for pot read would be the same as the signal read, only the channel changes. At the same time use the ALAR bit in ADCSRA to get an easy to manipulate 8bit result. Also find the settings to have the ADC clock at 1Mhz (if it isn't already).

Recommend you become familiar with the AVR ADC. Read the data sheet for your chip and things like this...
http://www.avrfreaks.net/forum/tut-c-newbies-guide-avr-adc?page=all.

Matching your delay time pot value to the sampling rate set in the _delay_us(90) function might be simply a matter of adding some constant to the pot read so it won't go to fast, but that may not be essential. It might get too slow, then a simple solution is to divide the pot result by 2 making the range of values 0-127.


cloudscapes

_delay_us(90) is what you change to vary the delay time smoothly. how much total delay time you want depends on the number of samples in the buffer/loop. if you have 1900 samples, and your delay *between* samples is 90 microseconds, you have a 171 millisecond delay. so if you want a delay that ranges between 50-200ms, you need a pot that smoothly changes the _delay_us(#) to range between 26 and 105. pretty easy!

I too prefer feedback to be in the analog domain. even for low fidelity sampling, it offers filter/tone possibilities that might be difficult to emulate digitally. the digital delay I built recently has analog feedback.
~~~~~~~~~~~~~~~~~~~~~~
{DIY blog}
{www.dronecloud.org}