This isn't immediately straightforward though, since it uses a variable sample rate (32KHz to 128KHz) and my current code is a fixed sample rate. The dsPIC could do a variable rate, but the DAC won't output samples faster than 100KHz, so the modulation depth might not be quite as extreme.
My delay line code (for
bela) sounds pretty good as a variable samplerate emulator. The thing to keep in mind is this is simply a resampling application that can be done with biquad filters emulating the typical BBD delay filters, and a linear interpolated delay line arguably as well as a real externally variable sample rate.
The main observation that led me here is that the 3rd order butterworth filters before and after a BBD chip limit the bandwidth on the audio signal that the higher variable samplerate has very little effect on the end audio quality (maybe gets a little more lo-fi on the lower end). The output filter is functionally a delay line interpolator. At the frequencies of interest, if you put the fixed rate delay line through a 6th or 8th order biquad, then you will have very little error in linear interpolating the delay line.
The main things you get from a variable samplerate are this:
a) LFO shape distortion (delay time proportional to 1/Fsample)
b) In the case of a BBD, some subtle amplitude modulation
Tom, on your site you have already worked out the formula for delay change vs modulation input. For lower CPU usage perhaps this could be reasonably approximated with a polynomial series.
Just food for thought -- I don't think the variable samplerate injects any real "magic"...but then maybe you don't have enough spare cycles to handle this in real-time.