Play-through latency comparison between TAAE and libPD
I made a comparison of play-through latencies with the "simplest possible" play-through programs using TAAE and libPD. I don't know if the methodology is correct, but it makes sense. You can find a detailed description with graphs etc on this page:
but a rough description would be:
I connected two guitar cables to the two audio inputs of a Presonus Firebox sound card. Input 1 was the “test signal” and input 2 the “ground-truth” signal. For testing the latency introduced by libPD and TAAE, the “test signal” was connected to an iPhone 6 before reaching the sound card. For a sanity-check of the process, I also tested both inputs driven directly (without iPhone intervention) into the sound card – just to make sure that there is no inherent latency between the two inputs of the sound card.
For comparing the difference between input 1 (test signal) and input 2 (ground truth), I recorded the both signal in two separate channels in Ableton Live. At some point during the recording I made both cables touch each other, producing an impulse peak in both channels. Then I extracted both signals in separate wav files, imported them in Matlab to measure the distance between the peaks in both signals. I considered this distance (in samples or seconds) as the value of latency introduced by using the iPhone.
libPD latency: 2335 samples — 0.0529 seconds
TAAE latency: 742 samples — 0.0168 seconds
I'll be very happy to hear your comments, thoughts, suggestions etc...