iOS: Sample code for simultaneous record and playback

To avoid latency issues, you will have to work at a lower level than AVFoundation alright. Check out this sample code from Apple - Auriotouch. It uses Remote I/O.


As suggested by Viraj, here is the answer.

Yes, you can achieve very good results using AVFoundation. Firstly you need to pay attention to the fact that for both the player and the recorder, activating them is a two step process.

First you prime it.

Then you play it.

So, prime everything. Then play everything.

This will get your latency down to about 70ms. I tested by recording a metronome tick, then playing it back through the speakers while holding the iPhone up to the speakers and simultaneously recording.

The second recording had a clear echo, which I found to be ~70ms. I could have analysed the signal in Audacity to get an exact offset.

So in order to line everything up I just performSelector:x withObject: y afterDelay: 70.0/1000.0

There may be hidden snags, for example the delay may differ from device to device. it may even differ depending on device activity. It is even possible the thread could get interrupted/rescheduled in between starting the player and starting the recorder.

But it works, and is a lot tidier than messing around with audio queues / units.


I had this problem and I solved it in my project simply by changing the PreferredHardwareIOBufferDuration parameter of the AudioSession. I think I have just 6ms latency now, that is good enough for my app.

Check this answer that has a good explanation.