ABetterMetronome using TAAE (a source code example)

2»

Comments

  • Thanks zobkiw. As you suggested, I muted the beat instead of terminating the block. The new problem is when I jump to my parent view and then jump back and click on start, the old thread for TAAE is still running, and it also creates a new thread. So beats will get messed up. Is there anyway we can only allow one thread. Any suggestions?

  • edited November 2015

    @zobkiw Your code has been a great help. I'm using an audio file for the metronome sound and I load it into an AVAudioPCMBuffer in viewDidLoad as I don't have a clue how to do that with CoreAudio's AudioBuffer. However, I can't figure out how to access this buffer in the AEBlockChannel. Do you see a way that I can make this work? I'm writing this in Swift, which may be part of the problem as I'm fuzzy on working with pointers. Unsurprisingly, Xcode currently gives me the error:

     Cannot assign value of type 'UnsafePointer UnsafeMutablePointer Int32' to type 'UnsafeMutablePointer AudioBufferList'  

    Any insights or suggestions would be fantastic.

    import UIKit
    import AVFoundation
    import TheAmazingAudioEngine
    
    
    class MyMetronomeVC: UIViewController {
    
    //Metronome
    var bpmMin = 1
    var bpmMax = 600
    
    var audioController:AEAudioController!
    var blockChannel:AEBlockChannel!
    var audioBuffer:AVAudioPCMBuffer!
    var audioFormat:AVAudioFormat!
    var beatURL:NSURL!
    var audioFrameCount:UInt32!
    
    
    override func viewDidLoad() {
        super.viewDidLoad()
    
        beatURL = NSBundle.mainBundle().URLForResource("FileName", withExtension: "wav")!
    
        let audioFile = try! AVAudioFile(forReading: beatURL!)
        audioFormat = audioFile.processingFormat
    
        let audioFrameCount = UInt32(audioFile.length)
    
        audioController = AEAudioController(audioDescription: AEAudioController.nonInterleaved16BitStereoAudioDescription(), inputEnabled: false)
        audioBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat!, frameCapacity: audioFrameCount)
    
        do {
            try audioFile.readIntoBuffer(audioBuffer)
        } catch let error as NSError {
            print(error.localizedDescription)
        }
    
    }
    
    
    func startMetronome() {
        var totalFrames:UInt64 = 0
        var nextBeatFrame:UInt64 = 0
        var makingBeat:Bool = false
    
        do {
            try audioController.start()
        } catch let error as NSError {
            print("Error starting AEAudioController: \(error.localizedDescription)")
        }
    
        self.blockChannel = AEBlockChannel(block: { (time, frames, var audio) in  //AudioTimeStamp *time, UInt32 frames, AudioBufferList *audio
            let bpm = 120
    
            let framesBetweenBeats = UInt64(44100.0 / Double(bpm) / 60.0)
    
            for (var i:UInt32 = 0; i < frames; i++) { // frame...by frame...
    
                if nextBeatFrame == totalFrames {
                    makingBeat = true
                    nextBeatFrame += framesBetweenBeats
                }
    
                if makingBeat == true {
    
                    //Attempting to implement this...
                    //((SInt16*)audio->mBuffers[0].mData)[i] = x;
                    //((SInt16*)audio->mBuffers[1].mData)[i] = x;
    
                    let frameData = self.audioBuffer.int32ChannelData
                    audio = frameData //??? 
    
                    // I'm aware that I need to provide the data for the current frame here, but not sure how to pull that out of frameData, nor how to access mBuffers and mData using Swift.
    
    
                    makingBeat = false
                }
    
                totalFrames++
            }
        })
    }
    
    
    
    }
    
  • @blwinters, I can't read swift, but in case it helps, in my Xcode solution I created an AEAudioFileLoaderOperation

     AEAudioFileLoaderOperation *soundFileOperation = [[AEAudioFileLoaderOperation alloc] initWithFileURL:[[NSBundle mainBundle] URLForResource:@sound withExtension:@aif]
                                                             targetAudioDescription:audioController.audioDescription];
    

    then started the operation

    [soundFileOperation start];
        if ( soundFileOperation.error) {
            NSLog(@load error: %@", soundFileOperation.error);
            return nil;
        }
    

    then load audio in buffer in a @property (assign, nonatomic) AudioBufferList *soundFileAudioBufferList;

    _soundFileAudioBufferList = soundFileOperation.bufferList;
     UInt32 soundFileLengthInFrames= soundFileOperation.lengthInFrames;
    

    then I need a UInt32 bufferPosition and then here is how I load my soundFile

    if makingBeat {
    
     if (bufferPosition < soundFileLengthInFrames) {
                        ((UInt16 *)audio->mBuffers[1].mData)[i] = ((UInt16 *)_soundFileAudioBufferList->mBuffers[1].mData)[bufferPosition];
                        ((UInt16 *)audio->mBuffers[0].mData)[i] = ((UInt16 *)_soundFileAudioBufferList->mBuffers[0].mData)[bufferPosition];
                    
                    } else {
                        makingBeat = NO;
                    }
                    bufferPosition ++;
    }
    

    Then I zero my bufferPosition at the same place I set makeBeat = YES. In your case maybe

     if nextBeatFrame == totalFrames {
                    makingBeat = true
                    bufferPosition = 0
                    nextBeatFrame += framesBetweenBeats
                }
    

    Man I hope it makes any sense. I gathered this code from the forums here and there, and I've copied now what I thought was relevant for you. I'm sorry I can't give the proper credits. I'm such a noob, even my metronome works. I'm working in playing three channels but I'm having problems syncing.

    For my project it was key to choose the right variable types, what was working with UInt32 was not with UInt16 and viceversa. I can't explain, it was a trial error thing, monkey coding..

  • I'm writing this in Swift

    This is a very bad idea, I'm afraid. http://forum.theamazingaudioengine.com/discussion/comment/2242/#Comment_2242

  • Thanks for your example @zobkiw, it has been extremely helpful. I made just a simple extension of it that uses Pure Data (PD) for making sounds. It was important for me to able to use an accurate timing mechanism, as the one you proposed, with the "idiot-proof" but powerful PD music/audio generation system - through libPD for iOS.

    Here is a simple example metronome that uses PD within a AEClockChannel - and also updates the current beat on the UI screen. More info can be found in my blog...

  • Hi, I ran this on an iPhone 7 and the simulator against a DAW and physical metronome, and it's spot on, but on older devices like the iPhone 6, it drifts forward after some time. I can't figure out why it would run differently on different devices, does anyone have a clue?

Sign In or Register to comment.