Swift walktrough

Hi, I've found no viable ways on the forum, how to use TAAE (and Audiobus too) using Swift/XCode 7 I've done a bridging header but no way to figure how to make... must implement my AudioController in Objc-C and use it into Swift or must I rewrite all the code?

Thanks

Comments

  • Hi @mikegazzaruso,

    In general, it is indeed better to use Obj-C: Swift is not a good choice of language for working with audio as its use can cause audio glitches due to priority inversion (it's not designed for realtime use). Consequently, I personally don't encourage its use in audio apps right now.

    If you're not actually doing anything on the realtime audio thread (e.g. you're only using AEAudioFilePlayer), then you can get away with it. You're sorta on your own there though, for now, I'm afraid =)

  • @Michael said:
    Swift is not a good choice of language for working with audio as its use can cause audio glitches due to priority inversion (it's not designed for realtime use). Consequently, I personally don't encourage its use in audio apps right now.

    Just out of curiosity, has anybody done any real world tests on this and published their results?
    I've seen it said by quite a lot of knowledgeable people but I've also seen benchmarks giving Swift comparable speed to C++ (I know these aren't representative of real world use). I think you'd get a bit of a performance hit but don't think it'd be significant.

    What about using Swift with the UnsafeMutablePointer C bridge stuff, that presumably compiles down fairly similarly to C? Or is the problem that there's no guarantees about when memory is going to allocated?

  • You nailed it at the end there - it's basically that there're no guarantees about when it might take a lock or allocate memory. It's not so much about the quality of the compiled code, but what realtime-unsafe stuff it's doing. For example, unlike in Objective-C, in Swift you can't directly access instance members as a pointer dereference: it'll call the getter. As I understand it (although I confess I don't understand it well - I've not studied the runtime, although I don't think it's open source yet anyway), every method call runs the risk of a block, because it does things like grabs locks.

    One workaround is to use a dual approach, writing Obj-C classes for the audio stuff, and Swift classes for the rest, which is probably the best bet if you'd like to use Swift as much as possible. It does introduce the need for additional glue code, though, as you'll need to write code that syncs state between the Swift classes and their Obj-C counterparts.

    Personally I'd just prefer to keep it simple and stick with Obj-C for now, but that's just me =)

  • Nevermind, I managed how.

    I'm writing here if someone needs starterkit help:

    1) Create a bridging header between Obj-C and Swift (create a .m file and XCode will ask, choose Yes)
    2) Create your AudioController Obj-C class (.h / .m) and try to create that agnostically, i.e.:

    -(void)setup
    // allocate your audioController

  • Hi Michael, thaks for your reply. Ok, I'll switch back to Objective-C.

  • However I'm trying to get it work by creating an AudioController bridge class in Objective-C.

    Right now I am able to get TAAE work without problems.

    Michael are you interested in some feedback?

  • Sorry just read now your actual answer :) Yes, I think using Obj-C audioController class and AppDelegate/ViewController(s) in Swift is the actual solution.

    It does work!

  • Sure, feedback always good.

  • edited October 2015

    @Michael said:
    Sure, feedback always good.

    Well, first of all I Created my AudioController class in Objective-C. I created in this class some methods that need to be called from your app's AppDelegate or ViewController, that creates upon loading an instance of AudioController itself.

    In example, during my delegate's didFinishLaunching method:

     func application(application: UIApplication, didFinishLaunchingWithOptions launchOptions: [NSObject: AnyObject]?) -> Bool {
            
           self. audioController.setup()
          
            return true
        }
    

    audioController is my Obj-C AudioController class and in setup(_:) method I simply start the engines:

    - (void)setup
    {
        
        // Create an instance of the audio controller
        AudioStreamBasicDescription audioDescription = [AEAudioController nonInterleaved16BitStereoAudioDescription];
        self.audioController = [[AEAudioController alloc] initWithAudioDescription:audioDescription inputEnabled:YES];
        
        // Starting the audio controller
    
        NSError *error = NULL;
        BOOL result = [_audioController start:&error];
        if ( !result ) {
             NSLog(@%@",[error localizedDescription]);
        }
        
        __block AudioBufferList *inBuf; // Will copy here input buffer.
        
        AEBlockAudioReceiver *recv = [AEBlockAudioReceiver audioReceiverWithBlock:^(void *source, const AudioTimeStamp *time, UInt32 frames, AudioBufferList *audio) {
            inBuf = audio;  
        }]; // Setting a Receiver
        
        [_audioController addInputReceiver:recv]; // Setting recv as input receiver.
        
        AEBlockChannel *channel = [AEBlockChannel channelWithBlock:^(const AudioTimeStamp *time, UInt32 frames, AudioBufferList *audio) { // Creating audio channel with received input
            if(inBuf) {
                for(int i=0;imNumberBuffers;i++)
                    memcpy(audio->mBuffers[i].mData, inBuf->mBuffers[i].mData, inBuf->mBuffers[i].mDataByteSize);
            }
        }]; // Populating a new audio channel with fresh data from inBuf
        
        channel.audioDescription = [_audioController inputAudioDescription]; // Setting channel audioDescription
        
        [_audioController addChannels:[NSArray arrayWithObject:channel]]; // Adding channel to Audio Controller
        
        self.audioControllerFilter = [[DRDynaRageAudioFilter alloc] init]; // Allocating Filter object 
        
        [_audioController addFilter:self.audioControllerFilter toChannel:channel]; // Adding Filter to audioController
    
    

    I was also able to send a NSNotification from Swift and catch it from Objective-C without problem (for filter specific's parameter changing).

    Would be nice to make some "glue wrappers" for Swift in the future, maybe.. Michael? Keep doin' the good work guys.

    Cheers,

    Mike

  • Oh, you might want to use AEPlaythroughChannel if you want to pass audio from the mic through to the speaker. I'm not sure the implementation above is going to work in all cases.

  • You're right.. however actually it does work also with Audiobus (i create filter port passing the AudioController's audioUnit and not a process block, the advantage is that with passing the audioUnit my AudioController's output muting is for free!)

Sign In or Register to comment.