nonInterleavedFloatStereoAudioDescription and AEAudioUnitChannel

mokmok
edited March 2013

TheEngineSample uses nonInterleaved16BitStereoAudioDescription in it's setup. If I change this to nonInterleavedFloatStereoAudioDescription, the AEAudioUnitChannel classes fail in the renderCallback (AudioUnitRender) with a code of -10876 (kAudioUnitErr_NoConnection)

Things appear fine, but I haven't dug too far into it yet. Other formats work, only the nonInterleavedFloatStereoAudioDescription seems to fail. Coincidently, the nonInterleavedFloatStereoAudioDescription is the only format I tried that causes the _converterNode to be skipped... which is suspicious, but I couldn't find anything immediately wrong with the non-coverted node.

Posting in hopes that this might trigger a clue.

Comments

  • Oops - right you are, @mok, thanks for the heads-up! Fixed now, update from the github repo (it needed some AudioUnitInitialize calls).

  • Whoohoo! Thanks @Michael for the quick fix! I also grabbed the other updates you made to support floating point in the example app. It's working great now.

  • Floating Audio description doesn't seem to play nice with the O-Scope in the example app. At runtime the app fails to access targetBuffer->mNumberBuffers in the float converter class.

  • Update from the git repo, @OTCintelligence - I fixed it on April 1.

  • The release from the Download link I pulled yesterday extracts with the following date:

    Tuesday, April 2, 2013 8:44 PM

    I just downloaded again and it has the same date.

  • Hmm, that's odd. This did push me to discover some problems with a couple audio formats, now fixed (so update again and see if that fixes it), but I don't see any problems with the oscilloscope. Is it happening on any particular device/iOS version?

  • edited April 2013

    With the update the issues with the Example application seem to be fixed, however, if i pull TPOscilloscope into my own project I still get the runtime crash embodied in the attached picture that I got with the previous release. I have not been able to use the oscope to analyze my waveform output in even 16 bit format since before the float updates. Clearly there must be some difference in the initialization between the example app and my own, but i can't seem to figure out what it is. Here is how I have been initializing the scope:

    @property (nonatomic, retain) TPOscilloscopeLayer *outputOscilloscope;
    
    self.outputOscilloscope = [[TPOscilloscopeLayer alloc] init];
    _outputOscilloscope.frame = CGRectMake(38, 60,
    self.oscopeImageView.frame.size.width - 69, self.self.oscopeImageView.frame.size.height - 130);
       
    [self.view.layer addSublayer:self.outputOscilloscope];
    [_audioController addOutputReceiver:_outputOscilloscope];
    [self.outputOscilloscope start];
    

    The crash when accessing targetBuffer->mNumberBuffers only leads me to believe that the AudioBufferList was not created properly:

    _conversionBuffer = AEAllocateAndInitAudioBufferList
    (_floatConverter.floatingPointAudioDescription, kMaxConversionSize);
    

    With the release from Tues Apr 2 the results were:

    ios 5.1: caused run time crash
    ios 6.1: ran but no output from drums organ or oscillator

  • Oh, right, I hadn't realised you'd put it into your own code; you're not using the right initializer there, which should be:

    - (id)initWithAudioController:(AEAudioController*)audioController;
    
  • Yep that was it, thanks!

Sign In or Register to comment.