Difficulty releasing objects

Forgive me if I don't use the proper terminology here, but I think you'll get the gist:

The example app never attempts to release the controller, since its a single view app. So I can't use it as a reference for help with my app. In my app, the controller can be close and the audio engine is destroyed. But I'm being left with orphaned items, and I'm not sure why. I'm sure if I can solve one of them and understand the issue, I can fix the rest, so I'll focus on an audio file player.

I have an audio file player working well but if I close the view, my view does dealloc but I'm left with a AEAudioFilePlayerModule leftover. The AEAudioFilePlayerModule is used in a subrenderer block and I use a managed value to handle it.

If I put this in a PAUSE method:

    self.audioFilePlayer.completionBlock = nil;
    self.playerValue.objectValue = nil;
    self.audioFilePlayer = nil;

The AEAudioFilePlayerModule is destroyed (viewing in Instruments). However, if I have a button that closes the view, while is also called the PAUSE method, the AEAudioFilePlayerModule is left in existence. Instruments shows a reference to an AEManaged Value is the bit that's causing it to hang.

Next, I found that if I don't put the subrenderer block in place, it will be destroyed when I PAUSE view closing the view. Here's my block.

    self.subRenderer.block = ^(const AERenderContext * _Nonnull context) {
        __unsafe_unretained AEAudioFilePlayerModule * weakAudioFilePlayer = (__bridge AEAudioFilePlayerModule *)AEManagedValueGetValue(weakPlayerValue);
                if (weakAudioFilePlayer)
        {
            AEModuleProcess(weakAudioFilePlayer, context);
        }
        AERenderContextOutput(context, 1);
    };

So I tried to add:

self.subRenderer.block = ^(const AERenderContext * _Nonnull context) {};

to my tear down, but it still doesn't work, the AEAudioFilePlayerModule is left hanging around if I close my view.

Here's where I get very confused.. sometimes if I use a breakpoint and pause the app during teardown it does destroy the AEAudioFilePlayerModule. I spent all day yesterday on this problem. I even re-wrote to make it tear it down on Pause and recreate on play (though I prefer to NOT do that) which works well.. until you use the view close button to pause it, then it leave it hanging. Argh. I've also make the AEAudioFilePlayerModule a local to the method that gets called to load it, using the ManagedValue to keep it alive (like the example app does; very elegant) but it didn't change the outcome.

I'm sure its something fundamental and relatively simple that I'm doing wrong. Rather, I hope it is. Ha!

Comments

  • edited June 2016

    I just found releaseNotificationBlock. So, obviously if I set the ManagedValue.objectvalue to nil in my code but NOT in tear down, I get the releaseNotificationBlock call. But if I set the ManagedValue.objectvalue to nil in a tear down I guess somethings are destroyed before ManagedValue gets a chance to do it's thing, and it's left hanging around. The releaseNotificationBlock never fires.

    So what's the best practice with something like this?

    Edit: I suppose I could daisychain the tear down to procedurely step through each Managedvalue, set each to nil and use the releaseNotificationBlock to call the next one but YUCK. Hopefully there is a better way.

  • I've confirmed that I CAN daisychain setting each ManagedValue to nil (I did two of 10 that I have) but it's terrible programming lol. I couldn't ever look another programmer in the eyes if I use this technique.

  • Oh what's this?

    // Mark old value as pending release - it'll be transferred to the release queue by
    // AEManagedValueGetValue on the audio thread

    So perhaps my problem is that the engine is stopping, so the request never gets fulfilled..

  • edited June 2016

    Things are improved using the releaseNotificationBlock method to release subsequent ManagedValues. I've found that items with a subrenderer must be let go after the ones in the main renderer. So I did this, but its not safe:

    __weak RecordingController * weakSelf = self; self.playerValue.releaseNotificationBlock = ^{ NSLog(@"self.playerValue.releaseNotificationBlock"); weakSelf.pitchShiftValue.objectValue = nil; }; self.playerValue.objectValue = nil; self.recorderValue.releaseNotificationBlock = ^{ NSLog(@"self.recorderValue.releaseNotificationBlock"); }; self.recorderValue.objectValue = nil; self.gainValue.releaseNotificationBlock = ^{ NSLog(@"self.gainValue.releaseNotificationBlock"); }; self.gainValue.objectValue = nil; self.audioReverbValue.releaseNotificationBlock = ^{ NSLog(@"self.audioReverbValue.releaseNotificationBlock"); }; self.audioReverbValue.objectValue = nil; self.reviewPlayerValue.releaseNotificationBlock = ^{ NSLog(@"self.reviewPlayerValue.releaseNotificationBlock"); }; self.reviewPlayerValue.objectValue = nil; self.pitchShiftValue.releaseNotificationBlock = ^{ NSLog(@"self.pitchShiftValue.releaseNotificationBlock"); [weakSelf.microphoneInput stop]; [weakSelf.audioOutput stop]; [[AVAudioSession sharedInstance] setActive:NO error:NULL]; [weakSelf tearDown]; }; return;

  • edited June 2016

    This has reduced my generational memory growth to < 1.5MB per instance, but there's still something wrong. See anything wrong with my render loop?

    audioRenderer.block = ^(const AERenderContext * context ){ __unsafe_unretained AEAudioFileRecorderModule * weakAudioFileRecorder = (__bridge AEAudioFileRecorderModule *)AEManagedValueGetValue(weakRecorderValue); __unsafe_unretained AEDynamicsProcessorModule * weakAudioGain = (__bridge AEDynamicsProcessorModule *)AEManagedValueGetValue(weakGainValue); __unsafe_unretained AENewTimePitchModule * weakPitchShift = (__bridge AENewTimePitchModule *)AEManagedValueGetValue(weakPitchShiftValue); __unsafe_unretained AEReverbModule * weakAudioReverb = (__bridge AEReverbModule *) AEManagedValueGetValue(weakAudioReverbValue); __unsafe_unretained AEAudioFilePlayerModule * weakReviewAudioFilePlayer = (__bridge AEAudioFilePlayerModule *)AEManagedValueGetValue(weakReviewPlayerValue); //put the mic input on the stack if (!weakSelf.isReviewing) { AEModuleProcess(input, context); if (weakAudioFileRecorder) { //record the stack contents to a file AEModuleProcess(weakAudioFileRecorder, context); } if (!weakSelf.hearVoiceInHeadphones) { AEBufferStackPop(context->stack,1); } else // we do want to hear the voice, process MMSBL effects { if (weakAudioGain) { AEModuleProcess(weakAudioGain, context); } if (weakAudioReverb) { AEModuleProcess(weakAudioReverb, context); } } } if (weakSelf.isReviewing && weakReviewAudioFilePlayer) { AEModuleProcess(weakReviewAudioFilePlayer, context); if (weakAudioGain) { AEModuleProcess(weakAudioGain, context); // MMSB } if (weakAudioReverb) { AEModuleProcess(weakAudioReverb, context); //MMSB } } if (weakPitchShift) { // the audio file is read in this submodule, and pitch shifted and passed here AEModuleProcess(weakPitchShift, context); } // mix the stack into one (the mic+subrenderer stacks) AEBufferStackMix(context->stack,2); // play the stack AERenderContextOutput(context, 0); };

  • edited June 2016

    -

  • Ok, pretty much done for the day. I boiled down all my code, removing everything I could but still play a simple file. Here's the setup:

    `

    • (void) setupAudioPlayback
      {
      AERenderer * audioRenderer = [AERenderer new];
      AEAudioUnitOutput * output = [[AEAudioUnitOutput alloc] initWithRenderer:audioRenderer];
      self.audioOutput = output;
      [output start:nil];
      NSString * filePath = [documentDir stringByAppendingPathComponent:[NSString stringWithFormat: @%@", audioFileName]];
      NSURL *file = [NSURL fileURLWithPath:filePath];
      AEAudioFilePlayerModule * audioFilePlayer = [[AEAudioFilePlayerModule alloc] initWithRenderer:audioRenderer
      URL:file
      error:nil];
      AEManagedValue * playerValue = [AEManagedValue new];
      self.playerValue = playerValue;
      self.playerValue.objectValue = audioFilePlayer;
      __weak RecordingController * weakSelf = self;
      self.playerValue.releaseNotificationBlock = ^{
      NSLog(@self.playerValue.releaseNotificationBlock);
      [weakSelf.audioOutput stop];
      [weakSelf continueTearDown];
      };
      audioRenderer.block = ^(const AERenderContext * context ){
      __unsafe_unretained AEAudioFilePlayerModule * player = (__bridge AEAudioFilePlayerModule *)AEManagedValueGetValue(playerValue);
      if (player)
      {
      AEModuleProcess(player, context);
      }
      AERenderContextOutput(context, 0);
      };
      }
      `

    And this starts the tear down cascade.

    `

    • (void) tearDownAudioEngine
      {
      self.playerValue.objectValue = nil;
      }
      `

    During runtime, I'm left with this:

    And this, which is the thing taking up actual memory.

    So.. AEBufferStackPoolInit seems to not free up a pool. Or something? I literally am exhausted and out of ideas. I hope this helps and I hope someone can help me track down the problem!

  • edited June 2016

    As I see the sample had an error causing a renderer block retention, I adjusted my render loop too.

    __unsafe_unretained RecordingController * weakSelf = self;

    and I use weakSelf->var to get my variables. Same problem tho..

    I realize now this is the renderer block being retained. But I'm at a loss at what I'm doing wrong.

  • I think if I simply do this:
    AERenderer * audioRenderer = [AERenderer new]; AEAudioUnitOutput * output = [[AEAudioUnitOutput alloc] initWithRenderer:audioRenderer];
    Then i get these processes in the Allocation Summary as above. Setting either to nil doesn't help.

  • Whoops! Right you are, @everlasting1; I had a couple dumb retain cycles in there =/

    Fixed now.

Sign In or Register to comment.