TAAE2: Modifying Volume @ AEAudioFilePlayerModule

Hi Guys:

I Need to Mute / UnMute certain AEAudioFilePlayerModule I've Added To My Output @ TAAE2.

The Previous AEAudioFilePlayer Had a property called "channelIsMuted". Is there an Equivalent in TAAE2?

PLEASE HELP.

Thanks :-)

Comments

  • Hi Hernan,

    Check out AEMixerModule, I've never used it but it should have what you need.

    Best,
    Chris

  • If all you need is to mute an AEAudioFilePlayerModule, you could also just set a boolean / some other primitive to tell the renderer block whether to add it to the output or not.

  • edited September 2016

    @Hernan,

    Here's some code that should help you understand how to modify the volume level and/or stereo panning of each AEAudioFilePlayerModule:

    - (instancetype)init {
        if ( !(self = [super init]) ) return nil;
    
        // Create a renderer
        AERenderer * renderer = [AERenderer new];
    
        // Setup audio loops
    
        NSURL * url = [[NSBundle mainBundle] URLForResource:@"technobeat" withExtension:@"m4a"];
        _technobeat = [[AEAudioFilePlayerModule alloc] initWithRenderer:renderer URL:url error:NULL];
        _technobeat.loop = YES;
        _technobeat.microfadeFrames = 32;
    
        url = [[NSBundle mainBundle] URLForResource:@"bass" withExtension:@"m4a"];
        _bass = [[AEAudioFilePlayerModule alloc] initWithRenderer:renderer URL:url error:NULL];
        _bass.loop = YES;
        _bass.microfadeFrames = 32;
    
        url = [[NSBundle mainBundle] URLForResource:@"housebeat" withExtension:@"m4a"];
        _housebeat = [[AEAudioFilePlayerModule alloc] initWithRenderer:renderer URL:url error:NULL];
        _housebeat.loop = YES;
        _housebeat.microfadeFrames = 32;
    
        url = [[NSBundle mainBundle] URLForResource:@"melody" withExtension:@"m4a"];
        _melody = [[AEAudioFilePlayerModule alloc] initWithRenderer:renderer URL:url error:NULL];
        _melody.loop = YES;
        _melody.microfadeFrames = 32;
    
        // Add effects filters
    
        // Distortion
        _distortion = [[AEDistortionModule alloc] initWithRenderer:renderer];
        _distortion.cubicTerm = 60.0;
        _distortion.softClipGain = 20.0;
        _distortion.rounding = 20.0;
        _distortion.finalMix = 80.0;
    
        // Delay
        _delay = [[AEDelayModule alloc] initWithRenderer:renderer];
        _delay.delayTime = _technobeat.duration/4;
        _delay.feedback = 40.0;
        _delay.lopassCutoff = 15000.0;
        _delay.wetDryMix = 35.0;
    
        // Reverb
        _reverb = [[AEReverbModule alloc] initWithRenderer:renderer];
        _reverb.decayTimeAt0Hz = 2;
        _reverb.decayTimeAtNyquist = 3;
        _reverb.gain = 4;
        _reverb.dryWetMix = 10.0;
    
        // Create an output
        self.output = [[AEAudioUnitOutput alloc] initWithRenderer:renderer];
    
        // Render block
    
        // Setup renderer. This is all performed on the audio thread, so the usual
        // rules apply: No holding locks, no memory allocation, no Objective-C/Swift code.
    
        renderer.block = ^(const AERenderContext * _Nonnull context) {
    
            AEModuleProcess(_melody, context); // Run player (pushes 1)
    
            if ( _melody ) {
                // Apply volume and pan level control to the melody track
                _volumeLevel = 1.0; // Volume level is 0.0 -> 1.0
                _panLevel = 0.0; // Stereo pan level is -1.0 = left side 0.0 = center 1.0 = right side
                AEBufferStackApplyFaders(context->stack, _volumeLevel, NULL, _panLevel, NULL);
            }
    
            // Apply distortion and delay filters ONLY to the 'melody' audio file above.
    
            AEModuleProcess(_distortion, context); // Run filter (edits top buffer)
            AEModuleProcess(_delay, context); // Run filter (edits top buffer)
    
            // Add the remaining 3 audio files.
            AEModuleProcess(_bass, context); // Run player (pushes 1)
            AEModuleProcess(_housebeat, context); // Run player (pushes 1)
            AEModuleProcess(_technobeat, context); // Run player (pushes 1)
    
            if (_housebeat) {
               AEBufferStackApplyFaders(context->stack, 1.0, NULL, 0.0, NULL);
            }
    
            if (_technobeat) {
                AEBufferStackApplyFaders(context->stack, 1.0, NULL, 0.0, NULL);
            }
    
            if (_bass) {
                AEBufferStackApplyFaders(context->stack, 1.0, NULL, 0.0, NULL);
            }
    
            // Mix all 4 audio file buffers
            AEBufferStackMix(context->stack, 4); <- 4 AEModuleProcess files
    
            // Apply reverb filter to entire audio output mix for all tracks
            AEModuleProcess(_reverb, context);
    
            AERenderContextOutput(context, 1); // Put top buffer onto output
    
        };
    
        NSLog(@"Render block initialized and running.");
    
        return self;
    }
    

    I hope this helps!

    Take care,
    Mark

  • edited October 2016

    Hi @markjeschke and @cgmaier:

    Thanks a LOT for Replying. And Sorry it took me a while :-)

    So I Think I Need a Combination of both your Answers:

    @cgmaier I Think I'll Start Holding a Reference to the AEMixerModule Instead of creating it on the spot in order to access the single Loops / AEAudioFilePlayerModule's. Can I Do this? or will this mess with the Render Block?

    @markjeschke how does this work?
    AEModuleProcess(_bass, context);
    Does this Create an AEModule that I Can Set the Volume To? Or should I Do This Individualy to Each Player and then run the AEBufferStackApplyFaders Method?

    Thanks again :-)

  • @Hernan,

    Do you simply want to add an interactive button that will mute the currently playing audio track? I was thinking that you could change the volume level to 0 when its mute boolean is set to true. But, then you could have a variable to return its volume level once the audio track is unmuted.

    I'll try to create a working example with dynamic volume levels for individual audio tracks. Are you using Swift or Objective-C? Xcode 7.3.1 or Xcode 8?

    Thanks,
    Mark

  • edited October 2016

    Hi @markjeschke :smile:

    Wow! you would do an Example for me? Thanks I Really Appreciate this.

    Yes the Goal is to Mute/Unmute Single Channels in the Mix without stopping them, so as not to lose their Sync in the Loop.

    I'm Working in Obj-C still and Xcode Version 7.3.1 (7D1014)

    I want to Update to Swift but I Guess that will have to be For Loopacks 2.0 ;-)

    Thanks again <3

  • @Hernan,

    I'm sorry for the delayed response. I attempted to make an Xcode 7.3.1 Objective-C version with the TAAE 2, but I can't get it to build successfully. The only projects I have working successfully with TAAE 2, utilizes a bridging header, similar to the Swift sample project that's included in the GitHub repo.

    If you happen to have a working Obj-C Xcode project that builds successfully with TAAE 2, could you please post it to GitHub? From there, I can implement the mute boolean/button code that you're looking for.

    Thanks for your patience.

    Take care,
    Mark

  • Hi @markjeschke:

    Sorry for the SUPER-LATE response. I'll try to Move On to AudioKit Based on @Michael 's Suggestion.

    It's already a pain to migrate, might Just as well do it once for the keeps :-)

    Thanks a lot for the Help ;-)

Sign In or Register to comment.