AEAudioFileOutput : how to use ?

Hi,

I'm currently using TAAE2 which is incredibly efficient, but I have a problem with the Offline rendering.
I did not succeed to use AEAudioFileOutput. Should I use the same renderer that the one I use for playing audio ?
Is it possible to have an example of integration ?
Thank you very much

Comments

  • I use runForDuration, and in the completionBlock I call finishWriting, but when I look the export file, this one is 00:00 seconds long

  • You'll need to use a different renderer, @alexisdarnat, or at least stop the output unit first, because otherwise you'll be running the same renderer twice at the same time.

  • @Michael Ah nice ! Thank you, but I still have a problem.
    I have juste create a render context
    AERenderer *offlineRenderer = [AERenderer new];
    And an AEAudioFilePlayerModule :
    module = [[AEAudioFilePlayerModule alloc] initWithRenderer:renderer URL:url error:NULL];
    offlineRenderer.block = ^(const AERenderContext * _Nonnull context) { AEModuleProcess(module, context); AERenderContextOutput(context, 1); AEBufferStackPop(context->stack, 1); };

    And I simply run :
    AEAudioFileOutput *outputRender = [[AEAudioFileOutput alloc] initWithRenderer: offlineRenderer URL:url type:AEAudioFileTypeM4A sampleRate: offlineRenderer.sampleRate channelCount: offlineRenderer.numberOfOutputChannels error:nil]; [module playAtTime:0]; [outputRender runForDuration:10.0 completionBlock:^{ NSLog(@"File exported"); } }];

    The audio file is well exported, he has the perfect duration, but there is no sound.

    I've tried to print out the AudioBuffer list data context->output->mBuffers->mData in the offlineRenderer Block, but it seems to be empty, full of "0"

  • edited May 2016

    Well, I just solve the problem, I used [module playAtTime:AECurrentTimeInHostTicks()]; instead of using [module playAtTime:0]; :D

  • edited May 2016

    That's very strange! I just tried to reproduce the issue and it's working fine on my end. What happens when you put the following into the sample app somewhere, like within the 'start' function in AEAudioController:

        dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), ^{
            AERenderer * offline = [AERenderer new];
    
            NSURL * url = [[NSBundle mainBundle] URLForResource:@"amen" withExtension:@"m4a"];
            AEAudioFilePlayerModule * loop = [[AEAudioFilePlayerModule alloc] initWithRenderer:offline URL:url error:NULL];
            loop.loop = YES;
    
            offline.block = ^(const AERenderContext * context) {
                AEModuleProcess(loop, context);
                AERenderContextOutput(context, 1);
            };
    
            NSString * documents = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject;
            NSURL * outputUrl = [NSURL fileURLWithPath:[documents stringByAppendingPathComponent:@"Offline.m4a"]];
            AEAudioFileOutput * output = [[AEAudioFileOutput alloc] initWithRenderer:offline URL:outputUrl type:AEAudioFileTypeM4A sampleRate:44100.0 channelCount:2 error:NULL];
    
            [loop playAtTime:0];
    
            [output runForDuration:10.0 completionBlock:^{
                [output finishWriting];
                NSLog(@"Rendered. Now playing.");
    
                AEAudioFilePlayerModule * player = [[AEAudioFilePlayerModule alloc] initWithRenderer:self.output.renderer URL:outputUrl error:NULL];
                [player playAtTime:0];
                self.playerValue.objectValue = player;
            }];
        });
    
  • edited June 2016

    @Michael ,@alexisdarnat I am using the same code as above but it is not working in the case when we need to play file after few second

    For example If i export 2 audio in one file, one play at 0 seconds and other play at 3 seconds.

    and for that i am using

    [module1 playAtTime:0];// to play file at 0 second
    
    [module2 playAtTime:AECurrentTimeInHostTicks() + AEHostTicksFromSeconds(3)]; // to play file at 3 second and 
    

    but it is not working. when i export it. it only play the file at 0 seconds and not play the file at 3 second

    Below is my full code.

     dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), ^{
    
                offlineRender = [AERenderer new];
    
               AEAggregatorModule *aggregator = [[AEAggregatorModule alloc] initWithRenderer:offlineRender];
    
                AEAudioFilePlayerModule *module1 = [[AEAudioFilePlayerModule alloc] initWithRenderer:offlineRender   URL:_url1 error:NULL];
    
                        [aggregator addModule:module1];
    
                        AEAudioFilePlayerModule *module2 = [[AEAudioFilePlayerModule alloc] initWithRenderer:offlineRender URL:_url error:NULL];
    
                        [aggregator addModule:module2];
    
    
                offlineRender.block = ^(const AERenderContext * _Nonnull context) {
                    // Run all the players, though the aggregator
    
                    AEModuleProcess(aggregator, context);
                    AERenderContextOutput(context, 1);
                };
    
                NSString * documents = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject;
    
                NSURL * outputUrl = [NSURL fileURLWithPath:[documents stringByAppendingPathComponent:@"Offline.m4a"]];
    
                AEAudioFileOutput * outputRender = [[AEAudioFileOutput alloc] initWithRenderer:offlineRender URL:outputUrl type:AEAudioFileTypeM4A sampleRate:44100.0 channelCount:2 error:NULL];
    
                 [module1 playAtTime:0];
                 [module2 playAtTime:AECurrentTimeInHostTicks() + AEHostTicksFromSeconds(3)];
    
    
    
                [outputRender runForDuration:10 completionBlock:^(NSError * _Nullable error) {
                    [outputRender finishWriting];
                }];
    
            });
    
  • I have the same problem - AEAudioFileOutput does not write any sound if AEAudioFilePlayerModule's playAtTime is used with nonzero shifted time: [module playAtTime:startTicks]. To fix it I correct mHostTime within the renderer block:

     //...
     AEAudioFileOutput *fileOutput;
     AEHostTicks _startOfflineOutput;  // the time when AEAudioFileOutput started
    //...
    fileOutput = [[AEAudioFileOutput alloc] initWithRenderer:renderer URL:recordedSoundTracksURL type:AEAudioFileTypeM4A sampleRate:renderer.sampleRate channelCount:renderer.numberOfOutputChannels error:&error];
    //...
    _startOfflineOutput = AECurrentTimeInHostTicks();
    
    [fileOutput runForDuration:totalDuration completionBlock:^(NSError * _Nullable error) {
          [fileOutput finishWriting];
           if (!error) {
                  //... using recorded file
            }
           fileOutput = nil;
     }];
    
    // renderer block
    renderer.block = ^(const AERenderContext *context) {
    // ...
        if (context->offlineRendering)     { 
        // timestamp correction to force audio file rendering
             ((AudioTimeStamp*)(context->timestamp))->mHostTime = _startOfflineOutput +  AEHostTicksFromSeconds(context->timestamp->mSampleTime / context->sampleRate);
        }
    // ...
    };
    

    But unfortunately it helps only once. To get AEAudioFileOutput next time working we have to recreate the renderer.

  • Ah, yes, this is something that probably needs to be addressed in the way timestamps are used. Because it's faster-than-realtime rendering, it doesn't make sense to use host ticks; instead, the time needs to be provided in samples. I'll work on that.

  • Okay, I've just added some changes that make AEAudioFilePlayerModule take AudioTimeStamps for scheduling.

    For example:


    dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), ^{ AERenderer * offline = [AERenderer new]; NSURL * url = [[NSBundle mainBundle] URLForResource:@"amen" withExtension:@"m4a"]; AEAudioFilePlayerModule * loop = [[AEAudioFilePlayerModule alloc] initWithRenderer:offline URL:url error:NULL]; loop.loop = YES; url = [[NSBundle mainBundle] URLForResource:@"bass" withExtension:@"m4a"]; AEAudioFilePlayerModule * loop2 = [[AEAudioFilePlayerModule alloc] initWithRenderer:offline URL:url error:NULL]; loop2.loop = YES; offline.block = ^(const AERenderContext * context) { AEModuleProcess(loop, context); AEModuleProcess(loop2, context); AERenderContextOutput(context, 2); }; NSString * documents = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject; NSURL * outputUrl = [NSURL fileURLWithPath:[documents stringByAppendingPathComponent:@"Offline.m4a"]]; AEAudioFileOutput * output = [[AEAudioFileOutput alloc] initWithRenderer:offline URL:outputUrl type:AEAudioFileTypeM4A sampleRate:44100.0 channelCount:2 error:NULL]; [loop playAtTime:AETimeStampWithSamples(2048)]; [loop2 playAtTime:AETimeStampWithSamples(2048+((loop.duration/4.0)*offline.sampleRate))]; [output runForDuration:10.0 completionBlock:^(NSError * error){ [output finishWriting]; NSLog(@"Rendered. Now playing."); AEAudioFilePlayerModule * player = [[AEAudioFilePlayerModule alloc] initWithRenderer:self.output.renderer URL:outputUrl error:NULL]; [player playAtTime:AETimeStampNone]; self.playerValue.objectValue = player; }]; });
  • Now it works fine. The start of playing at the relative time (through time in samples) this is what I wanted! Great thanks!

  • Thanks to all. :smile:

Sign In or Register to comment.