从流式传输将音频样本附加到AVAssetWriter

时间:2023-01-19 18:58:21

I'm using a project when I'm recording video from the camera, but the audio comes from streaming. The audio frames obviously are not synchronised with video frames. If I use AVAssetWriter without video, recording audio frames from streaming it is working fine. But if I append video and audio frames, I can't hear anything.

当我从相机录制视频时,我正在使用一个项目,但音频来自流媒体。音频帧显然与视频帧不同步。如果我在没有视频的情况下使用AVAssetWriter,那么从流媒体录制音频帧就可以了。但如果我追加视频和音频帧,我听不到任何声音。

Here it is the method for convert the audiodata from the stream to CMsampleBuffer

这是将audiodata从流转换为CMsampleBuffer的方法

AudioStreamBasicDescription monoStreamFormat = [self getAudioDescription];


CMFormatDescriptionRef format = NULL;
OSStatus status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &monoStreamFormat, 0,NULL, 0, NULL, NULL, &format);
if (status != noErr) {
    // really shouldn't happen
    return nil;
}

CMSampleTimingInfo timing = { CMTimeMake(1, 44100.0), kCMTimeZero, kCMTimeInvalid };


CMSampleBufferRef sampleBuffer = NULL;
status = CMSampleBufferCreate(kCFAllocatorDefault, NULL, false, NULL, NULL, format, numSamples, 1, &timing, 0, NULL, &sampleBuffer);
if (status != noErr) {
    // couldn't create the sample alguiebuffer
    NSLog(@"Failed to create sample buffer");
    CFRelease(format);
    return nil;
}

// add the samples to the buffer
status = CMSampleBufferSetDataBufferFromAudioBufferList(sampleBuffer,
                                                        kCFAllocatorDefault,
                                                        kCFAllocatorDefault,
                                                        0,
                                                        samples);
if (status != noErr) {
    NSLog(@"Failed to add samples to sample buffer");
    CFRelease(sampleBuffer);
    CFRelease(format);
    return nil;
}

I don't know if this is related with the timing. But I would like to append the audio frames from the first second of the video.

我不知道这是否与时间有关。但我想在视频的第一秒附加音频帧。

is it that possible?

这有可能吗?

Thanks

1 个解决方案

#1


0  

Finally I did this

最后我这样做了

uint64_t _hostTimeToNSFactor = hostTime;

_hostTimeToNSFactor *= info.numer;
_hostTimeToNSFactor /= info.denom;

uint64_t timeNS = (uint64_t)(hostTime * _hostTimeToNSFactor);
CMTime presentationTime = self.initialiseTime;//CMTimeMake(timeNS, 1000000000);
CMSampleTimingInfo timing = { CMTimeMake(1, 44100), presentationTime, kCMTimeInvalid };

#1


0  

Finally I did this

最后我这样做了

uint64_t _hostTimeToNSFactor = hostTime;

_hostTimeToNSFactor *= info.numer;
_hostTimeToNSFactor /= info.denom;

uint64_t timeNS = (uint64_t)(hostTime * _hostTimeToNSFactor);
CMTime presentationTime = self.initialiseTime;//CMTimeMake(timeNS, 1000000000);
CMSampleTimingInfo timing = { CMTimeMake(1, 44100), presentationTime, kCMTimeInvalid };