[ad_1]
Say we have now an AVCaptureAudioDataOutput
object that has been arrange with an AVCaptureSession
to get information samples from the system’s microphone.
captureOutput(_:didOutput:from:)
in AVCaptureAudioDataOutputSampleBufferDelegate
all the time will get referred to as with 2048 bytes of samples. Is it doable to set a most popular pattern buffer measurement?
A fundamental instance:
let output = AVCaptureAudioDataOutput()
output.setSampleBufferDelegate(delegate, queue)
// Add output to AVCaptureSession and begin the seize session as soon as every part is about up
...
// Within the delegate
public func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
// sampleBuffer is 2048 bytes, however I need much less
let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer)
print(CMBlockBufferGetDataLength(blockBuffer!)) // 2048
...
}
[ad_2]