我已经寻找这个问题的答案大约一个月了,所以非常感谢您的帮助!
我正在使用 AVAudioEngine 来录制音频。该音频是使用水龙头录制的:
localInput?.installTap(onBus: 0, bufferSize: 4096, format: localInputFormat) {
它被记录为AVAudioPCMBuffer类型。需要转换为类型[UInt8]
我用这个方法这样做:
func audioBufferToBytes(audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
let srcLeft = audioBuffer.floatChannelData![0]
let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)
// initialize bytes by 0
var audioByteArray = [UInt8](repeating: 0, count: numBytes)
srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
audioByteArray.withUnsafeMutableBufferPointer {
$0.baseAddress!.initialize(from: srcByteData, count: numBytes)
}
}
return audioByteArray
}
然后音频被写入输出流。在另一台设备上,数据需要转换回 AVAudioPCMBuffer 才能播放。我用这个方法:
func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame
let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
audioBuffer.frameLength = frameLength
let dstLeft = audioBuffer.floatChannelData![0]
buf.withUnsafeBufferPointer {
let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
dstLeft.initialize(from: src, count: Int(frameLength))
}
return audioBuffer
}
但是,我的逻辑一定有问题,因为在设备上,当我播放音频时,我确实听到了一些声音,但听起来像是静态的。
正如我所说,任何帮助都是值得赞赏的,我已经在这个问题上停留了一段时间了。
EDIT
感谢你目前的帮助。我已改用数据。所以我的转换看起来像这样(我在网上找到了这段代码):
func audioBufferToData(audioBuffer: AVAudioPCMBuffer) -> Data {
let channelCount = 1
let bufferLength = (audioBuffer.frameCapacity * audioBuffer.format.streamDescription.pointee.mBytesPerFrame)
let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: channelCount)
let data = Data(bytes: channels[0], count: Int(bufferLength))
return data
}
转换回 AVAudioPCMBuffer 如下所示:
func dataToAudioBuffer(data: Data) -> AVAudioPCMBuffer {
let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)
let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(data.count)/2)
audioBuffer.frameLength = audioBuffer.frameCapacity
for i in 0..<data.count/2 {
audioBuffer.floatChannelData?.pointee[i] = Float(Int16(data[i*2+1]) << 8 | Int16(data[i*2]))/Float(INT16_MAX)
}
return audioBuffer
}
不幸的是,同样的问题仍然存在......
EDIT 2
我创建了一个项目来模拟这个问题。它所做的就是录制音频,将其转换为数据,再将其转换回 AVAudioPCMBuffer,然后播放音频。
链接在这里:https://github.com/Lkember/IntercomTest https://github.com/Lkember/IntercomTest
EDIT 3
使用具有 2 个通道的设备时发生崩溃,但我已修复它。
EDIT 4
提交的答案解决了我的示例项目中的问题,但它没有解决我的主项目中的问题。我在这里添加了一个新问题:
如何通过 OutputStream 发送 NSData https://stackoverflow.com/questions/42936113/how-to-send-nsdata-over-an-outputstream