iphone Programming Glossary: maudiodata
How do I synthesize sounds with CoreAudio on iPhone/Mac http://stackoverflow.com/questions/1361148/how-do-i-synthesize-sounds-with-coreaudio-on-iphone-mac like void AudioQueueCallback void inUserData AudioQueueRef inAQ AudioQueueBufferRef inBuffer void pBuffer inBuffer mAudioData UInt32 bytes inBuffer mAudioDataBytesCapacity Write max bytes bytes of audio to pBuffer outBuffer mAudioDataByteSize actualNumberOfBytesWritten.. inUserData AudioQueueRef inAQ AudioQueueBufferRef inBuffer void pBuffer inBuffer mAudioData UInt32 bytes inBuffer mAudioDataBytesCapacity Write max bytes bytes of audio to pBuffer outBuffer mAudioDataByteSize actualNumberOfBytesWritten err AudioQueueEnqueueBuffer..
where to start with audio synthesis on iPhone http://stackoverflow.com/questions/2067267/where-to-start-with-audio-synthesis-on-iphone manipulate this so I'm not quite sure what they mean by opaque saying.... SInt16 coreAudioBuffer SInt16 outBuffer mAudioData Specify how many bytes we're providing outBuffer mAudioDataByteSize kBufferSizeInFrames m_outFormat.mBytesPerFrame Generate.. opaque saying.... SInt16 coreAudioBuffer SInt16 outBuffer mAudioData Specify how many bytes we're providing outBuffer mAudioDataByteSize kBufferSizeInFrames m_outFormat.mBytesPerFrame Generate the sine waves to Signed 16 Bit Stero interleaved Little..
Data format from recording using Audio Queue framework http://stackoverflow.com/questions/3963827/data-format-from-recording-using-audio-queue-framework queueFormat aqr DataFormat SoundTouch soundTouch aqr getSoundTouch soundTouch putSamples const SAMPLETYPE inBuffer mAudioData inBuffer mAudioDataByteSize 2 queueFormat.NumberChannels SAMPLETYPE samples SAMPLETYPE malloc sizeof SAMPLETYPE 10000.. SoundTouch soundTouch aqr getSoundTouch soundTouch putSamples const SAMPLETYPE inBuffer mAudioData inBuffer mAudioDataByteSize 2 queueFormat.NumberChannels SAMPLETYPE samples SAMPLETYPE malloc sizeof SAMPLETYPE 10000 queueFormat.NumberChannels.. buf 256 fprintf stderr Error s s n e.mOperation e.FormatError buf You can see that I'm passing the data in inBuffer mAudioData to SoundTouch. In my callback what exactly are the bytes representing i.e. how do I extract samples from mAudioData iphone..
AVAssetReader and Audio Queue streaming problem http://stackoverflow.com/questions/4763598/avassetreader-and-audio-queue-streaming-problem NSNotificationCenter defaultCenter postNotificationName NOTIF_callsample object nil float inData float inBuffer mAudioData int offsetSample 0 Loop until finish reading from the music data while bytesToRead THIS IS THE PROBLEMATIC LINE CMSampleBufferRef.. bytesToRead bytesToRead audioBuffer.mDataByteSize offsetSample offsetSample audioBuffer.mDataByteSize 8 inBuffer mAudioDataByteSize offsetSample 8 AudioQueueEnqueueBuffer pplayerState mQueue inBuffer 0 0 iphone core audio share improve this..
Why might my AudioQueueOutputCallback not be called? http://stackoverflow.com/questions/7575670/why-might-my-audioqueueoutputcallback-not-be-called this. At this point it is known that the local buffer contains the correct amount of data for copying memcpy aqBuffer mAudioData localBuffer kAQBufferSize aqBuffer mAudioDataByteSize kAQBufferSize OSStatus status AudioQueueEnqueueBuffer _audioQueue.. buffer contains the correct amount of data for copying memcpy aqBuffer mAudioData localBuffer kAQBufferSize aqBuffer mAudioDataByteSize kAQBufferSize OSStatus status AudioQueueEnqueueBuffer _audioQueue aqBuffer 0 NULL if status This is also not called...
iOS Stream Audio from one iOS Device to Another http://stackoverflow.com/questions/8357514/ios-stream-audio-from-one-ios-device-to-another method on your NSData object into the audio queue buffer inCompleteAQBuffer Using a memcpy memcpy inCompleteAQBuffer mAudioData THIS mMyAudioBuffer THIS mMyPlayBufferPosition sizeof float numBytesToCopy You'll also need to set the buffer size inCompleteAQBuffer.. THIS mMyPlayBufferPosition sizeof float numBytesToCopy You'll also need to set the buffer size inCompleteAQBuffer mAudioDataByteSize numBytesToCopy numBytesToCopy is always going to be the same unless you're just about to run out of data. For example..
Audio recorded using Audio Queue Services to data http://stackoverflow.com/questions/8451084/audio-recorded-using-audio-queue-services-to-data in the AudioInputCallback . My AudioQueueBufferRef is called inBuffer and it seems that I want to convert the inBuffer mAudioData to NSData and then send the NSData to the other device and then unpack it. Does anyone know if this would be the way to.. other device and then unpack it. Does anyone know if this would be the way to do it and how I can convert my inBuffer mAudioData to NSData Other approaches are also welcome. This is my callback method in which I believe I should grab the data and send.. if recordState recording return OSStatus status AudioFileWritePackets recordState audioFile false inBuffer mAudioDataByteSize inPacketDescs recordState currentPacket inNumberPacketDescriptions inBuffer mAudioData if status 0 recordState..
|