¡@

Home 

2014/10/15 ¤U¤È 10:04:44

iphone Programming Glossary: buffers

How do I synthesize sounds with CoreAudio on iPhone/Mac

http://stackoverflow.com/questions/1361148/how-do-i-synthesize-sounds-with-coreaudio-on-iphone-mac

deviceFormat AudioQueueCallback NULL CFRunLoopGetCurrent kCFRunLoopCommonModes 0 audioQueue Allocate buffers for the AudioQueue and pre fill them. for int i 0 i BUFFER_COUNT i AudioQueueBufferRef mBuffer err AudioQueueAllocateBuffer..

Upload live streaming video from iPhone like Ustream or Qik

http://stackoverflow.com/questions/1960782/upload-live-streaming-video-from-iphone-like-ustream-or-qik

past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds..

where to start with audio synthesis on iPhone

http://stackoverflow.com/questions/2067267/where-to-start-with-audio-synthesis-on-iphone

m_outFormat AudioQueueRef m_outAQ enum kBufferSizeInFrames 512 kNumBuffers 4 kSampleRate 44100 AudioQueueBufferRef m_buffers kNumBuffers bool m_isInitialised struct Wave Wave volume 1.f phase 0.f frequency 0.f fStep 0.f float volume float phase.. this NULL NULL 0 m_outAQ if result 0 printf ERROR d n int result return false Allocate buffers for the audio UInt32 bufferSizeBytes kBufferSizeInFrames m_outFormat.mBytesPerFrame for int buf 0 buf kNumBuffers buf OSStatus.. for int buf 0 buf kNumBuffers buf OSStatus result AudioQueueAllocateBuffer m_outAQ bufferSizeBytes m_buffers buf if result printf ERROR d n int result return false Prime the buffers queueCallback m_outAQ m_buffers buf m_isInitialised..

Cocoa Touch - Comparing Images

http://stackoverflow.com/questions/3400707/cocoa-touch-comparing-images

Then use CGContextDrawImage to draw the images into the bitmap contexts. Now the bytes of the images are in the buffers. You can then loop through manually or memcmp to check for differences. Apple's own detailed explanation and sample code..

How to properly release an AVCaptureSession

http://stackoverflow.com/questions/3741121/how-to-properly-release-an-avcapturesession

When the dispatch queue quits we can be sure that there won't be any more action in the second thread where the sample buffers are processed. static void capture_cleanup void p AugmReality ar AugmReality p cast to original context instance ar release..

This code to write video+audio through AVAssetWriter and AVAssetWriterInputs is not working. Why?

http://stackoverflow.com/questions/4149963/this-code-to-write-videoaudio-through-avassetwriter-and-avassetwriterinputs-is

in AVAssetWriterInput for an explanation. The library should calculate the proper timing for interleaving the buffers. You do not need to call endSessionAtSrouceTime. The last time stamp in the sample data will be used after the call to _videoWriter..

Record and play audio Simultaneously

http://stackoverflow.com/questions/4215180/record-and-play-audio-simultaneously

the Audio Unit RemoteIO or the Audio Queue API. These are lower level APIs where you have to handle the incoming buffers of outgoing and incoming PCM samples yourself. See Apple's aurioTouch sample app for example code. share improve this answer..

Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?

http://stackoverflow.com/questions/4944083/can-use-avcapturevideodataoutput-and-avcapturemoviefileoutput-at-the-same-time

assetWriterInputWithMediaType AVMediaTypeVideo outputSettings outputSettings I'm going to push pixel buffers to it so will need a AVAssetWriterPixelBufferAdaptor to expect the same 32BGRA input as I've asked the AVCaptureVideDataOutput..

Uploading live streaming video from iPhone [duplicate]

http://stackoverflow.com/questions/5062266/uploading-live-streaming-video-from-iphone

past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds..

AVFoundation + AssetWriter: Generate Movie With Images and Audio

http://stackoverflow.com/questions/5640657/avfoundation-assetwriter-generate-movie-with-images-and-audio

pxbuffer 0 return pxbuffer So can you help me out regarding how to add the audio files and how to make buffers for them and the adaptor and input settings etc If this approach might cause a problem guide me about how to use a AVMutableComposition..

Write Audio To Disk From IO Unit

http://stackoverflow.com/questions/6930609/write-audio-to-disk-from-io-unit

write an audio file to disk from a remote IO Unit. The steps I took were to Open an mp3 file and extract its audio to buffers. I set up an asbd to use with my graph based on the properties of the graph. I setup and run my graph looping the extracted..

How to record sound produced by mixer unit output (iOS Core Audio & Audio Graph)

http://stackoverflow.com/questions/7118429/how-to-record-sound-produced-by-mixer-unit-output-ios-core-audio-audio-graph

setupErr ExtAudioFileWriteAsync effectState.audioFileRef 0 NULL NSAssert setupErr noErr @ Couldn't initialize write buffers for audio file And the recording callback static OSStatus recordingCallback void inRefCon AudioUnitRenderActionFlags.. connected even though you've added a callback. Once your callback is running if you find that there's no data in the buffers ioData wrap this code around your callback code if ioActionFlags kAudioUnitRenderAction_PostRender your code This is needed..

Learning OpenGL ES 1.x

http://stackoverflow.com/questions/72288/learning-opengl-es-1-x

deprecated in OpenGL 3.0. GL ES 1.x is for pretty simple devices. What you have is a way to draw geometry vertex buffers manage textures and setup some fixed function state lighting texture combiners . That's pretty much all there is to it...

Learning OpenGLES 2.0 on iOS

http://stackoverflow.com/questions/8482327/learning-opengles-2-0-on-ios

to start using GLKit available only on iOS 5.0 which simplifies some of the normal setup chores around your render buffers and simple shader based effects. Apple's WWDC 2011 videos have some good material on this but their 2009 and 2010 videos..

Can anybody help me in recording iPhone output sound through Audio Unit

http://stackoverflow.com/questions/8615358/can-anybody-help-me-in-recording-iphone-output-sound-through-audio-unit

setupErr ExtAudioFileWriteAsync effectState.audioFileRef 0 NULL NSAssert setupErr noErr @ Couldn't initialize write buffers for audio file the recording call back static OSStatus recordingCallback void inRefCon AudioUnitRenderActionFlags ioActionFlags.. if ioActionFlags kAudioUnitRenderAction_PostRender inBusNumber 0 AudioBufferList bufferList Fill this up with buffers you will want to malloc it as it's a dynamic length list EffectState effectState EffectState inRefCon AudioUnit rioUnit.. bufferList if noErr status NSLog @ AudioUnitRender error return noErr Now we have the samples we just read sitting in buffers in bufferList ExtAudioFileWriteAsync effectState audioFileRef inNumberFrames bufferList return noErr then stop Recording..