¡@

Home 

2014/10/15 ¤U¤È 10:10:11

iphone Programming Glossary: imagefromsamplebuffer

Convert UIImage to CMSampleBufferRef

http://stackoverflow.com/questions/16475737/convert-uiimage-to-cmsamplebufferref

I am getting the CMSampleBufferRef and converting it to UIImage using the below code. CGImageRef _cgImage self imageFromSampleBuffer sampleBuffer UIImage _uiImage UIImage imageWithCGImage _cgImage CGImageRelease _cgImage _uiImage _uiImage resizedImageWithSize.. croppedBuffer NEED HELP WITH THIS _videoInput appendSampleBuffer sampleBuffer _videoInput is a AVAssetWriterInput The imageFromSampleBuffer method looks like this CGImageRef imageFromSampleBuffer CMSampleBufferRef sampleBuffer Create a CGImageRef from sample buffer.. sampleBuffer _videoInput is a AVAssetWriterInput The imageFromSampleBuffer method looks like this CGImageRef imageFromSampleBuffer CMSampleBufferRef sampleBuffer Create a CGImageRef from sample buffer data CVImageBufferRef imageBuffer CMSampleBufferGetImageBuffer..

UIImage created from CMSampleBufferRef not displayed in UIImageView?

http://stackoverflow.com/questions/3305862/uiimage-created-from-cmsamplebufferref-not-displayed-in-uiimageview

fromConnection AVCaptureConnection connection Create a UIImage from the sample buffer data UIImage theImage self imageFromSampleBuffer sampleBuffer NSLog @ Got an image f f theImage.size.width theImage.size.height NSLog @ The image view is @ imageView UIImage.. correctly and the second call to NSLog reports a non nil object . At least to a basic extent the image I get from imageFromSampleBuffer is fine since NSLog reports the size to be 360x480 which is the size I expected. The code I'm using is the recently posted.. and friends of which I understand very little and creates the UIImage object from the Core Video buffers that's the imageFromSampleBuffer method . Finally I can get the application to crash if I try to send drawInRect to a plain UIView subclass with the UIImage..

Applying Effect to iPhone Camera Preview “Video”

http://stackoverflow.com/questions/4893620/applying-effect-to-iphone-camera-preview-video

with the performance of processing the preview image a frame of the preview video . First I get the UIImage result of imageFromSampleBuffer on the sample buffer from captureOutput didOutputSampleBuffer fromConnection . Then I scale and rotate it for the screen.. simply create an autorelease pool in captureOutput didOutputSampleBuffer fromConnection . This makes sense since imageFromSampleBuffer returns an autoreleased UIImage object. Plus it frees up any autoreleased objects created by image processing code right.. NSAutoreleasePool pool NSAutoreleasePool alloc init Create a UIImage from the sample buffer data UIImage image self imageFromSampleBuffer sampleBuffer Add your code here that uses the image pool release My testing has shown that this will run without memory..

iPhone Watermark on recorded Video.

http://stackoverflow.com/questions/7205820/iphone-watermark-on-recorded-video

AVCaptureDataOutput will return images as CMSampleBufferRef s. Convert them to CGImageRef s using this code CGImageRef imageFromSampleBuffer CMSampleBufferRef sampleBuffer Create a CGImageRef from sample buffer data CVImageBufferRef imageBuffer CMSampleBufferGetImageBuffer..

kCVPixelFormatType_420YpCbCr8BiPlanarFullRange frame to UIImage conversion

http://stackoverflow.com/questions/8838481/kcvpixelformattype-420ypcbcr8biplanarfullrange-frame-to-uiimage-conversion

format Can you give some hint to tune this method provided by Apple Create a UIImage from sample buffer data UIImage imageFromSampleBuffer CMSampleBufferRef sampleBuffer Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer..

ios capturing image using AVFramework

http://stackoverflow.com/questions/8924299/ios-capturing-image-using-avframework

@ captureOutput didOutputSampleBufferFromConnection Create a UIImage from the sample buffer data UIImage image self imageFromSampleBuffer sampleBuffer Add your code here that uses the image self.imageView setImage image self.view setNeedsDisplay Create a UIImage.. the image self.imageView setImage image self.view setNeedsDisplay Create a UIImage from sample buffer data UIImage imageFromSampleBuffer CMSampleBufferRef sampleBuffer NSLog @ imageFromSampleBuffer called Get a CMSampleBuffer's Core Video image buffer for the.. Create a UIImage from sample buffer data UIImage imageFromSampleBuffer CMSampleBufferRef sampleBuffer NSLog @ imageFromSampleBuffer called Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer CMSampleBufferGetImageBuffer..

AVCaptureSession specify resolution and quality of captured images obj-c iphone app

http://stackoverflow.com/questions/9312832/avcapturesession-specify-resolution-and-quality-of-captured-images-obj-c-iphone

captureOutput didOutputSampleBufferFromConnection Create a UIImage from the sample buffer data self.currentImage self imageFromSampleBuffer sampleBuffer Add your code here that uses the image Create a UIImage from sample buffer data UIImage imageFromSampleBuffer.. sampleBuffer Add your code here that uses the image Create a UIImage from sample buffer data UIImage imageFromSampleBuffer CMSampleBufferRef sampleBuffer NSLog @ imageFromSampleBuffer called Get a CMSampleBuffer's Core Video image buffer for the.. image Create a UIImage from sample buffer data UIImage imageFromSampleBuffer CMSampleBufferRef sampleBuffer NSLog @ imageFromSampleBuffer called Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer CMSampleBufferGetImageBuffer..