¡@

Home 

2014/10/15 ¤U¤È 10:16:07

iphone Programming Glossary: yuv

iOS: Get pixel-by-pixel data from camera

http://stackoverflow.com/questions/10865100/ios-get-pixel-by-pixel-data-from-camera

data representing your camera frame. This can be in a few different formats but the most common are BGRA and planar YUV. I have an example application that uses this here but I'd recommend that you also take a look at my open source framework..

Fragment Shader - Average Luminosity

http://stackoverflow.com/questions/12168072/fragment-shader-average-luminosity

Does any body know how to find average luminosity for a texture in a fragment shader I have access to both RGB and YUV textures the Y component in YUV is an array and I want to get an average number from this array. iphone opengl es shader.. average luminosity for a texture in a fragment shader I have access to both RGB and YUV textures the Y component in YUV is an array and I want to get an average number from this array. iphone opengl es shader fragment share improve this..

Encoding images to video with ffmpeg

http://stackoverflow.com/questions/3334939/encoding-images-to-video-with-ffmpeg

from api example.c its works but it gives me weird green colors in video. I know I need to convert my RGB images to YUV I found some solution but its doesn't works the colors is not green but very strange so thats the code Register all formats.. c time_base AVRational 1 25 c gop_size 10 emit one intra frame every ten frames c max_b_frames 1 c pix_fmt PIX_FMT_YUV420P open it if avcodec_open c codec 0 fprintf stderr could not open codec n exit 1 f fopen filename wb if f fprintf stderr.. size c width c height #pragma mark AVFrame outpic avcodec_alloc_frame int nbytes avpicture_get_size PIX_FMT_YUV420P c width c height create buffer for the output image uint8_t outbuffer uint8_t av_malloc nbytes #pragma mark for i 1..

How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?

http://stackoverflow.com/questions/4085474/how-to-get-the-y-component-from-cmsamplebuffer-resulted-from-the-avcapturesessio

using AVCaptureSession. I follow the guide provided by Apple link here . The raw data from the samplebuffer is in YUV format Am I correct here about the raw video frame format how to directly obtain the data for Y component out of the raw.. color components but you do sacrifice a little performance by needing to make the conversion from the camera native YUV colorspace. Other supported colorspaces are kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange and kCVPixelFormatType_420YpCbCr8BiPlanarFullRange.. or full 0 255 for each component. I believe the default colorspace used by an AVCaptureVideoDataOutput instance is the YUV 4 2 0 planar colorspace except on the iPhone 3G where it's YUV 4 2 2 interleaved . This means that there are two planes..

How to grab YUV formatted video from the camera, display it and process it

http://stackoverflow.com/questions/4205191/how-to-grab-yuv-formatted-video-from-the-camera-display-it-and-process-it

to grab YUV formatted video from the camera display it and process it I am writing an iphone IOS 4 program that capture live video.. had which was to grab yuv output display it and process it although its not exactly the answer to the question To grab YUV output from the camera AVCaptureVideoDataOutput videoOut AVCaptureVideoDataOutput alloc init videoOut setAlwaysDiscardsLateVideoFrames.. not require any much code. You can see the FindMyiCon sample in the WWDC samples pack for example . To process the YUV y channel bi planer in this case so it's all in a single chunk you can also use memcpy instead of looping void processPixelBuffer..

Determining Image Luminance/Brightness

http://stackoverflow.com/questions/4876315/determining-image-luminance-brightness

in advance. iphone image processing opencv brightness share improve this question Just convert your image to YUV format and calculate the average of luma channel. Color conversion is a typical operation and any decent image processing..