¡@

Home 

2014/10/15 ¤U¤È 10:14:04

iphone Programming Glossary: shaders

Capture 60fps in iPhone app

http://stackoverflow.com/questions/10344637/capture-60fps-in-iphone-app

How can you track motion using the iPhone's camera?

http://stackoverflow.com/questions/3933716/how-can-you-track-motion-using-the-iphones-camera

a talk at SecondConf where I demonstrated the use of the iPhone's camera to track a colored object using OpenGL ES 2.0 shaders. The post accompanying that talk including my slides and sample code for all demos can be found here . The sample application.. 2007. That example is described in Chapter 27 of the GPU Gems 3 book . The basic idea is that you can use custom GLSL shaders to process images from the iPhone camera in realtime determining which pixels match a target color within a given threshold... moves across the view of the camera. While this doesn't address the case of tracking a more complex object like a foot shaders like this should be able to be written that could pick out such a moving object. As an update to the above in the two years..

How do I convert the live video feed from the iPhone camera to grayscale?

http://stackoverflow.com/questions/4381513/how-do-i-convert-the-live-video-feed-from-the-iphone-camera-to-grayscale

this effect. I show how to process live iPhone camera data through various filters in this sample application using shaders with a writeup on how that works here . In my benchmarks the iPhone 4 can do this processing at 60 FPS with programmable.. a writeup on how that works here . In my benchmarks the iPhone 4 can do this processing at 60 FPS with programmable shaders but you only get about 4 FPS if you rely on CPU bound code to do this. Since I wrote the above I've now created an open..

Choose OpenGL ES 1.1 or OpenGL ES 2.0?

http://stackoverflow.com/questions/4784137/choose-opengl-es-1-1-or-opengl-es-2-0

handle the rest for you. OpenGL ES 2.0 is based around a programmable pipeline where you supply vertex and fragment shaders to handle the specifics of how your content is rendered to the screen. Because you have to write your own code to replace.. on iTunes U and I created two sample applications here and here . When it comes to cross platform compatibility shaders have been available on desktop OpenGL for a little while now so anything you build using either OpenGL ES 1.1 or 2.0 should..

Gaussian filter with OpenGL Shaders

http://stackoverflow.com/questions/4804732/gaussian-filter-with-opengl-shaders

be done with one single image buffer or input pixels will change as the filter is performed. How can I do this with shaders Also should I handle the borders myself or there is a built it function or something that check invalid pixel access like..

How to apply “filters” to AVCaptureVideoPreviewLayer

http://stackoverflow.com/questions/5156872/how-to-apply-filters-to-avcapturevideopreviewlayer

another view or layer. I have a sample application here where I grab frames from the camera and apply OpenGL ES 2.0 shaders to process the video in realtime for display. In this application explained in detail here I was using color based filtering.. highly encourage looking at the use of OpenGL ES 2.0 for this because you can pull off many more kinds of effect using shaders than you can with the fixed function OpenGL ES 1.1 pipeline. Edit 2 13 2012 As an update on the above I've now created an..

How can I improve the performance of my custom OpenGL ES 2.0 depth texture generation?

http://stackoverflow.com/questions/6051237/how-can-i-improve-the-performance-of-my-custom-opengl-es-2-0-depth-texture-gener

my custom OpenGL ES 2.0 depth texture generation I have an open source iOS application that uses custom OpenGL ES 2.0 shaders to display 3 D representations of molecular structures. It does this by using procedurally generated sphere and cylinder..

iOS based OpenGL ES programming

http://stackoverflow.com/questions/6074688/ios-based-opengl-es-programming

simple water surface effects which do exactly what you want. One implementation uses OpenGL ES 1.1 the other 2.0 style shaders. Pick a way that you want to go my personal recommendation would be to learn shaders now and try to make a crude functional.. OpenGL ES 1.1 the other 2.0 style shaders. Pick a way that you want to go my personal recommendation would be to learn shaders now and try to make a crude functional application while working through the above videos and reading material. share improve..

Are the Core Image filters in iOS 5.0 fast enough for realtime video processing?

http://stackoverflow.com/questions/6625888/are-the-core-image-filters-in-ios-5-0-fast-enough-for-realtime-video-processing

on iOS and Mac OS X. Core Image mostly uses the GPU for image processing so you could look at how fast OpenGL ES 2.0 shaders handle image processing on existing devices. I did some work in this area recently and found that the iPhone 4 could do..

Most efficient way to draw part of an image in iOS

http://stackoverflow.com/questions/8035673/most-efficient-way-to-draw-part-of-an-image-in-ios

GPU acceleration . Anyway if you need extreme optimization for hundreds of animated sprites with finely tuned pixel shaders like in a game app you should use OpenGL directly because CALayer lacks many options for optimization at lower levels. Anyway..

Learning OpenGLES 2.0 on iOS

http://stackoverflow.com/questions/8482327/learning-opengles-2-0-on-ios

still one of the primary resources you can refer to. While written for desktop OpenGL most of the shading language and shaders presented there translate directly across to OpenGL ES 2.0 with only a little modification required. The books ShaderX6..

How can you apply distortions to a UIImage using OpenGL ES?

http://stackoverflow.com/questions/9886843/how-can-you-apply-distortions-to-a-uiimage-using-opengl-es

improve this question The most performant way of doing this kind of image processing would be to use OpenGL ES 2.0 shaders. Once again if I might point you to my GPUImage framework it can do many of the distortion operations you describe. For.. it can do many of the distortion operations you describe. For those that are missing you can write your own fragment shaders. The effects I have in there are a convex bulge distortion using a GPUImageBulgeDistortionFilter a concave distortion using.. the GPUImageSwirlFilter and finally a pinch distortion using the GPUImagePinchDistortionFilter If you look at the shaders used for each of the filters you'll find that the math is very similar between them. You should be able to tweak that to..