¡@

Home 

2014/10/15 ¤U¤È 10:14:04

iphone Programming Glossary: shader

Is there an tutorial on loading an 3D model in openGL ES on the iPhone?

http://stackoverflow.com/questions/3914632/is-there-an-tutorial-on-loading-an-3d-model-in-opengl-es-on-the-iphone

the camera viewport How hard is that to do Freakin' awesome math skills needed 5 Does openGL ES support some kind of shader tree model with gradients and effects that can be applied to the model or material Would be so happy if someone can point..

How can you track motion using the iPhone's camera?

http://stackoverflow.com/questions/3933716/how-can-you-track-motion-using-the-iphones-camera

a talk at SecondConf where I demonstrated the use of the iPhone's camera to track a colored object using OpenGL ES 2.0 shaders. The post accompanying that talk including my slides and sample code for all demos can be found here . The sample application.. 2007. That example is described in Chapter 27 of the GPU Gems 3 book . The basic idea is that you can use custom GLSL shaders to process images from the iPhone camera in realtime determining which pixels match a target color within a given threshold... moves across the view of the camera. While this doesn't address the case of tracking a more complex object like a foot shaders like this should be able to be written that could pick out such a moving object. As an update to the above in the two years..

How do I convert the live video feed from the iPhone camera to grayscale?

http://stackoverflow.com/questions/4381513/how-do-i-convert-the-live-video-feed-from-the-iphone-camera-to-grayscale

within it and running your live video frames through that. For OpenGL ES 2.0 you might want to use a programmable shader to achieve this effect. I show how to process live iPhone camera data through various filters in this sample application.. this effect. I show how to process live iPhone camera data through various filters in this sample application using shaders with a writeup on how that works here . In my benchmarks the iPhone 4 can do this processing at 60 FPS with programmable.. a writeup on how that works here . In my benchmarks the iPhone 4 can do this processing at 60 FPS with programmable shaders but you only get about 4 FPS if you rely on CPU bound code to do this. Since I wrote the above I've now created an open..

Gaussian filter with OpenGL Shaders

http://stackoverflow.com/questions/4804732/gaussian-filter-with-opengl-shaders

be done with one single image buffer or input pixels will change as the filter is performed. How can I do this with shaders Also should I handle the borders myself or there is a built it function or something that check invalid pixel access like..

How can I improve the performance of my custom OpenGL ES 2.0 depth texture generation?

http://stackoverflow.com/questions/6051237/how-can-i-improve-the-performance-of-my-custom-opengl-es-2-0-depth-texture-gener

my custom OpenGL ES 2.0 depth texture generation I have an open source iOS application that uses custom OpenGL ES 2.0 shaders to display 3 D representations of molecular structures. It does this by using procedurally generated sphere and cylinder.. approach is that the depth values for each fragment of these impostor objects needs to be calculated in a fragment shader to be used when objects overlap. Unfortunately OpenGL ES 2.0 does not let you write to gl_FragDepth so I've needed to output.. renderer like the PowerVR series in iOS devices but I can't think of a better way to do this. My depth fragment shader for spheres the most common display element looks to be at the heart of this bottleneck Renderer Utilization in Instruments..

Are the Core Image filters in iOS 5.0 fast enough for realtime video processing?

http://stackoverflow.com/questions/6625888/are-the-core-image-filters-in-ios-5-0-fast-enough-for-realtime-video-processing

on iOS and Mac OS X. Core Image mostly uses the GPU for image processing so you could look at how fast OpenGL ES 2.0 shaders handle image processing on existing devices. I did some work in this area recently and found that the iPhone 4 could do.. devices. I did some work in this area recently and found that the iPhone 4 could do 60 FPS processing using a simple shader on realtime video being fed in at a 480 x 320. You could download my sample application there and attempt to customize the.. video being fed in at a 480 x 320. You could download my sample application there and attempt to customize the shader and or video input size to determine if your particular device could handle this processing at a decent framerate. Core..

Learning OpenGLES 2.0 on iOS

http://stackoverflow.com/questions/8482327/learning-opengles-2-0-on-ios

still one of the primary resources you can refer to. While written for desktop OpenGL most of the shading language and shaders presented there translate directly across to OpenGL ES 2.0 with only a little modification required. The books ShaderX6.. GLKit available only on iOS 5.0 which simplifies some of the normal setup chores around your render buffers and simple shader based effects. Apple's WWDC 2011 videos have some good material on this but their 2009 and 2010 videos if you can find them..

Overlay Color Blend in OpenGL ES / iOS / Cocos2d

http://stackoverflow.com/questions/8771413/overlay-color-blend-in-opengl-es-ios-cocos2d

GL_ONE_MINUS_SRC_ALPHA The effects are not created by setting the blend function or mode but by texture environment or shader. The Overlay actually multiply effect corresponds to the GL_MODULATE texture environment mode. Or in terms of a shader gl_FragColor.. shader. The Overlay actually multiply effect corresponds to the GL_MODULATE texture environment mode. Or in terms of a shader gl_FragColor texture2D ... color . Lighten is min texture2D ... color Multiply actually overlay is gl_FragColor 1. 1. color..

Fragment Shader - Average Luminosity

http://stackoverflow.com/questions/12168072/fragment-shader-average-luminosity

Shader Average Luminosity Does any body know how to find average luminosity for a texture in a fragment shader I have access to..

How do I blend two textures with different co-ordinates in OpenGL ES 2.0 on iPhone?

http://stackoverflow.com/questions/12242443/how-do-i-blend-two-textures-with-different-co-ordinates-in-opengl-es-2-0-on-ipho

shader Add to ObjectiveC glVertexAttribPointer inputTextureCoordinate2 2 GL_FLOAT 0 0 textureCoordinates2 Vertex Shader attribute vec4 position attribute vec4 inputTextureCoordinate attribute vec4 inputTextureCoordinate2 varying vec2 textureCoordinate.. gl_Position position textureCoordinate inputTextureCoordinate.xy textureCoordinate2 inputTextureCoordinate2.xy Frag Shader varying highp vec2 textureCoordinate varying highp vec2 textureCoordinate2 uniform sampler2D inputTextureTop uniform sampler2D..

PowerVR SGX535 Shader Performance (OpenGL ES 2.0)

http://stackoverflow.com/questions/3320661/powervr-sgx535-shader-performance-opengl-es-2-0

SGX535 Shader Performance OpenGL ES 2.0 I'm currently working on a couple of shaders for an iPad game and it seems as if Apple's GLSL..

GLSL: Built-in attributes not accessible for iPhone Apps?

http://stackoverflow.com/questions/8205501/glsl-built-in-attributes-not-accessible-for-iphone-apps

really desperate here. I working with Xcode trying to implement some OpenGL stuff on the iPhone. I have to write a Shader for Phong Lighting. I got as far as declaring my geometry vertices indices calculating etc. and passing the respective arguments..

Draw a straight line using OpenGL ES in iPhone?

http://stackoverflow.com/questions/9736887/draw-a-straight-line-using-opengl-es-in-iphone

provides. Among other things the Apple template code will include creation of a GLKBaseEffect which provides some Shader functionality that seems to be required in order to be able to draw with OpenGL ES 2.0. Without the GLKBaseEffect you would.. the GLKBaseEffect you would need to use GLSL. The template provides an example of both with and without explicit GLSL Shader code. The template creates a setupGL function which I modified to look like this void setupGL EAGLContext setCurrentContext..