¡@

Home 

2014/10/15 ¤U¤È 10:13:13

iphone Programming Glossary: reduction

Optimized Image Loading in a UIScrollView

http://stackoverflow.com/questions/1098234/optimized-image-loading-in-a-uiscrollview

Go even further and make sure you've converted them to PNG16 without an alpha channel. This will give at least a 4 1 reduction in size with hopefully no detectable change in the visual image. Also consider using PVRTC format images which will take.. change in the visual image. Also consider using PVRTC format images which will take the size down even further 8 1 reduction . This will greatly reduced the time it takes to read the images from disk . I apologize if any of this doesn't make sense...

Fragment Shader - Average Luminosity

http://stackoverflow.com/questions/12168072/fragment-shader-average-luminosity

of two textures and you can't generate mipmaps for NPOT textures in OpenGL ES 2.0 on iOS. Instead I did a multistage reduction similar to mipmap generation but with some slight tweaks. Each step down reduced the size of the image by a factor of four.. interpolation to average the four sets of four pixels then I just had to average those four pixels to yield a 16X reduction in pixels in a single step. I converted the image to luminance at the very first stage using a dot product of the RGB values.. the RGB values with a vec3 of 0.2125 0.7154 0.0721 . This allowed me to just read the red channel for each subsequent reduction stage which really helps on iOS hardware. Note that you don't need this if you are starting with a Y channel luminance texture..

Communication between view controllers

http://stackoverflow.com/questions/1880033/communication-between-view-controllers

be a significant memory impact for doing things this way i.e. no lazy loading I assume the advantage of the array is a reduction of loading times for the second level controllers Thank you for any responses as I am trying to develop things properly..

Pixel-Position of Cursor in UITextView

http://stackoverflow.com/questions/2633379/pixel-position-of-cursor-in-uitextview

beginning of the first word before startOfLine 2. Check if drawing the substring of head up to startOfLine causes a reduction in height compared to initialSize. 3. If so then you've identified the start of the line containing the cursor otherwise..

Converting plist to binary plist

http://stackoverflow.com/questions/264440/converting-plist-to-binary-plist

arrays and dictionaries. If you have a lot of identical arrays or dicts in your content you may see a significant size reduction by uniquing them. You can enable that by hacking up _flattenPlist in CFBinaryPlist.c . If you do that make sure to test..

How to find a pixel-positon of a cursor in UITextView?

http://stackoverflow.com/questions/3920944/how-to-find-a-pixel-positon-of-a-cursor-in-uitextview

startsOfLine startOfLine delimiter.location 2. Check if drawing the substring of head up to startOfLine causes a reduction in height compared to initialSize. NSString tempHead head substringToIndex startOfLine Gets the size of this temp head CGSize..

Highlighting a UIControl subclass

http://stackoverflow.com/questions/4428437/highlighting-a-uicontrol-subclass

the correct way to highlight it I would like to highlight the control when tapped in any way even a simple momentary reduction of alpha . With beginTrackingWithTouch and endTrackingWithTouch I can't recognize the only UIControlEventTouchUpInside event...

Alternatives to creating an openGL texture from a captured video frame to overlay an openGL view over video? (iPhone)

http://stackoverflow.com/questions/4473894/alternatives-to-creating-an-opengl-texture-from-a-captured-video-frame-to-overla

How can I optimize the rendering of a large model in OpenGL ES 1.1?

http://stackoverflow.com/questions/5718846/how-can-i-optimize-the-rendering-of-a-large-model-in-opengl-es-1-1

of the geometry being sent to the GPU . Whatever you can do to shrink the geometry size can lead to an almost linear reduction in rendering time in my experience. These tuning steps have worked for me in the past If you're not already you could look.. will be vastly outweighed by the speedup you get from not having to send all that color information. I saw a ~18 reduction in rendering time by binning the colors in one of my larger models. You're already using VBOs so you've taken advantage..

rendering a waveform on an iphone

http://stackoverflow.com/questions/896194/rendering-a-waveform-on-an-iphone

this question since it applies here as well When displaying an audio waveform you will want to do some sort of data reduction on the original data because there is usually more data available in an audio file than pixels on the screen. Most audio..