Shaders and the Graphics Pipeline - Beginning OpenGL ES and GLKit - raywenderlich.com

Опубликовано: 22 Май 2017
на канале: Kodeco
13,648
242

This video will teach about shaders and how they function in the graphics pipeline.

----
About www.raywenderlich.com:

raywenderlich.com is a website focused on developing high quality programming tutorials. Our goal is to take the coolest and most challenging topics and make them easy for everyone to learn – so we can all make amazing apps.

We are also focused on developing a strong community. Our goal is to help each other reach our dreams through friendship and cooperation. As you can see below, a bunch of us have joined forces to make this happen: authors, editors, subject matter experts, app reviewers, and most importantly our amazing readers!

----

About Shaders from Wikipedia
https://en.wikipedia.org/wiki/Shader

In the field of computer graphics, a shader is a computer program that is used to do shading: the production of appropriate levels of light, darkness, and color within an image, or, in the modern era, also to produce special effects or do video post-processing.

Shaders calculate rendering effects on graphics hardware with a high degree of flexibility. Most shaders are coded for a graphics processing unit (GPU), though this is not a strict requirement. Shading languages are usually used to program the programmable GPU rendering pipeline, which has mostly superseded the fixed-function pipeline that allowed only common geometry transformation and pixel-shading functions; with shaders, customized effects can be used. The position, hue, saturation, brightness, and contrast of all pixels, vertices, or textures used to construct a final image can be altered on the fly, using algorithms defined in the shader, and can be modified by external variables or textures introduced by the program calling the shader.

Shaders are used widely in cinema postprocessing, computer-generated imagery, and video games to produce a seemingly infinite range of effects. Beyond just simple lighting models, more complex uses include altering the hue, saturation, brightness or contrast of an image, producing blur, light bloom, volumetric lighting, normal mapping for depth effects, bokeh, cel shading, posterization, bump mapping, distortion, chroma keying (so-called "bluescreen/greenscreen" effects), edge detection and motion detection, psychedelic effects, and a wide range of others.

About OpenGL

https://developer.apple.com/library/c...

The Open Graphics Library (OpenGL) is used for visualizing 2D and 3D data. It is a multipurpose open-standard graphics library that supports applications for 2D and 3D digital content creation, mechanical and architectural design, virtual prototyping, flight simulation, video games, and more. You use OpenGL to configure a 3D graphics pipeline and submit data to it. Vertices are transformed and lit, assembled into primitives, and rasterized to create a 2D image. OpenGL is designed to translate function calls into graphics commands that can be sent to underlying graphics hardware. Because this underlying hardware is dedicated to processing graphics commands, OpenGL drawing is typically very fast.

OpenGL for Embedded Systems (OpenGL ES) is a simplified version of OpenGL that eliminates redundant functionality to provide a library that is both easier to learn and easier to implement in mobile graphics hardware.

OpenGL ES allows an app to harness the power of the underlying graphics processor. The GPU on iOS devices can perform sophisticated 2D and 3D drawing, as well as complex shading calculations on every pixel in the final image. You should use OpenGL ES if the design requirements of your app call for the most direct and comprehensive access possible to GPU hardware. Typical clients for OpenGL ES include video games and simulations that present 3D graphics.

OpenGL ES is a low-level, hardware-focused API. Though it provides the most powerful and flexible graphics processing tools, it also has a steep learning curve and a significant effect on the overall design of your app. For apps that require high-performance graphics for more specialized uses, iOS provides several higher-level frameworks:

The Sprite Kit framework provides a hardware-accelerated animation system optimized for creating 2D games. (See Sprite Kit Programming Guide.)

The Core Image framework provides real-time filtering and analysis for still and video images. (See Core Image Programming Guide.)

Core Animation provides the hardware-accelerated graphics rendering and animation infrastructure for all iOS apps, as well as a simple declarative programming model that makes it simple to implement sophisticated user interface animations. (See Core Animation Programming Guide.)


Смотрите видео Shaders and the Graphics Pipeline - Beginning OpenGL ES and GLKit - raywenderlich.com онлайн, длительностью часов минут секунд в хорошем качестве, которое загружено на канал Kodeco 22 Май 2017. Делитесь ссылкой на видео в социальных сетях, чтобы ваши подписчики и друзья так же посмотрели это видео. Данный видеоклип посмотрели 13,648 раз и оно понравилось 242 посетителям.