I had some trouble learning about how to make my own post-processing shaders in Unity3D, so here is a simple guide to help people get started.
Note: Post-processing shaders require the "RenderTexture" function that is only avaliable in Unity Pro. Even shader code like "GrabPass" (which lets you use the previous render pass as a texture) requires you to have Unity Pro.
Learning the Basics
Cg Tutorial: http://http.developer.nvidia.com/CgTutorial/cg_tutorial_chapter01.html
Unity Shader Reference: http://docs.unity3d.com/Documentation/Components/SL-Reference.html
Note: The Cg tutorial contains a lot of basic computer graphics knowledge that is good for review. However, you don't need to read about "compiling under different profiles" because Unity handles that internally. For Unity Shader Reference, the most important topics are ShaderLab Syntax, Writing Surface Shaders, and Writing Vertex and Fragment Shaders.
Writing Your First Shader
But first, how does Unity call post-processing shaders?
In Unity, post-processing shaders are different from regular shaders because there is no model to stick a material on. Of course, you could create a plane and stick your post-processing shader on that, but there is a better way to do this.
Turns out that the Camera class has a function dedicated to post-processing, called OnRenderImage. The Camera class will automatically call OnRenderImage, so you just have to fill it out like you do with Update or Start.
In that function, you should use the Graphics.Bilt function with a material. The Graphics.Bilt function will render the source texture using your material (a material is just a shader and the values passed in to it), and save it to your dest texture.
So there should be a script on your camera that does something like this:
// Called by the camera to apply the image effect void OnRenderImage (RenderTexture source, RenderTexture destination){ //mat is the material containing your shader Graphics.Blit(source,destination,mat); }Note that in this code, we never explicitly tell the material (and hence the shader) to use the texture in the "source" variable (which contains the rendered image of the scene) as input. This is because Graphics.Bilt will automatically copy the "source" texture to the material's main texture (or _MainTex in the shader code).
After that, we need code for the actual shader. Below is a simple grayscale post-processing shader. The vertex shader simply transform the vertex position and texture coordinate and passes them along. The fragment shader uses the texture coordinates to get the color of the current render (which is stored in _MainTex) and finds the grayscale color.
Shader "Custom/GrayScale" { Properties { _MainTex ("", 2D) = "white" {} } SubShader { ZTest Always Cull Off ZWrite Off Fog { Mode Off } //Rendering settings Pass{ CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" //we include "UnityCG.cginc" to use the appdata_img struct struct v2f { float4 pos : POSITION; half2 uv : TEXCOORD0; }; //Our Vertex Shader v2f vert (appdata_img v){ v2f o; o.pos = mul (UNITY_MATRIX_MVP, v.vertex); o.uv = MultiplyUV (UNITY_MATRIX_TEXTURE0, v.texcoord.xy); return o; } sampler2D _MainTex; //Reference in Pass is necessary to let us use this variable in shaders //Our Fragment Shader fixed4 frag (v2f i) : COLOR{ fixed4 orgCol = tex2D(_MainTex, i.uv); //Get the orginal rendered color //Make changes on the color float avg = (orgCol.r + orgCol.g + orgCol.b)/3f; fixed4 col = fixed4(avg, avg, avg, 1); return col; } ENDCG } } FallBack "Diffuse" }