Most Efficient way to save shader result as texture and feed it back in next frame (Ping Pong Buffer)

I’ve done some fun work with Shaders in Open Frameworks by taking the output of the shader as a texture and feeding it back into the shader on the next frame as a sampler2D. This allows for interesting effects involving devices like cellular automata.

Now I am trying to replicate this effect in Unity, but wanted to check if my current method is good or if there’s some glaring flaw in efficiency people more experienced in graphics can see. I’d also just like to post this code since I had a hard time finding this exact problem. I am specifically doing this on the camera for the purpose of screen effects and took the foundations from Zucconi and Lammers’s book.

Specifically, I am after every rendering reading the pixels of the current active texture (which appears to be destTexture, though I am not certain) onto a global Texture2D variable. I am then feeding this in as a texture to the shader on the next pass.

Texture2D samp;
void OnRenderImage(RenderTexture sourceTexture, RenderTexture destTexture){
		if (curShader != null) {
			if (samp == null) {
				samp = new Texture2D(sourceTexture.width, sourceTexture.height);
				RenderTexture.active = sourceTexture;
				samp.ReadPixels(new Rect(0, 0, sourceTexture.width, sourceTexture.height), 0, 0);
				samp.Apply ();
			}
			material.SetTexture("_CurTex", samp);
			Graphics.Blit (sourceTexture, destTexture, material);
			samp.ReadPixels(new Rect(0, 0, sourceTexture.width, sourceTexture.height), 0, 0);
			samp.Apply ();
		} else {
			Graphics.Blit (sourceTexture, destTexture);
		}
	}

Thanks in advance for any help.

In reading about Texture2D.Apply it suggests that if you only need the texture on the GPU and not on the CPU then Graphics.CopyTexture would be faster.