Is it possible to track how many pixels have changed in a rendered frame?

In Unity, I would like to simulate what is known as an ‘event camera’. An event camera is a vision sensor in which changes in pixel intensities, when they are above a certain threshold, are recorded as events (+ if increase, - if decrease). So this is a bit like subtracting the previous image from the latest and doing a thresholding operation on it (in reality it is not, because event cameras are asynchronous as opposed to frame-based).

I was wondering if it’s possible to perform such an operation in Unity during the render pipeline directly, as opposed to rendering an image, reading the pixels again and then doing a pixel-wise subtraction. Is there any way to perform an image subtraction as a post-processing effect, perhaps?

I’d say the simplest way to do it would be to do it the same way a lot of anti-aliasing post-processes are done, and just send the current and last frame into a shader which returns the subtraction. You can use OnRenderImage to get the current screen render texture, then pass it into the shader and store it. that way the stored value is always the last frame which you can use to send to the shader as well for your subtracting.
Is that what you are trying to do or am i misreading your question?