Do debayering
We don't want every application developer to break the open door again. Better push it to libcamera as a builtin ability.
Upstream developers unofficially confirmed that this is in scope and interesting.
Considerations:
- fastest method depends on hardware (fragment shader, compute shader, OpenCL, Vulkan, OpenGL, CPU, IPU)
- consumer might want to receive the raw stream in one frame but not the other (or both)
- can get significantly simpler/faster if the output is downscaled
- applies only to some hardware