On most camera sensors, each pixel collects one single color information (Red, Green or Blue) thanks to the use of a color filter array.
Most common pattern is called Bayer filter (cf. https://en.wikipedia.org/wiki/Bayer_filter), which is used in most cameras.
Unfortunately, in some cases (that can involve one – or several! - of these characteristics: small pixels, lens flare, or when sensor is very close to the lens), light that should be collected by one pixel has such an angle that it is collected by the neighbor pixel.
Now, if you consider sensor area that corresponds to a homogeneous area of a photographed scene, a green pixel between two red pixels will receive a given amount of light that should have been collected by red pixels. The same thing will occur for green pixels that are between blue pixels. But as red and blue light is generally received in different amount, green pixels between two red pixels will consequently have a different value from green pixels between blue pixels, whereas all these green pixels should have had the same value (we consider an homogeneous area from the scene). This phenomenon, called crosstalk (or Gr-Gb imbalance) is generally amplified by demosaicing algorithms and leads to a pattern looking like a maze/labyrinth.
Below a sample of the phenomenon:
Picture shot with a Sony RX-100 camera
This crop is an area extracted from the center of the picture above
Photolab slider aims at removing this Gr-Gb imbalance before demosacing step.