A hand‑painted sky can look more convincing than a high‑resolution cloud photograph because the visual system is not a camera. Instead of logging every droplet and edge, the brain extracts statistical regularities: soft luminance gradients, typical cloud contours and the way brightness decays toward the horizon. When a painter exaggerates those signatures and strips away stray detail, the picture aligns more tightly with the brain’s internal model of what a sky should be.
Research on the visual cortex shows that neurons in early areas act as spatial frequency filters, while higher areas compress scenes into familiar schemas through pattern recognition. That hierarchy behaves like a lossy encoder, guided by principles close to information theory and entropy reduction rather than faithful pixel storage. Many ultra‑sharp photos preserve micro‑textures and lighting noise that do not match the brain’s compressed template, so they feel oddly flat. A stylized sky, by contrast, boosts the dominant cues the perceptual system uses to infer depth, volume and weather, triggering a stronger sense of reality precisely because it is less literal.