Images captured under adverse weather conditions, such as haze or fog, typically exhibit low contrast and faded colors, which may severely limit the visibility within the scene. Unveiling the image structure under the haze layer and recovering vivid colors out of a single image remains a challenging task, since the degradation is depth-dependent and conventional methods are unable to handle this problem.
We propose to extend a well-known perception-inspired variational framework  for the task of single image dehazing. The main modification consists on the replacement of the value used by this framework for the grey-world hypothesis by an estimation of the mean of the clean image. This allows us to devise a variational method that requires no estimate of the depth structure of the scene, performing a spatially-variant contrast enhancement that effectively removes haze from far away regions. Experimental results show that our method competes well with other state-of-the-art methods in typical benchmark images, while outperforming current image dehazing methods in more challenging scenarios.