I am studying the image denoising problem. The paper I studied uses Gaussian Noise to model the noise appearing on the image. E.e., If we say $u_g$ is a clean image, then we model the noisy image $u_0$ so that $u_0 = u_g+n$ where $n$ has Gaussian distribution with mean $0$ and variance $c$.
Q1: I am wondering why are we using Gaussian distribution to model the noise? Any theory behind it?
Q2: Can we use stochastic processes to model the noise? If so, is there any existing literature?
Answer
Theory behind it: the additive Gaussian noise can be related to (Johnson–Nyquist) thermal noise in standard images. For the denoising of more generic multivariate images, chapter 2.2.2 provides some references on why global noise present in acquired data can be realistically modelled by an additive zero-mean spatially white Gaussian noise.
The shot noise can be modeled as a Poisson process, that behaves, for large numbers, like a Gaussian noise.
Other types of noises are observable in images: salt-and-pepper, quantization (more uniform), grain. In some image modalities, noise can be not stationary, multiplicative. They can be modeled by stochastic processes in general.
Gaussian noise is also widely used because it enjoys relatively tractable theories, related to least-square estimation. A very practical reason.
Note, however, that most images have 8-bit channells, with integer values in $[0,\ldots,255]$. hence, adding a Gaussian noise yields non-integer negative or values above 255, which is not fully realistic: natural sampled noise is likely to be quantized in the admissible values.
No comments:
Post a Comment