• Stars
    star
    9
  • Rank 1,939,727 (Top 39 %)
  • Language
    Jupyter Notebook
  • Created almost 6 years ago
  • Updated almost 6 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

2D Fused LASSO using Gradient Descent for grayscale image restoration 🎈

2D Fused LASSO for grayscale image restoration

Problem

img

Fused LASSO is a variation of MSE + L1 regularization. We penalize weight (which corresponds to pixel brightness value) for being different from nearby pixels. As widely known, L1 is not differentiable, but it is convex, thus subgradient can be calculated, which corresponds to sign(w). Apart from this, training process is not that much different from regular Gradient Descent. To be fancy, three optimization algorithms implemented: vanilla Gradient Descent, GD with Momentum, and Nesterov-corrected momentum (NAG).

Examples

Gaussian noise N(0, 20) is added to each of the examples, then model is trained and result is compared with both the original and noisy image. Metric used for evaluation is a modification R^2 coefficient, where Variance is replaced with estimation of Gaussian noise standard deviation. This can be interpreted as amount of variation of the original image "preserved" in denoised version. example1

example2

Interface

Along with a research notebook, fused_lasso.py file is provided, containing model class with sklearn-like API.

Acknowledgements

  • Ryan Tibshirani's course on convex optimization
  • Sebastian Ruder's blog post on Gradient Descent optimization algorithms
  • Yann LeCun's MNIST dataset