How to implement the augmented lagarangian method?

Hi, i have a function like this
x = ||g-u||.^2 + lambda.*R
where g,u and R are matrices of same size,and lambda is some constant now i want to optimize this equation using lagarangian method,any help?

Antworten (0)

Kategorien

Gefragt:

am 23 Apr. 2013

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by