function [x,R] = fb(x, ProxF, GradG, be, options) % fb - Forward-Backward algorithm % % [x,R] = fb(x, ProxF, GradG, L, options); % % Solves min_x G(x) + F(x) % where G is a differentiable convex function, whose gradient is 1/be-Lipschitz % and F is a proper closed convex and simple function. % % Use several first order-scheme depending on options.method: % options.method = 'fb' : classical Foward-backward. % options.method = 'fista' : FISTA method of Beck and Teboulle. % options.method = 'nesterov' : Nesterov scheme. % % INPUTS: % ProxF(y,mu) computes Prox_{mu*F}(x) % GradG(x) computes \nabla G(x) % 1/be is the lipschitz constant of the gradient, if G is C^2: % 1/be = max_x norm( Hg(x) ) % where Hg(x) is the hessian of G at x. % For instance, if G(x)=1/2*|A*x-y|^2 then 1/be = norm(A)^2. % options.niter is the number of iterations. % options.verb is for the diaplay of iterations. % options.mu is the FB descent step-size (for FB mu \in ]0,2*be[ and FISTA mu \in ]0,be]). % options.tau is the relaxation, tau \in ]0,(4-mu/be)/2[ % options.report(x) is a function to fill in R. % % OUTPUTS: % x is the final solution. % R(i) = options.report(x) at iteration i. % method = getoptions(options, 'method', 'fb'); mu = getoptions(options, 'mu', 1.8*be); tau = getoptions(options, 'tau', 1); report = getoptions(options, 'report', @(x)0); niter = getoptions(options, 'niter', 100); verb = getoptions(options, 'verb', 1); R = []; t = 1; % fista & nesterov tt = 2*be; gg = 0; A = 0; % nesterov y = x; x0 = x; for i=1:niter if verb progressbar(i,niter); end xold = x; switch method case 'fb' x = x + tau*(ProxF( x-mu*GradG(x), mu ) - x); case 'fista' xnew = ProxF( y - be*GradG(y), be ); tnew = (1+sqrt(1+4*t^2))/2; y = xnew + (t-1)/(tnew)*(xnew-x); x = xnew; t = tnew; case 'nesterov' a = (tt + sqrt(tt^2 + 4*tt*A))/2; v = ProxF( x0-gg, A ); z = (A*x+a*v)/(A+a); x = ProxF( z - be*GradG(z) , be ); gg = gg + a * GradG(x); % P'*(P*x-y); A = A + a; otherwise error('Unknown method'); end R(i) = report(x); end