function [x,R] = fb_dual(x, K, KS, GradFS, ProxGS, be, options) % fb_dual - Forward-Backward algorithm applied to the Fenchel-Rockafellar dual (**) for strongly convex problems % % [x,R] = fb_dual(x, K, KS, GradFS, ProxGS, be, options); % % Minimization of % min_x F(x)+G(K*x) (*) % where F is strongly convex, and F is a proper closed convex and simple function. % % Under appropriate domain equilification conditions, we have strong duality and (* is such that % min_x F(x)+G(K*x) = -min_u F^*(-K^* u) + G^*(u) (**) % and the extremality Fenchel condition allows to recover the primal solution from a dual one, i.e. % x = grad(F^*)(-K^* u) % % % % INPUTS: % GradFS(x) is grad(F^*)(x) (is Lipschitz since F is strongly convex) % ProxGS(u,mu) is prox_{mu*G^*}(u) % K is the handle to the linear operator. % KS is the handle to the adjoint linear operator. % L is the lipshitz constant of K*GradFS*KS % options.niter is the number of iterations. % options.verb is for the diaplay of iterations. % options.mu is the FB descent step-size. % options.report(x) is a function to fill in R. % % OUTPUTS: % x=grad(F^*)(-K^* u) is the final solution. % R(i) = options.report(x) at iteration i. % if isnumeric(K) K = @(x)K*x; end if isnumeric(KS) KS = @(x)KS*x; end report_old = getoptions(options, 'report', @(x)0); options.report = @(u)report_old(GradFS(-KS(u))); NewGrad = @(u)-K(GradFS(-KS(u))); u0 = K(x); [u, R] = fb(u0, ProxGS, NewGrad, be, options); x = GradFS(-KS(u));