Many researchers have studied descent methods for multiobjective
optimization problems in recent years. For example, Fliege and Svaiter
proposed the steepest descent method for differentiable multiobjective
optimization problems. Afterward, a proximal gradient method, which
can solve non-differentiable problems, was also considered. However,
their accelerated versions are not sufficiently studied. Recently, El
Moudden and El Mouatasim proposed a natural extension of Nesterov's
accelerated method for multiobjective optimization problems. They
proved the global convergence rate of the algorithm ($O(1/k^2)$) under
the assumption that the sequence of the Lagrangian multipliers of the
subproblems is eventually fixed. However, this assumption is
restrictive because it means that
the method is regarded as the Nesterov's method for the weighting
problem. In this paper, we propose a new accelerated algorithm, in
which we solve subproblems with terms that only appear in the
multiobjective case. We also show the proposed method's global
convergence rate ($O(1/k^2)$) under a more natural assumption, using a
merit function to measure the complexity. Moreover, we present an
efficient way to solve the subproblem in the proposed method via its
dual, and we confirm the validity of the
proposed method through numerical experiments.