Skip to yearly menu bar Skip to main content


Poster

CoDe: An Explicit Content Decoupling Framework for Image Restoration

Enxuan Gu · Hongwei Ge · Yong Guo

Arch 4A-E Poster #269
[ ] [ Paper PDF ]
[ Slides [ Poster
Wed 19 Jun 10:30 a.m. PDT — noon PDT

Abstract: The performance of image restoration (IR) is highly dependent on the reconstruction quality of diverse contents with varying complexity. However, most IR approaches model the mapping between various complexity contents of inputs and outputs through the repeated feature calculation propagation mechanism in a unified pipeline, which leads to unsatisfactory results. To address this issue, we propose an explicit $\textbf{Co}$ntent $\textbf{De}$coupling framework for IR, dubbed $\textbf{CoDe}$, to end-to-end model the restoration process by utilizing decoupled content components in a divide-and-conquer-like architecture. Specifically, a Content Decoupling Module is first designed to decouple content components of inputs and outputs according to the frequency spectra adaptively generated from the transform domain. In addition, in order to harness the divide-and-conquer strategy for reconstructing decoupled content components, we propose an IR Network Container. It contains an optimized version, which is a streamlining of an arbitrary IR network, comprising the cascaded modulated subnets and a Reconstruction Layers Pool. Finally, a Content Consistency Loss is designed from the transform domain perspective to supervise the restoration process of each content component and further guide the feature fusion process. Extensive experiments on several IR tasks, such as image super-resolution, image denoising, and image blurring, covering both real and synthetic settings, demonstrate that the proposed paradigm can effectively take the performance of the original network to a new state-of-the-art level in multiple benchmark datasets (\eg, $\textbf{0.34}$dB@Set5 $\times4$ over DAT).

Chat is not available.