MLIC++: Linear Complexity Multi-Reference Entropy Modeling for Learned Image Compression
CoRR(2023)
摘要
Recently, learned image compression has achieved impressive performance. The
entropy model, which estimates the distribution of the latent representation,
plays a crucial role in enhancing rate-distortion performance. However,
existing global context modules rely on computationally intensive quadratic
complexity computations to capture global correlations. This quadratic
complexity imposes limitations on the potential of high-resolution image
coding. Moreover, effectively capturing local, global, and channel-wise
contexts with acceptable even linear complexity within a single entropy model
remains a challenge. To address these limitations, we propose the Linear
Complexity Multi-Reference Entropy Model (MEM++). MEM++ effectively captures
the diverse range of correlations inherent in the latent representation.
Specifically, the latent representation is first divided into multiple slices.
When compressing a particular slice, the previously compressed slices serve as
its channel-wise contexts. To capture local contexts without sacrificing
performance, we introduce a novel checkerboard attention module. Additionally,
to capture global contexts, we propose the linear complexity attention-based
global correlations capturing by leveraging the decomposition of the softmax
operation. The attention map of the previously decoded slice is implicitly
computed and employed to predict global correlations in the current slice.
Based on MEM++, we propose image compression model MLIC++. Extensive
experimental evaluations demonstrate that our MLIC++ achieves state-of-the-art
performance, reducing BD-rate by 13.39
VTM-17.0 in PSNR. Furthermore, MLIC++ exhibits linear GPU memory consumption
with resolution, making it highly suitable for high-resolution image coding.
Code and pre-trained models are available at
https://github.com/JiangWeibeta/MLIC.
更多查看译文
关键词
entropy,complexity,multi-reference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要