The autoregressive entropy model facilitates high compression efficiency by capturing intricate dependencies but suffers from slow decoding due to its serial context dependencies. To address this, we propose ParaPCAC, a lossy Parallel Point Cloud Attribute Compression scheme, designed to optimize the efficiency of the autoregressive entropy model. Our approach focuses on two main components: a parallel decoding strategy and a multi-stage context-based entropy model. In the parallel decoding strategy, we partition the voxels of the quantized latent features into non-overlapping groups for independent context entropy modeling, enabling parallel processing. The multi-stage context based entropy model is employed to decode neighboring features concurrently, utilizing previously decoded features at each stage. Global hyperprior is incorporated after the first stage to improve the estimation of attribute probability. Through these two techniques, ParaPCAC achieves significant decoding speed enhancements, with an acceleration of up to 160× and a 24.15% BD-Rate reduction compared to serial autoregressive entropy models. Furthermore, experimental results demonstrate that ParaPCAC outperforms existing learning-based methods in rate-distortion performance and decoding latency.
Companion
APSIPA Transactions on Signal and Information Processing Special Issue - Three-dimensional Point Cloud Data Modeling, Processing, and Analysis
See the other articles that are part of this special issue.