Open Access
Issue |
JNWPU
Volume 40, Number 6, December 2022
|
|
---|---|---|
Page(s) | 1414 - 1421 | |
DOI | https://doi.org/10.1051/jnwpu/20224061414 | |
Published online | 10 February 2023 |
- FU J, LI W, DU J, et al. DSAGAN: a generative adversarial network based on dual-stream attention mechanism for anatomical and functional image fusion[J]. Information Sciences, 2021, 576: 484–506. [Article] [CrossRef] [Google Scholar]
- ZHU Pan, LIU Ze, HUANG Zhanhua. Infrared polarization and intensity image fusion based on dual-tree complex wavelet transform and sparse representation[J]. Acta Photonica Sinica, 2017, 46(12): 1210002. [Article] (in Chinese) [CrossRef] [Google Scholar]
- YIN M, LIU X, LIU Y, et al. Medical image fusion with parameter-adaptive pulse coupled neural network in nonsubsampled shearlet transform domain[J]. IEEE Trans on Instrumentation and Measurement, 2018, 68(1): 49–64 [Google Scholar]
- LIU Y, LIU S, WANG Z. A general framework for image fusion based on multi-scale transform and sparse representation[J]. Information Fusion, 2015, 24: 147–164. [Article] [CrossRef] [Google Scholar]
- ZHU Z, ZHENG M, QI G, et al. A phase congruency and local Laplacian energy based multi-modality medical image fusion method in NSCT domain[J]. IEEE Access, 2019, 7: 20811–20824. [Article] [CrossRef] [Google Scholar]
- KIM M, HAN D K, KO H. Joint patch clustering-based dictionary learning for multimodal image fusion[J]. Information Fusion, 2016, 27: 198–214. [Article] [CrossRef] [Google Scholar]
- ZHU Z, CHAI Y, YIN H, et al. A novel dictionary learning approach for multi-modality medical image fusion[J]. Neurocomputing, 2016, 214: 471–482. [Article] [CrossRef] [Google Scholar]
- ZHOU F, LI X, ZHOU M, et al. A new dictionary construction based multimodal medical image fusion framework[J]. Entropy, 2019, 21(3): 267. [Article] [CrossRef] [Google Scholar]
- LIU Y, CHEN X, WARD R K, et al. Image fusion with convolutional sparse representation[J]. IEEE Signal Processing Letters, 2016, 23(12): 1882–1886. [Article] [CrossRef] [Google Scholar]
- LIU Y, CHEN X, CHENG J, et al. A medical image fusion method based on convolutional neural networks[C]//2017 20th International Conference on Information Fusion, 2017 [Google Scholar]
- LIU Y, CHEN X, WARD R K, et al. Medical image fusion via convolutional sparsity based morphological component analysis[J]. IEEE Signal Processing Letters, 2019, 26(3): 485–489. [Article] [CrossRef] [Google Scholar]
- LI S, KANG X, HU J. Image fusion with guided filtering[J]. IEEE Trans on Image Processing, 2013, 22(7): 2864–2875 [CrossRef] [Google Scholar]
- DU J, LI W, XIAO B. Anatomical-functional image fusion by information of interest in local Laplacian filtering domain[J]. IEEE Trans on Image Processing, 2017, 26(12): 5855–5866. [Article] [CrossRef] [Google Scholar]
- LI X, GUO X, HAN P, et al. Laplacian redecomposition for multimodal medical image fusion[J]. IEEE Trans on Instrumentation and Measurement, 2020, 69(9): 6880–6890. [Article] [CrossRef] [Google Scholar]
- PRASATH V B S, PELAPUR R, SEETHARAMAN G, et al. Multiscale structure tensor for improved feature extraction and image regularization[J]. IEEE Trans on Image Processing, 2019, 28(12): 6198–6210. [Article] [NASA ADS] [CrossRef] [Google Scholar]
- SUFYAN A, IMRAN M, SHAH S A, et al. A novel multimodality anatomical image fusion method based on contrast and structure extraction[J]. International Journal of Imaging Systems and Technology, 2021, 20(12): 124–132 [Google Scholar]
- LIU Z, BLASCH E, XUE Z, et al. Objective assessment of multiresolution image fusion algorithms for context enhancement in night vision: a comparative study[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2011, 34(1): 94–109 [Google Scholar]
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.