Analysis of deep complex-­valued convolutional neural networksfor MRI reconstruction and phase-­foc

题目: Analysis of deep complex-­valued convolutional neural networks for MRI reconstruction and phase-­focused applications【深度复值卷积神经网络在MRI重建和相位聚焦应用中的分析】

地址:https://www.researchgate.net/publication/340475482_Analysis_of_Deep_Complex-Valued_Convolutional_Neural_Networks_for_MRI_Reconstruction

对我有用的部分:

1.  对比了不同复激活函数的性能

2. 对UNet在里面的重要性做了评估

3. 关注了油水分离等对相位信息有高需求的任务

4. 代码开源

5. 依然提出了DL生成的图像平滑的问题

6. 认为复数网络结构也适用于超声

7. 测试了网络的深度和宽度带来的影响

网络结构:

FIGURE 1 A, One iteration of the unrolled network based on the iterative shrinkage-­thresholding algorithm.7,8 This consists of an update block, which uses the MRI model to enforce data consistency with the physically measured k-­space samples. Then, a residual structure block
is used to denoise the input image to produce the output image ym+1. Each convolutional layer except for the last is followed by a ReLU and a
complex-­valued activation function (see Methods section B). B, The second reconstruction network architecture, which is based on the original
U-­Net for segmentation.15 Every orange box depicts a multi-­channel feature map. The number of channels is denoted on top of each feature map representation. Each arrow denotes a different operation, as depicted by the right-­hand legend. This network uses contracting and expanding paths to capture information. Note: ReLU, rectified linear unit

结果 

1. 激活函数 :CReLU最好

2. 网络的深度和宽度对实验结果的影响

实数和复数网络都是随着网络宽度和深度的增加,性能更好

Supporting Information Figure S2. Here, we replotted Figure 3a by making the x-axis the number of total feature maps instead of the number of total parameters. We applied a linear trendline to each series. Here, we can see that the performance of the real-valued model actually never surpasses the complex-valued model, despite having the same or greater number of feature maps. Additionally, the feature map size, feature map properties, and number of unrolled iterations were equivalent for these experiments.

FIGURE 3B, Performance of the unrolled network as a function of network depth on a test dataset. Here, the number of feature maps is kept constant at 128 and 90 for the complex and real networks, respectively, whereas the number of iterations is varied for each network. The number of iterations in compressed sensing does not change; however, its performance is plotted for reference. 

3. UNet(在复数和实数网络的不同表现)

指标

复数值CNN (±标准差)实数值CNN (±标准差)比较结果
PSNR (dB)35.28 ± 2.3435.00 ± 1.95复数值CNN更优
NRMSE0.16 ± 0.030.17 ± 0.02复数值CNN更优
SSIM0.90 ± 0.050.90 ± 0.02两者相当

complex-valued neural networks: theories and applications》是一本介绍复数值神经网络理论和应用的电子书。该书首先介绍了复数值神经网络的基本概念和数学原理,包括复数的表示、运算规则和复数神经元的构建。然后详细阐述了复数值神经网络在信号处理、图像识别、自然语言处理等领域的应用,以及与实数值神经网络在性能上的比较和分析。 在理论方面,该书深入解释了复数值神经网络相较于实数值神经网络的优势和特点,如对非平稳信号的处理能力、对相位信息的敏感度等。同时,还介绍了复数值神经网络在频域特征提取、相位编码和解调等方面的重要性,以及复数值神经网络在复杂环境下的稳定性和鲁棒性。 在应用方面,该书涵盖了复数值神经网络在通信系统中的调制解调、自适应滤波和信道均衡等方面的应用,以及在图像处理中的相位提取、变换和压缩等应用,同时还介绍了复数值神经网络在自然语言处理中的词向量表示、语义分析和情感识别等应用。 总之,《complex-valued neural networks: theories and applications》是一本系统全面地介绍了复数值神经网络的理论和应用的电子书。这本书对于研究人员、工程师和学生都有很高的参考价值,可以帮助他们深入理解复数值神经网络的原理和方法,并且在实际应用中发挥其优势。
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值