1.固定循环次数一定会收敛
Convergence in the loss and validation symbol error rate of the trained models was obtained after less than 100000 iterations, which we used as a fixed stopping criterion.
2.网络参数选择
When designing ANNs, the choice of hyper-parameters such as the number of layers, number of nodes in a hidden layer, activation functions, mini-batch size, learning rate, etc. is important. The optimization of the hyper-parameters was beyond the scope of our investigation. In this work they were chosen with the goal to keep the networks relatively small and hence the training effort manageable. Better results in terms of performance and its trade-off with complexity can be obtained with well-designed sets of hyper-parameters
3. 带限
The bandwidth of the signal is restricted by a 32 GHz low-pass filter to account for the significantly lower bandwidth of today’s hardware.
4.负迁移
negative transfer, which happens when knowledge transfer has a negative impact on target learning.
An important issue is to recognize the limit of the power of transfer learning. In [68], Mahmud and Ray analyzed the case of transfer learning using Kolmogorov complexity, where some theoretical bounds are proved. In particular, the authors used conditional Kolmogorov complexity to measure relatedness between tasks and transfer the “right” amount of information in a sequential transfer learning task under a Bayesian framework. Recently, Eaton et al. [69] proposed a novel graph-based method for knowledge transfer, where the relationships between source tasks are modeled by embedding the set of learned source models in a graph using transferability as the metric. Transferring to a new task proceeds by mapping the problem into the graph and then learning a function on this graph that automatically determines the parameters to transfer to the new learning task.