Successive Approximation Training Algorithm with Increasing Hidden Units(HU) for Feedforward Neural Networks 用于前馈网络的基于隐单元递增相继逼近算法
Hidden units combination and deletion in the process of adjusting data centers and spread constants are also introduced. 数据中心和扩展常数调整过程中还引入了隐节点的合并操作和删除操作。
By pruning the redundant input features and hidden units alternatively, network architecture is kept reasonable. 通过交替删除网络中冗余的输入特征和隐结点,使网络结构在特征选择的过程中保持相对良好。
The inputs and outputs in conjunction of the neighbouring waves were also connected by the hidden units. 三个波的输入与输出只通过少量的隐层单元相联接,并通过各波的隐层单元将相邻波的边缘联系起来。
Finally, the quality factor of hidden layer, the efficient coefficient of hidden layer, the redundancy of hidden units and the evaluation factor of hidden layer were defined, and the rationality and validity of the evaluation method were verified by reviewing some typical feedforward neural networks. 最后定义了隐层品质因子、隐层有效系数、隐单元剩余度和隐层评价因子,并通过对典型前馈网络的考察,验证了该评测方法的合理性和有效性。