1) Multi-level transfer function
多层激励函数
1.
An approach to facial expression recognition based on multi-level transfer function quantum neural networks(QNN)and multi-layer classifiers is presented in order to improve the recognition rate and recognition reliability.
为了提高识别率和可靠性,提出了一种基于多层激励函数的量子神经网络和多级分类器组合的人脸表情识别方法。
2.
Aiming at the data overlapping of different patterns on pattern recognition,a pattern recognition algorithm is presented based on the multi-level transfer function quantum neural network(QNN).
针对不同样本之间存在交叉数据的模式识别问题,将多层激励函数的量子神经网络引入模式识别之中,提出一种基于量子神经网络的模式识别算法。
3.
An approach to handwritten digital recognition is presented based on multi-level transfer function quantum neural networks (QNN) and multi-layer classifiers.
提出了一种基于多层激励函数的量子神经网络和多级分类器组合的手写体数字识别方法,采用MNIST数据库进行训练和测试。
2) multi-wavelet incentive function
多层小波激励函数
1.
The quantum neurons of hidden layer of the model using a linear superposition of wavelet function as incentive function,called multi-wavelet incentive function,such hidden layer neurons not only can express more of .
该模型隐层量子神经元采用小波基函数的线性叠加作为激励函数,称之为多层小波激励函数,这样隐层神经元既能表示更多的状态和量级,又能提高网络收敛精度和速度。
3) activation function
激励函数
1.
Studies on the activation functions of BP algorithm;
BP网络激励函数的研究
2.
The paper proposes a new kind of NARX network, which has trainable activation functions in addition to only trainable weights in the conventional NARX networks.
提出了一种隐层神经元激励函数可调的具有外部输入的非线性回归(NARX)神经网络,它在进行权值调整的同时,还对各隐层神经元激励函数的参数进行自适应调节;并推导出激励函数参数的学习算法,从而使NARX神经网络更符合生物神经网络。
3.
Previous learning algorithms, to our best knowledge, failed to take advantage of all three factors (weights, activation functions, topological structure) that influence the efficiency of such algorithms.
该算法综合考虑了影响神经网络性能的3个主要因素:权值、激励函数和拓扑结构。
4) active function
激励函数
1.
In this paper, an algorithm in which active function is improved is proposed through analyzing the conventional BP algorithm, and different learning rates are used to increase the learning speed in each layer.
通过对传统BP算法的分析,提出了一种改进激励函数的学习方法,并且在神经网络的每一层采用不同的学习速率,以提高训练速度;采用所提出的改进BP算法,训练多层前向神经网络,建立机械手逆运动学模型,仿真结果表明了该算法的有效性;与传统BP算法相比,大大提高了机械手逆运动学的精度。
2.
The circuit of active function is introduced.
介绍其中的一种Sigmoid激励函数电路实现,该电路以差分器件为主要部分,通过调整相应的参数可以调节输入电压的范围和改变激励函数的增益,并在EDA环境下仿真验证了电路的有效性。
5) excitation function
激励函数
1.
Minimization technique of J and K excitation functions based on behaviors of flip-flop;
基于触发行为的J、K激励函数的最小化技术
2.
By this method, secondary state equations based on the state table are compared with the standard secondary state equations of the flip-flop used in the circuits, and then the excitation functions of flip-flop can be directly acquired.
该方法从状态表中获得次态方程 ,然后将所获得的次态方程与所用触发器的标准次态方程联立比较而直接得到触发器的激励函数。
6) stimulation function
激励函数
1.
The theory of fuzzy optimum selection is combined with the neural network theory, and the topologic structure of the network is determined, including the number of implicit layers, node number of implicit layers, and the reasonable mode of the node stimulation function.
将模糊优选理论与神经网络理论相结合,确定网络拓扑结构:隐含层数、隐含层节点数与节点激励函数的合理模式。
2.
In order to speed the convergence of BP algorithm, this paper puts forward a new stimulation function,which has an explicit physical meaning and is able to reflect the optimization essence of human beings brains.
为加速BP算法的收敛,提出了一种物理意义明确、体现人脑优选本质的新的激励函数,通过动态调整此激励函数的参数并结合已有的一些BP改进算法,用之进行股票价格的预测,得到了满意的结果。
3.
Combined with the fuzzy optimum selection theory proposed by the author with the neural network theory, this paper provides a rational pattern of determining the topologic structure of network: number of hidden layers, number of nodes in hidden layer and stimulation function of nodes.
把笔者建立的模糊优选理论与神经网络理论结合起来,提出确定网络拓扑结构:隐含层数、隐含层节点数与节点激励函数的合理模式。
补充资料:多层沉积层
分子式:
CAS号:
性质:由两种或两种以上相继沉积的金属构成的沉积层。这些沉积层可以由不同特性的同一金属或不同金属构成。如为了提高镍镀层的防护性,有时采用双层镍或三层镍镀层。又如为防护-装饰目的采用的铜/镍/铬三层镀层。
CAS号:
性质:由两种或两种以上相继沉积的金属构成的沉积层。这些沉积层可以由不同特性的同一金属或不同金属构成。如为了提高镍镀层的防护性,有时采用双层镍或三层镍镀层。又如为防护-装饰目的采用的铜/镍/铬三层镀层。
说明:补充资料仅用于学习参考,请勿用于其它任何用途。
参考词条