1) Neuron activation function
神经激励函数
2) activation function
激励函数
1.
Studies on the activation functions of BP algorithm;
BP网络激励函数的研究
2.
The paper proposes a new kind of NARX network, which has trainable activation functions in addition to only trainable weights in the conventional NARX networks.
提出了一种隐层神经元激励函数可调的具有外部输入的非线性回归(NARX)神经网络,它在进行权值调整的同时,还对各隐层神经元激励函数的参数进行自适应调节;并推导出激励函数参数的学习算法,从而使NARX神经网络更符合生物神经网络。
3.
Previous learning algorithms, to our best knowledge, failed to take advantage of all three factors (weights, activation functions, topological structure) that influence the efficiency of such algorithms.
该算法综合考虑了影响神经网络性能的3个主要因素:权值、激励函数和拓扑结构。
3) active function
激励函数
1.
In this paper, an algorithm in which active function is improved is proposed through analyzing the conventional BP algorithm, and different learning rates are used to increase the learning speed in each layer.
通过对传统BP算法的分析,提出了一种改进激励函数的学习方法,并且在神经网络的每一层采用不同的学习速率,以提高训练速度;采用所提出的改进BP算法,训练多层前向神经网络,建立机械手逆运动学模型,仿真结果表明了该算法的有效性;与传统BP算法相比,大大提高了机械手逆运动学的精度。
2.
The circuit of active function is introduced.
介绍其中的一种Sigmoid激励函数电路实现,该电路以差分器件为主要部分,通过调整相应的参数可以调节输入电压的范围和改变激励函数的增益,并在EDA环境下仿真验证了电路的有效性。
4) excitation function
激励函数
1.
Minimization technique of J and K excitation functions based on behaviors of flip-flop;
基于触发行为的J、K激励函数的最小化技术
2.
By this method, secondary state equations based on the state table are compared with the standard secondary state equations of the flip-flop used in the circuits, and then the excitation functions of flip-flop can be directly acquired.
该方法从状态表中获得次态方程 ,然后将所获得的次态方程与所用触发器的标准次态方程联立比较而直接得到触发器的激励函数。
5) stimulation function
激励函数
1.
The theory of fuzzy optimum selection is combined with the neural network theory, and the topologic structure of the network is determined, including the number of implicit layers, node number of implicit layers, and the reasonable mode of the node stimulation function.
将模糊优选理论与神经网络理论相结合,确定网络拓扑结构:隐含层数、隐含层节点数与节点激励函数的合理模式。
2.
In order to speed the convergence of BP algorithm, this paper puts forward a new stimulation function,which has an explicit physical meaning and is able to reflect the optimization essence of human beings brains.
为加速BP算法的收敛,提出了一种物理意义明确、体现人脑优选本质的新的激励函数,通过动态调整此激励函数的参数并结合已有的一些BP改进算法,用之进行股票价格的预测,得到了满意的结果。
3.
Combined with the fuzzy optimum selection theory proposed by the author with the neural network theory, this paper provides a rational pattern of determining the topologic structure of network: number of hidden layers, number of nodes in hidden layer and stimulation function of nodes.
把笔者建立的模糊优选理论与神经网络理论结合起来,提出确定网络拓扑结构:隐含层数、隐含层节点数与节点激励函数的合理模式。
6) neuronal activation function
神经元激活函数
1.
Optimizing neuronal activation function types based on GP in constructive FNN design;
前馈网络构造性设计中基于GP实现神经元激活函数类型优化
补充资料:CT导向腹腔神经节和内脏神经松解术
CT导向腹腔神经节和内脏神经松解术
介入放射学技术。内脏神经松解术治疗上腹痛首先由Kappis等(1919年)报道,此后它和腹腔神经节松解术主要用于治疗胰腺癌、腹部其他恶性肿瘤或胰腺炎引起的顽固性腹痛。目前用CT导向进针向腹腔神经节或内脏神经丛注射无水酒精20~25ml(每18~22ml酒精加2~3ml碘酞葡胺),然后经CT扫描确定酒精分布范围。如果肿瘤浸润进展疼痛复发,可重复进行这种治疗。
说明:补充资料仅用于学习参考,请勿用于其它任何用途。
参考词条