说明:双击或选中下面任意单词,将显示该词的音标、读音、翻译等;选中中文或多个词,将显示翻译。
您的位置:首页 -> 词典 -> 自激函数
1)  function of the self-excitation
自激函数
2)  Excitation function
激发函数
1.
The excitation functions in dissipative heavy ion collision of27Al+27Al were measured.
在较大的连续角区范围内测量了27Al+27Al耗散反应产物的激发函数,入射束流的能量从114MeV到127MeV,能量步长200keV。
2.
The correlation function method is applied to analyze the fluctuation of the excitation function in the dissipative nuclear reaction.
应用关联函数方法 ,分析耗散反应激发函数中的涨落 。
3.
A theoretical calculation on the hots of the statistical theory including preequilibrium emimon has been carried out for the excitation functions on α+Ta(natural)and a comparison of measured results with calculated results has been exec.
4MeV能区181Ta(α,n)184Re和181Ta(α,2n)183Re的激发函数,实验结果同其他作者的结果作了比较,并且采用考虑了平衡前发射的统计模型进行了理论计算,计算结果同实验值作了比较。
3)  excitation functions
激发函数
1.
By using a detection system combining Time-of-Flight(TOF)method with(△E-E)technique,the excitation functions,the dependence of energy coherence widthΓon product mass A and the dependence of Γon the degree of freedom of neutron excess N/Z for some elements and their isotope products in the dissipative process of reaction 19F+51V at energies from 102.
50MeV(能量步长为250keV)的19T+51V反应中,第一次给出了耗散碰撞同位素产物的激发函数,讨论了双核系统的能量相干宽度Γ与产物的质量数A和Γ与产物的中子过剩自由度N/Z的依赖关系。
4)  activating function
激活函数
1.
A fast method and formula for computing activating function was provided.
给出一种激活函数快速计算方法和公式 ,推导出磁刺激作用下神经纤维的 Hodgkin- Hux-ley模型 ,并建立了神经兴奋与磁刺激仪电路参数之间的联系 。
2.
A modified cable equation and an activating function are obtained to describe the response of the neuraxon under the magnetic field based on the traditional model.
在传统电缆方程基础上,增加径向电场的作用,提出了一种能够描述磁场刺激神经轴突兴奋的改进的电缆方程和激活函数,仿真结果验证了其正确性。
3.
The method to avoid local minimum and the selections of learning method, learning step length, learning samples and activating function are introduced especially.
尤其对学习方法的选择、隐层数和隐层单元数的选择、学习步长的选择、避免局部最小的方法、学习样本的选择、激活函数的选择等都作了详细的介
5)  activation function
激励函数
1.
Studies on the activation functions of BP algorithm;
BP网络激励函数的研究
2.
The paper proposes a new kind of NARX network, which has trainable activation functions in addition to only trainable weights in the conventional NARX networks.
提出了一种隐层神经元激励函数可调的具有外部输入的非线性回归(NARX)神经网络,它在进行权值调整的同时,还对各隐层神经元激励函数的参数进行自适应调节;并推导出激励函数参数的学习算法,从而使NARX神经网络更符合生物神经网络。
3.
Previous learning algorithms, to our best knowledge, failed to take advantage of all three factors (weights, activation functions, topological structure) that influence the efficiency of such algorithms.
该算法综合考虑了影响神经网络性能的3个主要因素:权值、激励函数和拓扑结构。
6)  activation function
激活函数
1.
This paper presents some improvements on the convergent criterion and activation function of the traditional BP neural network algorithm,and also the measures to prevent vibration,accelerate convergence and avoid falling into local minimum.
针对传统BP(back propagetion)算法存在的缺陷,分别对其收敛性标准、激活函数等进行改进,并采取措施防止振荡、加速收敛以及防止陷入局部极小。
2.
Perfect artificial neural network (ANN) learning should include the optimization of neural activation function types, and the tradition of optimizing the network weights only in ANN learning is not consistent with biology.
以典型的前馈网络设计为例,对网络学习中神经元激活函数类型优化的重要性做了进一步的探讨。
3.
This paper studies the influence of different activation function on the speed of convergence of BP algorithm,and reaches a conclusion:The combined activation function can improve the speed of convergence of BP algorithm.
研究了不同激活函数选取对BP 网络收敛速度的影响,得出了采用组合激活函数可改善BP网络的收敛性的结论。
补充资料:自调自净自度
【自调自净自度】
 (术语)同自调项。
说明:补充资料仅用于学习参考,请勿用于其它任何用途。
参考词条