[外文翻譯]反向傳播/backpropagation.rar
[外文翻譯]反向傳播/backpropagation,[外文翻譯]反向傳播/backpropagation內(nèi)包含中文翻譯英文原文,內(nèi)容齊全,建議下載閱覽。①中文頁(yè)數(shù) 23中文字?jǐn)?shù) 9903②英文頁(yè)數(shù) 31英文字?jǐn)?shù) 36000③ 摘要 frank rosenblat:的感知機(jī)學(xué)習(xí)規(guī)則和bernard widrow和marcian hnff的lms算法是設(shè)計(jì)用來(lái)訓(xùn)練單層的類似...
該文檔為壓縮文件,包含的文件列表如下:
內(nèi)容介紹
原文檔由會(huì)員 鄭軍 發(fā)布
[外文翻譯]反向傳播/BACKPROPAGATION
內(nèi)包含中文翻譯英文原文,內(nèi)容齊全,建議下載閱覽。
①中文頁(yè)數(shù) 23
中文字?jǐn)?shù) 9903
②英文頁(yè)數(shù) 31
英文字?jǐn)?shù) 36000
③ 摘要
Frank Rosenblat:的感知機(jī)學(xué)習(xí)規(guī)則和Bernard Widrow和Marcian Hnff的LMS算法是設(shè)計(jì)用來(lái)訓(xùn)練單層的類似感知器的網(wǎng)絡(luò)的。這些單層網(wǎng)絡(luò)的缺點(diǎn)是只能解線性可分的分類間題。Rosenblat,和Pvidrow均意識(shí)到這些限制并且都提出了克服此類問(wèn)題的方法:多層網(wǎng)絡(luò)。但他們未將這類算法推廣到用來(lái)訓(xùn)練功能更強(qiáng)的網(wǎng)絡(luò)。
Paul Werboss在他1974年的論文中第一次描述了訓(xùn)練多層神經(jīng)網(wǎng)絡(luò)的一個(gè)算法,論文中的算法是在一般網(wǎng)絡(luò)的情況中描述的,而將神經(jīng)網(wǎng)絡(luò)作為一個(gè)特例。論文沒(méi)有在神經(jīng)網(wǎng)絡(luò)研究圈子內(nèi)傳播。直到20世紀(jì)80年代中期,反向傳播算法才重新被發(fā)現(xiàn)并廣泛地宣揚(yáng),它是被David Ftumelhart, Geoffrey Hinton和Ronald Williaras , David Parkerr ,以及Yann Le Cun分別獨(dú)立地重新發(fā)現(xiàn)的。這個(gè)算法因被包括在《并行分布式處理》( Parallel Distributed Processing)一書中而得到普及。這本書介紹心理學(xué)家David Rumelhart和James McClelland領(lǐng)導(dǎo)的并行分布處理小組所做的研究工作。這本書的出版引發(fā)了神經(jīng)網(wǎng)絡(luò)的研究熱潮。當(dāng)前,用反向傳播算法訓(xùn)練的多層感知機(jī)是應(yīng)用最廣的神經(jīng)網(wǎng)絡(luò)
The perceptron learning rule of Frank Rosenblatt and the LMS algorithm of Bernard Widrow arid Martian Hotf were designed to train single-layer perceptron-like networks. these single-layer networks suffer from the disadvantage that they are only able to solve linearly separable classification pmhlems. Both Rosenblatt and Widraw were aware of these limitations and purposed multilayer networks that could overcome them, but they were not able to generalize their also-rithms to train these more powerful networks.
Apparently the first description of an algorithm to train multilayer networks was contained in the thesis of Paul Werhos in 1974. This thesis presented the algorithm in the context of general networks, with neural networks as a special case, and was not disseminated in the neural
④ 關(guān)鍵字 多層感知機(jī)/Muitilayer Per
內(nèi)包含中文翻譯英文原文,內(nèi)容齊全,建議下載閱覽。
①中文頁(yè)數(shù) 23
中文字?jǐn)?shù) 9903
②英文頁(yè)數(shù) 31
英文字?jǐn)?shù) 36000
③ 摘要
Frank Rosenblat:的感知機(jī)學(xué)習(xí)規(guī)則和Bernard Widrow和Marcian Hnff的LMS算法是設(shè)計(jì)用來(lái)訓(xùn)練單層的類似感知器的網(wǎng)絡(luò)的。這些單層網(wǎng)絡(luò)的缺點(diǎn)是只能解線性可分的分類間題。Rosenblat,和Pvidrow均意識(shí)到這些限制并且都提出了克服此類問(wèn)題的方法:多層網(wǎng)絡(luò)。但他們未將這類算法推廣到用來(lái)訓(xùn)練功能更強(qiáng)的網(wǎng)絡(luò)。
Paul Werboss在他1974年的論文中第一次描述了訓(xùn)練多層神經(jīng)網(wǎng)絡(luò)的一個(gè)算法,論文中的算法是在一般網(wǎng)絡(luò)的情況中描述的,而將神經(jīng)網(wǎng)絡(luò)作為一個(gè)特例。論文沒(méi)有在神經(jīng)網(wǎng)絡(luò)研究圈子內(nèi)傳播。直到20世紀(jì)80年代中期,反向傳播算法才重新被發(fā)現(xiàn)并廣泛地宣揚(yáng),它是被David Ftumelhart, Geoffrey Hinton和Ronald Williaras , David Parkerr ,以及Yann Le Cun分別獨(dú)立地重新發(fā)現(xiàn)的。這個(gè)算法因被包括在《并行分布式處理》( Parallel Distributed Processing)一書中而得到普及。這本書介紹心理學(xué)家David Rumelhart和James McClelland領(lǐng)導(dǎo)的并行分布處理小組所做的研究工作。這本書的出版引發(fā)了神經(jīng)網(wǎng)絡(luò)的研究熱潮。當(dāng)前,用反向傳播算法訓(xùn)練的多層感知機(jī)是應(yīng)用最廣的神經(jīng)網(wǎng)絡(luò)
The perceptron learning rule of Frank Rosenblatt and the LMS algorithm of Bernard Widrow arid Martian Hotf were designed to train single-layer perceptron-like networks. these single-layer networks suffer from the disadvantage that they are only able to solve linearly separable classification pmhlems. Both Rosenblatt and Widraw were aware of these limitations and purposed multilayer networks that could overcome them, but they were not able to generalize their also-rithms to train these more powerful networks.
Apparently the first description of an algorithm to train multilayer networks was contained in the thesis of Paul Werhos in 1974. This thesis presented the algorithm in the context of general networks, with neural networks as a special case, and was not disseminated in the neural
④ 關(guān)鍵字 多層感知機(jī)/Muitilayer Per