<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="style/detail_T.xsl"?>
<bibitem type="J">   <ARLID>0090278</ARLID> <utime>20240903170617.2</utime><mtime>20071127235959.9</mtime>         <title language="eng" primary="1">Neuromorphic features of probabilistic neural networks</title>  <specification> <page_count>16 s.</page_count> </specification>   <serial><ARLID>cav_un_epca*0297163</ARLID><ISSN>0023-5954</ISSN><title>Kybernetika</title><part_num/><part_title/><volume_id>43</volume_id><volume>5 (2007)</volume><page_num>697-712</page_num><publisher><place/><name>Ústav teorie informace a automatizace AV ČR, v. v. i.</name><year/></publisher></serial>   <title language="cze" primary="0">Neuromorfní vlastnosti pravděpodobnostních neuronových sítí</title>    <keyword>probabilistic neural networks</keyword>   <keyword>distribution mixtures</keyword>   <keyword>sequential EM algorithm</keyword>   <keyword>pattern recognition</keyword>    <author primary="1"> <ARLID>cav_un_auth*0101091</ARLID> <name1>Grim</name1> <name2>Jiří</name2> <institution>UTIA-B</institution> <full_dept>Department of Pattern Recognition</full_dept>  <fullinstit>Ústav teorie informace a automatizace AV ČR, v. v. i.</fullinstit> </author>        <cas_special> <project> <project_id>507752</project_id> <country>XE</country>   <agency>EC</agency> <ARLID>cav_un_auth*0200689</ARLID> </project> <project> <project_id>GA102/07/1594</project_id> <agency>GA ČR</agency> <ARLID>cav_un_auth*0228611</ARLID> </project> <project> <project_id>1M0572</project_id> <agency>GA MŠk</agency> <ARLID>cav_un_auth*0001814</ARLID> </project> <project> <project_id>2C06019</project_id> <agency>GA MŠk</agency> <country>CZ</country> <ARLID>cav_un_auth*0216518</ARLID> </project> <research> <research_id>CEZ:AV0Z10750506</research_id> </research>  <abstract language="eng" primary="1">We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The  probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can find possible  theoretical background of the functional properties of neurons. For example, the general formula for synaptical weights provides a statistical justification of the well known Hebbian principle of learning. Similarly, the mean effect of lateral inhibition can be expressed by means of a formula proposed by Perez as a measure of dependence tightness of involved variables.</abstract> <abstract language="cze" primary="0">Souhrnná práce o pravděpodobnostních neuronových sítích, které nabízejí alternativní řešení problému výběru příznaků (podprostorový přístup) a jsou široce použitelné pro řešení mnohorozměrných úloh klasifikace s omezenými datovými soubory.</abstract>     <reportyear>2008</reportyear>  <RIV>IN</RIV>     <unknown tag="mrcbC52"> 4 O 4o 20231122133652.2 </unknown>  <permalink>http://hdl.handle.net/11104/0151218</permalink>         <unknown tag="mrcbT16-f">0.464</unknown> <unknown tag="mrcbT16-g">0.044</unknown> <unknown tag="mrcbT16-h">8.9</unknown> <unknown tag="mrcbT16-i">0.0021</unknown> <unknown tag="mrcbT16-j">0.359</unknown> <unknown tag="mrcbT16-k">329</unknown> <unknown tag="mrcbT16-l">68</unknown> <unknown tag="mrcbT16-q">21</unknown> <unknown tag="mrcbT16-s">1.071</unknown> <unknown tag="mrcbT16-y">14.83</unknown> <unknown tag="mrcbT16-x">0.67</unknown> <arlyear>2007</arlyear>    <unknown tag="mrcbTft">  Soubory v repozitáři: 0090278.pdf </unknown>    <unknown tag="mrcbU63"> cav_un_epca*0297163 Kybernetika 0023-5954 Roč. 43 č. 5 2007 697 712 Ústav teorie informace a automatizace AV ČR, v. v. i. </unknown> </cas_special> </bibitem>