<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="style/detail_T.xsl"?>
<bibitem type="J">   <ARLID>0449029</ARLID> <utime>20240103210937.7</utime><mtime>20151023235959.9</mtime>   <WOS>000358811700008</WOS> <SCOPUS>84938521589</SCOPUS>  <DOI>10.1239/jap/1437658607</DOI>           <title language="eng" primary="1">Sample-Path Optimal Stationary Policies in Stable Markov Decision Chains with Average Reward Criterion</title>  <specification> <page_count>20 s.</page_count> <media_type>P</media_type> </specification>   <serial><ARLID>cav_un_epca*0256875</ARLID><ISSN>0021-9002</ISSN><title>Journal of Applied Probability</title><part_num/><part_title/><volume_id>52</volume_id><volume>2 (2015)</volume><page_num>419-440</page_num><publisher><place/><name>Cambridge University Press</name><year/></publisher></serial>    <keyword>Dominated Convergence theorem for the expected average criterion</keyword>   <keyword>Discrepancy function</keyword>   <keyword>Kolmogorov inequality</keyword>   <keyword>Innovations</keyword>   <keyword>Strong sample-path optimality</keyword>    <author primary="1"> <ARLID>cav_un_auth*0307645</ARLID> <name1>Cavazos-Cadena</name1> <name2>R.</name2> <country>MX</country>  <share>50</share> </author> <author primary="0"> <ARLID>cav_un_auth*0238984</ARLID> <name1>Montes-de-Oca</name1> <name2>R.</name2> <country>MX</country>  <share>25</share> </author> <author primary="0"> <ARLID>cav_un_auth*0101196</ARLID> <name1>Sladký</name1> <name2>Karel</name2> <full_dept language="cz">Ekonometrie</full_dept> <full_dept>Department of Econometrics</full_dept> <department language="cz">E</department> <department>E</department> <institution>UTIA-B</institution> <full_dept>Department of Econometrics</full_dept>  <share>25</share> <fullinstit>Ústav teorie informace a automatizace AV ČR, v. v. i.</fullinstit> </author>   <source> <url>http://library.utia.cas.cz/separaty/2015/E/sladky-0449029.pdf</url> </source>        <cas_special> <project> <project_id>171396</project_id> <agency>GA AV ČR</agency> <country>CZ</country> <ARLID>cav_un_auth*0307567</ARLID> </project>  <abstract language="eng" primary="1">This work concerns discrete-time Markov decision chains with denumerable state and compact action sets. Besides standard continuity requirements, the main assumption on the model is that it admits a Lyapunov function m. In this context the average reward criterion is analyzed from the sample-path point of view. The main conclusion is that, if the expected average reward associated to m^2 is finite under any policy, then a stationary policy obtained from the optimality equation in the standard way is sample-path average optimal in a strong sense.</abstract>     <reportyear>2016</reportyear>  <RIV>BC</RIV>      <num_of_auth>3</num_of_auth>  <inst_support> RVO:67985556 </inst_support>  <permalink>http://hdl.handle.net/11104/0250631</permalink>  <cooperation> <ARLID>cav_un_auth*0320900</ARLID> <name>Universidad Aut onoma Metropolitana, Campus Iztapalapa, Avenida San Rafael Atlixco #186, Colonia Vicentina, M exico 09340,</name> <country>MX</country> </cooperation>  <confidential>S</confidential>          <unknown tag="mrcbT16-e">STATISTICS&amp;PROBABILITY</unknown> <unknown tag="mrcbT16-f">0.864</unknown> <unknown tag="mrcbT16-g">0.131</unknown> <unknown tag="mrcbT16-h">999.9</unknown> <unknown tag="mrcbT16-i">0.00534</unknown> <unknown tag="mrcbT16-j">0.784</unknown> <unknown tag="mrcbT16-k">2677</unknown> <unknown tag="mrcbT16-s">0.724</unknown> <unknown tag="mrcbT16-4">Q2</unknown> <unknown tag="mrcbT16-5">0.618</unknown> <unknown tag="mrcbT16-6">84</unknown> <unknown tag="mrcbT16-7">Q3</unknown> <unknown tag="mrcbT16-B">50.157</unknown> <unknown tag="mrcbT16-C">34.6</unknown> <unknown tag="mrcbT16-D">Q2</unknown> <unknown tag="mrcbT16-E">Q2</unknown> <unknown tag="mrcbT16-P">34.553</unknown> <arlyear>2015</arlyear>       <unknown tag="mrcbU14"> 84938521589 SCOPUS </unknown> <unknown tag="mrcbU34"> 000358811700008 WOS </unknown> <unknown tag="mrcbU63"> cav_un_epca*0256875 Journal of Applied Probability 0021-9002 1475-6072 Roč. 52 č. 2 2015 419 440 Cambridge University Press </unknown> </cas_special> </bibitem>