Journal articles on the topic 'Markov Decision Process Planning'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Markov Decision Process Planning.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Wang, Lidong, Reed Mosher, Patti Duett, and Terril Falls. "Predictive Modelling of a Honeypot System Based on a Markov Decision Process and a Partially Observable Markov Decision Process." Applied Cybersecurity & Internet Governance 2, no. 2 (2023): 1–5. http://dx.doi.org/10.5604/01.3001.0016.2027.
Full textPinder, Jonathan P. "An Approximation of a Markov Decision Process for Resource Planning." Journal of the Operational Research Society 46, no. 7 (1995): 819. http://dx.doi.org/10.2307/2583966.
Full textPinder, Jonathan P. "An Approximation of a Markov Decision Process for Resource Planning." Journal of the Operational Research Society 46, no. 7 (1995): 819–30. http://dx.doi.org/10.1057/jors.1995.115.
Full textMouaddib, Abdel-Illah. "Vector-Value Markov Decision Process for multi-objective stochastic path planning." International Journal of Hybrid Intelligent Systems 9, no. 1 (2012): 45–60. http://dx.doi.org/10.3233/his-2012-0146.
Full textSarsur, Daniel, Lucas V. R. Alves, and Patrícia N. Pena. "Using Markov Decision Process over Local Modular Supervisors for Planning Problems." IFAC-PapersOnLine 58, no. 1 (2024): 126–31. http://dx.doi.org/10.1016/j.ifacol.2024.07.022.
Full textNaguleswaran, Sanjeev, and Langford B. White. "Planning without state space explosion: Petri net to Markov decision process." International Transactions in Operational Research 16, no. 2 (2009): 243–55. http://dx.doi.org/10.1111/j.1475-3995.2009.00674.x.
Full textSchell, Greggory J., Wesley J. Marrero, Mariel S. Lavieri, Jeremy B. Sussman, and Rodney A. Hayward. "Data-Driven Markov Decision Process Approximations for Personalized Hypertension Treatment Planning." MDM Policy & Practice 1, no. 1 (2016): 238146831667421. http://dx.doi.org/10.1177/2381468316674214.
Full textDing, Yi, and Hongyang Zhu. "Risk-Sensitive Markov Decision Processes of USV Trajectory Planning with Time-Limited Budget." Sensors 23, no. 18 (2023): 7846. http://dx.doi.org/10.3390/s23187846.
Full textNguyen, Truong-Huy, David Hsu, Wee-Sun Lee, et al. "CAPIR: Collaborative Action Planning with Intention Recognition." Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment 7, no. 1 (2011): 61–66. http://dx.doi.org/10.1609/aiide.v7i1.12425.
Full textHai-Feng, Jiu, Chen Yu, Deng Wei, and Pang Shuo. "Underwater chemical plume tracing based on partially observable Markov decision process." International Journal of Advanced Robotic Systems 16, no. 2 (2019): 172988141983187. http://dx.doi.org/10.1177/1729881419831874.
Full textHamasha, Mohammad M., and George Rumbe. "Determining optimal policy for emergency department using Markov decision process." World Journal of Engineering 14, no. 5 (2017): 467–72. http://dx.doi.org/10.1108/wje-12-2016-0148.
Full textYordanova, Veronika, Hugh Griffiths, and Stephen Hailes. "Rendezvous planning for multiple autonomous underwater vehicles using a Markov decision process." IET Radar, Sonar & Navigation 11, no. 12 (2017): 1762–69. http://dx.doi.org/10.1049/iet-rsn.2017.0098.
Full textLin, Yong, Xingjia Lu, and Fillia Makedon. "Approximate Planning in POMDPs with Weighted Graph Models." International Journal on Artificial Intelligence Tools 24, no. 04 (2015): 1550014. http://dx.doi.org/10.1142/s0218213015500141.
Full textRagi, Shankarachary, and Edwin K. P. Chong. "UAV Path Planning in a Dynamic Environment via Partially Observable Markov Decision Process." IEEE Transactions on Aerospace and Electronic Systems 49, no. 4 (2014): 2397–412. http://dx.doi.org/10.1109/taes.2014.6619936.
Full textRagi, Shankarachary, and Edwin K. P. Chong. "UAV Path Planning in a Dynamic Environment via Partially Observable Markov Decision Process." IEEE Transactions on Aerospace and Electronic Systems 49, no. 4 (2013): 2397–412. http://dx.doi.org/10.1109/taes.2013.6621824.
Full textGedik, Ridvan, Shengfan Zhang, and Chase Rainwater. "Strategic level proton therapy patient admission planning: a Markov decision process modeling approach." Health Care Management Science 20, no. 2 (2016): 286–302. http://dx.doi.org/10.1007/s10729-016-9354-6.
Full textMikhalov, Oleksandr Illich, Oleksandr Afrykanovych Stenin, Viktor Petrovych Pasko, Oleksandr Serhiiovych Stenin, and Yurii Opanasovych Tymoshyn. "Situational planning and operational adjustment of the route of the Autonomous robotic underwater vehicle." System technologies 3, no. 122 (2019): 3–11. http://dx.doi.org/10.34185/1562-9945-3-122-2019-01.
Full textBai, Yun, Saeed Babanajad, and Zheyong Bian. "Transportation infrastructure asset management modeling using Markov decision process under epistemic uncertainties." Smart and Resilient Transport 3, no. 3 (2021): 249–65. http://dx.doi.org/10.1108/srt-11-2020-0026.
Full textRigter, Marc, Bruno Lacerda, and Nick Hawes. "Minimax Regret Optimisation for Robust Planning in Uncertain Markov Decision Processes." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 13 (2021): 11930–38. http://dx.doi.org/10.1609/aaai.v35i13.17417.
Full textOGRYCZAK, WLODZIMIERZ, PATRICE PERNY, and PAUL WENG. "A COMPROMISE PROGRAMMING APPROACH TO MULTIOBJECTIVE MARKOV DECISION PROCESSES." International Journal of Information Technology & Decision Making 12, no. 05 (2013): 1021–53. http://dx.doi.org/10.1142/s0219622013400075.
Full textAdjei, Patrick, Norman Tasfi, Santiago Gomez-Rosero, and Miriam A. M. Capretz. "Safe Reinforcement Learning for Arm Manipulation with Constrained Markov Decision Process." Robotics 13, no. 4 (2024): 63. http://dx.doi.org/10.3390/robotics13040063.
Full textCheng, Minghui, and Dan M. Frangopol. "Optimal load rating-based inspection planning of corroded steel girders using Markov decision process." Probabilistic Engineering Mechanics 66 (October 2021): 103160. http://dx.doi.org/10.1016/j.probengmech.2021.103160.
Full textBubnov, Yakov. "DNS Data Exfiltration Detection Using Online Planning for POMDP." European Journal of Engineering Research and Science 4, no. 9 (2019): 22–25. http://dx.doi.org/10.24018/ejers.2019.4.9.1500.
Full textBubnov, Yakov. "DNS Data Exfiltration Detection Using Online Planning for POMDP." European Journal of Engineering and Technology Research 4, no. 9 (2019): 22–25. http://dx.doi.org/10.24018/ejeng.2019.4.9.1500.
Full textLefebvre, Randy, and Audrey Durand. "On Shallow Planning Under Partial Observability." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 25 (2025): 26587–95. https://doi.org/10.1609/aaai.v39i25.34860.
Full textLi, Xinchen, Levent Guvenc, and Bilin Aksun-Guvenc. "Autonomous Vehicle Decision-Making with Policy Prediction for Handling a Round Intersection." Electronics 12, no. 22 (2023): 4670. http://dx.doi.org/10.3390/electronics12224670.
Full textXu, Jiuyun, Kun Chen, and Stephan Reiff-Marganiec. "Using Markov Decision Process Model with Logic Scoring of Preference Model to Optimize HTN Web Services Composition." International Journal of Web Services Research 8, no. 2 (2011): 53–73. http://dx.doi.org/10.4018/jwsr.2011040103.
Full textde Saporta, Benoîte, Aymar Thierry d’Argenlieu, Régis Sabbadin, and Alice Cleynen. "A Monte-Carlo planning strategy for medical follow-up optimization: Illustration on multiple myeloma data." PLOS ONE 19, no. 12 (2024): e0315661. https://doi.org/10.1371/journal.pone.0315661.
Full textKim, Hongseok, and Do-Nyun Kim. "Maintenance decision-making model for gas turbine engine components." PHM Society European Conference 8, no. 1 (2024): 7. http://dx.doi.org/10.36001/phme.2024.v8i1.4043.
Full textShu, Mingrui, Xiuyu Zheng, Fengguo Li, Kaiyong Wang, and Qiang Li. "Numerical Simulation of Time-Optimal Path Planning for Autonomous Underwater Vehicles Using a Markov Decision Process Method." Applied Sciences 12, no. 6 (2022): 3064. http://dx.doi.org/10.3390/app12063064.
Full textMonteiro, Neemias Silva, Vinicius Mariano Goncalves, and Carlos Andrey Maia. "Motion Planning of Mobile Robots in Indoor Topological Environments using Partially Observable Markov Decision Process." IEEE Latin America Transactions 19, no. 8 (2021): 1315–24. http://dx.doi.org/10.1109/tla.2021.9475862.
Full textAlDurgam, Mohammad M. "An Integrated Inventory and Workforce Planning Markov Decision Process Model with a Variable Production Rate." IFAC-PapersOnLine 52, no. 13 (2019): 2792–97. http://dx.doi.org/10.1016/j.ifacol.2019.11.631.
Full textKim, M., A. Ghate, and M. H. Phillips. "A Markov decision process approach to temporal modulation of dose fractions in radiation therapy planning." Physics in Medicine and Biology 54, no. 14 (2009): 4455–76. http://dx.doi.org/10.1088/0031-9155/54/14/007.
Full textZhang, Zhen, Jianfeng Wu, Yan Zhao, and Ruining Luo. "Research on Distributed Multi-Sensor Cooperative Scheduling Model Based on Partially Observable Markov Decision Process." Sensors 22, no. 8 (2022): 3001. http://dx.doi.org/10.3390/s22083001.
Full textYuan, Minsen, Zhenshan Shi, and Zhouyi Shen. "Mission Planning in Time-Invariant Domains with MDPs and Gaussian Distribution." Journal of Physics: Conference Series 2386, no. 1 (2022): 012022. http://dx.doi.org/10.1088/1742-6596/2386/1/012022.
Full textGarbatov, Yordan, and Petar Georgiev. "Markovian Maintenance Planning of Ship Propulsion System Accounting for CII and System Degradation." Energies 17, no. 16 (2024): 4123. http://dx.doi.org/10.3390/en17164123.
Full textWang, Kui, Xitao Wu, Shaoyang Shi, et al. "A Novel Integrated Path Planning and Mode Decision Algorithm for Wheel–Leg Vehicles in Unstructured Environment." Sensors 25, no. 9 (2025): 2888. https://doi.org/10.3390/s25092888.
Full textMubiru, Kizito Paul. "Joint Replenishment Problem in Drug Inventory Management of Pharmacies under Stochastic Demand." Brazilian Journal of Operations & Production Management 15, no. 2 (2018): 302–10. http://dx.doi.org/10.14488/bjopm.2018.v15.n2.a12.
Full textBäuerle, Nicole, and Alexander Glauner. "Minimizing spectral risk measures applied to Markov decision processes." Mathematical Methods of Operations Research 94, no. 1 (2021): 35–69. http://dx.doi.org/10.1007/s00186-021-00746-w.
Full textMorley, C. D., and J. B. Thornes. "A Markov Decision Model for Network Flows*." Geographical Analysis 4, no. 2 (2010): 180–93. http://dx.doi.org/10.1111/j.1538-4632.1972.tb00468.x.
Full textIsradi, Muhammad, Andri I. Rifai, Joewono Prasetijo, Reni K. Kinasih, and Muhammad I. Setiawan. "Development of Pavement Deterioration Models Using Markov Chain Process." Civil Engineering Journal 10, no. 9 (2024): 2954–65. http://dx.doi.org/10.28991/cej-2024-010-09-012.
Full textLarach, Abdelhadi, Cherki Daoui, and Mohamed Baslam. "A Markov Decision Model for Area Coverage in Autonomous Demining Robot." International Journal of Informatics and Communication Technology (IJ-ICT) 6, no. 2 (2017): 105. http://dx.doi.org/10.11591/ijict.v6i2.pp105-116.
Full textZhang, Hanrui, Yu Cheng, and Vincent Conitzer. "Planning with Participation Constraints." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 5 (2022): 5260–67. http://dx.doi.org/10.1609/aaai.v36i5.20462.
Full textZhang, Jian, Mahjoub Dridi, and Abdellah El Moudni. "A Markov decision model with dead ends for operating room planning considering dynamic patient priority." RAIRO - Operations Research 53, no. 5 (2019): 1819–41. http://dx.doi.org/10.1051/ro/2018110.
Full textWinder, John, Stephanie Milani, Matthew Landen, et al. "Planning with Abstract Learned Models While Learning Transferable Subtasks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 06 (2020): 9992–10000. http://dx.doi.org/10.1609/aaai.v34i06.6555.
Full textBouton, Maxime, Jana Tumova, and Mykel J. Kochenderfer. "Point-Based Methods for Model Checking in Partially Observable Markov Decision Processes." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 06 (2020): 10061–68. http://dx.doi.org/10.1609/aaai.v34i06.6563.
Full textKareem, B., and HA Owolabi. "Optimizing Maintenance Planning in the Production Industry Using the Markovian Approach." Journal of Engineering Research [TJER] 9, no. 2 (2012): 46. http://dx.doi.org/10.24200/tjer.vol9iss2pp46-63.
Full textWalker, Violet, Fernando Vanegas, and Felipe Gonzalez. "Multi-UAV Mapping and Target Finding in Large, Complex, Partially Observable Environments." Remote Sensing 15, no. 15 (2023): 3802. http://dx.doi.org/10.3390/rs15153802.
Full textYang, Qiming, Jiancheng Xu, Haibao Tian, and Yong Wu. "Decision Modeling of UAV On-Line Path Planning Based on IMM." Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University 36, no. 2 (2018): 323–31. http://dx.doi.org/10.1051/jnwpu/20183620323.
Full textSoltan, Sajad, and Maryam Ashrafi. "Application of reinforcement learning for integrating project risk analysis and risk response planning: A case study on construction projects." Journal of Project Management 10, no. 1 (2025): 71–86. https://doi.org/10.5267/j.jpm.2024.11.001.
Full text