Academic literature on the topic 'Markov Decision Process Planning'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Markov Decision Process Planning.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Markov Decision Process Planning"

1

Wang, Lidong, Reed Mosher, Patti Duett, and Terril Falls. "Predictive Modelling of a Honeypot System Based on a Markov Decision Process and a Partially Observable Markov Decision Process." Applied Cybersecurity & Internet Governance 2, no. 2 (2023): 1–5. http://dx.doi.org/10.5604/01.3001.0016.2027.

Full text
Abstract:
A honeypot is used to attract and monitor attacker activities and capture valuable information that can be used to help practice good cybersecurity. Predictive modelling of a honeypot system based on a Markov decision process (MDP) and a partially observable Markov decision process (POMDP) is performed in this paper. Analyses over a finite planning horizon and an infinite planning horizon for a discounted MDP are conducted, respectively. Four methods, including value iteration (VI), policy iteration (PI), linear programming (LP), and Q-learning, are used in analyses over an infinite planning h
APA, Harvard, Vancouver, ISO, and other styles
2

Pinder, Jonathan P. "An Approximation of a Markov Decision Process for Resource Planning." Journal of the Operational Research Society 46, no. 7 (1995): 819. http://dx.doi.org/10.2307/2583966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Pinder, Jonathan P. "An Approximation of a Markov Decision Process for Resource Planning." Journal of the Operational Research Society 46, no. 7 (1995): 819–30. http://dx.doi.org/10.1057/jors.1995.115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mouaddib, Abdel-Illah. "Vector-Value Markov Decision Process for multi-objective stochastic path planning." International Journal of Hybrid Intelligent Systems 9, no. 1 (2012): 45–60. http://dx.doi.org/10.3233/his-2012-0146.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sarsur, Daniel, Lucas V. R. Alves, and Patrícia N. Pena. "Using Markov Decision Process over Local Modular Supervisors for Planning Problems." IFAC-PapersOnLine 58, no. 1 (2024): 126–31. http://dx.doi.org/10.1016/j.ifacol.2024.07.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Naguleswaran, Sanjeev, and Langford B. White. "Planning without state space explosion: Petri net to Markov decision process." International Transactions in Operational Research 16, no. 2 (2009): 243–55. http://dx.doi.org/10.1111/j.1475-3995.2009.00674.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Schell, Greggory J., Wesley J. Marrero, Mariel S. Lavieri, Jeremy B. Sussman, and Rodney A. Hayward. "Data-Driven Markov Decision Process Approximations for Personalized Hypertension Treatment Planning." MDM Policy & Practice 1, no. 1 (2016): 238146831667421. http://dx.doi.org/10.1177/2381468316674214.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ding, Yi, and Hongyang Zhu. "Risk-Sensitive Markov Decision Processes of USV Trajectory Planning with Time-Limited Budget." Sensors 23, no. 18 (2023): 7846. http://dx.doi.org/10.3390/s23187846.

Full text
Abstract:
Trajectory planning plays a crucial role in ensuring the safe navigation of ships, as it involves complex decision making influenced by various factors. This paper presents a heuristic algorithm, named the Markov decision process Heuristic Algorithm (MHA), for time-optimized avoidance of Unmanned Surface Vehicles (USVs) based on a Risk-Sensitive Markov decision process model. The proposed method utilizes the Risk-Sensitive Markov decision process model to generate a set of states within the USV collision avoidance search space. These states are determined based on the reachable locations and d
APA, Harvard, Vancouver, ISO, and other styles
9

Nguyen, Truong-Huy, David Hsu, Wee-Sun Lee, et al. "CAPIR: Collaborative Action Planning with Intention Recognition." Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment 7, no. 1 (2011): 61–66. http://dx.doi.org/10.1609/aiide.v7i1.12425.

Full text
Abstract:
We apply decision theoretic techniques to construct non-player characters that are able to assist a human player in collaborative games. The method is based on solving Markov decision processes, which can be difficult when the game state is described by many variables. To scale to more complex games, the method allows decomposition of a game task into subtasks, each of which can be modelled by a Markov decision process. Intention recognition is used to infer the subtask that the human is currently performing, allowing the helper to assist the human in performing the correct task. Experiments s
APA, Harvard, Vancouver, ISO, and other styles
10

Hai-Feng, Jiu, Chen Yu, Deng Wei, and Pang Shuo. "Underwater chemical plume tracing based on partially observable Markov decision process." International Journal of Advanced Robotic Systems 16, no. 2 (2019): 172988141983187. http://dx.doi.org/10.1177/1729881419831874.

Full text
Abstract:
Chemical plume tracing based on autonomous underwater vehicle uses chemical as a guidance to navigate and search in the unknown environments. To solve the key issue of tracing and locating the source, this article proposes a path-planning strategy based on partially observable Markov decision process algorithm and artificial potential field algorithm. The partially observable Markov decision process algorithm is used to construct a source likelihood map and update it in real time with environmental information from the sensors on autonomous underwater vehicle in search area. The artificial pot
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Markov Decision Process Planning"

1

Geng, Na. "Combinatorial optimization and Markov decision process for planning MRI examinations." Phd thesis, Saint-Etienne, EMSE, 2010. http://tel.archives-ouvertes.fr/tel-00566257.

Full text
Abstract:
This research is motivated by our collaborations with a large French university teaching hospital in order to reduce the Length of Stay (LoS) of stroke patients treated in the neurovascular department. Quick diagnosis is critical for stroke patients but relies on expensive and heavily used imaging facilities such as MRI (Magnetic Resonance Imaging) scanners. Therefore, it is very important for the neurovascular department to reduce the patient LoS by reducing their waiting time of imaging examinations. From the neurovascular department perspective, this thesis proposes a new MRI examinations r
APA, Harvard, Vancouver, ISO, and other styles
2

Dai, Peng. "FASTER DYNAMIC PROGRAMMING FOR MARKOV DECISION PROCESSES." UKnowledge, 2007. http://uknowledge.uky.edu/gradschool_theses/428.

Full text
Abstract:
Markov decision processes (MDPs) are a general framework used by Artificial Intelligence (AI) researchers to model decision theoretic planning problems. Solving real world MDPs has been a major and challenging research topic in the AI literature. This paper discusses two main groups of approaches in solving MDPs. The first group of approaches combines the strategies of heuristic search and dynamic programming to expedite the convergence process. The second makes use of graphical structures in MDPs to decrease the effort of classic dynamic programming algorithms. Two new algorithms proposed by
APA, Harvard, Vancouver, ISO, and other styles
3

Alizadeh, Pegah. "Elicitation and planning in Markov decision processes with unknown rewards." Thesis, Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCD011/document.

Full text
Abstract:
Les processus décisionnels de Markov (MDPs) modélisent des problèmes de décisionsséquentielles dans lesquels un utilisateur interagit avec l’environnement et adapte soncomportement en prenant en compte les signaux de récompense numérique reçus. La solutiond’unMDP se ramène à formuler le comportement de l’utilisateur dans l’environnementà l’aide d’une fonction de politique qui spécifie quelle action choisir dans chaque situation.Dans de nombreux problèmes de décision du monde réel, les utilisateurs ont despréférences différentes, donc, les gains de leurs actions sur les états sont différents et
APA, Harvard, Vancouver, ISO, and other styles
4

Ernsberger, Timothy S. "Integrating Deterministic Planning and Reinforcement Learning for Complex Sequential Decision Making." Case Western Reserve University School of Graduate Studies / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=case1354813154.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Al, Sabban Wesam H. "Autonomous vehicle path planning for persistence monitoring under uncertainty using Gaussian based Markov decision process." Thesis, Queensland University of Technology, 2015. https://eprints.qut.edu.au/82297/1/Wesam%20H_Al%20Sabban_Thesis.pdf.

Full text
Abstract:
One of the main challenges facing online and offline path planners is the uncertainty in the magnitude and direction of the environmental energy because it is dynamic, changeable with time, and hard to forecast. This thesis develops an artificial intelligence for a mobile robot to learn from historical or forecasted data of environmental energy available in the area of interest which will help for a persistence monitoring under uncertainty using the developed algorithm.
APA, Harvard, Vancouver, ISO, and other styles
6

Poulin, Nolan. "Proactive Planning through Active Policy Inference in Stochastic Environments." Digital WPI, 2018. https://digitalcommons.wpi.edu/etd-theses/1267.

Full text
Abstract:
In multi-agent Markov Decision Processes, a controllable agent must perform optimal planning in a dynamic and uncertain environment that includes another unknown and uncontrollable agent. Given a task specification for the controllable agent, its ability to complete the task can be impeded by an inaccurate model of the intent and behaviors of other agents. In this work, we introduce an active policy inference algorithm that allows a controllable agent to infer a policy of the environmental agent through interaction. Active policy inference is data-efficient and is particularly useful when data
APA, Harvard, Vancouver, ISO, and other styles
7

Pokharel, Gaurab. "Increasing the Value of Information During Planning in Uncertain Environments." Oberlin College Honors Theses / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=oberlin1624976272271825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Stárek, Ivo. "Plánování cesty robota pomocí dynamického programování." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2009. http://www.nusl.cz/ntk/nusl-228655.

Full text
Abstract:
This work is dedicated to robot path planning with using principles of dynamic programing in discrete state space. Theoretical part is dedicated to actual situation in this field and to principle of applying Markov decission process to path planning. Practical part is dedicated to implementation of two algorithms based on MDP principles.
APA, Harvard, Vancouver, ISO, and other styles
9

Junyent, Barbany Miquel. "Width-Based Planning and Learning." Doctoral thesis, Universitat Pompeu Fabra, 2021. http://hdl.handle.net/10803/672779.

Full text
Abstract:
Optimal sequential decision making is a fundamental problem to many diverse fields. In recent years, Reinforcement Learning (RL) methods have experienced unprecedented success, largely enabled by the use of deep learning models, reaching human-level performance in several domains, such as the Atari video games or the ancient game of Go. In contrast to the RL approach in which the agent learns a policy from environment interaction samples, ignoring the structure of the problem, the planning approach for decision making assumes known models for the agent's goals and domain dynamics, and fo
APA, Harvard, Vancouver, ISO, and other styles
10

Pinheiro, Paulo Gurgel 1983. "Localização multirrobo cooperativa com planejamento." [s.n.], 2009. http://repositorio.unicamp.br/jspui/handle/REPOSIP/276155.

Full text
Abstract:
Orientador: Jacques Wainer<br>Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação<br>Made available in DSpace on 2018-09-11T21:14:07Z (GMT). No. of bitstreams: 1 Pinheiro_PauloGurgel_M.pdf: 1259816 bytes, checksum: a4783df9aa3755becb68ee233ad43e3c (MD5) Previous issue date: 2009<br>Resumo: Em um problema de localização multirrobô cooperativa, um grupo de robôs encontra-se em um determinado ambiente, cuja localização exata de cada um dos robôs é desconhecida. Neste cenário, uma distribuição de probabilidades aponta as chances de um robô estar em um determinado
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Markov Decision Process Planning"

1

Kolobov, Mausam, and Andrey Kolobov. Planning with Markov Decision Processes. Springer International Publishing, 2012. http://dx.doi.org/10.1007/978-3-031-01559-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Xiao, Zhiqing. Reinforcement Learning: Discrete-Time Markov Decision Process. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3740-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Section, Auckland Regional Council Resource Management Division Regional Planning Dept Strategic Planning. The Regional strategic decision-making process: [report. Auckland Regional Council, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Corps, United States Marine. Marine Corps planning process. Headquarters, U.S. Marine Corps, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Corps, United States Marine, ed. Marine Corps planning process. Headquarters, U.S. Marine Corps, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Doll, Ronald C. Curriculum improvement: Decision making and process. 9th ed. Allyn and Bacon, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Aonuma, Tatsuo. A facet-following coordination for linear bilevel planning process. Institute of Economic Research, Kobe University of Commerce, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Smith, S. A. Guidebook for transportation corridor studies: A process for effective decision-making. National Academy Press, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhuang, Taozhi. Urban renewal decision-making in China: Stakeholders, process, and system improvement. BK Books, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Schwarz, Mirela. The development of shared beliefs and its role during the strategic decision process. University of Southampton, School of Management, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Markov Decision Process Planning"

1

Thiébaux, Sylvie, and Olivier Buffet. "Operations Planning." In Markov Decision Processes in Artificial Intelligence. John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118557426.ch15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kolobov, Mausam, and Andrey Kolobov. "Fundamental Algorithms." In Planning with Markov Decision Processes. Springer International Publishing, 2012. http://dx.doi.org/10.1007/978-3-031-01559-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kolobov, Mausam, and Andrey Kolobov. "MDPs." In Planning with Markov Decision Processes. Springer International Publishing, 2012. http://dx.doi.org/10.1007/978-3-031-01559-5_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kolobov, Mausam, and Andrey Kolobov. "Advanced Notes." In Planning with Markov Decision Processes. Springer International Publishing, 2012. http://dx.doi.org/10.1007/978-3-031-01559-5_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kolobov, Mausam, and Andrey Kolobov. "Symbolic Algorithms." In Planning with Markov Decision Processes. Springer International Publishing, 2012. http://dx.doi.org/10.1007/978-3-031-01559-5_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kolobov, Mausam, and Andrey Kolobov. "Heuristic Search Algorithms." In Planning with Markov Decision Processes. Springer International Publishing, 2012. http://dx.doi.org/10.1007/978-3-031-01559-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Naveed, Munir, Andrew Crampton, Diane Kitchin, and Lee McCluskey. "Real-Time Path Planning using a Simulation-Based Markov Decision Process." In Research and Development in Intelligent Systems XXVIII. Springer London, 2011. http://dx.doi.org/10.1007/978-1-4471-2318-7_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Veeramani, Satheeshkumar, Sreekumar Muthuswamy, Keerthi Sagar, and Matteo Zoppi. "Multi-Head Path Planning of SwarmItFIX Agents: A Markov Decision Process Approach." In Advances in Mechanism and Machine Science. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20131-9_221.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Dawid, R., D. McMillan, and M. Revie. "Time series semi-Markov decision process with variable costs for maintenance planning." In Risk, Reliability and Safety: Innovating Theory and Practice. CRC Press, 2016. http://dx.doi.org/10.1201/9781315374987-172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Souidi, Mohammed El Habib, Toufik Messaoud Maarouk, and Abdeldjalil Ledmi. "Multi-agent Ludo Game Collaborative Path Planning based on Markov Decision Process." In Inventive Systems and Control. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-1395-1_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Markov Decision Process Planning"

1

Qiu, Kang, Sigmund Eggen Holm, Julian Straus, and Simon Roussanaly. "Optimal Clustered, Multi-modal CO2 Transport Considering Non-linear Costs � a Path-planning Approach." In Foundations of Computer-Aided Process Design. PSE Press, 2024. http://dx.doi.org/10.69997/sct.150076.

Full text
Abstract:
An important measure to achieve global reduction in CO2 emissions is CO2 capture, transport, and storage. The deployment of CO2 capture requires the development of a shared CO2 transport infrastructure, where CO2 can be transported with different transport modes. Furthermore, the cost of CO2 transport can be subject to significant economies of scale effects with respect to the amount of CO2 transported, also mentioned as clustering effects. Therefore, optimizing the shared infrastructure of multiple CO2 sources can lead to significant reductions in infrastructure costs. This paper presents a n
APA, Harvard, Vancouver, ISO, and other styles
2

Xu, Dan, Yunxiao Guo, Han Long, and Chang Wang. "A Novel Variable Step-size Path Planning Framework with Step-Consistent Markov Decision Process For Large-Scale UAV Swarm." In 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2024. https://doi.org/10.1109/iros58592.2024.10802011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sarkis, Miriam, Nilay Shah, and Maria M. Papathanasiou. "Integrating process and demand uncertainty in capacity planning for next-generation pharmaceutical supply chains." In The 35th European Symposium on Computer Aided Process Engineering. PSE Press, 2025. https://doi.org/10.69997/sct.162819.

Full text
Abstract:
Emerging sectors within the biopharmaceutical industry are undergoing rapid scale-up due to the market boom of gene therapies and vaccine platform technologies. Manufacturers are pressured to orchestrate resources and plan investments under future demand uncertainty and, critically, an early-stage process uncertainty for platforms still under development. In this work, a multi-product multi-stage stochastic optimization problem integrating demand uncertainty is presented and augmented with a worst-case optimization approach with respect to process uncertainty. Results focus on a comparison bet
APA, Harvard, Vancouver, ISO, and other styles
4

Kalagarla, Krishna C., Matthew Low, Rahul Jain, Ashutosh Nayyar, and Pierluigi Nuzzo. "Compositional Planning for Logically Constrained Multi-Agent Markov Decision Processes." In 2024 IEEE 63rd Conference on Decision and Control (CDC). IEEE, 2024. https://doi.org/10.1109/cdc56724.2024.10885812.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kuhl, William, Jun Wang, Duncan Eddy, and Mykel J. Kochenderfer. "Markov Decision Processes for Satellite Maneuver Planning and Collision Avoidance." In 2025 IEEE Aerospace Conference. IEEE, 2025. https://doi.org/10.1109/aero63441.2025.11068672.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Al Khafaf, Nameer, Mahdi Jalili, and Peter Sokolowski. "Demand Response Planning Tool using Markov Decision Process." In 2018 IEEE 16th International Conference on Industrial Informatics (INDIN). IEEE, 2018. http://dx.doi.org/10.1109/indin.2018.8472098.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Leal Gomes Leite, Joao Marcelo, Edilson F. Arruda, Laura Bahiense, and Lino G. Marujo. "Mine-to-client planning with Markov Decision Process." In 2020 European Control Conference (ECC). IEEE, 2020. http://dx.doi.org/10.23919/ecc51009.2020.9143651.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ting, Lei, Zhu Cheng, and Zhang Weiming. "Planning for target system striking based on Markov decision process." In 2013 IEEE International Conference on Service Operations and Logistics, and Informatics (SOLI). IEEE, 2013. http://dx.doi.org/10.1109/soli.2013.6611401.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lane, Terran, and Leslie Pack Kaelbling. "Approaches to macro decompositions of large Markov decision process planning problems." In Intelligent Systems and Advanced Manufacturing, edited by Douglas W. Gage and Howie M. Choset. SPIE, 2002. http://dx.doi.org/10.1117/12.457435.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Trotti, Francesco, Alessandro Farinelli, and Riccardo Muradore. "A Markov Decision Process Approach for Decentralized UAV Formation Path Planning." In 2024 European Control Conference (ECC). IEEE, 2024. http://dx.doi.org/10.23919/ecc64448.2024.10591307.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Markov Decision Process Planning"

1

LaRaine Ingram, Keisha. Applied Sales Predictive Analytics for Business Development. Vilnius Business College, 2024. http://dx.doi.org/10.57005/ab.2024.1.2.

Full text
Abstract:
In the dynamic business environment, leveraging predictive analytics for sales optimization and business development has become crucial for achieving sustained growth. As the e-commerce landscape continues to evolve, many e-businesses must harness the power of predictive analytics to anticipate sales trends and optimize business development strategies. This paper explores the application of sales predictive analytics, focusing on its role in forecasting sales, optimizing resource allocation, and enhancing customer relationship management. The application of predictive analytics in sales foreca
APA, Harvard, Vancouver, ISO, and other styles
2

Davis, Charles N., and Jr. A Decision Support Process for Planning Air Operations. Defense Technical Information Center, 1991. http://dx.doi.org/10.21236/ada236580.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kanters, Jouri, and Martin Thebault. Opportunities for Improved Workflows and Development Needs of Solar Planning Tools. IEA SHC Task 63, 2024. http://dx.doi.org/10.18777/ieashc-task63-2024-0006.

Full text
Abstract:
There has been a significant development of simulation tools capable of providing support in decision-making regarding solar neighborhoods. This report highlights opportunities for maximizing the use of tools for solar neighborhood planning by analyzing the current use of tools in the design process, the mapping of the solar potential and installed capacity, and by describing opportunities for an increase in use of tools.
APA, Harvard, Vancouver, ISO, and other styles
4

Tabakovic, Momir, Stefan Savic, Andreas Türk, et al. Analysis of the Technological Innovation System for BIPV in Austria. Edited by Michiel Van Noord. International Energy Agency Photovoltaic Power Systems Programme, 2024. http://dx.doi.org/10.69766/aocp4683.

Full text
Abstract:
This report analyses the Technological Innovation System (TIS) of Building Integrated Photovoltaics (BIPV) in Austria. The study’s scope is consistent with the IEA PVPS Task 15 report [1].The analysis aims to facilitate and support the innovation, development, and implementation of industrial solutions of BIPV technologies. In Austria, the use of BIPV is still a niche application and covers under 2% of all implemented PV systems [1]. BIPV technology in Austria has historically developed with the support of different public financial incentives, national and European. The history of BIPV is som
APA, Harvard, Vancouver, ISO, and other styles
5

Bigl, Matthew, Caitlin Callaghan, Brandon Booker, et al. Energy Atlas—mapping energy-related data for DoD lands in Alaska : Phase 2—data expansion and portal development. Engineer Research and Development Center (U.S.), 2022. http://dx.doi.org/10.21079/11681/43062.

Full text
Abstract:
As the largest Department of Defense (DoD) land user in Alaska, the U.S. Army oversees over 600,000 hectares of land, including remote areas accessible only by air, water, and winter ice roads. Spatial information related to the energy resources and infrastructure that exist on and adjacent to DoD installations can help inform decision makers when it comes to installation planning. The Energy Atlas−Alaska portal provides a secure value-added resource to support the decision-making process for energy management, investments in installation infrastructure, and improvements to energy resiliency a
APA, Harvard, Vancouver, ISO, and other styles
6

Bigl, Matthew, Caitlin Callaghan,, Jacqueline Willan, Paulina Lintsai, and Jamie Potter. Energy Atlas—mapping energy-related data for DoD lands : Phase 3—data and portal expansion : Northeast CONUS. Engineer Research and Development Center (U.S.), 2023. http://dx.doi.org/10.21079/11681/47823.

Full text
Abstract:
The DoD is a significant land user in northeast United States overseeing approximately 375 k acres of land with a total value of $113 B. The Department of Energy has found that major impacts from climate change will threaten energy infrastructure in the northeast US moving into the future. Current spatial information related to the energy resources and infrastructure on and adjacent to DoD installations can play a vital role in decision-making for sustainable and resilient installation planning in the region. The Energy Atlas (EA) portal provides a secure value-added resource to inform the dec
APA, Harvard, Vancouver, ISO, and other styles
7

Ruisinger, Ulrich, and Heike Sonntag. Internal insulation: two condensed guidelines for beginners. Department of the Built Environment, 2023. http://dx.doi.org/10.54337/aau541623517.

Full text
Abstract:
On the way to a reliable and large-scale application of internal insulation, clear and simple guidelines for building practitioners are needed, across all phases of refurbishment planning. Closing this gap was one objective of the “IN2EuroBuild”-project, completed in 2022. As a support in the planning process and for decision-making within the framework of the project, two comprehensive guides for the planning of internal insulation measures were developed. They guide users from the as-is analysis of the building (part one) to the renovation planning of the façade, the selection of suitable in
APA, Harvard, Vancouver, ISO, and other styles
8

Heggen, Hans Olav. PR186-215102-R01 Subsea Pipeline Risk-Based Inspection Benchmarking. Pipeline Research Council International, Inc. (PRCI), 2022. http://dx.doi.org/10.55274/r0012237.

Full text
Abstract:
There are many standards, recommended practices (RP) and guidelines available for subsea pipe-line integrity management (PIM). In all of these, inspection is a key part of managing the integrity of subsea pipelines. However, detailed coverage of risk-based inspection (RBI) and detailed inspection planning is limited in these standards, RPs and guidelines; optimization of inspection and inspection frequency is generally determined at operator's discretion unless strictly regulated by authorities. Thus, there is a need to identify operator best practices and attempt to define a unified decision-
APA, Harvard, Vancouver, ISO, and other styles
9

Salazar, Alessandro Toledo, Matthew Medrano, Mathias Duque Medina, Julio Roa, and Jorge E. Pesantez. Enhancing Evacuation Warning Responsiveness: Exploring the Impact of Social Interactions through an Agent-Based Model Approach. Mineta Transportation Institute, 2024. http://dx.doi.org/10.31979/mti.2024.2356.

Full text
Abstract:
Evacuations are the preferred response to human- or natural-caused disasters. The process often involves people deciding when and how to evacuate based on messages from local authorities. However, diverse opinions of the affected people may influence their decision to evacuate or to stay and see how the situation unfolds. This project applies an opinion dynamics concept to model the opinion and decision-making of people threatened by wildfire. To demonstrate how individual opinions evolve with time, the model applies an agent-based approach that includes the interaction between an agency sendi
APA, Harvard, Vancouver, ISO, and other styles
10

Lemanski, Ursula, and Donna Vogler. Stakeholder Analysis. American Museum of Natural History, 2010. http://dx.doi.org/10.5531/cbc.ncep.0029.

Full text
Abstract:
Stakeholders are defined as the people and organizations who are involved in or affected by an action or policy and can be directly or indirectly included in the decision making process. In environmental and conservation planning, stakeholders typically include government representatives, businesses, scientists, landowners, and local users of natural resources. These groups of stakeholders often have very different positions and values that may be difficult to reconcile with each other and the planned project. The synthesis provides a brief overview of why it is important to incorporate differ
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!