Open Access. Powered by Scholars. Published by Universities.®

Operations Research, Systems Engineering and Industrial Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Air Force Institute of Technology

Approximate dynamic programming

Articles 1 - 5 of 5

Full-Text Articles in Operations Research, Systems Engineering and Industrial Engineering

Multiagent Routing Problem With Dynamic Target Arrivals Solved Via Approximate Dynamic Programming, Andrew E. Mogan Mar 2022

Multiagent Routing Problem With Dynamic Target Arrivals Solved Via Approximate Dynamic Programming, Andrew E. Mogan

Theses and Dissertations

This research formulates and solves the multiagent routing problem with dynamic target arrivals (MRP-DTA), a stochastic system wherein a team of autonomous unmanned aerial vehicles (AUAVs) executes a strike coordination and reconnaissance (SCAR) mission against a notional adversary. Dynamic target arrivals that occur during the mission present the team of AUAVs with a sequential decision-making process which we model via a Markov Decision Process (MDP). To combat the curse of dimensionality, we construct and implement a hybrid approximate dynamic programming (ADP) algorithmic framework that employs a parametric cost function approximation (CFA) which augments a direct lookahead (DLA) model via a …


Analyzing The Impact Of Blood Transfusion Kits And Triage Misclassification Errors For Military Medical Evacuation Dispatching Policies Via Approximate Dynamic Programming, Channel A. Rodriguez Mar 2022

Analyzing The Impact Of Blood Transfusion Kits And Triage Misclassification Errors For Military Medical Evacuation Dispatching Policies Via Approximate Dynamic Programming, Channel A. Rodriguez

Theses and Dissertations

Members of the armed forces greatly rely on having an effective and efficient medical evacuation (MEDEVAC) process for evacuating casualties from the battlefield to medical treatment facilities (MTF) during combat operations. This thesis examines the MEDEVAC dispatching problem and seeks to determine an optimal policy for dispatching a MEDEVAC unit, if any, when a 9-line MEDEVAC request arrives, taking into account triage classification errors and the possibility of having blood transfusion kits on board select MEDEVAC units. A discounted, infinite-horizon continuous-time Markov decision process (MDP) model is formulated to examine such problem and compare generated dispatching policies to the myopic …


Examining How Standby Assets Impact Optimal Dispatching Decisions Within A Military Medical Evacuation System Via A Markov Decision Process Model, Kylie Wooten Mar 2021

Examining How Standby Assets Impact Optimal Dispatching Decisions Within A Military Medical Evacuation System Via A Markov Decision Process Model, Kylie Wooten

Theses and Dissertations

The Army medical evacuation (MEDEVAC) system ensures proper medical treatment is readily available to wounded soldiers on the battlefield. The objective of this research is to determine which MEDEVAC unit to task to an incoming 9-line MEDEVAC request and where to station a single standby unit to maximize patient survivability. A discounted, infinite-horizon continuous-time Markov decision process model is formulated to examine this problem. We design, develop, and test an approximate dynamic programming (ADP) technique that leverages a least squares policy evaluation value function approximation scheme within an approximate policy iteration algorithmic framework to solve practical-sized problem instances. A computational …


The Military Inventory Routing Problem: Utilizing Heuristics Within A Least Squares Temporal Differences Algorithm To Solve A Multiclass Stochastic Inventory Routing Problem With Vehicle Loss, Ethan L. Salgado Sep 2018

The Military Inventory Routing Problem: Utilizing Heuristics Within A Least Squares Temporal Differences Algorithm To Solve A Multiclass Stochastic Inventory Routing Problem With Vehicle Loss, Ethan L. Salgado

Theses and Dissertations

Military commanders currently resupply forward operating bases (FOBs) from a central location within an area of operations mainly via convoy operations in a way that closely resembles vendor managed inventory practices. Commanders must decide when and how much inventory to distribute throughout their area of operations while minimizing soldier risk. Technology currently exists that makes utilizing unmanned cargo aerial vehicles (CUAVs) for resupply an attractive alternative due to the dangers of utilizing convoy operations. Enemy actions in wartime environments pose a significant risk to a CUAV's ability to safely deliver supplies to a FOB. We develop a Markov decision process …


Determination Of Fire Control Policies Via Approximate Dynamic Programming, Michael T. Davis Mar 2016

Determination Of Fire Control Policies Via Approximate Dynamic Programming, Michael T. Davis

Theses and Dissertations

Given the ubiquitous nature of both offensive and defensive missile systems, the catastrophe-causing potential they represent, and the limited resources available to countries for missile defense, optimizing the defensive response to a missile attack is a necessary endeavor. For a single salvo of offensive missiles launched at a set of targets, a missile defense system protecting those targets must decide how many interceptors to fire at each incoming missile. Since such missile engagements often involve the firing of more than one attack salvo, we develop a Markov decision process (MDP) model to examine the optimal fire control policy for the …