A Markov Decision Process Approach for Managing Medical Drone Deliveries
Abstract
We consider the problem of optimizing the distribution operations at a drone hub that dispatches drones to different geographic locations generating stochastic demands for medical supplies. Drone delivery is an innovative method that introduces many benefits, such as low-contact delivery, thereby reducing the spread of pandemic and vaccine-preventable diseases. While we focus on medical supply delivery for this work, drone delivery is suitable for many other items, including food, postal parcels, and e-commerce. In this paper, our goal is to address drone delivery challenges related to the stochastic demands of different geographic locations. We consider different classes of demand related to geographic locations that require different flight ranges, which is directly related to the amount of charge held in a drone battery. We classify the stochastic demands based on their distance from the drone hub, use a Markov decision process to model the problem, and perform computational tests using realistic data representing a prominent drone delivery company. We solve the problem using a reinforcement learning method and show its high performance compared with the exact solution found using dynamic programming. Finally, we analyze the results and provide insights for managing the drone hub operations.
- Publication:
-
arXiv e-prints
- Pub Date:
- June 2021
- DOI:
- 10.48550/arXiv.2106.04729
- arXiv:
- arXiv:2106.04729
- Bibcode:
- 2021arXiv210604729A
- Keywords:
-
- Mathematics - Optimization and Control;
- Computer Science - Artificial Intelligence;
- Computer Science - Machine Learning