We address the problem of ambulance dispatching, in which we must decide which ambulance to send to an incident in real time. In practice, it is commonly believed that the 'closest idle ambulance' rule is near-optimal and it is used throughout most literature. In this paper, we present alternatives to the classical closest idle ambulance rule. Most ambulance providers as well as researchers focus on minimizing the fraction of arrivals later than a certain threshold time, and we show that significant improvements can be obtained by our alternative policies. The first alternative is based on a Markov decision problem (MDP), that models more than just the number of idle vehicles, while remaining computationally tractable for reasonably-sized ambulance fleets. Second, we propose a heuristic for ambulance dispatching that can handle regions with large numbers of ambulances. Our main focus is on minimizing the fraction of arrivals later than a certain threshold time, but we show that with a small adaptation our MDP can also be used to minimize the average response time. We evaluate our policies by simulating a large emergency medical services region in the Netherlands. For this region, we show that our heuristic reduces the fraction of late arrivals by 18 % compared to the 'closest idle' benchmark policy. A drawback is that this heuristic increases the average response time (for this problem instance with 37 %). Therefore, we do not claim that our heuristic is practically preferable over the closest-idle method. However, our result sheds new light on the popular belief that the closest idle dispatch policy is near-optimal when minimizing the fraction of late arrivals.
Keywords: Ambulances; Dispatching; Emergency medical services; Markov decision processes; OR in health services.