We consider an infinite horizon optimal control problem for a continuous-time Markov chain $X$ in a finite set $I$ with noise-free partial observation. The observation process is defined as $Y_t = h(X_t)$, $t \geq 0$, where $h$ is a given map defined on $I$. The observation is noise-free in the sense that the only source of randomness is the process $X$ itself. The aim is to minimize a discounted cost functional and study the associated value function $V$. After transforming the control problem with partial observation into one with complete observation (the separated problem) using filtering equations, we provide a link between the value function $v$ associated with the latter control problem and the original value function $V$. Then, we present two different characterizations of $v$ (and indirectly of $V$): on one hand as the unique fixed point of a suitably defined contraction mapping and on the other hand as the unique constrained viscosity solution (in the sense of Soner) of a HJB integro-differential equation. Under suitable assumptions, we finally prove the existence of an optimal control

Calvia, A. (2018). Optimal Control of Continuous-Time Markov Chains with Noise-Free Observation. SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 56(3), 2000-2035 [10.1137/17M1139989].

Optimal Control of Continuous-Time Markov Chains with Noise-Free Observation

CALVIA, ALESSANDRO
2018

Abstract

We consider an infinite horizon optimal control problem for a continuous-time Markov chain $X$ in a finite set $I$ with noise-free partial observation. The observation process is defined as $Y_t = h(X_t)$, $t \geq 0$, where $h$ is a given map defined on $I$. The observation is noise-free in the sense that the only source of randomness is the process $X$ itself. The aim is to minimize a discounted cost functional and study the associated value function $V$. After transforming the control problem with partial observation into one with complete observation (the separated problem) using filtering equations, we provide a link between the value function $v$ associated with the latter control problem and the original value function $V$. Then, we present two different characterizations of $v$ (and indirectly of $V$): on one hand as the unique fixed point of a suitably defined contraction mapping and on the other hand as the unique constrained viscosity solution (in the sense of Soner) of a HJB integro-differential equation. Under suitable assumptions, we finally prove the existence of an optimal control
Articolo in rivista - Articolo scientifico
partial observation control problem, continuous-time Markov chains, piecewise deterministic Markov processes, Bellman equation, viscosity solutions
English
2018
56
3
2000
2035
none
Calvia, A. (2018). Optimal Control of Continuous-Time Markov Chains with Noise-Free Observation. SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 56(3), 2000-2035 [10.1137/17M1139989].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/198728
Citazioni
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 7
Social impact