Publication Cover
Optimization
A Journal of Mathematical Programming and Operations Research
Volume 73, 2024 - Issue 4
106
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Some advances on constrained Markov decision processes in Borel spaces with random state-dependent discount factors

, &
Pages 925-951 | Received 15 Feb 2022, Accepted 23 Sep 2022, Published online: 12 Oct 2022
 

ABSTRACT

This paper addresses a class of discrete-time Markov decision processes in Borel spaces with a finite number of cost constraints. The constrained control model considers costs of discounted type with state-dependent discount factors which are subject to external disturbances. Our objective is to prove the existence of optimal control policies and characterize them according to certain optimality criteria. Specifically, by rewriting appropriately our original constrained problem as a new one on a space of occupation measures, we apply the direct method to show solvability. Next, the problem is defined as a convex program, and we prove that the existence of a saddle point of the associated Lagrangian operator is equivalent to the existence of an optimal control policy for the constrained problem. Finally, we turn our attention to multi-objective optimization problems, where the existence of Pareto optimal policies can be obtained from the existence of saddle-points of the aforementioned Lagrangian or equivalently from the existence of optimal control policies of constrained problems.

AMS 2020 SUBJECT CLASSIFICATIONS:

Disclosure statement

The authors declare that they have no conflict of interest.

Additional information

Funding

Work partially supported by Consejo Nacional de Ciencia y Tecnología (CONACYT) – México [grant number Ciencia Frontera 2019-87787] and [grant number PRODEP-2021 no. CA-38].

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.