Optimal control processes associated with a class of stochastic sequential dynamical systems based on a parameter

dc.contributor.authorCandan, Muhammet
dc.date.accessioned2025-03-26T13:53:00Z
dc.date.available2025-03-26T13:53:00Z
dc.date.issued2021
dc.departmentİstanbul Esenyurt Üniversitesi
dc.description.abstractThis paper examines the optimal control processes represented by stochastic sequential dynamic systems involving a parameter obtained by unique solution conditions concerning constant input values. Then, the principle of optimality is proven for the considered process. Afterwards, the Bellman equation is constructed by applying the dynamic programming method. Moreover, a particular set defined as an accessible set is established to show the existence of an optimal control problem. Finally, it is discussed the need for further research.
dc.identifier.endpage48
dc.identifier.issn1304-7981
dc.identifier.issue2
dc.identifier.startpage40
dc.identifier.urihttps://hdl.handle.net/20.500.14704/403
dc.identifier.volume10
dc.institutionauthorCandan, Muhammet
dc.language.isoen
dc.publisherTokat Gaziosmanpasa University
dc.relation.ispartofJournal of New Results in Science
dc.relation.publicationcategoryMakale - Ulusal Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/openAccess
dc.snmzKA_DergiPark_20250326
dc.subjectOptimal control process
dc.subjectBellman’s equation
dc.subjectDynamical programming
dc.subjectStochastic sequential dynamical systems
dc.titleOptimal control processes associated with a class of stochastic sequential dynamical systems based on a parameter
dc.typeReview Article

Dosyalar