Candan, Muhammet2025-03-262025-03-2620211304-7981https://hdl.handle.net/20.500.14704/403This paper examines the optimal control processes represented by stochastic sequential dynamic systems involving a parameter obtained by unique solution conditions concerning constant input values. Then, the principle of optimality is proven for the considered process. Afterwards, the Bellman equation is constructed by applying the dynamic programming method. Moreover, a particular set defined as an accessible set is established to show the existence of an optimal control problem. Finally, it is discussed the need for further research.eninfo:eu-repo/semantics/openAccessOptimal control processBellman’s equationDynamical programmingStochastic sequential dynamical systemsOptimal control processes associated with a class of stochastic sequential dynamical systems based on a parameterReview Article4824010