<< mdp_bellman_operator Markov Decision Processses (MDP) Toolbox mdp_check_square_stochas >>

Markov Decision Processses (MDP) Toolbox >> Markov Decision Processses (MDP) Toolbox > mdp_check

mdp_check

Checks the validity of a MDP.

Calling Sequence

error_msg = mdp_check (P, R)

Description

mdp_check checks whether the MDP defined by the transition probability array (P) and the reward matrix (R) is valid. If P and R are correct, the function returns an empty error message.

In the opposite case, the function returns an error message describing the problem.

Arguments

P

transition probability array.

P can be a 3 dimensions array (SxSxA) or a list (1xA), each list element containing a sparse matrix (SxS).

R

reward array.

R can be a 3 dimensions array (SxSxA) or a list (1xA), each list element containing a sparse matrix (SxS) or a 2D array (SxA) possibly sparse.

Evaluation

error_msg

error message.

error_msg is a character string which is empty if the MDP is valid. In the opposite case, the variable contains problem information.

Examples

-> P = list()
-> P(1) = [ 0.5 0.5;   0.8 0.2 ];
-> P(2) = [ 0 1;   0.1 0.9 ];
-> R = [ 5 10;   -1 2 ];
-> error_msg = mdp_check (P, R)
error_msg =

In the above example, P can be a list containing sparse matrices:
-> P(1) = sparse([ 0.5 0.5  0.8 0.2 ]);
-> P(2) = sparse([ 0 1;  0.1 0.9 ]);
The function is unchanged.

Authors


Report an issue
<< mdp_bellman_operator Markov Decision Processses (MDP) Toolbox mdp_check_square_stochas >>