An Approach to Customer Behavior Modeling using Markov Decision Process

  • Ondrej Grunt
  • Jan Plucar
  • Marketa Stakova
  • Tomas Janecko
  • Ivan Zelinka
Keywords: markov decision process, marketing, models, probability


This paper presents an application of Markov Decision Process method for modeling of selected marketing processes. Based on available realistic data, MDP model is constructed. Customer behavior is represented by a set of states of the model with assigned rewards corresponding to the expected return value. Outcoming arcs then represent actions available to the customer in current state. Favourable outcome rate of available actions is then analyzed, with emphasis on suitability of the model for future predictions of customer behavior.


Wessling, H.: Aktivn vztah k zakaznkum pomoc CRM: strategie, prakticke prklady a scenare. Praha: Grada, 2003. Manazer. ISBN 8024705699.

DeMaagd, N.: The Application of Stochastic Processes in the Sciences: Using Markov Chains to Estimate Optimal Resource Allocation, Population Distribution, and Gene Inheritance. Honors Projects. Paper 290 (2014).

Hamilton, J.D: A New Approach to the Economic Analysis of Nonstationary Time Series and the Business Cycle. Econometrica 57(2), 357{384 (1989).

Calvet, L.E.: How to Forecast Long-Run Volatility: Regime Switching and the Estimation of Multifractal Processes. Journal of Financial Econometrics 2 (1), 49{83 (2004).

Pfeifer, P.E., Carraway, R.: Modeling Customer Relationships as Markov Chains. Journal of Interactive Marketing 14(2), 43{55 (2000).

Labbi, A., Berrospi, C.: Optimizing marketing planning and budgeting using Markov decision processes: an airline case study. IBM Journal of Research and Development - Business optimization 51(3), 421{431 (2007).

Bauerle, N., Rieder, U.: MDP algorithms for portfolio optimization problems in pure jump markets. Finance and Stochastics 13(4), 591{611 (2009).

Perez, I., Hodge, D., Le, H.: Markov decision process algorithms for wealth allocation problems with defaultable bonds. Advances in Applied Probability 48(2), 392{405 (2016).

Rosch, A., Schmidbauer, H.: Action Selection in Customer Value Optimization: An Approach Based on Covariate-Dependent Markov Decision Processes. Proceedings of the 2009 International Conference on Data

Mining DMIN 09, Las Vegas. ISBN: 1-60132-099-X.

Zaharia, M.: Apache Software Foundation, UC Berkeley AMPLab, Databricks. (2017). [Online; accessed 3-May-2017]

Bellman, R.E.: A Markovian Decision Process. Journal of Mathematics and Mechanics 6(5), 679{684 (1957).

Howard, R.A.: Dynamic programming and Markov processes. Cambridge, Massachusetts, 1960. The M.I.T. Press.

Storn, R., Price, K.: Dierential evolutiona simple and ecient heuristic for global optimization over continuous spaces. Journal of global optimization 11(4), 341{359 (1997).

Davendra, D., Zelinka, I.: Self-Organizing Migrating Algorithm: Methodology and Implementation. Springer, 2016. ISBN 978-3-319-28161-2.

Clerc, M.: Particle swarm optimization. Vol. 93. John Wiley & Sons, 2010. Kennedy, James. "Particle swarm optimization." In Encyclopedia of machine learning, pp. 760-766. Springer US, 2011.

Koza, J. R.: Genetic programming: on the programming of computers by means of natural selection. Vol. 1. MIT press, 1992.

ONeil, M., Ryan, C.: Grammatical evolution. In Grammatical Evolution, pp. 33-47. Springer US, 2003. Harvard

Zelinka, I., Davendra, D., Senkerik, R., Jasek, R., Oplatkova, Z.: Analytical Programming - a Novel Approach for Evolutionary Synthesis of Symbolic Structures, Evolutionary Algorithms, Prof. Eisuke Kita (Ed.), InTech, 2011. DOI: 10.5772/16166. Available from:

How to Cite
Grunt, O., Plucar, J., Stakova, M., Janecko, T. and Zelinka, I. 2017. An Approach to Customer Behavior Modeling using Markov Decision Process. MENDEL. 23, 1 (Jun. 2017), 141-148. DOI:
Research articles