Martingale optimal transport: an application to robust option pricing
Financial markets are inherently fraught with uncertainty, translating directly into various forms of risk. Among these, model risk—the risk associated with making poor decisions based on inadequate mod- els—stands out for its profound implications on financial decision-making. This thesis addresses...
- Autores:
-
Corredor Montenegro, David
- Tipo de recurso:
- Trabajo de grado de pregrado
- Fecha de publicación:
- 2023
- Institución:
- Universidad de los Andes
- Repositorio:
- Séneca: repositorio Uniandes
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.uniandes.edu.co:1992/74196
- Acceso en línea:
- https://hdl.handle.net/1992/74196
- Palabra clave:
- Robust option pricing
Martingale optimal transport
Deep learning
Matemáticas
- Rights
- openAccess
- License
- Attribution-ShareAlike 4.0 International
Summary: | Financial markets are inherently fraught with uncertainty, translating directly into various forms of risk. Among these, model risk—the risk associated with making poor decisions based on inadequate mod- els—stands out for its profound implications on financial decision-making. This thesis addresses model risk by proposing a robust approach to the pricing and hedging of financial derivatives, aimed at minimizing exposure to model inaccuracies. Traditional valuation methods rely heavily on a fixed probability measure, leading to disparate outcomes in the valuation of the same financial derivative across different models. Our approach seeks to establish bounds within which the true value of a derivative is likely to fall, by considering all models consistent with market prices that preclude arbitrage opportunities. This effectively encompasses all martingale measures aligned with observed market marginals. This motivation sets the stage for exploring the Martingale Optimal Transport Problem (MOT), the core focus of our thesis. We present it’s primal and dual formulation in the most general setting, and provide a financial interpretation of its dual in terms of super-hedging (super-replication) strategies. Additionally, following the work of Eckstein and Kupper [15], we approximate the problem in two dimensions (i) by penalizing the “complicating super-replication constraints” in the objective function; and (ii) by constraining the solution space to be functions that can be specified with finitely many parameters (a specific class of neural networks). This relaxed version of the problem is shown to converge to the optimal value when the approximation quality increases. This relaxed version of the problem is numerically solved, as it ends up being an unconstrained smooth optimization problem that can be solved with gradient decent type of algorithms. We implemented this solution algorithm and test it in various settings. We use simple scenarios to validate the behavior of the algorithm and some more general settings to evaluate the performance and efficiency of the algorithm. In both settings we conclude that the algorithm is consistent with the theory. Despite all the convergence results and how we leverage the deep learning tools for solving the unconstrained optimization problem, we still see that we cannot escape the course of dimensionality, as the solution time and dimensions required for solving the problem are still significant. |
---|