Deep backward stochastic differential equation method: Difference between revisions

Content deleted Content added
AzzurroLan (talk | contribs)
No edit summary
AzzurroLan (talk | contribs)
No edit summary
Line 2:
'''Deep BSDE''' (Deep Backward Stochastic Differential Equation) is a numerical method that combines [[deep learning]] with [[Backward stochastic differential equation]] (BSDE). This method is particularly useful for solving high-dimensional problems in [[financial derivatives]] pricing and [[risk management]]. By leveraging the powerful function approximation capabilities of [[deep neural networks]], deep BSDE addresses the computational challenges faced by traditional numerical methods in high-dimensional settings <ref name="Han2018">{{cite journal | last1=Han | first1=J. | last2=Jentzen | first2=A. | last3=E | first3=W. | title=Solving high-dimensional partial differential equations using deep learning | journal=Proceedings of the National Academy of Sciences | volume=115 | issue=34 | pages=8505-8510 | year=2018 }}</ref>.
==History==
BackwardBSDEs stochasticwere differential equations werefirst introduced by [[Jean-MichelPardoux Bismut]]and in 1973Peng in the linear1990 case<ref name="Pardoux1990">{{cite journal |last last1=BismutPardoux |first first1=Jean-MichelE. |year last2=1973Peng | first2=S. | title=ConjugateAdapted convexsolution functionsof ina optimalbackward stochastic controldifferential equation | journal=JournalSystems of& MathematicalControl AnalysisLetters and Applications| volume=4414 | issue=21 | pages=384–40455-61 |doi year=10.1016/0022-247X(73)90066-81990 }}</ref> and have since become essential tools in [[stochastic control]] and [[financial mathematics]]. In the 1990s, [[Étienne Pardoux]] and [[Shige Peng]] established the existence and uniqueness theory for nonlinear BSDE solutions, applying BSDEs to financial mathematics and control theory<ref name="Pardoux1990">{{cite journal | last1=Pardoux | first1=E. | last2=Peng | first2=S. | title=Adapted solution of a backward stochastic differential equation | journal=Systems & Control Letters | volume=14 | issue=1 | pages=55-61 | year=1990 }}</ref>. For instance, BSDEs have been widely used in option pricing, risk measurement, and dynamic hedging.
 
[[Deep Learning]] is a [[machine learning]] method based on multilayer [[neural networks]]. Its core concept can be traced back to the neural computing models of the 1940s. In the 1980s, the proposal of the [[backpropagation]] algorithm made the training of multilayer neural networks possible. In 2006, the [[Deep Belief Networks]] proposed by [[Geoffrey Hinton]] and others rekindled interest in deep learning. Since then, deep learning has made groundbreaking advancements in [[image processing]], [[speech recognition]], [[natural language processing]], and other fields.
Line 14:
==Model==
===Mathematical Method===
Backward Stochastic Differential Equations (BSDEs) represent a powerful mathematical tool extensively applied in fields such as [[stochastic control]], [[financial mathematics]], and beyond. Unlike traditional [[Stochastic differential equations ]](SDEs), which are solved forward in time, BSDEs are solved backward, starting from a future time and moving backwards to the present. This unique characteristic makes BSDEs particularly suitable for problems involving terminal conditions and uncertainties<ref name="Pardoux1990">{{cite journal | last1=Pardoux | first1=E. | last2=Peng | first2=S. | title=Adapted solution of a backward stochastic differential equation | journal=Systems & Control Letters | volume=14 | issue=1 | pages=55-61 | year=1990 }}</ref>.
 
A backward stochastic differential equation (BSDE) can be formulated as:
This unique characteristic makes BSDEs particularly suitable for problems involving terminal conditions and uncertainties<ref name="Pardoux1990">.
{{NumBlk|:|<math> Y_t = \xi + \int_t^T f(s, Y_s, Z_s) \mathrm{d}s, ds - \int_t^T Z_s \mathrm{d}B_s, dW_s, \quad t \in [0, T], </math>|{{EquationRef|1}}}}
 
In this equation:
Fix a terminal time <math>T>0</math> and a [[probability space]] <math>(\Omega,\mathcal{F},\mathbb{P})</math>. Let <math>(B_t)_{t\in [0,T]}</math> be a [[Brownian motion]] with natural filtration <math>(\mathcal{F}_t)_{t\in [0,T]}</math>. A backward stochastic differential equation is an integral equation of the type
* <math> \xi </math> is an <math>\mathcal{F}_T</math>-measurable random variable and the terminal condition specified at time <math> T </math>.
 
* <math>f:[0,T]\times\mathbb{R}\times\mathbb{R}\to\mathbb{R}</math> is called the generator of the BSDE,
{{NumBlk|:|<math>Y_t = \xi + \int_t^T f(s,Y_s,Z_s) \mathrm{d}s - \int_t^T Z_s \mathrm{d}B_s,\quad t\in[0,T],</math>|{{EquationRef|1}}}}
*<math>(Y_t,Z_t)_{t\in[0,T]}</math> is the solution consists of stochastic processes <math>(Y_t)_{t\in[0,T]}</math> and <math>(Z_t)_{t\in[0,T]}</math> which are adapted to the filtration <math>(\mathcal{F}_t)_{t\in [0,T]}</math>.
 
* <math> B_sW_s </math> is a standard [[Brownian motion|Brownian motion]].
In this equation:
* <math>f:[0,T]\times\mathbb{R}\times\mathbb{R}\to\mathbb{R}</math> is called the generator of the BSDE,
* <math>\xi</math> is an <math>\mathcal{F}_T</math>-measurable random variable and the terminal condition specified at time <math> T </math>.
* <math>(Y_t,Z_t)_{t\in[0,T]}</math> is the solution process, which consists of stochastic processes <math>(Y_t)_{t\in[0,T]}</math> and <math>(Z_t)_{t\in[0,T]}</math>
* <math>(Y_t)_{t\in[0,T]}</math> and <math>(Z_t)_{t\in[0,T]}</math> which are adapted to the filtration <math>(\mathcal{F}_t)_{t\in [0,T]}</math>.
* <math> B_s </math> is a standard [[Brownian motion|Brownian motion]].
 
The goal is to find adapted processes <math> Y_t </math> and <math> Z_t </math> that satisfy this equation. Traditional numerical methods struggle with BSDEs due to the curse of dimensionality, which makes computations in high-dimensional spaces extremely challenging.
 
===Neural Network Architecture===
[[File:Deep BSDE Method.png|thumb|Neural Network Framework of Deep BSDE Method]]