Content deleted Content added
AzzurroLan (talk | contribs) No edit summary |
AzzurroLan (talk | contribs) No edit summary |
||
Line 1:
==Introduction==
'''Deep BSDE''' (Deep Backward Stochastic Differential Equation) is a numerical method that combines [[deep learning]] with [[Backward stochastic differential equation]] (BSDE). This method is particularly useful for solving high-dimensional problems in [[financial derivatives]] pricing and [[risk management]]. By leveraging the powerful function approximation capabilities of [[deep neural networks]], deep BSDE addresses the computational challenges faced by traditional numerical methods in high-dimensional settings <ref name="Han2018">{{cite journal | last1=Han | first1=J. | last2=Jentzen | first2=A. | last3=E | first3=W. | title=Solving high-dimensional partial differential equations using deep learning | journal=Proceedings of the National Academy of Sciences | volume=115 | issue=34 | pages=8505-8510 | year=2018 }}</ref>.
==History==▼
BSDEs were first introduced by Pardoux and Peng in 1990 <ref name="Pardoux1990">{{cite journal | last1=Pardoux | first1=E. | last2=Peng | first2=S. | title=Adapted solution of a backward stochastic differential equation | journal=Systems & Control Letters | volume=14 | issue=1 | pages=55-61 | year=1990 }}</ref> and have since become essential tools in [[stochastic control]] and [[financial mathematics]]. In the 1990s, [[Étienne Pardoux]] and [[Shige Peng]] established the existence and uniqueness theory for BSDE solutions, applying BSDEs to financial mathematics and control theory.
Deep Neural Network Methods encompass a class of machine learning techniques that have transformed numerous fields by enabling the modeling and interpretation of intricate data structures. These methods, often referred to as [[deep learning]], are distinguished by their hierarchical architecture comprising multiple layers of interconnected nodes, or neurons. This architecture allows deep neural networks to autonomously learn abstract representations of data, making them particularly effective in tasks such as [[image recognition]], [[natural language processing]], and [[financial modeling]]<ref>LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. *Nature, 521*(7553), 436-444.</ref>.▼
For instance, BSDEs have been widely used in option pricing, risk measurement, and dynamic hedging.
Backward Stochastic Differential Equations (BSDEs) represent a powerful mathematical tool extensively applied in fields such as [[stochastic control]], [[financial mathematics]], and beyond. Unlike traditional [[Stochastic differential equations ]](SDEs), which are solved forward in time, BSDEs are solved backward, starting from a future time and moving backwards to the present. This unique characteristic makes BSDEs particularly suitable for problems involving terminal conditions and uncertainties<ref name="Pardoux1990">{{cite journal | last1=Pardoux | first1=E. | last2=Peng | first2=S. | title=Adapted solution of a backward stochastic differential equation | journal=Systems & Control Letters | volume=14 | issue=1 | pages=55-61 | year=1990 }}</ref>.▼
▲==History==
▲BSDEs were first introduced by Pardoux and Peng in 1990 <ref name="Pardoux1990">{{cite journal | last1=Pardoux | first1=E. | last2=Peng | first2=S. | title=Adapted solution of a backward stochastic differential equation | journal=Systems & Control Letters | volume=14 | issue=1 | pages=55-61 | year=1990 }}</ref> and have since become essential tools in [[stochastic control]] and [[financial mathematics]]. In the 1990s, [[Étienne Pardoux]] and [[Shige Peng]] established the existence and uniqueness theory for BSDE solutions, applying BSDEs to financial mathematics and control theory. For instance, BSDEs have been widely used in option pricing, risk measurement, and dynamic hedging.
[[Deep Learning]] is a [[machine learning]] method based on multilayer [[neural networks]]. Its core concept can be traced back to the neural computing models of the 1940s. In the 1980s, the proposal of the [[backpropagation]] algorithm made the training of multilayer neural networks possible. In 2006, the [[Deep Belief Networks]] proposed by [[Geoffrey Hinton]] and others rekindled interest in deep learning. Since then, deep learning has made groundbreaking advancements in [[image processing]], [[speech recognition]], [[natural language processing]], and other fields.
Line 15 ⟶ 11:
The combination of deep learning with BSDEs, known as deep BSDE, was proposed by Han, Jentzen, and E in 2018 as a solution to the high-dimensional challenges faced by traditional numerical methods<ref name="Han2018" />. The Deep BSDE approach leverages the powerful nonlinear fitting capabilities of deep learning, approximating the solution of BSDEs by constructing neural networks. The specific idea is to represent the solution of a BSDE as the output of a neural network and train the network to approximate the solution.
==Model==
===Mathematical Method===
▲Backward Stochastic Differential Equations (BSDEs) represent a powerful mathematical tool extensively applied in fields such as [[stochastic control]], [[financial mathematics]], and beyond. Unlike traditional [[Stochastic differential equations ]](SDEs), which are solved forward in time, BSDEs are solved backward, starting from a future time and moving backwards to the present. This unique characteristic makes BSDEs particularly suitable for problems involving terminal conditions and uncertainties<ref name="Pardoux1990">{{cite journal | last1=Pardoux | first1=E. | last2=Peng | first2=S. | title=Adapted solution of a backward stochastic differential equation | journal=Systems & Control Letters | volume=14 | issue=1 | pages=55-61 | year=1990 }}</ref>.
A backward stochastic differential equation (BSDE) can be formulated as:
<math> Y_t = \xi + \int_t^T f(s, Y_s, Z_s) \, ds - \int_t^T Z_s \, dW_s, \quad t \in [0, T] </math>
Line 33 ⟶ 28:
===Neural Network Architecture===
[[File:Deep BSDE Method.png|thumb|Neural Network Framework of Deep BSDE Method]]
▲Deep
The core of this method lies in designing an appropriate neural network structure (such as [[fully connected network|fully connected networks]] or [[recurrent neural networks]]) and selecting effective optimization algorithms.
|