Abstract
Monte Carlo simulation is an indispensable tool in calculating high-dimensional integrals. Although Monte Carlo integration is notoriously known for its slow convergence, it could be improved by various variance reduction techniques. This paper applies orthogonal projections to study the amount of variance reduction, and also proposes a novel projection estimator that is associated with a group of symmetries of the probability measure. For a given space of functions, the average variance reduction can be derived. For a specific function, its variance reduction is also analyzed. The well-known antithetic estimator is a special case of the projection estimator, and new results of its variance reduction and efficiency are provided. Various illustrations including pricing financial Asian options are provided to confirm our claims.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11009-021-09893-3/MediaObjects/11009_2021_9893_Fig1_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11009-021-09893-3/MediaObjects/11009_2021_9893_Fig2_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11009-021-09893-3/MediaObjects/11009_2021_9893_Fig3_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11009-021-09893-3/MediaObjects/11009_2021_9893_Fig4_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11009-021-09893-3/MediaObjects/11009_2021_9893_Fig5_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11009-021-09893-3/MediaObjects/11009_2021_9893_Fig6_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11009-021-09893-3/MediaObjects/11009_2021_9893_Fig7_HTML.png)
Similar content being viewed by others
Data Availability
The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.
References
Asmussen S, Glynn P (2007) Stochastic simulation: algorithms and analysis. Springer-Verlag, New York
Beardon AF (1983) The geometry of discrete groups. Springer-Verlag, New York
Duan JC, Simonato JG (1998) Empirical martingale simulation for asset prices. Manag Sci 44(9):1218–1233
Fraleigh JB (2019) A first course in abstract algebra, 7edn. Pearson Education, India
Glasserman P (2004) Monte carlo methods in financial engineering. Springer, New York
Glasserman P, Heidelberger P, Shahabuddin P (1999) Asymptotically optimal importance sampling and stratification for pricing path-dependent options. Math Financ 9:117–152
L’Ecuyer P (1994) Efficiency improvement and variance reduction. In: Proceedings of the 1994 winter simulation conference, Orlando, pp 122–132
Neddermeyer JC (2011) Non-parametric partial importance sampling for financial derivative pricing. Quant Finance 11:1193–1206
Park JJ, Choe GH (2016) A new variance reduction method for option pricing based on sampling the vertices of a simplex. Quant Finance 16(8):1165–1173
Ren H, Zhao S, Ermon S (2019) Adaptive antithetic sampling for variance reduction. In: International conference on machine learning. PMLR, pp 5420–5428
Ross SM (2013) Simulation. Academic Press, New York
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they have no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
We are benefited from the very helpful comments of the Editor and two anonymous referees. The first author was supported by the Ministry of Science and Technology of Taiwan, ROC, under Grant 108-2118-M-009-001-MY2, and the second author was supported under Grant 108-2115-M-009-007-MY2.
Appendices
Appendix A: Proof of Lemma 1
Proof
To prove that \(P_{\mathbb {E}}\) is a projection, we need to show that it satisfies the conditions C1 and C2. For the condition C1, note that for any \(f\in \mathcal {F}\),
Hence, we have \(P^{2}_{\mathbb {E}}=P_{\mathbb {E}}\). For the condition C2, for any \(f,g\in \mathcal {F}\), we have
As a result, \(P_{\mathbb {E}}\) is an orthogonal projection. □
Appendix B: Proof of Theorem 1
Proof
Let V0 be the space of constant functions. Since \(P(\mathcal {F}) \supset V_{0}\), we have \( P(\mathcal {F})^{\perp } \subset V_{0}^{\perp }\). For all \(f \in \mathcal {F}\), because \( (f- P(f)) \in P(\mathcal {F})^{\perp }\), it is clear that
As a result, the expectation of f(X) equals
In addition, we obtain
where the last equality holds by Eq. 2. Therefore, we obtain
□
Appendix C: Proof of Lemma 2
Proof
Since g is a symmetry of μX, we have
Hence, fg(X) is an unbiased estimator. By the same token, we also have \(\mathbb {E}\big (f_{g}(X)^{2}\big ) = \mathbb {E}\big (f(X)^{2}\big )\). Now we have
Because both \(\mathbb {E}\big (f_{g}(X)\big )\) and var(fg(X)) are well-defined, fg remains in \(\mathcal {F}\). □
Appendix D: Proof of Theorem 2
Proof
First, we show that PG is a linear transformation. For \(f_{1}, f_{2}\in \mathcal {F}\), and \(\alpha \in \mathbb {R}\), it is clear that
Therefore, we conclude that PG is a linear transformation on \(\mathcal {F}\).
Next, let us show that \(P_{G}={P_{G}^{2}}\). For all \(f \in \mathcal {F}\) and all g ∈ G,
Here we use the property that the multiplication by g from left just permutes elements of G. Therefore, it does not change the summation. Now we have
Second, let us show that for \(f_{1}, f_{2} \in \mathcal {F}\), 〈PG(f1),f2 − PG(f2)〉 = 0, or equivalently 〈PG(f1),PG(f2)〉 = 〈PG(f1),f2〉. Now we have
Now let us change the variable x by \(y=g^{\prime }x\). Together with the property that dμ(gx) = dμ(x) and multiplying \(g^{\prime -1}\) from right is a permutation on G, we can rewrite the above equation as
We conclude that PG is an orthogonal projection. For the last part of the theorem, it is clear that PG(1) = 1 which implies that \(P(\mathcal {F}) \supset P(\mathbb {R}) = \mathbb {R}\). □
Appendix E: Proof of Proposition 1
Proof
Let \(a_{g}=\langle I_{gD_{0}}, f \rangle \) for short. (1) Consider the following equations.
(2) Consider the following equations.
On the other hand, we have
From the above result, we have
which implies the following
(3) By definition, we have
Note that hx ∈ gD0 if and only if x ∈ h− 1gD0. We can rewrite the above equation as
Here we use the fact that {h− 1g : h ∈ G} equals to G as a set. Next, consider
Combing the above two results, We have shown that
which is a constant function except on a measure zero set. implies that var(PG(f)a(X)) = var(PG(f)a(X)) = 0.
(4) Applying (3), we have
Together with Corollary 1, we have
□
Rights and permissions
About this article
Cite this article
Teng, HW., Kang, MH. On Accelerating Monte Carlo Integration Using Orthogonal Projections. Methodol Comput Appl Probab 24, 1143–1168 (2022). https://doi.org/10.1007/s11009-021-09893-3
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11009-021-09893-3