Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 71 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 101 tok/s Pro
Kimi K2 236 tok/s Pro
GPT OSS 120B 469 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

On long term investment optimality (1609.00587v6)

Published 2 Sep 2016 in math.PR

Abstract: We study the problem of optimal long term portfolio selection with a view to beat a benchmark. Two kinds of objectives are considered. One concerns the probability of outperforming the benchmark and seeks either to minimise the decay rate of the probability that the portfolio exceeds the benchmark or to maximise the decay rate that the portfolio falls short. The other criterion concerns the growth rate of the risk-sensitive utility of wealth which has to be either minimised, for the risk-averse investor, or maximised, for the risk-seeking investor. It is assumed that the mean returns and volatilities of the securities are affected by an economic factor, possibly, in a nonlinear fashion. The economic factor and the benchmark are modelled with general It^o differential equations. The results identify optimal portfolios and produce decay, or growth, rates. The portfolios have the form of time-homogeneous functions of the economic factor. Furthermore, a uniform treatment is given to the out- and under- performance probability optimisation as well as to the risk-averse and risk-seeking optimisation. In particular, it is shown that there exists a portfolio that optimises the decay rates of both the outperfomance probability and the underperformance probability. Whereas earlier research on the subject has relied on the techniques of stochastic optimal control and dynamic programming, in this contribution the quantities of interest are studied directly by employing the methods of the large deviation theory. The key to the analysis is to recognise the setup in question as a case of coupled diffusions with time scale separation, with the economic factor representing "the fast process".

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube