Date: 2017-08-07
Portfolio optimization and scenario mapping
As long as one understands the pros and cons of it, portfolio optimization via CAPM can help the investor to be disciplined in his or her portfolio construction by providing a good starting point and structure for rebalancing.
While I enjoy creative and relatively ad hoc bits of investments very much, having some degree of consistency in portfolio management seems to be helpful for the sake of sustainability. Especially if one made portfolio management a career, it would make sense for him or her to set a framework that can function as an anchor. This is because good inspirations come and go, but investments are continuous.
Deciding whether certain trends in the past will persist in future or not is a highly subjective exercise, and commonly due to the overwhelming number of variables and sheer volume of data dealt with, analysts often end up arguing that whatever happened in the past in the longest time frame possible is likely to be the natural order for the relevant asset’s past and future performance. This way, forecasters essentially end up simply extrapolating the past trends into future. While the outcome of this may not necessarily be disappointing, it is likely to yield an even better outcome if forecasting could be done less hubristically, focusing on the range of possibilities rather than on the likelihood of specific situation from happening. For this reason, I view the preparation of well thought through data points as the most challenging step of the portfolio optimization process.
When forecasting a scenario specific outcome, relative performance of different assets (which are most extensively utilized in scenario mapping in the traditional portfolio optimization setting) are in fact extremely difficult to forecast. Prudent portfolio construction had better involve attention to risk asymmetry, and assessment of risk asymmetry requires the forecaster to consider both direction and magnitude of price movement. Therefore, if one were to simulate with only two assets in the portfolio, even if we look at the causal relations in the cross sectional manner hence assume no reverse causation, at least 5 scenarios need to be considered, in the most simplified version: (1) correlated to the negligible degree, (2) correlated with positive-low beta, (3) correlated with positive-high beta, (4) correlated with negative-low beta, and (5) correlated with negative-high beta. The number of scenarios grow substantially when just one more asset’s added, and correlation table turns too complicated if interactive terms are added in, even though adding interactive terms increasingly makes intuitive sense due to improved interconnectedness among different market segments and rapidly growing ETF market.
It makes sense to make a distinction between two different uses of the optimization technique: (1) for the assessment of desired exposures and (2) for the estimation of future performance. For the former, the assets of choice need to represent the investment universe in a comprehensive manner. Comprehensiveness is important because the future performance of an individual asset can deviate significantly from the historical norm and also from the rest of the asset classes. For the latter, which assets to include in the portfolio is a relatively straight forward bottom-up process, and I am of the view that it makes sense to eliminate assets with unfavorable risk profile altogether in the screening process.
I view the followings as key variables to assess:
Valuation, secular trend, sector view, REER and degree of pegging (for the currency market) for the expected return.
Microstructure (newly introduced or popularized financial products (such as MBS and CDS in pre-subprime mortgage crisis and pre-Lehman crisis period, and ETF in the current period) or techniques’ (such as high frequency trading) roles in the market), observation of whether or not momentum strategy is gaining popularity), magnitude of deviation from the theoretical equilibrium of valuation (to estimate for how long valuations should not be used as a basis of market recovery, in the event of market meltdown), and level of leverage for the volatility.
Additionally, it increasingly makes sense to keep the risk free rate at 0% for the prudence sake. This way, the optimization model may capture the impact of negative interest rate environment better. It is important to pair this with the comprehensiveness of the assets used in the portfolio optimization for the assessment of desired exposures. If not, the model would be flawed.
Another interesting and ironic thought is: it makes sense to set the time frame narrow and seek to understand many minutiae (rather than one long continuous time series). This adds a different sort of explanatory power. I view this as a more responsible approach, in that one has a better chance of understanding that specific case better.
There still are many things I am puzzled about the use of Black Litterman model. Does it make sense to express additional views (I'm particularly troubled because it seems like Black Litterman model assumes that relative assessment of expected returns of assets is more easily done than the estimation of absolute returns) without changing the raw data? What can Black Litterman do that traditional CAPM can’t (if one were to be diligent enough to keep micro managing the data sets)?