Foreword
Author
As we approach the end of the first half of 2020, it is difficult to find commentaries or official statements that do not refer to ‘unprecedented times’. However, it is hardly the first time that we have seen extreme financial market volatility. So is this time really different? What implications does answering this question have for investors during the second half of 2020?
With financial markets showing extreme behaviour, it is not unexpected to see market participants’ opinions diverge and become more binary. This is also true for views on the role that quantitative analysis can play in guiding investor decision-making. On the one hand, it can be argued that quantitative models which have been calibrated over the recent past are unlikely to provide helpful insights as to the more immediate future given current extremes. On the other hand, patterns of past behaviour often repeat themselves. Without the help of a quantitative model that does not suffer the same behavioural biases that financial market participants are prone to exhibit, especially in extreme environments, forecasting can quickly become merely the reflection of opinion.
The articles in this month’s Global Outlook all deal with quantitative models of some type. In each case, we acknowledge the important role that fundamental views and qualitative judgements can play in designing and interpreting the output.
In the spotlight article, Carolina Martinez and Karolina Noculak then reflect on the impact of the Covid-19 crisis on US corporate earnings. Given the lag or even absence of bottom-up earnings estimates, top-down quantitative models are used that link earnings growth to the economic and financial context. However, given the uncertainty around the length and severity of the crisis, three alternative economic scenarios that reflect different economic assumptions are constructed.
Picking up from the theme of macro drivers, Maximilien Macmillan discusses dynamic factor allocation using the notion of risk types or groups of exposures with a strong and stable link amongst each other. Once risk types are identified, it is argued that actively managing the directional exposures to different risk types helps navigate environments of changing correlation structures across traditional asset classes and helps target upside performance in the envisioned macro outlook.
In their article ‘Capital-aware active management’, David Roseburgh, Investment Director, and Gavin Donnelly, Investment Manager, discuss how active managers can add value through fundamental analysis, while still managing the capital requirements imposed on Insurance companies by global regulators.
In his article, Robert Minter, Investment Strategist, Commodities, outlines a rigorous framework for commodity analysis that is informed by behavioural best practice. Its methodical application to chaotic energy markets results in robust and consistent forecasts. He concludes that the current price is not supported by traditional demand/ supply metrics while it is heavily influenced by the deep economic impact of the Covid-19 crisis.
In the final article, Simon Whiteley makes the case for combining two equity risk premia strategies – Value and Quality – into a hybrid strategy that reaps the benefits of diversification to increase risk-adjusted returns.
Both strategies have a strong fundamental investment rationale and can empirically generate superior risk-adjusted returns over the long term, typically several decades. However, it is the combination of these two strategies that is most robust in a variety of market environments, including the current Covid-19 crisis.
We believe that financial markets are adaptive systems that require a deep understanding of sophisticated quantitative techniques, as well as qualitative judgement from experienced investment professionals, to successfully navigate. This blend of ‘qualitative and quantitative’ elements speak to the benefits of both active management and portfolio diversification.
Download this article as a PDF
DownloadThe impact of the Covid-19
crisis on US corporate earnings
Chapter 1
Authors
The impact of the Covid-19 crisis on US corporate earnings has been significant. Given the lag or even absence of bottom-up earnings estimates, top-down quantitative models are being used to link earnings growth to the current economic and financial context. Given the uncertainty around the path for growth, alternative economic scenarios are worthwhile considering.
We have all seen how the coronavirus outbreak and containment measures, such as retail closures and social distancing, are having extraordinary repercussions in financial markets and economies. Economic activity collapsed in the first quarter, but with many more countries entering lockdown during April and May, the second quarter will be the worst quarter for global growth in 2020.
In the past, shares in companies that have withheld guidance have underperformed the wider market
At this point, we do not know how, or when, normal life will resume in full. Until there is widespread availability of a vaccine for the virus, any lifting of social distancing measures could risk a second wave of infections and the re-introduction of lockdowns. We have seen this in China, with some cities having to return to lockdown to control new outbreaks of infection. In this highly uncertain environment, companies initially found it very difficult to quantify the impact of this crisis and for the first-quarter US earnings season many withdrew forward guidance for revenues and profits. This makes it challenging for investors to assess the outlook for corporate profits, one of the key aspects in our investment decision-making process.
The messages from the earnings season
While the first-quarter earnings season gave us some insight into the initial effect of the virus, we are mindful that much of the information is backward-looking and uncertainty about the economic outlook remains elevated. It has not helped that many companies have refused to provide forward guidance. Under normal circumstances, only about half of the companies in the S&P 500 Index tend to do so. This time, however, even fewer businesses have made projections. Companies that have decided not to give future guidance include IBM, Caterpillar and the industrial conglomerate 3M1. This was not well received by investors. In the past, shares in companies that have withheld guidance have underperformed the wider market.
The lack of corporate guidance presents a challenge for analysts who cover these companies. Added to this is the uncertain economic environment. While many analysts have begun to lower their expectation for earnings, their aggregate estimates, or consensus forecasts, tend to lag major economic data releases. This makes it likely that there will be further negative changes to analysts’ expectations in the months to come.
Looking at their current consensus projections, they expect earnings to contract sharply this year, but to recover just as quickly in 2021. Analysts, like many others, are prone to be optimistic, and investors have also been showing their enthusiasm. April was a stellar month for US stocks, with the S&P 500 Index climbing almost 13%. We, on the other hand, think there are reasons to believe that profits could dip more substantially and then be slower to recover. Strong company profit growth is often a feature of stock markets’ recovery. But it takes time – roughly ten quarters, if previous recessions are a guide – to repair all of the damage to profits.
A quantitative approach to assessing the impact to corporate earnings
We have been looking at ways to help counter the uncertainty about the earnings outlook for companies. To do so, we incorporated the findings from quantitative models that link earnings growth in an economic and financial context into our tactical asset allocation deliberations.
Our models use a top-down approach to gauge earnings growth potential, as defined by economic and financial variables, such as the global manufacturing cycle and fluctuations in the exchange rates. More specifically, this framework allows us to identify the sensitivity and time dynamics of earnings for this set of key factors.
Moreover, as the economic cycle is so relevant for earnings, constructing alternative economic scenarios is a valuable exercise. In doing so, we have recognised that the length and severity of this crisis is as-yet undefined. We have therefore tested how earnings behave under a range of different economic assumptions, which led us to three plausible alternative scenarios.
The GDP growth profile is the key factor for the corporate earnings outlook
Our baseline economic scenario follows our Research Institute’s view: as seen in Chart 1 the economic rebound takes place during the second half of 2020 and is followed by a recovery in 2021. By contrast, it is conceivable that recovery could begin in the second half of this year and is followed by a strong rebound afterwards. This is our potential upside scenario. It would likely be a result of either the aggressive fiscal or monetary stimulus measures taking place in the US and/or the availability of a successful vaccine allowing a quick and safe return to normality. Finally, a more severe recession and a recovery that takes longer than anticipated define our plausible downside scenario. Factors that could signal such an outcome would be an increase in infection levels as US cities reopen for business or long-lasting damage to consumer behaviour. Another factor could be an escalation in the already terse political relationship between the US and China.
Chart 1: Alternative roadmaps for US growth
Source: Aberdeen Standard Investments (as at May 2020).
1 Note: US GDP.
Key drivers of corporate earnings growth
Our framework suggests that the industrial cycle – both domestic and global – is the most significant driver of US corporate earnings. As shown in Chart 2, during the global financial crisis (GFC), for example, growth in rolling 12-month trailing earnings per share (EPS) closely shadowed the collapse in industrial production. Afterwards, they improved in tandem. Last year provided further evidence of this relationship, as EPS growth slowed in line with a deceleration in the global manufacturing cycle.
Given the nature of this crisis, the recession is likely to be more severe for services than for the industrial sector. This means that in the short-term, growth in industrial production will mirror the sharp economic contraction more closely.
The currency implications
The behaviour of the US dollar is also very important. Usually, a strong dollar equals weaker company profits. It gives US consumers more purchasing power to buy goods overseas and hurts international sales for US companies. The dollar was in a strong position when the crisis began. If it remains strong, as investors prefer the safety of holding dollars, it could act as another drag on corporate earnings over the coming quarters.
Our models use three different predictions for the trade-weighted dollar. Our baseline scenario assumes that the dollar will drop 6% against its peers in2020. On the other hand, in our downside prediction, we assume that the dollar will appreciate a further 12% this year. This gain is slightly less than the one that occurred during the GFC (around 16%), reflecting the already-strong stance of the currency. Finally, our upside model supposes that the dollar could weaken by 15%. This is similar to what happened during the recovery from the GFC.
Chart 2: We expect the collapse in global manufacturing growth to be the main driver of earnings' contraction
Source: Federal Reserve Bank of Dallas (as of May 2020).
Labour productivity
Labour costs are another important driver of earnings. In April, the US unemployment rate reached a record high of 14.7%. Since mid-March about 38 million unemployment insurance claims have been filed, suggesting that the unemployment rate could reach up to 20%. It seems likely that labour costs will fall in response, providing one element of relief for company profits. This relief is marginal, however. It is true that higher labour availability makes labour cheaper, but high unemployment and weak consumption are likely to have more damaging implications for companies’ revenues. Given the characteristics of the labour market in the US, we assumed across all scenarios that US unit labour costs will contract at a 4% pace, the same rate as during the GFC. This suggests that, given the speed at which job losses have taken place, employment is unlikely to recover rapidly even in the most benign scenario.
Our earnings growth projections for this year and next
We believe that the most likely outcome is that earnings will contract by about 35% this year, and recover at a similar pace in 2021. This reaffirms the point we made earlier: yes, analysts’ consensus earnings estimates have moved lower, but they still need to catch up with economic reality.
This result also indicates that by the end of next year we expect the level of earnings to remain below the pre-corona crisis level.
If a prolonged and more severe recession materialised, we could see an unprecedented collapse in earnings of 60% this year; the recovery in 2021 would be stunted at just 25%.
Meanwhile, our more optimistic model only really distinguishes itself in 2021. It posits that earnings will increase by 50% next year, having fallen around 30% in 2020.
As always, making forecasts is a complex business, but it is especially so during a crisis like this one. Our projections are all built on quantitative findings related to sensitivity analysis and time dynamics. These findings are complemented by our qualitative judgement of the set of circumstances that are plausible and the likely indicators of such circumstances.
As mutually exclusive scenarios, only one can come to fruition. Nevertheless, determining an array of plausible outcomes has proved to be a very useful exercise in this environment of uncertainty.
Chart 3: The path for EPS Growth in US
Source: Aberdeen Standard Investments (as of May 2020).
In the past, shares in companies that have withheld guidance have underperformed the wider market
Dynamic Factor Allocation
Chapter 2
Author
Actively managing the exposures to different risk types can help investors navigate varying correlations between traditional asset classes to provide a more diversified portfolio.
Asset allocation is often touted as being the primary driver of investment performance. Yet, few investors manage their risk profiles decisively enough to break the link between fund returns and those of the equity market.
Generating uncorrelated returns requires active management of the balance across risk types
Why is this? Despite actively managing risk, why are outcomes so often tied to the performance of risk assets?
The answer is twofold. Firstly, and paradoxically, trying to achieve a high degree of diversification often leads to portfolios that resemble the broader market, as the accumulation of strategies tend to cancel out idiosyncratic risk. What is then left is mostly equity-type exposure.
Secondly, in so far as active asset allocation is used, it is often deployed between asset classes that differ in name, but whose performance remains highly correlated under different macro environments.
Generating uncorrelated returns requires active management of the balance across risk types. But investors often consider these to be too many, or too few.
Some investors think that every asset class with a different name is a disconnected risk type. In fact, they are sometimes just different degrees of exposure to the same risk type (or different combinations of the same underlying risk type). Investment grade and high-yield credit are good examples of this, displaying different ratios of duration and credit risk.
On the other hand, some investors think there are just two risk types: duration risk and risk assets.
We think the truth lies in between. There are a small number of truly disconnected risk types that, if properly identified and managed, give investors the best chance of crafting risk profiles that can benefit from any macro environment.
While it may be the case that holding a multitude of strategies helps dispel idiosyncratic risk, it is only by actively managing exposure to the different risk types that systemic risk can be mitigated. And it can even be used to one’s advantage to achieve uncorrelated returns.
The dynamic factor allocation approach, or the active management of a fund’s risk profile, helps to build risk profiles that strongly diverge from the market. They are tailored to the expected macro environment.
Here is how.
Risk Types
Amid the multitude of asset classes, we find there are groups that exhibit a strong and stable link with each other. Not only is their correlation high, but it is stable through time. (For the purposes of this exercise, we only include risk types that are sufficiently liquid to be traded in and out dynamically).
- Developed market corporate risk (equity returns and credit excess returns)
- Developed market interest-rate risk (developed market government bonds)
- Dollar risk (US dollar index, leaving emerging market foreign exchange to the emerging market risk type)
- Gold
- Emerging market risk (hard currency spread, emerging market equities and emerging markets foreign exchange)
Importantly, while asset classes can be grouped together, a single asset class must sometimes be split before it can be reorganised. It is unnecessarily complex speaking of emerging market hard currency debt, for instance. Much better to split out the US Treasury component from the emerging market default risk component, or the spread, so as to manage each component separately. The same applies, of course, to corporate credit.
Once we have defined the risk types, we can measure the empirical sensitivity that a fund has to each. That is, we can map out the empirical risk profile.
Risk profiles and the macro environment
Groups of assets behave in the same way, in a reliable and stable manner, because they are driven by the same macro factors. To illustrate, when risk aversion leads international investors to withdraw from emerging markets, foreign exchange and hard currency spreads both suffer.
The outflows lead to currency selling, which weakens currencies and endangers the sustainability of dollar debt.
In general, the macro environment can be defined by the set of macro drivers that characterise it. Whether it is changes in cyclical growth, monetary policy, oil prices, geopolitical shocks or risk sentiment, we can analyse historical environments and observe that the correlation structure across risk types varies radically.
Developed market interest-rate risk, for instance, can be positively correlated to developed market corporate risk under one environment, but it will be negatively correlated under another. The variation in correlation offers the opportunity to build better risk profiles.
Hedging risk is no panacea if it comes at the cost of performance. A stable negative correlation is of little use, as it will cost as much on the upside as it will protect on the downside. Knowing the different correlation structures that will prevail under different macro regimes mean that we can hedge portfolios for alternative outcomes, while limiting the reduction in expected returns. As such, we can add extra dimensions to portfolio construction.
Relative performance of risk types under several macro environments
Chart 1: Taper Tantrum (2013)
Source: Aberdeen Standard Investments, Bloomberg (as of May 2020).
Chart 2: China Fear (Mid 2015)
Source: Aberdeen Standard Investments, Bloomberg (as of May 2020).
Chart 3: Dove Powell (H1 2019)
Source: Aberdeen Standard Investments, Bloomberg (as of May 2020).
Chart 4: Hawk Powell (Q1 2018)
Source: Aberdeen Standard Investments, Bloomberg (as of May 2020).
Chart 5: Cyclical Recovery (2017)
Source: Aberdeen Standard Investments, Bloomberg (as of May 2020).
Chart 6: European Crisis (H2 2011)
Source: Aberdeen Standard Investments, Bloomberg (as of May 2020).
Generating uncorrelated returns requires active management of the balance across risk types
Capital-aware
active management
Chapter 3
Authors
Solvency regulations are becoming increasingly sophisticated. Traditional investment-management processes struggle to deliver optimal solutions for insurance portfolios. This has produced capital-inefficient portfolios.
Insurers have therefore turned to quantitative models that seek to create optimal insurance asset portfolios in terms of solvency capital requirements. However, these portfolios are not necessarily attractive from an investment perspective. They may be capital-efficient, but they may not deliver the best returns for a specific level of risk.
Investors need a more holistic approach – one that combines active asset management and solvency capital modelling. In this way, investors can benefit from an investment process that is informed, but not determined, by capital rules. We call this approach ‘capital-aware active management’.
The regulatory environment
Regulatory solvency regimes determine the amount of capital that insurance companies must hold to support the risks within their asset portfolios. In recent years, these regulations have become increasingly sophisticated, detailed and risk-sensitive. Here is a summary of the evolution of solvency regimes:
- 2006: The Swiss Solvency Test
- 2013: Australia’s LAGIC solvency capital approach
- 2016: Solvency II in the European Union (EU)
- 2016: Bermudan Solvency Capital Requirement; enhanced in 2019
Today, a number of Asian countries are developing risk-sensitive, probabilistic approaches to solvency risk capital. A similar process is underway in Canada.
The International Association of Insurance Supervisors is consulting on an International Capital Standard in a bid to establish common ground. This would aim to deliver a detailed, risk-sensitive capital assessment method for all insurers.
All of these assessments – as prescribed by regulators – are formulaic, using calculations typically based on historical data. Insurance investors can incorporate these assessments into algorithmic optimisation methods. These methods calculate the efficient frontier from a risk-capital perspective – the highest achievable expected return for each level of a portfolio’s incurred capital charge. This is particularly effective for fixed-income portfolio construction, where measures of risk, expected return and capital requirements of individual securities are straightforward to calculate.
Quantitative models: positives and pitfalls
These quantitative approaches appeal to both asset owners and asset managers. For insurance companies, they offer the ability to optimise portfolios to meet their specific needs. For asset managers, these models allow them to demonstrate the value of their proposed investment solution.
However, this regulatory capital-optimisation approach may not deliver a genuinely optimal investment solution for insurers. Its focus on maximising or minimising capital-driven metrics places a heavy reliance on regulatory capital calculations.
While the capital formula may be more sophisticated than before, it cannot provide a perfect, or even a reasonable, representation of all forms of risk. It seeks out solutions that work, for now, according to the capital formula. However, these solutions are not necessarily attractive from an investment perspective.
In addition, this quantitative approach does not provide scope for the asset manager to add value using fundamental analysis. Active managers construct portfolios that reflect their active investment views, as agreed with clients in investment mandates. They use forward-looking judgements alongside analysis of historical data.
Yet how can they add value when the investment process is an algorithmic function of the regulatory capital formula?
In this paper, we seek to answer this question. We illustrate how capital-sensitive insurance companies can apply active asset management — even ones that operate under a regulatory solvency capital regime that is sophisticated, detailed and risk-sensitive.
Balancing investors’ objectives
Investors who manage insurance assets in a regulated environment face two independent objectives. First, they must seek the optimal balance between risk and return for their investments. Second, they must seek the optimal balance between expected return and the portfolio’s incurred capital charge.
As insurance regulations have become more sophisticated, detailed and risk-sensitive, insurers have focused on the second of these objectives. These regulations are quantitative in nature and involve calculations based on historical data. This has led insurers to turn to algorithmic optimisation methods to provide the ‘optimum’ portfolio strategy.
These quant models are backwards-looking, however. In particular, they rely on rating agencies’ assessments of risk for corporate bonds. But these assessments offer a ‘stale’ measure of risk. Credit-rating agencies do not update their ratings in real time to capture ever-changing fundamentals. Therefore, these purely quantitative approaches do not provide scope for an active manager to add value.
Applying active management
By contrast, an active approach to fixed income can use fundamental analysis to anticipate the outlook for the credit rating of the bond issuer. The investor can incorporate this forward-looking judgement into his or her investment decision-making process.
However, most active investment processes do not consider the regulatory capital treatment of bond holdings. This means that a purely fundamental approach can result in an inefficient capital allocation for the insurance firm.
An active investment-management process identifies the assets that the investor believes are fundamentally attractive, overweighting these positions. These preferred positions can provide the starting point for constructing a capital-aware active portfolio. There is no strong connection between the active process and the regulatory capital efficiency of the bonds. These two perspectives are independent of one another.
A third way: capital-aware active management
A capital-aware active management approach should incorporate both the active management strategy and the regulatory capital efficiency of the bonds. What could this process look like?
The insurance investor could, for example, apply the algorithmic return-on-capital optimisation process to the bonds selected by the active approach. For each asset, the investor replaces the expected returns derived from Solvency II fundamental spread assumptions with the fund manager’s expected return. This combines the active strategy with a capital-driven active investment strategy.
The robustness of this approach would depend on the risk sophistication of the capital formula. The investor can constrain the optimisation to mitigate these concerns. But this does not address the basic issue.
The fact remains that an optimisation algorithm can only solve for the relatively simple capital formula. It does not incorporate other forms of risk. As a result, it may still deliver a portfolio that is not attractive or sensible from a more holistic risk perspective. The process is still being driven to ‘game’ the regulatory formula as much as possible. Is this the ‘optimal’ approach?
Refining the capital-aware active management process
An alternative starting point is to screen out the most capital-inefficient assets first. The active manager can then use this narrower universe to construct the portfolio in their usual way. This is a less demanding use of the regulatory formula.
The portfolio construction process does not depend on the formula providing an accurate risk measure for all assets. However, it does allow for the fact that regulatory capital is costly. Some assets, including ones that are attractive from an investment perspective, may be capital-inefficient. This approach screens out assets that are best avoided by a capital-sensitive insurance asset-owner.
We can illustrate this revised capital-aware active management strategy for a portfolio measured against the BAML Sterling Corp. Index used above. This index included 1,004 different bonds from 391 issuers, as at 30 September 2019. Of those 391 issuers, the active fixed income process identified 195 as active ‘buys’, with a total of 530 bonds outstanding between them. Screening out 25% of the bonds that were least capital-efficient left 397 bonds.
As noted above, an equal-weighted allocation to all bonds in the index resulted in a matching-adjustment benefit of 7.7% and an SCR of 11.2%. An equal-weighted portfolio of these 397 capital-efficient bonds increased the matching-adjustment benefit to 8.8%, with a reduced SCR of 10.1%.
So, capital-aware active management has materially improved the ratio of return to risk capital. Of course, a purely quantitative approach can also offer this level of improvement or more, at least in the short term. But this would not incorporate the skill of the active investor in assessing the fundamental risk and return characteristics of investments. It would not include any forward-looking assessment of bond ratings.
In practice, an insurer using a quant approach must hope that the regulatory capital process is good at identifying cheap assets. Yet we know this is not the case.
Incorporating judgement
In this article, we have made the case for incorporating an active approach based on a judgement of the fundamentals of investment. We believe that investors can add value by incorporating forward-looking assessments of business models, industry dynamics and a company’s management.
However, there are quantitative approaches to active management too, not just to risk-capital management. Could insurance investors use a purely quantitative approach that combines active investment views with risk-capital management, avoiding the need for judgement?
In practice, even quantitative approaches involve judgement. Investors must judge which quantitative models to employ. In addition, asset returns are not normally distributed, with negative outcomes occurring more often than expected under the assumption of normally distributed returns. This is particularly true for the credit portfolios that dominate insurance portfolios.
These statistical properties, including higher moments such as skew and kurtosis, are hard to capture in an optimisation process. When they are included, they can result in portfolios that are not intuitively sensible in our experience.
Investors also require judgement when placing constraints on the optimisation process, such as setting limits on individual issuers. Today, integrating ESG analysis into the investment process is as good as mandatory. Yet there are no widely agreed sets of measures for ESG factors. Here too, investors must use their judgement.
We developed the example above in the specific context of Solvency II and credit risk. But the ideas, logic and principles behind it are more general. Insurance investors can adopt this approach to any case where our two basic premises apply – an active investment process and a sophisticated, detailed, risk-sensitive regulatory solvency capital regime.
Could the underlying principles set out here be applied outside the world of regulatory constraints? Investors could apply the same thinking to build a carbon-aware active management process. All they would require is some measure of carbon footprint. The principles that we have set out here allow investors to apply the same thinking to their underlying investments to address our changing climate in a disciplined way that avoids compromising expected returns.
Chart 1: UK Corporate Bonds: Screened for Active Bets and Capital Efficiency
Capitalised expected return versus Solvency II Standard Formula Solvency Capital Requirement for BAML UK Corporate Index (5)
Source: Bloomberg, Aberdeen Standard Investments, 30 September 2019.
Conclusion
To be clear, there is a place for a robust quantitative framework in the investment process. These frameworks allow both portfolio managers and insurers to understand the art of the possible: the highest achievable expected return for each level of a portfolio’s incurred capital charge. They provide a valid solution where taking an active view is not possible or relevant. They can highlight the scope to enhance the risk-capital management of an existing, well-constructed asset portfolio.
However, these quantitative tools, and the metrics they embed, are not sufficient to deliver a genuinely optimal portfolio for an insurance investor. To provide the greatest value, they need to be combined with fundamental analysis based on expert judgement – a capital-aware active approach.
Energy market framework
is not on autopilot
Chapter 4
Author
Every investment framework requires subjective adjustments. The largest energy market disruption in history was no exception. Prices below $45 are needed to deplete swollen inventory.
Intelligent design
Our design of an appropriate framework for analysing commodity prices makes extensive use of behavioural analysis.
Our design of an appropriate framework for analysing commodity prices makes extensive use of behavioural analysis
We have utilised best practice guidance on forecasting outlined in the Good Judgement Project, co-created by Philip Tetlock and Barbara Mellers, and the subsequent book by Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction. Their approach is informed by Daniel Kahenman’s Nobel Prize-winning work on behavioural economics and identification of cognitive biases that impede effective decision making.
Our framework incorporates many of the practices suggested by this preeminent forecasting analysis.
Through its use over the past seven years we have aimed to reduce cognitive biases, increase analytical rigour and allow for more persistent results.
The framework scores six major factors which influence a commodity’s price. These factors are weighted roughly equally between more numeric drivers such as valuation, supply and demand and more subjective drivers: price signals, externalities and risks. The process creates a numeric score for each factor between positive and negative three, with positive numbers indicating bullishness and negative bearishness over a three month time period.
Scoring and sense checking
The valuation factor represents the current price’s relationship to the current macroeconomic landscape using trade-weighted dollar, estimated global output gap and consumer price indices as guideposts. The demand factor scores the strength of the demand outlook using economic surprise indicators, purchasing manager indices and other coincident and forward-looking indicators. The supply factor scores the strength of supply using production and inventory data and trends. These three quantitative factors make up half the weight of the overall score. The three remaining factors are less fundamental. The price signals score looks at market internals due to the relatively poor quality of data available in the commodities market. Industry data is frequently restated and is rarely timely. In fact, a study of IEA supply/demand accuracy by UBS found that the average error term in the international agency’s data was 1 million barrels per day (bpd). The IEA data, once revised, provides perfect correlation to the oil price, but is relatively useless prior to final revision (chart 1).
The externalities factor scores the extent to which the market will be influenced by external factors. These include supply disruptions, international trade, sanctions, conflicts and cartel actions. Through the isolation of these issues we achieve more transparency on temporary or political actions. The final factor looks at risks to the assumptions used in our scoring which can include moves in the US dollar and probabilities of a large supply or demand shock.
Chart 1: Hindsight is 20/20
Source: IEA, Bloomberg, Aberdeen Standard Investment, July 2018.Scoring objectively is key to the process and is improved by first ignoring prior data. After the clean slate scoring is complete, each of the six factor areas are sense checked, i.e. given the context of the totality of data, does the score accurately represent what our indicators are telling us or do we need to revisit? This qualitative judgment overlay is crucial, as the scores in isolation could be misleading. An example of this veto action occurred in 2015 when the most widely followed oil market indicators signalled a bullish turn. The ‘discount to OSP (official selling price)’ represents the pricing power that the largest supplier, Saudi Aramco, has over the largest consumer, Asia. This indicator is followed with the same fanaticism within the oil market as US Federal Reserve meetings by economists and bond markets. Throughout 2015 the ‘discount to OSP’ was improving which normally would signal healthy demand for oil. After additional research it was determined that the price ‘improvement’ was caused by the filling of newly completed strategic oil reserve tanks in China, hardly a source of ongoing demand. We overruled the bullish indicator and adjusted the score to negative.
This qualitative element comes from a group of experienced researchers and fund managers. Our team is diverse by age, gender, nationality, expertise and we add to that diversity by frequently interacting with internal clients. We enjoy healthy debate with our internal fixed income and equity experts that hold corollary assets in the stock and bond world, incorporating their views where appropriate. The strict division of duties and honest constructive discussions have built the trust necessary to produce robust decision making.
Of particular interest is how the qualitative analysis is useful in a regime change. A fitting example occurred in 2014, when oil dropped from $115 in mid-June to $97 by the end of September. This represented the bottom of a three year trading range and argued for going long, assuming an oscillating price environment. However, the objectivity of the process required deeper research to explore a wider range of outcomes given the poor quality of most commodity data. Our research showed a regime change underway as US shale production was exploding, and inventories were rising as a consequence, yet the IEA had not yet picked up the velocity of the change. Thus a classic value trap was avoided (Chart 2).
Chart 2: When a regime ends
Source: Bloomberg, Aberdeen Standard Investment, July 2018.So what does our framework show now?
A framework that is effective brings order and structure to an analysis, while one that is too rigid lacks the flexibility to handle extraordinary situations.
The economic effects from the Coronavirus have been unprecedented. We have been tracking the virus progression since mid-January. It has been of special interest to our commodity group both because it originated in China, the largest commodity consuming country and because the early mitigation techniques centered around locking down economic activity and mobility which targets energy consumption. When the February estimates of Chinese crude oil demand destruction came to light, it was clear we couldn’t wait for negative values to flow into all the indicators of our framework to estimate the effects. We needed to add another tool to supplement the framework that would combine art and science to get a sense of the magnitude of supply demand dynamics. Essentially asking, “How big of a hole is the oil market digging, and how long will it take to get out?” We started with the Chinese demand destruction estimates and rolled those through to other regions at roughly 35%, 15%, 7% demand destruction for months one to three respectively.
The maximum amount of inventory year to date should have occurred in May, at roughly 1.5 billion barrels. Supply cuts of 10 million bpd for the rest of the year could relieve stress on inventories and whittle the build down to 560 million or 1/3 of May’s value.
The two largest caveats are that a second wave of the virus does not become widespread in any region, and that the oil price does not rise to a level that removes voluntary supply cuts. In our estimation the price that reverses supply cuts is $40-45 a barrel. Any sustained period of prices at that level before 2021 risk building inventories again.
Robust conclusion
Following a 20% cut in global oil demand and a 12% cut in global supply, inventories of crude and crude products remain very high. It would not take much of a change in assumption to move prices higher or lower in short order. Brent prices of $32-42 and WTI prices of $30-40 keep production lower, repairing the damage done by Covid-19.
Our design of an appropriate framework for analysing commodity prices makes extensive use of behavioural analysis
Combining value
and quality
Chapter 5
Author
The combination of two risk premia, value and quality, into a multi-factor strategy reaps the benefits of diversification to increase risk-adjusted returns.
Quants often combine individual risk premia to build multi-factor strategies
Quantitative investors use risk premia, rather than individual stocks, as the building blocks for their strategies. Risk premia are factors that research has shown generate superior risk-adjusted returns over the long term, typically several decades.
Examples of risk premia are value, quality, momentum, small size and low volatility, each with its own performance drivers and risk profile. Furthermore, each risk premium has a strong investment rationale to insure against data mining I.e. the discovery of spurious patterns in data which have no ability to predict future returns.
Quants often combine individual risk premia to build multi-factor strategies with the aim of increasing returns while simultaneously reducing the frequency and magnitude of drawdowns.
We focus on two risk premia, value and quality. We calculate their performance individually and in combination over the last 30 years, including the current Covid-19 economic crisis.
The definition of value used here is free cash flow (FCF) yield where:
FCF Yield = (Cash Flow from Operations – CAPEX – Dividends)/Market Capitalisation
Free cash flow is the cash a company has leftover after maintaining and growing its business, and paying out dividends. FCF yield is a value factor frequently used by quant investors since it tends to experience smaller drawdowns than other value factors such as book yield.
Value stocks can be thought of as ‘cheap,’ due to, for example, questions over the sustainability of earnings, or issues with the company’s business model. There is a wealth of empirical evidence showing that, over the long term, value stocks tend to outperform the broader market. Possible explanations for this outperformance include investors overreacting to the negative newsflow associated with value stocks causing them to subsequently revert to their long-term mean. Or, alternatively, it may because of investors overpaying for ‘non-value’ growth stocks.
In contrast to value, quality is a difficult factor for quants to define. By quality do we mean high levels of profitability, low leverage or stability of earnings? Here, we choose to define quality in terms of profitability:
Profitability = Gross Profit/Total Assets
where Gross Profit = Revenues – Cost of Goods Sold
The investment rationale for this factor was motivated by the seminal academic paper by Robert Novy-Marx2 who argued that higher profitability is predictive of higher future returns.
As quants we take great pride in being systematic, rules-based investors. This means we backtest our investment ideas to understand how they behaved through past market cycles.
A backtest is constructed in the following straightforward way.
First, we specify an investment universe, defined here as the largest 500 stocks by market capitalisation in the FTSE World ex Financials. Financials are excluded because both the above factors have little meaning for financial stocks e.g. what is the significance of capital expenditure or cost of goods sold for, say, a bank?
Then, every month, we calculate the value of the factor for each stock in the universe. Next, we split the universe into sectors and take long positions in the top-scoring third of stocks in each sector. We fund these by shorting the bottom third in each sector. Running the backtest within each sector effectively neutralises any sector exposures, which quants do not usually consider to be rewarded source of risk.
We also backtest a combination of the two factors by converting the factor values for each stock into a percentile relative to the investment universe. The combination score is then simply the average of the two percentiles. By combining factors in this way, we are taking long positions in those stocks within each sector that score ‘fairly well’ on both factors. At the same time, we are shorting those stocks that tend to score ‘fairly poorly’ according to both factors.
The results of the backtests over the last 30 years are shown in Chart 1.
Chart 1: Combining Value & Quality - period: 31 December 1989 to 30 April 2020
Source: Aberdeen Standard Investments, FactSet (as of May 2020).
Back-tested performance is not indicative of future results. The above analysis is provided strictly for illustrative purposes only to demonstrate the investment principals mentioned. The methodology informing the back-test is included in the text. Additional assumptions, as well as more detailed description of the model / methodology, will be provided at your request.
The back-tested performance above is gross of management fees and expenses. It should be noted that annual management fees, transaction fees and/or other expenses typically associated with an account (i.e., custodial fees, administrative charges, etc.) were not included in this illustration. Had these fees and expenses been incorporated the returns shown would have been lower. While the construction of the models included in this document is aimed at replicating how an allocation would be managed in a live environment, there may be unexpected material differences. Back-tested results are calculated by the retroactive application of a model constructed on the basis of historical data and based on assumptions integral to the model, which are subject to losses. General assumptions include: liquidity would have permitted all trading; the impact of certain market factors, such as fast market conditions, are not applicable; economic and political global events would not have changed the investment decisions, the model does not change materially over the time period presented, and the ability or inability to withstand losses did not adversely affect actual results. Changes in these assumptions may have a material impact on the back-tested returns presented.
These assumptions have been made for modeling purposes and are unlikely to be realized. No representations and warranties are made as to the reasonableness of the assumptions. It should also be noted the back-tested performance is not necessarily representative of any Aberdeen portfolios and do not represent returns that any investor actually attained. No client of the adviser has achieved the results provided herein nor should it be inferred that any portfolio will perform in a manner as outlined in this document. When interpreting the results, the investor should always take into consideration the limitation of the model applied. One of the limitations of hypothetical performance results is that they are prepared with the benefit of hindsight. In addition, hypothetical trading does not involve financial risk, and no hypothetical analysis can completely account for the impact of financial risk in actual trading. Further, back-testing allows for the security selection methodology to be adjusted until past returns are maximized. The hypothetical portfolio is not a strategy that is currently available for investment.
The results clearly show the low correlation between the value and quality factors. This makes intuitive sense since value stocks are by definition ‘cheap’ whereas high-scoring quality stocks tend to be expensive because investors are prepared to pay a premium for high levels of profitability. Furthermore, combining the two factors significantly increases the return over the backtest period. Note also that the strategy has outperformed in the current Covid-19 crisis.
We can summarise a backtest performance using an information ratio (IR), which is a measure of risk-adjusted return.3 The combination strategy has an attractive IR of 0.90.
Conclusion
In this short article, we took two lowly correlated factors that individually outperformed over the long term. We then combined them to build a hybrid strategy that reaped the benefits of diversification to generate higher risk-adjusted returns. This new multi-factor strategy outperformed over the last 30 years in a variety of market environments including the current Covid-19 crisis.
Quants place a lot of emphasis on backtesting, not because we think history repeats itself but rather, as Mark Twain famously remarked, history does rhyme.
In addition to value and quality, there exist other pairs of lowly correlated risk premia for quants to exploit, such as value and momentum… but that’s a topic for another article.
1 Company selected for illustrative purposes only and not as an investment recommendation or indication of future performance
2 Novy-Marx, Robert. The other side of value: Good growth and the gross profitability premium. No. w15940. National Bureau of Economic Research, 2010.
3 An Information Ratio is defined as the mean annual performance divided by the standard deviation of returns. The rule of thumb is that an IR of 0.5 is ‘interesting’ while an IR of 1 is ‘very good.’
Quants often combine individual risk premia to build multi-factor strategies