Risk Management

Equity Bond Exposure Management

I did a post last October (here) looking at varying allocation between stocks/bonds and at the end I hinted towards a tactical overly between the two asset classes. Six months later, I finally found a decent overlay I feel may hold value.

In a paper called “Principal Components as a Measure of Systemic Risk” (SSRN), Kritzman Et al. presented a method for identifying “fragile” market states. To do this, he constructed the Absorption Ratio. Here is the equation:

The numerator sigma represents the variance of the ith eigenvector, while the denominator one equals the variance of the jth asset. In the paper, n = 1/5 the total number of assets (N). The interpretation is simple, the higher the ratio, the more “fragile” the market state. The intuition behind this ratio is that when its high, it implies that risk is very concentrated. On the other hand, when it is low, risk is dispersed and spread out. Think weak and strong. Following is the raw AR through time of the DJ 30 Components. 

ar

As you can see, the ratio spikes during the tech bubble and the recent financial crisis. How would it look like when used as a filter? Below are two pictures comparing the signals generated by 200 day sma and standardized AR.

SMA ARatio

Pretty good at the timing in my opinion. In line with the paper, I reconstructed the strategy that switches between stocks(DIA) and bonds (VBMFX). When the AR is between 1 and -1, we will split 50/50. When its above 1, we are in love with bonds and when its below -1, we are  in love with stocks. Simple. Results:

EQPerf
Table

And here is the code: (I know its messy, didn’t have a lot of time! :)

Note: There is survivorship bias. I used the current day DJ30.

Thanks for reading

Advertisements

Diversification through Equity Blending

In a sound asset allocation framework, it is never a good idea to over weight the risky portion of the portfolio. One example would be the traditional 60/40 portfolio whereby an investor allocates 60% to equities and 40% to bonds. Such allocation may intuitively makes sense as you “feel” diversified but when extraordinary events happen, you will be less protected. Below is the performance of the 60/40 allocation rebalanced monthly since 2003. Note I used SPY and IEF for the mix.

In this post, I would like to show some ideas that will reduce risk and increase return by bringing in a different type of return stream. Traditional asset allocation focuses mainly on optimal diversification of assets, here I will focus on allocation to strategies. From my own research, there are only so many asset classes the individual can choose to mix to form portfolios, not to mention the less than reliable cross correlation between assets classes in market turmoil (2008). To bring stability for the core portfolio, I will incorporate Harry Browne’s Permanent Portfolio. This return stream is composed of equal weight allocation to equities, gold, bonds. and liquid cash. For the more aggressive part, I will use daily equity mean reversion (RSI2). Note that a basic strategy overlay on an asset can produce a return stream that can have diversification benefits. Below are three different equity curves. Black, green and red represents, mean reversion, 60/40 equal weight allocation of both strategies, and permanent portfolio respectively.  

To summarize, I have taken two return streams derived from strategies traded over assets and combined them to form a portfolio. The allocation percentage is 40% to the risk asset (mean reversion/MR) and 60% to the conservative asset (Permanent Portfolio/PP). And here are the performance metrics.

Traditional represents the traditional 60/40 allocation to equity and bonds while B-H represents buy and hold for the SP500. This superficial analysis is only meant to prove the powerful idea of equity blending of assets and trading strategies. When traditional search for diversification becomes fruitless, this idea of incorporating different strategies can have a huge impact on your underlying performance.

I will come back later for the R code as its pretty late and I have class tomorrow!

Parameter Insensitive Models

In my opinion there are two enemies to successful system development. One is the “exploitability” of the anomaly  you are trying to extract profit from. The other is the parameters that you choose to exploit the anomaly with. The “exploitability” aspect is something you can’t have much control over as the profitability of any anomaly is in constant flux. One example is the profitability of trend following in general. When markets are choppy, its tough for any trend followers to extract sizeable profits.

The other area that you have absolute control over is the parameters with which you choose to trade with. The more varied the parameter selection, the more robust you are as the diversification increase will reduce probability of loss if any parameters were to suffer lack of performance. Parameters here can literally be the days you choose to employ a MA crossover strategy or it can extend to similar models like breakouts.

In the following experiment, I will test the performance of 5 different models. They are all mean reversion in nature.

Model1 (rsi1): RSI(2) 50/50

Model2 (rsi2): RSI(2) Buy: <30 Short: >70

Model3 (rsi3): RSI(2) Buy: <30 Sell: >50 Short: >70 Cover: <50

Model4 (no.reb): no rebalance but equal weight

Model5 (reb): equal weight rebalance weekly

Parameter insensitive models rest on the idea that no one knows what the future holds and how each parameter will perform. Instead of just relying on past data to select something that “was” consistent, parameter insensitive models try to avoid putting all eggs in one basket. The following is the equity curve of the strategy.

The focus should be on the bold equity curve which rebalances weekly. From the graph, it is very much correlated with the other equity curves, but it is smoother than individual strategies equity curve. What I am trying to convey here is that return to any strategy is attributed to the underlying health of the anomaly (something you cannot control) plus the efficiency of the parameters that are used to extract profit (something you have control over). The next chart is the drawdown

If we unfortunately chose to trade rsi2 (blue), our drawdowns will be markedly different.  Next a stacked horizon plot of rolling 252 day return

The first three are the models rsi1, rsi2, rsi3 and the third and fourth are no rebalance and rebalance. As you can see the overall performance is reduced, but in times when certain individual models underperform, the aggregate rebalancing model is able to mitigate it quite successfully.  An finally, the numbers…

One little experiment cannot prove anything. I am still trying the idea out in many different ways and hope that through further research, I will arrive at some more concrete conclusions.

# RSI Parameter insensitive Model
# test out rebalancing equal wait verses holding constant weight
# correctly set working directory and import in your own equity curves
# for blending, I passed in three equity curves for blending

data<-read.csv("rsi.csv") #set your own input files

#conversion to zoo object
data$date<-as.Date(data$date,"%Y-%m-%d")
rsi2.50.50<-zoo(data$rsi2.50,data$date)
rsi2.extreme<-zoo(data$rsi2.extreme,data$date)
rsi2.semi<-zoo(data$rsi2.semi,data$date)
data<-merge(rsi2.50.50,rsi2.extreme)
data<-merge(data,rsi2.semi)

names(data)<-c("rsi1",'rsi2','rsi3')
ret<-ROC(data)
ret[is.na(ret)]<-0

#normalize equity curves
ret$rsi1.equity<-cumprod(1+ret$rsi1) #simulated equity
ret$rsi2.equity<-cumprod(1+ret$rsi2)
ret$rsi3.equity<-cumprod(1+ret$rsi3)
ret$equity<-ret$rsi1.equity+ret$rsi2.equity+ret$rsi3.equity #add them together

ret$equity<-ROC(ret$equity)
ret$equity[is.na(ret$equity)]<-0
ret$equity<-cumprod(1+ret$equity)

rsi.equity1<-ret[,-(1:3)] #same allocation through time
rsi.equity2<-as.xts(rsi.equity1[,-4])

###############################
#Rebalancing of equity
###############################
# Load Systematic Investor Toolbox (SIT)
setInternet2(TRUE)
con = gzcon(url('https://github.com/systematicinvestor/SIT/raw/master/sit.gz', 'rb'))
source(con)
close(con)

#*****************************************************************
# prep input
#******************************************************************
data <- list(prices=rsi.equity2,
 rsi1=rsi.equity2$rsi1,
 rsi2=rsi.equity2$rsi2,
 rsi3=rsi.equity2$rsi3) #need to create new list to store stuff
#weight with n column as input data
data$weight<-rsi.equity2
data$weight[!is.na(data$weight)]<- NA
#execution price
data$execution.price<-rsi.equity2
data$execution.price[!is.na(data$execution.price)]<-NA
#dates
data$dates<-index(rsi.equity2)

#*****************************************************************
# Rebalancing Algo
#******************************************************************
prices = data$prices
nperiods = nrow(prices)
target.allocation = matrix(c(0.33,0.33,0.33), nrow=1)

# Rebalance periodically
models = list()

period<-'weeks' #change to whatever rebalancing preiod you want
data$weight[] = NA
data$weight[1,] = target.allocation

period.ends = endpoints(prices, period)
period.ends = period.ends[period.ends > 0]
data$weight[period.ends,] = repmat(target.allocation, len(period.ends), 1)

capital = 100000
data$weight[] = (capital / prices) * data$weight

#this works oly when your input prices are object XTS rather than zoo
models[[period]] = bt.run(data, type='share', capital=capital)

#*****************************************************************
# Create Report
#******************************************************************
#all 5 equity curves
equity1<-merge(rsi.equity1,models$weeks$equity)
names(equity1)<-c('rsi1','rsi2','rsi3','no.reb','reb')
equity1<-as.xts(equity1)

#print out all the strategies return
for(i in 1:ncol(equity1)
{
 ret<-ROC(equity1[,i])
 ret[is.na(ret)]<-0
 ret<-as.xts(ret)
 dd<-Drawdowns(ret,geometric=T) #
 print(compute.cagr(equity1[,i]))
 print(maxDrawdown(as.xts(ret)))
 print((compute.cagr(equity1[,1]))/(maxDrawdown(as.xts(ret))))
 print(compute.sharpe(ret))
}

ret<-ROC(equity1,252)
horizonplot(ret,origin=0,scales = list(y = list(relation = "same")),colorkey=T)

df<-equity1
df <- data.frame( time=index(df),rsi1=df$rsi1,rsi2=df$rsi2,rsi3=df$rsi3,no.reb=df$no.reb,reb=df$reb)
#plot Equity

ggplot(df,aes(df$time)) +
 geom_line(aes(y=rsi1,colour="rsi1")) +
 geom_line(aes(y=rsi2,colour="rsi2")) +
 geom_line(aes(y=rsi3,colour="rsi3")) +
 geom_line(aes(y=no.reb,colour="no.reb")) +
 geom_line(aes(y=reb,colour="reb"),size=1.2)
#plot drawdown
df1<-merge(dd1,dd2,dd3,dd4,dd5)
df1<-data.frame( time=index(df1),rsi1=df1$rsi1,rsi2=df1$rsi2,rsi3=df1$rsi3,no.reb=df1$no.reb,reb=df1$reb)

ggplot(df1,aes(df$time)) +
 geom_line(aes(y=rsi1,colour="rsi1")) +
 geom_line(aes(y=rsi2,colour="rsi2")) +
 geom_line(aes(y=rsi3,colour="rsi3")) +
 geom_line(aes(y=no.reb,colour="no.reb")) +
 geom_line(aes(y=reb,colour="reb"),size=1.2)

Value Investing with Risk Management

When I started researching about investing at the age of 16, I only believed in value investing. The idea of buying things cheaply intuitively made sense. You’d find me constantly researching and reading value investing books night after night. As each company I researched had different business models and economics moats, a lot of my earlier methods of analyzing stocks weren’t systematic and therefore couldn’t really be tested. Besides where could a teenager get his hands on data back in 2006.

I stopped all my efforts in discretionary value investing in early 2010 but 2 years today, I finally could put my earlier ideas to test. Let me define some of my ideas for researching companies. There are a lot of ways to value a company. Be it using earning projections to discount future cash flows to deriving value from assets in the balance sheet, many value investors can only agree that value is dependent on how one defines it. For myself, I didn’t like outright projections of earnings due to the myriad of factors and assumptions I had to make regarding the future. Instead I placed heavy emphasis on assets on the balance sheet which really took out companies that weren’t heavily asset backed. Within the balance sheet, I really just simply looked at how healthy things were. The ratio of liquid assets verses long term debt and short term debt should be healthy. The current price should be trading near its book value per share minus liquid cash. Profit margins should be high indicating market share. One of my favourite ratios to look at is comparing trailing 12 month net income to total debt. Reason being the higher it is, the more assured I am that the company can pay money back to it creditors. These are just some of the analysis I do but what comes after is entirely subjective hence hard to know if its lucky or hard work.

To test my earlier methods of analyzing stocks, I have created a stock screen that screens and buys stocks based on the following 2 conditions.

1. Closing price of stock must be less than Book Value per share (quarterly)

2. Annual Free Cash flow > Total Annual Debt

Test date are from 2001-2012. Rebalancing quarterly. Stock universe is based on the Russell 1000 adjusted for survivor-ship bias and splits. My benchmark is the SP500. The results, assuming 100 dollar invested:

The simple strategy multiplied the initial capital by more than 8 times compared to the benchmark which barely moved in the past decade. Its true that no matter how nice a business or its fundamentals are, a rising tide lifts all boats. Simply running this stock screen, the investor would face drawdowns of 50%, a bit more than holding the SP500 which endured a 49% MaxDD. What risk management measures can be implemented to improve the performance?

In one of my earlier posts I mentioned about trading the equity curve. The modified strategy will be taking signals until the 4 quarter total equity value is less than its own SMA which then I will be favouring cash rather than invested in equity. The results:

From the above image, one can see that there indeed is a reduction in risk associated with the modified strategy. Although Sharpe improved by 2.8%, MAR improved by 74%. Further research I think may improve results is ranking each quarters signals by some measure of price, volume, or volatility.

This was really an eye opener as I would never have dreamt of actually testing my earlier methods in such a manner. I always thought that value investing would be more of an art compared to systematic trading. In the future I hope to start messing together my research in value and momentum to show that combining uncorrelated edges improves performance.

SE

Volatility Parity

When I first started out in system research a year ago, I was told that in this business, if you can achieve return with lower volatility, you will definitely attract a lot of people’s attention. Since then, I’ve found myself to leaning towards strategies with lower volatility, usually achieved through proper volatility management.

In this post, I’d like to take a look at portfolio volatility by using some tools from portfolio theory. I’d like to show that through peeling into volatility one can better manage their portfolio.

There exists a fine line between academic finance and practitioners of finance. The opposing ideas are  whether the markets are efficient or not. I am not going to dive in to the discussion of this, but I stand to reason that there are no rules or equation to the markets. They are ever changing, therefore, I believe that one should treat every concept as tools.

A bit of equations…portfolio variance is defined by the following equation. I am only going to use a two asset class example to avoid bringing in the use of covariance matrix.

        

    

 

The variance contribution of each asset is thus…

In my opinion, the above equations capture a lot of information that can be used to manage volatility. At any given time multi market strategies will have more than one position. If you are able to position size each position so that each one contributes equally to overall portfolio volatility, you will have much smoother balanced and diversified portfolio.

In the following graphs, I calculated according to the above equations how Bonds and Stock contribute to aggregate portfolio variance. For stocks, I used SPY and for bonds, I used IEF, both are exchange traded funds. This is a rolling 252 day graph with traditional 60/40 allocation.

 From the above graph one can infer that the volatility contribution is not equal and at times, you will see that stocks will contribution more than 100% while bonds contributed negatively.

The above graph also gives a pretty good market timing signal. When bonds contributed negatively, it seems that the market is in turmoil and vice versa for stocks when it contributed more than 100%.

I hope through this, the reader will be able to understand volatility more and look and just how it affects your portfolio.

Equity Curve Trading

In 2006, Cambria Investment Management published a paper that gained wide spread popularity where it proposed that applying a long term 10 month SMA to assets classes (rules below) to trade them would effectively cut the drawdown while preserving the equity like return. Such a simple trend following method paired with ease of access to exchange traded funds for exposure has helped made the idea very popular.

I proposed a similar idea. Instead of applying a SMA to the asset classes, one would instead calculate a X month SMA of a buy and hold strategy’s equity curve. If the equity curve ever dips below the equity curve SMA, we will go to cash. In effect we are forming a hedge against our own equity curve.

Below are the results for trading the S&P 500 Index, ETF data are limited so I used underlying total return index as proxy. Data from 1988 to 2011 are total return, but 1970 to 1988 are normal composite data. I know that the numbers are probably a bit off and one should take it as instructive purposes.Test duration are from 1970 to 2012 (May). No commission or slippage accounted for. All signals are taken the following days open.

Th equity curve in the above image shows that it has avoided the last two major bear markets. It preserves return through being on the sidelines.

The above Drawdown for the entire test period confirms that the strategy is able to weather storms! Below is the rolling 10 month return of the strategy. You can see that it replicates the upside buy and hold return very well while. Pretty consistent across time.

I believe that there can be a lot of ideas that can be paired with this method. One such extensions I dreamt about was using traditional methods of portfolio optimization to form portfolio of assets that meet criteria like Minimum variance or risk parity. Overlay this method to only participate in the upside and staying in cash or investing in bonds when equity curve is below its SMA. You are bounded by your own imagination.