MA Cross Over Example in Usage Guide

I've been attempting to build various position sizing rules for an EWMA crossover trend strategy but I've been having issues where the backtest does not produce any results. To try and figure out the issue I decided to go through a process of elimination to identify the issue with my sizing rules.

To that end, I copied the Moving Average Crossover strategy from the Quantrocket Usage Guide but changed it to an Exponential Weighted Moving Average instead of the SMA. It's identical except for the change to EWMA and I'm using a StaticUniverse that I use regularly without issue. I also ensured that there was a decay_rate included for the EWMAs and used the example decay_rate from API guide but the backtest is producing no returns. As a result, I'm thinking the issue may not be with the various sizing rule strategies I've been building

Any assistance would be appreciated. I've pasted the EWMA version of the crossover strategy I copied from the Usage Guide and where I'm receiving an error indicating no returns were produced below:

import zipline.api as algo
from zipline.pipeline import Pipeline, EquityPricing
from zipline.pipeline.factors import ExponentialWeightedMovingAverage
from zipline.pipeline.filters import StaticUniverse

BUNDLE = "usstock-1min"

def initialize(context: algo.Context):
    """
    Create a pipeline containing the moving averages and
    schedule the rebalance function to run each trading
    day 30 minutes after the open.
    """
    context.target_value = 50000

    pipe = Pipeline(
        columns={
            "long_mavg": ExponentialWeightedMovingAverage(
                inputs=[EquityPricing.close],
                window_length=128, decay_rate=.08),
            "short_mavg": ExponentialWeightedMovingAverage(
                inputs=[EquityPricing.close],
                window_length=32, decay_rate=.08)
        },
        initial_universe=StaticUniverse("global-macro1"))

    algo.attach_pipeline(pipe, "mavgs")

    algo.schedule_function(
        rebalance,
        algo.date_rules.every_day(),
        algo.time_rules.market_open(minutes=30))

def before_trading_start(context: algo.Context, data: algo.BarData):
    """
    Gather today's pipeline output.
    """
    context.mavgs = algo.pipeline_output("mavgs")

def rebalance(context: algo.Context, data: algo.BarData):
    """
    Buy the assets when their short moving average is above the
    long moving average.
    """

    for asset in context.mavgs.index:

        short_mavg = context.mavgs.short_mavg.loc[asset]
        long_mavg = context.mavgs.long_mavg.loc[asset]

        if short_mavg > long_mavg:
            algo.order_target_value(asset, context.target_value)
        elif short_mavg < long_mavg:
            algo.order_target_value(asset, 0)

EWMA is more complicated than simple MA and not necessarily a drop-in replacement for it. In this case, the decay rate is so aggressive that past observations quickly have 0 weights, which results in the long and short moving averages being identical, which results in placing no trades since the rebalance logic expects one MA to be higher than the other. Try a higher decay rate such as 0.9 (= slower decay). The 0.08 decay rate in the API example is not the best example for this reason and should probably be updated.

For what it's worth I did some research on the sensitivity because I suspected this may be the issue but it's counter intuitive to me that you would want a decay rate as aggressive as .90. Also, in other test I was able to get a .10 decay rate to work with the EWMA. ChatGPT provided the below which would indicate that the .08 is well within a normal range for a decay rate. Is it possible Zipline has the equation flipped somehow?

Below is what ChatGPT had to say about the equation and the normal decay rate for EWMAs (I know ChatGPT is not always right:))

"The decay rate in an Exponential Weighted Moving Average (EWMA) is often chosen based on the desired sensitivity of the average to new data. However, a common choice for the decay rate (or smoothing factor) in financial applications, including trend analysis in equities, is typically in the range of 0.05 to 0.3. This range allows the EWMA to respond to recent price changes while still smoothing out noise.

To give you a more concrete example, the decay rate is sometimes expressed in terms of the number of periods (N) it effectively covers. For a common decay rate used in finance, such as for a 20-day EWMA, the decay rate is often calculated as:

α=2N+1\alpha = \frac{2}{N+1}α=N+12​

For N = 20 days: α=220+1=221≈0.095\alpha = \frac{2}{20+1} = \frac{2}{21} \approx 0.095α=20+12​=212​≈0.095

Thus, a decay rate of around 0.1 is often used for a 20-day EWMA. For shorter periods, a higher decay rate might be used (e.g., 0.3 for a 5-day EWMA), while for longer periods, a lower decay rate might be applied (e.g., 0.05 for a 50-day EWMA).

In summary, the normal decay rate applied in financial applications for EWMA typically falls within the range of 0.05 to 0.3, depending on the specific needs and the period length of the moving average being used."

I think I just got my answer about Zipline. I dug deeper and ChatGPT said the following about how Zipline calculates their decay rate.

"In Zipline, the Exponential Weighted Moving Average (EWMA) is calculated using a decay rate that is set by the user or defaults to a typical value. The decay rate in Zipline is defined as λ\lambdaλ, which determines the rate at which the weights of the older data points decrease.

The formula for the decay rate λ\lambdaλ in Zipline is:

λ=1−2N+1\lambda = 1 - \frac{2}{N + 1}λ=1−N+12​

where NNN is the window length or the number of periods over which the average is calculated. This formula means that a higher value of NNN will result in a smaller decay rate, giving more weight to older data points, whereas a smaller NNN will give more weight to recent data points."

@Brian after more trial and error I continue to think there may be an issue with Ziplines EWMA. I ran the strategy pasted below testing a single asset (SPY) on an EWMA crossover strategy with a decay_rate of .9 and while the backtest now produces results, the Zipline data browser results indicates it stayed long from 2014 to 2024 with the short EWMA (32) never crossing below the long EWMA (128), which isn't possible unless this decay rate is dramatically altering how the EWMAs are calculated. Further, the Zipline results file indicates the strategy went long on 1/2/14, never sold, but the results didn't match the benchmark which was also the SPY. So a few things just aren't adding up. Any guidance would be appreciated. Thanks.

%%writefile EWMA_Trend_Strategy_SingleAsset_Long_Only.py

import zipline.api as algo
import numpy as np
from zipline.finance import commission, slippage
from zipline.pipeline import Pipeline
from zipline.pipeline.data import EquityPricing
from zipline.pipeline.factors import ExponentialWeightedMovingAverage
from zipline.pipeline.filters import StaticAssets

from zipline.api import (
    attach_pipeline,
    date_rules,
    time_rules,
    get_datetime,
    schedule_function,
    pipeline_output,
    record,
    symbol,
    order_target_value,
    set_commission,
    set_slippage,
)

SHORT_EWMA = 32
LONG_EWMA = 128

def make_pipeline():
    price = EquityPricing.close.latest
    universe_price = price.all_present(252)
    universe_symbols = StaticAssets([symbol("SPY")])
    
    short_ewma = ExponentialWeightedMovingAverage(
        inputs=[EquityPricing.close], 
        window_length=SHORT_EWMA, 
        decay_rate=0.9, 
        mask=universe_symbols
    )
    long_ewma = ExponentialWeightedMovingAverage(
        inputs=[EquityPricing.close], 
        window_length=LONG_EWMA, 
        decay_rate=0.9, 
        mask=universe_symbols
    )
    
    ma_cross = short_ewma > long_ewma
    universe = universe_price & universe_symbols & ma_cross

    pipe = Pipeline(
        columns={
            "price": price,
            "short_ewma": short_ewma,
            "long_ewma": long_ewma,
        },
        screen=universe
    )
    return pipe

def initialize(context: algo.Context):
    """
    Create a pipeline containing the moving averages and
    schedule the rebalance function to run each trading
    day.
    """
    algo.set_benchmark(symbol("SPY"))
    
    pipe = make_pipeline()
    algo.attach_pipeline(pipe, "ewmas")

    algo.schedule_function(
        rebalance,
        algo.date_rules.every_day(),
        algo.time_rules.market_close()
    )
    
    context.target_value = None  # Initialize the target_value attribute

def before_trading_start(context: algo.Context, data: algo.BarData):
    """
    Gather today's pipeline output.
    """
    context.ewmas = algo.pipeline_output("ewmas")

def rebalance(context: algo.Context, data: algo.BarData):
    """
    Buy the assets when their short moving average is above the
    long moving average.
    """
    
    for asset in context.ewmas.index:

        short_ewma = context.ewmas.short_ewma.loc[asset]
        long_ewma = context.ewmas.long_ewma.loc[asset]
        context.target_value = context.portfolio.portfolio_value  # Set target_value

        if short_ewma > long_ewma:
            algo.order_target_value(asset, context.target_value)
        elif short_ewma < long_ewma:
            algo.order_target_value(asset, 0)

You've set your universe filter such that SPY will only be included in the pipeline output when the short EWMA is above the long EWMA, so naturally you will only ever buy the security.

The Zipline test suite validates the EWMA factor against pandas .ewm() method, so you can trust its accuracy. The docstring spells out the math Zipline is using:

    When calculating historical averages, rows are multiplied by the
    sequence::

        decay_rate, decay_rate ** 2, decay_rate ** 3, ...

Which is why a lower decay_rate will yield faster decay.

1 Like

That makes complete sense. Sorry for the run around on this. The API example of decay_rate had me mixed up and thinking something else (besides my code) was the problem. Thank you @Brian.