Zipline, a Pythonic Algorithmic Trading Library

Overview
Zipline

Gitter pypi version status pypi pyversion status travis status appveyor status Coverage Status

Zipline is a Pythonic algorithmic trading library. It is an event-driven system for backtesting. Zipline is currently used in production as the backtesting and live-trading engine powering Quantopian -- a free, community-centered, hosted platform for building and executing trading strategies. Quantopian also offers a fully managed service for professionals that includes Zipline, Alphalens, Pyfolio, FactSet data, and more.

Features

  • Ease of Use: Zipline tries to get out of your way so that you can focus on algorithm development. See below for a code example.
  • "Batteries Included": many common statistics like moving average and linear regression can be readily accessed from within a user-written algorithm.
  • PyData Integration: Input of historical data and output of performance statistics are based on Pandas DataFrames to integrate nicely into the existing PyData ecosystem.
  • Statistics and Machine Learning Libraries: You can use libraries like matplotlib, scipy, statsmodels, and sklearn to support development, analysis, and visualization of state-of-the-art trading systems.

Installation

Zipline currently supports Python 2.7, 3.5, and 3.6, and may be installed via either pip or conda.

Note: Installing Zipline is slightly more involved than the average Python package. See the full Zipline Install Documentation for detailed instructions.

For a development installation (used to develop Zipline itself), create and activate a virtualenv, then run the etc/dev-install script.

Quickstart

See our getting started tutorial.

The following code implements a simple dual moving average algorithm.

from zipline.api import order_target, record, symbol

def initialize(context):
    context.i = 0
    context.asset = symbol('AAPL')


def handle_data(context, data):
    # Skip first 300 days to get full windows
    context.i += 1
    if context.i < 300:
        return

    # Compute averages
    # data.history() has to be called with the same params
    # from above and returns a pandas dataframe.
    short_mavg = data.history(context.asset, 'price', bar_count=100, frequency="1d").mean()
    long_mavg = data.history(context.asset, 'price', bar_count=300, frequency="1d").mean()

    # Trading logic
    if short_mavg > long_mavg:
        # order_target orders as many shares as needed to
        # achieve the desired number of shares.
        order_target(context.asset, 100)
    elif short_mavg < long_mavg:
        order_target(context.asset, 0)

    # Save values for later inspection
    record(AAPL=data.current(context.asset, 'price'),
           short_mavg=short_mavg,
           long_mavg=long_mavg)

You can then run this algorithm using the Zipline CLI. First, you must download some sample pricing and asset data:

$ zipline ingest
$ zipline run -f dual_moving_average.py --start 2014-1-1 --end 2018-1-1 -o dma.pickle --no-benchmark

This will download asset pricing data data sourced from Quandl, and stream it through the algorithm over the specified time range. Then, the resulting performance DataFrame is saved in dma.pickle, which you can load and analyze from within Python.

You can find other examples in the zipline/examples directory.

Questions?

If you find a bug, feel free to open an issue and fill out the issue template.

Contributing

All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome. Details on how to set up a development environment can be found in our development guidelines.

If you are looking to start working with the Zipline codebase, navigate to the GitHub issues tab and start looking through interesting issues. Sometimes there are issues labeled as Beginner Friendly or Help Wanted.

Feel free to ask questions on the mailing list or on Gitter.

Note

Please note that Zipline is not a community-led project. Zipline is maintained by the Quantopian engineering team, and we are quite small and often busy.

Because of this, we want to warn you that we may not attend to your pull request, issue, or direct mention in months, or even years. We hope you understand, and we hope that this note might help reduce any frustration or wasted time.

Comments
  • Getting benckmark data via IEX API does not work anymore

    Getting benckmark data via IEX API does not work anymore

    Zipline uses IEX API to get benchmark data in benchmarks.py:

    def get_benchmark_returns(symbol):
        """
        Get a Series of benchmark returns from IEX associated with `symbol`.
        Default is `SPY`.
    
        Parameters
        ----------
        symbol : str
            Benchmark symbol for which we're getting the returns.
    
        The data is provided by IEX (https://iextrading.com/), and we can
        get up to 5 years worth of data.
        """
        r = requests.get(
            'https://api.iextrading.com/1.0/stock/{}/chart/5y'.format(symbol)
        )
        data = r.json()
    
        df = pd.DataFrame(data)
    
        df.index = pd.DatetimeIndex(df['date'])
        df = df['close']
    
        return df.sort_index().tz_localize('UTC').pct_change(1).iloc[1:]
    

    However, according to the IEX FAQ page, the chart api was already removed on June 15, 2019. Currently, using this api to try to download any stock data such as SPY will return nothing but an HTTP 403 error. The functions of deprecated APIs are now transferred to their new API, IEX Cloud, which requires a unique token per user in any request. Any idea how to fix this issue in the long run?

    Benchmark 
    opened by MarikoKujo 55
  • Can't connect to Yahoo - Errors: Loader: failed to cache the new benchmark returns

    Can't connect to Yahoo - Errors: Loader: failed to cache the new benchmark returns

    Dear Zipline Maintainers,

    Before I tell you about my issue, let me describe my environment:

    Environment

    • Operating System: Linux hostname 2.6.18-238.12.1.el5 #1 SMP Tue May 31 13:22:04 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux
    • Python Version: Python 3.5.1
    • Python Bitness: 32
    • How did you install Zipline: pip
    • Python packages:

    alembic==0.9.1 appdirs==1.4.3 bcolz==0.12.1 Bottleneck==1.3.0.dev0 click==6.7 contextlib2==0.5.5 cyordereddict==1.0.0 Cython==0.25.2 decorator==4.0.11 empyrical==0.2.2 intervaltree==2.1.0 Logbook==1.0.0 lru-dict==1.1.6 Mako==1.0.6 MarkupSafe==1.0 multipledispatch==0.4.9 networkx==1.11 numexpr==2.6.2 numpy==1.12.1 packaging==16.8 pandas==0.18.1 pandas-datareader==0.3.0.post0 patsy==0.4.1 pyparsing==2.2.0 python-dateutil==2.6.0 python-editor==1.0.3 pytz==2017.2 requests==2.13.0 requests-file==1.4.2 requests-ftp==0.3.1 scipy==0.19.0 setuptools-scm==1.15.5 six==1.10.0 sortedcontainers==1.5.7 SQLAlchemy==1.1.9 statsmodels==0.8.0 tables==3.4.2 toolz==0.8.2 zipline==1.1.0

    Now that you know a little about me, let me tell you about the issue I am having:

    Description of Issue

    I just ran the buyapple.py example that described on http://www.zipline.io/beginner-tutorial.html, but got some errors, can you help to check ?

    /home/kevinyuan/dev/zipline> bin/zipline ingest Downloading Bundle: quantopian-quandl [####################################] 100% Writing data to /home/kevinyuan/.zipline/data/quantopian-quandl/2017-05-01T16;36;49.048566.

    /home/kevinyuan/dev/zipline> bin/zipline run -f examples/buyapple.py -s 2000-1-1 -e 2001-12-31 [2017-05-01 16:33:28.023652] INFO: Loader: Cache at /home/kevinyuan/.zipline/data/^GSPC_benchmark.csv does not have data from 1990-01-02 00:00:00+00:00 to 2017-04-27 00:00:00+00:00. Downloading benchmark data for '^GSPC'. [2017-05-01 16:33:28.076307] ERROR: Loader: failed to cache the new benchmark returns Traceback (most recent call last): File "/share/dev/tools/lib/python3.5/urllib/request.py", line 1240, in do_open h.request(req.get_method(), req.selector, req.data, headers) File "/share/dev/tools/lib/python3.5/http/client.py", line 1083, in request self.send_request(method, url, body, headers) File "/share/dev/tools/lib/python3.5/http/client.py", line 1128, in send_request self.endheaders(body) File "/share/dev/tools/lib/python3.5/http/client.py", line 1079, in endheaders self._send_output(message_body) File "/share/dev/tools/lib/python3.5/http/client.py", line 911, in _sendoutput self.send(msg) File "/share/dev/tools/lib/python3.5/http/client.py", line 854, in send self.connect() File "/share/dev/tools/lib/python3.5/http/client.py", line 1237, in connect serverhostname=server_hostname) File "/share/dev/tools/lib/python3.5/ssl.py", line 376, in wrap_socket context=self) File "/share/dev/tools/lib/python3.5/ssl.py", line 747, in _init self.do_handshake() File "/share/dev/tools/lib/python3.5/ssl.py", line 983, in do_handshake self.sslobj.do_handshake() File "/share/dev/tools/lib/python3.5/ssl.py", line 628, in do_handshake self._sslobj.dohandshake() ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/data/loader.py", line 247, in ensure_benchmark_data last_date, File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/data/benchmarks.py", line 59, in get_benchmark_returns squeeze=True, # squeeze tells pandas to make this a Series File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/pandas/io/parsers.py", line 562, in parser_f return read(filepath_or_buffer, kwds) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/pandas/io/parsers.py", line 301, in read compression=kwds.get('compression', None)) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/pandas/io/common.py", line 308, in get_filepath_or_buffer req = urlopen(str(filepath_or_buffer)) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 162, in urlopen return opener.open(url, data, timeout) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 465, in open response = self.open(req, data) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 483, in _open '_open', req) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 443, in _callchain result = func(*args) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 1283, in https_open context=self.context, check_hostname=self.check_hostname) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 1242, in doopen raise URLError(err) urllib.error.URLError: <urlopen error [SSL: CERTIFICATEVERIFY_FAILED] certificate verify failed (_ssl.c:645)> Traceback (most recent call last): File "/share/dev/tools/lib/python3.5/urllib/request.py", line 1240, in doopen h.request(req.get_method(), req.selector, req.data, headers) File "/share/dev/tools/lib/python3.5/http/client.py", line 1083, in request self.send_request(method, url, body, headers) File "/share/dev/tools/lib/python3.5/http/client.py", line 1128, in send_request self.endheaders(body) File "/share/dev/tools/lib/python3.5/http/client.py", line 1079, in endheaders self._send_output(message_body) File "/share/dev/tools/lib/python3.5/http/client.py", line 911, in _sendoutput self.send(msg) File "/share/dev/tools/lib/python3.5/http/client.py", line 854, in send self.connect() File "/share/dev/tools/lib/python3.5/http/client.py", line 1237, in connect serverhostname=server_hostname) File "/share/dev/tools/lib/python3.5/ssl.py", line 376, in wrap_socket context=self) File "/share/dev/tools/lib/python3.5/ssl.py", line 747, in _init self.do_handshake() File "/share/dev/tools/lib/python3.5/ssl.py", line 983, in do_handshake self.sslobj.do_handshake() File "/share/dev/tools/lib/python3.5/ssl.py", line 628, in do_handshake self._sslobj.dohandshake() ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "bin/zipline", line 11, in sys.exit(main()) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/click/core.py", line 722, in call return self.main(args, **kwargs) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/click/core.py", line 697, in main rv = self.invoke(ctx) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/click/core.py", line 1066, in invoke return process_result(sub_ctx.command.invoke(sub_ctx)) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/click/core.py", line 895, in invoke return ctx.invoke(self.callback, *ctx.params) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/click/core.py", line 535, in invoke return callback(args, **kwargs) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/main.py", line 97, in _ return f(args, **kwargs) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/click/decorators.py", line 17, in new_func return f(get_current_context(), *args, *kwargs) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/main.py", line 240, in run environ=os.environ, File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/utils/run_algo.py", line 132, in run env = TradingEnvironment(asset_db_path=connstr) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/finance/trading.py", line 101, in init self.bm_symbol, File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/data/loader.py", line 164, in load_market_data trading_day, File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/data/loader.py", line 247, in ensure_benchmark_data last_date, File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/zipline/data/benchmarks.py", line 59, in get_benchmark_returns squeeze=True, # squeeze tells pandas to make this a Series File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/pandas/io/parsers.py", line 562, in parser_f return read(filepath_or_buffer, kwds) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/pandas/io/parsers.py", line 301, in _read compression=kwds.get('compression', None)) File "/home-tahoe-n2/kevinyuan/dev/zipline/lib/python3.5/site-packages/pandas/io/common.py", line 308, in get_filepath_or_buffer req = _urlopen(str(filepath_or_buffer)) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 162, in urlopen return opener.open(url, data, timeout) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 465, in open response = self.open(req, data) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 483, in open '_open', req) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 443, in _callchain result = func(args) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 1283, in https_open context=self.context, check_hostname=self.check_hostname) File "/share/dev/tools/lib/python3.5/urllib/request.py", line 1242, in doopen raise URLError(err) urllib.error.URLError: <urlopen error [SSL: CERTIFICATEVERIFY_FAILED] certificate verify failed (_ssl.c:645)>

    opened by kevinyuan 55
  • Tutorial buyapple.py fails in python 2.7.10

    Tutorial buyapple.py fails in python 2.7.10

    running buyapple.py fails in python 2.7.10 but runs in python 3.4.3 for zipline 8.0rc. The part of code that fails is data[symbol('AAPL')] found in

    record(AAPL=data[symbol('AAPL')].price)
    

    the error is

    Traceback (most recent call last):
      File "C:\python2710_64\Scripts\run_algo.py", line 4, in <module>
        __import__('pkg_resources').run_script('zipline==0.8.0rc1', 'run_algo.py')
      File "C:\Python2710_64\lib\site-packages\pkg_resources\__init__.py", line 729, in run_script
        self.require(requires)[0].run_script(script_name, ns)
      File "C:\Python2710_64\lib\site-packages\pkg_resources\__init__.py", line 1642, in run_script
        exec(code, namespace, namespace)
      File "c:\python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\EGG-INFO\scripts\run_algo.py", line 23, in <module>
        run_pipeline(print_algo=True, **parsed)
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\utils\cli.py", line 246, in run_pipeline
        perf = algo.run(source, overwrite_sim_params=overwrite_sim_params)
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\algorithm.py", line 529, in run
        for perf in self.gen:
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\gens\tradesimulation.py", line 120, in transform
        self.algo.instant_fill,
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\gens\tradesimulation.py", line 314, in _process_snapshot
        new_orders = self._call_handle_data()
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\gens\tradesimulation.py", line 343, in _call_handle_data
        self.simulation_dt,
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\utils\events.py", line 194, in handle_data
        event.handle_data(context, data, dt)
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\utils\events.py", line 212, in handle_data
        self.callback(context, data)
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\algorithm.py", line 306, in handle_data
        self._handle_data(self, data)
      File "buyapple.py", line 26, in handle_data
        record(AAPL=data[symbol('AAPL')].price)
      File "C:\Python2710_64\lib\site-packages\zipline-0.8.0rc1-py2.7-win-amd64.egg\zipline\protocol.py", line 518, in __getitem__
        return self._data[name]
    KeyError: Equity(0, symbol='AAPL', asset_name='', exchange='', start_date=None, end_date=None, first_traded=None)
    

    Environment: Windows 7 64 bit, Python 3.4, Python 2.7, zipline 0.8.0rc Appears to be with zipline.protocol.BarData and zipline.assets._assets.Equity

    Kind regards

    David Bieber

    Windows 32-bit 
    opened by beebeed 45
  • Add estimates quarter loader to pipeline

    Add estimates quarter loader to pipeline

    This PR adds an estimates quarter loader to be used with Pipeline.

    Some open questions are:

    • What other tests do we need to add?
    • How do we ensure that the num_quarters attribute has been added to the dynamically generated dataset by the time load_adjusted_array is called?
    opened by mtydykov 44
  • Benchmark downloading is broken

    Benchmark downloading is broken

    Fix benchmark downloading from Google with pandas-datareader. This issue was originally brought up here.

    We now get benchmark data from Google instead of Yahoo, as seen here.

    However, it appears that as of only a week or two ago, Google changed the URL from which they are serving their financial data, causing pandas datareader to break. This is also preventing us from rebuilding the test_examples data. (For more info see the original post above).

    Data Bundle Benchmark Close on Next Release 
    opened by dmichalowicz 36
  • Create in-memory restricted list

    Create in-memory restricted list

    • Create a restricted list manager that takes in information about restricted sids and stores in memory upon instantiation
    • Create a restrictions controller that aggregates and provides restrictions information from multiple restricted list managers
    • Make this restrictions controller available to BarData so that can_trade can take into account these restrictions
    • Register each restricted list manager as a trading control using set_do_not_order_list so that we may log/fail when we try to order a restricted sid
    Ready to Merge Waiting on Downstream 
    opened by lianga888 35
  • Error retrieving H15 interest rates - ValueError: 'Time Period' is not in list

    Error retrieving H15 interest rates - ValueError: 'Time Period' is not in list

    Dear Zipline Maintainers,

    Environment

    • Operating System: Linux 3d62a3c1924c 4.4.0-66-generic #87-Ubuntu SMP Fri Mar 3 15:29:05 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
    • Python Version: Python 3.5.2 :: Anaconda 4.3.0 (64-bit)
    • Python Bitness: 64
    • How did you install Zipline: pip
    • Python packages:
    alabaster==0.7.9
    alembic==0.9.2
    anaconda-client==1.6.0
    anaconda-navigator==1.4.3
    appdirs==1.4.3
    APScheduler==3.3.1
    arctic==1.38.0
    argcomplete==1.0.0
    astroid==1.4.9
    astropy==1.3
    Babel==2.3.4
    backports.shutil-get-terminal-size==1.0.0
    bayesian-optimization==0.4.0
    bcolz==0.12.1
    beautifulsoup4==4.5.3
    bitarray==0.8.1
    blaze==0.10.1
    bokeh==0.12.4
    boto==2.45.0
    Bottleneck==1.2.1
    butils==0.2.13
    cachetools==2.0.0
    cffi==1.9.1
    chardet==2.3.0
    chest==0.2.3
    click==6.7
    cloudpickle==0.2.2
    clyent==1.2.2
    colorama==0.3.7
    conda==4.3.9
    conda-build==2.1.3
    conda-verify==2.0.0
    configobj==5.0.6
    contextlib2==0.5.5
    coverage==4.3.4
    coveralls==1.1
    cryptography==1.7.1
    cycler==0.10.0
    cyordereddict==1.0.0
    Cython==0.25.2
    cytoolz==0.8.2
    dask==0.13.0
    data==0.3.7
    datashape==0.5.4
    deap==1.1.0
    decorator==4.0.11
    dill==0.2.5
    docopt==0.6.2
    docutils==0.13.1
    dynd==0.7.3.dev1
    empyrical==0.2.2
    enum34==1.1.6
    et-xmlfile==1.0.1
    fastcache==1.0.2
    filelock==2.0.7
    Flask==0.12
    Flask-Cors==3.0.2
    future==0.16.0
    gevent==1.2.1
    greenlet==0.4.11
    h5py==2.6.0
    HeapDict==1.0.0
    hmmlearn==0.2.0
    idna==2.2
    imagesize==0.7.1
    inspyred==1.0.1
    intervaltree==2.1.0
    ipykernel==4.5.2
    ipython==5.1.0
    ipython-genutils==0.1.0
    ipywidgets==5.2.2
    isort==4.2.5
    itsdangerous==0.24
    jdcal==1.3
    jedi==0.9.0
    Jinja2==2.9.4
    jsonschema==2.5.1
    jupyter==1.0.0
    jupyter-client==4.4.0
    jupyter-console==5.0.0
    jupyter-core==4.2.1
    jupyterhub==0.7.2
    lazy-object-proxy==1.2.2
    learning==0.4.19
    llvmlite==0.15.0
    locket==0.2.0
    Logbook==1.0.0
    lru-dict==1.1.6
    lxml==3.7.2
    lz4==0.8.2
    Mako==1.0.6
    MarkupSafe==1.0
    matplotlib==2.0.0
    minepy==1.2.0
    mistune==0.7.3
    mockextras==1.0.2
    mpmath==0.19
    multipledispatch==0.4.9
    nb-anacondacloud==1.2.0
    nb-conda==2.0.0
    nb-conda-kernels==2.0.0
    nbconvert==4.2.0
    nbformat==4.2.0
    nbpresent==3.0.2
    networkx==1.11
    nltk==3.2.2
    nose==1.3.7
    notebook==4.3.1
    numba==0.30.1
    numexpr==2.6.2
    numpy==1.12.1
    numpydoc==0.6.0
    odo==0.5.0
    openpyxl==2.4.1
    packaging==16.8
    pamela==0.3.0
    pandas==0.18.1
    pandas-datareader==0.4.0
    partd==0.3.7
    pathlib2==2.2.0
    patsy==0.4.1
    pep8==1.7.0
    pexpect==4.2.1
    pickleshare==0.7.4
    Pillow==4.0.0
    pkginfo==1.4.1
    ply==3.9
    prompt-toolkit==1.0.9
    psutil==5.0.1
    ptyprocess==0.5.1
    py==1.4.32
    pyasn1==0.1.9
    PyBrain==0.3.3
    pycosat==0.6.1
    pycparser==2.17
    pycrypto==2.6.1
    pycurl==7.43.0
    pyflakes==1.5.0
    pyfolio==0.7.0
    Pygments==2.1.3
    pylint==1.6.4
    pymongo==3.4.0
    pyOpenSSL==16.2.0
    pypandoc==1.3.3
    pyparsing==2.2.0
    pytest==3.0.5
    python-dateutil==2.6.0
    python-editor==1.0.3
    pytz==2017.2
    PyWavelets==0.5.2
    PyYAML==3.12
    pyyawt==0.1.1
    pyzmq==16.0.2
    QtAwesome==0.4.3
    qtconsole==4.2.1
    QtPy==1.2.1
    redis==2.10.5
    requests==2.14.2
    requests-file==1.4.2
    requests-ftp==0.3.1
    rope-py3k==0.9.4.post1
    rpy2==2.8.5
    scikit-image==0.12.3
    scikit-learn==0.18.1
    scipy==0.19.0
    seaborn==0.7.1
    simplegeneric==0.8.1
    singledispatch==3.4.0.3
    six==1.10.0
    snowballstemmer==1.2.1
    sockjs-tornado==1.0.3
    sortedcontainers==1.5.7
    Sphinx==1.5.1
    sphinx-rtd-theme==0.1.9
    spyder==3.1.2
    SQLAlchemy==1.1.10
    statsmodels==0.8.0
    sympy==1.0
    TA-Lib==0.4.10
    tables==3.4.2
    terminado==0.6
    toolz==0.8.2
    tornado==4.4.2
    TPOT==0.6.8
    tqdm==4.11.2
    traitlets==4.3.1
    tsfresh==0.5.0
    tzlocal==1.3
    unicodecsv==0.14.1
    update-checker==0.16
    wcwidth==0.1.7
    Werkzeug==0.11.15
    widgetsnbextension==1.2.6
    wrapt==1.10.8
    xgboost==0.6a2
    xlrd==1.0.0
    XlsxWriter==0.9.6
    xlwt==1.2.0
    zipline==1.1.0
    

    Error retrieving H15 interest rates - ValueError: 'Time Period' is not in list

    • If I try to instantiate a TradingEnvironment, I get a ValueError. The code worked until yesterday and stopped some hours ago. I'm not sure whether on Th or Fr (CEST).
    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    
    ...
    env = TradingEnvironment(bm_symbol=self.benchmark, exchange_tz=self.exchange_tz,
    --> 539                                  trading_calendar=cal)
    ...
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/zipline/finance/trading.py in __init__(self, load, bm_symbol, exchange_tz, trading_calendar, asset_db_path)
         94             trading_calendar.day,
         95             trading_calendar.schedule.index,
    ---> 96             self.bm_symbol,
         97         )
         98 
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/zipline/data/loader.py in load_market_data(trading_day, trading_days, bm_symbol)
        169         first_date,
        170         last_date,
    --> 171         now,
        172     )
        173     benchmark_returns = br[br.index.slice_indexer(first_date, last_date)]
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/zipline/data/loader.py in ensure_treasury_data(bm_symbol, first_date, last_date, now)
        317 
        318     try:
    --> 319         data = loader_module.get_treasury_data(first_date, last_date)
        320         data.to_csv(path)
        321     except (OSError, IOError, HTTPError):
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/zipline/data/treasuries.py in get_treasury_data(start_date, end_date)
         74         parse_dates=['Time Period'],
         75         na_values=['ND'],  # Presumably this stands for "No Data".
    ---> 76         index_col=0,
         77     ).loc[
         78         start_date:end_date
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/pandas/io/parsers.py in parser_f(filepath_or_buffer, sep, delimiter, header, names, index_col, usecols, squeeze, prefix, mangle_dupe_cols, dtype, engine, converters, true_values, false_values, skipinitialspace, skiprows, nrows, na_values, keep_default_na, na_filter, verbose, skip_blank_lines, parse_dates, infer_datetime_format, keep_date_col, date_parser, dayfirst, iterator, chunksize, compression, thousands, decimal, lineterminator, quotechar, quoting, escapechar, comment, encoding, dialect, tupleize_cols, error_bad_lines, warn_bad_lines, skipfooter, skip_footer, doublequote, delim_whitespace, as_recarray, compact_ints, use_unsigned, low_memory, buffer_lines, memory_map, float_precision)
        644                     skip_blank_lines=skip_blank_lines)
        645 
    --> 646         return _read(filepath_or_buffer, kwds)
        647 
        648     parser_f.__name__ = name
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/pandas/io/parsers.py in _read(filepath_or_buffer, kwds)
        387 
        388     # Create the parser.
    --> 389     parser = TextFileReader(filepath_or_buffer, **kwds)
        390 
        391     if (nrows is not None) and (chunksize is not None):
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/pandas/io/parsers.py in __init__(self, f, engine, **kwds)
        728             self.options['has_index_names'] = kwds['has_index_names']
        729 
    --> 730         self._make_engine(self.engine)
        731 
        732     def close(self):
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/pandas/io/parsers.py in _make_engine(self, engine)
        921     def _make_engine(self, engine='c'):
        922         if engine == 'c':
    --> 923             self._engine = CParserWrapper(self.f, **self.options)
        924         else:
        925             if engine == 'python':
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/pandas/io/parsers.py in __init__(self, src, **kwds)
       1434                 raise ValueError("Usecols do not match names.")
       1435 
    -> 1436         self._set_noconvert_columns()
       1437 
       1438         self.orig_names = self.names
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/pandas/io/parsers.py in _set_noconvert_columns(self)
       1484                         _set(k)
       1485                 else:
    -> 1486                     _set(val)
       1487 
       1488         elif isinstance(self.parse_dates, dict):
    
    /opt/conda/envs/develop/lib/python3.5/site-packages/pandas/io/parsers.py in _set(x)
       1474 
       1475             if not is_integer(x):
    -> 1476                 x = names.index(x)
       1477 
       1478             self._reader.set_noconvert(x)
    
    ValueError: 'Time Period' is not in list
    

    What steps have you taken to resolve this already?

    • If I run the corresponding code manually, I get the same error. https://github.com/quantopian/zipline/blob/master/zipline/data/treasuries.py#L61
    df = pd.read_csv(
            "http://www.federalreserve.gov/datadownload/Output.aspx"
            "?rel=H15"
            "&series=bf17364827e38702b42a58cf8eaa3f78"
            "&lastObs="
            "&from="  # An unbounded query is ~2x faster than specifying dates.
            "&to="
            "&filetype=csv"
            "&label=omit"
            "&layout=seriescolumn"
            "&type=package",
            skiprows=1,  # First row is a useless header.
            parse_dates=['Time Period'],
            na_values=['ND'],  # Presumably this stands for "No Data".
            index_col=0,
    )
    
    • The request to the hard encoded link (to obtain the H15 interest rates) fails: "Unable to find the output file. Please contact administrator for assistance." http://www.federalreserve.gov/datadownload/Output.aspx?rel=H15&series=bf17364827e38702b42a58cf8eaa3f78&lastObs=&from=&to=&filetype=csv&label=omit&layout=seriescolumn&type=package

    • I played around with the Data Download Program and compared the parameters of the generated request strings. https://www.federalreserve.gov/releases/h15/

    • It seems as if label=omit is not accepted anymore. If I omit the label parameter or set it to include, it seems to work. http://www.federalreserve.gov/datadownload/Output.aspx?rel=H15&series=bf17364827e38702b42a58cf8eaa3f78&lastObs=&from=&to=&filetype=csv&label=include&layout=seriescolumn&type=package

    • I am curious if I am the only one having this problem. If not, how can we fix this problem permanently?

    Sincerely,

    Rudi

    opened by rudolf-bauer 30
  • Futures being read as Equities from env

    Futures being read as Equities from env

    code snippet:

    class TestAssetFinder(TestCase):
        @with_environment()
        def setUp(self, env=None):
            self.identifier = 'f'
            metadata={0: {'asset_type': 'future', 'symbol':
                self.identifier}}
            self.asset_finder = AssetFinder(metadata=metadata)
            env.update_asset_finder(identifiers=[self.identifier])
    
        @with_environment()
        def test_future_metadata(self, env=None):
            asset = env.asset_finder.lookup_generic(self.identifier,
                                                       datetime.now())
            self.assertIsInstance(asset, Future)
    

    output

    AssertionError: (Equity(0, symbol='f', asset_name='', exchange='', start_date=0, end_date=Timestamp('2015-08-20 00:00:00+0000', tz='UTC'), first_traded=None), []) is not an instance of <class 'zipline.assets._assets.Future'>
    

    cc @jfkirk @StewartDouglas @yankees714 @dhexus

    opened by warren-oneill 30
  • most examples don't work outside quantopian env

    most examples don't work outside quantopian env

    Hi guys,

    Please update http://www.zipline.io/ because dual_moving_avg.py doesn't exist, its dual_moving_average.py. This strategy work perfectly in quantopian environment but not with zipline CLI. I suspect the ichart.finance.yahoo path to be broken (see details at the end).

    The only example that works or don't crash is buyapple.
    dual_ema_talib.py -> ImportError: No module named talib pairtrade.py-> ValueError: You must define a handle_data function. olmar.py-> AttributeError: 'TradingAlgorithm' object has no attribute 'symbol' buyapple_analyze.py-> ValueError: You must define a handle_data function.

    Thanks,

    Fraka6

    zipline.version Out[39]: '0.7.0'

    python scripts/run_algo.py -f zipline/examples/dual_moving_average.py --symbols AAPL --start 2011-1-1 --end 2012-1-1 -o dma.pickle ... [2015-07-30 04:14:04.415502] WARNING: Loader: No benchmark data found for date range. start_date=2015-07-30 00:00:00+00:00, end_date=2015-07-30 04:14:04.295081, url=http://ichart.finance.yahoo.com/table.csv?a=6&c=2015&b=30&e=30&d=6&g=d&f=2015&s=%5EGSPC [2015-07-30 04:14:24.025207] INFO: Performance: Simulated 252 trading days out of 252. [2015-07-30 04:14:24.025510] INFO: Performance: first open: 2011-01-03 14:31:00+00:00 [2015-07-30 04:14:24.025611] INFO: Performance: last close: 2011-12-30 21:00:00+00:00 Traceback (most recent call last): ... File "scripts/run_algo.py", line 25, in File "pandas/hashtable.pyx", line 694, in pandas.hashtable.PyObjectHashTable.get_item (pandas/hashtable.c:12231) KeyError: 'AAPL'

    Bug Documentation 
    opened by fraka6 29
  • Re-implemented the calendar API.

    Re-implemented the calendar API.

    Instead of having separate ExchangeCalendar and TradingSchedule objects, we now just have TradingCalendar. The TradingCalendar keeps track of each session (defined as a contiguous set of minutes between an open and a close). It's also responsible for handling the grouping logic of any given minute to its containing session, or the next/previous session if it's not a market minute for the given calendar.

    (Unfortunately, the branch name is a misnomer. no 24/5 yet.)

    opened by jbredeche 27
  • Remove Implicit Dependency on Benchmarks and Treasury Returns

    Remove Implicit Dependency on Benchmarks and Treasury Returns

    Background

    Zipline currently requires two special data inputs for simulations: "Benchmark Returns", which are used to calculate the "Alpha" and "Beta" metrics, among other things, and "Treasury Curves", which were at one time used as the "Risk Free Rate", which was part of the Sharpe Ratio calculation.

    Since these inputs are required by all simulations, we implicitly fetch them from third party API sources if they're not provided by users. We get treasury data from the US Federal Reserve's API, and we get benchmarks from IEX.

    Problems

    Implicitly fetching benchmarks and treasuries causes many problems:

    • Implicitly fetching means that running simulations requires an internet connection. We try to make this less painful by caching downloaded results and re-using them when possible, but this is only a partial fix, and it means that many users don't notice the implicit download until it starts causing mysterious problems.

    • The APIs we fetch from sometimes fail, leading to confusing behavior for users and spurious bug reports for Zipline maintainers.

    • The APIs we fetch from sometimes change in incompatible ways, which breaks older versions of Zipline. This is currently the case for the IEX API we use to fetch benchmarks, resulting in issues like:

      • https://github.com/quantopian/zipline/issues/2480
      • https://github.com/quantopian/zipline/issues/2607
      • https://github.com/quantopian/zipline/issues/2445
    • Our default benchmark is US-centric. We default to using SPY as the benchmark, which only makes sense in the US (and even then, only makes sense if you have also historical dividends for SPY, which many users don't have).

    Proposed Solution

    I think we should remove these implicit dependencies from Zipline. Treasuries we should just remove, since they're not actually used anymore. Figuring out what to do with benchmarks is a bit trickier.

    Treasuries

    Removing treasuries is relatively straightforward because we no longer actually use them. A quick scan of our GitHub issues turns up these issues that should be fixed by the removal:

    • https://github.com/quantopian/zipline/issues/324
    • https://github.com/quantopian/zipline/issues/119
    • https://github.com/quantopian/zipline/issues/144
    • https://github.com/quantopian/zipline/issues/2422

    I've opened a PR at https://github.com/quantopian/zipline/pull/2626 to finally remove all traces of the treasury subsystem.

    Benchmarks

    Benchmarks are a bit trickier. The benchmark is used in the calculation of the "alpha" and "beta" metrics, and many users are generally interested in comparing the returns of their strategy against a particular benchmark (often an ETF or index of some kind). We also don't currently have a way to specify a benchmark from the command line, or to define a benchmark asset for a particular bundle.

    I think there are a few things we could do to improve the situation here:

    1. We could add the ability to define a benchmark explicitly when running Zipline via the CLI. We already have the ability to do this internally, but there's no supported way to control the benchmark via the CLI or via an extension. I think this is necessary pretty much no matter what.

      • Optionally, we could also make it required that the user tell us what their benchmark is. This would remove the need for implicit fetching of the benchmark. Users who don't care could pass a dummy benchmark (e.g., of all zero returns).
    2. Make the benchmark optional. Making the benchmark optional would result in alpha, beta, and any other benchmark-dependent risk metrics not being populated in zipline's output. The tricky thing here is to do this in a way that doesn't result in performance degradataion when running with a benchmark. I think we either should do this or make the benchmark asset required.

    3. (Short Term) We can fix our IEX API calls for benchmark data to use the updated APIs. This doesn't fix the systemic maintenance issues associated with the benchmark, but it would at least fix Zipline being straight-up broken for many people, which is its current status. I think the main challenge here is that IEX now requires an API token to work at all, which means we need to provide some mechanism for the user to pass in their API token.

    Benchmark 
    opened by ssanderson 26
  • SyntaxError: future feature annotations is not defined

    SyntaxError: future feature annotations is not defined

    Dear Zipline Maintainers,

    Before I tell you about my issue, let me describe my environment:

    Environment

    • Operating System: (macOS 12.6 Monterey)
    • Python Version: $ python --3.6.9
    • Python Bitness: `$ 64
    • How did you install Zipline: (conda)
    • Python packages: conda list:

    packages in environment at /Users/avant/opt/anaconda3/envs/env_zipline:

    Name Version Build Channel

    alembic 1.8.1 pyhd8ed1ab_0 conda-forge appnope 0.1.3 pypi_0 pypi backcall 0.2.0 pypi_0 pypi bcolz 1.2.1 py36h4f17bb1_1001 conda-forge beautifulsoup4 4.11.1 pypi_0 pypi blosc 1.21.1 hd0a9f43_0 conda-forge bokeh 2.3.3 py36h79c6626_0 conda-forge bottleneck 1.3.2 py36h5f094cf_4 conda-forge brotlipy 0.7.0 py36hfa26744_1001 conda-forge bzip2 1.0.8 h0d85af4_4 conda-forge ca-certificates 2022.10.11 hecd8cb5_0
    certifi 2021.5.30 py36hecd8cb5_0
    cffi 1.14.0 py36hb5b8e2f_0
    charset-normalizer 2.1.1 pyhd8ed1ab_0 conda-forge click 7.1.2 pyh9f0ad1d_0 conda-forge cloudpickle 2.2.0 pyhd8ed1ab_0 conda-forge contextvars 2.4 py_0 conda-forge cryptography 35.0.0 py36ha6a00b0_0 conda-forge cycler 0.11.0 pypi_0 pypi cytoolz 0.11.0 py36hfa26744_3 conda-forge dask 2.10.1 py_0 conda-forge dask-core 2.10.1 py_0 conda-forge dataclasses 0.8 pypi_0 pypi decorator 5.1.1 pyhd8ed1ab_0 conda-forge distributed 2.30.1 py36h79c6626_0 conda-forge empyrical 0.5.5 pyh9f0ad1d_0 conda-forge freetype 2.10.4 h4cff582_1 conda-forge fsspec 2022.11.0 pyhd8ed1ab_0 conda-forge greenlet 1.1.2 py36hefe7e0e_0 conda-forge h5py 2.10.0 nompi_py36h106b333_102 conda-forge hdf5 1.10.5 nompi_h500d6d3_1114 conda-forge heapdict 1.0.1 py_0 conda-forge idna 3.4 pyhd8ed1ab_0 conda-forge immutables 0.16 py36hfa26744_0 conda-forge importlib-metadata 4.8.1 py36h79c6626_0 conda-forge importlib_resources 5.10.0 pyhd8ed1ab_0 conda-forge intervaltree 3.0.2 py_0 conda-forge ipython 7.16.3 pypi_0 pypi ipython-genutils 0.2.0 pypi_0 pypi iso3166 2.1.1 pyhd8ed1ab_0 conda-forge iso4217 1.9.20220401 pyhd8ed1ab_0 conda-forge jedi 0.17.2 pypi_0 pypi jinja2 3.0.3 pyhd8ed1ab_0 conda-forge joblib 1.1.1 pypi_0 pypi jpeg 9e hac89ed1_2 conda-forge kiwisolver 1.3.1 pypi_0 pypi lcms2 2.12 h577c468_0 conda-forge libblas 3.9.0 16_osx64_openblas conda-forge libcblas 3.9.0 16_osx64_openblas conda-forge libcxx 14.0.6 h9765a3e_0
    libedit 3.1.20210910 hca72f7f_0
    libffi 3.2.1 h0a44026_1007
    libgfortran 5.0.0 9_5_0_h97931a8_26 conda-forge libgfortran5 11.3.0 h082f757_26 conda-forge liblapack 3.9.0 16_osx64_openblas conda-forge libopenblas 0.3.21 openmp_h429af6e_3 conda-forge libpng 1.6.37 h7cec526_2 conda-forge libtiff 4.2.0 h1167814_3 conda-forge libwebp-base 1.2.4 h775f41a_0 conda-forge llvm-openmp 15.0.4 h61d9ccf_0 conda-forge locket 1.0.0 pyhd8ed1ab_0 conda-forge logbook 1.5.3 py36h20b66c6_4 conda-forge lru-dict 1.1.7 py36hfa26744_0 conda-forge lxml 4.9.1 pypi_0 pypi lz4-c 1.9.3 he49afe7_1 conda-forge mako 1.2.3 pyhd8ed1ab_0 conda-forge markupsafe 2.0.1 py36hfa26744_0 conda-forge matplotlib 3.3.4 pypi_0 pypi mock 4.0.3 py36h79c6626_1 conda-forge msgpack-python 1.0.2 py36hc61eee1_1 conda-forge multipledispatch 0.6.0 py_0 conda-forge ncurses 6.3 hca72f7f_3
    networkx 1.11 py36_0 conda-forge numexpr 2.7.3 py36he43235d_0 conda-forge numpy 1.19.5 py36h08b5fde_2 conda-forge olefile 0.46 pyh9f0ad1d_1 conda-forge openjpeg 2.4.0 h6e7aa92_1 conda-forge openssl 1.1.1s hca72f7f_0
    packaging 21.3 pyhd8ed1ab_0 conda-forge pandas 0.22.0 pypi_0 pypi pandas-datareader 0.6.0 py36_0 conda-forge parso 0.7.1 pypi_0 pypi partd 1.2.0 pyhd8ed1ab_0 conda-forge patsy 0.5.3 pyhd8ed1ab_0 conda-forge pexpect 4.8.0 pypi_0 pypi pickleshare 0.7.5 pypi_0 pypi pillow 8.2.0 py36h154fef6_1 conda-forge pip 21.2.2 py36hecd8cb5_0
    prompt-toolkit 3.0.32 pypi_0 pypi psutil 5.8.0 py36hfa26744_1 conda-forge ptyprocess 0.7.0 pypi_0 pypi pycparser 2.21 pyhd8ed1ab_0 conda-forge pyfolio 0.9.2 pypi_0 pypi pygments 2.13.0 pypi_0 pypi pyopenssl 22.0.0 pyhd8ed1ab_1 conda-forge pyparsing 3.0.9 pyhd8ed1ab_0 conda-forge pysocks 1.7.1 py36h79c6626_3 conda-forge pytables 3.6.1 py36h6f8395a_1 conda-forge python 3.6.9 h359304d_0
    python-dateutil 2.8.2 pyhd8ed1ab_0 conda-forge python-interface 1.6.0 py_0 conda-forge python_abi 3.6 2_cp36m conda-forge pytz 2022.6 pyhd8ed1ab_0 conda-forge pyyaml 5.4.1 py36hfa26744_1 conda-forge readline 7.0 h1de35cc_5
    requests 2.28.1 pyhd8ed1ab_0 conda-forge requests-file 1.5.1 pyh9f0ad1d_0 conda-forge requests-ftp 0.3.1 py_1 conda-forge scikit-learn 0.24.2 pypi_0 pypi scipy 1.5.3 py36h4f136de_1 conda-forge seaborn 0.10.1 pypi_0 pypi setuptools 58.0.4 py36hecd8cb5_0
    six 1.16.0 pyh6c4a22f_0 conda-forge snappy 1.1.9 h225ccf5_2 conda-forge sortedcontainers 2.4.0 pyhd8ed1ab_0 conda-forge soupsieve 2.3.2.post1 pypi_0 pypi sqlalchemy 1.4.25 py36hfa26744_0 conda-forge sqlite 3.33.0 hffcf06c_0
    statsmodels 0.11.1 py36h37b9a7d_2 conda-forge tblib 1.7.0 pyhd8ed1ab_0 conda-forge threadpoolctl 3.1.0 pypi_0 pypi tk 8.6.12 h5d9f67b_0
    toolz 0.12.0 pyhd8ed1ab_0 conda-forge torch 1.10.2 pypi_0 pypi tornado 6.1 py36hfa26744_1 conda-forge trading-calendars 2.1.1 pyhd3deb0d_0 conda-forge traitlets 4.3.3 pypi_0 pypi typing-extensions 4.1.1 hd8ed1ab_0 conda-forge typing_extensions 4.1.1 pyha770c72_0 conda-forge urllib3 1.26.11 pyhd8ed1ab_0 conda-forge wcwidth 0.2.5 pypi_0 pypi wheel 0.37.1 pyhd3eb1b0_0
    wrapt 1.14.1 pypi_0 pypi xz 5.2.6 hca72f7f_0
    yahoofinancials 1.6 pypi_0 pypi yaml 0.2.5 h0d85af4_2 conda-forge zict 2.0.0 py_0 conda-forge zipline 1.4.1 py36haf1e3a3_0 conda-forge zipp 3.6.0 pyhd8ed1ab_0 conda-forge zlib 1.2.13 h4dc903c_0
    zstd 1.5.0 h582d3a0_0 conda-forge

    Now that you know a little about me, let me tell you about the issue I am having:

    I am trying to set up the zipline enviornment, its been incredibly time confusing and stressful, as I am a new prog. now when I try to ingest or run a program I get the following syntax error:

    (env_zipline) [email protected] VSC % /Users/avant/opt/anaconda3/envs/env_zipline/bin/python /Users/avant/Desktop/VSC/agcap/ma.py Traceback (most recent call last): File "/Users/avant/Desktop/VSC/agcap/ma.py", line 1, in from zipline.api import order_target, record, symbol File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/zipline/init.py", line 29, in from .utils.run_algo import run_algorithm File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/zipline/utils/run_algo.py", line 20, in from zipline.data import bundles File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/zipline/data/bundles/init.py", line 2, in from . import quandl # noqa File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/zipline/data/bundles/quandl.py", line 16, in from . import core as bundles File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/zipline/data/bundles/core.py", line 20, in from zipline.assets.asset_db_migrations import downgrade File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/zipline/assets/asset_db_migrations.py", line 1, in from alembic.migration import MigrationContext File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/alembic/init.py", line 3, in from . import context File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/alembic/context.py", line 1, in from .runtime.environment import EnvironmentContext File "/Users/avant/opt/anaconda3/envs/env_zipline/lib/python3.6/site-packages/alembic/runtime/environment.py", line 1 from future import annotations ^ SyntaxError: future feature annotations is not defined

    Description of Issue

    • What did you expect to happen?
    • What happened instead?

    Here is how you can reproduce this issue on your machine:

    Reproduction Steps

    1.run any program or try to ingest 2. 3. ...

    What steps have you taken to resolve this already?

    many hours of trying different versions of python/pandas and the rest of the libs. many GitHub/stackoverflow articles.

    ...

    Anything else?

    ...

    Sincerely, `$ whoami

    I also attached a screenshot 16B22496-C127-4D7C-B02F-08DBF89655D7 `

    opened by agcap1 0
  • CVE-2007-4559 Patch

    CVE-2007-4559 Patch

    Patching CVE-2007-4559

    Hi, we are security researchers from the Advanced Research Center at Trellix. We have began a campaign to patch a widespread bug named CVE-2007-4559. CVE-2007-4559 is a 15 year old bug in the Python tarfile package. By using extract() or extractall() on a tarfile object without sanitizing input, a maliciously crafted .tar file could perform a directory path traversal attack. We found at least one unsantized extractall() in your codebase and are providing a patch for you via pull request. The patch essentially checks to see if all tarfile members will be extracted safely and throws an exception otherwise. We encourage you to use this patch or your own solution to secure against CVE-2007-4559. Further technical information about the vulnerability can be found in this blog.

    If you have further questions you may contact us through this projects lead researcher Kasimir Schulz.

    opened by TrellixVulnTeam 0
  • Links broken

    Links broken

    Dear Zipline Maintainers, go to your https://pypi.org/project/zipline/#description page. try the links. specifically the 'Zipline Install Documentation' link. that link, as well as many others, go to a useless advert for zipline

    not a confidence builder

    opened by DanManila 1
  • docs: Fix a few typos

    docs: Fix a few typos

    There are small typos in:

    • tests/test_assets.py
    • zipline/data/minute_bars.py
    • zipline/errors.py
    • zipline/finance/controls.py

    Fixes:

    • Should read constraint rather than contraint.
    • Should read slippage rather than slipage.
    • Should read identifier rather than identifer.
    • Should read considered rather than consdered.

    Semi-automated pull request generated by https://github.com/timgates42/meticulous/blob/master/docs/NOTE.md

    opened by timgates42 2
  • Quantopian/zipline is NO LONGER MAINTAINED - See other forks

    Quantopian/zipline is NO LONGER MAINTAINED - See other forks

    Any issues raised for in this Git are unlikely to be resolved, since Quantopian has folded and the staff that have moved to Robinhood have not been involved for a few years, so it's currently unlikely that anything will happen here.

    However, thanks to the foresight of the Quantopian founders and devs who made this open source, other contributors have taken the banner and have improved this package from where Quantopian left it.

    @stefan-jansen maintains a forked, updated version of this package here: https://github.com/stefan-jansen/zipline-reloaded

    opened by RichardDale 0
  • ~

    ~

    Dear Zipline Maintainers,

    Before I tell you about my issue, let me describe my environment:

    Environment

    • Operating System: (Windows Version or $ uname --all)
    • Python Version: $ python --version
    • Python Bitness: $ python -c 'import math, sys;print(int(math.log(sys.maxsize + 1, 2) + 1))'
    • How did you install Zipline: (pip, conda, or other (please explain))
    • Python packages: $ pip freeze or $ conda list

    Now that you know a little about me, let me tell you about the issue I am having:

    Description of Issue

    • What did you expect to happen?
    • What happened instead?

    Here is how you can reproduce this issue on your machine:

    Reproduction Steps

    ...

    What steps have you taken to resolve this already?

    ...

    Anything else?

    ...

    Sincerely, $ whoami

    opened by scubamut 0
Releases(1.4.1)
  • 1.4.1(Oct 5, 2020)

    This release includes a small number of bug fixes, documentation improvements, and build/dependency enhancements.

    Conda packages for zipline and its dependencies are now available for python 3.6 on the ‘conda-forge’ Anaconda channel. They’re also available on the ‘Quantopian’ channel, but we’ll stop updating those eventually.

    Source code(tar.gz)
    Source code(zip)
  • 1.4.0(Jul 23, 2020)

  • v1.3.0(Jul 17, 2018)

    Release 1.3.0

    This release includes several enhancements and performance improvements along with a small number of bug fixes. We recommend that all users upgrade to this version.

    NOTE: This will likely be the last minor release in the Zipline 1.x series. The release next will be Zipline 2.0, which will include a number of small breaking changes required to support international equities.

    Highlights

    Support for Newer Numpy/Pandas Versions

    Zipline has historically been very conservative when updating versions of numpy, pandas, and other "PyData" ecosystem packages. This conservatism is primarily due to the fact that Zipline is used as the backtesting engine for Quantopian, which means that updating package versions risks breaking a large installed codebase. Of course, many Zipline users don't have the backwards compatibility requirements that Quantopian has, and they'd like to be able to use the latest and greatest package versions.

    As part of this release, we're now building and testing Zipline with two package configurations:

    • "Stable", using numpy version 1.11 and pandas version 0.18.1.
    • "Latest", using numpy version 1.14 and pandas version 0.22.0.

    Other combinations of numpy and pandas may work, but these package sets will be built and tested during our normal development cycle.

    Moving forward, our goal is to continue to maintain support for two sets of packages at any given time. The "stable" package set will change relatively infrequently, and will contain the versions of numpy and pandas supported on Quantopian. The "latest" package set will change regularly, and will contain recently-released versions of numpy and pandas.

    Our hope with these changes is to strike a balance between stability and novelty without taking on too great a maintenance burden by supporting every possible combination of packages.

    Standalone trading_calendars Module

    One of the most popular features of Zipline is its collection of trading calendars, which provide information about holidays and trading hours of various markets. As part of this release, Zipline's calendar-related functionality has been moved to a separate trading-calendars_ package, allowing users that only needed access to the calendars to use them without taking on the rest of Zipline's dependencies.

    For backwards compability, Zipline will continue to re-export calendar-related functions. For example, zipline.get_calendar still exists, but is now an alias for trading_calendars.get_calendar. Users that depend on this functionality are encouraged to update their imports to the new locations in trading_calendars.

    Custom Blotters

    This release adds experimental support for running Zipline with user-defined subclasses of ~zipline.finance.blotter.blotter.Blotter. The primary motivation for this change is to make it easier to run live algorithms from the Zipline CLI.

    There are two primary ways to configure a custom blotter:

    1. You can pass an instance of zipline.finance.blotter.blotter.Blotter as the blotter parameter to zipline.run_algorithm. (This functionality had existed previously, but wasn't well-documented.)
    2. You can register a named factory for a blotter in your extension.py and pass the name on the command line via the --blotter flag.

    An example usage of (2) might look like this:

    from zipline.extensions import register
    from zipline.finance.blotter import Blotter, SimulationBlotter
    from zipline.finance.cancel_policy import EODCancel
    
    @register(Blotter, 'my-blotter')
    def my_blotter():
        """Create a SimulationBlotter with a non-default cancel policy.
        """
        return SimulationBlotter(cancel_policy=EODCancel())
    

    To use this factory when running zipline from the command line, we would invoke zipline like this:

    $ zipline run --blotter my-blotter <...other-args...>
    

    As part of this change, the zipline.finance.blotter.blotter.Blotter class has been converted to an abstract base class. The default blotter used in simulations is now named zipline.finance.blotter.SimulationBlotter.

    Custom Command-Line Arguments

    This release adds support for passing custom arguments to the zipline command-line interface. Custom command-line arguments are passed via the -x flag followed by a key=value pair. Arguments passed this way can be accessed from Python code (e.g., an algorithm or an extension) via attributes of zipline.extension_args. For example, if zipline is invoked like this:

    $ zipline -x argle=bargle run ...
    

    then the result of zipline.extension_args.argle would be the string "bargle".

    Custom arguments can be grouped into namespaces by including . characters in keys. For example, if zipline is invoked like this:

    $ zipline -x argle.bargle=foo
    

    then zipline.extension_args.argle will contain an object with a bargle attribute containing the string "foo". Keys can contain multiple dots to create nested namespaces.

    Enhancements

    • Added support for pandas 0.22 and numpy 1.14. See above for details.
    • Moved zipline.utils.calendars into a separately-installable trading-calendars_ package.
    • Added support for specifying custom string arguments with the -x flag. See above for details.

    Experimental Features

    • Added support for registering custom subclass of zipline.finance.blotter.Blotter. See above for details.

    Bug Fixes

    • Fixed a bug in zipline.pipeline.Factor.winsorize where NaN values were incorrectly included in value counts when determining cutoff thresholds for winsorization.

    • Fixed a crash in zipline.pipeline.Factor.top with a count of 1 and no groupby.

    • Fixed a bug where calling data.history with a negative lookback would fetch prices from the future.

    • Fixed a bug where StopOrder, LimitOrder, and StopLimitOrder prices were being rounded to the nearest penny regardless of asset tick size. Prices are now rounded based on the tick_size attribute of the asset being ordered.

    Performance

    • Improved performance when fetching minutely prices for assets that trade regularly.
    • Improved performance when fetching minutely prices for many assets by tuning cache sizes.

    Maintenance and Refactorings

    • Refactored large parts of the Zipline test suite to make it easier to change the signature of TradingAlgorithm.

    Build

    • Added support for running travis builds with pandas 0.18 and 0.22.
    • Added OSX builds to the travis build matrix.
    Source code(tar.gz)
    Source code(zip)
  • 1.0.2(Sep 7, 2016)

Owner
Quantopian, Inc.
Quantopian builds software tools and libraries for quantitative finance.
Quantopian, Inc.
A neural-based binary analysis tool

A neural-based binary analysis tool Introduction This directory contains the demo of a neural-based binary analysis tool. We test the framework using

Facebook Research 208 Dec 22, 2022
A CLI tool to reduce the friction between data scientists by reducing git conflicts removing notebook metadata and gracefully resolving git conflicts.

databooks is a package for reducing the friction data scientists while using Jupyter notebooks, by reducing the number of git conflicts between different notebooks and assisting in the resolution of

dataroots 86 Dec 25, 2022
Maximum Covariance Analysis in Python

xMCA | Maximum Covariance Analysis in Python The aim of this package is to provide a flexible tool for the climate science community to perform Maximu

Niclas Rieger 39 Jan 03, 2023
Calculate multilateral price indices in Python (with Pandas and PySpark).

IndexNumCalc Calculate multilateral price indices using the GEKS-T (CCDI), Time Product Dummy (TPD), Time Dummy Hedonic (TDH), Geary-Khamis (GK) metho

Dr. Usman Kayani 3 Apr 27, 2022
🌍 Create 3d-printable STLs from satellite elevation data 🌏

mapa 🌍 Create 3d-printable STLs from satellite elevation data Installation pip install mapa Usage mapa uses numpy and numba under the hood to crunch

Fabian Gebhart 13 Dec 15, 2022
PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams

PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams Motivation When dataset freshness is critical, the annotating of high speed

4 Aug 02, 2022
PCAfold is an open-source Python library for generating, analyzing and improving low-dimensional manifolds obtained via Principal Component Analysis (PCA).

PCAfold is an open-source Python library for generating, analyzing and improving low-dimensional manifolds obtained via Principal Component Analysis (PCA).

Burn Research 4 Oct 13, 2022
Data-sets from the survey and analysis

bachelor-thesis "Umfragewerte.xlsx" contains the orginal survey results. "umfrage_alle.csv" contains the survey results but one participant is cancele

1 Jan 26, 2022
We're Team Arson and we're using the power of predictive modeling to combat wildfires.

We're Team Arson and we're using the power of predictive modeling to combat wildfires. Arson Map Inspiration There’s been a lot of wildfires in Califo

Jerry Lee 3 Oct 17, 2021
Python data processing, analysis, visualization, and data operations

Python This is a Python data processing, analysis, visualization and data operations of the source code warehouse, book ISBN: 9787115527592 Descriptio

FangWei 1 Jan 16, 2022
A Python adaption of Augur to prioritize cell types in perturbation analysis.

A Python adaption of Augur to prioritize cell types in perturbation analysis.

Theis Lab 2 Mar 29, 2022
A data structure that extends pyspark.sql.DataFrame with metadata information.

MetaFrame A data structure that extends pyspark.sql.DataFrame with metadata info

Invent Analytics 8 Feb 15, 2022
PyPSA: Python for Power System Analysis

1 Python for Power System Analysis Contents 1 Python for Power System Analysis 1.1 About 1.2 Documentation 1.3 Functionality 1.4 Example scripts as Ju

758 Dec 30, 2022
A lightweight, hub-and-spoke dashboard for multi-account Data Science projects

A lightweight, hub-and-spoke dashboard for cross-account Data Science Projects Introduction Modern Data Science environments often involve many indepe

AWS Samples 3 Oct 30, 2021
Wafer Fault Detection - Wafer circleci with python

Wafer Fault Detection Problem Statement: Wafer (In electronics), also called a slice or substrate, is a thin slice of semiconductor, such as a crystal

Avnish Yadav 14 Nov 21, 2022
Tools for working with MARC data in Catalogue Bridge.

catbridge_tools Tools for working with MARC data in Catalogue Bridge. Borrows heavily from PyMarc

1 Nov 11, 2021
MS in Data Science capstone project. Studying attacks on autonomous vehicles.

Surveying Attack Models for CAVs Guide to Installing CARLA and Collecting Data Our project focuses on surveying attack models for Connveced Autonomous

Isabela Caetano 1 Dec 09, 2021
Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Theano

PyMC3 is a Python package for Bayesian statistical modeling and Probabilistic Machine Learning focusing on advanced Markov chain Monte Carlo (MCMC) an

PyMC 7.2k Dec 30, 2022
BasstatPL is a package for performing different tabulations and calculations for descriptive statistics.

BasstatPL is a package for performing different tabulations and calculations for descriptive statistics. It provides: Frequency table constr

Angel Chavez 1 Oct 31, 2021
A multi-platform GUI for bit-based analysis, processing, and visualization

A multi-platform GUI for bit-based analysis, processing, and visualization

Mahlet 529 Dec 19, 2022