Releases#
Release 2.0.0rc#
- Release:
2.0.0rc1
- Date:
April 5, 2021
Highlights#
This release updates Zipline to be compatible with Python >= 3.7 as well as the current versions of relevant PyData libraries like Pandas, scikit-learn, and others.
Conda packages for Zipline and key dependencies bcolz and TA-Lib are now available for Python 3.7-3.9 on the ‘ml4t’ Anaconda channel. Binary wheels are available on PyPi for Linux ( Python 3.7-3.9) and MacOSx (3.7 and 3.8).
As part of the update, the BlazeLoader
functionality was removed. It was built on the Blaze Ecosystem. Unfortunately, the three relevant projects (Blaze, Odo and datashape have received very limited support over the last several years.
Other updates include:
A new release for Bcolz which has been marked unmaintained since September 2020 by the author. The new release updates the underlying c-blosc library from version 1.14 to the latest 1.21.0. There are also conda packages for Bcolz (see links above).
Networkx now uses the better performing version 2.0.
Conda packages for TA-Lib 0.4.19.
This new release also makes it easier to load custom data sources into a Pipeline (such as the predictions of an ML model) when backtesting. See the relevant examples in the Github repo of the book Machine Learning for Trading, such as these ones.
Enhancements#
custom_loader() for custom Pipeline data
compatibility with the latest versions of Pandas, scikit-learn, and other relevant PyData libraries.
Bug Fixes#
Numerous tests updates to accommodate recent Python and dependency versions.
Performance#
Latest blosc library may improve compression and I/O performance
Maintenance and Refactorings#
Removed Python 2 support
Build#
All builds consolidated on GitHub Actions CI
Documentation#
Expanded with additional information on Pipeline and related DataLoaders
Release 1.4.1#
- Release:
1.4.1
- Date:
October 5, 2020
This release includes a small number of bug fixes, documentation improvements, and build/dependency enhancements.
Conda packages for zipline and its dependencies are now available for python 3.6 on the ‘conda-forge’ Anaconda channel. They’re also available on the ‘Quantopian’ channel, but we’ll stop updating those eventually.
Bug Fixes#
Fix for calling
run_algorithm
withoutbenchmark_returns
(2762)
Maintenance and Refactorings#
Build#
Documentation#
Release 1.4.0#
- Release:
1.4.0
- Date:
July 22, 2020
Highlights#
Removed Implicit Dependency on Benchmarks and Treasury Returns#
Previously, Zipline implicitly fetched these required inputs from third party API sources if they were not provided by users: treasury data from the US Federal Reserve’s API, and benchmarks from IEX. This meant that simulations required an internet connection and stable APIs for these data sources, neither of which were guaranteed for many users.
We removed the dependency on treasury curves, since they weren’t actually being used anymore. And we replaced the implicit downloading of benchmark returns with explicit options:
--benchmark-file The csv file that contains the benchmark
returns (date, returns columns)
--benchmark-symbol The instrument's symbol to be used as
a benchmark.
(should exist in the ingested bundle)
--benchmark-sid The sid of the instrument to be used as a
benchmark.
(should exist in the ingested bundle)
--no-benchmark This flag is used to set the benchmark to
zero. Alpha, beta and benchmark metrics
are not calculated
New Built In Factors#
PercentChange
: Calculates the percent change over the givenwindow_length
. Note: Percent change is calculated as(new - old) / abs(old)
. (2506)PeerCount
: Gives the number of occurrences of each distinct category in a classifier. (2509)ConstantMixin
: A mixin for creating a Pipeline term with a constant value. (2697)if_else()
: Allows users to create expressions that conditionally draw from the outputs of one of two terms. (2697)fillna()
: Allows users to fill missing data with either a constant value, or values from another term. (2697)clip()
: Allows users to constrain a factor’s values to a given range. (2708)mean()
,stddev()
,max()
,min()
,median()
,sum()
,notnull_count()
: Summarize data across the entire domain into a scalar factor. (2697)
Enhancements#
Added International Pipelines (2262)
Added DataSetFamily (née MultiDimensionalDataSet) - a shorthand for creating a collection of regular DataSets that share the same columns. (2402)
Added
get_column()
for looking up columns by name (2210)Added
CheckWindowsClassifier
that allows us to test lookback windows of categorical and string columns using Pipeline. (2458)Added
PipelineHooks
which is now used to display Pipline progress bars (2467)BoundColumn
comparisons will now result in an error. This prevents writingEquityPricing.volume > 1000
(silently returning bad data) insteads ofEquityPricing.volume.latest > 1000
. (2537)Added currency conversion support to Pipeline. (2586)
Added
--benchmark-file
and--benchmark-symbol
command line arguments to make it easier to provide benchmark data. (2642)Added support for Python 3.6 (2643)
Added
mask
argument to Factor.peer_count. (2676)Added
if_else()
andfillna()
for allowing conditional logic in Pipelines. (2691)Added daily summary methods to Factor for collecting summary statistics for the entire universe. (2697)
Added
clip()
method for clipping values to a range. (2708)Added support for Pipeline term arithmetic with more than 32 terms. (2727)
Bug Fixes#
Fixed support for non unique sid->exchange mappings. (2289)
Fixed crash on dividend warning. (2323)
Fixed
week_start
when Monday precedes the New Year. (2394)Ensured correct dtypes when unpacking empty dataframes. (2444)
Fixed a bug where a Pipeline term with
window_length=0
would not copy the input before callingcompute()
which could cause incorrect results if the input was reused in the Pipeline. (2723)
Performance#
Added
HDF5DailyBarWriter
, which writes daily pricing in a new format as an HDF5 file. Each OHLCV field is stored as a 2D array in a chunked HDF5 dataset, with a row per sid and a column per day. The file also supports multiple countries. AddedHDF5DailyBarReader
, which implements the BarReader interface and can read files written by HDF5DailyBarWriter. (2295)Vectorized dividend ratio calculation (2298)
Improved performance of the
RollingPearson
andRollingPearsonOfReturns
pipeline factors. (2071)
Maintenance and Refactorings#
Miscellaneous#
International Pipelines#
Pipeline now supports international data.
Pipeline is a tool that allows you to define computations over a universe of assets and a period of time. In the past, you could only run pipelines on the US equity market. Now, you can now specify a domain over which a pipeline should be computed. The name “domain” refers to the mathematical concept of the “domain of a function”, which is the set of potential inputs to a function. In the context of Pipeline, the domain specifies the set of assets and a corresponding trading calendar over which the expressions of a pipeline should be computed.
For example, the following pipeline returns the latest close price and volume for all Canadian equities, every day.
pipe = Pipeline(
columns={
'price': EquityPricing.close.latest,
'volume': EquityPricing.volume.latest,
'mcap': factset.Fundamentals.mkt_val.latest,
},
domain=CA_EQUITIES,
)
Another challenge related to currencies is the fact that some exchanges don’t require stocks to be listed in local currency. For example, the London Stock Exchange only has about 75% of its listings denominated in GBP*. The other 25% are primarily listed in EUR or USD. This can make it hard to make cross sectional comparisons.
To solve this problem, most people rely on currency conversions to bring price-based fields into the same currency.
Pipeline columns now support an fx
method for specifying what currency the data should be viewed as.
This method is only available on terms which are “currency-aware”, for example open or close, but not on terms that do not care about currency like volume.
Currently, there is no way to load international data into a bundle. We are working on ways to make it easy to get international data into Zipline.
The domains that Zipline currently supports for running pipelines (using the latest trading-calendars package) are the following:
Argentina
Australia
Austria
Belgium
Brazil
Canada
Chile
China
Czech Republic
Colombia
Czechia
Finland
France
Germany
Greece
Hong Kong
Hungary
India
Indonesia
Ireland
Italy
Japan
Malaysia
Mexico
Netherlands
New Zealand
Norway
Pakistan
Peru
Philippines
Poland
Portugal
Russia
Singapore
Spain
Sweden
Taiwan
Thailand
Turkey
United Kingdom
United States
South Africa
South Korea
Switzerland
DataSetFamily#
Dataset families are used to represent data where the unique identifier for a row requires more than just asset and date coordinates. A
DataSetFamily
can also be thought of as a collection ofDataSet
objects, each of which has the same columns, domain, and ndim.
DataSetFamily
objects are defined with one or moreColumn
objects, plus one additional field:extra_dims
.The
extra_dims
field defines coordinates other than asset and date that must be fixed to produce a logical timeseries. The column objects determine columns that will be shared by slices of the family.
extra_dims
are represented as an ordered dictionary where the keys are the dimension name, and the values are a set of unique values along that dimension.To work with a
DataSetFamily
in a pipeline expression, one must choose a specific value for each of the extra dimensions using theslice()
method. For example, given aDataSetFamily
:class SomeDataSet(DataSetFamily): extra_dims = [ ('dimension_0', {'a', 'b', 'c'}), ('dimension_1', {'d', 'e', 'f'}), ] column_0 = Column(float) column_1 = Column(bool)This dataset might represent a table with the following columns:
sid :: int64 asof_date :: datetime64[ns] timestamp :: datetime64[ns] dimension_0 :: str dimension_1 :: str column_0 :: float64 column_1 :: boolHere we see the implicit
sid
,asof_date
andtimestamp
columns as well as the extra dimensions columns.This
DataSetFamily
can be converted to a regularDataSet
with:DataSetSlice = SomeDataSet.slice(dimension_0='a', dimension_1='e')This sliced dataset represents the rows from the higher dimensional dataset where
(dimension_0 == 'a') & (dimension_1 == 'e')
.
Release 1.3.0#
- Release:
1.3.0
- Date:
July 16, 2018
This release includes several enhancements and performance improvements along with a small number of bug fixes. We recommend that all users upgrade to this version.
Note
This will likely be the last minor release in the Zipline 1.x series. The release next will be Zipline 2.0, which will include a number of small breaking changes required to support international equities.
Highlights#
Support for Newer Numpy/Pandas Versions#
Zipline has historically been very conservative when updating versions of numpy, pandas, and other “PyData” ecosystem packages. This conservatism is primarily due to the fact that Zipline is used as the backtesting engine for Quantopian, which means that updating package versions risks breaking a large installed codebase. Of course, many Zipline users don’t have the backwards compatibility requirements that Quantopian has, and they’d like to be able to use the latest and greatest package versions.
As part of this release, we’re now building and testing Zipline with two package configurations:
“Stable”, using numpy version 1.11 and pandas version 0.18.1.
“Latest”, using numpy version 1.14 and pandas version 0.22.0.
Other combinations of numpy and pandas may work, but these package sets will be built and tested during our normal development cycle.
Moving forward, our goal is to continue to maintain support for two sets of packages at any given time. The “stable” package set will change relatively infrequently, and will contain the versions of numpy and pandas supported on Quantopian. The “latest” package set will change regularly, and will contain recently-released versions of numpy and pandas.
Our hope with these changes is to strike a balance between stability and novelty without taking on too great a maintenance burden by supporting every possible combination of packages. (2194)
Standalone trading_calendars
Module#
One of the most popular features of Zipline is its collection of trading calendars, which provide information about holidays and trading hours of various markets. As part of this release, Zipline’s calendar-related functionality has been moved to a separate trading-calendars package, allowing users that only needed access to the calendars to use them without taking on the rest of Zipline’s dependencies.
For backwards compability, Zipline will continue to re-export calendar-related
functions. For example, zipline.get_calendar()
still exists, but is now
an alias for trading_calendars.get_calendar
. Users that depend on this
functionality are encouraged to update their imports to the new locations in
trading_calendars
. (2219)
Custom Blotters#
This release adds experimental support for running Zipline with user-defined
subclasses of Blotter
. The primary
motivation for this change is to make it easier to run live algorithms from the
Zipline CLI.
There are two primary ways to configure a custom blotter:
You can pass an instance of
Blotter
as theblotter
parameter tozipline.run_algorithm()
. (This functionality had existed previously, but wasn’t well-documented.)You can register a named factory for a blotter in your
extension.py
and pass the name on the command line via the--blotter
flag.
An example usage of (2) might look like this:
from zipline.extensions import register
from zipline.finance.blotter import Blotter, SimulationBlotter
from zipline.finance.cancel_policy import EODCancel
@register(Blotter, 'my-blotter')
def my_blotter():
"""Create a SimulationBlotter with a non-default cancel policy.
"""
return SimulationBlotter(cancel_policy=EODCancel())
To use this factory when running zipline from the command line, we would invoke zipline like this:
$ zipline run --blotter my-blotter <...other-args...>
As part of this change, the Blotter
class has been converted to an abstract base class. The default blotter used in
simulations is now named zipline.finance.blotter.SimulationBlotter
.
Custom Command-Line Arguments#
This release adds support for passing custom arguments to the zipline
command-line interface. Custom command-line arguments are passed via the -x
flag followed by a key=value
pair. Arguments passed this way can be
accessed from Python code (e.g., an algorithm or an extension) via attributes
of zipline.extension_args
. For example, if zipline is invoked like this:
$ zipline -x argle=bargle run ...
then the result of zipline.extension_args.argle
would be the string
"bargle"
.
Custom arguments can be grouped into namespaces by including .
characters
in keys. For example, if zipline is invoked like this:
$ zipline -x argle.bargle=foo
then zipline.extension_args.argle
will contain an object with a bargle
attribute containing the string "foo"
. Keys can contain multiple dots to
create nested namespaces. (2210)
Enhancements#
Added support for pandas 0.22 and numpy 1.14. See above for details. (2194)
Moved
zipline.utils.calendars
into a separately-installable trading-calendars package. (2219)Added support for specifying custom string arguments with the
-x
flag. See above for details. (2210)
Experimental Features#
Warning
Experimental features are subject to change.
Bug Fixes#
Fixed a bug in
zipline.pipeline.Factor.winsorize()
where NaN values were incorrectly included in value counts when determining cutoff thresholds for winsorization. (2138)Fixed a crash in
zipline.pipeline.Factor.top()
with a count of 1 and no groupby. (2218)Fixed a bug where calling
data.history
with a negative lookback would fetch prices from the future. (2164)Fixed a bug where
StopOrder`
,zipline.finance.execution.LimitOrder
, andzipline.finance.execution.StopLimitOrder
prices were being rounded to the nearest penny regardless of asset tick size. Prices are now rounded based on thetick_size
attribute of the asset being ordered. (2211)
Performance#
Maintenance and Refactorings#
Build#
Release 1.2.0#
- Release:
1.2.0
- Date:
April 4, 2018
Highlights#
Extensible Risk and Performance Metrics (2081)#
The risk and performance metrics are summarizing values calculated by Zipline when running a simulation, for example: returns or Sharpe ratio. 1.1.2 introduces a new API for registering custom risk and performance metrics defined by the user. We have also made it possible to run a backtest without computing any metrics to improve the feedback cycle when debugging an algorithm.
For more information, see Metrics.
Docs, Trading Calendars, and Benchmarks#
Zipline now defaults to using the quandl
bundle, which you’ll need an API Key for, and can find information about in the Data Bundles documentation.
We’ve added many Tutorial & Documentations updates, including information on how to create your own TradingCalendar
, pass it to your algorithm via the Zipline CLI, and how to use custom csv data using the csvdir
bundle.
Zipline is no longer being tested and packaged for Python 3.4.
Zipline now requests data for SPY, the default benchmark used for Zipline backtests, using the IEX Trading API, and no longer uses pandas-datareader
. You can run a backtest up to 5 years from the current day using this data.
Enhancements#
Grow minute file cache to 1550 by default (1906)
Change default commission to .001 (1946)
Enable the ability to compute multiple pipelines (1974)
Allow users to switch between calendars (1800)
New filter
NoMissingValues
(1969)Fail better on
AssetFinder(nonexistent_path)
(2000)Implement csvdir bundle (1860)
Update quandl_bundle to use Quandl API v3 (1990)
Add
FixedBasisPointsSlippage
slippage model (2047)Create MinLeverage control (2064)
Experimental Features#
Warning
Experimental features are subject to change.
None
Bug Fixes#
history
calls with a frequency of1d
now work when using a Panel as the minute data source. (1920)Check contract exists when using futures daily bar reader (1892)
NoDataBeforeDate
edge cases (1894)Fix frame column validation in Python 2.7.5 (1954)
Fix daily history for minute panel data backtest (1920)
get_last_traded_dt
expects a trading day (2087)Daily Adjustment perspective fix (2089)
Performance#
Maintenance and Refactorings#
Add
CachedObject.expired()
(1881)Set
RollingLinearRegressionOfReturns
factor to be window_safe (1902)Set
RSI
factor to be window_safe (1904)Updates for better docs generation (1890)
Remove and zero out unused treasury curves (1910)
Networkx 2 changes the behavior of out_degree (1996)
Pass calendars to
DataPortal
(2026)Remove old Yahoo code (2032)
Sync and fill benchmarks through latest trading day (2044)
Provides better error message when QUANDL_API_KEY is missing (2078)
Improve the error message for misaligned dates in Pipeline engine (2131)
Build#
Documentation#
Include
MACDSignal
in zipline.io documentation (1828)Remove mentions of Yahoo from the Beginner Tutorial (1845)
Add contributing & questions section to the README (1889)
Add info about using a conda envs for installs (1922)
Fix Beginner Tutorial link (1932)
Add clean docs (1943)
Add distinct warnings for benchmark and treasury fetchers (1971)
Add CONTRIBUTING.rst (2033)
Add tutorial on creating a custom
TradingCalendar
(2035)Docs & tutorial updates for ingesting, beginners, and csvdir (2073)
Documented the new risk and performance metrics API (2081).
Fixed a typo in the description of
--bundle-timestamp
(2123)
Miscellaneous#
None
Release 1.1.1#
- Release:
1.1.1
- Date:
July 5, 2017
Highlights#
Zipline now has broad support for futures, in addition to equities. It’s also being tested and packaged for Python 3.5.
We also saw breaking changes occur from Yahoo changing their API endpoint, thus preventing users from downloading benchmark data needed for backtests. Since that change, we have swapped out Yahoo-related benchmarking code with references to Google Finance and have removed all deprecated Yahoo code, including the usage of custom Yahoo bundles.
Enhancements#
Adds a property for BarData to know about current session’s minutes (1713)
Adds a better error message for non-existent root symbols (1715:)
Adds
StaticSids
Pipeline Filter (1717)Allows
zipline.data.data_portal.DataPortal.get_spot_value
to accept multiple assets (1719)Adds
ContinuousFuture
tolookup_generic
(1718)Adds CFE Adhoc Holidays to
exchange_calendar_cfe
(1698)Allows overriding of order amount rounding (1722)
Makes continuous future adjustment style an argument (1726)
Adds preliminary support for Futures slippage and commission models (1738)
Fix a bug in cost basis calculation and change all mentions of
sid
toasset
(1757)Add slippage and commission models for futures (1748)
Use Python 3.5 in our Dockerfile (1806)
Allow pipelines to be run in chunks (1811)
Adds get_range to BenchmarkSource (1815)
Adds support for relabeling classifiers in Pipeline (1833)
Experimental Features#
Warning
Experimental features are subject to change.
None
Bug Fixes#
Fixes a floating point division issue in
zipline.data.minute_bars
by using integer divison instead (1683)Sorts data in
zipline.pipeline.loaders.blaze.core
onasof_date
to resolve timestamp conflicts (1710)Swapped out Yahoo for Google Finance benchmark data (1812)
Gold and silver futures contracts only traded during certain months (1779)
Fixes bug in TradingCalendar initialization when we use tzaware datetimes (1802)
Fixes precision issues on futures prices when rounding (1788)
Performance#
Avoid repeated recursive calls when getting forward-filled close price (1735)
Maintenance and Refactorings#
Adds linter recommendations to adjustments module (1712)
Clears up naming and logic in resample close (1728)
Use March quarterly cycle for several continuous futures (1762)
Use better repr for Transaction objects (1746)
Shorten repr for Asset objects (1786)
Removes usage of empyrical’s information ratio (1854)
Build#
Documentation#
Miscellaneous#
Use csv market data with
run_algorithm
so we don’t try to download data for tests (1793)Updates Dockerfile to use Python 3.5
Release 1.1.0#
- Release:
1.1.0
- Date:
March 10, 2017
This release is meant to provide zipline support for pandas 0.18, as well as several bug fixes, API changes, and many performance changes.
Enhancements#
Makes the minute bar read catch NoDataOnDate exceptions if dates are not in the calendar. Before, the minute bar reader was forward filling, but now it returns nan for OHLC and 0 for V. (1488)
Adds
truncate
method toBcolzMinuteBarWriter
(1499)Bumps up to pandas 0.18.1 and numpy 1.11.1 (1339)
Adds an earnings estimates quarter loader for Pipeline (1396)
Creates a restricted list manager that takes in information about restricted sids and stores in memory upon instantiation (1487)
Adds
last_available{session, minute}
args toDataPortal
(1528)Adds
SpecificAssets
filter (1530)Adds the ability for an algorithm to request the current contract for a future chain (1529)
Adds
chain
field to current and supporting methods inDataPortal
andOrderedContracts
(1538)Adds history for continuous futures (1539)
Adds adjusted history for continuous future (1548)
Adds roll style which takes the volume of a future contract into account, specifically for continuous futures (1556)
Adds better error message when calling Zipline API functions outside of a running simulation (1593)
Adds
MACDSignal()
,MovingAverageConvergenceDivergenceSignal()
, andAnnualizedVolatility()
as built-in factors. (1588)Allows running pipelines with custom date chunks in
attach_pipeline
(1617)Adds
order_batch
to the trade blotter (1596)Adds vectorized lookup_symbol (1627)
Solidifies equality comparisons for SlippageModel classes (1657)
Adds a factor for winsorized results (1696)
Bug Fixes#
Changes str to string_types to avoid errors when type checking unicode and not str type. (1315)
Algorithms default to quantopian-quandl bundle when no data source is specified (1479) (1374)
Catches all missing data exceptions when computing dividend ratios (1507)
Creates adjustments based on ordered assets instead of a set. Before, adjustments were created for estimates based on where assets happened to fall in a set rather than using ordered assets (1547)
Fixes blaze pipeline queries for when users query for the
asof_date
column (1608)Datetimes should be converted in utc. DataFrames being returned were creating US/Eastern timestamps out of the ints, potentially changing the date returned to be the date before (1635)
Fixes default inputs for
IchimokuKinkoHyo
factor (1638)
Performance#
Removes invocations of
get_calendar('NYSE')
which cuts down zipline import time and makes the CLI more responsive and use less memory. (1471)Refcounts and releases pipeline terms when they are no longer needed (1484)
Saves up to 75% of calls to minute_to_session_label (1492)
Speeds up counting of number of minutes across contiguous session (1497)
Removes/defers calls to get_loc on large indices (1504) (1503)
Replaces
get_loc
calls incalc_dividend_ratios
withget_indexer
(1510)Speeds up minute to session sampling (1549)
Adds some micro optimizations in
data.current
(1561)Adds optimization for initial workspace for pipelines (1521)
More memory savings (1599)
Maintenance and Refactorings#
Adds additional fields to
__getitem__
for Order class (1483)Adds
BarReader
base class for minute and session readers (1486)Removes
future_chain
API method, to be replaced bydata.current_chain
(1502)Puts zipline back on blaze master (1505)
Adds Tini and sets version range for numpy, pandas, and scipy in Dockerfile (1514)
Deprecates
set_do_not_order_list
(1487)Uses
Timedelta
instead ofDateOffset
(1487)Update and pin more dev requirements (1642)
Build#
Documentation#
Updated example notebook for latest zipline cell magic
Adds ANACONDA_TOKEN directions (1589)
Miscellaneous#
Changed the short-opt for
--before
in thezipline clean
entrypoint. The new argument is-e
. The old argument,-b
, conflicted with the--bundle
short-opt (1625).
Release 1.0.2#
- Release:
1.0.2
- Date:
September 8, 2016
Enhancements#
Adds forward fill checkpoint tables for the blaze core loader. This allow the loader to more efficiently forward fill the data by capping the lower date it must search for when querying data. The checkpoints should have novel deltas applied (1276).
Updated VagrantFile to include all dev requirements and use a newer image (1310).
Allow correlations and regressions to be computed between two 2D factors by doing computations asset-wise (1307).
Filters have been made window_safe by default. Now they can be passed in as arguments to other Filters, Factors and Classifiers (1338).
Added an optional
groupby
parameter torank()
,top()
, andbottom()
. (1349).Added new pipeline filters,
All
andAny
, which takes another filter and returns True if an asset produced a True for any/all days in the previouswindow_length
days (1358).Added new pipeline filter
AtLeastN
, which takes another filter and an int N and returns True if an asset produced a True on N or more days in the previouswindow_length
days (1367).Use external library empyrical for risk calculations. Empyrical unifies risk metric calculations between pyfolio and zipline. Empyrical adds custom annualization options for returns of custom frequencies. (855)
Add Aroon factor. (1258)
Add fast stochastic oscillator factor. (1255)
Add a Dockerfile. (1254)
New trading calendar which supports sessions which span across midnights, e.g. 24 hour 6:01PM-6:00PM sessions for futures trading. zipline.utils.tradingcalendar is now deprecated. (1138) (1312)
Allow slicing a single column out of a Factor/Filter/Classifier. (1267)
Provide Ichimoku Cloud factor (1263)
Allow default parameters on Pipeline terms. (1263)
Provide rate of change percentage factor. (1324)
Provide linear weighted moving average factor. (1325)
Add
NotNullFilter
. (1345)Allow capital changes to be defined by a target value. (1337)
Add
TrueRange
factor. (1348)Add point in time lookups to
assets.db
. (1361)Make
can_trade
aware of the asset’s exchange . (1346)Add
downsample
method to all computable terms. (1394)Add QuantopianUSFuturesCalendar. (1414)
Enable publishing of old
assets.db
versions. (1430)Enable
schedule_function
for Futures trading calendar. (1442)Disallow regressions of length 1. (1466)
Experimental#
Bug Fixes#
Changes
AverageDollarVolume
built-in factor to treat missing close or volume values as 0. Previously, NaNs were simply discarded before averaging, giving the remaining values too much weight (1309).Remove risk-free rate from sharpe ratio calculation. The ratio is now the average of risk adjusted returns over violatility of adjusted returns. (853)
Sortino ratio will return calculation instead of np.nan when required returns are equal to zero. The ratio now returns the average of risk adjusted returns over downside risk. Fixed mislabeled API by converting mar to downside_risk. (747)
Downside risk now returns the square root of the mean of downside difference squares. (747)
Information ratio updated to return mean of risk adjusted returns over standard deviation of risk adjusted returns. (1322)
Alpha and sharpe ratio are now annualized. (1322)
Fix units during reading and writing of daily bar
first_trading_day
attribute. (1245)Optional dispatch modules, when missing, no longer cause a NameError. (1246)
Treat
schedule_function
argument as a time rule when a time rule, but no date rule is supplied. (1221)Protect against boundary conditions at beginning and end trading day in schedule function. (1226)
Apply adjustments to previous day when using history with a frequency of 1d. (1256)
Fail fast on invalid pipeline columns, instead of attempting to access the nonexistent column. (1280)
Fix
AverageDollarVolume
NaN handling. (1309)
Performance#
Maintenance and Refactorings#
Removed remaining mentions of
add_history
. (1287)
Documentation#
Testing#
Add test fixture which sources daily pricing data from minute pricing data fixtures. (1243)
Data Format Changes#
Release 1.0.1#
- Release:
1.0.1
- Date:
May 27, 2016
This is a minor bug-fix release from 1.0.0 and includes a small number of bug fixes and documentation improvements.
Enhancements#
Added support for user-defined commission models. See the
zipline.finance.commission.CommissionModel
class for more details on implementing a commision model. (1213)Added support for non-float columns to Blaze-backed Pipeline datasets (1201).
Added
zipline.pipeline.slice.Slice
, a new pipeline term designed to extract a single column from another term. Slices can be created by indexing into a term, keyed by asset. (1267)
Bug Fixes#
Fixed a bug where Pipeline loaders were not properly initialized by
zipline.run_algorithm()
. This also affected invocations ofzipline run
from the CLI.Fixed a bug that caused the
%%zipline
IPython cell magic to fail (533233fae43c7ff74abfb0044f046978817cb4e4).Fixed a bug in the
PerTrade
commission model where commissions were incorrectly applied to each partial-fill of an order rather than on the order itself, resulting in algorithms being charged too much in commissions when placing large orders.PerTrade
now correctly applies commissions on a per-order basis (1213).Attribute accesses on
CustomFactors
defining multiple outputs will now correctly return an output slice when the output is also the name of aFactor
method (1214).Replaced deprecated usage of
pandas.io.data
withpandas_datareader
(1218).Fixed an issue where
.pyi
stub files forzipline.api
were accidentally excluded from the PyPI source distribution. Conda users should be unaffected (1230).
Documentation#
Added a new example,
zipline.examples.momentum_pipeline
, which exercises the Pipeline API (1230).
Release 1.0.0#
- Release:
1.0.0
- Date:
May 19, 2016
Highlights#
Zipline 1.0 Rewrite (1105)#
We have rewritten a lot of Zipline and its basic concepts in order to improve runtime performance. At the same time, we’ve introduced several new APIs.
At a high level, earlier versions of Zipline simulations pulled from a multiplexed stream of data sources, which were merged via heapq. This stream was fed to the main simulation loop, driving the clock forward. This strong dependency on reading all the data made it difficult to optimize simulation performance because there was no connection between the amount of data we fetched and the amount of data actually used by the algorithm.
Now, we only fetch data when the algorithm needs it. A new class,
DataPortal
, dispatches data requests to
various data sources and returns the requested values. This makes the runtime of
a simulation scale much more closely with the complexity of the algorithm,
rather than with the number of assets provided by the data sources.
Instead of the data stream driving the clock, now simulations iterate through a
pre-calculated set of day or minute timestamps. The timestamps are emitted by
MinuteSimulationClock
and
DailySimulationClock
, and consumed by the main
loop in transform()
.
We’ve retired the data[sid(N)]
and history
APIs, replacing them with
several methods on the BarData
object:
current()
,
history()
,
can_trade()
, and
is_stale()
. Old APIs will continue to work for
now, but will issue deprecation warnings.
You can now pass in an adjustments source to the
DataPortal
, and we will apply adjustments to
the pricing data when looking backwards at data. Prices and volumes for
execution and presented to the algorithm in data.current are the as-traded value
of the asset.
New Entry Points (1173 and 1178)#
In order to make it easier to use zipline we have updated the entry points for a backtest. The three supported ways to run a backtest are now:
zipline.run_algo()
$ zipline run
%zipline
(IPython magic)
Data Bundles (1173 and 1178)#
1.0.0 introduces data bundles. Data bundles are groups of data that should be preloaded and used to run backtests later. This allows users to not need to specify which tickers they are interested in each time they run an algorithm. This also allows us to cache the data between runs.
By default, the quantopian-quandl
bundle will be used which pulls data from
Quantopian’s mirror of the quandl WIKI dataset. New bundles may be registered with
zipline.data.bundles.register()
like:
@zipline.data.bundles.register('my-new-bundle')
def my_new_bundle_ingest(environ,
asset_db_writer,
minute_bar_writer,
daily_bar_writer,
adjustment_writer,
calendar,
cache,
show_progress):
...
This function should retrieve the data it needs and then use the writers that have been passed to write that data to disc in a location that zipline can find later.
This data can be used in backtests by passing the name as the -b / --bundle
argument to $ zipline run
or as the bundle
argument to
zipline.run_algorithm()
.
For more information see Data for more information.
String Support in Pipeline (1174)#
Added support for string data in Pipeline.
zipline.pipeline.data.Column
now accepts object
as a dtype, which
signifies that loaders for that column should emit windowed iterators over the
experimental new LabelArray
class.
Several new Classifier
methods have also been added
for constructing Filter
instances based on string
operations. The new methods are:
element_of()
startswith()
endswith()
has_substring()
matches()
element_of
is defined for all classifiers. The remaining methods are only defined for string-dtype classifiers.
Enhancements#
Made the data loading classes have more consistent interfaces. This includes the equity bar writers, adjustment writer, and asset db writer. The new interface is that the resource to be written to is passed at construction time and the data to write is provided later to the write method as dataframes or some iterator of dataframes. This model allows us to pass these writer objects around as a resource for other classes and functions to consume (1109 and 1149).
Added masking to
zipline.pipeline.CustomFactor
. Custom factors can now be passed a Filter upon instantiation. This tells the factor to only compute over stocks for which the filter returns True, rather than always computing over the entire universe of stocks. (1095)Added
zipline.utils.cache.ExpiringCache
. A cache which wraps entries in azipline.utils.cache.CachedObject
, which manages expiration of entries based on the dt supplied to the get method. (1130)Implemented
zipline.pipeline.factors.RecarrayField
, a new pipeline term designed to be the output type of a CustomFactor with multiple outputs. (1119)Added optional outputs parameter to
zipline.pipeline.CustomFactor
. Custom factors are now capable of computing and returning multiple outputs, each of which are themselves a Factor. (1119)Added support for string-dtype pipeline columns. Loaders for thse columns should produce instances of
zipline.lib.labelarray.LabelArray
when traversed.latest()
on string columns produces a string-dtypezipline.pipeline.Classifier
. (1174)Added several methods for converting Classifiers into Filters.
The new methods are: -
element_of()
-startswith()
-endswith()
-has_substring()
-matches()
element_of
is defined for all classifiers. The remaining methods are only defined for strings. (1174)Added
BollingerBands
factor. This factor implements the Bollinger Bands technical indicator: https://en.wikipedia.org/wiki/Bollinger_Bands (1199).Fetcher has been moved from Quantopian internal code into Zipline (1105).
Added new built-in factors,
RollingPearsonOfReturns
,RollingSpearmanOfReturns
andRollingLinearRegressionOfReturns
(1154)
Experimental Features#
Warning
Experimental features are subject to change.
Added a new
zipline.lib.labelarray.LabelArray
class for efficiently representing and computing on string data with numpy. This class is conceptually similar topandas.Categorical
, in that it represents string arrays as arrays of indices into a (smaller) array of unique string values. (1174)
Bug Fixes#
None
Performance#
None
Maintenance and Refactorings#
None
Build#
None
Documentation#
Miscellaneous#
Release 0.9.0#
- Release:
0.9.0
- Date:
March 29, 2016
Highlights#
Added classifiers and normalization methods to pipeline, along with new datasets and factors.
Added support for Windows with continuous integration on AppVeyor.
Enhancements#
Added new datasets
CashBuybackAuthorizations
andShareBuybackAuthorizations
for use in the Pipeline API. These datasets provide an abstract interface for adding cash and share buyback authorizations data, respectively, to a new algorithm. pandas-based reference implementations for these datasets can be found inzipline.pipeline.loaders.buyback_auth
, and experimental blaze-based implementations can be found inzipline.pipeline.loaders.blaze.buyback_auth
. (1022).Added new datasets
DividendsByExDate
,DividendsByPayDate
, andDividendsByAnnouncementDate
for use in the Pipeline API. These datasets provide an abstract interface for adding dividends data organized by ex date, pay date, and announcement date, respectively, to a new algorithm. pandas-based reference implementations for these datasets can be found inzipline.pipeline.loaders.dividends
, and experimental blaze-based implementations can be found inzipline.pipeline.loaders.blaze.dividends
. (1093).Added new built-in factors,
zipline.pipeline.factors.BusinessDaysSinceCashBuybackAuth
andzipline.pipeline.factors.BusinessDaysSinceShareBuybackAuth
. These factors use the newCashBuybackAuthorizations
andShareBuybackAuthorizations
datasets, respectively. (1022).Added new built-in factors,
zipline.pipeline.factors.BusinessDaysSinceDividendAnnouncement
,zipline.pipeline.factors.BusinessDaysUntilNextExDate
, andzipline.pipeline.factors.BusinessDaysSincePreviousExDate
. These factors use the newDividendsByAnnouncementDate` and ``DividendsByExDate
datasets, respectively. (1093).Implemented
zipline.pipeline.Classifier
, a new core pipeline API term representing grouping keys. Classifiers are primarily used by passing them as thegroupby
parameter to factor normalization methods. (1046)Added factor normalization methods:
zipline.pipeline.Factor.demean()
andzipline.pipeline.Factor.zscore()
. (1046)Added
zipline.pipeline.Factor.quantiles()
, a method for computing a Classifier from a Factor by partitioning into equally-sized buckets. Also added helpers for common quantile sizes (zipline.pipeline.Factor.quartiles()
,zipline.pipeline.Factor.quartiles()
, andzipline.pipeline.Factor.deciles()
) (1075).
Experimental Features#
Warning
Experimental features are subject to change.
None
Bug Fixes#
Fixed a bug where merging two numerical expressions failed given too many inputs. This caused running a pipeline to fail when combining more than ten factors or filters. (1072)
Performance#
None
Maintenance and Refactorings#
None
Build#
Added AppVeyor for continuous integration on Windows. Added conda build of zipline and its dependencies to AppVeyor and Travis builds, which upload their results to anaconda.org labeled with “ci”. (981)
Documentation#
None
Miscellaneous#
Adds
ZiplineTestCase
which provides hooks to consume test fixtures. Fixtures are things like:WithAssetFinder
which will makeself.asset_finder
available to your test with some mock data (1042).
Release 0.8.4#
- Release:
0.8.4
- Date:
February 24, 2016
Highlights#
Added a new
EarningsCalendar
dataset for use in the Pipeline API. (905).AssetFinder
speedups (830 and 817).Improved support for non-float dtypes in Pipeline. Most notably, we now support
datetime64
andint64
dtypes forFactor
, andBoundColumn.latest
now returns a properFilter
object when the column is of dtypebool
.Zipline now supports
numpy
1.10,pandas
0.17, andscipy
0.16 (969).Batch transforms have been deprecated and will be removed in a future release. Using
history
is recommended as an alternative.
Enhancements#
Adds a way for users to provide a context manager to use when executing the scheduled functions (including
handle_data
). This context manager will be passed theBarData
object for the bar and will be used for the duration of all of the functions scheduled to run. This can be passed toTradingAlgorithm
by the keyword argumentcreate_event_context
(828).Added support for
zipline.pipeline.factors.Factor
instances withdatetime64[ns]
dtypes. (905)Added a new
EarningsCalendar
dataset for use in the Pipeline API. This dataset provides an abstract interface for adding earnings announcement data to a new algorithm. A pandas-based reference implementation for this dataset can be found inzipline.pipeline.loaders.earnings
, and an experimental blaze-based implementation can be found inzipline.pipeline.loaders.blaze.earnings
. (905).Added new built-in factors,
zipline.pipeline.factors.BusinessDaysUntilNextEarnings
andzipline.pipeline.factors.BusinessDaysSincePreviousEarnings
. These factors use the newEarningsCalendar
dataset. (905).Added
isnan()
,notnan()
andisfinite()
methods tozipline.pipeline.factors.Factor
(861).Added
zipline.pipeline.factors.Returns
, a built-in factor which calculates the percent change in close price over the given window_length. (884).Added a new built-in factor:
AverageDollarVolume
. (927).Added
ExponentialWeightedMovingAverage
andExponentialWeightedMovingStdDev
factors. (910).Allow
DataSet
classes to be subclassed where subclasses inherit all of the columns from the parent. These columns will be new sentinels so you can register them a custom loader (924).Added
coerce()
to coerce inputs from one type into another before passing them to the function (948).Added
optionally()
to wrap other preprocessor functions to explicitly allowNone
(947).Added
ensure_timezone()
to allow string arguments to get converted intodatetime.tzinfo
objects. This also allowstzinfo
objects to be passed directly (947).Added two optional arguments,
data_query_time
anddata_query_tz
toBlazeLoader
andBlazeEarningsCalendarLoader
. These arguments allow the user to specify some cutoff time for data when loading from the resource. For example, if I want to simulate executing mybefore_trading_start
function at8:45 US/Eastern
then I could passdatetime.time(8, 45)
and'US/Eastern'
to the loader. This means that data that is timestamped on or after8:45
will not seen on that day in the simulation. The data will be made available on the next day (947).BoundColumn.latest
now returns aFilter
for columns of dtypebool
(962).Added support for
Factor
instances withint64
dtype.Column
now requires amissing_value
when dtype is integral. (962)It is also now possible to specify custom
missing_value
values forfloat
,datetime
, andbool
Pipeline terms. (962)Added auto-close support for equities. Any positions held in an equity that reaches its
auto_close_date
will be liquidated for cash according to the equity’s last sale price. Furthermore, any open orders for that equity will be canceled. Both futures and equities are now auto-closed on the morning of theirauto_close_date
, immediately prior tobefore_trading_start
. (982)
Experimental Features#
Warning
Experimental features are subject to change.
Added support for parameterized
Factor
subclasses. Factors may specifyparams
as a class-level attribute containing a tuple of parameter names. These values are then accepted by the constructor and forwarded by name to the factor’scompute
function. This API is experimental, and may change in future releases.
Bug Fixes#
Fixes an issue that would cause the daily/minutely method caching to change the
len
of aSIDData
object. This would cause us to think that the object was not empty even when it was (826).Fixes an error raised in calculating beta when benchmark data were sparse. Instead
numpy.nan
is returned (859).Fixed an issue pickling
sentinel()
objects (872).Fixed spurious warnings on first download of treasury data (:issue 922).
Corrected the error messages for
set_commission()
andset_slippage()
when used outside of theinitialize
function. These errors called the functionsoverride_*
instead ofset_*
. This also renamed the exception types raised fromOverrideSlippagePostInit
andOverrideCommissionPostInit
toSetSlippagePostInit
andSetCommissionPostInit
(923).Fixed an issue in the CLI that would cause assets to be added twice. This would map the same symbol to two different sids (942).
Fixed an issue where the
PerformancePeriod
incorrectly reported the total_positions_value when creating aAccount
(950).Fixed issues around KeyErrors coming from history and BarData on 32-bit python, where Assets did not compare properly with int64s (959).
Fixed a bug where boolean operators were not properly implemented on
Filter
(991).Installation of zipline no longer downgrades numpy to 1.9.2 silently and unconditionally (969).
Performance#
Maintenance and Refactorings#
Build#
Makes zipline install requirements more flexible (825).
Use
versioneer
to manage the project__version__
and setup.py version (829).Fixed coveralls integration on travis build (840).
Fixed conda build, which now uses git source as its source and reads requirements using setup.py, instead of copying them and letting them get out of sync (937).
Require
setuptools
> 18.0 (951).
Documentation#
Document the release process for developers (835).
Added reference docs for the Pipeline API. (864).
Added reference docs for Asset Metadata APIs. (864).
Generated documentation now includes links to source code for many classes and functions. (864).
Added platform-specific documentation describing how to find binary dependencies. (883).
Miscellaneous#
Added a
show_graph()
method to render a Pipeline as an image (836).Adds
subtest()
decorator for creating subtests withoutnose_parameterized.expand()
which bloats the test output (833).Limits timer report in test output to 15 longest tests (838).
Treasury and benchmark downloads will now wait up to an hour to download again if data returned from a remote source does not extend to the date expected. (841).
Added a tool to downgrade the assets db to previous versions (941).
Release 0.8.3#
- Release:
0.8.3
- Date:
November 6, 2015
Note
We advanced the version to 0.8.3
to fix a source distribution issue with
pypi. There are no code changes in this version.
Release 0.8.0#
- Release:
0.8.0
- Date:
November 6, 2015
Highlights#
New documentation system with a new website at zipline.io
Major performance enhancements.
Dynamic history.
New user defined method:
before_trading_start
.New api function:
schedule_function()
.New api function:
get_environment()
.New api function:
set_max_leverage()
.New api function:
set_do_not_order_list()
.Pipeline API.
Support for trading futures.
Enhancements#
Account object: Adds an account object to context to track information about the trading account. Example:
context.account.settled_cash
Returns the settled cash value that is stored on the account object. This value is updated accordingly as the algorithm is run (396).
HistoryContainer
can now grow dynamically. Calls tohistory()
will now be able to increase the size or change the shape of the history container to be able to service the call.add_history()
now acts as a preformance hint to pre-allocate sufficient space in the container. This change is backwards compatible withhistory
, all existing algorithms should continue to work as intended (412).Simple transforms ported from quantopian and use history.
SIDData
now has methods for:stddev
mavg
vwap
returns
These methods, except for
returns
, accept a number of days. If you are running with minute data, then this will calculate the number of minutes in those days, accounting for early closes and the current time and apply the transform over the set of minutes.returns
takes no parameters and will return the daily returns of the given asset. Example:data[security].stddev(3)
(429).
New fields in Performance Period. Performance Period has new fields accessible in return value of
to_dict
: - gross leverage - net leverage - short exposure - long exposure - shorts count - longs count (464).Allow
order_percent()
to work with various market values (by Jeremiah Lowin).Currently,
order_percent()
andorder_target_percent()
both operate as a percentage ofself.portfolio.portfolio_value
. This PR lets them operate as percentages of other important MVs. Also addscontext.get_market_value()
, which enables this functionality. For example:# this is how it works today (and this still works) # put 50% of my portfolio in AAPL order_percent('AAPL', 0.5) # note that if this were a fully invested portfolio, it would become 150% levered. # take half of my available cash and buy AAPL order_percent('AAPL', 0.5, percent_of='cash') # rebalance my short position, as a percentage of my current short book_target_percent('MSFT', 0.1, percent_of='shorts') # rebalance within a custom group of stocks tech_stocks = ('AAPL', 'MSFT', 'GOOGL') tech_filter = lambda p: p.sid in tech_stocks for stock in tech_stocks: order_target_percent(stock, 1/3, percent_of_fn=tech_filter)
(477).
Command line option to for printing algo to stdout (by Andrea D’Amore) (545).
New user defined function
before_trading_start
. This function can be overridden by the user to be called once before the market opens every day (389).New api function
schedule_function()
. This function allows the user to schedule a function to be called based on more complicated rules about the date and time. For example, call the function 15 minutes before market close respecting early closes (411).New api function
set_do_not_order_list()
. This function accepts a list of assets and adds a trading guard that prevents the algorithm from trading them. Adds a list point in time list of leveraged ETFs that people may want to mark as ‘do not trade’ (478).Adds a class for representing securities.
order()
and other order functions now require an instance ofSecurity
instead of an int or string (520).Generalize the
Security
class toAsset
. This is in preperation of adding support for other asset types (535).New api function
get_environment()
. This function by default returns the string'zipline'
. This is used so that algorithms can have different behavior on Quantopian and local zipline (384).Extends
get_environment()
to expose more of the environment to the algorithm. The function now accepts an argument that is the field to return. By default, this is'platform'
which returns the old value of'zipline'
but the following new fields can be requested:''arena'
: Is this live trading or backtesting?'data_frequency'
: Is this minute mode or daily mode?'start'
: Simulation start date.'end'
: Simulation end date.'capital_base'
: The starting capital for the simulation.'platform'
: The platform that the algorithm is running on.'*'
: A dictionary containing all of these fields.
(449).
New api function
set_max_leveraged()
. This method adds a trading guard that prevents your algorithm from over leveraging itself (552).
Experimental Features#
Warning
Experimental features are subject to change.
Adds new Pipeline API. The pipeline API is a high-level declarative API for representing trailing window computations on large datasets (630).
Adds support for futures trading (637).
Adds Pipeline loader for blaze expressions. This allows users to pull data from any format blaze understands and use it in the Pipeline API. (775).
Bug Fixes#
Fix a bug where the reported returns could sharply dip for random periods of time (378).
Fix a bug that prevented debuggers from resolving the algorithm file (431).
Properly forward arguments to user defined
initialize
function (687).Fix a bug that would cause treasury data to be redownloaded every backtest between midnight EST and the time when the treasury data was available (793).
Fix a bug that would cause the user defined
analyze
function to not be called if it was passed as a keyword argument toTradingAlgorithm
(819).
Performance#
Major performance enhancements to history (by Dale Jung) (488).
Maintenance and Refactorings#
Remove simple transform code. These are available as methods of
SIDData
(550).
Build#
None
Documentation#
Switched to sphinx for the documentation (816).
Release 0.7.0#
- Release:
0.7.0
- Date:
July 25, 2014
Highlights#
Command line interface to run algorithms directly.
IPython Magic
%%zipline
that runs algorithm defined in an IPython notebook cell.API methods for building safeguards against runaway ordering and undesired short positions.
New history() function to get a moving DataFrame of past market data (replaces BatchTransform).
A new beginner tutorial.
Enhancements#
CLI: Adds a CLI and IPython magic for zipline. Example:
python run_algo.py -f dual_moving_avg.py --symbols AAPL --start 2011-1-1 --end 2012-1-1 -o dma.pickle
Grabs the data from yahoo finance, runs the file dual_moving_avg.py (and looks for
dual_moving_avg_analyze.py
which, if found, will be executed after the algorithm has been run), and outputs the perfDataFrame
todma.pickle
(325).
IPython magic command (at the top of an IPython notebook cell). Example:
%%zipline --symbols AAPL --start 2011-1-1 --end 2012-1-1 -o perf
Does the same as above except instead of executing the file looks for the algorithm in the cell and instead of outputting the perf df to a file, creates a variable in the namespace called perf (325).
Adds Trading Controls to the algorithm API.
The following functions are now available on
TradingAlgorithm
and for algo scripts:set_max_order_size(self, sid=None, max_shares=None, max_notional=None)
Set a limit on the absolute magnitude, in shares and/or total dollar value, of any single order placed by this algorithm for a given sid. Ifsid
is None, then the rule is applied to any order placed by the algorithm. Example:def initialize(context): # Algorithm will raise an exception if we attempt to place an # order which would cause us to hold more than 10 shares # or 1000 dollars worth of sid(24). set_max_order_size(sid(24), max_shares=10, max_notional=1000.0)
set_max_position_size(self, sid=None, max_shares=None, max_notional=None)
-Set a limit on the absolute magnitude, in either shares or dollar value, of any position held by the algorithm for a given sid. Ifsid
is None, then the rule is applied to any position held by the algorithm. Example:def initialize(context): # Algorithm will raise an exception if we attempt to order more than # 10 shares or 1000 dollars worth of sid(24) in a single order. set_max_order_size(sid(24), max_shares=10, max_notional=1000.0) ``set_max_order_count(self, max_count)`` Set a limit on the number of orders that can be placed by the algorithm in a single trading day. Example:
def initialize(context): # Algorithm will raise an exception if more than 50 orders are placed in a day. set_max_order_count(50)
set_long_only(self)
Set a rule specifying that the algorithm may not hold short positions. Example:def initialize(context): # Algorithm will raise an exception if it attempts to place # an order that would cause it to hold a short position. set_long_only()
(329).
Adds an
all_api_methods
classmethod onTradingAlgorithm
that returns a list of allTradingAlgorithm
API methods (333).Expanded record() functionality for dynamic naming. The record() function can now take positional args before the kwargs. All original usage and functionality is the same, but now these extra usages will work:
name = 'Dynamically_Generated_String' record( name, value, ... ) record( name, value1, 'name2', value2, name3=value3, name4=value4 )
The requirements are simply that the poritional args occur only before the kwargs (355).
history() has been ported from Quantopian to Zipline and provides moving window of market data. history() replaces BatchTransform. It is faster, works for minute level data and has a superior interface. To use it, call
add_history()
inside ofinitialize()
and then receive a pandasDataFrame
by calling history() from insidehandle_data()
. Check out the tutorial and an example. (345 and 357).history() now supports
1m
window lengths (345).
Bug Fixes#
Performance#
None
Maintenance and Refactorings#
Build#
The following dependencies have been updated (zipline might work with other versions too):
-pytz==2013.9 +pytz==2014.4 +numpy==1.8.1 -numpy==1.8.0 +scipy==0.12.0 +patsy==0.2.1 +statsmodels==0.5.0 -six==1.5.2 +six==1.6.1 -Cython==0.20 +Cython==0.20.1 -TA-Lib==0.4.8 +--allow-external TA-Lib --allow-unverified TA-Lib TA-Lib==0.4.8 -requests==2.2.0 +requests==2.3.0 -nose==1.3.0 +nose==1.3.3 -xlrd==0.9.2 +xlrd==0.9.3 -pep8==1.4.6 +pep8==1.5.7 -pyflakes==0.7.3 -pip-tools==0.3.4 +pyflakes==0.8.1` -scipy==0.13.2 -tornado==3.2 -pyparsing==2.0.1 -patsy==0.2.1 -statsmodels==0.4.3 +tornado==3.2.1 +pyparsing==2.0.2 -Markdown==2.3.1 +Markdown==2.4.1
Contributors#
The following people have contributed to this release, ordered by numbers of commit:
38 Scott Sanderson
29 Thomas Wiecki
26 Eddie Hebert
6 Delaney Granizo-Mackenzie
3 David Edwards
3 Richard Frank
2 Jonathan Kamens
1 Pankaj Garg
1 Tony Lambiris
1 fawce
Release 0.6.1#
- Release:
0.6.1
- Date:
April 23, 2014
Highlights#
Major fixes to risk calculations, see Bug Fixes section.
Port of
history()
function, see Enhancements sectionStart of support for Quantopian algorithm script-syntax, see ENH section.
conda package manager support, see Build section.
Enhancements#
Always process new orders i.e. on bars where
handle_data
isn’t called, but there is ‘clock’ data e.g. a consistent benchmark, process orders.Empty positions are now filtered from the portfolio container. To help prevent algorithms from operating on positions that are not in the existing universe of stocks. Formerly, iterating over positions would return positions for stocks which had zero shares held. (Where an explicit check in algorithm code for
pos.amount != 0
could prevent from using a non-existent position.)Add trading calendar for BMF&Bovespa.
Add beginning of algo script support.
Starts on the path of parity with the script syntax in Quantopian’s IDE on https://quantopian.com Example:
from datetime import datetime import pytz from zipline import TradingAlgorithm from zipline.utils.factory import load_from_yahoo from zipline.api import order def initialize(context): context.test = 10 def handle_date(context, data): order('AAPL', 10) print(context.test) if __name__ == '__main__': import pylab as pl start = datetime(2008, 1, 1, 0, 0, 0, 0, pytz.utc) end = datetime(2010, 1, 1, 0, 0, 0, 0, pytz.utc) data = load_from_yahoo( stocks=['AAPL'], indexes={}, start=start, end=end) data = data.dropna() algo = TradingAlgorithm( initialize=initialize, handle_data=handle_date) results = algo.run(data) results.portfolio_value.plot() pl.show()
Add HDF5 and CSV sources.
Limit
handle_data
to times with market data. To prevent cases where custom data types had unaligned timestamps, only callhandle_data
when market data passes through. Custom data that comes before market data will still update the data bar. But the handling of that data will only be done when there is actionable market data.Extended commission PerShare method to allow a minimum cost per trade.
Add symbol api function A
symbol()
lookup feature was added to Quantopian. By adding the same API function to zipline we can make copy&pasting of a Zipline algo to Quantopian easier.Add simulated random trade source. Added a new data source that emits events with certain user-specified frequency (minute or daily). This allows users to backtest and debug an algorithm in minute mode to provide a cleaner path towards Quantopian.
Remove dependency on benchmark for trading day calendar. Instead of the benchmarks’ index, the trading calendar is now used to populate the environment’s trading days. Remove
extra_date
field, since unlike the benchmarks list, the trading calendar can generate future dates, so dates for current day trading do not need to be appended. Motivations:The source for the open and close/early close calendar and the trading day calendar is now the same, which should help prevent potential issues due to misalignment.
Allows configurations where the benchmark is provided as a generator based data source to need to supply a second benchmark list just to populate dates.
Port
history()
API method from Quantopian. Opens the core of thehistory()
function that was previously only available on the Quantopian platform.The history method is analoguous to the
batch_transform
function/decorator, but with a hopefully more precise specification of the frequency and period of the previous bar data that is captured. Example usage:from zipline.api import history, add_history def initialize(context): add_history(bar_count=2, frequency='1d', field='price') def handle_data(context, data): prices = history(bar_count=2, frequency='1d', field='price') context.last_prices = prices
N.B. this version of history lacks the backfilling capability that allows the return a full DataFrame on the first bar.
Bug Fixes#
Adjust benchmark events to match market hours (241). Previously benchmark events were emitted at 0:00 on the day the benchmark related to: in ‘minute’ emission mode this meant that the benchmarks were emitted before any intra-day trades were processed.
Ensure perf stats are generated for all days When running with minutely emissions the simulator would report to the user that it simulated ‘n - 1’ days (where n is the number of days specified in the simulation params). Now the correct number of trading days are reported as being simulated.
Fix repr for cumulative risk metrics. The
__repr__
for RiskMetricsCumulative was referring to an older structure of the class, causing an exception when printed. Also, now prints the last values in the metrics DataFrame.Prevent minute emission from crashing at end of available data. The next day calculation was causing an error when a minute emission algorithm reached the end of available data. Instead of a generic exception when available data is reached, raise and catch a named exception so that the tradesimulation loop can skip over, since the next market close is not needed at the end.
Fix pandas indexing in trading calendar. This could alternatively be filed under Performance. Index using loc instead of the inefficient index-ing of day, then time.
Prevent crash in vwap transform due to non-existent member. The WrongDataForTransform was referencing a
self.fields
member, which did not exist. Add a self.fields member set toprice
andvolume
and use it to iterate over during the check.Fix max drawdown calculation. The input into max drawdown was incorrect, causing the bad results. i.e. the
compounded_log_returns
were not values representative of the algorithms total return at a given time, thoughcalculate_max_drawdown
was treating the values as if they were. Instead, thealgorithm_period_returns
series is now used, which does provide the total return.Fix cost basis calculation. Cost basis calculation now takes direction of txn into account. Closing a long position or covering a short shouldn’t affect the cost basis.
Fix floating point error in
order()
. Where order amounts that were near an integer could accidentally be floored or ceilinged (depending on being postive or negative) to the wrong integer. e.g. an amount stored internally as -27.99999 was converted to -27 instead of -28.Update perf period state when positions are changed by splits. Otherwise,
self._position_amounts
will be out of sync with position.amount, etc.Fix misalignment of downside series calc when using exact dates. An oddity that was exposed while working on making the return series passed to the risk module more exact, the series comparison between the returns and mean returns was unbalanced, because the mean returns were not masked down to the downside data points; however, in most, if not all cases this was papered over by the call to
.valid()
which was removed in this change set.Check that self.logger exists before using it.
self.logger
is initialized asNone
and there is no guarantee that users have set it, so check that it exists before trying to pass messages to it.Prevent out of sync market closes in performance tracker. In situations where the performance tracker has been reset or patched to handle state juggling with warming up live data, the
market_close
member of the performance tracker could end up out of sync with the current algo time as determined by the performance tracker. The symptom was dividends never triggering, because the end of day checks would not match the current time. Fix by having the tradesimulation loop be responsible, in minute/minute mode, for advancing the market close and passing that value to the performance tracker, instead of having the market close advanced by the performance tracker as well.Fix numerous cumulative and period risk calculations. The calculations that are expected to change are:
cumulative.beta
cumulative.alpha
cumulative.information
cumulative.sharpe
period.sortino
How Risk Calculations Are Changing Risk Fixes for Both Period and Cumulative
Downside Risk
Use sample instead of population for standard deviation.
Add a rounding factor, so that if the two values are close for a given dt, that they do not count as a downside value, which would throw off the denominator of the standard deviation of the downside diffs.
Standard Deviation Type
Across the board the standard deviation has been standardized to using a ‘sample’ calculation, whereas before cumulative risk was mostly using ‘population’. Using
ddof=1
withnp.std
calculates as if the values are a sample.Cumulative Risk Fixes
Beta
Use the daily algorithm returns and benchmarks instead of annualized mean returns.
Volatility
Use sample instead of population with standard deviation.
The volatility is an input to other calculations so this change affects Sharpe and Information ratio calculations.
Information Ratio
The benchmark returns input is changed from annualized benchmark returns to the annualized mean returns.
Alpha
The benchmark returns input is changed from annualized benchmark returns to the annualized mean returns.
Period Risk Fixes
Sortino
Now uses the downside risk of the daily return vs. the mean algorithm returns for the minimum acceptable return instead of the treasury return.
The above required adding the calculation of the mean algorithm returns for period risk.
Also, uses
algorithm_period_returns
andtresaury_period_return
as the cumulative Sortino does, instead of using algorithm returns for both inputs into the Sortino calculation.
Performance#
Removed
alias_dt
transform in favor of property on SIDData. Adding a copy of the Event’s dt field as datetime via thealias_dt
generator, so that the API was forgiving and allowed both datetime and dt on a SIDData object, was creating noticeable overhead, even on an noop algorithms. Instead of incurring the cost of copying the datetime value and assigning it to the Event object on every event that is passed through the system, add a property to SIDData which acts as an aliasdatetime
todt
. Eventually support fordata['foo'].datetime
may be removed, and could be considered deprecated.Remove the drop of ‘null return’ from cumulative returns. The check of existence of the null return key, and the drop of said return on every single bar was adding unneeded CPU time when an algorithm was run with minute emissions. Instead, add the 0.0 return with an index of the trading day before the start date. The removal of the
null return
was mainly in place so that the period calculation was not crashing on a non-date index value; with the index as a date, the period return can also approximate volatility (even though the that volatility has high noise-to-signal strength because it uses only two values as an input.)
Maintenance and Refactorings#
Allow
sim_params
to provide data frequency for the algorithm. In the case thatdata_frequency
of the algorithm is None, allow thesim_params
to provide thedata_frequency
.Also, defer to the algorithms data frequency, if provided.
Build#
Added support for building and releasing via conda For those who prefer building with https://docs.conda.io/en/latest/ to compiling locally with pip. The following should install Zipline on many systems.
conda install -c quantopian zipline
Contributors#
The following people have contributed to this release, ordered by numbers of commit:
49 Eddie Hebert
28 Thomas Wiecki
11 Richard Frank
2 Jamie Kirkpatrick
2 Jeremiah Lowin
1 Colin Alexander
1 Michael Schatzow
1 Moises Trovo
1 Suminda Dharmasena