Get started Features Tutorials Blog Research
April 06, 2022

VowpalWabbit 9.1.0 Release Notes

author's avatar
Jack Gerrits

It’s only been a couple of months since version 9, but this release includes some exciting improvements to Vowpal Wabbit. Some highlights are the removal of the Boost Program Options dependency, a new experimental way to output feature names and weights and a new loss function.

Checked in Visual Studio project files removal

To be very clear - we are not removing support for Windows. Ever since we introduced the CMake build system in version 8.7.0 there have been two parallel build systems for Windows: the CMake based one and a checked in solution and projects. We’re confident now that the CMake based build system is in a place to fully replace the old Visual Studio project file based one. In fact, the CMake based Windows build has been significantly more flexible, robust, easier to use and faster to build for quite some time now.

Therefore, version 9.1.0 will be the last release that contains the old checked in Visual Studio solution and project files. We strongly recommend anyone who was using this to migrate over to using the CMake based build system. If that is not possible in the short term we have created v141 and v142 nugets as a migration path.

Looking forward, being able to depend on solely the CMake build system is going to greatly improve our ability to improve the modularity of VW and ease of consumption of VW as a library.


Removal of Boost Program Options dependency

For a long time we have depended on Boost Program Options for command line options parsing. In this release, we have replaced this dependency with our own implementation of command line parsing. Apart from one place where we depend on Boost Math in standalone mode, this means that VW core and the command line tool are free of Boost dependencies hopefully making the code a bit easier to build and package.

Experimental: Expectile loss

Expectile is a new loss function, which is currently asymmetric squared loss. It is being experimented with and being used for risk averse contextual bandits. It has been added to the loss functions wiki page.

Experimental: Export model weights and readable feature names as JSON

The --invert_hash output is often used to understand what features correspond to different weights in the trained model. This output was never designed for machine reading but it has seen considerable use in this regard.

We’ve added a new experimental option to export information similar to the weights section of --invert_hash but in JSON format. This is subject to change as we better understand what information is helpful. It is also only currently supported for the default gradient descent base learner. Early usage shows this is much easier to use and provides more stable and rich output.

Find out more information such as options and format here.

v141 and v142 native Nugets

As a measure to assist anyone who is unable to consume the CMake based build system we have implemented CI to generate native nugets for the core VW library and command line executable. These Nugets are 64 bit and come bundled with the necessary compiled dependencies to build VW. They can be found as artifacts in the CI jobs and are available on

These are a temporary measure to help migration to CMake based consumption of the package and will be removed eventually.

Thank you

A huge thank you and welcome to all of the new contributors since the last release:

And of course thank you to existing contributors: