Home

Awesome

notears

Python package implementing "DAGs with NO TEARS: Smooth Optimization for Structure Learning", Xun Zheng, Bryon Aragam, Pradeem Ravikumar and Eric P. Xing (March 2018, arXiv:1803.01422)

This package implements the NOTEARS learning algorithm, and supplies a few useful utilities (e.g. for generating random graphs, simulating data from linear Gaussian models, measuring performance, and thresholding edge matrices returned by NOTEARS to ensure acyclicity).

Optimization is ultimately performed by the SciPy implementation of L-BFGS-B.

Dependencies

numpy
scipy
networkx

Usage

See example_usage.ipynb for a simple Jupyter notebook demonstrating usage.

In general, using this package looks like:

import notears

output_dict = notears.run(notears.notears_standard, data, notears.loss.least_squares_loss, notears.loss.least_squares_loss_grad)
thresholded_output = notears.utils.threshold_output(output_dict['W'])

Parameters

notears.run

notears.run(variant, data, loss, loss_grad, c=0.25, r=10.0, e=1e-8, rnd_W_init=False, output_all_progress=False, verbose=False)

Calling this function returns a dictionary of the form {'h': h(W), 'loss': loss(W, data), 'W': W}, unless output_all_progress is true, in which case it returns an array of such dictionaries.

Utilities

Some useful utilities are provided in notears/utils.py, and can be accessed from notears.utils.