Home

Awesome

Build Status Coverage Status DOI

plot

A variety of matrix completion and imputation algorithms implemented in Python 3.6.

To install:

pip install fancyimpute

If you run into tensorflow problems and use anaconda, you can try to fix them with conda install cudatoolkit.

Important Caveats

(1) This project is in "bare maintenance" mode. That means we are not planning on adding more imputation algorithms or features (but might if we get inspired). Please do report bugs, and we'll try to fix them. Also, we are happy to take pull requests for more algorithms and/or features.

(2) IterativeImputer started its life as a fancyimpute original, but was then merged into scikit-learn and we deleted it from fancyimpute in favor of the better-tested sklearn version. As a convenience, you can still from fancyimpute import IterativeImputer, but under the hood it's just doing from sklearn.impute import IterativeImputer. That means if you update scikit-learn in the future, you may also change the behavior of IterativeImputer.

Usage

from fancyimpute import KNN, NuclearNormMinimization, SoftImpute, BiScaler

# X is the complete data matrix
# X_incomplete has the same values as X except a subset have been replace with NaN

# Use 3 nearest rows which have a feature to fill in each row's missing features
X_filled_knn = KNN(k=3).fit_transform(X_incomplete)

# matrix completion using convex optimization to find low-rank solution
# that still matches observed values. Slow!
X_filled_nnm = NuclearNormMinimization().fit_transform(X_incomplete)

# Instead of solving the nuclear norm objective directly, instead
# induce sparsity using singular value thresholding
X_incomplete_normalized = BiScaler().fit_transform(X_incomplete)
X_filled_softimpute = SoftImpute().fit_transform(X_incomplete_normalized)

# print mean squared error for the  imputation methods above
nnm_mse = ((X_filled_nnm[missing_mask] - X[missing_mask]) ** 2).mean()
print("Nuclear norm minimization MSE: %f" % nnm_mse)

softImpute_mse = ((X_filled_softimpute[missing_mask] - X[missing_mask]) ** 2).mean()
print("SoftImpute MSE: %f" % softImpute_mse)

knn_mse = ((X_filled_knn[missing_mask] - X[missing_mask]) ** 2).mean()
print("knnImpute MSE: %f" % knn_mse)

Algorithms

Citation

If you use fancyimpute in your academic publication, please cite it as follows:

@software{fancyimpute,
  author = {Alex Rubinsteyn and Sergey Feldman},
  title={fancyimpute: An Imputation Library for Python},
  url = {https://github.com/iskandr/fancyimpute},
  version = {0.7.0},
  date = {2016},
}