Home

Awesome

DeepSphere: a spherical convolutional neural network

Nathanaël Perraudin, Michaël Defferrard, Tomasz Kacprzak, Raphael Sgier

The code in this repository implements a generalization of Convolutional Neural Networks (CNNs) to the sphere. We here model the discretised sphere as a graph of connected pixels. The resulting convolution is more efficient (especially when data doesn't span the whole sphere) and mostly equivariant to rotation (small distortions are due to the non-existence of a regular sampling of the sphere). The pooling strategy exploits a hierarchical pixelisation of the sphere (HEALPix) to analyse the data at multiple scales. The graph neural network model is based on ChebNet and its TensorFlow implementation. The performance of DeepSphere is demonstrated on a discrimination problem: the classification of convergence maps into two cosmological model classes.

Resources

Code:

Papers:

Installation

Binder   Click the binder badge to play with the notebooks from your browser without installing anything.

For a local installation, follow the below instructions.

  1. Clone this repository.

    git clone https://github.com/deepsphere/deepsphere-cosmo-tf1.git
    cd deepsphere-cosmo-tf1
    
  2. Install the dependencies.

    pip install -r requirements.txt
    

    Note: if you will be working with a GPU, comment the tensorflow==1.6.0 line in requirements.txt and uncomment the tensorflow-gpu==1.6.0 line.

    Note: the code has been developed and tested with Python 3.5 and 3.6. It should work on Python 2.7 with requirements_py27.txt. Please send a PR if you encounter an issue.

  3. Play with the Jupyter notebooks.

    jupyter notebook
    

Notebooks

The below notebooks contain examples and experiments to play with the model. Look at the first three if you want to use the model with your own data.

  1. Classification of data on the whole sphere. The easiest example to run if you want to play with the model.
  2. Regression from multiple spherical maps. Shows how to regress (multiple) parameters from (multiple) input channels.
  3. Classification of data from part of the sphere with noise. That is the main experiment carried on in the paper (with one configuration of resolution and noise). It requires private data, see below.
  4. Spherical convolution using a graph to represent the sphere. Learn what is the convolution on a graph, and how it works on the sphere. Useful to learn how the model works.
  5. Comparison of the spherical harmonics with the eigenvectors of the graph Laplacian. Get a taste of the difference between the graph representation and the analytic sphere.

Reproducing the results of the paper

Follow the below steps to reproduce the paper's results. While the instructions are simple, the experiments will take a while.

  1. Get the main dataset. You need to ask the ETHZ cosmology research group for a copy of the data. The simplest option is to request access on Zenodo. You will have to write a description of the project for which the dataset is intended to be used.

  2. Preprocess the dataset.

    python data_preprocess.py
    
  3. Run the experiments. The first corresponds to the fully convolutional architecture variant of DeepSphere. The second corresponds to the classic CNN architecture variant. The third corresponds to a standard 2D CNN (spherical maps are projected on the plane). The last two are the baselines: an SVM that classifies histograms and power spectral densities.

    python experiments_deepsphere.py FCN
    python experiments_deepsphere.py CNN
    python experiments_2dcnn.py
    python experiments_histogram.py
    python experiments_psd.py
    

The results will be saved in the results folder. Note that results may vary from one run to the other. You may want to check summaries with tensorboard to verify that training converges. For some experiments, the network needs a large number of epochs to stabilize.

The experiments_deepsphere.py, experiments_2dcnn.py, and experiments_psd.py scripts can be executed in parallel in a HPC setting. You can adapt the launch_cscs.py, launch_cscs_2dcnn.py, and launch_euler.py to your particular setting.

Once the results are computed (or using those stored in the repository), you can reproduce the paper's figures with the figure* notebooks. The results will be saved in the figures folder. You can also look at the original figures stored in the outputs branch.

Experimental

The experimental folder contains unfinished, untested, and buggy code. We leave it as is for our own future reference, and for the extra curious. :wink:

License & citation

The content of this repository is released under the terms of the MIT license.
Please consider citing our papers if you find it useful.

@article{deepsphere_cosmo,
  title = {{DeepSphere}: Efficient spherical Convolutional Neural Network with {HEALPix} sampling for cosmological applications},
  author = {Perraudin, Nathana\"el and Defferrard, Micha\"el and Kacprzak, Tomasz and Sgier, Raphael},
  journal = {Astronomy and Computing},
  volume = {27},
  pages = {130-146},
  year = {2019},
  month = apr,
  publisher = {Elsevier BV},
  issn = {2213-1337},
  doi = {10.1016/j.ascom.2019.03.004},
  archiveprefix = {arXiv},
  eprint = {1810.12186},
  url = {https://arxiv.org/abs/1810.12186},
}
@inproceedings{deepsphere_rlgm,
  title = {{DeepSphere}: towards an equivariant graph-based spherical {CNN}},
  author = {Defferrard, Micha\"el and Perraudin, Nathana\"el and Kacprzak, Tomasz and Sgier, Raphael},
  booktitle = {ICLR Workshop on Representation Learning on Graphs and Manifolds},
  year = {2019},
  archiveprefix = {arXiv},
  eprint = {1904.05146},
  url = {https://arxiv.org/abs/1904.05146},
}
@inproceedings{deepsphere_iclr,
  title = {{DeepSphere}: a graph-based spherical {CNN}},
  author = {Defferrard, Michaël and Milani, Martino and Gusset, Frédérick and Perraudin, Nathanaël},
  booktitle = {International Conference on Learning Representations (ICLR)},
  year = {2020},
  url = {https://openreview.net/forum?id=B1e3OlStPB},
}