Home

Awesome

neural-fortran

A parallel framework for deep learning. Read the paper here.

Features

Available layers

Layer typeConstructor nameSupported input layersRank of output arrayForward passBackward pass
Inputinputn/a1, 3n/an/a
Dense (fully-connected)denseinput1d, flatten1
Convolutional (2-d)conv2dinput3d, conv2d, maxpool2d, reshape3✅(*)
Max-pooling (2-d)maxpool2dinput3d, conv2d, maxpool2d, reshape3
Flattenflatteninput3d, conv2d, maxpool2d, reshape1
Reshape (1-d to 3-d)reshapeinput1d, dense, flatten3

(*) See Issue #145 regarding non-converging CNN training on the MNIST dataset.

Getting started

Get the code:

git clone https://github.com/modern-fortran/neural-fortran
cd neural-fortran

Dependencies

Required dependencies are:

Optional dependencies are:

Compilers tested include:

Building with fpm

Building in serial mode

With gfortran, the following will create an optimized build of neural-fortran:

fpm build --profile release

Building in parallel mode

If you use GFortran and want to run neural-fortran in parallel, you must first install OpenCoarrays. Once installed, use the compiler wrappers caf and cafrun to build and execute in parallel, respectively:

fpm build --compiler caf --profile release --flag "-cpp -DPARALLEL"

Testing with fpm

fpm test --profile release

For the time being, you need to specify the same compiler flags to fpm test as you did in fpm build so that fpm knows it should use the same build profile.

See the Fortran Package Manager for more info on fpm.

Building with CMake

Building in serial mode

mkdir build
cd build
cmake ..
make

Tests and examples will be built in the bin/ directory.

Building in parallel mode

If you use GFortran and want to run neural-fortran in parallel, you must first install OpenCoarrays. Once installed, use the compiler wrappers caf and cafrun to build and execute in parallel, respectively:

FC=caf cmake .. -DPARALLEL
make
cafrun -n 4 bin/mnist # run MNIST example on 4 cores

Building with a different compiler

If you want to build with a different compiler, such as Intel Fortran, specify FC when issuing cmake:

FC=ifort cmake ..

for a parallel build of neural-fortran, or

FC=ifort cmake ..

for a serial build.

Building with BLAS or MKL

To use an external BLAS or MKL library for matmul calls, run cmake like this:

cmake .. -DBLAS=-lblas

where the value of -DBLAS should point to the desired BLAS implementation, which has to be available in the linking path. This option is currently available only with gfortran.

Building in debug mode

To build with debugging flags enabled, type:

cmake .. -DCMAKE_BUILD_TYPE=debug

Running tests with CMake

Type:

ctest

to run the tests.

Using neural-fortran in your project

You can use the CMake module available here to find or fetch an installation of this project while configuring your project. This module makes sure that the neural-fortran::neural-fortran target is always generated regardless of how the neural-fortran is included in the project.

First, either copy Findneural-fortran.cmake to, say, your project's cmake directory and then include it in your CMakeLists.txt file:

list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake")

or use the CMAKE_MODULE_PATH variable to point to the directory where it is installed.

Next you need to set neural-fortran_ROOT_DIR to the directory where neural-fortran is installed such that neural-fortran_ROOT_DIR/lib/libneural-fortran.a exists.

The following should be added in the CMake file of your directory:

if(NOT TARGET neural-fortran::neural-fortran)
  find_package(neural-fortran REQUIRED)
endif()

and then to use the target in your project:

target_link_libraries(your_target PRIVATE neural-fortran::neural-fortran)

Examples

The easiest way to get a sense of how to use neural-fortran is to look at examples, in increasing level of complexity:

  1. simple: Approximating a simple, constant data relationship
  2. sine: Approximating a sine function
  3. dense_mnist: Hand-written digit recognition (MNIST dataset) using a dense (fully-connected) network
  4. cnn_mnist: Training a CNN on the MNIST dataset
  5. get_set_network_params: Getting and setting hyperparameters of a network.

The examples also show you the extent of the public API that's meant to be used in applications, i.e. anything from the nf module.

Examples 3-6 rely on curl to download the needed datasets, so make sure you have it installed on your system. Most Linux OSs have it out of the box. The dataset will be downloaded only the first time you run the example in any given directory.

If you're using Windows OS or don't have curl for any other reason, download mnist.tar.gz directly and unpack in the directory in which you will run the example program.

API documentation

API documentation can be generated with FORD. Assuming you have FORD installed on your system, run

ford ford.md

from the neural-fortran top-level directory to generate the API documentation in doc/html. Point your browser to doc/html/index.html to read it.

Contributing

This Contributing guide briefly describes the code organization. It may be useful to read if you want to contribute a new feature to neural-fortran.

Acknowledgement

Thanks to all open-source contributors to neural-fortran: awvwgk, ggoyman, ivan-pi, jacobwilliams, jvdp1, jvo203, milancurcic, pirpyn, rouson, rweed, Spnetic-5, and scivision.

Development of convolutional networks and Keras HDF5 adapters in neural-fortran was funded by a contract from NASA Goddard Space Flight Center to the University of Miami. Development of optimizers is supported by the Google Summer of Code 2023 project awarded to Fortran-lang.

<img src="assets/nasa.png" alt="NASA logo"> <img src="assets/gsoc.png" alt="GSoC logo">

Related projects

Impact

Neural-fortran has been used successfully in over a dozen published studies. See all papers that cite it here.