Hasktorch is a library for tensors and neural networks in Haskell. It is an independent open source community project which leverages the core C++ libraries shared by PyTorch.
This project is in active development, so expect changes to the library API as it evolves. We would like to invite new users to join our Hasktorch slack space for questions and discussions. Contributions/PR are encouraged.
Currently we are developing the second major release of Hasktorch (0.2). Note the 1st release, Hasktorch 0.1, on hackage is outdated and should not be used.
The documentation is divided into several sections:
- High-level MuniHac talk by @austinvhuang
- Hands-on live-coding demo by @tscholak
- Low-level FFI talk by @junjihashimoto
The following steps will get you started. They assume the hasktorch repository has just been cloned. After setup is done, read the online tutorials and API documents.
- linux+cabal+cpu
- linux+cabal+cuda11
- macos+cabal+cpu
- linux+stack+cpu
- macos+stack+cpu
- nixos+cabal+cpu
- nixos+cabal+cuda11
- docker+jupyterlab+cuda11
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn -- ./mnist/ # Run the MNIST CNN example.
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh -a cu118 # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ export DEVICE="cuda:0" # Set device to CUDA for the MNIST CNN example.
$ cabal run static-mnist-cnn -- ./mnist/ # Run the MNIST CNN example.
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn -- ./mnist/ # Run the MNIST CNN example.
Install the Haskell Tool Stack if you haven't already, following instructions here
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
To build and test the Hasktorch library, run:
$ stack build hasktorch # Build the Hasktorch library.
$ stack test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ stack build examples # Build the Hasktorch examples.
$ stack test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn -- ./mnist/ # Run the MNIST CNN example.
Install the Haskell Tool Stack if you haven't already, following instructions here
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
To build and test the Hasktorch library, run:
$ stack build hasktorch # Build the Hasktorch library.
$ stack test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ stack build examples # Build the Hasktorch examples.
$ stack test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn -- ./mnist/ # Run the MNIST CNN example.
(Optional) Install and set up Cachix:
$ nix-env -iA cachix -f https://cachix.org/api/v1/install # (Optional) Install Cachix.
# (Optional) Use IOHK's cache. See https://input-output-hk.github.io/haskell.nix/tutorials/getting-started/#setting-up-the-binary-cache
$ cachix use hasktorch # (Optional) Use hasktorch's cache.
Starting from the top-level directory of the project, run:
$ nix develop # Enter the nix shell environment for Hasktorch.
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn -- ./mnist/ # Run the MNIST CNN example.
(Optional) Install and set up Cachix:
$ nix-env -iA cachix -f https://cachix.org/api/v1/install # (Optional) Install Cachix.
# (Optional) Use IOHK's cache. See https://input-output-hk.github.io/haskell.nix/tutorials/getting-started/#setting-up-the-binary-cache
$ cachix use hasktorch # (Optional) Use hasktorch's cache.
Starting from the top-level directory of the project, run:
$ cat > nix/dev-config.nix
{
profiling = true;
cudaSupport = true;
cudaMajorVersion = "11";
}
$ nix develop # Enter the nix shell environment for Hasktorch.
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ export DEVICE="cuda:0" # Set device to CUDA for the MNIST CNN example.
$ cabal run static-mnist-cnn -- ./mnist/ # Run the MNIST CNN example.
This dockerhub repository provides the docker-image of jupyterlab. It supports cuda11, cuda10 and cpu only. When you use jupyterlab with hasktorch, type following command, then click a url in a console.
$ docker run --gpus all -it --rm -p 8888:8888 htorch/hasktorch-jupyter
or
$ docker run --gpus all -it --rm -p 8888:8888 htorch/hasktorch-jupyter:latest-cu11
In rare cases, you may see errors like
cannot move tensor to "CUDA:0"
although you have CUDA capable hardware in your machine and have followed the getting-started instructions for CUDA support.
If that happens, check if /run/opengl-driver/lib
exists.
If not, make sure your CUDA drivers are installed correctly.
If you have run cabal
in a CPU-only Hasktorch Nix shell before,
you may need to:
- Clean the
dist-newstyle
folder usingcabal clean
. - Delete the
.ghc.environment*
file in the Hasktorch root folder.
Otherwise, at best, you will not be able to move tensors to CUDA, and, at worst, you will see weird linker errors like
gcc: error: hasktorch/dist-newstyle/build/x86_64-linux/ghc-8.8.3/libtorch-ffi-1.5.0.0/build/Torch/Internal/Unmanaged/Autograd.dyn_o: No such file or directory
`cc' failed in phase `Linker'. (Exit code: 1)
We welcome new contributors.
Contact us for access to the hasktorch slack channel. You can send an email to hasktorch@gmail.com or on twitter as @austinvhuang, @SamStites, @tscholak, or @junjihashimoto3.
See the wiki for developer notes.
Basic functionality:
deps/
- submodules and downloads for build dependencies (libtorch, mklml, pytorch) -- you can ignore this if you are on Nixexamples/
- high level example models (xor mlp, typed cnn, etc.)experimental/
- experimental projects or tipshasktorch/
- higher level user-facing library, calls intoffi/
, used byexamples/
Internals (for contributing developers):
codegen/
- code generation, parsesDeclarations.yaml
spec from pytorch and producesffi/
contentsinline-c/
- submodule to inline-cpp fork used for C++ FFIlibtorch-ffi/
- low level FFI bindings to libtorchspec/
- specification files used forcodegen/