Tutorials

A series of Jupyter notebooks have been written to demonstrate the capabilities of MatGL. The following is a recommended sequence of nicely formatted notebooks autogenerated via nbconvert for easier reading.

Brief note on running with GPUs

All example notebooks are written using the CPU for easier running and testing on most machines. If GPU use is desired, e.g., for faster training, please refer to Pytorch’s documentation on CUDA semantics. Briefly, setting the default device using the with torch.device("cuda") context manager or torch.set_default_device("cuda").

with torch.device("cuda"):
    # Rest of your code

or

import torch

torch.set_default_device("cuda")
# Rest of your code

Basic Usage

This series of notebooks demonstrate how to load and use the pretrained models for property predictions.

  1. Property Predictions using MEGNet or M3GNet Models
  2. Relaxations and Simulations using the M3GNet Universal Potential
  3. Combining the M3GNet Universal Potential with Property Prediction Models

Large scale benchmarking

  1. Benchmarking M3GNet Predictions of Cubic Lattice Parameters

Training MatGL models

  1. Training a MEGNet Formation Energy Model
  2. Training a M3GNet Potential

© Copyright 2022, Materials Virtual Lab