Skip to content

Overview

presto

Parameter Refinement Engine for Smirnoff Training / Optimisation

CI license Ruff Checked with mypy


Train bespoke SMIRNOFF force fields quickly using a machine learning potential (MLP). All valence parameters (bonds, angles, proper torsions, and improper torsions) are trained to MLP energies sampled using molecular dynamics. Please see the documentation.

Warning: This code is experimental and under active development. It is not guaranteed to provide correct results, the documentation and testing is incomplete, and the API may change without notice.

Please note that the MACE-OFF models are released under the Academic Software License which does not permit commercial use. However, the default AceFF-2.0 model (as well as Egret-1 and AIMNet-2) does.

Installation#

Ensuring that you have pixi installed, install and start a shell with the current environment with:

git clone https://github.com/cole-group/presto.git
cd presto
pixi shell
By default, this will create an environment with CUDA 12.9. If your version is older, but >= 12.6 (check with nvidia-smi), then run
pixi shell -e gpu-py313-cuda126

For more information on activating pixi environments, see the documentation.

Usage#

Run with command line arguments:

presto train --parameterisation-settings.smiles "CCC(CC)C(=O)Nc2cc(NC(=O)c1c(Cl)cccc1Cl)ccn2"
then see the bespoke force field at training_iteration_2/bespoke_ff.offxml.

Sensible defaults have been set, but all available options can be viewed with:

presto train --help

Run from a yaml file:

presto write-default-yaml default.yaml
# Modify the yaml to set the desired smiles
presto train-from-yaml default.yaml

For more details on the theory and implementation, please see the documentation.

MACE-Model Use#

To use models with the MACE architecture, run

pixi shell -e gpu-py313-cuda129-mace
(or the equivalent CUDA 12.6 version)

Copyright (c) 2025-2026, Finlay Clark, Newcastle University, UK

Copyright (c) 2025-2026, Thomas James Pope, Newcastle University, UK

This package includes models from other projects under the MIT license. See presto/models/LICENSES.md for details.

Acknowledgements#

Early development was completed by Thomas James Pope. Many ideas taken from Simon Boothroyd's super helpful python-template.