Skip to content

Overview

presto

Parameter Refinement Engine for Smirnoff Training / Optimisation

CI license Ruff Checked with mypy


Warning: ⚠️ There are currently issues with AceFF 2.0 so we recommend using AIMNet2 (aimnet2 in the settings toml) as the default MLP for the moment.

Train bespoke SMIRNOFF force fields quickly using a machine learning potential (MLP). All valence parameters (bonds, angles, proper torsions, and improper torsions) are trained to MLP energies sampled using molecular dynamics. Please see the documentation.

Warning: This code is experimental and under active development. It is not guaranteed to provide correct results, the documentation and testing is incomplete, and the API may change without notice.

Please note that the MACE-OFF models are released under the Academic Software License which does not permit commercial use. However, the default AceFF-2.0 model (as well as Egret-1 and AIMNet-2) does.

Installation#

Ensuring that you have pixi installed, install and start a shell with the current environment with:

git clone https://github.com/cole-group/presto.git
cd presto
pixi shell
This will create an environment with CUDA 12.9. You'll need to update to CUDA >= 12.9 (check with nvidia-smi) to use presto (older versions are not usable as we require OpenMM 8.5 for the PythonForce class, and this requires CUDA 12.9).

For more information on activating pixi environments, see the documentation.

Usage#

Run with command line arguments:

presto train --parameterisation-settings.smiles "CCC(CC)C(=O)Nc2cc(NC(=O)c1c(Cl)cccc1Cl)ccn2"
then see the bespoke force field at training_iteration_2/bespoke_ff.offxml.

Sensible defaults have been set, but all available options can be viewed with:

presto train --help

Run from a yaml file:

presto write-default-yaml default.yaml
# Modify the yaml to set the desired smiles
presto train-from-yaml default.yaml

For more details on the theory and implementation, please see the documentation.

Copyright (c) 2025-2026, Finlay Clark, Newcastle University, UK

Copyright (c) 2025-2026, Thomas James Pope, Newcastle University, UK

This package includes models from other projects under the MIT license. See presto/models/LICENSES.md for details.

Acknowledgements#

Early development was completed by Thomas James Pope. Many ideas taken from Simon Boothroyd's super helpful python-template.