gradient_descent_step

torch_sim.optimizers.gradient_descent.gradient_descent_step(state, model, *, pos_lr=0.01, cell_lr=0.1)[source]

Perform one gradient descent optimization step.

Updates atomic positions and optionally cell parameters based on the filter.

Parameters:
  • model (ModelInterface) – Model that computes energies, forces, and optionally stress

  • state (OptimState | CellOptimState) – Current optimization state

  • pos_lr (float | Tensor) – Learning rate(s) for atomic positions

  • cell_lr (float | Tensor) – Learning rate(s) for cell optimization (ignored if no cell filter)

Returns:

Updated OptimState after one optimization step

Return type:

OptimState | CellOptimState