Skip to content

Commit

Permalink
fix spacing in checkpointing docs (#690)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #690

Example for loading best checkpoint wasn't displayed as code block was missing space

Reviewed By: gunchu, williamhufb

Differential Revision: D53008623

fbshipit-source-id: 3a88d67767d292883f28c1f87d0a860aa1f08cc7
  • Loading branch information
JKSenthil authored and facebook-github-bot committed Jan 25, 2024
1 parent 67bcc82 commit 466a0cd
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions docs/source/checkpointing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,7 @@ By specifying the monitored metric to be "train_loss", the checkpointer will exp
Later on, the best checkpoint can be loaded via

.. code-block:: python
TorchSnapshotSaver.restore_from_best(your_dirpath_here, unit, metric_name="train_loss", mode="min")
If you'd like to monitor a validation metric (say validation loss after each eval epoch during :py:func:`~torchtnt.framework.fit.fit`), you can use the `save_every_n_eval_epochs` flag instead, like so
Expand Down

0 comments on commit 466a0cd

Please sign in to comment.