[Bug]: How to set max epochs parameter ? #1962
Answered
by
blaz-r
shanmugamani1023
asked this question in
Q&A
-
Beta Was this translation helpful? Give feedback.
Answered by
blaz-r
Apr 9, 2024
Replies: 1 comment 3 replies
-
You could set it when you instantiate model = EfficientAd()
engine = Engine(task="classification", max_epochs=1000)
engine.train(datamodule=datamodule, model=model)
but , i dont know how to set my custom epoch value , if i give that max_epochs parameter in anomalib\configs\model\efficient_ad.yaml ,it didnt work, how to change that? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Patchcore isn't trained using backpropagation, so it runs only a single epoch in all cases. This is expected. If you want to learn more, check the Patchcore readme.