Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Apply data shuffle in training for mini-batch gradient update #744

Open
junshiguo opened this issue Jan 6, 2021 · 0 comments
Open

Apply data shuffle in training for mini-batch gradient update #744

junshiguo opened this issue Jan 6, 2021 · 0 comments

Comments

@junshiguo
Copy link
Collaborator

For mini-batch gradient update, we need to shuffle training data, so that actual training inputs for each iteration are different and random. This should benefit model performance given same iterations of training.

Applicable models include NN, LR and WDL.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant