Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default parameters for model training should be extracted from Persephone #95

Open
shuttle1987 opened this issue Oct 2, 2018 · 3 comments
Labels
refactor Refactoring related

Comments

@shuttle1987
Copy link
Member

Consider this situation:

def post(modelInfo):
    """Create a new transcription model"""
    current_corpus = DBcorpus.query.get_or_404(modelInfo['corpusID'])

    min_epochs = modelInfo.get('minimumEpochs', 0)
    max_epochs = modelInfo.get('maximumEpochs', None)
    if max_epochs and min_epochs > max_epochs:
        return "minimum number of epochs must be smaller than maximum", 400

    early_stopping_steps = modelInfo.get('earlyStoppingSteps', None)
    num_layers = modelInfo.get('numberLayers', 3)
    hidden_size = modelInfo.get('hiddenSize', 250)
    beam_width = modelInfo.get('beamWidth', 100)
    decoding_merge_repeated = modelInfo.get('decodingMergeRepeated', True)

As you can see default parameters are provided here if the request doesn't contain them. These currently match up with what occurs in Persephone but they could get out of sync easily.

@shuttle1987 shuttle1987 added the refactor Refactoring related label Oct 2, 2018
@shuttle1987
Copy link
Member Author

PR #94 addresses this somewhat for the model.train calls.

@shuttle1987
Copy link
Member Author

I can see a case for loading these from a config file too, the hardcoded nature that exists here is merely a interim hack to keep other work unblocked.

@shuttle1987
Copy link
Member Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
refactor Refactoring related
Projects
None yet
Development

No branches or pull requests

1 participant