Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FIX] sgd: Change deprecated n_iter to max_iter #2920

Merged
merged 6 commits into from
Feb 23, 2018
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions Orange/classification/sgd.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ class SGDClassificationLearner(SklLearner):
preprocessors = SklLearner.preprocessors + [Normalize()]

def __init__(self, loss='hinge', penalty='l2', alpha=0.0001,
l1_ratio=0.15,fit_intercept=True, n_iter=5, shuffle=True,
epsilon=0.1, random_state=None, learning_rate='invscaling',
eta0=0.01, power_t=0.25, warm_start=False, average=False,
preprocessors=None):
l1_ratio=0.15,fit_intercept=True, max_iter=5,
tol=None, shuffle=True, epsilon=0.1, random_state=None,
learning_rate='invscaling', eta0=0.01, power_t=0.25,
warm_start=False, average=False, preprocessors=None):
super().__init__(preprocessors=preprocessors)
self.params = vars()
8 changes: 4 additions & 4 deletions Orange/widgets/model/owsgd.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ class Outputs(OWBaseLearner.Outputs):
learning_rate_index = Setting(0)
eta0 = Setting(.01)
power_t = Setting(.25)
n_iter = Setting(5)
max_iter = Setting(10)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default in sklearn is 5. Is there a particular reason to have it at 10?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No particular reason. Sklearn will change it to 1000 from 0.21. I decided for 10 only to see if I changed everything correctly (testing widget in Orange).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please implement a migrate_settings method. Without it, old workflows won't use the old n_iter value as the max_iter value while their semantics are essentially the same.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know how to do it.
Is this correct (added in class OWSGD)?

    @classmethod
    def migrate_settings(cls, settings_, version):
        if version < 2:
            settings_["max_iter"] = settings_.get("n_iter", 5)

Copy link
Collaborator

@pavlin-policar pavlin-policar Feb 21, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That should do the trick. Perhaps delete the setting n_iter as well to clean up the old settings? You can do this in one line settings_["max_iter"] = settings_.pop("n_iter", 5).


def add_main_layout(self):
self._add_algorithm_to_layout()
Expand Down Expand Up @@ -150,8 +150,8 @@ def _add_learning_params_to_layout(self):
callback=self.settings_changed)
gui.separator(box, height=12)

self.n_iter_spin = gui.spin(
box, self, 'n_iter', 1, MAXINT - 1, label='Number of iterations: ',
self.max_iter_spin = gui.spin(
box, self, 'max_iter', 1, MAXINT - 1, label='Number of iterations: ',
controlWidth=80, alignment=Qt.AlignRight,
callback=self.settings_changed)
# Wrap shuffle_cbx inside another hbox to align it with the random_seed
Expand Down Expand Up @@ -246,7 +246,7 @@ def create_learner(self):
learning_rate=self.learning_rates[self.learning_rate_index][1],
eta0=self.eta0,
power_t=self.power_t,
n_iter=self.n_iter,
max_iter=self.max_iter,
preprocessors=self.preprocessors,
**params)

Expand Down