diff --git a/homeworks/homework_2/homework_2.tex b/homeworks/homework_2/homework_2.tex index 18e6efb..2be0540 100644 --- a/homeworks/homework_2/homework_2.tex +++ b/homeworks/homework_2/homework_2.tex @@ -11,7 +11,7 @@ Homework 2\hfill Draft version due: Friday, April 26, 2024, 8:00pm\\ \hfill - Final version due: Wednesday, May 1, 2024, 5:00pm\\ + Final version due: Wednesday, May 1, 2024, 8:00pm\\ } } } @@ -22,10 +22,10 @@ \section*{Policies} \begin{itemize} \item You are free to collaborate on all of the problems, subject to the collaboration policy stated in the syllabus. - \item Please submit your report as a single .pdf file to Gradescope under ``Homework 1" or ``Homework 1 Corrections". + \item Please submit your report as a single .pdf file to Gradescope under ``Homework 2" or ``Homework 2 Corrections". \textbf{In the report, include any images generated by your code along with your answers to the questions.} For instructions specifically pertaining to the Gradescope submission process, see \url{https://www.gradescope.com/get_started#student-submission}. - \item Please submit your code as a .zip archive to Gradescope under ``Homework 1 Code'' or ``Homework 1 Code Corrections". + \item Please submit your code as a .zip archive to Gradescope under ``Homework 2 Code'' or ``Homework 2 Code Corrections". The .zip file should contain your code files. Submit your code either as Jupyter notebook .ipynb files or .py files. \end{itemize} @@ -152,7 +152,7 @@ \section{Stochastic Gradient Descent [36 Points]} \begin{problem}[2] -The closed form solution for linear regression with least squares is \[\mathbf{w} = \left(\sum_{i=1}^N \mathbf{x_i}\mathbf{x_i}^\intercal\right)^{-1}\left(\sum_{i=1}^N \mathbf{x_i}y_i\right).\] +The closed-form solution for linear regression with least squares is \[\mathbf{w} = \left(\sum_{i=1}^N \mathbf{x_i}\mathbf{x_i}^\intercal\right)^{-1}\left(\sum_{i=1}^N \mathbf{x_i}y_i\right).\] Compute this analytical solution. Does the result match up with what you got from SGD? \end{problem} @@ -163,7 +163,7 @@ \section{Stochastic Gradient Descent [36 Points]} Answer the remaining questions in 1--2 short sentences. \begin{problem}[2] -Is there any reason to use SGD when a closed form solution exists? +Is there any reason to use SGD when a closed-form solution exists? \end{problem} \begin{solution} @@ -186,7 +186,7 @@ \section{Stochastic Gradient Descent [36 Points]} \section{Neural networks vs. boosted decision trees [45 Points]} % \materials{lectures 4--6} -In this problem, you will compare the performance of neural networks and boosted decision trees for binary classfication on a tabular dataset, namely the MiniBooNE dataset: \url{https://archive.ics.uci.edu/ml/datasets/MiniBooNE+particle+identification}. +In this problem, you will compare the performance of neural networks and boosted decision trees for binary classification on a tabular dataset, namely the MiniBooNE dataset: \url{https://archive.ics.uci.edu/ml/datasets/MiniBooNE+particle+identification}. This dataset is taken from the MiniBooNE experiment and is used to distinguish electron neutrinos (signal) from muon neutrinos (background) The dataset contains 130,065 samples with 50 features and a single binary label.