Skip to content

Commit

Permalink
fix
Browse files Browse the repository at this point in the history
  • Loading branch information
jmduarte committed Apr 18, 2024
1 parent 1dd796f commit 7bd6ee5
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions homeworks/homework_2/homework_2.tex
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
Homework 2\hfill
Draft version due: Friday, April 26, 2024, 8:00pm\\
\hfill
Final version due: Wednesday, May 1, 2024, 5:00pm\\
Final version due: Wednesday, May 1, 2024, 8:00pm\\
}
}
}
Expand All @@ -22,10 +22,10 @@
\section*{Policies}
\begin{itemize}
\item You are free to collaborate on all of the problems, subject to the collaboration policy stated in the syllabus.
\item Please submit your report as a single .pdf file to Gradescope under ``Homework 1" or ``Homework 1 Corrections".
\item Please submit your report as a single .pdf file to Gradescope under ``Homework 2" or ``Homework 2 Corrections".
\textbf{In the report, include any images generated by your code along with your answers to the questions.}
For instructions specifically pertaining to the Gradescope submission process, see \url{https://www.gradescope.com/get_started#student-submission}.
\item Please submit your code as a .zip archive to Gradescope under ``Homework 1 Code'' or ``Homework 1 Code Corrections".
\item Please submit your code as a .zip archive to Gradescope under ``Homework 2 Code'' or ``Homework 2 Code Corrections".
The .zip file should contain your code files.
Submit your code either as Jupyter notebook .ipynb files or .py files.
\end{itemize}
Expand Down Expand Up @@ -152,7 +152,7 @@ \section{Stochastic Gradient Descent [36 Points]}


\begin{problem}[2]
The closed form solution for linear regression with least squares is \[\mathbf{w} = \left(\sum_{i=1}^N \mathbf{x_i}\mathbf{x_i}^\intercal\right)^{-1}\left(\sum_{i=1}^N \mathbf{x_i}y_i\right).\]
The closed-form solution for linear regression with least squares is \[\mathbf{w} = \left(\sum_{i=1}^N \mathbf{x_i}\mathbf{x_i}^\intercal\right)^{-1}\left(\sum_{i=1}^N \mathbf{x_i}y_i\right).\]
Compute this analytical solution.
Does the result match up with what you got from SGD?
\end{problem}
Expand All @@ -163,7 +163,7 @@ \section{Stochastic Gradient Descent [36 Points]}
Answer the remaining questions in 1--2 short sentences.

\begin{problem}[2]
Is there any reason to use SGD when a closed form solution exists?
Is there any reason to use SGD when a closed-form solution exists?
\end{problem}
\begin{solution}

Expand All @@ -186,7 +186,7 @@ \section{Stochastic Gradient Descent [36 Points]}
\section{Neural networks vs. boosted decision trees [45 Points]}
% \materials{lectures 4--6}

In this problem, you will compare the performance of neural networks and boosted decision trees for binary classfication on a tabular dataset, namely the MiniBooNE dataset: \url{https://archive.ics.uci.edu/ml/datasets/MiniBooNE+particle+identification}.
In this problem, you will compare the performance of neural networks and boosted decision trees for binary classification on a tabular dataset, namely the MiniBooNE dataset: \url{https://archive.ics.uci.edu/ml/datasets/MiniBooNE+particle+identification}.

This dataset is taken from the MiniBooNE experiment and is used to distinguish electron neutrinos (signal) from muon neutrinos (background)
The dataset contains 130,065 samples with 50 features and a single binary label.
Expand Down

0 comments on commit 7bd6ee5

Please sign in to comment.