Skip to content

Commit

Permalink
Merge pull request #63 from UCLAIS/website_update
Browse files Browse the repository at this point in the history
Website update
  • Loading branch information
levon-d authored Dec 21, 2024
2 parents 768a782 + 2dbaa2b commit 3c42d7c
Show file tree
Hide file tree
Showing 6 changed files with 32 additions and 4 deletions.
2 changes: 1 addition & 1 deletion our-initiatives/tutorials/2024-2025/_category_.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"label": "2024-2025",
"position": 1,
"position": 2,
"link": {
"type": "doc",
"id": "tutorials/2024-2025/index"
Expand Down
14 changes: 14 additions & 0 deletions our-initiatives/tutorials/2024-2025/intro_to_transformers.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
sidebar_position: 11
---

# 9: Introduction to Transformers

**Date: 11th December 2024**

💡 **Transformers** were initially introduced for the purpose of **machine translation**, but is now the most prevalent (State Of The Art) architecture used for virtually all deep learning tasks. Unlike traditional neural networks, Transformers rely on a mechanism called **attention**, which allows them to focus on relevant parts of the input sequence. Unlike RNNs this architecture takes in sequential input data in parallel.

Central to this model are the **encoder-decoder blocks**, where input data undergoes **tokenization** and is embedded into vectors with **positional encodings** to capture word order. This week, we will explore the **attention mechanism**, including **multi-headed attention**, the structure of **encoder and decoder blocks**, and the processes involved in **training Transformers**, such as **tokenization, masking strategies**, and managing **computational costs**.
💡

You can access our **slides** here: 💻 [**Tutorial 9 Slides**](https://www.canva.com/design/DAGYOwRh8u8/xn2OqkUHgTGClSoYOhSxYQ/view?utm_content=DAGYOwRh8u8&utm_campaign=designshare&utm_medium=link2&utm_source=uniquelinks&utlId=ha097b37913)
2 changes: 1 addition & 1 deletion our-initiatives/tutorials/2024-2025/neural-networks.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
sidebar_position: 7
---

# 4: Neural Networks
# 5: Neural Networks

**Date: 13th November 2024**

Expand Down
14 changes: 14 additions & 0 deletions our-initiatives/tutorials/2024-2025/rnns.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
sidebar_position: 10
---

# 8: Recurrent Neural Networks

**Date: 4th December 2024**

💡 **Recurrent Neural Networks (RNNs)** are a class of models designed to handle sequential data, such as **time series** or **language**, by using **feedback loops** to maintain **context** over time. This week, we will explore the fundamentals of RNNs, the challenges of training them—especially backpropagation through time—and the introduction of variants like **Long Short-Term Memory (LSTM)** networks that better capture **long-term dependencies**. We will briefly mention contrast these approaches with **transformers**, which have largely replaced RNNs and LSTMs in state-of-the-art applications by using self-attention mechanisms to model sequence elements in parallel, ultimately offering a broader perspective on modern sequence modeling techniques.💡

You can access our **demonstration notebook** here: 📘 [**Tutorial 8 Notebook**](https://github.com/UCLAIS/ml-tutorials-season-5/blob/main/week-8/rnn.ipynb)

You can access our **slides** here: 💻 [**Tutorial 8 Slides**](https://www.canva.com/design/DAGSEPaNv_I/RpD2FqJCqnRyZxwa_cvsGQ/view?utm_content=DAGSEPaNv_I&utm_campaign=designshare&utm_medium=link2&utm_source=uniquelinks&utlId=h053c9bd49f)

2 changes: 1 addition & 1 deletion our-initiatives/tutorials/2024-2025/visual-computing-1.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
sidebar_position: 8
---

# 5: Visual Computing I
# 6: Visual Computing I

**Date: 20th November 2024**

Expand Down
2 changes: 1 addition & 1 deletion our-initiatives/tutorials/_category_.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"label": "💻 ML Tutorial Series",
"position": 2,
"position": 1,
"link": {
"type": "doc",
"id": "tutorials/2024-2025/index"
Expand Down

0 comments on commit 3c42d7c

Please sign in to comment.