Skip to content

Commit

Permalink
Merge pull request #13 from entelecheia/main
Browse files Browse the repository at this point in the history
  • Loading branch information
entelecheia authored Aug 27, 2024
2 parents 15ead47 + e1eb537 commit 30ccbcb
Show file tree
Hide file tree
Showing 4 changed files with 14 additions and 14 deletions.
8 changes: 4 additions & 4 deletions book/en/week01/session1.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

Natural Language Processing (NLP) is an interdisciplinary field that combines linguistics, computer science, and artificial intelligence to enable computers to understand, interpret, and generate human language. The primary goal of NLP is to bridge the gap between human communication and computer understanding.

```mermaid
```{mermaid}
graph TD
A[Natural Language Processing] --> B[Linguistics]
A --> C[Computer Science]
Expand Down Expand Up @@ -88,7 +88,7 @@ Example: A researcher studying political discourse could use NLP techniques to a

## 2. Historical Perspective of NLP

```mermaid
```{mermaid}
timeline
title Evolution of NLP
1950s : Rule-based systems
Expand Down Expand Up @@ -199,7 +199,7 @@ This era also saw the emergence of corpus linguistics, which emphasized the stud

The current era of NLP is characterized by the dominance of deep learning approaches, particularly transformer-based models.

```mermaid
```{mermaid}
graph TD
A[Modern NLP] --> B[Word Embeddings]
A --> C[Deep Neural Networks]
Expand Down Expand Up @@ -254,7 +254,7 @@ for text in texts:

The traditional NLP pipeline typically consists of several stages:

```mermaid
```{mermaid}
graph LR
A[Text Input] --> B[Text Preprocessing]
B --> C[Feature Extraction]
Expand Down
6 changes: 3 additions & 3 deletions book/en/week01/session2.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ The adoption of deep learning techniques in NLP has led to significant improveme
2. Long Short-Term Memory networks (LSTMs)
3. Convolutional Neural Networks (CNNs) for text

```mermaid
```{mermaid}
graph TD
A[Deep Learning in NLP] --> B[RNNs]
A --> C[LSTMs]
Expand Down Expand Up @@ -111,7 +111,7 @@ Key components of Transformer architecture:
3. Positional encoding
4. Feed-forward neural networks

```mermaid
```{mermaid}
graph TD
A[Transformer] --> B[Encoder]
A --> C[Decoder]
Expand Down Expand Up @@ -182,7 +182,7 @@ Capabilities of LLMs include:
5. Code generation
6. Few-shot and zero-shot learning

```mermaid
```{mermaid}
graph TD
A[Large Language Models] --> B[Few-shot Learning]
A --> C[Zero-shot Learning]
Expand Down
8 changes: 4 additions & 4 deletions book/ko/week01/session1.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

자연어처리(NLP)는 언어학, 컴퓨터 과학, 인공지능을 결합한 학제간 분야로, 컴퓨터가 인간의 언어를 이해, 해석, 생성할 수 있게 합니다. NLP의 주요 목표는 인간 의사소통과 컴퓨터 이해 사이의 간극을 좁히는 것입니다.

```mermaid
```{mermaid}
graph TD
A[자연어처리] --> B[언어학]
A --> C[컴퓨터 과학]
Expand Down Expand Up @@ -88,7 +88,7 @@ NLP는 다음과 같은 이유로 사회과학 연구에서 점점 더 중요해

## 2. NLP의 역사적 관점

```mermaid
```{mermaid}
timeline
title NLP의 진화
1950년대 : 규칙 기반 시스템
Expand Down Expand Up @@ -199,7 +199,7 @@ print(classification_report(y_test, y_pred, target_names=['부정', '긍정', '

현재 NLP 시대는 특히 트랜스포머 기반 모델을 중심으로 한 딥러닝 접근법의 우세로 특징지어집니다.

```mermaid
```{mermaid}
graph TD
A[현대 NLP] --> B[단어 임베딩]
A --> C[심층 신경망]
Expand Down Expand Up @@ -254,7 +254,7 @@ for text in texts:

전통적인 NLP 파이프라인은 일반적으로 여러 단계로 구성됩니다:

```mermaid
```{mermaid}
graph LR
A[텍스트 입력] --> B[텍스트 전처리]
B --> C[특징 추출]
Expand Down
6 changes: 3 additions & 3 deletions book/ko/week01/session2.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ NLP에서 딥 러닝 기술의 채택은 다양한 작업에서 성능을 크게
2. 장단기 메모리 네트워크 (LSTMs)
3. 텍스트를 위한 합성곱 신경망 (CNNs)

```mermaid
```{mermaid}
graph TD
A[NLP에서의 딥 러닝] --> B[RNNs]
A --> C[LSTMs]
Expand Down Expand Up @@ -111,7 +111,7 @@ NLP에서 딥 러닝 모델의 장점:
3. 위치 인코딩
4. 피드포워드 신경망

```mermaid
```{mermaid}
graph TD
A[트랜스포머] --> B[인코더]
A --> C[디코더]
Expand Down Expand Up @@ -182,7 +182,7 @@ LLM의 주요 능력:
5. 코드 생성
6. 퓨샷 및 제로샷 학습

```mermaid
```{mermaid}
graph TD
A[대규모 언어 모델] --> B[퓨샷 학습]
A --> C[제로샷 학습]
Expand Down

0 comments on commit 30ccbcb

Please sign in to comment.