diff --git a/book/en/week01/session1.md b/book/en/week01/session1.md index d71f856..e91d7bc 100644 --- a/book/en/week01/session1.md +++ b/book/en/week01/session1.md @@ -6,7 +6,7 @@ Natural Language Processing (NLP) is an interdisciplinary field that combines linguistics, computer science, and artificial intelligence to enable computers to understand, interpret, and generate human language. The primary goal of NLP is to bridge the gap between human communication and computer understanding. -```mermaid +```{mermaid} graph TD A[Natural Language Processing] --> B[Linguistics] A --> C[Computer Science] @@ -88,7 +88,7 @@ Example: A researcher studying political discourse could use NLP techniques to a ## 2. Historical Perspective of NLP -```mermaid +```{mermaid} timeline title Evolution of NLP 1950s : Rule-based systems @@ -199,7 +199,7 @@ This era also saw the emergence of corpus linguistics, which emphasized the stud The current era of NLP is characterized by the dominance of deep learning approaches, particularly transformer-based models. -```mermaid +```{mermaid} graph TD A[Modern NLP] --> B[Word Embeddings] A --> C[Deep Neural Networks] @@ -254,7 +254,7 @@ for text in texts: The traditional NLP pipeline typically consists of several stages: -```mermaid +```{mermaid} graph LR A[Text Input] --> B[Text Preprocessing] B --> C[Feature Extraction] diff --git a/book/en/week01/session2.md b/book/en/week01/session2.md index 45da003..bcbb3f5 100644 --- a/book/en/week01/session2.md +++ b/book/en/week01/session2.md @@ -57,7 +57,7 @@ The adoption of deep learning techniques in NLP has led to significant improveme 2. Long Short-Term Memory networks (LSTMs) 3. Convolutional Neural Networks (CNNs) for text -```mermaid +```{mermaid} graph TD A[Deep Learning in NLP] --> B[RNNs] A --> C[LSTMs] @@ -111,7 +111,7 @@ Key components of Transformer architecture: 3. Positional encoding 4. Feed-forward neural networks -```mermaid +```{mermaid} graph TD A[Transformer] --> B[Encoder] A --> C[Decoder] @@ -182,7 +182,7 @@ Capabilities of LLMs include: 5. Code generation 6. Few-shot and zero-shot learning -```mermaid +```{mermaid} graph TD A[Large Language Models] --> B[Few-shot Learning] A --> C[Zero-shot Learning] diff --git a/book/ko/week01/session1.md b/book/ko/week01/session1.md index f1ea2a7..dea72f1 100644 --- a/book/ko/week01/session1.md +++ b/book/ko/week01/session1.md @@ -6,7 +6,7 @@ 자연어처리(NLP)는 언어학, 컴퓨터 과학, 인공지능을 결합한 학제간 분야로, 컴퓨터가 인간의 언어를 이해, 해석, 생성할 수 있게 합니다. NLP의 주요 목표는 인간 의사소통과 컴퓨터 이해 사이의 간극을 좁히는 것입니다. -```mermaid +```{mermaid} graph TD A[자연어처리] --> B[언어학] A --> C[컴퓨터 과학] @@ -88,7 +88,7 @@ NLP는 다음과 같은 이유로 사회과학 연구에서 점점 더 중요해 ## 2. NLP의 역사적 관점 -```mermaid +```{mermaid} timeline title NLP의 진화 1950년대 : 규칙 기반 시스템 @@ -199,7 +199,7 @@ print(classification_report(y_test, y_pred, target_names=['부정', '긍정', ' 현재 NLP 시대는 특히 트랜스포머 기반 모델을 중심으로 한 딥러닝 접근법의 우세로 특징지어집니다. -```mermaid +```{mermaid} graph TD A[현대 NLP] --> B[단어 임베딩] A --> C[심층 신경망] @@ -254,7 +254,7 @@ for text in texts: 전통적인 NLP 파이프라인은 일반적으로 여러 단계로 구성됩니다: -```mermaid +```{mermaid} graph LR A[텍스트 입력] --> B[텍스트 전처리] B --> C[특징 추출] diff --git a/book/ko/week01/session2.md b/book/ko/week01/session2.md index 35f7f82..2c9e6d2 100644 --- a/book/ko/week01/session2.md +++ b/book/ko/week01/session2.md @@ -57,7 +57,7 @@ NLP에서 딥 러닝 기술의 채택은 다양한 작업에서 성능을 크게 2. 장단기 메모리 네트워크 (LSTMs) 3. 텍스트를 위한 합성곱 신경망 (CNNs) -```mermaid +```{mermaid} graph TD A[NLP에서의 딥 러닝] --> B[RNNs] A --> C[LSTMs] @@ -111,7 +111,7 @@ NLP에서 딥 러닝 모델의 장점: 3. 위치 인코딩 4. 피드포워드 신경망 -```mermaid +```{mermaid} graph TD A[트랜스포머] --> B[인코더] A --> C[디코더] @@ -182,7 +182,7 @@ LLM의 주요 능력: 5. 코드 생성 6. 퓨샷 및 제로샷 학습 -```mermaid +```{mermaid} graph TD A[대규모 언어 모델] --> B[퓨샷 학습] A --> C[제로샷 학습]