Skip to content

Latest commit

 

History

History
11 lines (7 loc) · 624 Bytes

README.md

File metadata and controls

11 lines (7 loc) · 624 Bytes

Text Classification with Transformers

This project explores text classification using transformer-based models such as BERT. The primary goal is to compare two approaches:

  1. Using transformers as feature extractors.
  2. Fine-tuning transformer models for specific text classification tasks.

Additionally, this project implements key concepts and techniques from the book Natural Language Processing with Transformers.

What's Inside

All code, experiments, and detailed explanations are available inside the provided notebook. Follow along to see the step-by-step process, from data preparation to model evaluation.