Quiz: Data Preparation and Tokenization
Deslize para mostrar o menu
1. Which of the following best describes the purpose of tokenization in transformer models?
2. What is the role of an attention mask in transformer-based models?
3. Why is it important to split your dataset into training, validation, and test sets when preparing data for fine-tuning?
4. When using a tokenizer from a pre-trained transformer model, what is a common output besides input IDs?
5. Which statement about padding is correct when batching sequences for transformers?
Tudo estava claro?
Obrigado pelo seu feedback!
Seção 2. Capítulo 5
Pergunte à IA
Pergunte à IA
Pergunte o que quiser ou experimente uma das perguntas sugeridas para iniciar nosso bate-papo
Seção 2. Capítulo 5