Quiz: Transformers and Transfer Learning
Glissez pour afficher le menu
1. Which of the following best describes the self-attention mechanism in Transformer models?
2. What is the primary purpose of pre-training a Transformer model on a large corpus before fine-tuning?
3. When selecting a pre-trained Transformer model for a new NLP task, which factor is most important to consider?
4. Which statement about transfer learning with Transformers is correct?
Tout était clair ?
Merci pour vos commentaires !
Section 1. Chapitre 4
Demandez à l'IA
Demandez à l'IA
Posez n'importe quelle question ou essayez l'une des questions suggérées pour commencer notre discussion
Section 1. Chapitre 4