Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
学ぶ Quiz: Data Preparation and Tokenization | Preparing Data and Tokenization
Fine-Tuning Transformers

bookQuiz: Data Preparation and Tokenization

メニューを表示するにはスワイプしてください

1. Which of the following best describes the purpose of tokenization in transformer models?

2. What is the role of an attention mask in transformer-based models?

3. Why is it important to split your dataset into training, validation, and test sets when preparing data for fine-tuning?

4. When using a tokenizer from a pre-trained transformer model, what is a common output besides input IDs?

5. Which statement about padding is correct when batching sequences for transformers?

question mark

Which of the following best describes the purpose of tokenization in transformer models?

正しい答えを選んでください

question mark

What is the role of an attention mask in transformer-based models?

正しい答えを選んでください

question mark

Why is it important to split your dataset into training, validation, and test sets when preparing data for fine-tuning?

正しい答えを選んでください

question mark

When using a tokenizer from a pre-trained transformer model, what is a common output besides input IDs?

正しい答えを選んでください

question mark

Which statement about padding is correct when batching sequences for transformers?

正しい答えを選んでください

すべて明確でしたか?

どのように改善できますか?

フィードバックありがとうございます!

セクション 2.  5

AIに質問する

expand

AIに質問する

ChatGPT

何でも質問するか、提案された質問の1つを試してチャットを始めてください

セクション 2.  5
some-alt