Saving and Loading Fine-Tuned Models
Glissez pour afficher le menu
When you have finished fine-tuning a Transformer model, you will often want to save it so you can reuse it later for inference or further training. Saving both the model weights and the tokenizer is crucial for ensuring that you can reproduce your results and deploy your model reliably. The model weights capture the learned parameters, while the tokenizer encodes and decodes text in exactly the same way as during training. To save a model, you typically export its weights to disk, and likewise, you store the tokenizer configuration and vocabulary. Reloading involves restoring both the model and tokenizer so that any new text can be processed and predictions can be made as before.
Always save both your fine-tuned model and its tokenizer together. This guarantees that your inference pipeline will work exactly as intended and that text is processed consistently from training to deployment.
1234567891011121314151617181920from transformers import AutoModelForSequenceClassification, AutoTokenizer # Suppose you have a fine-tuned model and tokenizer model = AutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased") tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased") # Save model and tokenizer to a directory save_directory = "./my_finetuned_model" model.save_pretrained(save_directory) tokenizer.save_pretrained(save_directory) # Later, reload for inference loaded_model = AutoModelForSequenceClassification.from_pretrained(save_directory) loaded_tokenizer = AutoTokenizer.from_pretrained(save_directory) # Use loaded model and tokenizer for inference inputs = loaded_tokenizer("Your input text here.", return_tensors="pt") outputs = loaded_model(**inputs) print(outputs.logits)
If you forget to save the tokenizer with your model, you might not be able to correctly preprocess new input text, leading to errors or inconsistent results during inference.
Merci pour vos commentaires !
Demandez à l'IA
Demandez à l'IA
Posez n'importe quelle question ou essayez l'une des questions suggérées pour commencer notre discussion