If you train llama 3, when this issue happes:
ValueError: Unsloth: Untrained tokens found, but embed_tokens & lm_head not trainable, causing NaNs. Restart then add `embed_tokens` & `lm_head` to `FastLanguageModel.get_peft_model(target_modules = [..., "embed_tokens", "lm_head",]). `Are you using the `base` model? Instead, use the `instruct` version to silence this warning.
Try to use the code to address it:
target_modules = ["q_proj", "k_proj", "v_proj", "o_proj",
"gate_proj", "up_proj", "down_proj",
"lm_head", "embed_tokens",],
Reference: https://github.com/unslothai/unsloth/wiki#finetuning-the-lm_head-and-embed_tokens-matrices