A LoRA is tied to a specific model architecture — a LoRA trained on Llama 3 8B won't work on Mistral 7B. Train on the exact model you plan to use. You should also use Copy parameters from to restore ...
_DEEP_RESEARCH_AGENT_DEFAULT_SYS_PROMPT = "You're a helpful assistant." ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results