30
Update: Fixed my LoRA training by switching optimizers last Tuesday
Been fighting with a model that kept overfitting on a 500 image dataset of vintage cars. Switched from AdamW to Prodigy after a tip on github and the loss curve finally flattened out. Anyone else had a dumb optimizer swap fix a week of headaches?
2 comments
Log in to join the discussion
Log In2 Comments
eva_ward881d ago
Have you ever been that person who's all "oh it's definitely the data quality" and then it's not? Because I was totally that person until last month when I switched from AdamW to Lion on a 300 image set of 90s computers. I had spent two weeks messing with learning rates and batch sizes, convinced my dataset was just too small or noisy. Then someone on a discord server said to try Lion and boom, the model actually started learning meaningful features instead of just memorizing patterns. It's humbling how something so simple can fix a problem you thought was way more complicated.
7
noahjenkins1d ago
Spent three days blaming my dataset before realizing I just needed to restart the optimizer.
3