-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Overfitting #24
Comments
If your data scale can not be enlarged, maybe you can try:
(2) Reduce Model Complexity: You may choose a small model, i.e., DFormer-B instead of -L. (3) You mentioned monitoring and early stopping: Our framework can monitor the mIoU at the validation set and save the best one. It has the same effect as the early stopping The current framework does not support the following (4)-(6). Maybe you need to implement them. (4) More Data Augmentation: You can refer to https://github.com/VCIP-RGBD/RGBD-Pretrain/blob/main/data/auto_augment.py for more data augmentation for the RGBD data. (5) Cross-Validation: Use cross-validation to ensure that the model’s performance is consistent across different subsets of the data. (6) Ensemble Methods: Combine predictions from multiple models to reduce variance and improve generalization. But this will sacrifice the model's efficiency. |
Okay thank you! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
While setting up your neural network to on it with own data, I was wondering if you have any tactics to prevent overfitting used in your code. Is there some kind of monitoring and early stopping?
The text was updated successfully, but these errors were encountered: