Replies: 1 comment
-
Deep Potential is unlikely to over fit a system, if the forces are used as labels. Force is the negative gradient of the energy. If the energy is overfitting (imaging a picture that the model energy is oscillating around the ground truth), the force is generally not correct. Thus a model trained against both energy and force is usually free of overfitting. In the water example. The size of the embedding net and the fitting net can be made much smaller without losing the accuracy. Actually the sizes are so redundant that we haven't see any case that we have to increase the embedding and the fitting nets. In practice, you may want to reduce the network sizes for the reason of efficiency. |
Beta Was this translation helpful? Give feedback.
-
I'm using DeepMD to train a potential for a simple crystalline system. I'm not worried about other phases, just simple vibrations about equilibrium for this single crystal. I figured a few thousand frames might be okay for this scenario.
Starting with the water example, I'm guessing the neural net for water is unnecessarily complex since liquid water experiences many more changes in atomic environment, and also the paper said this network was trained to different phases like ice. I therefore probably don't need a [240,240,240] network structure.
So, does anyone have any insight into choosing a network structure so that you can prevent overfitting? I assume it's something like total number of node connections should be less than the available frames you're training to, or something like that.
Beta Was this translation helpful? Give feedback.
All reactions