Weight Initialization

The strategy for setting initial values of neural network parameters before training. Proper initialization (Xavier/Glorot, Kaiming/He) ensures that activations and gradients maintain appropriate magnitude across layers at the start of training. Poor initialization can cause vanishing or exploding gradients. Pre-trained initialization (from foundation models) is a form of transfer learning.

MLTraining

Explore More Terms

Browse the full robotics glossary.

Back to Glossary