BERT

Bidirectional Encoder Representations from Transformers — a pre-trained language model that learns contextual text representations by masking tokens and predicting them from surrounding context. In robotics, BERT-style models encode natural language instructions for language-conditioned policies. SentenceBERT embeddings are used for task retrieval and instruction similarity matching.

MLVision-Language

Explore More Terms

Browse the full robotics glossary.

Back to Glossary