MIT is offering a range of free AI learning courses, catering to everyone from beginners to advanced learners and researchers.
MIT introduces Self-Distillation Fine-Tuning to reduce catastrophic forgetting; it uses student-teacher demonstrations and needs 2.5x compute.
First author Canyu Chen led a multi-institution research team in developing a scalable approach to training AI agents without sacrificing users’ data privacy.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results