Distillation with Reasoning: can DeepSeek R1 Teach Better Than Humans?
Ada Owens 於 9 月之前 修改了此頁面


Inclusion of reasoning "chains of idea" (CoT) in the model output significantly enhances its quality, however it increases reasoning cost. - Distillation transfers thinking understanding from a costly instructor design to a more cost-efficient trainee, minimizing total reasoning cost.