Distillation with Reasoning: can DeepSeek R1 Teach Better Than Humans?
Ada Owens 于 6 月之前 修改了此页面


Inclusion of reasoning "chains of idea" (CoT) in the model output significantly enhances its quality, however it increases reasoning cost. - Distillation transfers thinking understanding from a costly instructor design to a more cost-efficient trainee, minimizing total reasoning cost.