Model distillation uses one model to generate training data for a second model.
Curious what happens if models distill upon distilled models into infinity... is there a point in which the model is no longer reasonable
undefined subscriptions will be displayed on your profile (edit)
Skip for now
For your security, we need to re-authenticate you.
Click the link we sent to , or click here to sign in.
Curious what happens if models distill upon distilled models into infinity... is there a point in which the model is no longer reasonable