Embedded and Edge AutoML
Knowledge Distillation
A transfer learning method where a large teacher model trains a more compact student model, allowing the performance of the large model to be retained in an architecture suitable for Edge devices.
← Kembali