NTK-Guided Implicit Neural Teaching
CVPR 2026

The University of Hong Kong * denotes equal contribution, denotes corresponding author

Abstract

Implicit Neural Representations (INRs) parameterize continuous signals via multilayer perceptrons (MLPs), enabling compact, resolution-independent modeling for tasks like image, audio, and 3D reconstruction. However, fitting high-resolution signals demands optimizing over millions of coordinates, incurring prohibitive computational costs. To address it, we propose NTK-Guided Implicit Neural Teaching (NINT), which accelerates training by dynamically selecting coordinates that maximize global functional updates. Leveraging the Neural Tangent Kernel (NTK), NINT scores examples by the norm of their NTK-augmented loss gradients, capturing both fitting errors and heterogeneous leverage (self-influence and cross-coordinate coupling). This dual consideration enables faster convergence compared to existing methods. Through extensive experiments, we demonstrate that NINT significantly reduces training time by nearly half while maintaining or improving representation quality, establishing state-of-the-art acceleration among recent sampling-based strategies.

Implementations

We provide a plug-and-play package to generally speed up the learning efficiency of the INR.

Poster

Related links

Related works (for developing a deeper understanding of AtteNT) are:

[ICLR 2026] Nonparametric Teaching of Attention Learners,

[ICML 2025 spotlight] Nonparametric Teaching for Graph Property Learners,

[ICML 2024] Nonparametric Teaching of Implicit Neural Representations,

[NeurIPS 2023] Nonparametric Teaching for Multiple Learners,

[ICML 2023] Nonparametric Iterative Machine Teaching.

Citation

Acknowledgments

We thank all anonymous reviewers for their constructive feedback, which helped improve our paper.
This work was supported in part by the Theme-based Research Scheme (TRS) project T45-701/22-R of the Research Grants Council of Hong Kong, and in part by the AVNET-HKU Emerging Microelectronics and Ubiquitous Systems (EMUS) Lab.
The website template was borrowed from Michaël Gharbi.

Send feedback and questions to Chen Zhang