Policy-Guided Neural Thinning: Dynamic Parameter Removal During Inference

Authors

  • Lara Krasovec Faculty of Mathematics, Natural Sciences and Information Technologies University of Primorska, Slovenia Author
  • Marko Cernetic Faculty of Mathematics, Natural Sciences and Information Technologies University of Primorska, Slovenia Author
  • Ivana Jerman Faculty of Mathematics, Natural Sciences and Information Technologies University of Primorska, Slovenia Author
  • Timotej Belak Faculty of Mathematics, Natural Sciences and Information Technologies University of Primorska, Slovenia Author

DOI:

https://doi.org/10.5281/zenodo.17792846

Keywords:

Dynamic inference, policy-guided thinning, adaptive neural models, selective activation, reinforcement-driven optimization, efficient computation, lightweight analytics

Abstract

This work presents a dynamic inference framework in which neural models selectively deactivate internal parameters based on a policy learned through reinforcement signals. The method, termed policy-guided neural thinning, enables a network to adjust its computational footprint at run time, allowing inference to scale with the difficulty of the input or constraints of the device. Instead of relying on fixed pruning decisions, the system evaluates structural importance on a per-input basis and activates only the components that contribute meaningfully to prediction quality. Experiments demonstrate that this adaptive approach reduces computation and energy consumption while preserving stable predictive behavior across varying workloads. The results show that neural thinning, when controlled by decision policies, forms a viable pathway toward efficient and responsive analytics on constrained platforms.

Downloads

Published

2020-06-10