Rstg - STG : Feature Selection using STochastic Gates
'STG' is a method for feature selection in neural network.
The procedure is based on probabilistic relaxation of the l0
norm of features, or the count of the number of selected
features. The framework simultaneously learns either a
nonlinear regression or classification function while selecting
a small subset of features. Read more: Yamada et al. (2020)
<https://proceedings.mlr.press/v119/yamada20a.html>.