Feature Selection

Feature Selection with Annealing: A generic method for feature selection and model learning that outperforms boosting and methods based on sparsity inducing penalties such as L1, SCAD, and MCP.

Publications:
  • Y.She, J.Shen, A. Barbu. Slow Kill for Big Data Learning. IEEE Trans. Information Theory, 2023. (IEEE)
  • M. Wang, A. Barbu. Online Feature Screening for Data Streams with Concept Drift. IEEE Trans. on Knowledge and Data Engineering, 2023. (arxivIEEE)
  • M. Wang, A. Barbu. Are screening methods useful in feature selection? An empirical study. PLoS One 14, No. 9 (2019). (arxivGithub)
  • A. Barbu, Y. She, L. Ding, G. Gramajo. Feature Selection with Annealing for Computer Vision and Big Data Learning. IEEE PAMI 39, No. 2, 272–286, 2017. (arxivlink, Matlab code, Github)
  • F. Bunea, A. Tsybakov, M. Wegkamp and A.Barbu. SPADES and mixture models. Annals of Statistics 38, No. 4, 2525–2558, 2010. (pdf, code)
  • F. Bunea and A.Barbu. Dimension reduction and variable selection in case control studies via regularized likelihood optimization. Electronic Journal of Statistics, 3, 2009. (pdf)

No comments:

Post a Comment