Menu
Publications
2024
2023
2022
2021
2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
2007
2006
2005
2004
2003
2002
2001
Editor-in-Chief
Nikiforov
Vladimir O.
D.Sc., Prof.
Partners
doi: 10.17586/2226-1494-2023-23-1-112-120
Software framework for hyperparameters optimization of models with additive regularization
Read the full article ';
Article in Russian
For citation:
Abstract
For citation:
Khodorchenko M.A., Butakov N.A., Nasonov D.A., Firulik M.Yu. Software framework for hyperparameters optimization of models with additive regularization. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2023, vol. 23, no. 1, pp. 112–120 (in Russian). doi: 10.17586/2226-1494-2023-23-1-112-120
Abstract
The processing of unstructured data, such as natural language texts, is one of the urgent tasks in the development of intelligent products. In turn, topic modeling as a method of working with unmarked and partially marked text data is a natural choice for analyzing document bodies and creating vector representations. In this regard, it is especially important to train high-quality thematic models in a short time which is possible with the help of the proposed framework. The developed framework implements an evolutionary approach to optimizing hyperparameters of models with additive regularization and high results on quality metrics (coherence, NPMI). To reduce the computational time, a mode of working with surrogate models is presented which provides acceleration of calculations up to 1.8 times without loss of quality. The effectiveness of the framework is demonstrated on three datasets with different statistical characteristics. The results obtained exceed similar solutions by an average of 20 % in coherence and 5 % in classification quality for two of the three datasets. A distributed version of the framework has been developed for conducting experimental studies of topic models. The developed framework can be used by users without special knowledge in the field of topic modeling due to the default data processing pipeline. The results of the work can be used by researchers to analyze topic models and expand functionality.
Keywords: AutoML framework, topic modeling, unstructured data, additive regularization, evolutionary approach, surrogate models
Acknowledgements. This research is financially supported by The Russian Science Foundation, Agreement No. 20-11-20270.
References
Acknowledgements. This research is financially supported by The Russian Science Foundation, Agreement No. 20-11-20270.
References
-
Khanthaapha P., Pipanmaekaporn L., Kamonsantiroj S. Topic-based user profile model for POI recommendations. Proc. of the 2nd International Conference on Intelligent Systems, Metaheuristics Swarm Intelligence, 2018, pp. 143–147. https://doi.org/10.1145/3206185.3206203
-
Peña F.J., O'Reilly-Morgan D., Tragos E.Z., Hurley N., Duriakova E., Smyth B., Lawlor A. Combining rating and review data by initializing latent factor models with topic models for top-n recommendation. Proc. of the 14th ACM Conference on Recommender Systems, 2020, pp. 438–443. https://doi.org/10.1145/3383313.3412207
-
Sokhin T., Butakov N. Semi-automatic sentiment analysis based on topic modeling. Procedia Computer Science, 2018, vol. 136, pp. 284–292. https://doi.org/10.1016/j.procs.2018.08.286
-
Nevezhin E., Butakov N., Khodorchenko M., Petrov M., Nasonov D. Topic-driven ensemble for online advertising generation. Proc. of the 28th International Conference on Computational Linguistics, 2020, pp. 2273–2283. https://doi.org/10.18653/v1/2020.coling-main.206
-
Zamiralov A., Khodorchenko M., Nasonov D. Detection of housing and utility problems in districts through social media texts. Procedia Computer Science, 2020, vol. 178, pp. 213–223. https://doi.org/10.1016/j.procs.2020.11.023
-
Shi T., Kang K., Choo J., Reddy C.K. Short-text topic modeling via non-negative matrix factorization enriched with local word-context correlations. Proc. of the World Wide Web Conference (WWW 2018), 2018, pp. 1105–1114. https://doi.org/10.1145/3178876.3186009
-
Hofmann T. Probabilistic latent semantic indexing. Proc. of the 22nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '99), 1999, pp. 50–57. https://doi.org/10.1145/312624.312649
-
Blei D.M., Ng A.Y., Jordan M.I. Latent Dirichlet allocation. Journal of Machine Learning Research, 2003, vol. 3, pp. 993–1022.
-
Vorontsov K., Potapenko A., Plavin A. Additive regularization of topic models for topic selection and sparse factorization. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2015, vol. 9047, pp. 193–202. https://doi.org/10.1007/978-3-319-17091-6_14
-
Card D., Tan C., Smith N.A. Neural models for documents with metadata. Proc. of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2018, pp. 2031–2040. https://doi.org/10.18653/v1/p18-1189
-
Cao Z., Li S., Liu Y., Li W., Ji H. A novel neural topic model and its supervised extension. Proceedings of the AAAI Conference on Artificial Intelligence, 2015, vol. 29, no. 1, pp. 2210–2216. https://doi.org/10.1609/aaai.v29i1.9499
-
Bianchi F., Terragni S., Hovy D. Pre-training is a hot topic: Contextualized document embeddings improve topic coherence. Proc. of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), 2021, pp. 759–766. https://doi.org/10.18653/v1/2021.acl-short.96
-
Ye J., Jing X., Li J. Sentiment Analysis Using Modified LDA. Lecture Notes in Electrical Engineering, 2018, vol. 473, pp. 205–212. https://doi.org/10.1007/978-981-10-7521-6_25
-
Bodrunova S., Koltsov S., Koltsova O., Nikolenko S., Shimorina A. Interval semi-supervised LDA: Classifying needles in a haystack. Lecture Notes in Computer Science, 2013, vol. 8265, pp. 265–274. https://doi.org/10.1007/978-3-642-45114-0_21
-
Řehůřek R., Sojka P. Software framework for topic modelling with large corpora. Proc. of the LREC 2010 Workshop on New Challenges for NLP, 2010, pp. 45–50.
-
Terragni S., Fersini E., Galuzzi B.G., Tropeano P., Candelieri A. OCTIS: Comparing and optimizing topic models is simple! Proc. of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations, 2021, pp. 263–270. https://doi.org/10.18653/v1/2021.eacl-demos.31
-
Khodorchenko M., Butakov N. Developing an approach for lifestyle identification based on explicit and implicit features from social media. Procedia Computer Science, 2018, vol. 136, pp. 236–245. https://doi.org/10.1016/j.procs.2018.08.262
-
Vorontsov K., Frei O., Apishev M., Romov P., Dudarenko M. BigARTM: Open source library for regularized multimodal topic modeling of large collections. Communications in Computer and Information Science, 2015, vol. 542, pp. 370–381. https://doi.org/10.1007/978-3-319-26123-2_36
-
Khodorchenko M., Teryoshkin S., Sokhin T., Butakov N. Optimization of learning strategies for ARTM-based topic models. Lecture Notes in Computer Science, 2020, vol. 12344, pp. 284–296. https://doi.org/10.1007/978-3-030-61705-9_24
-
Khodorchenko M., Butakov N., Sokhin T., Teryoshkin S. Surrogate-based optimization of learning strategies for additively regularized topic models. Logic Journal of the IGPL, 2022. https://doi.org/10.1093/jigpal/jzac019
-
Röder M., Both A., Hinneburg A. Exploring the space of topic coherence measures. Proc. of the Eighth ACM International Conference on Web Search and Data Mining (WSDM’15), 2015, pp. 399–408. https://doi.org/10.1145/2684822.2685324
-
Newman D., Noh Y., Talley E., Karimi S., Baldwin T. Evaluating topic models for digital libraries. Proc. of the 10th Annual Joint Conference on Digital Libraries (JCDL’10), 2010, pp. 215–224. https://doi.org/10.1145/1816123.1816156