Modelos de clasificación binaria de la coloración semántica de textos


  • Nataliya Boyko Lviv Polytechnic National University


Palabras clave:

Long short-term memory, Convolutional neural network, Gate recurrent node, Binary text classification


Introduction: The purpose of the research is to compare different types of recurrent neural network architectures, namely the long short-term memory and gate recurrent node architecture and the convolutional neural network, and to explore their performance on the example of binary text classification. Material and Methods: To achieve this, the research evaluates the performance of these two popular deep-learning approaches on a dataset consisting of film reviews that are marked with both positive and adverse opinions. The real-world dataset was used to train neural network models using software implementations. Results and Discussion: The research focuses on the implementation of a recurrent neural network for the binary classification of a dataset and explores different types of architecture, approaches and hyperparameters to determine the best model to achieve optimal performance. The software implementation allowed evaluating of various quality metrics, which allowed comparing the performance of the proposed approaches. In addition, the research explores various hyperparameters such as learning rate, packet sizes, and regulation methods to determine their impact on model performance. Conclusion: In general, the research provides valuable insights into the performance of neural networks in binary text classification and highlights the importance of careful architecture selection and hyperparameter tuning to achieve optimal performance.


Martins RM, Gresse Von Wangenheim C. Findings on teaching machine learning in high school: A ten-year systematic literature review. Informatics in Education. 2022.

Hopkins E. Machine learning tools, algorithms, and techniques. J Self-Gov Manage Econ. 2022. 1: 43-55.

Murphy KP. Probabilistic machine learning: An introduction. Cambridge: MIT Press; 2022.

Zhou ZH. Open-environment machine learning. National Science Review, 2022. 9(8): nwac123.

Zhu J, Jiang Q, Shen Y, Qian C, Xu F, Zhu Q. Application of recurrent neural network to mechanical fault diagnosis: A review. J Mech Sci Technol. 2022. 36: 527-542.

Ghimire D, Kil D, Kim SH. A survey on efficient convolutional neural networks and hardware acceleration. Electronics, 2022. 11(6): 945.

Benchekroun O. Computer vision for document image analysis and text extraction. 2022. Available at: Accessed June 29, 2023.

Nazari M, Alidadi M. Measuring credit risk of bank customers using artificial neural network. J Manage Res. 2013. 5(2): 17-27.

Assous HF. Saudi green banks and stock return volatility: GLE algorithm and neural network models. Economies. 2022. 10(10): 242.

Wüthrich MV. Bias regularization in neural network models for general insurance pricing. Eur Actuar J. 2020. 10: 179-202.

Greener JG, Kandathil SM, Moffat L, Jones DT. A guide to machine learning for biologists. Nat Rev Mol Cell Biol. 2022. 23: 40-55.

Trinh Van L, Dao Thi Le T, Le Xuan T, Castelli E. Emotional speech recognition using deep neural networks. Sensors. 2022. 22(4): 1414.

Saqib N, Haque KF, Yanambaka VP, Abdelgawad A. Convolutional-neural-network-based handwritten character recognition: An approach with massive multisource data. Algorithms, 2022. 15(4): 129.

Hashimoto W, Hashimoto K, Takai S. Stl2vec: Signal temporal logic embeddings for control synthesis with recurrent neural networks. IEEE Rob Autom Lett. 2022. 7(2): 5246-5253.

Ho NH, Yang, HJ, Kim SH, Lee G. Multimodal approach of speech emotion recognition using multi-level multi-head fusion attention-based recurrent neural network. IEEE Access. 2020. 8: 61672-6168.

Chamishka S, Madhavi I, Nawaratne R, Alahakoon D, De Silva D, Chilamkurti N, Nanayakkara V. A voice-based real-time emotion detection technique using recurrent neural network empowered feature modelling. Multimedia Tools Appl. 2022. 81: 35173-35194.

Barzegar V, Laflamme S, Hu C, Dodson J. Ensemble of recurrent neural networks with long short-term memory cells for high-rate structural health monitoring. Mech Syst Signal Process. 2022. 164: 108201.

Yao L, Mao C, Luo Y. Graph convolutional networks for text classification. Proceedings of the AAAI Conf Artif Intell, 2019. 33(1): 7370-7377.

Yin W, Kann K, Yu M, Schütze H. Comparative study of CNN and RNN for natural language processing. 2017.

Zhou P, Qi Z, Zheng S, Xu J, Bao H, Xu B. Text classification improved by integrating bidirectional LSTM with two-dimensional max pooling. 2016.

Zhou C, Sun C, Liu Z, Lau F. A C-LSTM neural network for text classification. 2015.

Huang L, Ma D, Li S, Zhang X, Wang H. Text level graph neural network for text classification. 2019.

Wang R, Li Z, Cao J, Chen T, Wang L. Convolutional recurrent neural networks for text classification. In: 2019 International Joint Conference on Neural Networks (IJCNN). Budapest: IEEE; 2019. p. 1-6.

Chen G, Ye D, Xing Z, Chen J, Cambria E. Ensemble application of convolutional and recurrent neural networks for multi-label text categorization. In: 2017 International joint conference on neural networks (IJCNN). Anchorage: IEEE; 2017. p. 2377-2383.

Manoharan JS. 2021. Capsule network algorithm for performance optimization of text classification. J Soft Comput Paradigm. 2021. 3(1): 1-9.




Cómo citar

Boyko, N. (2023). Modelos de clasificación binaria de la coloración semántica de textos. Innovaciencia, 11(1).



Artículo de investigación científica y tecnológica



Los datos de descargas todavía no están disponibles.