SUBMISSIONS

SUBMISSION DETAIL

Ayegl Ceren KO, ahin BATMAZ, Hseyin DA
 


Keywords:



ARTIFICAL INTELLEGENCE CHATBOT BASED ON TRANSFORMER MODEL
 
Because security systems consist of packages of services that are made up of various components, there have been various problems with the internal communication processes of security companies that have led to operational bottlenecks. It is possible to solve these problems with intelligent assistants. With this artificial intelligence-based system, a virtual assistant is developed specifically for the security industry. In this way, customer satisfaction will be increased and customers will be provided with faster and round-the-clock service. In this direction, Transformer-based deep learning models are developed using PyTorch/Tensorflow infrastructures. Industry specific datasets and our own website were prepared and used as training data. The basis of the established architecture is the Transformer mechanism, which is a state-of-the-art architecture. The core of the transformer structure is the attention mechanism. Thanks to this layer, the training set can be modelled contextually and more successful results can be obtained. The suitability of the Transformer model for Turkish, an agglutinative language, has also been demonstrated in the tests conducted. The goal is to predict the intent classification and named entity recognition with high accuracy using NLU. The basis of a model with a high confidence index is the appropriate creation of inputs and training data. In the established transformer-based encoder block, word position information will also be interpreted in word vectors created by using position vectors. Moreover, it will be possible to establish a contextual relationship between the word vectors obtained as a result of the transformer model and the words in the sentence. This will allow objective classification and estimation of entity extraction with high accuracy. In the analysis of the model results, it was found that Turkish spelling errors have a negative impact on the accuracy of the model. A transformer-based preprocessing model is also trained to correct these typos. The result of this study is that the training time of the NLP model (with 5 thousand training data and 100 iterations) is less than 20 minutes and the test data results of the Turkish typo correction model and the system model are more than 70%.

Anahtar Kelimeler: Natural Language Processing, Deep Learning, Transformer, Natural Language Understanding, Question Answering