ANALYSIS OF PARAPHRASE GENERATION ALGORITHMS (ON THE EXAMPLE OF UZBEK, ENGLISH, AND RUSSIAN LANGUAGES)

Paraphrase generation, natural language processing, Transformer, BERT, Uzbek language, stylistic rephrasing, multilingual models

Authors

  • Zarnigor XAYATOVA Toshkent davlat o‘zbek tili va adabiyoti universiteti doktoranti, f.f.d, Uzbekistan

This article presents a comparative analysis of paraphrase generation algorithms using Uzbek, English, and Russian languages as case studies. The paper focuses on seq2seq, Transformer-based (BERT, T5), and multilingual models (mBART, XLM-R). It evaluates how each language's morphological and syntactic structure impacts paraphrasing. The study highlights challenges specific to the agglutinative structure of the Uzbek language and offers transfer learning as a solution.