|
[1] Feras Al-Hawari and Hala Barham. “A machine learning based help desk system for IT service management”. In:Journal of King Saud University-Computer and Information Sciences(2019).
[2] Gilmar Barreto, Paulo D Battaglin, and Sergio Varga. “Ensuring Efficient IT Service Management to Increase Information Systems Availability”. In:Journal of Information Systems Engineering & Management 4.4 (2019), em0108.
[3] G Chin, Y Benslimane, and Z Yang. “Examining the application of standards for information technology service management practice: An empirical study”. In:2017 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM). IEEE. 2017, pp. 841–845.
[4] Gopinath Krishnan and Vinod Ravindran. “IT service management automation and its impact to IT industry”. In:2017 International Conference on Computational Intelligence in Data Science (ICCIDS). IEEE. 2017, pp. 1–4.
[5] Volodymyr Lyubinets, Taras Boiko, and Deon Nicholas. “Automated labeling of bugs and tickets using attention-based mechanisms in recurrent neural networks”.In:2018 IEEE Second International Conference on Data Stream Mining & Processing (DSMP). IEEE. 2018, pp. 271–275.47
[6] Ea-Ee Jan, Kuan-Yu Chen, and Tsuyoshi Id ́e. “Probabilistic text analytics frame-work for information technology service desk tickets”. In:2015 IFIP/IEEE International Symposium on Integrated Network Management (IM). IEEE. 2015,pp. 870–873.
[7] Jason W Wei and Kai Zou. “Eda: Easy data augmentation techniques for boosting performance on text classification tasks”. In:arXiv preprint arXiv:1901.11196(2019).
[8] Rico Sennrich, Barry Haddow, and Alexandra Birch. “Edinburgh neural machine translation systems for wmt 16”. In:arXiv preprint arXiv:1606.02891(2016).
[9] Sergey Edunov et al. “Understanding back-translation at scale”. In:arXiv preprint arXiv:1808.09381(2018).
[10] Zihang Dai et al. “Transformer-xl: Attentive language models beyond a fixed-length context”. In:arXiv preprint arXiv:1901.02860(2019).
[11] Jeremy Howard and Sebastian Ruder. “Universal language model fine-tuning for text classification”. In:arXiv preprint arXiv:1801.06146(2018).
[12] Jacob Devlin et al. “Bert: Pre-training of deep bidirectional transformers for language understanding”. In:arXiv preprint arXiv:1810.04805(2018).
[13] Ashish Vaswani et al. “Attention is all you need”. In:Advances in neural information processing systems. 2017, pp. 5998–6008.
[14] M Gabri ́e, EW Tramel, and F Krzakala. “Advances in Neural Information Processing Systems”. In: (2015).
[15] Matthew E Peters et al. “Deep contextualized word representations”. In:arXiv preprint arXiv:1802.05365(2018).
[16] Zhilin Yang et al. “Xlnet: Generalized autoregressive pretraining for language understanding”. In:Advances in neural information processing systems. 2019,pp. 5754–5764.
[17] Alec Radford et al. “Improving language understanding by generative pre-training”.In:URL https://s3-us-west-2. amazonaws. com/openai-assets/researchcovers/languageunsupervised/languageunderstanding paper. pdf(2018).
[18] Xu LIANG.What is XLNet and why it outperforms BERT.URL:https://towardsdatascience . com / what - is - xlnet - and - why - it -outperforms-bert-8d8fce710335. (accessed: 03.04.2020).
[19] ESSEN.AI.What is XLNet and why it outperforms BERT.URL:https://blog.essen.ai/what-is-xlnet-and-how-does-it-work/.(accessed: 03.04.2020).
[20] Chris McCormick.BERT Fine-Tuning Tutorial with PyTorch.URL:http://mccormickml.com/2019/07/22/BERT-fine-tuning/. (accessed:09.09.2019).
[21] Chris McCormick.XLNet Fine-Tuning Tutorial with PyTorch.URL:http://mccormickml.com/2019/09/19/XLNet-fine-tuning/. (accessed:01.02.2020).
[22] Sam Shleifer. “Low resource text classification with ulmfit and back translation”.In:arXiv preprint arXiv:1903.09244(2019).
[23] Cyril Goutte and Eric Gaussier. “A Probabilistic Interpretation of Precision, Re-call and F-Score, with Implication for Evaluation”. In: vol. 3408. Apr. 2005,pp. 345–359.DOI:10.1007/978-3-540-31865-1_25.
[24] Yizhe Zhang et al. “Adversarial feature matching for text generation”. In:Proceedings of the 34th International Conference on Machine Learning-Volume 70.JMLR. org. 2017, pp. 4006–4015.
[25] Xing Wu et al. “Conditional BERT contextual augmentation”. In:International Conference on Computational Science. Springer. 2019, pp. 84–95. |