Self-Supervised Text Style Transfer with Rationale Prediction and Pretrained Transformers

Sinclair, Neil and Buys, Jan (2022) Self-Supervised Text Style Transfer with Rationale Prediction and Pretrained Transformers, Proceedings of Third Southern African Conference for AI Research (SACAIR 2022), December 2022, Stellenbosch, South Africa, Artificial Intelligence Research, Communications in Computer and Information Science, 1734, 291-305, Springer, Cham.

[thumbnail of Self_supervised_Style_Transfer__SACAIR_8990.pdf] Text
Self_supervised_Style_Transfer__SACAIR_8990.pdf - Accepted Version

Download (766kB)

Abstract

Sentiment transfer involves changing the sentiment of a sentence, such as from a positive to negative sentiment, while maintaining the informational content. Given the dearth of parallel corpora in this domain, sentiment transfer and other text rewriting tasks have been posed as unsupervised learning problems. In this paper we propose a self-supervised approach to sentiment or text style transfer. First, sentiment words are identified through an interpretable text classifier based on the method of rationales. Second, a pretrained BART model is fine-tuned as a denoising autoencoder to autoregressively reconstruct sentences in which sentiment words are masked. Third, the model is used to generate a parallel corpus, filtered using a sentiment classifier, which is used to fine-tune the model further in a self-supervised manner. Human and automatic evaluations show that on the Yelp sentiment transfer dataset the performance of our self-supervised approach is close to the state-of-the-art while the BART model performs substantially better than a sequence-to-sequence baseline. On a second dataset of Amazon reviews our approach scores high on fluency but struggles more to modify sentiment while maintaining sentence content. Rationale-based sentiment word identification obtains similar performance to the saliency-based sentiment word identification baseline on Yelp but underperforms it on Amazon. Our main contribution is to demonstrate the advantages of self-supervised learning for unsupervised text rewriting.

Item Type: Conference paper
Additional Information: The Version of Record is available online at: http://dx.doi.org/10.1007/978-3-031-22321-1_20
Subjects: Computing methodologies > Artificial intelligence > Natural language processing
Date Deposited: 19 Nov 2023 05:20
Last Modified: 19 Nov 2023 05:20
URI: https://pubs.cs.uct.ac.za/id/eprint/1632

Actions (login required)

View Item View Item