You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
error correction and text style transfer 는 monolingual seq2seq 문제로 생각할 수 있습니다.
두 테스크에서 모두 parallel corpus 가 부족하기 때문에 어려운 부분이 많이 있습니다.
본 논문에서는 두개의 다른 테스크의 데이터를 동시에 사용하여 모델을 학습하였습니다.
학습된 모델을 두개의 테스크에 대해 3개의 언어와 어려운 evaluation 을 거쳤을 때, style transfer 나 error type 에서 유의미한 결과들을 보여 주었습니다.
Abstract (요약) 🕵🏻♂️
Both grammatical error correction and text style transfer can be viewed as monolingual sequence-to-sequence transformation tasks, but the scarcity of directly annotated data for either task makes them unfeasible for most languages. We present an approach that does both tasks within the same trained model, and only uses regular language parallel data, without requiring error-corrected or style-adapted texts. We apply our model to three languages and present a thorough evaluation on both tasks, showing that the model is reliable for a number of error types and style transfer aspects.
이 논문을 읽어서 무엇을 배울 수 있는지 알려주세요! 🤔
abstract 에 method 가 나와 있지 않아서 얼마나 좋은 논문인지 판단할 수 없습니다.
두개의 다른 도메인의 데이터를 같이 사용했다는 것에 있어서 인사이트를 얻을 수 있을 것 같습니다.
어떤 내용의 논문인가요? 👋
Abstract (요약) 🕵🏻♂️
Both grammatical error correction and text style transfer can be viewed as monolingual sequence-to-sequence transformation tasks, but the scarcity of directly annotated data for either task makes them unfeasible for most languages. We present an approach that does both tasks within the same trained model, and only uses regular language parallel data, without requiring error-corrected or style-adapted texts. We apply our model to three languages and present a thorough evaluation on both tasks, showing that the model is reliable for a number of error types and style transfer aspects.
이 논문을 읽어서 무엇을 배울 수 있는지 알려주세요! 🤔
레퍼런스의 URL을 알려주세요! 🔗
https://arxiv.org/abs/1903.11283
The text was updated successfully, but these errors were encountered: