zhang-yice

6 models • 1 total models in database
Sort by:

Spt Absa Bert 400k

We continue to pre-train BERT-base via Sentiment-enhance pre-training (SPT). - Title: An Empirical Study of Sentiment-Enhanced Pre-Training for Aspect-Based Sentiment Analysis - Author: Yice Zhang, Yifan Yang, Bin Liang, Shiwei Chen, Bing Qin, and Ruifeng Xu - Conference: ACL-2023 Finding (Long) GitHub Repository: https://github.com/HITSZ-HLT/SPT-ABSA Aspect-Based Sentiment Analysis (ABSA) is an important problem in sentiment analysis. Its goal is to recognize opinions and sentiments towards specific aspects from user-generated content. Many research efforts leverage pre-training techniques to learn sentiment-aware representations and achieve significant gains in various ABSA tasks. We conduct an empirical study of SPT-ABSA to systematically investigate and analyze the effectiveness of the existing approaches. We mainly concentrate on the following questions: - (a) what impact do different types of sentiment knowledge have on downstream ABSA tasks?; - (b) which knowledge integration method is most effective?; and - (c) does injecting non-sentiment-specific linguistic knowledge (e.g., part-of-speech tags and syntactic relations) into pre-training have positive impacts? Based on the experimental investigation of these questions, we eventually obtain a powerful sentiment-enhanced pre-trained model. The powerful sentiment-enhanced pre-trained model has two versions, namely zhang-yice/spt-absa-bert-400k and zhang-yice/spt-absa-bert-10k, which integrates three types of knowledge: - aspect words: masking aspects' context and predicting them. - review's rating score: rating prediction. - syntax knowledge: - part-of-speech, - dependency direction, - dependency distance.

license:cc-by-4.0
8
5

t5-sentiment-base

license:cc-by-4.0
4
1

spt-absa-bert-10k

license:cc-by-4.0
3
3

llama-3-3B-sentiment-distillation-v1

NaNK
llama
3
0

llama-3-1B-sentiment-distillation-v1

NaNK
llama
1
0

t5-sentiment-large

license:cc-by-4.0
0
1