Sentiment analysis using pre-trained language model with no fine-tuning and less resource
Sentiment analysis has become popular when Natural Language Processing algorithms were proven to be able to process complex sentences with good accuracy. Recently, pre-trained language models such as BERT and mBERT, have been shown to be effective for improving language tasks. Most of the work in im...
Saved in:
Main Authors: | Kit, Yuheng, Mohd. Mokji, Musa |
---|---|
Format: | Article |
Language: | English |
Published: |
Institute of Electrical and Electronics Engineers Inc.
2022
|
Subjects: | |
Online Access: | http://eprints.utm.my/104421/1/MusaMohdMokji2022_SentimentAnalysisUsingPreTrained.pdf http://eprints.utm.my/104421/ http://dx.doi.org/10.1109/ACCESS.2022.3212367 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Pre-trained language model with feature reduction and no fine-tuning
by: Kit, Y. H., et al.
Published: (2022) -
Pre-trained language model with feature reduction and no fine-tuning
by: Kit, Y. H., et al.
Published: (2022) -
Patterned ground shield for inductance fine-tuning
by: Yusof, Nur S., et al.
Published: (2022) -
Modelling and PSO fine-tuned PID control of quadrotor UAV
by: Noordin, A., et al.
Published: (2017) -
The Parametric Study and Fine-Tuning of Bow-Tie Slot Antenna with Loaded Stub
by: Shafiei, M.M., et al.
Published: (2017)