T5Tokenizer.from_pretrained("rinna/japanese-gpt-1b") model = AutoModelForCausalLM.from_pretrained("rinna/japanese-gpt-1b") if torch.cuda.is_available(): model = model.to("cuda") 守:既存のモデルを利用 https://huggingface.co/rinna/japanese-gpt-1b
Yuta Matsuda, and Norihiko Sawa (2021). Editors-in-the-loop News Article Summarization Framework with Sentence Selection and Compression. In Proceedings of the 5th IEEE Workshop on Human-in-the-Loop Methods and Future of Work in BigData, pp. 3522-24.
Semantic Shift Stability: Efficient Way to Detect Performance Degradation of Word Embeddings and Pre-trained Language Models. In Proceedings of AACL-IJCNLP 2022. (to appear) https://github.com/Nikkei/seman tic-shift-stability