Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
文献紹介: Attention is not Explanation
Search
Sponsored
·
Your Podcast. Everywhere. Effortlessly.
Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.
→
Yumeto Inaoka
March 19, 2019
Research
0
500
文献紹介: Attention is not Explanation
Yumeto Inaoka
March 19, 2019
Tweet
Share
More Decks by Yumeto Inaoka
See All by Yumeto Inaoka
文献紹介: Quantity doesn’t buy quality syntax with neural language models
yumeto
1
210
文献紹介: Open Domain Web Keyphrase Extraction Beyond Language Modeling
yumeto
0
260
文献紹介: Self-Supervised_Neural_Machine_Translation
yumeto
0
180
文献紹介: Comparing and Developing Tools to Measure the Readability of Domain-Specific Texts
yumeto
0
190
文献紹介: PAWS: Paraphrase Adversaries from Word Scrambling
yumeto
0
180
文献紹介: Beyond BLEU: Training Neural Machine Translation with Semantic Similarity
yumeto
0
300
文献紹介: EditNTS: An Neural Programmer-Interpreter Model for Sentence Simplification through Explicit Editing
yumeto
0
370
文献紹介: Decomposable Neural Paraphrase Generation
yumeto
0
240
文献紹介: Analyzing the Limitations of Cross-lingual Word Embedding Mappings
yumeto
0
260
Other Decks in Research
See All in Research
Aurora Serverless からAurora Serverless v2への課題と知見を論文から読み解く/Understanding the challenges and insights of moving from Aurora Serverless to Aurora Serverless v2 from a paper
bootjp
6
1.5k
Tiaccoon: Unified Access Control with Multiple Transports in Container Networks
hiroyaonoe
0
590
生成AIとうまく付き合うためのプロンプトエンジニアリング
yuri_ohashi
0
130
When Learned Data Structures Meet Computer Vision
matsui_528
1
2.8k
Stealing LUKS Keys via TPM and UUID Spoofing in 10 Minutes - BSides 2025
anykeyshik
0
180
生成AI による論文執筆サポート・ワークショップ ─ サーベイ/リサーチクエスチョン編 / Workshop on AI-Assisted Paper Writing Support: Survey/Research Question Edition
ks91
PRO
0
140
Thirty Years of Progress in Speech Synthesis: A Personal Perspective on the Past, Present, and Future
ktokuda
0
160
Proposal of an Information Delivery Method for Electronic Paper Signage Using Human Mobility as the Communication Medium / ICCE-Asia 2025
yumulab
0
160
AI Agentの精度改善に見るML開発との共通点 / commonalities in accuracy improvements in agentic era
shimacos
4
1.2k
財務諸表監査のための逐次検定
masakat0
1
250
存立危機事態の再検討
jimboken
0
240
都市交通マスタープランとその後への期待@熊本商工会議所・熊本経済同友会
trafficbrain
0
120
Featured
See All Featured
How to Get Subject Matter Experts Bought In and Actively Contributing to SEO & PR Initiatives.
livdayseo
0
64
Paper Plane
katiecoart
PRO
0
46k
svc-hook: hooking system calls on ARM64 by binary rewriting
retrage
1
99
Into the Great Unknown - MozCon
thekraken
40
2.3k
Primal Persuasion: How to Engage the Brain for Learning That Lasts
tmiket
0
250
Unlocking the hidden potential of vector embeddings in international SEO
frankvandijk
0
170
Navigating the moral maze — ethical principles for Al-driven product design
skipperchong
2
240
Designing for Performance
lara
610
70k
AI in Enterprises - Java and Open Source to the Rescue
ivargrimstad
0
1.1k
Impact Scores and Hybrid Strategies: The future of link building
tamaranovitovic
0
200
How to Create Impact in a Changing Tech Landscape [PerfNow 2023]
tammyeverts
55
3.2k
From π to Pie charts
rasagy
0
120
Transcript
Attention is not Explanation 文献紹介 2019/03/19 長岡技術科学大学 自然言語処理研究室 稲岡 夢人
Literature 2 Title Attention is not Explanation Author Sarthak Jain,
Byron C. Wallace Conference NAACL-HLT 2019 Paper https://arxiv.org/abs/1902.10186
Abstract Attentionは入力単位で重みを示す その分布が重要性を示すものとして扱われることがある → 重みと出力の間の関係は明らかにされていない 標準的なAttentionは意味のある説明を提供しておらず、 そのように扱われるべきでないことを示す
3
調査方法 1. 重みが重要度と相関しているか 2. 事実と反する重み設定が予測を変化させるか 4
Tasks Binary Text Classification Question Answering (QA)
Natural Language Inference (NLI) 5
Datasets 6
Results 7
Definitions 出力結果の比較に使用する距離 Attentionの比較に使用する距離 8
Results Attentionの変化を大きくしても結果が変化しない → 出力への影響が小さい 9
Results DiabetesはPositiveのクラスにおいては影響が大きい → 高精度で糖尿病を示すトークンが存在するため 10
Adversarial Attention 出力を大きく変化させるようにAttentionを変化させる Attentionが少し変化しただけで出力が大きく変化するか ← Attentionの挙動を確認 12
Results 少しのAttentionの変化で出力が大きく変化している 13
Conclusions 重要度とAttentionの重みは相関が弱い 事実に反する重みは必ずしも出力を変化させない Seq2seqについては今後の課題とする 14