Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Collaborative Topic Modeling for Recommending S...
Search
Shinichi Takayanagi
May 30, 2016
Research
0
1.4k
Collaborative Topic Modeling for Recommending Scientific Articles
論文"Collaborative Topic Modeling for Recommending Scientific Articles"を読んだ際に使用したスライド
Shinichi Takayanagi
May 30, 2016
Tweet
Share
More Decks by Shinichi Takayanagi
See All by Shinichi Takayanagi
[NeurIPS 2023 論文読み会] Wasserstein Quantum Monte Carlo
stakaya
0
430
[KDD2021 論文読み会] ControlBurn: Feature Selection by Sparse Forests
stakaya
2
1.8k
[ICML2021 論文読み会] Mandoline: Model Evaluation under Distribution Shift
stakaya
0
1.9k
[情報検索/推薦 各社合同 論文読み祭 #1] KDD ‘20 "Embedding-based Retrieval in Facebook Search"
stakaya
2
540
【2020年新人研修資料】ナウでヤングなPython開発入門
stakaya
28
20k
論文読んだ「Simple and Deterministic Matrix Sketching」
stakaya
1
1k
Quick Introduction to Approximate Bayesian Computation (ABC) with R"
stakaya
3
280
The Road to Machine Learning Engineer from Data Scientist
stakaya
5
4.2k
論文読んだ「Winner’s Curse: Bias Estimation for Total Effects of Features in Online Controlled Experiments」
stakaya
1
4.6k
Other Decks in Research
See All in Research
言語処理学会30周年記念事業留学支援交流会@YANS2024:「学生のための短期留学」
a1da4
1
240
[CV勉強会@関東 CVPR2024] Visual Layout Composer: Image-Vector Dual Diffusion Model for Design Layout Generation / kantocv 61th CVPR 2024
shunk031
1
450
ミニ四駆AI用制御装置の事例紹介
aks3g
0
160
Embers of Autoregression: Understanding Large Language Models Through the Problem They are Trained to Solve
eumesy
PRO
7
1.2k
ニューラルネットワークの損失地形
joisino
PRO
35
16k
ニュースメディアにおける事前学習済みモデルの可能性と課題 / IBIS2024
upura
3
510
文書画像のデータ化における VLM活用 / Use of VLM in document image data conversion
sansan_randd
2
200
新規のC言語処理系を実装することによる 組込みシステム研究にもたらす価値 についての考察
zacky1972
0
130
SNLP2024:Planning Like Human: A Dual-process Framework for Dialogue Planning
yukizenimoto
1
330
RSJ2024「基盤モデルの実ロボット応用」チュートリアルA(河原塚)
haraduka
3
650
2024/10/30 産総研AIセミナー発表資料
keisuke198619
1
330
湯村研究室の紹介2024 / yumulab2024
yumulab
0
280
Featured
See All Featured
The Power of CSS Pseudo Elements
geoffreycrofte
73
5.3k
Put a Button on it: Removing Barriers to Going Fast.
kastner
59
3.5k
Documentation Writing (for coders)
carmenintech
65
4.4k
Build The Right Thing And Hit Your Dates
maggiecrowley
33
2.4k
A Tale of Four Properties
chriscoyier
156
23k
Producing Creativity
orderedlist
PRO
341
39k
GraphQLとの向き合い方2022年版
quramy
43
13k
Music & Morning Musume
bryan
46
6.2k
The Art of Delivering Value - GDevCon NA Keynote
reverentgeek
8
890
Faster Mobile Websites
deanohume
305
30k
Bootstrapping a Software Product
garrettdimon
PRO
305
110k
Code Review Best Practice
trishagee
64
17k
Transcript
RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang,
David M. Blei 高柳慎一
(C)Recruit Communications Co., Ltd. ABSTRACT 1
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4
(C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks
5
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11
(C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12
(C)Recruit Communications Co., Ltd. LDAの生成過程 13
(C)Recruit Communications Co., Ltd. LDAの特徴 14
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15
(C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16
(C)Recruit Communications Co., Ltd. CTRの生成過程 17
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18
(C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19
(C)Recruit Communications Co., Ltd. 学習のさせ方 20
(C)Recruit Communications Co., Ltd. 学習のさせ方 21
(C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22
(C)Recruit Communications Co., Ltd. 学習のさせ方 23
(C)Recruit Communications Co., Ltd. 予測 24
(C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25
(C)Recruit Communications Co., Ltd. データの規模感 26
(C)Recruit Communications Co., Ltd. 評価 27
(C)Recruit Communications Co., Ltd. 結果 28
(C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29
数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)
(C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出
(C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当
(C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当
(C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当
(C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34