Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Collaborative Topic Modeling for Recommending Scientific Articles
Search
Shinichi Takayanagi
May 30, 2016
Research
0
1.3k
Collaborative Topic Modeling for Recommending Scientific Articles
論文"Collaborative Topic Modeling for Recommending Scientific Articles"を読んだ際に使用したスライド
Shinichi Takayanagi
May 30, 2016
Tweet
Share
More Decks by Shinichi Takayanagi
See All by Shinichi Takayanagi
[NeurIPS 2023 論文読み会] Wasserstein Quantum Monte Carlo
stakaya
0
390
[KDD2021 論文読み会] ControlBurn: Feature Selection by Sparse Forests
stakaya
2
1.8k
[ICML2021 論文読み会] Mandoline: Model Evaluation under Distribution Shift
stakaya
0
1.9k
[情報検索/推薦 各社合同 論文読み祭 #1] KDD ‘20 "Embedding-based Retrieval in Facebook Search"
stakaya
2
520
【2020年新人研修資料】ナウでヤングなPython開発入門
stakaya
28
20k
論文読んだ「Simple and Deterministic Matrix Sketching」
stakaya
1
970
Quick Introduction to Approximate Bayesian Computation (ABC) with R"
stakaya
3
260
The Road to Machine Learning Engineer from Data Scientist
stakaya
5
4k
論文読んだ「Winner’s Curse: Bias Estimation for Total Effects of Features in Online Controlled Experiments」
stakaya
1
4.5k
Other Decks in Research
See All in Research
AIを前提とした体験の実現に向けて/toward_ai_based_experiences
monochromegane
1
430
[輪講資料] Text Embeddings by Weakly-Supervised Contrastive Pre-training
hpprc
3
720
「人間にAIはどのように辿り着けばよいのか?ー 系統的汎化からの第一歩 ー」@第22回 Language and Robotics研究会
maguro27
0
400
機械学習と最適化の融合動的ロットサイズ決定問題を例として
mickey_kubo
2
360
SSII2024 [OS2] 画像、その先へ 〜モーション解析への誘い〜
ssii
PRO
1
1.1k
中高生にSFを読んでもらうには
ichiiida
1
830
点群処理の基礎: 平面の検出と、その上下の点の取り出しについて
kentaitakura
0
320
「Goトレ」のご紹介
smartfukushilab1
0
210
Engineering LaCAM∗: Towards Real-Time, Large-Scale, and Near-Optimal Multi-Agent Pathfinding
kei18
0
460
継続的な研究費獲得のための考え方
moda0
2
540
生成AIエージェントの現状を俯瞰する
isidaitc
0
2.4k
JMED-LLM: 日本語医療LLM評価データセットの公開
fta98
1
350
Featured
See All Featured
The Mythical Team-Month
searls
217
43k
How GitHub Uses GitHub to Build GitHub
holman
471
290k
実際に使うSQLの書き方 徹底解説 / pgcon21j-tutorial
soudai
149
45k
Thoughts on Productivity
jonyablonski
64
4.1k
Designing Experiences People Love
moore
136
23k
What’s in a name? Adding method to the madness
productmarketing
PRO
21
2.9k
Designing the Hi-DPI Web
ddemaree
276
34k
Product Roadmaps are Hard
iamctodd
PRO
48
10k
Cheating the UX When There Is Nothing More to Optimize - PixelPioneers
stephaniewalter
277
13k
RailsConf 2023
tenderlove
16
720
Adopting Sorbet at Scale
ufuk
71
8.8k
How GitHub (no longer) Works
holman
305
140k
Transcript
RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang,
David M. Blei 高柳慎一
(C)Recruit Communications Co., Ltd. ABSTRACT 1
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4
(C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks
5
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11
(C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12
(C)Recruit Communications Co., Ltd. LDAの生成過程 13
(C)Recruit Communications Co., Ltd. LDAの特徴 14
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15
(C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16
(C)Recruit Communications Co., Ltd. CTRの生成過程 17
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18
(C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19
(C)Recruit Communications Co., Ltd. 学習のさせ方 20
(C)Recruit Communications Co., Ltd. 学習のさせ方 21
(C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22
(C)Recruit Communications Co., Ltd. 学習のさせ方 23
(C)Recruit Communications Co., Ltd. 予測 24
(C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25
(C)Recruit Communications Co., Ltd. データの規模感 26
(C)Recruit Communications Co., Ltd. 評価 27
(C)Recruit Communications Co., Ltd. 結果 28
(C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29
数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)
(C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出
(C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当
(C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当
(C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当
(C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34