Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Collaborative Topic Modeling for Recommending S...
Search
Shinichi Takayanagi
May 30, 2016
Research
0
1.5k
Collaborative Topic Modeling for Recommending Scientific Articles
論文"Collaborative Topic Modeling for Recommending Scientific Articles"を読んだ際に使用したスライド
Shinichi Takayanagi
May 30, 2016
Tweet
Share
More Decks by Shinichi Takayanagi
See All by Shinichi Takayanagi
論文紹介「Evaluation gaps in machine learning practice」と、効果検証入門に関する昔話
stakaya
0
1k
バイブコーディングの正体——AIエージェントはソフトウェア開発を変えるか?
stakaya
5
1.5k
[NeurIPS 2023 論文読み会] Wasserstein Quantum Monte Carlo
stakaya
0
560
[KDD2021 論文読み会] ControlBurn: Feature Selection by Sparse Forests
stakaya
2
2k
[ICML2021 論文読み会] Mandoline: Model Evaluation under Distribution Shift
stakaya
0
2k
[情報検索/推薦 各社合同 論文読み祭 #1] KDD ‘20 "Embedding-based Retrieval in Facebook Search"
stakaya
2
640
【2020年新人研修資料】ナウでヤングなPython開発入門
stakaya
29
21k
論文読んだ「Simple and Deterministic Matrix Sketching」
stakaya
1
1.2k
Quick Introduction to Approximate Bayesian Computation (ABC) with R"
stakaya
3
370
Other Decks in Research
See All in Research
教師あり学習と強化学習で作る 最強の数学特化LLM
analokmaus
2
870
都市交通マスタープランとその後への期待@熊本商工会議所・熊本経済同友会
trafficbrain
0
110
超高速データサイエンス
matsui_528
2
370
Earth AI: Unlocking Geospatial Insights with Foundation Models and Cross-Modal Reasoning
satai
2
430
機械学習と数理最適化の融合 (MOAI) による革新
mickey_kubo
1
460
国際論文を出そう!ICRA / IROS / RA-L への論文投稿の心構えとノウハウ / RSJ2025 Luncheon Seminar
koide3
13
7.2k
Mamba-in-Mamba: Centralized Mamba-Cross-Scan in Tokenized Mamba Model for Hyperspectral Image Classification
satai
3
540
ウェブ・ソーシャルメディア論文読み会 第36回: The Stepwise Deception: Simulating the Evolution from True News to Fake News with LLM Agents (EMNLP, 2025)
hkefka385
0
140
スキマバイトサービスにおける現場起点でのデザインアプローチ
yoshioshingyouji
0
280
ブレグマン距離最小化に基づくリース表現量推定:バイアス除去学習の統一理論
masakat0
0
110
Collective Predictive Coding and World Models in LLMs: A System 0/1/2/3 Perspective on Hierarchical Physical AI (IEEE SII 2026 Plenary Talk)
tanichu
1
230
Multi-Agent Large Language Models for Code Intelligence: Opportunities, Challenges, and Research Directions
fatemeh_fard
0
120
Featured
See All Featured
How to Talk to Developers About Accessibility
jct
2
120
Understanding Cognitive Biases in Performance Measurement
bluesmoon
32
2.8k
16th Malabo Montpellier Forum Presentation
akademiya2063
PRO
0
44
Groundhog Day: Seeking Process in Gaming for Health
codingconduct
0
87
Efficient Content Optimization with Google Search Console & Apps Script
katarinadahlin
PRO
0
300
The Psychology of Web Performance [Beyond Tellerrand 2023]
tammyeverts
49
3.3k
The Impact of AI in SEO - AI Overviews June 2024 Edition
aleyda
5
720
Claude Code のすすめ
schroneko
67
210k
Stop Working from a Prison Cell
hatefulcrawdad
273
21k
Art, The Web, and Tiny UX
lynnandtonic
304
21k
StorybookのUI Testing Handbookを読んだ
zakiyama
31
6.5k
Facilitating Awesome Meetings
lara
57
6.7k
Transcript
RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang,
David M. Blei 高柳慎一
(C)Recruit Communications Co., Ltd. ABSTRACT 1
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4
(C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks
5
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11
(C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12
(C)Recruit Communications Co., Ltd. LDAの生成過程 13
(C)Recruit Communications Co., Ltd. LDAの特徴 14
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15
(C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16
(C)Recruit Communications Co., Ltd. CTRの生成過程 17
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18
(C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19
(C)Recruit Communications Co., Ltd. 学習のさせ方 20
(C)Recruit Communications Co., Ltd. 学習のさせ方 21
(C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22
(C)Recruit Communications Co., Ltd. 学習のさせ方 23
(C)Recruit Communications Co., Ltd. 予測 24
(C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25
(C)Recruit Communications Co., Ltd. データの規模感 26
(C)Recruit Communications Co., Ltd. 評価 27
(C)Recruit Communications Co., Ltd. 結果 28
(C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29
数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)
(C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出
(C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当
(C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当
(C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当
(C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34