Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Redis Hacks
Search
David Cramer
May 03, 2014
Technology
3
250
Redis Hacks
Python Nordeste 2014 - Lightning Talk
David Cramer
May 03, 2014
Tweet
Share
More Decks by David Cramer
See All by David Cramer
Mastering Duct Tape (PyCon Balkan 2018)
zeeg
2
890
Open Source as a Business (PyCon SG 2014)
zeeg
0
390
Angular.js Workshop (PyCon SG 2014)
zeeg
0
250
Architecting a Culture of Quality
zeeg
2
320
Release Faster
zeeg
12
1.4k
Open Source as a Business (EuroPython 2013)
zeeg
18
17k
Building to Scale (PyCon TW 2013)
zeeg
18
1.3k
Building to Scale
zeeg
28
24k
Lessons in Testing - DjangoCon 2012
zeeg
8
1.4k
Other Decks in Technology
See All in Technology
AIと自動化がもたらす業務効率化の実例: 反社チェック等の調査・業務プロセス自動化
enpipi
0
640
What's the recommended Flutter architecture
aakira
3
1.9k
アジャイル社内普及ご近所さんマップを作ろう / Let's create an agile neighborhood map
psj59129
1
130
Devoxx Morocco 2025 - Like Spring but faster: The new Java Jedi
edeandrea
PRO
0
100
手を動かしながら学ぶデータモデリング - 論理設計から物理設計まで / Data modeling
soudai
PRO
24
5.9k
バフェットコード株式会社 開発チームカルチャーデック
shoe116
1
110
ソフトウェア開発現代史: 55%が変化に備えていない現実 ─ AI支援型開発時代のReboot Japan #agilejapan
takabow
7
4.3k
個人から巡るAI疲れと組織としてできること - AI疲れをふっとばせ。エンジニアのAI疲れ治療法 ショートセッション -
kikuchikakeru
4
1.2k
JAWS-UG SRE支部 #14 LT
okaru
0
110
機密情報の漏洩を防げ! Webフロントエンド開発で意識すべき漏洩パターンとその対策
mizdra
PRO
10
3.5k
仕様駆動 x Codex で 超効率開発
ismk
2
1.5k
なぜThrottleではなくDebounceだったのか? 700並列リクエストと戦うサーバーサイド実装のすべて
yoshiori
13
4.6k
Featured
See All Featured
The Cult of Friendly URLs
andyhume
79
6.7k
10 Git Anti Patterns You Should be Aware of
lemiorhan
PRO
658
61k
CSS Pre-Processors: Stylus, Less & Sass
bermonpainter
359
30k
Mobile First: as difficult as doing things right
swwweet
225
10k
How GitHub (no longer) Works
holman
315
140k
Balancing Empowerment & Direction
lara
5
750
Principles of Awesome APIs and How to Build Them.
keavy
127
17k
Gamification - CAS2011
davidbonilla
81
5.5k
4 Signs Your Business is Dying
shpigford
186
22k
Build your cross-platform service in a week with App Engine
jlugia
234
18k
Navigating Team Friction
lara
190
15k
The Straight Up "How To Draw Better" Workshop
denniskardys
239
140k
Transcript
David Cramer twitter.com/zeeg Redis Hacks (or “How Sentry Scales”)
Buffering Writes
r = Redis() ! def incr(type, id): key = 'pending:{}'.format(type)
! r.zincrby(key, id, 1)
r = Redis() ! def flush(type): key = 'pending:{}'.format(type) result
= r.zrange(key, 0, -1, withscores=True) ! for id, count in result: prms = {'type': type, 'count': count, 'id': id} ! sql(""" update %(type)s set count = count + % (count)d where id = %(id)s """, prms)
Rate Limiting
r = Redis() ! def process_hit(project_id): epoch = time() /
60 key = ‘{}:{}’.format(project_id, epoch) ! pipe = r.pipeline() pipe.incr(key) pipe.expire(key, 60) result = pipe.execute() ! # return current value return int(result[0])
def request(project_id): result = process_hit(project_id) if result > 20: return
Response(status=429) return Response(status=200)
Time Series Data
def count_hits_today(project_id): start = time() end = now + DAY_SECONDS
! pipe = r.pipeline() for epoch in xrange(now, end, 10): key = ‘{}:{}’.format( project_id, epoch) pipe.get(key) results = pipe.execute() ! # remove non-zero results results = filter(bool, results) # coerce remainder to ints results = map(int, results) # return sum of buckets return sum(results)
Good-enough Locks
from contextlib import contextmanager ! r = Redis() ! @contextmanager
def lock(key, nowait=True): while not r.setnx(key, '1'): if nowait: raise Locked('try again soon!') sleep(0.01) ! # limit lock time to 10 seconds r.expire(key, 10) ! # do something crazy yield ! # explicitly unlock r.delete(key)
def do_something_crazy(): with lock('crazy'): print 'Hello World!'
Basic Sharding via Nydus
from nydus.db import create_cluster ! redis = create_cluster({ 'backend': 'nydus.db.backends.redis.Redis',
'hosts': { 0: {'db': 0}, 1: {'db': 1}, }, 'router': 'nydus.db.routers.keyvalue.PartitionRouter', })
def count_hits_today(project_id): start = time() end = now + DAY_SECONDS
! keys = [] for epoch in xrange(now, end, 10): key = '{}:{}'.format(project_id, epoch) keys.append(key) ! with redis.map() as conn: results = map(conn.get, keys) ! # remove non-zero results results = filter(bool, results) # coerce remainder to ints results = map(int, results) # return sum of buckets return sum(results)