Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Redis Hacks
Search
David Cramer
May 03, 2014
Technology
3
230
Redis Hacks
Python Nordeste 2014 - Lightning Talk
David Cramer
May 03, 2014
Tweet
Share
More Decks by David Cramer
See All by David Cramer
Mastering Duct Tape (PyCon Balkan 2018)
zeeg
2
870
Open Source as a Business (PyCon SG 2014)
zeeg
0
350
Angular.js Workshop (PyCon SG 2014)
zeeg
0
240
Architecting a Culture of Quality
zeeg
2
320
Release Faster
zeeg
12
1.4k
Open Source as a Business (EuroPython 2013)
zeeg
18
17k
Building to Scale (PyCon TW 2013)
zeeg
18
1.3k
Building to Scale
zeeg
28
24k
Lessons in Testing - DjangoCon 2012
zeeg
8
1.4k
Other Decks in Technology
See All in Technology
10ヶ月かけてstyled-components v4からv5にアップデートした話
uhyo
5
450
Running JavaScript within Ruby
hmsk
3
430
Mastraに入門してみた ~AWS CDKを添えて~
tsukuboshi
0
380
Асинхронная коммуникация в Go: от понятного к душному. Дима Некрасов, Otello, 2ГИС
lamodatech
0
1.7k
Databricksで完全履修!オールインワンレイクハウスは実在した!
akuwano
0
140
AI駆動で進化する開発プロセス ~クラスメソッドでの実践と成功事例~ / aidd-in-classmethod
tomoki10
1
810
製造業向けIoTソリューション提案資料.pdf
haruki_uiru
0
150
テストって楽しい!開発を加速させるテストの魅力 / Testing is Fun! The Fascinating of Testing to Accelerate Development
aiandrox
0
160
AI-in-the-Enterprise|OpenAIが公開した「AI導入7つの教訓」——ChatGPTで変わる企業の未来とは?
customercloud
PRO
0
120
CodeRabbitと過ごした1ヶ月 ─ AIコードレビュー導入で実感したチーム開発の進化
mitohato14
1
230
品質文化を支える小さいクロスファンクショナルなチーム / Cross-functional teams fostering quality culture
toma_sm
0
180
Microsoft Fabric vs Databricks vs (Snowflake) -若手エンジニアがそれぞれの強みと違いを比較してみた- "A Young Engineer's Comparison of Their Strengths and Differences"
reireireijinjin6
1
130
Featured
See All Featured
Automating Front-end Workflow
addyosmani
1370
200k
Imperfection Machines: The Place of Print at Facebook
scottboms
267
13k
Optimizing for Happiness
mojombo
378
70k
Embracing the Ebb and Flow
colly
85
4.7k
The World Runs on Bad Software
bkeepers
PRO
68
11k
Save Time (by Creating Custom Rails Generators)
garrettdimon
PRO
31
1.2k
Build your cross-platform service in a week with App Engine
jlugia
230
18k
The Pragmatic Product Professional
lauravandoore
33
6.6k
BBQ
matthewcrist
88
9.6k
Code Review Best Practice
trishagee
67
18k
How GitHub (no longer) Works
holman
314
140k
Art, The Web, and Tiny UX
lynnandtonic
298
20k
Transcript
David Cramer twitter.com/zeeg Redis Hacks (or “How Sentry Scales”)
Buffering Writes
r = Redis() ! def incr(type, id): key = 'pending:{}'.format(type)
! r.zincrby(key, id, 1)
r = Redis() ! def flush(type): key = 'pending:{}'.format(type) result
= r.zrange(key, 0, -1, withscores=True) ! for id, count in result: prms = {'type': type, 'count': count, 'id': id} ! sql(""" update %(type)s set count = count + % (count)d where id = %(id)s """, prms)
Rate Limiting
r = Redis() ! def process_hit(project_id): epoch = time() /
60 key = ‘{}:{}’.format(project_id, epoch) ! pipe = r.pipeline() pipe.incr(key) pipe.expire(key, 60) result = pipe.execute() ! # return current value return int(result[0])
def request(project_id): result = process_hit(project_id) if result > 20: return
Response(status=429) return Response(status=200)
Time Series Data
def count_hits_today(project_id): start = time() end = now + DAY_SECONDS
! pipe = r.pipeline() for epoch in xrange(now, end, 10): key = ‘{}:{}’.format( project_id, epoch) pipe.get(key) results = pipe.execute() ! # remove non-zero results results = filter(bool, results) # coerce remainder to ints results = map(int, results) # return sum of buckets return sum(results)
Good-enough Locks
from contextlib import contextmanager ! r = Redis() ! @contextmanager
def lock(key, nowait=True): while not r.setnx(key, '1'): if nowait: raise Locked('try again soon!') sleep(0.01) ! # limit lock time to 10 seconds r.expire(key, 10) ! # do something crazy yield ! # explicitly unlock r.delete(key)
def do_something_crazy(): with lock('crazy'): print 'Hello World!'
Basic Sharding via Nydus
from nydus.db import create_cluster ! redis = create_cluster({ 'backend': 'nydus.db.backends.redis.Redis',
'hosts': { 0: {'db': 0}, 1: {'db': 1}, }, 'router': 'nydus.db.routers.keyvalue.PartitionRouter', })
def count_hits_today(project_id): start = time() end = now + DAY_SECONDS
! keys = [] for epoch in xrange(now, end, 10): key = '{}:{}'.format(project_id, epoch) keys.append(key) ! with redis.map() as conn: results = map(conn.get, keys) ! # remove non-zero results results = filter(bool, results) # coerce remainder to ints results = map(int, results) # return sum of buckets return sum(results)