Upgrade to Pro — share decks privately, control downloads, hide ads and more …

IDC에서 AWS로 Redis 데이터 이전하기

Sponsored · Ship Features Fearlessly Turn features on and off without deploys. Used by thousands of Ruby developers.
Avatar for mingrammer mingrammer
February 28, 2019

IDC에서 AWS로 Redis 데이터 이전하기

Redis 데이터 이전기: https://mingrammer.com/redis-migration/

Avatar for mingrammer

mingrammer

February 28, 2019

More Decks by mingrammer

Other Decks in Programming

Transcript

  1. Name ӂ޹੤ (MinJae Kwon) Nickname @mingrammer Email [email protected] Who Game

    Server Engineer @ SundayToz Blog https://mingrammer.com Facebook https://facebook.com/mingrammer Github https://github.com/mingrammer Eng Blog https://medium.com/@mingrammer
  2. … … … … … • Redis ࢲߡ: 37؀ (Redis

    2.6 / 2.8 ഒਊ) • য়ߡ೻٘ܳ ઁ৻ೠ ୨ ࣽࣻ ؘ੉ఠ ਊ۝: 20GB ੹റ • ୨ ః ѐࣻ: 33M (33,000,000) ੹റ • ࠙࢑ػ ఃٜ਷ ݽف ೠ Ҕਵ۽ ా೤
  3. … … Public Outbound X Public Inbound X Multi Master

    X Key migration Merge redis dbs Proxy? Merge?
  4. Merge redis des • Merge 37 rdb files into a

    single rdb file • Dump & restore for all keys
  5. Merge redis des • Merge 37 rdb files into a

    single rdb file • Dump & restore for all keys There are no official / 3rd party rdb merge tools
  6. Merge redis des • Merge 37 rdb files into a

    single rdb file • Dump & restore for all keys
  7. … 37 redis servers in IDC 1 iMac with 24GB

    RAM … :6400 :6401 :6402 :6436 192.x.x.x 192.x.x.x 192.x.x.x 192.x.x.x … 8 parallel replica Active Waiting Data Migration Architecture
  8. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  9. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  10. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  11. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  12. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  13. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  14. def run(port, server): local_server = 'localhost:{}'.format(port) fetch(server, local_server) migrate_all(local_server, 'ec2

    server ip') shutdown(local_server) clear_dump(port) def main(start, end): ... pool = Pool(processes=8) pool.starmap(run, zip(local_ports, legacy_servers)) Data Migration Process
  15. def run(port, server): local_server = 'localhost:{}'.format(port) fetch(server, local_server) migrate_all(local_server, 'ec2

    server ip') shutdown(local_server) clear_dump(port) def main(start, end): ... pool = Pool(processes=8) pool.starmap(run, zip(local_ports, legacy_servers)) Data Migration Process $ ls -l ... dump6400.rdb # زӝച ৮ܐػ ۽ஸ ࢲߡ ನ౟. EC2 ੉੹ө૑ ৮ܐغݶ rdb ࢏ઁ ... dump6409.rdb ... dump6405.rdb ... dump6406.rdb ... temp_6401.rdb # زӝച ૓೯઺ੋ ۽ஸ ࢲߡ ನ౟. ... temp_6408.rdb ... temp_6403.rdb ... temp_6402.rdb
  16. # src: IDC ࢲߡ # dst: ۽ஸ ࢲߡ def fetch(src,

    dst): ... r = redis.StrictRedis(host=dsthost, port=dstport, charset='utf8') try: r.ping() except redis.exceptions.ConnectionError: subprocess.call([ 'redis-server', '--port', dstport, '--dbfilename', 'dump{}.rdb'.format(dstport), '--daemonize', 'yes', ], stdout=subprocess.DEVNULL) print('[{}] [{}|{}] Slave status: {}'.format(now(), src, dst, r.slaveof(srchost, srcport))) ... Data Migration Process (Synchronization with Replication) Start redis server after ping test
  17. def fetch(src, dst): ... while True: master_link_status = r.info('replication')['master_link_status'] master_sync_in_progress

    = r.info('replication')['master_sync_in_progress'] if master_link_status == 'up' and master_sync_in_progress == 0: r.slaveof() break print('[{}] [{}|{}] All keys is fetched.'.format(now(), src, dst)) Data Migration Process (Synchronization with Replication) Sync and check done
  18. def migrate_all(src, dst): ... r = redis.StrictRedis(host=srchost, port=srcport, charset='utf8') keyspace

    = r.info('keyspace') print('[{}] [{}|{}] Started migrating.'.format(now(), src, dst)) jobs = [gevent.spawn(migrate, src, dst, int(k[2:])) for k in keyspace.keys()] gevent.joinall(jobs) print('[{}] [{}|{}] Migration was done.'.format(now(), src, dst)) Data Migration Process (Dump & Restore) Parallelism by keyspace
  19. def migrate(src, dst, db): count = 2500 cursor = 0

    ... while True: # 2,500ѐ ః ࣽഥ cursor, keys = srcr.scan(cursor, count=count) pipeline = srcr.pipeline(transaction=False) # dump for key in keys: pipeline.pttl(key) pipeline.dump(key) result = pipeline.execute() # restore pipeline = dstr.pipeline(transaction=False) for key, ttl, data in zip(keys, result[::2], result[1::2]): if data != None: pipeline.restore(key, ttl + 10800000 if ttl > 0 else 0, data) pipeline.execute(False) ... if cursor == 0: break Data Migration Process (Dump & Restore) Migrate all keys with pipelining
  20. def run(port, server): local_server = 'localhost:{}'.format(port) fetch(server, local_server) migrate_all(local_server, 'ec2

    server ip') shutdown(local_server) clear_dump(port) def main(start, end): ... pool = Pool(processes=8) pool.starmap(run, zip(local_ports, legacy_servers)) Data Migration Process
  21. Conclusion ✓ ؘझ௼఑ → EC2: 23࠙ (ড 24,000 keys/s) ✓

    EC2 → ElastiCache: 14࠙ (ড 40,000 keys/s)