Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Surviving Data in Large Doses
Search
Tareq Abedrabbo
November 20, 2013
Technology
0
210
Surviving Data in Large Doses
NoSQL Search Roadshow London 2013
Tareq Abedrabbo
November 20, 2013
Tweet
Share
More Decks by Tareq Abedrabbo
See All by Tareq Abedrabbo
Not a SO(A) Trivial Question!
tareqabedrabbo
0
61
Designing APIs for Data Driven Systems
tareqabedrabbo
0
56
Things I wish I'd known before I started with Microservices
tareqabedrabbo
0
660
Building a Scalable Event Service with Cassandra: Design to Code
tareqabedrabbo
1
470
The 7 Deadly Sins of Microservices
tareqabedrabbo
7
1.2k
The Ubiquitous Graph
tareqabedrabbo
0
200
The 7 Deadly Sins of Microservices
tareqabedrabbo
0
610
Building a Scalable Event Service with Cassandra: Design to Code
tareqabedrabbo
0
73
Time Series and Events: Storage and Querying Strategies with Cassandra
tareqabedrabbo
0
290
Other Decks in Technology
See All in Technology
AIAgentの限界を超え、 現場を動かすWorkflowAgentの設計と実践
miyatakoji
0
130
GC25 Recap+: Advancing Go Garbage Collection with Green Tea
logica0419
1
370
Azure SynapseからAzure Databricksへ 移行してわかった新時代のコスト問題!?
databricksjapan
0
130
多野優介
tanoyusuke
1
300
Modern_Data_Stack最新動向クイズ_買収_AI_激動の2025年_.pdf
sagara
0
190
o11yで育てる、強い内製開発組織
_awache
3
110
「AI駆動PO」を考えてみる - 作る速さから価値のスループットへ:検査・適応で未来を開発 / AI-driven product owner. scrummat2025
yosuke_nagai
4
540
いま注目しているデータエンジニアリングの論点
ikkimiyazaki
0
580
20250929_QaaS_vol20
mura_shin
0
110
空間を設計する力を考える / 20251004 Naoki Takahashi
shift_evolve
PRO
3
320
許しとアジャイル
jnuank
1
110
AI時代だからこそ考える、僕らが本当につくりたいスクラムチーム / A Scrum Team we really want to create in this AI era
takaking22
6
3.1k
Featured
See All Featured
Code Reviewing Like a Champion
maltzj
525
40k
Why Our Code Smells
bkeepers
PRO
339
57k
Designing for Performance
lara
610
69k
Build your cross-platform service in a week with App Engine
jlugia
232
18k
The Myth of the Modular Monolith - Day 2 Keynote - Rails World 2024
eileencodes
26
3.1k
Java REST API Framework Comparison - PWX 2021
mraible
33
8.8k
Imperfection Machines: The Place of Print at Facebook
scottboms
269
13k
Making Projects Easy
brettharned
119
6.4k
Unsuck your backbone
ammeep
671
58k
Embracing the Ebb and Flow
colly
88
4.8k
Scaling GitHub
holman
463
140k
Building Better People: How to give real-time feedback that sticks.
wjessup
368
20k
Transcript
Surviving Data in Large Doses Tareq Abedrabbo NoSQL Search Roadshow
London 2013
About me • CTO at OpenCredo • Delivering large-scale data
projects in a number of domains • Co-author of Neo4j in Action (Manning)
What this talk is about…
Supermarkets
Meanwhile, in DevLand
Bob is an application developer
Bob wants to build an application. Bob knows that a
relational database is definitely not the right choice for his application
Bob chooses a NoSQL database because he likes it (he
secretly thinks it’s good for his CV too).
Bob goes for a three-tier architecture. It’s separation of concerns.
It’s best practice.
Bob builds an object model first. It’s Domain Driven Design.
It’s best practice.
Bob uses an object mapping framework. Databases should be hidden
behind layers of abstraction. It’s best practice.
Bob hopes for the best!
What challenges is Bob facing?
Suitability of the data model
Suitability of the architecture and the implementation
Ability to meet new requirements
Being able to use the selected technology to the best
of its ability
Performance
A number of applications built on top of NoSQL technologies
end up unfit for purpose
How did we get ourselves into such a mess?
• Technical evangelism • Evolution in requirements • Unthinking decisions
• Ill-informed opinions
Common problem: there is focus on technology and implementation, not
on real value
So what’s the alternative?
Separation of concerns based on data flow
Data flow
• Lifecycle • Structure • Size • Velocity • Purpose
How?
Identify the concerns: what do I care about?
Identify the locality of these concerns: where are the natural
boundaries?
Build focused specialised models
Compose the models into a complete system
Computing is data structures + algorithms
If we accept that separation of concerns should be applied
to algorithms, it is appropriate to apply the same thinking to data
The real value of this form of separation of concerns
is true decoupling
What’s out there
CQRS
Polyglot Persistence
How do I apply it?
It depends on the data flow :)
For general-purpose data platforms, micro services work well
Build micro services that are closer to the natural underlying
model
Other strategies are possible, for example if the data is
highly volatile, consider in-memory grids
There are practical considerations - obviously
Don’t start with 10 different databases because you think you
might eventually need all of them
How would that impact support and operations?
There is potential for simplification based on clearly targeted usage
Links • Twitter: @tareq_abedrabbo • Blog: http://www.terminalstate.net • OpenCredo: http://www.opencredo.com
Thank you!