boards.
Does normalize work better in this case or cassandra can handle this kind
of
write load?
--
View this message in context:
http://cassandra-user-incubator-apache-org.3065146.n2.nabble.com/Data-modeling-for-Pinterest-like-application-tp7594481p7594517.html
Sent from the cassandra-u
this message in context:
http://cassandra-user-incubator-apache-org.3065146.n2.nabble.com/Data-modeling-for-Pinterest-like-application-tp7594481p7594539.html
Sent from the cassandra-u...@incubator.apache.org mailing list archive at
Nabble.com.
Hello,
I'm working on data modeling for a Pinterest-like project. There are
basically two main concepts: Pin and Board, just like Pinterest, where pin
is an item containing an image, description and some other information such
as a like count, and each board should contain a sorted list of Pins.
The problem is whether I should denormalize details of pins into the board
table or just retrieve pins by page (page size can be 10~20) and then
multi-get by pin_ids to obtain details
-- Denormalize is the best way to go in your case. Otherwise, for 1 board
read, you'll have 10-20 subsequent
://cassandra-user-incubator-apache-org.3065146.n2.nabble.com/Data-modeling-for-Pinterest-like-application-tp7594481p7594517.html
Sent from the cassandra-u...@incubator.apache.org mailing list archive at
Nabble.com.