@DuyHai
Yes, that's another case, the "entity" model used in rdbms. But I need rows
together to work with them (indexing etc).

@sfespace
The map is needed when you have a dynamic schema. I don't have a dynamic
schema (may have, and will use the map if I do). I just have thousands of
schemas. One user needs 10 integers, while another user needs 20 booleans,
and another needs 30 integers, or a combination of them all.

On Thu, Sep 15, 2016 at 7:46 PM, DuyHai Doan <doanduy...@gmail.com> wrote:

> "Another possible alternative is to use a single map column"
>
> --> how do you manage the different types then ? Because maps in Cassandra
> are strongly typed
>
> Unless you set the type of map value to blob, in this case you might as
> well store all the object as a single blob column
>
> On Thu, Sep 15, 2016 at 6:13 PM, sfesc...@gmail.com <sfesc...@gmail.com>
> wrote:
>
>> Another possible alternative is to use a single map column.
>>
>>
>> On Thu, Sep 15, 2016 at 7:19 AM Dorian Hoxha <dorian.ho...@gmail.com>
>> wrote:
>>
>>> Since I will only have 1 table with that many columns, and the other
>>> tables will be "normal" tables with max 30 columns, and the memory of 2K
>>> columns won't be that big, I'm gonna guess I'll be fine.
>>>
>>> The data model is too dynamic, the alternative would be to create a
>>> table for each user which will have even more overhead since the number of
>>> users is in the several thousands/millions.
>>>
>>>
>>> On Thu, Sep 15, 2016 at 3:04 PM, DuyHai Doan <doanduy...@gmail.com>
>>> wrote:
>>>
>>>> There is no real limit in term of number of columns in a table, I would
>>>> say that the impact of having a lot of columns is the amount of meta data
>>>> C* needs to keep in memory for encoding/decoding each row.
>>>>
>>>> Now, if you have a table with 1000+ columns, the problem is probably
>>>> your data model...
>>>>
>>>> On Thu, Sep 15, 2016 at 2:59 PM, Dorian Hoxha <dorian.ho...@gmail.com>
>>>> wrote:
>>>>
>>>>> Is there alot of overhead with having a big number of columns in a
>>>>> table ? Not unbounded, but say, would 2000 be a problem(I think that's the
>>>>> maximum I'll need) ?
>>>>>
>>>>> Thank You
>>>>>
>>>>
>>>>
>>>
>

Reply via email to