I have a column daycount list<BigInt>. The column is storing a count.
Every few secs a new count is appended. The total count for the day is the
sum of all items in the list.
My application logs indicate I wrote about 110000 items to the column for a
particular row. Assume row key is day_timestamp.
But when I do a read on the column I get back a list with only 43000 items.
Checked with both java driver and CQL.
There are no errors or exceptions anywhere.
There is this statement in the WIKI "Collection values may not be larger
than 64K". I assume this refers to 1 item in a collection.
Has anyone else seen an issue like this ?
Manoj Khangaonkar 's gravatar image asked Jan 22 2014 at 18:17 in Cassandra-User by Manoj Khangaonkar

2 Answers

Thanks. I guess I can work around by maintaining hour_counts (which will
have fewer items) and adding the hour counts to
get day counts.
Manoj Khangaonkar 's gravatar image answered Jan 22 2014 at 19:28 by Manoj Khangaonkar
Alternatively you can use clustering columns to store very big collections.
Beware of not making a row too wide though (use bucketing)
Le 23 janv. 2014 04:29, "Manoj Khangaonkar" a crit
DuyHai Doan 's gravatar image answered Jan 23 2014 at 11:29 by DuyHai Doan