how could I solve it if let's say it would be limited to the second?

On Sun, Oct 5, 2014 at 10:07 AM, Stanley Iriele <[email protected]>
wrote:

> Or can it be down to say....the hour... Or minute... Something like that
> On Oct 4, 2014 11:53 PM, "Boaz Citrin" <[email protected]> wrote:
>
> > I need to filter by date and group by group...
> >
> > For example having these docs:
> > {
> >   "group" : "a",
> >   "associated" : "2014-10-04T21:58:59.377Z",
> >  ...
> > }
> > {
> >   "group" : "a",
> >   "associated" : "2014-10-03T21:58:59.377Z",
> >  ...
> > }
> > {
> >   "group" : "b",
> >   "associated" : "2014-10-04T21:58:59.377Z",
> >  ...
> > }
> > {
> >   "group" : "b",
> >   "associated" : "2014-10-01T21:58:59.377Z",
> >  ...
> > }
> >
> > I want to support a query with from-date = "2014-10-02T21:58:59.377Z"
> > to-date= "2014-10-04T22:58:59.377Z"
> >
> > And the result will be -
> >
> > a, 2
> > b, 1
> >
> >
> >
> >
> > On Sun, Oct 5, 2014 at 9:36 AM, Stanley Iriele <[email protected]>
> > wrote:
> >
> > > Right ...are you saying... That you need grouping down to the day?...
> In
> > > which case you can just emit[ date, group]... The fate being the
> > > milliseconds truncated to the day and just have a reduce that's
> > _count....
> > > By the way the date can also be broken down to [year, month, day,
> group]
> > >
> > > You would need to change the group_level to 4 instead of 2 bit the view
> > > would be human readable.
> > >
> > > The truck here is that you query with start key ,end_key to get the the
> > > days and re reduce those values again for that arbitrary date range in
> a
> > > reduce function and you should be good
> > > On Oct 4, 2014 11:23 PM, "Boaz Citrin" <[email protected]> wrote:
> > >
> > > > I need to get the count of document that were associated to a group
> > > between
> > > > two given dates. Thanks!
> > > >
> > > > On Sun, Oct 5, 2014 at 8:13 AM, Stanley Iriele <[email protected]
> >
> > > > wrote:
> > > >
> > > > > Hey what date are you looking to filter to? Day/ month..year?
> > > > > On Oct 4, 2014 10:11 PM, "Boaz Citrin" <[email protected]> wrote:
> > > > >
> > > > > > Thanks Giovanni,
> > > > > > You say I can get all the groups at the same time,
> > > > > > but how can I achieve this and also filter by date?
> > > > > >
> > > > > > On Sun, Oct 5, 2014 at 4:04 AM, Giovanni P <[email protected]>
> > > wrote:
> > > > > >
> > > > > > > You can use the second with group_level=1 and get all the
> groups
> > at
> > > > the
> > > > > > > same time.
> > > > > > > And you can use _count instead of _sum, so you don't even need
> to
> > > > emit
> > > > > > any
> > > > > > > value, just the key.
> > > > > > >
> > > > > > > On Sat, Oct 4, 2014 at 8:24 PM, Boaz Citrin <[email protected]
> >
> > > > wrote:
> > > > > > >
> > > > > > > > Hello,
> > > > > > > >
> > > > > > > > My documents contain two fields to maintain group
> associations,
> > > say
> > > > > > > "group"
> > > > > > > > holds the group document id, and "associated" holds the date
> > this
> > > > > > > document
> > > > > > > > was added to the group.
> > > > > > > > Now I want to be able to know how many documents were added
> to
> > a
> > > > > given
> > > > > > > > group[s] between two given dates.
> > > > > > > > The challenge is that to be able to filter by dates, I need
> to
> > > have
> > > > > the
> > > > > > > > date as the key first part.
> > > > > > > > But I also need the group as the first key part in order to
> > > > aggregate
> > > > > > the
> > > > > > > > number of group associations.
> > > > > > > >
> > > > > > > > So I see two options here:
> > > > > > > >
> > > > > > > > 1.
> > > > > > > > Map: associated, {"group": group}
> > > > > > > > Reduce: a function that aggregates all values by group,
> which I
> > > > > assume
> > > > > > is
> > > > > > > > fine as I know the number of groups is relatively small.
> > > > > > > > (plus configuring reduce_limit=false ...)
> > > > > > > >
> > > > > > > > 2.
> > > > > > > > Map: [group,associated], 1
> > > > > > > > Reduce: sum(values)
> > > > > > > > Here I cannot retrieve multiple groups at once, so I use a
> > > request
> > > > > per
> > > > > > > > desired group.
> > > > > > > >
> > > > > > > > Tried the two approaches, with the first one gives faster
> > > response.
> > > > > > Which
> > > > > > > > leads me to two questions:
> > > > > > > > 1. Is there any risk in a reduce function that produces a
> > > > potentially
> > > > > > > long
> > > > > > > > string?
> > > > > > > > 2. Is there a better way to achieve what I do here?
> > > > > > > >
> > > > > > > > Thanks!
> > > > > > > >
> > > > > > > > Boaz
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Reply via email to