Thanks Bob.
I am currently running dhis 2.22 version 22086 and upgraded the Postgres from 
version 9.3 to version 9.5. The analytics was successful in the previous 
Postgres versions 9.3 and the dhis version 2.21.
Strange enough, the tracker instance is having no issue with running 
analytics.It starts fine and successfully ends, without any errors.This failure 
in analytics, is happening to just one instances (dhims) which holds both 
aggregated values and individual events.

Regards Tony

 
On Jul 28, 2016, at 10:50 AM, Bob Jolliffe <bobjolli...@gmail.com> wrote:
> 
> Hi
> 
> The folk in ghana are hitting a problem running analytics where an
> invalid table name is being generated.  Specifically the sql which
> fails is:
> 
> create table analytics_event_temp_-1_r8cbfnorkzf ( ...
> 
> instead of something like ...
> 
> create table analytics_event_temp_2015_r8cbfnorkzf ( ....
> 
> Does anybody have an idea how that "-1" is getting generated and where
> they might start to look to track down the problem.  I can see from
> the source code that the string is created with
> 
> name += PartitionUtils.SEP + PeriodType.getCalendar().fromIso(
> period.getStartDate() ).getYear();
> 
> For some reason that getYear() is returning -1 which means they must
> have some dodgy period as input.  Jason has pointed out that for
> analytics, the period might not be coming from the period table, but
> rather calculated from incidentdate or enrollmentdate or the like.
> 
> If anyone has some helpful advice to give Tony in tracking this down
> would be appreciated.  As background, I understand this has happened
> after an upgrade to postgres 9.5 but this doesn't look like a database
> version problem to me.
> 
> I am not too sure of the dhis2 version but perhaps Tony can elaborate.
> 
> Bob

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to     : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help   : https://help.launchpad.net/ListHelp

Reply via email to