Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
Hi there. I need to do a fresh import of the Zambia DHIS 1.4 database after a long round of changes in the 1.4 database, but get stuck with the following stack trace. Hopefully, this will be the last before we start doing incremental XML updates. Data integrity checks in 1.4 version 116 are OK. Also, I checked the data in the DataElementCalculated table and it looks OK. Any idea what this might be caused by? Best regards, Jason * ERROR 08:26:23,406 The process threw exception (ProcessExecutor.java [Thread-14]) java.lang.RuntimeException: Query with RowHandler failed at org.hisp.dhis.importexport.dhis14.file.query.IbatisQueryManager.queryWithRowhandler(IbatisQueryManager.java:144) at org.hisp.dhis.importexport.dhis14.file.query.IbatisQueryManager.queryWithRowhandler(IbatisQueryManager.java:126) at sun.reflect.GeneratedMethodAccessor212.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at $Proxy45.queryWithRowhandler(Unknown Source) at org.hisp.dhis.importexport.dhis14.file.importer.DefaultDhis14FileImportService.importRoutineDataValues(DefaultDhis14FileImportService.java:766) at org.hisp.dhis.importexport.dhis14.file.importer.DefaultDhis14FileImportService.importData(DefaultDhis14FileImportService.java:296) at org.hisp.dhis.importexport.ImportInternalProcess.executeStatements(ImportInternalProcess.java:97) at org.hisp.dhis.system.process.AbstractStatementInternalProcess.execute(AbstractStatementInternalProcess.java:90) at org.hisp.dhis.system.process.AbstractStatementInternalProcess.execute(AbstractStatementInternalProcess.java:37) at org.amplecode.cave.process.ProcessExecutor.run(ProcessExecutor.java:125) at java.lang.Thread.run(Unknown Source) Caused by: com.ibatis.common.jdbc.exception.NestedSQLException: --- The error occurred in sqlmap/routineDataValue.sqlmap.xml. --- The error occurred while applying a result map. --- Check the routineDataValue.routineDataValueResultMap. --- The error happened while setting a property on the result object. --- Cause: java.lang.NullPointerException Caused by: java.lang.NullPointerException at com.ibatis.sqlmap.engine.mapping.statement.GeneralStatement.executeQueryWithCallback(GeneralStatement.java:188) at com.ibatis.sqlmap.engine.mapping.statement.GeneralStatement.executeQueryWithRowHandler(GeneralStatement.java:133) at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.queryWithRowHandler(SqlMapExecutorDelegate.java:644) at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.queryWithRowHandler(SqlMapSessionImpl.java:121) at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.queryWithRowHandler(SqlMapClientImpl.java:98) at org.hisp.dhis.importexport.dhis14.file.query.IbatisQueryManager.queryWithRowhandler(IbatisQueryManager.java:138) ... 18 more Caused by: java.lang.NullPointerException at org.hisp.dhis.importexport.converter.AbstractDataValueConverter.getMatching(AbstractDataValueConverter.java:79) at org.hisp.dhis.importexport.converter.AbstractDataValueConverter.getMatching(AbstractDataValueConverter.java:39) at org.hisp.dhis.importexport.converter.AbstractConverter.read(AbstractConverter.java:77) at org.hisp.dhis.importexport.dhis14.file.rowhandler.RoutineDataValueRowHandler.handleRow(RoutineDataValueRowHandler.java:116) at com.ibatis.sqlmap.engine.mapping.statement.RowHandlerCallback.handleResultObject(RowHandlerCallback.java:76) at com.ibatis.sqlmap.engine.execution.SqlExecutor.handleResults(SqlExecutor.java:395) at com.ibatis.sqlmap.engine.execution.SqlExecutor.executeQuery(SqlExecutor.java:185) at com.ibatis.sqlmap.engine.mapping.statement.GeneralStatement.sqlExecuteQuery(GeneralStatement.java:205) at com.ibatis.sqlmap.engine.mapping.statement.GeneralStatement.executeQueryWithCallback(GeneralStatement.java:173) ... 23 more ___ Mailing list: https://launchpad.net/~dhis2-devs Post to : dhis2-devs@lists.launchpad.net Unsubscribe : https://launchpad.net/~dhis2-devs More help :
Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
Jason, o I have now imported the database I downloaded last night without problems in 32 minutes, giving 4,9 million data values. - The code now manages missing calculated data element identifiers. Will be a part of next DHIS 2 release. If you remove them manually it will work anyway. - I had to remove the data values larger than 2^31, which is basically the budget data, from the database manually. o To speed things up you can do the following (you have probably done a few of them): - Set the environment variable JAVA_OPTS, mine is -Xms128m -Xmx512m -XX:PermSize=128m -XX:MaxPermSize=1024m - Give Postgres more memory in pginstalldir/data/postgresql.conf by setting the shared_buffersproperty, mine is 256MB - For inital imports (!), in the DHIS 2 import screen - Show advanced options - set Skip check for matching datavalues to yes (empty database) o Regarding db maintenance I did a few things under Data administration - Maintenance: - I ran Clear zero values which reduced the total amount of data values from 4.9 to 3.3 millions. - I ran Prune periods which reduced the total amount of periods from 99 to 68, all of this will speed up the datamart process. - I entered the orgunit level names manually under Organisation Unit - Organisation Unit Levels. o What you need to do is the following: - Have a look at the indicator and calculated data element identifiers. I am now replacing missing identifiers with 0, which might imply unwanted results or division-by-zero. - Change the aggregation operator from SUM to AVERAGE for data elements where you want to do the average when aggregating, eg. Age and Population related elements. The sustainable solution to this is to change this in DHIS 1.4, which simply always sets this property to SUM. o I am not sure why you didn't get it to work. This sounds a little silly but are you sure you waited long enough? My import on a high-end maching took 32 min. Is the GUI message saying Import process complete, or is it still saying Importing routine data values ? This is the end of my log. Yours stops at Imported Periods: - * INFO 12:58:25,726 Imported Periods (DefaultDhis14FileImportService.java [Thre ad-9]) * INFO 13:27:31,520 Imported RoutineDataValues (DefaultDhis14FileImportService. java [Thread-9]) * INFO 13:27:34,427 Imported OnChangePeriods (DefaultDhis14FileImportService.ja va [Thread-9]) * INFO 13:27:58,880 Imported SemiPermanentDataValues (DefaultDhis14FileImportSe rvice.java [Thread-9]) * INFO 13:27:58,958 Import process completed: 0:32:01.611 (ImportInternalProces s.java [Thread-9]) -- You can download the pg backup file here (created in windows/pgadmin): http://folk.uio.no/larshelg/files/dhis2zambiaMarch2009.backup best regards, Lars ___ Mailing list: https://launchpad.net/~dhis2-devs Post to : dhis2-devs@lists.launchpad.net Unsubscribe : https://launchpad.net/~dhis2-devs More help : https://help.launchpad.net/ListHelp
Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
We will need to change this in DHIS 1.4 somehow. Would it be possible for DHIS to implement a check to ensure that the value is not out of the allowable range? If it is, then perhaps it could simply ignore these values, and inform the user. That would have been very nice but this happens when the JDBC driver accesses the database and doesn't give the DHIS 2 application a chance to do something about it. - For inital imports (!), in the DHIS 2 import screen - Show advanced options - set Skip check for matching datavalues to yes (empty database) Could you explain this option? I would assume that it will check for existing data values and not overwrite them. If I choose no, I will assume it will attempt to insert everything? Exactly. No is the default and will do a check for matching data values for every row, which is redundant on empty databases. o I am not sure why you didn't get it to work. This sounds a little silly but are you sure you waited long enough? My import on a high-end maching took 32 min. Is the GUI message saying Import process complete, or is it still saying Importing routine data values ? This is the end of my log. Yours stops at Imported Periods Yes, I waited. Well, in fact ,I did not have to wait. It just executed like lightning and gave me a null response in the browser window. I did not get anything like Import process complete. Strange, normally something would have emerged in the logs. http://folk.uio.no/larshelg/files/dhis2zambiaMarch2009.backup Big help Lars. This will at least enable me to start having a look at the development of the reports (which they need ASAP) and then we can continue to clean up the other issues. Tack, Jason Det e lugnt.. ___ Mailing list: https://launchpad.net/~dhis2-devs Post to : dhis2-devs@lists.launchpad.net Unsubscribe : https://launchpad.net/~dhis2-devs More help : https://help.launchpad.net/ListHelp
Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
Hi Everyone, I wanted to pick up this thread again, after trying everything that I can to get the import process from 1.4 to 2.0 to work, but am still not succeeding. I have checked a database provided to me by Ola several months ago. This database contained roughly 3 million records. Since then, there have been several new imports into the master 1.4 database here in Zambia. According to DHIS 2, the database that was imported by Ola has the folliowing characteristics According to DHIS 1.4 the database contains roughly 3 million records with only 100 or so being archived. Data Statistics TypeNumber Data elements 3114 Data element groups 128 Indicator types 6 Indicators 390 Indicator groups128 Data sets 19 Data dictionaries 0 Organisation units 12729 Validation rules0 Periods 85 Data values 1646537 The database that I have recently imported (with over 4.5 million records with 2.2 million that are archived according to DHIS 1.4) has the following attributes. TypeNumber Data elements 3124 Data element groups 128 Indicator types 6 Indicators 396 Indicator groups128 Data sets 22 Data dictionaries 0 Organisation units 11484 Validation rules0 Periods 38 Data values 543599 As you can see, there are significantly fewer data values that have been imported even though DHIS 1.4 is reporting more data values. I have been importing directly from the DHIS on a XP machine that has both systems. I have included a stack trace of an import process towards the end of this email. Then try importing again. Should take around 15 minutes. When Lars tried a few week back, he indicated the process would take about 15 minutes. In my case, the process is almost instantaneous, which seems impossible, given the amount of data in the original 1.4 database. I do have a new laptop, but it cannot be that fast, or can it!! So, I am asking for any pointers here about how to get all the data from 1.4 into 2.0. I can of course provide all files to you if you can assist me, but I have been struggling for several days now, and really cannot figure out a way forward. It is crucial here in Zambia that we establish a clear, reliable workflow for the regular importation of data from 1.4 into 2.0. Any help, would be appreciated. Best regards, Jason Below, stack trace from the import process... * INFO 16:24:32,484 Imported DataElements (DefaultDhis14FileImportService.java [Thread-23]) * INFO 16:24:36,078 Imported CalculatedDataElements (DefaultDhis14FileImportService.java [Thread-23]) * INFO 16:24:36,375 Imported IndicatorTypes (DefaultDhis14FileImportService.java [Thread-23]) * WARN 16:24:38,875 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * WARN 16:24:38,890 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,890 'Inpatient referral rate' contains a non-existing data element identifier: 0 (Dhis14ExpressionConverter.java [Thread-23]) * WARN 16:24:38,890 Key is null (LoggingHashMap.java [Thread-23]) * WARN 16:24:38,890 Key is null (LoggingHashMap.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,906 'Facility mortality under 5 years rate' contains a non-existing data element identifier: 0 (Dhis14ExpressionConverter.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,906 'Facility mortality under 5 years rate' contains a non-existing data element identifier: 0 (Dhis14ExpressionConverter.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,906 'Facility mortality under 5 years rate' contains a non-existing data element identifier: 0 (Dhis14ExpressionConverter.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,906 'Facility mortality under 5 years rate' contains a non-existing data element identifier: 0 (Dhis14ExpressionConverter.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,906 'Weighing rate under 5 years' contains a non-existing data element identifier: 0 (Dhis14ExpressionConverter.java [Thread-23]) * WARN 16:24:38,906 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,906 'Weighing rate under 5 years' contains a non-existing data element identifier: 0 (Dhis14ExpressionConverter.java [Thread-23]) * WARN 16:24:38,921 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * WARN 16:24:38,921 Value is null for key: '0' (LoggingHashMap.java [Thread-23]) * ERROR 16:24:38,921 'Children with severe malnutrition' contains a
Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
On Tue, Mar 31, 2009 at 6:16 PM, Jason Pickering jason.p.picker...@gmail.com wrote: Hi Lars, Thanks for the offer. I am preparing the DB now for upload and will upload it. It will likely take some time as the internet is quite slow on my side. There is another version, that is very similar, except for a few changes to the organizational hierarchy that I uploaded to our internet server a few days back.I will send you the link in another email, as the data is not public yet. That was (what I thought) was the entire stack trace of the import process as far as I can tell. This is another stack trace from yesterday, using a slightly different database (I corrected some issues with the organizational hierarchy, and that was the stack trace I included in the last mail. It seems a bit more complete, so I am not sure what happened with the other one. That was all there was as far as the import went. (Attached at the bottom) . Let me know if you come up with anything. OK if this is the whole ting the system does not even attempt at importing data values. This is weird, I will have a look at the data file. Regarding import work flow, I would recommend doing one initial import from data file with legacy data and everything you have by now. Then use the XML based import to for the data you receive on a regular basis. That is DHIS 1.4 XML Import, remember that the DHIS 1.4 installation that produced the XML file must be build 112 or later. ___ Mailing list: https://launchpad.net/~dhis2-devs Post to : dhis2-devs@lists.launchpad.net Unsubscribe : https://launchpad.net/~dhis2-devs More help : https://help.launchpad.net/ListHelp
Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
On Tue, Mar 31, 2009 at 6:33 PM, Calle Hedberg chedb...@telkomsa.netwrote: Lars, If the missing data element ids in some of the indicator formulas result in the import process into 2.0 bombing out, then something is not right - random errors in e.g. formulas do not need to result in the whole process aborting. My suggestion would be to simply disregard any such indicators (but log a user-friendly message) and then continue the import. 1.4 is doing the same for indicator processing - any erroneously defined indicator is reported to the user and then disregarded. Yes you are right. We are working on making it more robust. ___ Mailing list: https://launchpad.net/~dhis2-devs Post to : dhis2-devs@lists.launchpad.net Unsubscribe : https://launchpad.net/~dhis2-devs More help : https://help.launchpad.net/ListHelp
Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
I would assume this would be caused by the fact I am using a completely new database. So essentially, I would like to perform a completely clean installation from 1.4 to 2.0. We will need to do this on a regular basis, as 2.0 (at least for the time being in Zambia) will only be used for reporting and analysis. No data will be entered or modified. Each quarter, a new import from the master 1.4 database will need to take place. Thanks in advance for your help. I will attempt the second procedure now,and see what happens . Best regards, Jason I had a look in the DHIS 1.4 Zambia database and found a few inconsistencies. 1) The table DataElementCalculated represents the formula for calculated data elements, and there are a few entries that refer to data elements that don't exist. These can be revealed by opening the table, sort the DataElementCalculatedID column from A-Z and remove the entries with only identifiers/numbers. Existing data elements will be mapped and appear as names. Then do the same for the DataElementID column and remove entries with numbers/identifiers. Ideally this should be included in the data integrity checks in 1.4. 2) There are a few EntryNumber entries in the RoutineData table that exceed what the JDBC driver is able to read. Open the RoutineData table, sort big-small on EntryNumber and remove the 3 highest numbers. Then try importing again. Should take around 15 minutes. Lars ___ Mailing list: https://launchpad.net/~dhis2-devs Post to : dhis2-devs@lists.launchpad.net Unsubscribe : https://launchpad.net/~dhis2-devs More help : https://help.launchpad.net/ListHelp
Re: [Dhis2-devs] Fwd: [Dhis-dev] Export process from 1.4 to 2.0
Hi there, I had seen these integrity check messages from 1.4 actually, but did not fix them. I see now that I need to do this. I just wanted to ensure that a complete clean import from 1.4 to 2.0 was actually possible, and it seems that it is. 2) There are a few EntryNumber entries in the RoutineData table that exceed what the JDBC driver is able to read. Open the RoutineData table, sort big-small on EntryNumber and remove the 3 highest numbers. What is the maximum then? I would assume that we would need to do something to prevent this from happening in the first place. We have a new version of the DHIS 1.4 Zambia database, that includes some more historical data (as of last Friday), but I am not sure if that is the version that you are using. This is part of the problem here, but that deserves another thread. Thanks again, and I will revert to the list again if I encounter more issues. Best regards, jason 2009/3/18 Lars Helge Øverland larshe...@gmail.com: I would assume this would be caused by the fact I am using a completely new database. So essentially, I would like to perform a completely clean installation from 1.4 to 2.0. We will need to do this on a regular basis, as 2.0 (at least for the time being in Zambia) will only be used for reporting and analysis. No data will be entered or modified. Each quarter, a new import from the master 1.4 database will need to take place. Thanks in advance for your help. I will attempt the second procedure now,and see what happens . Best regards, Jason I had a look in the DHIS 1.4 Zambia database and found a few inconsistencies. 1) The table DataElementCalculated represents the formula for calculated data elements, and there are a few entries that refer to data elements that don't exist. These can be revealed by opening the table, sort the DataElementCalculatedID column from A-Z and remove the entries with only identifiers/numbers. Existing data elements will be mapped and appear as names. Then do the same for the DataElementID column and remove entries with numbers/identifiers. Ideally this should be included in the data integrity checks in 1.4. 2) There are a few EntryNumber entries in the RoutineData table that exceed what the JDBC driver is able to read. Open the RoutineData table, sort big-small on EntryNumber and remove the 3 highest numbers. Then try importing again. Should take around 15 minutes. Lars ___ Mailing list: https://launchpad.net/~dhis2-devs Post to : dhis2-devs@lists.launchpad.net Unsubscribe : https://launchpad.net/~dhis2-devs More help : https://help.launchpad.net/ListHelp