Greetings,

I am evaluating ledger for a use case where a large accounting dataset 
contains tens of thousands transactions every month, over several years. 
Transactions are mostly currency conversions, and my initial search 
specifically led me to ledger due to its flexible commodity handling. A few 
questions related to the large size of the journal:

(1) Would it be practical to try the program with such a large transaction 
dataset at all? Anyone tried over 10^5 - 10^6 transactions?

(2) I'd have to script json (current data) parsing to ledger journal 
format. Any recommended alternatives to input formats (e.g. xml mentioned 
in some places) to speed up transaction parsing? Or would it be as fast 
with the human-readable journal input text format?

(3) If updating the journal, say, monthly, is there any way to cache 
previous month's calculations or would the entire updated ledger have to be 
parsed and recalculated each time?

Any recommendations would be helpful as I do like the cleanness of the 
program and hope to make it suitable for this use case. Thanks.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"Ledger" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to