Hi all,

I recently switched over from using hledger to beancount and am really 
enjoying the thoughtful and extensive integrations (fava, importers, etc). 
I wrote a tool 
<https://github.com/allancalix/clerk/tree/c11a032bec41806c5bb26486d35c5aa61d93859e>to
 
generate ledger entries from Plaid api data a while back and am now using 
to generate beancount entries.

I'm removing the half-baked ledger entry printer and have an importer (WIP) 
<https://github.com/allancalix/clerk-importer> to generate beancount 
entries instead. So far everything works as expected but parts of the 
importer have a clunky interface:

   1. Importers are mapped to whether they can handle specific files or 
   not. In the case of this importer, I tend to pull date ranges each time I 
   update my beancount file. This results in a stream of transactions for all 
   accounts linked to Plaid. This leads to the second bit. I work around this 
   currently by using a predefined json file 
   
<https://github.com/allancalix/clerk-importer/blob/14ba8280e7ae128db62faaae20aa5dd7ff13b6c6/clerk_importer/clerk.json#L1>
 
   with the date range as inputs.
   2. No natural account name to map to (i.e. `file_account`). Because the 
   stream of transaction relates to multiple beancount accounts, filing 
   doesn't make much sense in this context. This is minor as the file could be 
   deleted after import, but is a bizarre flow when used in fava.

Overall these hiccups are minor but I'd really like to know if there are 
better patterns for handling extractors that are associated to multiple 
files and don't have a natural related file import.

Thanks for any feedback or input you can provide.

Best,
Allan

-- 
You received this message because you are subscribed to the Google Groups 
"Beancount" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/beancount/0c7c497d-e695-4621-8173-763a8912bfc0n%40googlegroups.com.

Reply via email to