Send each element as a medsage via producer template.

On Fri, Oct 23, 2020, 18:44 Site Register <site.regis...@ymail.com.invalid>
wrote:

>  I am thinking to have a processor to read json file in streaming and
> produce as batch to outbound. However, I am not sure how to batch messages
> out during streaming read.
>
> try (JsonReader reader = new JsonReader(new FileReader(fileName))) {
>             reader.beginArray();
>             Gson gson = new GsonBuilder().create();
>             int numberOfRecords = 0;
>             while (reader.hasNext()) {
>              JsonObject document = gson.fromJson(reader, JsonObject.class);
>              batchList.add(document);
>              numberOfRecords++;
>              if (numberOfRecords % batchSize == 0) {
>                         /*
>
>                             How can I process batchList to the pipeline?
>
>                         */
>
>              }
>             }}
>
>
>     On Friday, October 23, 2020, 10:39:28 AM EDT, Romain Manni-Bucau <
> rmannibu...@gmail.com> wrote:
>
>  Hi,
>
> From my experience - got the same issue with xml years ago now, the
> simplest is to do a custom component (like jsonstreaming:....) and handle
> the split in this component.
> It will likely require a jsonpointer to find an array to emit array item
> one by one to next processor.
> At the end the route could look like something like:
>
> from("jsonstreaming:/path/file.json?pointer=/items")
>   .to("bean:processUser");
>
> Romain Manni-Bucau
> @rmannibucau <https://twitter.com/rmannibucau> |  Blog
> <https://rmannibucau.metawerx.net/> | Old Blog
> <http://rmannibucau.wordpress.com> | Github <
> https://github.com/rmannibucau> |
> LinkedIn <https://www.linkedin.com/in/rmannibucau> | Book
> <
> https://www.packtpub.com/application-development/java-ee-8-high-performance
> >
>
>
> Le ven. 23 oct. 2020 à 16:16, Mantas Gridinas <mgridi...@gmail.com> a
> écrit :
>
> > From personal experience any activities related to file component tend to
> > try to load entire file into memory. You could perhaps fiddle around by
> > converting it to an input stream but then you get into an issue of making
> > sure that you don't read an entire file into memory before actually
> > converting it.
> >
> > I'd suggest avoiding using camel here at all for the sake of retaining
> fine
> > grained control over streaming process. At most you could wrap it in a
> > processor and be done with it. Otherwise you'll start programming with
> > routes, which is something you'd want to avoid.
> >
> >
> > On Fri, Oct 23, 2020, 17:05 Site Register <site.regis...@ymail.com
> > .invalid>
> > wrote:
> >
> > > Hi Camel Users,
> > > I have a 4G json array file need to load into database. How can I
> > leverage
> > > Camel to stream the file and split into json?
> > > I had tried to use "stream:file" but it was reading line by line not
> > split
> > > into json.
> > > I leveraged gson streaming to read and insert into the database in java
> > > application which took about 3 minutes. However, I would like to check
> if
> > > there is a way to leverage camel pipeline for the same purpose.
> > > Thank you,
> >

Reply via email to