Dear Werner,
 
>let me know whether you'd need any help with creating an EventProducer
that in my humble opinion provides a better approach going about your
restriction business than dealing with this issue as part of an addBook
method.
 
Ok, I will tell you about any solution we could find.
 
Am not agry with you about the EventProducer aproach will be appropiate to solve my problem. EventProducer should be really appropiate for FILTERING the nodes we want to parse, but this is not my problem, at least this is not the efficient solution.
 
What I would like is to unmarshall a given set of equal defined nodes, without stopping the unmarshall process. Then go to the next set of nodes during the same unmarshaller process!!!
 
With EventProducer you can filter, the nodes to unmarshall, but from the begining to the end of the file parsing process. So you can filter by the books, for example,  for a particular editorial, but this is not my problem. I have no other criteria than to fix the maximum number of nodes to unmarshall for avoiding the overflow on the database or on my application.
 
I think there is no way to say EventProducer:
 
"Pick the next set of books, then pause the process and let me stores the books on the database and now we can go on" (unless mixing the unmarshall process design with the dao objects)
 
With the EventProducer you only can say, as far as I understand, other gurus could opine about that:
 
"Filter the books following a certain criteria, then unmarshall will parse the entire file with the filtering criterium, then you manipulate the unmarshaller objects (for example storing the into database) and the start again the unmarshall process from the begining, for the next filtering criteria"
 
The better solution for the moment is the addBook method which is not a pretty solution as we have commented.

>The remainder of the email is me just thinking out aloud. Yes, I agree
that it would be great to be able to control some of the criteria
mentioned, e.g. how many books to unmarshal, etc. But providing such
features at the API level does *not* really appeal to me, especially
since we are talking about dealing with XML documents already. I think
it should not be too hard - and I hope I am not proven to be completely
wrong .. ;-) - to come up with an EventProducer that filtered an XML
document based upon e.g. an XPath statement, where one would specify
that you are only interested in unmarshaling a specific subset of the
original document.
 
see my previous comment about this.
 
Thanks again for your time and interesest. I would like to know also the opinion from Keith Visco  about this suggestion of adding this service to the unmarshaller API. Would you think it would be appropiate the solicite it on JIRA?

David


Werner Guttmann <[EMAIL PROTECTED]> wrote:
David,

let me know whether you'd need any help with creating an EventProducer
that in my humble opinion provides a better approach going about your
restriction business than dealing with this issue as part of an addBook
method. You are right that mixing concerns is not a good design
practice; but at the moment, it is the only working solution for you ..
;-). Iow, a little bit ugly from the design perspective (think
separation of concerns), but it works.

The remainder of the email is me just thinking out aloud. Yes, I agree
that it would be great to be able to control some of the criteria
mentioned, e.g. how many books to unmarshal, etc. But providing such
features at the API level does *not* really appeal to me, especially
since we are talking about dealing with XML documents already. I think
it should not be too hard - and I hope I am not proven to be completely
wrong .. ;-) - to come up with an EventProducer that filtered an XML
document based upon e.g. an XPath statement, where one would specify
that you are only interested in unmarshaling a specific subset of the
original document.

Just my 0.02 cents ...
Werner

David wrote:
> Stephen,
>
> Thanks again for your interest,
>
> Concerning to your solution, it seems to work what Keith says.
>
> Only one desing question about to mix the unmarshal process with the
> business staff, because you have to embed the storing books into the
> database with the unmarshal process. So the umarshal process has
> internally the service for storing the domain object in to database,
> which is no so good desing practice. Isn't it?
>
> Any way I think like you that it could work. Yust to think about a
> possible improve of this umarshall process in order provide a better
> solution for treating large files. I would like to provide an example
> about how to use EventProducer for this case, but I am not an SAX
> expert, but if I find a way to do that I am going to send you the sample
> source code.
>
> The current solution with the castor API you have a limitated control of
> the parsing process. If you wanto to parse a file from the begining to
> the end it is easy, otherwise you have to pay a certain price for that.
>
> some thing like this would be nice for such typica files with a big
> sequence of equal node elements like the bookstore example.
>
> unmashall the fowolling file with setting the maximum number of the size
> element of the umarshal process, something like this:
>
> int size = maximum number of register to unmarshall
>
> while (still are records) {
> List books = unmarshall(*.class, size);
> booksDao.save(books);
> }
>
> with the BookStore class solution we need a dependence with bookDao in
> order to save the books when the max counter is reached. It sounds to my
> a little ougly, isn't it?
>
> Thanks in advance,
>
> David
>
> */Stephen Bash <[EMAIL PROTECTED]>/* wrote:
>
> David-
>
> In order to save Keith some time, let me jump in. Keith's point about
> the addBook() method is that you don't have to store the contents
> element. You can simply put whatever processing you want done
> to each element within the addBook() method. Using your
> example, you could internally maintain a List of books, and when that
> list reached 1000 objects, perform whatever processing you wish to
> perform (assuming it happens on the same thread, Castor will wait),
> and then empty the list and start over again.
>
> Does that make more sense? Let me know if you have more questions.
>
> Stephen
>
> On 2/21/06, David wrote:
> > Keith,
> >
> > Thanks for your interest on my problem. The solution about the class
> > Bookstore it is what I have implemented first, but if the file is
> very big
> > after parsing and unmarshalling hte file, you will get a big array
> and this
> > is what I don't want.
> >
> > I would like to unmarshal book by book or a list of books with
> size smaller
> > than the original file, for example if my file has 1 000 000 of books
> > unmarshal by package of 1000 books for example each time.
> >
> > Thanks in advance,
> >
> > David Leal
> >
> > Keith Visco wrote:
> > David,
> >
> > Another solution would be to create a Bookstore class with an addBook
> > method...such as:
> >
> > public class Bookstore {
> >
> > public void addBook(Book book) {
> > // add book to database
> > }
> >
> > public Book[] getBooks() {
> > // return array of Book instances
> > }
> >
> > }
> >
> >
> > You can write a very simple mapping file:
> >
> >
> >
> > get-method="getBooks" set-method="addBook"/>
> >
> >
> >
> >
> > ...
> >
> >
> >
> > Each time Castor encounters the element it will call the addBook
> > method. You can then do whatever you want with the book and
> discard it.
> >
> > If you're only going to use this for unmarshalling you don't need the
> > getBooks method.
> >
> > --Keith
> >
> > David wrote:
> > > Dear Members,
> > >
> > > I have a typical file like this:
> > >
> > >
> > >
> > >
> > > ...
> > >
> > >
> > >
> > > if the file is big enough I can't load the whole book store at
> once, my
> > > idea is to take one by one each information (or a fixed numbers
> > > of book elements for avoiding overflow), then for example store into
> > > database and then to take the next one, so I want to parse book
> event,
> > > process it and then to go to the next one. ¿How can we do it
> with castor?
> > >
> > > If I bind the hole bookstore element into BookStore class with
> million
> > > of books I get Overflow for sure.
> > >
> > > It should be a solution where the unmarshall process get each time a
> > > book then you process it (for example stores into database) and
> go to
> > > the next book without creation a new object for example each time or
> > > doing by a package of certain number of books each time.
> > >
> > > Thanks in advance,
> > >
> > > David Leal
> > > [EMAIL PROTECTED]
> > >
> > >
> >
> ------------------------------------------------------------------------
> > > Relax. Yahoo! Mail virus scanning
> > >
> > > helps detect nasty viruses!
> >
> >
> > -------------------------------------------------
> > If you wish to unsubscribe from this list, please
> > send an empty message to the following address:
> >
> > [EMAIL PROTECTED]
> > -------------------------------------------------
> >
> >
> >
> >
> > ________________________________
> >
> > What are the most popular cars? Find out at Yahoo! Autos
> >
> >
>
>
> ------------------------------------------------------------------------
> Yahoo! Mail
> Use Photomail
>
> to share photos without annoying attachments.


-------------------------------------------------
If you wish to unsubscribe from this list, please
send an empty message to the following address:

[EMAIL PROTECTED]
-------------------------------------------------



Yahoo! Mail
Use Photomail to share photos without annoying attachments.

Reply via email to