More info:

Since all loading and processing was handled by one function in a
session-bean, we thought that this was all done in one transaction, and the
problems came from this.
So we tried changing the transaction-type for this function to NONE (or
NEVER - cant remember) with the same result.

Then we tried putting the file-traversing code in a jsp-bean, and we let
this bean read one line at a time, and then sends it into the session-bean
which will process it.

This works. Some files of +30.000 lines are imported with no problems at
all. (except a bit slow loading time)

Our conclusion was that the transaction is still playing a role when we do
all processing in the session-bean ???

thank for the comments
 - Ren�

> -----Original Message-----
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED]]On Behalf Of Mailman
> Sent: 25. september 2002 14:54
> To: [EMAIL PROTECTED]
> Subject: RE: [JBoss-user] Large dataimport through entities
>
>
> We tried setting it to 500, and then started loading the data again.
> It reads 7626 lines, and then throws a
> HTTP ERROR: 500 removing bean lock and it has tx set!;
> CausedByException is:
> removing bean lock and it has tx set!
>
> This is extraction from the log:
> 2002-09-25 14:47:34,543 ERROR [org.jboss.ejb.plugins.LogInterceptor]
> TransactionRolledbackLocalException, causedBy:
> java.lang.IllegalStateException: removing bean lock and it has tx set!
>         at
> org.jboss.ejb.plugins.lock.QueuedPessimisticEJBLock.removeRef(Queu
> edPessimis
> ticEJBLock.java:473)
>         at
> org.jboss.ejb.BeanLockManager.removeLockRef(BeanLockManager.java:78)
>         at
> org.jboss.ejb.plugins.EntityLockInterceptor.invoke(EntityLockInter
> ceptor.jav
> a:124)
>         at
> org.jboss.ejb.plugins.EntityCreationInterceptor.invoke(EntityCreat
> ionInterce
> ptor.java:69)
>         at
> org.jboss.ejb.plugins.AbstractTxInterceptor.invokeNext(AbstractTxI
> nterceptor
> .java:107)
>         at
> org.jboss.ejb.plugins.TxInterceptorCMT.runWithTransactions(TxInter
> ceptorCMT.
> java:178)
>         at
> org.jboss.ejb.plugins.TxInterceptorCMT.invoke(TxInterceptorCMT.java:60)
>         at
> org.jboss.ejb.plugins.SecurityInterceptor.invoke(SecurityIntercept
> or.java:13
> 0)
>         at
> org.jboss.ejb.plugins.LogInterceptor.invoke(LogInterceptor.java:203)
>         at org.jboss.ejb.EntityContainer.invoke(EntityContainer.java:493)
>         at
> org.jboss.ejb.plugins.local.BaseLocalContainerInvoker.invoke(BaseL
> ocalContai
> nerInvoker.java:301)
>         at
> org.jboss.ejb.plugins.local.EntityProxy.invoke(EntityProxy.java:38)
>         at $Proxy279.setInternalId(Unknown Source)
>         at
> com.netmill.dmmsupplier.ejb.session.SanistaalManagerBean.loadItems
> FromFile(S
> anistaalManagerBean.java:287)
>
>
> > -----Original Message-----
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED]]On Behalf Of Sacha
> > Labourey
> > Sent: 25. september 2002 11:40
> > To: [EMAIL PROTECTED]
> > Subject: RE: [JBoss-user] Large dataimport through entities
> >
> >
> > can you try to:
> >     - limit the cache size for this entity (by default: 1Milion)
> >     - split your import in sub-transactions (may not be the
> > best situation)
> >
> > > -----Message d'origine-----
> > > De : [EMAIL PROTECTED]
> > > [mailto:[EMAIL PROTECTED]]De la part de Ren�
> > > Rolander Nygaard
> > > Envoy� : mercredi, 25 septembre 2002 11:28
> > > � : Jboss User
> > > Objet : [JBoss-user] Large dataimport through entities
> > >
> > >
> > > Hello all
> > >
> > > We are trying to do a data-import/update using entities.
> > > It's fairly easy to read data-lines, make a findByPK and if
> > finderEx, then
> > > create the entitity, and then you call the setters for the data
> > > you acquire.
> > >
> > > Fairly simple, but our server runs into outofmemory problems.
> > > My own bet is that the entitys are not release as quickly as i
> > load them.
> > >
> > > If we split the file into smaller sections, then it will read
> more lines
> > > than if one big file.
> > >
> > > The option of "Just-add-memory" response is not possible
> > because we cannot
> > > control the data-sizes, and the number of concurrent uploads.
> > > So - is there any way of controlling the entities in memory,
> or is there
> > > another way to make these uploads possible ?
> > >
> > > thanks in advance
> > >  - Ren�
> > >
> > >
> > >
> > > -------------------------------------------------------
> > > This sf.net email is sponsored by:ThinkGeek
> > > Welcome to geek heaven.
> > > http://thinkgeek.com/sf
> > > _______________________________________________
> > > JBoss-user mailing list
> > > [EMAIL PROTECTED]
> > > https://lists.sourceforge.net/lists/listinfo/jboss-user
> > >
> >
> >
> >
> > -------------------------------------------------------
> > This sf.net email is sponsored by:ThinkGeek
> > Welcome to geek heaven.
> > http://thinkgeek.com/sf
> > _______________________________________________
> > JBoss-user mailing list
> > [EMAIL PROTECTED]
> > https://lists.sourceforge.net/lists/listinfo/jboss-user
> >
> >
>
>
>
> -------------------------------------------------------
> This sf.net email is sponsored by:ThinkGeek
> Welcome to geek heaven.
> http://thinkgeek.com/sf
> _______________________________________________
> JBoss-user mailing list
> [EMAIL PROTECTED]
> https://lists.sourceforge.net/lists/listinfo/jboss-user
>



-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
JBoss-user mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/jboss-user

Reply via email to