What's the standard lifetime of an EnityBean ?
For Stateful SessionBeans there is a configuration for passivation
(<max-bean-age>) and removal (<max-bean-life>) in the
container-configuration. But for EntityBeans I can find only the passivation
time to configure. Are they removed after a fix time ?
Is there a way to get them out of the memory ? (jboss 2.4.x)
We have a lot of EntityBeans which are created and used only once.

Annegret

-----Original Message-----
From: Cor Hofman [mailto:[EMAIL PROTECTED]]
Sent: Mittwoch, 25. September 2002 12:41
To: [EMAIL PROTECTED]
Subject: RE: [JBoss-user] Large dataimport through entities


Hi René,

Not that I am the expert, but I had a similar problem. I was running 2.4.3
or 2.4.4 (I am not sure). When I switched to 2.4.6 my problems were gone.
The 2.4.3(4?) release contains a known bug. Because of this bug entity beans
will live forever, hence your "out of memory" problem. In short, switch at
least to 2.4.6 (that is if your are running a lower version ;-)

Regards,

   Sales Companion :)
   Cor Hofman

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Sacha
Labourey
Sent: Wednesday, September 25, 2002 11:40
To: [EMAIL PROTECTED]
Subject: RE: [JBoss-user] Large dataimport through entities


can you try to:
        - limit the cache size for this entity (by default: 1Milion)
        - split your import in sub-transactions (may not be the best
situation)

> -----Message d'origine-----
> De : [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED]]De la part de René
> Rolander Nygaard
> Envoyé : mercredi, 25 septembre 2002 11:28
> À : Jboss User
> Objet : [JBoss-user] Large dataimport through entities
>
>
> Hello all
>
> We are trying to do a data-import/update using entities.
> It's fairly easy to read data-lines, make a findByPK and if finderEx, then
> create the entitity, and then you call the setters for the data
> you acquire.
>
> Fairly simple, but our server runs into outofmemory problems.
> My own bet is that the entitys are not release as quickly as i load them.
>
> If we split the file into smaller sections, then it will read more lines
> than if one big file.
>
> The option of "Just-add-memory" response is not possible because we cannot
> control the data-sizes, and the number of concurrent uploads.
> So - is there any way of controlling the entities in memory, or is there
> another way to make these uploads possible ?
>
> thanks in advance
>  - René
>
>
>
> -------------------------------------------------------
> This sf.net email is sponsored by:ThinkGeek
> Welcome to geek heaven.
> http://thinkgeek.com/sf
> _______________________________________________
> JBoss-user mailing list
> [EMAIL PROTECTED]
> https://lists.sourceforge.net/lists/listinfo/jboss-user
>



-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
JBoss-user mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/jboss-user




-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
JBoss-user mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/jboss-user


-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
JBoss-user mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/jboss-user

Reply via email to