|
||||||||
|
This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators. For more information on JIRA, see: http://www.atlassian.com/software/jira |
||||||||
- [JIRA] (JENKINS-15619) Huge Files sync eat all ... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)
- [JIRA] (JENKINS-15619) Huge Files sync eat... [email protected] (JIRA)

added limit* for number of files per changelist in order to avoid OOM on large changes (issue #13109)
We applied this change our plugin 1.1.13 but it have same problems.
Our job always used -f option.
P4 plugin keep below sync log in heap when a Job syncing sources.
p4 sync -f //aaa/bbbb/cccc/dddd/eeee/ffff/gggg.java#head
...............
is it correct?
we run 100+ build Simultaneously and all job start syncing a lot of file.
If out heap is full before syncing complete than is it possible the thrown out the log immediately?