Unfortunately, the logs aren't giving me much more info. 

The agent log simply reports the same message in my original message, with 
a stacktrace:

java.lang.RuntimeException: Artifact upload for file #### (Size: 38003857) 
> was denied by the server. This usually happens when server runs out of disk 
> space..  HTTP return code is 413
> at com.thoughtworks.go.util.ExceptionUtils.bomb(ExceptionUtils.java:29)
> at 
> com.thoughtworks.go.publishers.GoArtifactsManipulator.publish(GoArtifactsManipulator.java:102)
> at 
> com.thoughtworks.go.work.DefaultGoPublisher.upload(DefaultGoPublisher.java:68)
> at 
> com.thoughtworks.go.domain.ArtifactPlan.uploadArtifactFile(ArtifactPlan.java:169)
> at 
> com.thoughtworks.go.domain.ArtifactPlan.publishBuildArtifact(ArtifactPlan.java:143)
> at 
> com.thoughtworks.go.domain.ArtifactPlan.publishBuiltInArtifacts(ArtifactPlan.java:131)
> at 
> com.thoughtworks.go.remote.work.artifact.ArtifactsPublisher.publishArtifacts(ArtifactsPublisher.java:75)
> at 
> com.thoughtworks.go.remote.work.BuildWork.completeJob(BuildWork.java:197)
> at com.thoughtworks.go.remote.work.BuildWork.build(BuildWork.java:129)
> at com.thoughtworks.go.remote.work.BuildWork.doWork(BuildWork.java:79)
> at com.thoughtworks.go.agent.JobRunner.run(JobRunner.java:53)
> at 
> com.thoughtworks.go.agent.AgentHTTPClientController.retrieveWork(AgentHTTPClientController.java:149)
> at 
> com.thoughtworks.go.agent.AgentHTTPClientController.work(AgentHTTPClientController.java:121)
> at com.thoughtworks.go.agent.AgentController.loop(AgentController.java:86)
> at jdk.internal.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
> at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
> at 
> org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:65)
> at 
> org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
> at 
> java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
> at 
> java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
> at 
> java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> at java.base/java.lang.Thread.run(Thread.java:834)


The server log doesn't report anything of significance:

2020-08-10 15:46:20,983 INFO  [qtp536519735-422] 
> BuildRepositoryRemoteImpl:99 - [Agent [XXX, XXX, 
> 63e8d22e-b360-4b84-bcc5-847ec9fb55f4, 
> 4b178349-8d56-43cb-8d25-dc048a4d84fa]] is reporting result [Passed] for 
> [Build [XXX]]
> 2020-08-10 15:46:39,061 INFO  [qtp536519735-422] 
> BuildRepositoryRemoteImpl:99 - [Agent [XXX, XXX, 
> 63e8d22e-b360-4b84-bcc5-847ec9fb55f4, 
> 4b178349-8d56-43cb-8d25-dc048a4d84fa]] is reporting status and result 
> [Completed, Failed] for [Build [XXX]]
> 2020-08-10 15:46:39,061 INFO  [qtp536519735-422] Stage:234 - Stage is 
> being completed by transition id: 209109


There's no quotas on the file system, and I'm not even sure of a way to do 
that in Windows. 

Windows Event logs on the GO Server and GO Agent computers show nothing at 
that time either. 

I assumed there was some undocumented setting that changed in a recent GO 
version that now limits artifact upload size. 



On Tuesday, August 11, 2020 at 11:24:04 AM UTC-4, Jason Smyth wrote:
>
> Hi Chris,
>
> Are there any quotas in place on the file system that the GoCD might be 
> hitting? Are there any error messages in the server logs when the artifact 
> upload fails? Are there any errors in Event Log when the artifact upload 
> fails?
>
> Unless there is a quota in place, it sounds like you're right about there 
> being plenty of disk space so maybe the logs will have more information 
> that can point you in the right direction.
>
> Regards,
> Jason
>
> On Tuesday, 11 August 2020 11:14:50 UTC-4, Chris Payne wrote:
>>
>> Well we're on a Windows server. We have over 160GB free space on a 1TB 
>> drive. Everything is stored on the single C drive. As far as I can tell, 
>> everything looks fine on the server itself. But maybe there's something 
>> hidden I'm missing.
>>
>> On Tuesday, August 11, 2020 at 11:10:39 AM UTC-4, Jason Smyth wrote:
>>>
>>> Hi Chris,
>>>
>>> The last time someone brought up a space issue like this, it turned out 
>>> that the server had plenty of space but it was allocated elsewhere on the 
>>> file system.
>>>
>>> Have you checked the server and confirmed that there are no volumes that 
>>> are running out of space? I suggest looking specifically at the mount 
>>> points that hold /tmp and GoCD's server directory.
>>>
>>> Regards,
>>> Jason
>>>
>>> On Tuesday, 11 August 2020 11:05:54 UTC-4, Chris Payne wrote:
>>>>
>>>> We upgraded to 20.6.0 from 19.12 over the weekend and we're now getting 
>>>> problems uploading large-ish artifacts (35MB+). This didn't happen 
>>>> before... seeing these errors in our logs:
>>>>
>>>> [go] Artifact upload for file ####### (Size: 38003857) was denied by 
>>>> the server. This usually happens when server runs out of disk space.
>>>> [go] Failed to upload #####
>>>>
>>>> Is there some new limit imposed? Is there any way to configure this? My 
>>>> searching on this turns up the problem reported with the Linux tmp 
>>>> directory, but I'm not sure how that applies here at all. I don't see any 
>>>> other settings in the config files or in the Tanuki wrapper that could 
>>>> tweak this. The OS didn't change, and we have plenty of storage space on 
>>>> the server (over 160GB). 
>>>>
>>>> Thanks!
>>>>
>>>>

-- 
You received this message because you are subscribed to the Google Groups 
"go-cd" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/go-cd/c1d1c7d2-2c2a-430f-ba07-339fefc3af1fo%40googlegroups.com.

Reply via email to