Hi, I'm using Spark on a Google Compute Engine cluster with the Google Cloud Storage connector (instead of HDFS, as recommended here <https://cloud.google.com/hadoop/google-cloud-storage-connector#benefits> ), and get a lot of "rate limit" errors, as added below. The errors relate to temp files (in a folder called _temporary), and not to my input/output of the program.
Is there a way to control the read/write rate of Spark? Is there a way to increase the rate limit for my Google Project? Is there a way to use local Hard-Disk for temp files that don't have to be shared with other slaves? Or anyone knows or thinks of any other solution for that? Thanks, Oren ------------------------ java.io.IOException: Error inserting: bucket: *****, object: ***** at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.wrapException(GoogleCloudStorageImpl.java:1600) at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl$3.run(GoogleCloudStorageImpl.java:475) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 429 Too Many Requests { "code" : 429, "errors" : [ { "domain" : "usageLimits", "message" : "The total number of changes to the object ***** exceeds the rate limit. Please reduce the rate of create, update, and delete requests.", "reason" : "rateLimitExceeded" } ], "message" : "The total number of changes to the object ***** exceeds the rate limit. Please reduce the rate of create, update, and delete requests." } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469) at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl$3.run(GoogleCloudStorageImpl.java:472) ... 3 more ------------------------ -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-with-GCS-Connector-Rate-limit-error-tp24194.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org