[GitHub] [airflow] mik-laj commented on issue #6359: [AIRFLOW-XXX] Change instances "Google cloud storage" to "Google Cloud Storage".

2019-12-12 Thread GitBox
mik-laj commented on issue #6359: [AIRFLOW-XXX] Change instances "Google cloud 
storage" to "Google Cloud Storage".
URL: https://github.com/apache/airflow/pull/6359#issuecomment-565124089
 
 
   I did rebase.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #6359: [AIRFLOW-XXX] Change instances "Google cloud storage" to "Google Cloud Storage".

2019-12-02 Thread GitBox
mik-laj commented on issue #6359: [AIRFLOW-XXX] Change instances "Google cloud 
storage" to "Google Cloud Storage".
URL: https://github.com/apache/airflow/pull/6359#issuecomment-560289355
 
 
   @dgorelik Can you do rebase? Travis is sad, because master branch was broken.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #6359: [AIRFLOW-XXX] Change instances "Google cloud storage" to "Google Cloud Storage".

2019-10-17 Thread GitBox
mik-laj commented on issue #6359: [AIRFLOW-XXX] Change instances "Google cloud 
storage" to "Google Cloud Storage".
URL: https://github.com/apache/airflow/pull/6359#issuecomment-543397354
 
 
   This problem also occurs elsewhere. Can you fix it in the whole project?
   ```
   ./airflow/operators/gcs_to_bq.py:Loads files from Google cloud storage 
into BigQuery.
   ./airflow/operators/gcs_to_bq.py:point the operator to a Google cloud 
storage object name. The object in
   ./airflow/operators/gcs_to_bq.py:Google cloud storage must be a JSON 
file with the schema fields in it.
   ./airflow/operators/gcs_to_bq.py::param source_objects: List of Google 
cloud storage URIs to load from. (templated)
   ./airflow/operators/cassandra_to_gcs.py:data from Cassandra to Google cloud 
storage in JSON format.
   ./airflow/operators/cassandra_to_gcs.py:Copy data from Cassandra to 
Google cloud storage in JSON format
   ./airflow/operators/cassandra_to_gcs.py:to Google cloud storage. A 
{} should be specified in the filename
   ./airflow/operators/local_to_gcs.py:Uploads the file to Google cloud 
storage
   ./airflow/operators/mysql_to_gcs.py:"""Copy data from MySQL to Google 
cloud storage in JSON or CSV format.
   ./airflow/operators/mysql_to_gcs.py:JSON/Google cloud 
storage/BigQuery. Dates are converted to UTC seconds.
   ./airflow/operators/gcs_to_gcs.py::param source_bucket: The source 
Google cloud storage bucket where the
   ./airflow/operators/gcs_to_gcs.py::param destination_bucket: The 
destination Google cloud storage bucket
   ./airflow/operators/gcs_to_gcs.py:destination Google cloud storage 
bucket. (templated)
   ./airflow/operators/sql_to_gcs.py:Google cloud storage.
   ./airflow/gcp/sensors/gcs.py::param bucket: The Google cloud storage 
bucket where the object is.
   ./airflow/gcp/sensors/gcs.py:connecting to Google cloud storage.
   ./airflow/gcp/sensors/gcs.py::param bucket: The Google cloud storage 
bucket where the object is.
   ./airflow/gcp/sensors/gcs.py:connecting to Google cloud storage.
   ./airflow/gcp/sensors/gcs.py::param bucket: The Google cloud storage 
bucket where the object is.
   ./airflow/gcp/sensors/gcs.py:connecting to Google cloud storage.
   ./airflow/gcp/sensors/gcs.py::param bucket: The Google cloud storage 
bucket where the objects are.
   ./airflow/gcp/sensors/gcs.py:to Google cloud storage.
   ./airflow/gcp/operators/cloud_storage_transfer_service.py::param 
source_bucket: The source Google cloud storage bucket where the
   ./airflow/gcp/operators/cloud_storage_transfer_service.py::param 
destination_bucket: The destination Google cloud storage bucket
   ./airflow/gcp/operators/gcs.py::param bucket: The Google cloud storage 
bucket to find the objects. (templated)
   ./airflow/gcp/operators/gcs.py::param bucket: The Google cloud storage 
bucket where the object is.
   ./airflow/gcp/operators/bigquery.py:point the operator to a Google cloud 
storage object name. The object in
   ./airflow/gcp/operators/bigquery.py:Google cloud storage must be a JSON 
file with the schema fields in it.
   ./airflow/gcp/operators/bigquery.py:point the operator to a Google cloud 
storage object name. The object in
   ./airflow/gcp/operators/bigquery.py:Google cloud storage must be a JSON 
file with the schema fields in it.
   ./airflow/gcp/operators/bigquery.py::param source_objects: List of 
Google cloud storage URIs to point
   ./airflow/gcp/hooks/gcs.py::param bucket_name: The Google cloud 
storage bucket where the object is.
   ./airflow/gcp/hooks/gcs.py::param bucket_name: The Google cloud 
storage bucket where the object is.
   ./airflow/gcp/hooks/gcs.py::param bucket_name: The Google cloud 
storage bucket where the blob_name is.
   ./airflow/gcp/hooks/gcs.py::param bucket_name: The Google cloud 
storage bucket where the blob_name is.
   ./airflow/gcp/hooks/gcs.py::param bucket_name: The Google cloud 
storage bucket where the blob_name is.
   grep: ./logs/scheduler/latest: No such file or directory
   grep: ./files/airflow-home/logs/scheduler/latest: No such file or directory
   ./CHANGELOG.txt:- Don't return error when writing files to Google cloud 
storage.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services