Made misatke writing bulkloader.yaml, encoding is one line below.
- kind: AppendixBlobData
connector: csv
* connector_options:*
* encoding: iso-8859-9*
property_map:
- property: __key__
external_name: key
export_transform: transform.key_id_or_name_as_string
- property: ObjectID
external_name: ObjectID
- property: FileData
external_name: FileData
export_transform: transform.blob_to_file('Filename',
'AttachmentBlobs')
On Thursday, September 5, 2013 10:31:37 AM UTC+3, Sercan Altundas wrote:
>
> hello again!
>
> I have some tables with blob data in them, such as PDF and DOC files. I
> download the tables from MS SQL server and convert them into CSV files, at
> this step blob data go crazy.
>
> Here is my bulkloader.yaml, in case you offer me to use export_transform
> property. And probably I used it wrong (lack of examples about GAE) it does
> no thing.
>
> - kind: AppendixBlobData
> connector: csv
> connector_options: encoding: iso-8859-9
> property_map:
> - property: __key__
> external_name: key
> export_transform: transform.key_id_or_name_as_string
>
> - property: ObjectID
> external_name: ObjectID
>
> - property: FileData
> external_name: FileData
> export_transform: transform.blob_to_file('Filename',
> 'AttachmentBlobs')
>
> I want to fetch that file form CSV file then be able yo upload it to
> datastore. And yes, when I try to upload the CSV with blob data I get this
> error:
>
> 10:24 AM Uploading data records.
> [INFO ] Logging to bulkloader-log-20130905.102454
> [INFO ] Throttling transfers:
> [INFO ] Bandwidth: 250000 bytes/second
> [INFO ] HTTP connections: 8/second
> [INFO ] Entities inserted/fetched/modified: 20/second
> [INFO ] Batch Size: 10
> [INFO ] Opening database: bulkloader-progress-20130905.102454.sql3
> [INFO ] Connecting to banamsgbirak.appspot.com/_ah/remote_api
> [INFO ] Starting import; maximum 10 entities per post
> [ERROR ] [Thread-12] DataSourceThread:
> Traceback (most recent call last):
> File "C:\Program Files
> (x86)\Google\google_appengine\google\appengine\tools\bulkloader.py", line
> 1601, in run self.PerformWork()
> File "C:\Program Files
> (x86)\Google\google_appengine\google\appengine\tools\bulkloader.py", line
> 1720, in PerformWork for item in content_gen.Batches():
> File "C:\Program Files
> (x86)\Google\google_appengine\google\appengine\tools\bulkloader.py", line
> 556, in Batches self._ReadRows(key_start, key_end)
> File "C:\Program Files
> (x86)\Google\google_appengine\google\appengine\tools\bulkloader.py", line
> 466, in _ReadRows row = self.reader.next()
> File "C:\Program Files
> (x86)\Google\google_appengine\google\appengine\ext\bulkload\csv_connector.py",
>
> line 218, in generate_import_record for input_dict in self.dict_generator:
> File "C:\Python27\lib\csv.py", line 104, in next row = self.reader.next()
> Error: line contains NULL byte
> [INFO ] An error occurred. Shutting down...
>
> [ERROR ] Error in data source thread: line contains NULL byte
> [INFO ] 0 entities total, 0 previously transferred
> [INFO ] 0 entities (2590 bytes) transferred in 2.1 seconds
> [INFO ] Some entities not successfully transferred
> Continue...
>
> What can I do about this Null nyte error and, exporting blob data, I would
> buy you a beer If you'd help me with this! Thanks.
>
>
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/google-appengine.
For more options, visit https://groups.google.com/groups/opt_out.