I forgive the shell :

$ scrapy crawl bomnegocio -o data.csv -t csv
2014-12-13 23:50:16-0200 [scrapy] INFO: Scrapy 0.24.4 started (bot: 
bomnegocio)
2014-12-13 23:50:16-0200 [scrapy] INFO: Optional features available: ssl, 
http11
2014-12-13 23:50:16-0200 [scrapy] INFO: Overridden settings: 
{'NEWSPIDER_MODULE': 'bomnegocio.spiders', 'FEED_FORMAT': 'csv', 
'SPIDER_MODULES': ['bomnegocio.spiders'], 'FEED_URI': 'data.csv', 
'BOT_NAME': 'bomnegocio'}
2014-12-13 23:50:17-0200 [scrapy] INFO: Enabled extensions: FeedExporter, 
LogStats, TelnetConsole, CloseSpider, WebService, CoreStats, SpiderState
2014-12-13 23:50:17-0200 [scrapy] INFO: Enabled downloader middlewares: 
HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, 
RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, 
HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, 
ChunkedTransferMiddleware, DownloaderStats
2014-12-13 23:50:17-0200 [scrapy] INFO: Enabled spider middlewares: 
HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, 
UrlLengthMiddleware, DepthMiddleware
2014-12-13 23:50:17-0200 [scrapy] INFO: Enabled item pipelines: 
2014-12-13 23:50:17-0200 [bomnegocio] INFO: Spider opened
2014-12-13 23:50:17-0200 [bomnegocio] INFO: Crawled 0 pages (at 0 
pages/min), scraped 0 items (at 0 items/min)
2014-12-13 23:50:17-0200 [scrapy] DEBUG: Telnet console listening on 
127.0.0.1:6023
2014-12-13 23:50:17-0200 [scrapy] DEBUG: Web service listening on 
127.0.0.1:6080
2014-12-13 23:50:17-0200 [bomnegocio] DEBUG: Crawled (200) <GET 
http://sp.bomnegocio.com/regiao-de-bauru-e-marilia/eletrodomesticos/fogao-industrial-itajobi-4-bocas-c-forno-54183713>
 
(referer: None)
=====> Start data extract ....
2014-12-13 23:50:17-0200 [bomnegocio] DEBUG: Scraped from <200 
http://sp.bomnegocio.com/regiao-de-bauru-e-marilia/eletrodomesticos/fogao-industrial-itajobi-4-bocas-c-forno-54183713>
{'title': u'\n\t\t\t\n\t\t\t\tFog\xe3o industrial itajobi 4 bocas c/ forno 
\n\t\t\t\t\t\n\t\t\t\t\n\t\t\t\n\t\t\t\n\t\t\t\t- '}
=====> Finish data extract.
2014-12-13 23:50:17-0200 [bomnegocio] INFO: Closing spider (finished)
2014-12-13 23:50:17-0200 [bomnegocio] INFO: Stored csv feed (1 items) in: 
data.csv
2014-12-13 23:50:17-0200 [bomnegocio] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 308,
 'downloader/request_count': 1,
 'downloader/request_method_count/GET': 1,
 'downloader/response_bytes': 8501,
 'downloader/response_count': 1,
 'downloader/response_status_count/200': 1,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2014, 12, 14, 1, 50, 17, 186619),
 'item_scraped_count': 1,
 'log_count/DEBUG': 4,
 'log_count/INFO': 8,
 'response_received_count': 1,
 'scheduler/dequeued': 1,
 'scheduler/dequeued/memory': 1,
 'scheduler/enqueued': 1,
 'scheduler/enqueued/memory': 1,
 'start_time': datetime.datetime(2014, 12, 14, 1, 50, 17, 97079)}
2014-12-13 23:50:17-0200 [bomnegocio] INFO: Spider closed (finished)


Em sexta-feira, 12 de dezembro de 2014 14h12min09s UTC-2, Pedro Castro 
escreveu:
>
> Hi, everbody. 
>
> My question is the following : scrapy export empty csv.
>
> I try to post my code here, but became confused.
>
> My doubt on the stackoverflow :
>
> http://stackoverflow.com/questions/27447399/scrapy-export-empty-csv
>
>
> Thank you for your attention and I now look forward to hearing your views.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to