Hi Suyatoslav i have inserted data into database.

Thank you so much for help me.
I wish u could help me in future.

Thank You.


On Thu, Apr 17, 2014 at 1:37 PM, Svyatoslav Sydorenko <
svyatos...@sydorenko.org.ua> wrote:

> Did you add it to ITEM_PIPELINES dict first?
>
> http://doc.scrapy.org/en/latest/topics/item-pipeline.html#activating-an-item-pipeline-component
>
> Четвер, 17 квітня 2014 р. 07:35:44 UTC+3 користувач masroor javed написав:
>>
>> Hi Suyatoslav, i tried to put codes from given pipeline but data could
>> not inserted.
>> Well i m also confused that is there any other command to insert the data.
>> firstly i use to this command to run the spider  " spider crawl
>> spidername"
>> may i know this command is also work for pipeline code or not? because i
>> used this command but data couldn't inserted and there is no error.
>> so please let me know what should i do?
>>
>>
>> On Thu, Apr 17, 2014 at 3:16 AM, Svyatoslav Sydorenko <
>> svyat...@sydorenko.org.ua> wrote:
>>
>>> Try using this pipeline..
>>> http://snipplr.com/view/66986/mysql-pipeline/
>>>
>>> Середа, 16 квітня 2014 р. 13:35:34 UTC+3 користувач masroor javed
>>> написав:
>>>
>>>> Hi All I tried to insert data into mysql database but data could not be
>>>> insert and there is no error during run the crawl.
>>>> My pipeline code is below and pls suggest me how to insert.
>>>> Is there any command for scrapy crawl to insert into mysql data same as
>>>> excel like "scrapy crawl -o datafile.csv -t csv" or simply run the spider
>>>> like " scrapy crawl spidername"?
>>>> Please help me guys i am new in scrapy.
>>>>
>>>> import sys
>>>> import MySQLdb
>>>> import hashlib
>>>> from scrapy.exceptions import DropItem
>>>> from scrapy.http import Request
>>>>
>>>> class PagitestPipeline(object):
>>>>   def __init__(self):
>>>> self.conn = MySQLdb.connect("localhost","root","
>>>> ","test",charset="utf8", use_unicode=True )
>>>>     self.cursor = self.conn.cursor()
>>>>
>>>> def process_item(self, item, spider):
>>>>     try:
>>>>         self.cursor.execute("INSERT INTO infosec (titlename, standname)
>>>>
>>>>                         VALUES ('%s', '%s')",
>>>>                        (item['titlename'].encode('utf-8'),
>>>>                         item['standname'].encode('utf-8')))
>>>>
>>>>         self.conn.commit()
>>>>
>>>>
>>>>     except MySQLdb.Error, e:
>>>>         print "Error %d: %s" % (e.args[0], e.args[1])
>>>>
>>>>
>>>>     return item
>>>>
>>>  --
>>> You received this message because you are subscribed to the Google
>>> Groups "scrapy-users" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to scrapy-users...@googlegroups.com.
>>> To post to this group, send email to scrapy...@googlegroups.com.
>>>
>>> Visit this group at http://groups.google.com/group/scrapy-users.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>  --
> You received this message because you are subscribed to the Google Groups
> "scrapy-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to scrapy-users+unsubscr...@googlegroups.com.
> To post to this group, send email to scrapy-users@googlegroups.com.
> Visit this group at http://groups.google.com/group/scrapy-users.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to