Hi masroor,

i want to insert my data in MYSQl 
please find my code

MY SPIDER

import scrapy

from craigslist_sample.items import AmazonDepartmentItem
from scrapy.contrib.spiders import CrawlSpider, Rule
from scrapy.contrib.linkextractors import LinkExtractor

class AmazonAllDepartmentSpider(scrapy.Spider):

    name = "amazon"
    allowed_domains = ["amazon.com"]
    start_urls = [
        
"http://www.amazon.com/gp/site-directory/ref=nav_sad/187-3757581-3331414";
    ]
    def parse(self, response):
        for sel in response.xpath('//ul/li'):
            item = AmazonDepartmentItem()
            item['title'] = sel.xpath('a/text()').extract()
            item['link'] = sel.xpath('a/@href').extract()
            item['desc'] = sel.xpath('text()').extract()
        return item


MY PIPELINE


import sys
import MySQLdb
import hashlib
from scrapy.exceptions import DropItem
from scrapy.http import Request

class MySQLStorePipeline(object):


    host = 'derr.com'
    user = 'amazon'
    password = 'mertl123'
    db = 'amazon_project'

    def __init__(self):
        self.connection = MySQLdb.connect(self.host, self.user, 
self.password, self.db)
        self.cursor = self.connection.cursor()

    def process_item(self, item, spider):    
        try:
            self.cursor.execute("""INSERT INTO 
amazon_project.ProductDepartment (ProductDepartmentLilnk)  
                            VALUES (%s)""", 
                           ( 
                            item['link'].encode('utf-8')))

            self.conn.commit()

        except MySQLdb.Error, e:
            print "Error %d: %s" % (e.args[0], e.args[1])
        return item

and i am running this command
scrapy crawl amazon


thanks



On Thursday, May 15, 2014 9:46:25 AM UTC+5:30, masroor javed wrote:
>
> ya sure could you share ur code here?
>
>
> On Wed, May 14, 2014 at 11:33 AM, Mukesh Salaria <mukeshs...@gmail.com 
> <javascript:>> wrote:
>
>> Hey Masroor,
>>
>> I am also newbie like you in scrapy as you insert data in database, could 
>> you please let me know what steps i have to follow to insert data in 
>> database.
>>
>> Regrads,
>> Mukesh
>>
>>
>> On Thursday, 17 April 2014 15:55:13 UTC+5:30, masroor javed wrote:
>>
>>> Hi Suyatoslav i have inserted data into database.
>>>
>>> Thank you so much for help me.
>>> I wish u could help me in future.
>>>
>>> Thank You.
>>>  
>>>
>>> On Thu, Apr 17, 2014 at 1:37 PM, Svyatoslav Sydorenko <
>>> svyat...@sydorenko.org.ua> wrote:
>>>
>>>> Did you add it to ITEM_PIPELINES dict first?
>>>> http://doc.scrapy.org/en/latest/topics/item-pipeline.
>>>> html#activating-an-item-pipeline-component
>>>>
>>>> Четвер, 17 квітня 2014 р. 07:35:44 UTC+3 користувач masroor javed 
>>>> написав:
>>>>>
>>>>> Hi Suyatoslav, i tried to put codes from given pipeline but data could 
>>>>> not inserted.
>>>>> Well i m also confused that is there any other command to insert the 
>>>>> data.
>>>>> firstly i use to this command to run the spider  " spider crawl 
>>>>> spidername"
>>>>> may i know this command is also work for pipeline code or not? because 
>>>>> i used this command but data couldn't inserted and there is no error.
>>>>> so please let me know what should i do?
>>>>>
>>>>>
>>>>> On Thu, Apr 17, 2014 at 3:16 AM, Svyatoslav Sydorenko <
>>>>> svyat...@sydorenko.org.ua> wrote:
>>>>>
>>>>>> Try using this pipeline..
>>>>>> http://snipplr.com/view/66986/mysql-pipeline/
>>>>>>
>>>>>> Середа, 16 квітня 2014 р. 13:35:34 UTC+3 користувач masroor javed 
>>>>>> написав:
>>>>>>
>>>>>>> Hi All I tried to insert data into mysql database but data could not 
>>>>>>> be insert and there is no error during run the crawl.
>>>>>>> My pipeline code is below and pls suggest me how to insert.
>>>>>>> Is there any command for scrapy crawl to insert into mysql data same 
>>>>>>> as excel like "scrapy crawl -o datafile.csv -t csv" or simply run the 
>>>>>>> spider like " scrapy crawl spidername"?
>>>>>>> Please help me guys i am new in scrapy.
>>>>>>>
>>>>>>> import sys
>>>>>>> import MySQLdb
>>>>>>> import hashlib
>>>>>>> from scrapy.exceptions import DropItem
>>>>>>> from scrapy.http import Request
>>>>>>>
>>>>>>> class PagitestPipeline(object):
>>>>>>>   def __init__(self):
>>>>>>> self.conn = MySQLdb.connect("localhost","root"," 
>>>>>>> ","test",charset="utf8", use_unicode=True )
>>>>>>>     self.cursor = self.conn.cursor()
>>>>>>>
>>>>>>> def process_item(self, item, spider):
>>>>>>>     try:
>>>>>>>         self.cursor.execute("INSERT INTO infosec (titlename, 
>>>>>>> standname)  
>>>>>>>                         VALUES ('%s', '%s')", 
>>>>>>>                        (item['titlename'].encode('utf-8'), 
>>>>>>>                         item['standname'].encode('utf-8')))
>>>>>>>
>>>>>>>         self.conn.commit()
>>>>>>>
>>>>>>>
>>>>>>>     except MySQLdb.Error, e:
>>>>>>>         print "Error %d: %s" % (e.args[0], e.args[1])
>>>>>>>
>>>>>>>
>>>>>>>     return item
>>>>>>>
>>>>>>  -- 
>>>>>> You received this message because you are subscribed to the Google 
>>>>>> Groups "scrapy-users" group.
>>>>>> To unsubscribe from this group and stop receiving emails from it, 
>>>>>> send an email to scrapy-users...@googlegroups.com.
>>>>>> To post to this group, send email to scrapy...@googlegroups.com.
>>>>>>
>>>>>> Visit this group at http://groups.google.com/group/scrapy-users.
>>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>>
>>>>>
>>>>>  -- 
>>>> You received this message because you are subscribed to the Google 
>>>> Groups "scrapy-users" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>> an email to scrapy-users...@googlegroups.com.
>>>> To post to this group, send email to scrapy...@googlegroups.com.
>>>> Visit this group at http://groups.google.com/group/scrapy-users.
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>  -- 
>> You received this message because you are subscribed to the Google Groups 
>> "scrapy-users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to scrapy-users...@googlegroups.com <javascript:>.
>> To post to this group, send email to scrapy...@googlegroups.com 
>> <javascript:>.
>> Visit this group at http://groups.google.com/group/scrapy-users.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to