Sorry Mukesh, I did not use sql server. if you want work with mysql server you have to install mysql server on your window then you can create manually database and table.
On Thu, May 15, 2014 at 12:21 AM, Mukesh Salaria <mukeshsalari...@gmail.com>wrote: > hey i am not much familiar with mysql database as it is used for php, but > i am .net developer and use sql server for backend storage. can i use sql > server to store the data or only mysql, if only mysql then could you please > help me to set it up. > > > > > On Thu, May 15, 2014 at 12:38 PM, masroor javed <masroor....@gmail.com>wrote: > >> Hi, >> well l am storing the data in localhost not on server. >> before run the crawl you have to create database with table. >> >> >> On Thu, May 15, 2014 at 12:07 AM, Mukesh Salaria < >> mukeshsalari...@gmail.com> wrote: >> >>> hey one question >>> >>> are you storing the data in your local mysql database or at server? >>> >>> because i have to put the data in the wordpress mysql database, and i >>> have to create table before insert the data or it will create himself? >>> >>> Regards >>> >>> >>> On Thu, May 15, 2014 at 12:34 PM, masroor javed >>> <masroor....@gmail.com>wrote: >>> >>>> Your welcome Mukesh. >>>> No issue. >>>> >>>> >>>> On Thu, May 15, 2014 at 12:03 AM, Mukesh Salaria < >>>> mukeshsalari...@gmail.com> wrote: >>>> >>>>> Hey >>>>> >>>>> Thanks dude for quick reply, let me try this. If i face any problem >>>>> then let you know:) >>>>> >>>>> Regards, >>>>> Mukesh >>>>> >>>>> >>>>> On Thu, May 15, 2014 at 9:54 AM, masroor javed >>>>> <masroor....@gmail.com>wrote: >>>>> >>>>>> Hi Mukesh... >>>>>> As i know you should add this thing in ur spider setting means in >>>>>> setting.py, >>>>>> >>>>>> ITEM_PIPELINES = { >>>>>> #botname.pipelines.pipelineclassname >>>>>> 'pagitest.pipelines.PagitestPipeline': 300, >>>>>> } >>>>>> >>>>>> after that you should write the code in pipeline.py >>>>>> >>>>>> >>>>>> from twisted.enterprise import adbapi >>>>>> import datetime >>>>>> from scrapy import log >>>>>> import MySQLdb.cursors >>>>>> >>>>>> class PagitestPipeline(object): >>>>>> >>>>>> def __init__(self): >>>>>> self.dbpool = adbapi.ConnectionPool('MySQLdb', db='xyz', >>>>>> user='abc', passwd='bcd', >>>>>> cursorclass=MySQLdb.cursors.DictCursor, >>>>>> charset='utf8', use_unicode=True) >>>>>> >>>>>> def process_item(self, item, spider): >>>>>> # run db query in thread pool >>>>>> query = self.dbpool.runInteraction(self._conditional_insert, >>>>>> item) >>>>>> query.addErrback(self.handle_error) >>>>>> >>>>>> return item >>>>>> >>>>>> def _conditional_insert(self, tx, item): >>>>>> # create record if doesn't exist. >>>>>> # all this block run on it's own thread >>>>>> tx.execute("select * from infosec where titlename = %s", >>>>>> (item['titlename'][0], )) >>>>>> result = tx.fetchone() >>>>>> if result: >>>>>> log.msg("Item already stored in db: %s" % item, >>>>>> level=log.DEBUG) >>>>>> else: >>>>>> tx.execute( >>>>>> "insert into infosec (titlename, standname) " >>>>>> "values (%s, %s)", >>>>>> (item['titlename'][0],item['standname'][0],) >>>>>> ) >>>>>> log.msg("Item stored in db: %s" % item, level=log.DEBUG) >>>>>> >>>>>> def handle_error(self, e): >>>>>> log.err(e) >>>>>> >>>>>> >>>>>> >>>>>> On Wed, May 14, 2014 at 11:37 AM, Mukesh Salaria < >>>>>> mukeshsalari...@gmail.com> wrote: >>>>>> >>>>>>> hey guys >>>>>>> >>>>>>> Is someone know, how do we extract the images using scrapy with >>>>>>> example as i am newbie..... >>>>>>> >>>>>>> -- >>>>>>> You received this message because you are subscribed to the Google >>>>>>> Groups "scrapy-users" group. >>>>>>> To unsubscribe from this group and stop receiving emails from it, >>>>>>> send an email to scrapy-users+unsubscr...@googlegroups.com. >>>>>>> >>>>>>> To post to this group, send email to scrapy-users@googlegroups.com. >>>>>>> Visit this group at http://groups.google.com/group/scrapy-users. >>>>>>> For more options, visit https://groups.google.com/d/optout. >>>>>>> >>>>>> >>>>>> -- >>>>>> You received this message because you are subscribed to a topic in >>>>>> the Google Groups "scrapy-users" group. >>>>>> To unsubscribe from this topic, visit >>>>>> https://groups.google.com/d/topic/scrapy-users/2K7fz7xvGY4/unsubscribe >>>>>> . >>>>>> To unsubscribe from this group and all its topics, send an email to >>>>>> scrapy-users+unsubscr...@googlegroups.com. >>>>>> >>>>>> To post to this group, send email to scrapy-users@googlegroups.com. >>>>>> Visit this group at http://groups.google.com/group/scrapy-users. >>>>>> For more options, visit https://groups.google.com/d/optout. >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Best Regards >>>>> Mukesh Kumar >>>>> Software Engineer >>>>> Contact - +91-9814268272 >>>>> Skype - mukeshsalaria01 >>>>> >>>>> -- >>>>> You received this message because you are subscribed to the Google >>>>> Groups "scrapy-users" group. >>>>> To unsubscribe from this group and stop receiving emails from it, send >>>>> an email to scrapy-users+unsubscr...@googlegroups.com. >>>>> To post to this group, send email to scrapy-users@googlegroups.com. >>>>> Visit this group at http://groups.google.com/group/scrapy-users. >>>>> For more options, visit https://groups.google.com/d/optout. >>>>> >>>> >>>> -- >>>> You received this message because you are subscribed to a topic in the >>>> Google Groups "scrapy-users" group. >>>> To unsubscribe from this topic, visit >>>> https://groups.google.com/d/topic/scrapy-users/2K7fz7xvGY4/unsubscribe. >>>> To unsubscribe from this group and all its topics, send an email to >>>> scrapy-users+unsubscr...@googlegroups.com. >>>> To post to this group, send email to scrapy-users@googlegroups.com. >>>> Visit this group at http://groups.google.com/group/scrapy-users. >>>> For more options, visit https://groups.google.com/d/optout. >>>> >>> >>> >>> >>> -- >>> Best Regards >>> Mukesh Kumar >>> Software Engineer >>> Contact - +91-9814268272 >>> Skype - mukeshsalaria01 >>> >>> -- >>> You received this message because you are subscribed to the Google >>> Groups "scrapy-users" group. >>> To unsubscribe from this group and stop receiving emails from it, send >>> an email to scrapy-users+unsubscr...@googlegroups.com. >>> To post to this group, send email to scrapy-users@googlegroups.com. >>> Visit this group at http://groups.google.com/group/scrapy-users. >>> For more options, visit https://groups.google.com/d/optout. >>> >> >> -- >> You received this message because you are subscribed to a topic in the >> Google Groups "scrapy-users" group. >> To unsubscribe from this topic, visit >> https://groups.google.com/d/topic/scrapy-users/2K7fz7xvGY4/unsubscribe. >> To unsubscribe from this group and all its topics, send an email to >> scrapy-users+unsubscr...@googlegroups.com. >> To post to this group, send email to scrapy-users@googlegroups.com. >> Visit this group at http://groups.google.com/group/scrapy-users. >> For more options, visit https://groups.google.com/d/optout. >> > > > > -- > Best Regards > Mukesh Kumar > Software Engineer > Contact - +91-9814268272 > Skype - mukeshsalaria01 > > -- > You received this message because you are subscribed to the Google Groups > "scrapy-users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to scrapy-users+unsubscr...@googlegroups.com. > To post to this group, send email to scrapy-users@googlegroups.com. > Visit this group at http://groups.google.com/group/scrapy-users. > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "scrapy-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to scrapy-users+unsubscr...@googlegroups.com. To post to this group, send email to scrapy-users@googlegroups.com. Visit this group at http://groups.google.com/group/scrapy-users. For more options, visit https://groups.google.com/d/optout.