Hi there,

I am a noob and trying to test this on different product grids. I am not 
able to get more than a few (6 to 8) items per page.

For example, 

import scrapy


class NordstromSpider(scrapy.Spider):
    name = "nordstrom"
    start_urls = [
        
'http://shop.nordstrom.com/c/womens-dresses-new?origin=leftnav&cm_sp=Top%20Navigation-_-New%20Arrivals'
    ]


    def parse(self, response):
        for dress in response.css('article.npr-product-module'):
            yield {
                'src': dress.css('img.product-photo').xpath('@src').
extract_first(),
                'url': dress.css('a.product-photo-href').xpath('@href').
extract_first()
            }


    def noparse(self, response):
        page = response.url.split("/")[-2]
        filename = 'nordstrom-%s.html' % page
        with open(filename, 'wb') as f:
            f.write(response.body)
        self.log('Saved file %s' % filename)



This gave only 6 items. So I tried another site -

import scrapy


class QuotesSpider(scrapy.Spider):
    name = "rtr"
    start_urls = [
        'https://www.renttherunway.com/products/dress'
    ]


    def parse(self, response):
        for dress in response.css('div.cycle-image-0'):
            yield {
                'image-url': dress.xpath('.//img/@src').extract_first(),
            }



This only gave 12 items even though the page has a lot more.
I am guessing that I'm missing a setting somewhere. Any pointers are 
appreciated.

Thanks,

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at https://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to