Hey Raf. Paste a chunk of te actual url if you have it...
I have no clue as to how the actual scrapy process works.. but it's easy enough to psuedo code what you want with xpath anda bit of logic to get you in the right direction Are the next pages you're looking at on subsequent pages, or is it essentially in a chunk of html where you have all the pages youd want to iterrate over On Sat, Aug 13, 2016 at 6:49 PM, Raf Roger <raf.n...@gmail.com> wrote: > Hi, > > i'm new to scrapy and i'm looking for a way to retrieve all links (with > class: ul li a). > on each page, there is pagination and first page url is like: > telephone-horaires-metier/Restaurant > > page 2 url is: > telephone-horaires-metier/Restaurant?p=2 > > page 3 url is: > telephone-horaires-metier/Restaurant?p=3 > > etc... > > the "next" url is always the current page +1 so if i'm page 2 "next" url > is telephone-horaires-metier/Restaurant?p=3 > > How can i do to collect all links on each page ? > > thx > > -- > You received this message because you are subscribed to the Google Groups > "scrapy-users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to scrapy-users+unsubscr...@googlegroups.com. > To post to this group, send email to scrapy-users@googlegroups.com. > Visit this group at https://groups.google.com/group/scrapy-users. > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "scrapy-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to scrapy-users+unsubscr...@googlegroups.com. To post to this group, send email to scrapy-users@googlegroups.com. Visit this group at https://groups.google.com/group/scrapy-users. For more options, visit https://groups.google.com/d/optout.