Max Cuban <edze...@gmail.com> writes: > I am putting together a project using Python 2.7 Django 1.5 on Windows 7. > I believe this should be on the django group but I haven't had help > from there so I figured I would try the python list > I have the following view: > views.py:
[snip] > Right now as my code stands, anytime I run it, it scraps all the links > on the frontpage of the sites selected and presents them paginated all > afresh. However, I don't think its a good idea for the script to > read/write all the links that had previously extracted links all over > again and therefore would like to check for and append only new links. > I would like to save the previously scraped links so that over the > course of say, a week, all the links that have appeared on the > frontpage of these sites will be available on my site as older pages. > > It's my first programming project and don't know how to incorporate > this logic into my code. > > Any help/pointers/references will be greatly appreciated. > > regards, Max I don't know anything about Django, but I don't think this is a Django question. I think the best way would be to put the urls in a database with the time that they have been retrieved. Then you could retrieve the links from the database next time, and when present, sort them on time retrieved and put them at the end of the list. Now if you want to do this on a user basis you should add user information with it also (and then it would be partly a Django problem because you would get the user id from Django). -- Piet van Oostrum <p...@vanoostrum.org> WWW: http://pietvanoostrum.com/ PGP key: [8DAE142BE17999C4] -- https://mail.python.org/mailman/listinfo/python-list