Thank you Camilo to be more in details,suppose the website has 80,000 users and each page takes 200 ms to be rendered and you have thousand hits in a second so we want to reduce the time of rendering. is there any way to reduce the rendering time?
other thing is suppose they want to upload files simultaneously and the videos are in the website not on another server like YouTube and so streams are really consuming the bandwidth. Also,It is troublesome to get backups,when getting backups you have problem of lock backing up with bulk of data. Sincerely Negin Nickparsa On Wed, Sep 18, 2013 at 12:50 PM, Camilo Sperberg <unrea...@gmail.com>wrote: > > On Sep 18, 2013, at 09:38, Negin Nickparsa <nickpa...@gmail.com> wrote: > > > Thank you Sebastian..actually I will already have one if qualified for > the > > job. Yes, and I may fail to handle it that's why I asked for guidance. > > I wanted some tidbits to start over. I have searched through yslow, > > HTTtrack and others. > > I have searched through php list in my email too before asking this > > question. it is kind of beneficial for all people and not has been asked > > directly. > > > > > > Sincerely > > Negin Nickparsa > > > > > > On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs <krebs....@gmail.com > >wrote: > > > >> > >> > >> > >> 2013/9/18 Negin Nickparsa <nickpa...@gmail.com> > >> > >>> In general, what are the best ways to handle high traffic websites? > >>> > >>> VPS(clouds)? > >>> web analyzers? > >>> dedicated servers? > >>> distributed memory cache? > >>> > >> > >> Yes :) > >> > >> But seriously: That is a topic most of us spent much time to get into > it. > >> You can explain it with a bunch of buzzwords. Additional, how do you > define > >> "high traffic websites"? Do you already _have_ such a site? Or do you > >> _want_ it? It's important, because I've seen it far too often, that > >> projects spent too much effort in their "high traffic infrastructure" > and > >> at the end it wasn't that high traffic ;) I wont say, that you cannot be > >> successfull, but you should start with an effort you can handle. > >> > >> Regards, > >> Sebastian > >> > >> > >>> > >>> > >>> Sincerely > >>> Negin Nickparsa > >>> > >> > >> > >> > >> -- > >> github.com/KingCrunch > >> > > Your question is way too vague to be answered properly... My best guess > would be that it depends severely on the type of website you have and how's > the current implementation being well... implemented. > > Simply said: what works for Facebook may/will not work for linkedIn, > twitter or Google, mainly because the type of search differs A LOT: > facebook is about relations between people, twitter is about small pieces > of data not mainly interconnected between each other, while Google is all > about links and all type of content: from little pieces of information > through whole Wikipedia. > > You could start by studying how varnish and redis/memcached works, you > could study about how proxies work (nginx et al), CDNs and that kind of > stuff, but if you want more specific answers, you could better ask specific > question. > > In the PHP area, an opcode cache does the job very well and can accelerate > the page load by several orders of magnitude, I recommend OPCache, which is > already included in PHP 5.5. > > Greetings. > >