> >> Why is a static site considered a good idea in this time and age? If your >> concern is that a CMS-managed site can be hacked while a static site can't >> be - well, yes, it is more difficult to attack a pure static site.
>I believe static pages in *our case* would be a good idea because we can get >around the problems of one person being the gatekeeper for all the >changes/updates that need to be done. What exactly is the workflow you have in mind for how the static website is going to be updated? Whether you use an SCM or not you are going to have a person who finally takes a call on a "release" to the production server. Consider the steps required for a SCM repository idea to work: Naturally, I am assuming that you are not providing commit privileges to everyone. Random user: Step 1 - Random user installs subversion on his desktop and then proceeds to checkout the html files and the other content files from the repository. This includes checking out the *entire* content from the branch. Step 2 - Random user sets up the files on his local machine. This involves downloading and installing a version of Apache. Step 3 - Random user makes changes in the html files. Adds files, removes files etc. Changes stylesheets (common across all files) Step 4 - Sends a patch to the webmaster? Carefully sends the list of new files he has added so that he does not break anything. (NOTE: I know we are speaking of techies here, but really - how many of even these techies will know how to send patches properly and will be diligent enough to include any additional files and so on?) Webmaster: Step 5 - Webmaster receives the patch and determine if the patch can be applied to the code. Step 6 - He will need to actually apply the patch to determine if it is breaking anything. Since stylesheets may have changed, he will need to review a lot of files. Step 7 - He will need to make changes if the patch breaks anything Step 8 - If there are multiple patches, do steps 5 to 7 and also consider the case when one patch causes problem with another patch Step 9 - Applies the patch to a branch. Step 10 - Asks someone else (review board) to download the version and review it and decide if the version can go online. Step 11 - Wait around and repeat steps 8 and 9 if there are multiple people on the board. Step 12 - Apply the patches / changes sent in by the people on the review board. Step 13 - Merge all the changes to the head (optional) Step 14 - Export the changes to the production server. This may involve deleting the files from the production server. You could do a checkout if you prefer having .svn files all over the webserver directory. >Also, it's also a case of not >complicating things unless we really, really need it - It's not just >security I'm talking about - Drupal will need a database, and somebody has >to backup all that data separately. It has to be transferred off-server so >that a disk failure won't take out all the old data. Ergo, considerable >amount of admin time. As SK mentioned, it becomes a chore for any one >person when he has a day job. Consider the steps given above. One poor soul will be entrusted with the task of doing this - there is a lot of possibility of error. >Static pages are simple and get the job done - all the data is in the svn >repo. And everybody who has the latest checkout will have all the data of >the site. Any one of us can restore it. Also, there is also the issue of >flexibility - using simple html pages and bunch of css files is so much >easier than managing 'themes' and 'modules' and somesuch. What exactly is so difficult about these "themes" and "modules"? There are kids who are setting up their own websites using these CMSs - a theme is mostly html with some additional markup and some css. Have you tried putting in a large binary file into subversion? consider something like 20 video files of 50 mb each. Try checking them out of subversion. Just consider that the users who want to modify html files and send it to the webmaster will need to checkout these files on their desktops over a pathetic 1mbps DSL connection. Consider the workflow for a CMS driven site: Step 1. Web master obtains requests for changes over email or over ilugc. Another approach would be to allow a moderated comment system against on each page and the users can submit their comments directly. Step 2. Web master logs in from anywhere in the world (does not need access to subversion etc) Step 3. Web master makes the change in the content, uploads files etc. On some CMSs, this will generate a new version of the content. The older version will remain published. Step 4. If the software supports a workflow, those responsible for reviewing the changes will be automatically notified over email. Step 5. Reviewers place their comments or add/remove changes Step 6. Web master incorporates the changes. Step 7. Web master publishes the changes. The CMS will itself handle auto-archiving and auto-expiry of content (if setup appropriately) without the web master doing any additional work. It won't require the users to know anything about subversion, apache etc. They won't need to download and install these on their computers - they won't need to download the entire repository which will become quite big over time. The same applies for reviewers. It may also be possible to set up the system such that each section in the site is administered by a different person so that the web master does not need to worry about maintaining each section. Backups will be simple as well. Do a database export. Also zip up the content directories. Transfer these across over ftp/sftp to a remote location. These could be automated using a cron job. In any case, this is just my opinion on the subject. The people who are actually going to be managing the site should take the call - and if they are happier managing using an SCM, so be it. Regards, Prem _______________________________________________ To unsubscribe, email [email protected] with "unsubscribe <password> <address>" in the subject or body of the message. http://www.ae.iitm.ac.in/mailman/listinfo/ilugc
