I am in the process of getting multiple Desktops to run linuxfor Research
and Development, at the moment am comparing the following intel chipset
H81, Q87, Q77, H61...What is the difference?What would be best performance
issue?
___
CentOS mailing list
Can I say H61 Q77 Q87 H81 ?
On Mon, Nov 25, 2013 at 9:24 PM, John R Pierce pie...@hogranch.com wrote:
On 11/25/2013 9:54 AM, madu...@gmail.com wrote:
I am in the process of getting multiple Desktops to run linuxfor Research
and Development, at the moment am comparing the following
Done!
Attention: Shrink/Extend a filesystem may damage the data on it, pls backup
date before.
1. Decrease /dev/mapper/vg_web-lv_home
# umount /dev/mapper/vg_web-lv_home
# e2fsck -f /dev/mapper/vg_web-lv_home
# resize2fs /dev/mapper/vg_web-lv_home size
# lvresize -L size
I need to shrink /home(755G) to 150GB and use free space to add to the
existing /(50G).
#df -kh
Filesystem Size Used Avail Use% Mounted on
/dev/mapper/vg_web-lv_root 50G 7.8G 40G 17% /
tmpfs 7.8G 384K 7.8G 1% /dev/shm
/dev/sda2 485M 79M 381M 18%
Hi
I would like to use a bash script that searches files and
subdirectories name in a directory /var/ww/html/web
for a specific string, and when it finds the search string, replaces
the string (old1) with new string (new1), and so on
old2 with new2 oldn with newn.
replace_string.sh
-type f ??
the string could be a name of file name or subdirectory name
Thanks
pons
On Fri, Sep 23, 2011 at 8:51 PM, m.r...@5-cent.us wrote:
madu...@gmail.com wrote:
I would like to use a bash script that searches files and
subdirectories name in a directory /var/ww/html/web
for a specific
yes files and directories too ..
pons
On Fri, Sep 23, 2011 at 9:08 PM, m.r...@5-cent.us wrote:
madu...@gmail.com wrote:
-type f ??
the string could be a name of file name or subdirectory name
I hate webmail. After I hit send and while it was thinking about going,
I realized another
I am planning to have this in 2 stages first -type fthen -type d
pons
On Fri, Sep 23, 2011 at 11:15 PM, m.r...@5-cent.us wrote:
Les Mikesell wrote:
On Fri, Sep 23, 2011 at 1:21 PM, m.r...@5-cent.us wrote:
I realized another question: are you trying to rename files?
yes files and
I am running squid + sarg, how can I change the ip-address in the
generated report into username? The users are free to surf the web
anonymously, no need to provide a login or any authentication to the
proxy.
Thanks
___
CentOS mailing list
Recall..
I run now the following task every day tar -cvzf
/rescue/website-$(date +%u).tgz /var/www/htdocs/*
I want now to move these files from the local server to a remote server via ftp.
any help.
Thanks
On Fri, Jan 28, 2011 at 5:33 PM, cpol...@surewest.net wrote:
madu...@gmail.com wrote
Should I add to my tar the following option
-p, --preserve-permissions
extract all protection information
tar -cvzfp ..
Thanks
On Tue, Jan 25, 2011 at 7:10 PM, John Doe jd...@yahoo.com wrote:
From: madu...@gmail.com madu...@gmail.com
I want to create bash script to have
I have reallocated it to /home
thx
On Fri, Jan 28, 2011 at 5:33 PM, cpol...@surewest.net wrote:
madu...@gmail.com wrote:
Should I add to my tar the following option
-p, --preserve-permissions
extract all protection information
tar -cvzfp ..
Thanks
On Tue, Jan 25, 2011
home folder for backup /backup
On Fri, Jan 28, 2011 at 7:49 PM, m.r...@5-cent.us wrote:
madu...@gmail.com wrote:
I have reallocated it to /home
thx
Please stop top posting.
Relocated it to /home, as in /home/backup? Don't clutter your base
directories, that's very bad practice
...@gmail.com wrote:
You could create a script and have a variable date --date=5 days ago
append to your tar file and after that, combine it with if syntax. If match,
then rm.
HTH
On Tue, Jan 25, 2011 at 3:31 PM, madu...@gmail.com madu...@gmail.com
wrote:
I want to create bash script to have
I want to create bash script to have a zip copy from a website running
on linux /var/www/htdocs/* local on the same box on different
directory
I am thinking to do a local backup using crontab (snapshot my web)
tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/*
This command will
I need to know the meaning of this line in my squid access log
1295267166.311 1069 10.6.50.123 TCP_MISS/200 16623 GET
http://www.mycom.com/sendNews.php? - DIRECT/71.6.196.18 text/html
Thx
___
CentOS mailing list
CentOS@centos.org
16 matches
Mail list logo