Emile van Sebille wrote:
> On 11/21/2016 11:27 AM, subhabangal...@gmail.com wrote:
>> I have a python script where I am trying to read from a list of files
>> in a folder and trying to process something. As I try to take out the
>> output I am presently appending to a list.
>>
>> But I am trying
On 11/21/2016 11:27 AM, subhabangal...@gmail.com wrote:
I have a python script where I am trying to read from a list of files in a
folder and trying to process something.
As I try to take out the output I am presently appending to a list.
But I am trying to write the result of individual files
I have a python script where I am trying to read from a list of files in a
folder and trying to process something.
As I try to take out the output I am presently appending to a list.
But I am trying to write the result of individual files in individual list or
files.
The script is as follows:
> What I meant was that you would have a dict of dicts, where the key was the
> country:
Thanks MRAB I could not see that solution. That save me a lot of lines of code.
Certainly my previous solution also manage to do that but yours is more
clean-code wise.
This email is confidential and may be
On 2016-06-14 21:06, Joaquin Alzola wrote:
> >> The dictionary that I am using in the classes:
> >> {'Country':'Empty','Service':'Empty','TimeStamp':'Empty','Ocg':'see3',
> >> 'DiameterCodes':{'2001':0,'4010':0,'4012':0,'4998':0,'4999':0,'5007':0
> >> ,'5012':0}}
> >>
> >> Wanted help from your
>> The dictionary that I am using in the classes:
>> {'Country':'Empty','Service':'Empty','TimeStamp':'Empty','Ocg':'see3',
>> 'DiameterCodes':{'2001':0,'4010':0,'4012':0,'4998':0,'4999':0,'5007':0
>> ,'5012':0}}
>>
>> Wanted help from your side on how to focus this just because I want to read
>>
On 2016-06-14 17:53, Joaquin Alzola wrote:
Hi Guys
I am doing program that reads into a directory for the files that were created
the last 5 mins. (working)
Inside those files there are 242 fields in each line separated by | (pipe).
Each file has about 5k records and there are about 5 files
Hi Guys
I am doing program that reads into a directory for the files that were created
the last 5 mins. (working)
Inside those files there are 242 fields in each line separated by | (pipe).
Each file has about 5k records and there are about 5 files per 5 mins.
I will look for field 29 and
Mark Summerfield <l...@qtrac.plus.com> writes:
> Sometimes I want to spread a class over multiple files.
When I run into such a use case, I use (multiple) inheritance --
with "mixin class"es.
Each "mixin class" handles some important aspect and is only loosely
cou
On 6/5/2016 2:55 AM, Mark Summerfield wrote:
Sometimes I want to spread a class over multiple files.
My experience with trying to work with two do-all classes in idlelib has
engendered a dislike for such. It is hard to find things in a
kitchen-sink class. To switch IDLE from being a multi
itance, mixins and traits. See links here:
https://mail.python.org/pipermail/python-list/2016-June/709808.html
To my mind, if you have to split a class over multiple files, it probably
does too much. The "God Class" that Peter referred to is an anti-pattern:
https://en.wikipedia.org
You're quite right! For some reason I have a blind-spot about mixins, but they
are the perfect solution. Thanks:-)
--
https://mail.python.org/mailman/listinfo/python-list
Mark Summerfield wrote:
> Sometimes I want to spread a class over multiple files.
>
> My primary use case is when I create a "Model" class to reflect an entire
> SQL database. I want a model instance to provide a single point of access
> to
> the database, but the d
On 06/04/2016 11:55 PM, Mark Summerfield wrote:
Sometimes I want to spread a class over multiple files.
There’s and easy way to do this in Python using what's called a Mixin
class and (multiple) inheritance:
(See https://en.wikipedia.org/wiki/Mixin for more information.)
In one file, say
Sometimes I want to spread a class over multiple files.
My primary use case is when I create a "Model" class to reflect an entire SQL
database. I want a model instance to provide a single point of access to
the database, but the database has many tables each requiring its own meth
Sometimes I want to spread a class over multiple files.
My primary use case is when I create a "Model" class to reflect an entire SQL
database. I want a model instance to provide a single point of access to the
database, but the database has many tables each requiring its own met
On Monday, April 20, 2015 at 5:30:15 PM UTC+5:30, subhabrat...@gmail.com wrote:
Dear Group,
I am trying to open multiple files at one time.
I am trying to do it as,
for item in [ one, two, three ]:
f = open (item + world.txt, w)
f.close()
This is fine. But I
subhabrata.bane...@gmail.com writes:
Dear Group,
I am trying to open multiple files at one time.
I am trying to do it as,
for item in [ one, two, three ]:
f = open (item + world.txt, w)
f.close()
This is fine. But I was looking if I do not know the number of
text files
On 04/21/2015 03:56 AM, subhabrata.bane...@gmail.com wrote:
Yes. They do not. They are opening one by one.
I have some big chunk of data I am getting by crawling etc.
now as I run the code it is fetching data.
I am trying to fetch the data from various sites.
The contents of the file are
On Tuesday, April 21, 2015 at 4:20:16 AM UTC+5:30, Dave Angel wrote:
On 04/20/2015 07:59 AM, wrote:
Dear Group,
I am trying to open multiple files at one time.
I am trying to do it as,
for item in [ one, two, three ]:
f = open (item + world.txt, w)
f.close
On 04/20/2015 07:59 AM, subhabrata.bane...@gmail.com wrote:
Dear Group,
I am trying to open multiple files at one time.
I am trying to do it as,
for item in [ one, two, three ]:
f = open (item + world.txt, w)
f.close()
This is fine.
But it does not open multiple files
Dear Group,
I am trying to open multiple files at one time.
I am trying to do it as,
for item in [ one, two, three ]:
f = open (item + world.txt, w)
f.close()
This is fine. But I was looking if I do not know the number of
text files I would create beforehand, so not trying
On Monday, April 20, 2015 at 5:00:15 AM UTC-7, subhabrat...@gmail.com wrote:
Dear Group,
I am trying to open multiple files at one time.
I am trying to do it as,
for item in [ one, two, three ]:
f = open (item + world.txt, w)
f.close()
This is fine. But I was looking
On Sunday, June 29, 2014 4:19:27 PM UTC+5:30, subhaba...@gmail.com wrote:
Dear Group,
I am trying to crawl multiple URLs. As they are coming I want to write them
as string, as they are coming, preferably in a queue.
If any one of the esteemed members of the group may kindly help.
On Mon, 30 Jun 2014 12:23:08 -0700, subhabangalore wrote:
Thank you for your kind suggestion. But I am not being able to sort out,
fp = open( scraped/body{:05d}.htm.format( n ), w )
please suggest.
look up the python manual for string.format() and open() functions.
The line indicated opens
Dear Group,
I am trying to crawl multiple URLs. As they are coming I want to write them as
string, as they are coming, preferably in a queue.
If any one of the esteemed members of the group may kindly help.
Regards,
Subhabrata Banerjee.
--
On 29/06/2014 11:49, subhabangal...@gmail.com wrote:
Dear Group,
I am trying to crawl multiple URLs. As they are coming I want to write them as
string, as they are coming, preferably in a queue.
If any one of the esteemed members of the group may kindly help.
Regards,
Subhabrata Banerjee.
you want to keep multiple files open, and
write to each in an arbitrary order. That's no problem, up to the operating
system limits. Define a class that holds the URL information and for each
instance, add an attribute for an output file handle.
Don't forget to close each file when you're
.
If any one of the esteemed members of the group may kindly help.
From your subject line, it appears you want to keep multiple files open,
and write to each in an arbitrary order. That's no problem, up to the
operating system limits. Define a class that holds the URL information
. As they are coming I want to write
them
as string, as they are coming, preferably in a queue.
If any one of the esteemed members of the group may kindly help.
From your subject line, it appears you want to keep multiple files open,
and write to each in an arbitrary order
On Sun, 29 Jun 2014 10:32:00 -0700, subhabangalore wrote:
I am opening multiple URLs with urllib.open, now one Url has huge html
source files, like that each one has. As these files are read I am
trying to concatenate them and put in one txt file as string.
From this big txt file I am trying
Roundup Robot added the comment:
New changeset 74faca1ac59c by Ned Deily in branch '2.7':
Issue #6676: Ensure a meaningful exception is raised when attempting
http://hg.python.org/cpython/rev/74faca1ac59c
New changeset 9e3fc66ee0b8 by Ned Deily in branch '3.4':
Issue #6676: Ensure a meaningful
Ned Deily added the comment:
Applied for release in 3.5.0, 3.4.1 and 2.7.7. Thanks, everyone!
--
resolution: - fixed
stage: patch review - committed/rejected
status: open - closed
versions: +Python 3.5 -Python 3.3
___
Python tracker
Ned Deily added the comment:
Thanks for the reminder, David. Here are patches for 3.x and 2.7 that include
updated versions of the proposed pyexpat.c and test_pyexpat.py patches along
with a doc update along the lines suggested by David.
--
stage: - patch review
versions: -Python
Changes by Ned Deily n...@acm.org:
Added file: http://bugs.python.org/file34241/issue6676_27.patch
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue6676
___
David H. Gutteridge added the comment:
Updating to reflect the Python 3.4 documentation is now also relevant to this
discussion. Perhaps someone could commit a change something like my suggestion
in msg143295?
--
versions: +Python 3.4
___
Python
On Thursday, December 12, 2013 5:20:59 PM UTC-5, Chris Angelico wrote:
import urllib
import csv
# You actually could get away with not using a with
# block here, but may as well keep it for best practice
with open('clients.csv') as f:
for client in csv.reader(f):
I have a CSV file containing a bunch of URLs I have to download a file from for
clients (Column 7) and the clients names (Column 0) I tried making a script to
go down the .csv file and just download each file from column 7, and save the
file as [clientname].csv
I am relatively new to python,
On 12/12/2013 21:43, Matt Graves wrote:
I have a CSV file containing a bunch of URLs I have to download a file from for
clients (Column 7) and the clients names (Column 0) I tried making a script to
go down the .csv file and just download each file from column 7, and save the
file as
On Fri, Dec 13, 2013 at 8:43 AM, Matt Graves tunacu...@gmail.com wrote:
###This SHOULD plug in the URL for F, and the client name for G.
def downloadFile(urls, clientname):
urllib.urlretrieve(f, %g.csv) % clientname
downloadFile(f,g)
When I run it, I get : AttributeError: 'file' object
In 88346903-2af8-48cd-9829-37cedb717...@googlegroups.com Matt Graves
tunacu...@gmail.com writes:
import urllib
import csv
urls = []
clientname = []
###This will set column 7 to be a list of urls
with open('clients.csv', 'r') as f:
reader = csv.reader(f)
for column in reader:
I have a dilemma I cant figure out how to send multiple files as an attachment
to my email using this script. I can only send a single file attachment .
Help!!! Here is my script.
All filename's are txt files.
fo = with open(filename,'rb')
fo1 = open(filename2,'rb')
fo2= open(filename3, 'rb
On 08/08/2013 12:05 PM, wachk...@gmail.com wrote:
I have a dilemma I cant figure out how to send multiple files as an attachment
to my email using this script. I can only send a single file attachment .
Help!!! Here is my script.
All filename's are txt files.
There is a standard Python
On Thu, Aug 8, 2013 at 1:05 PM, wachk...@gmail.com wrote:
I have a dilemma I cant figure out how to send multiple files as an
attachment to my email using this script. I can only send a single file
attachment . Help!!! Here is my script.
You just need to repeat part3 for each attachment
On Thu, Aug 8, 2013 at 3:19 PM, Gary Herron
gary.her...@islandtraining.com wrote:
On 08/08/2013 12:05 PM, wachk...@gmail.com wrote:
I have a dilemma I cant figure out how to send multiple files as an
attachment to my email using this script. I can only send a single file
attachment . Help
im trying to delete all text files from an ftp directory. is there a way to
delete multiple files of the same extension?
I came up with the following code below which works but I have to append the
string because ftp.nlst returns:
-rwx-- 1 user group 0 Feb 04 15:57 New Text Document.txt
On 02/05/2013 12:29 PM, chris.an...@gmail.com wrote:
im trying to delete all text files from an ftp directory. is there a way to
delete multiple files of the same extension?
I came up with the following code below which works but I have to append the
string because ftp.nlst returns:
-rwx
On 2013-02-05 17:29, chris.an...@gmail.com wrote:
im trying to delete all text files from an ftp directory. is there a way to
delete multiple files of the same extension?
I came up with the following code below which works but I have to append the
string because ftp.nlst returns:
-rwx
im trying to delete all text files from an ftp directory. is there a way to
delete multiple files of the same extension?
I came up with the following code below which works but I have to append
the string because ftp.nlst returns:
-rwx-- 1 user group 0 Feb 04 15:57
want to split the file and get multiple files like
A1980JE3937.txt and A1980KK18700010.txt, where each file will
contain column2, 3 and 4.
Sorry for being completely off-topic here, but awk has a very convenient
feature to deal with this. Simply use:
awk '{ print $2,$3,$4 $1.txt; }' /path
On Tue, 23 Oct 2012 20:01:03 -0700, satyam wrote:
I have a text file like this
A1980JE3937 2732 4195 12.527000
[...]
I want to split the file and get multiple files like A1980JE3937.txt
and A1980KK18700010.txt, where each file will contain column2, 3 and 4.
Are you just excited
On Wed, Oct 24, 2012 at 3:52 AM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
On Tue, 23 Oct 2012 20:01:03 -0700, satyam wrote:
I have a text file like this
A1980JE3937 2732 4195 12.527000
[...]
I want to split the file and get multiple files like A1980JE3937.txt
186 1.285000
A1980KK18700010 30 185 4.395000
A1980KK18700010 185 186 9.00
A1980KK18700010 25 30 3.493000
I want to split the file and get multiple files like A1980JE3937.txt
and A1980KK18700010.txt, where each file will contain column2, 3 and 4.
Thanks Satyam
import os
from
185 4.395000
A1980KK18700010 185 186 9.00
A1980KK18700010 25 30 3.493000
I want to split the file and get multiple files like A1980JE3937.txt and
A1980KK18700010.txt, where each file will contain column2, 3 and 4.
Thanks
Satyam
--
http://mail.python.org/mailman/listinfo/python-list
and get multiple files like A1980JE3937.txt and
A1980KK18700010.txt, where each file will contain column2, 3 and 4.
The sample data above shows the data grouped by file name. Will this
be true generally?
--
http://mail.python.org/mailman/listinfo/python-list
30 186 1.285000
A1980KK18700010 30 185 4.395000
A1980KK18700010 185 186 9.00
A1980KK18700010 25 30 3.493000
I want to split the file and get multiple files like A1980JE3937.txt and
A1980KK18700010.txt, where each file will contain column2, 3 and 4.
Unless your source file is very
and get multiple files like A1980JE3937.txt
and A1980KK18700010.txt, where each file will contain column2, 3 and 4.
Unless your source file is very large this should be sufficient:
$ cat source
A1980JE3937 2732 4195 12.527000
A1980JE3937 3465 9720 22.00
A1980JE3937 1853
A1980KK18700010 186 3366 4.78
A1980KK18700010 30 186 1.285000
A1980KK18700010 30 185 4.395000
A1980KK18700010 185 186 9.00
A1980KK18700010 25 30 3.493000
I want to split the file and get multiple files like A1980JE3937.txt and
A1980KK18700010.txt, where each file will contain
On 2012-10-23, at 10:24 PM, David Hutto dwightdhu...@gmail.com wrote:
count = 0
Don't use count.
for file_data in turn_text_to_txt:
Use enumerate:
for count, file_data in enumerate(turn_text_to_txt):
f = open('/home/david/files/%s_%s.txt' % (file_data.split(' ')[0], count),
'w')
Use
satyam dirac@gmail.com writes:
I have a text file like this
A1980JE3937 2732 4195 12.527000
A1980JE3937 3465 9720 22.00
A1980JE3937 2732 9720 18.00
A1980KK18700010 130 303 4.985000
A1980KK18700010 7 4915 0.435000
[...]
I want to split the file and get multiple
On Tue, 10 Jul 2012 10:46:08 -0700, Subhabrata wrote:
Dear Group,
I kept a good number of files in a folder. Now I want to read all of
them. They are in different formats and different encoding. Using
listdir/glob.glob I am able to find the list but how to open/read or
process them for
On Tuesday, July 10, 2012 11:16:08 PM UTC+5:30, Subhabrata wrote:
Dear Group,
I kept a good number of files in a folder. Now I want to read all of
them. They are in different formats and different encoding. Using
listdir/glob.glob I am able to find the list but how to open/read or
process
On 11 July 2012 19:15, subhabangal...@gmail.com wrote:
On Tuesday, July 10, 2012 11:16:08 PM UTC+5:30, Subhabrata wrote:
Dear Group,
I kept a good number of files in a folder. Now I want to read all of
them. They are in different formats and different encoding. Using
listdir/glob.glob
On Wed, 11 Jul 2012 11:15:02 -0700, subhabangalore wrote:
On Tuesday, July 10, 2012 11:16:08 PM UTC+5:30, Subhabrata wrote:
Dear Group,
I kept a good number of files in a folder. Now I want to read all of
them. They are in different formats and different encoding. Using
listdir/glob.glob I
Dear Group,
I kept a good number of files in a folder. Now I want to read all of
them. They are in different formats and different encoding. Using
listdir/glob.glob I am able to find the list but how to open/read or
process them for different encodings?
If any one can help me out.I am using
On 10/07/2012 18:46, Subhabrata wrote:
Dear Group,
I kept a good number of files in a folder. Now I want to read all of
them. They are in different formats and different encoding. Using
listdir/glob.glob I am able to find the list but how to open/read or
process them for different encodings?
On Thu, Jan 26, 2012 at 2:19 AM, lh lhughe...@gmail.com wrote:
Third, length. Well 5000 lines eh... I'm nowhere near that guess I can
stick with one file.
Of all the source files I have at work, the largest is about that, 5K
lines. It gets a little annoying at times (rapid deployment requires
In article mailman.5080.1327510460.27778.python-l...@python.org,
Dennis Lee Bieber wlfr...@ix.netcom.com wrote:
The old convention I'd learned was to keep functions down to a
(printer) page (classical 6 lines per inch, 11 high, tractor feed -- so
about 60 lines per function -- possibly
On Fri, Jan 27, 2012 at 1:11 AM, Roy Smith r...@panix.com wrote:
So, I'd say the driving principle should be that a function should do
one thing. Every function should have an elevator talk. You should be
able to get on an elevator with a function and when you ask it, So,
what do you do?, it
in
test2.py).
In short I would like to distribute code for one class across multiple
files so a given file doesn't get ridiculously long.
I take the point of the other responders that it is not a normal thing to
do, but I had a few long but rarely used methods which I wanted to move out
which contains a set of methods that are
methods of class Foo defined in test.py.
Technically, yes, this is possible, but you shouldn't need to do it.
Needing to split a single class across multiple files is a sign of bad
design. If the class is that huge, then it probably does too many things
like to distribute code for one class across multiple
files so a given file doesn't get ridiculously long.
Thank you,
Luke
If the file is ridiculously long, could be that the class has a
ridiculous number of methods. If you spread your class into multiple
files, you will have a ridiculous
First, thanks for all the thoughtful replies. I am grateful.
Second, I figured I'd get a lot of judgement about how I really
shouldn't be doing this. Should have pre-empted it :-) oh well. There
is a place IMHO for filename as another structuring element to help
humans in search. Also it can be
On 2012-01-25, lh lhughe...@gmail.com wrote:
First, thanks for all the thoughtful replies. I am grateful.
Second, I figured I'd get a lot of judgement about how I really
shouldn't be doing this. Should have pre-empted it :-) oh well.
There is a place IMHO for filename as another structuring
to distribute code for one class across multiple
files so a given file doesn't get ridiculously long.
Thank you,
Luke
--
http://mail.python.org/mailman/listinfo/python-list
would just indent
them def's under the class but the class isn't textually in
test2.py).
In short I would like to distribute code for one class across multiple
files so a given file doesn't get ridiculously long.
The student asks the master, How long is a file?
The master replies, Just long
but the class isn't textually in
| test2.py).
|
| In short I would like to distribute code for one class across multiple
| files so a given file doesn't get ridiculously long.
You may need to define ridiculously long. What's your objecttion to
a long file? What specific difficulties does it cause? I'm
David H. Gutteridge dhgutteri...@sympatico.ca added the comment:
Ned: My proposed wording is: Note that only one document can be parsed by a
given instance; it is not possible to reuse an instance to parse multiple
files. To provide more detail, one could also add something like: The
isfinal
Ned Deily n...@acm.org added the comment:
I agree that, at a minimum, the documentation should be updated to include a
warning about not reusing a parser instance. Whether it's worth trying to plug
all the holes in the expat library is another issue (see, for instance,
issue12829). David,
Ned Deily n...@acm.org added the comment:
Also, note issue1208730 proposes a feature to expose a binding for
XML_ParserReset and has the start of a patch.
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue6676
David H. Gutteridge dhgutteri...@sympatico.ca added the comment:
The documentation should definitely be updated to clarify that a parser
instance is not reusable with more than one file. I had a look at the
equivalent documentation for Perl and TCL, and Perl's implementation explicitly
does
On Sun, May 22, 2011 at 8:48 PM, Shunichi Wakabayashi
shunichi_wakabaya...@yahoo.co.jp wrote:
One idea is using contextlib.nested(),
from contextlib import nested
with nested(*[open('list_%d.txt' % i, 'w') for i in range(LIST_LEN)]) as
fobjlist:
for i in range(1000):
To write onto multiple files on the same time (a number of files are variable),
I'd like to code as follows, for example, IF I can do,
LIST_LEN = 4
with [ open('list_%d.txt' % i, 'w') for i in range(LIST_LEN) ] as fobjlist:
for i in range(1000):
fobjlist[random.randrange(LIST_LEN)].write
Éric Araujo mer...@netwok.org added the comment:
+1, this was agreed upon last summer:
https://bitbucket.org/Merwok/sample-distutils2-project/src/tip/new-config-file.rst#cl-255
(see also fellowship archive). Fixed by Gaël in c8dfb3c63894.
--
resolution: - fixed
stage: -
: normal
severity: normal
status: open
title: Allow multiple files in the description-file metadata
type: feature request
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue11036
Guido's time machine strikes again! It's already in Python
3; your
example would be spelled:
with open('scores.csv') as f, open('grades.csv', wt) as g:
g.write(f.read())
Indeed! Thanks, Chris and James.
Yingjie
--
http://mail.python.org/mailman/listinfo/python-list
Hi,
Suppose I am working with two files simultaneously,
it might make sense to do this:
with open('scores.csv'), open('grades.csv', wt) as f,g:
g.write(f.read())
sure, you can do this with nested with-blocks,
but the one above does not seem too complicated,
it is like having a multiple
On Sun, Oct 31, 2010 at 10:03 PM, Yingjie Lan lany...@yahoo.com wrote:
Hi,
Suppose I am working with two files simultaneously,
it might make sense to do this:
with open('scores.csv'), open('grades.csv', wt) as f,g:
g.write(f.read())
sure, you can do this with nested with-blocks,
but
On Mon, Nov 1, 2010 at 3:03 PM, Yingjie Lan lany...@yahoo.com wrote:
with open('scores.csv'), open('grades.csv', wt) as f,g:
g.write(f.read())
One could write their own ContextManager here...
cheers
James
--
-- James Mills
--
-- Problems are solved by method
--
Thanks,
The issue with the times is now sorted, however I'm running into a problem
towards the end of the script:
File sortoutsynop2.py, line 131, in module
newline =
message_type+c+str(station_id)+c+newtime+c+lat+c+lon+c+c+-+c+ 002
+c+-+c+-+c+str(pressure)+c
TypeError: cannot
On 10/15/2010 6:59 AM, Christopher Steele wrote:
Thanks,
The issue with the times is now sorted, however I'm running into a
problem towards the end of the script:
File sortoutsynop2.py, line 131, in module
newline =
message_type+c+str(station_id)+c+newtime+c+lat+c+lon+c+c+-+c+
002
Hi
I've been trying to decode a series of observations from multiple files
(each file is a different time) and put each type of observation into their
own separate file. The script runs successfully for one file but whenever I
try it for more they just overwrite each other. I'm new to python
multiple files
(each file is a different time) and put each type of observation into
their own separate file. The script runs successfully for one file but
whenever I try it for more they just overwrite each other.
fileinput.input() iterates over *lines* not entire *files*. So take a look
On 10/14/2010 6:08 AM, Christopher Steele wrote:
Hi
I've been trying to decode a series of observations from multiple files
(each file is a different time) and put each type of observation into
their own separate file. The script runs successfully for one file but
whenever I try it for more
On 10/14/2010 10:44 AM, Christopher Steele wrote:
The issue is that I need to be able to both, split the names of the
files so that I can extract the relevant times, and open each
individual file and process each line individually. Once I have
achieved this I need to append the sorted files
I have 3 files which are constantly being updated therefore I use tail
-f /var/log/file1, tail -f /var/log/file2, and tail -f /var/log/file3
For 1 file I am able to manage by
tail -f /var/log/file1 | python prog.py
prog.py looks like this:
f=sys.stdin
for line in f:
print line
But how can I
Mag Gam magaw...@gmail.com writes:
I have 3 files which are constantly being updated therefore I use tail
-f /var/log/file1, tail -f /var/log/file2, and tail -f /var/log/file3
For 1 file I am able to manage by
tail -f /var/log/file1 | python prog.py
prog.py looks like this:
f=sys.stdin
Thanks for your response.
I was going by this thread,
http://mail.python.org/pipermail/tutor/2009-January/066101.html makes
you wonder even if its possible.
I will try your first solution by doing mkfifo on the files.
On Thu, Sep 9, 2010 at 9:19 PM, Alain Ketterlin
Will Grainger willgrain...@gmail.com added the comment:
I don't think this is a python specific problem. I have just seen
the same error when working with the expat library from C, and the cause
is using the same parser to read multiple files.
--
nosy: +willgrainger
Ezio Melotti ezio.melo...@gmail.com added the comment:
This is a duplicate of #7372.
--
resolution: - duplicate
stage: test needed - committed/rejected
status: open - closed
superseder: - Regression in pstats
___
Python tracker
1 - 100 of 201 matches
Mail list logo