Re: Script suddenly stops
Dear All, thanks a lot for your replies. I've found my mistake. The script output stopped, but the script was still filling the MySQL table. When I resized the Gnome terminal window the output continued. -- Christian -- https://mail.python.org/mailman/listinfo/python-list
Script suddenly stops
Dear All, I'm trying to read ten 200 MB textfiles into a MySQL MyISAM database (Linux, ext4). The script output is suddenly stopping, while the Python process is still running (or should I say sleeping?). It's not in top, but in ps visible. Why is it stopping? Is there a way to make it continue, without calling kill -9, deleting the processed lines and starting it again? Thank you in advance. [1] http://pastebin.com/CxHCA9eB -- Christian -- https://mail.python.org/mailman/listinfo/python-list
Re: Script suddenly stops
Chris ch2...@arcor.de writes: I'm trying to read ten 200 MB textfiles into a MySQL MyISAM database (Linux, ext4). The script output is suddenly stopping, while the Python process is still running (or should I say sleeping?). It's not in top, but in ps visible. Why is it stopping? Is there a way to make it continue, without calling kill -9, deleting the processed lines and starting it again? This is difficult to say (from the distance). I would approach an analysis in the following way: * use a Python with debug symbols (OS provided Python installations usually lack debugging symbols; a manually generated Python usually has those symbols) * run your script unter gdb control (the C level debugger) * when you see that your script starts sleeping, hit CTRL-C in the gdb session * use gdb commands - maybe combined with the special python commands for gdb to learn where the sleeping happens. These special Python commands allow you to use the C level debugger gdb to get information about the Python level. -- https://mail.python.org/mailman/listinfo/python-list
Re: Script suddenly stops
Chris wrote: Dear All, I'm trying to read ten 200 MB textfiles into a MySQL MyISAM database (Linux, ext4). The script output is suddenly stopping, while the Python process is still running (or should I say sleeping?). It's not in top, but in ps visible. Why is it stopping? Is there a way to make it continue, without calling kill -9, deleting the processed lines and starting it again? Thank you in advance. [1] http://pastebin.com/CxHCA9eB #!/usr/bin/python import MySQLdb, pprint, re db = None daten = /home/chris/temp/data/data/ host = localhost user = data passwd = data database = data table = data def connect_mysql(): global db, host, user, passwd, database db = MySQLdb.connect(host, user, passwd, database) return(db) def read_file(srcfile): lines = [] f = open(srcfile, 'r') while True: line = f.readline() #print line lines.append(line) if len(line) == 0: break return(lines) The read_file() function looks suspicious. It uses a round-about way to read the whole file into memory. Maybe your system is just swapping? Throw read_file() away and instead iterate over the file directly (see below). def write_db(anonid, query, querytime, itemrank, clickurl): global db, table print write_db aufgerufen. cur = db.cursor() try: cur.execute(INSERT INTO data (anonid,query,querytime,itemrank,clickurl) VALUES (%s,%s,%s,%s,%s), (anonid,query,querytime,itemrank,clickurl)) db.commit() except: db.rollback() def split_line(line): print split_line called. print line is:, line searchObj = re.split(r'(\d*)\t(.*)\t([0-9: -]+)\t(\d*)\t([A-Za- z0-9._:/ -]*)',line, re.I|re.U) return(searchObj) db = connect_mysql() pprint.pprint(db) with open(daten + test-07b.txt) as lines: for line in lines: result = split_line(line) write_db(result[1], result[2], result[3], result[4], result[5]) db.close() Random remarks: - A bare except is evil. You lose valuable information. - A 'global' statement is only needed to rebind a module-global variable, not to access such a variable. At first glance all your 'global' declarations seem superfluous. - You could change the signature of write_db() to accept result[1:6]. - Do you really need a new cursor for every write? Keep one around as a global. - You might try cur.executemany() to speed things up a bit. -- https://mail.python.org/mailman/listinfo/python-list
Re: Script suddenly stops
On 5/29/14, 7:47 PM, Chris wrote: I'm trying to read ten 200 MB textfiles into a MySQL MyISAM database (Linux, ext4). The script output is suddenly stopping, while the Python process is still running (or should I say sleeping?). It's not in top, but in ps visible. Does it stop in the same place every time? How long are you waiting before giving up? Is it at all possible that it is the MySQL side that is blocking? Why is it stopping? Is there a way to make it continue, without calling kill -9, deleting the processed lines and starting it again? One thing to try (maybe, depending on whether it still fits into your requirements for a database transaction) is to increase the number of rows inserted before each commit. [1] http://pastebin.com/CxHCA9eB It won't have any bearing, but those globals aren't necessary... Paul -- https://mail.python.org/mailman/listinfo/python-list