On 2020-05-14 23:36, Stephane Tougard wrote:
Are there 100 threads running execute_sql? Do you put 100 "EXIT" messages into the queue, one for each thread?Hello, A multithreaded software written in Python is connected with a Postgres database. To avoid concurrent access issue with the database, it starts a thread who receive all SQL request via queue.put and queue.get (it makes only insert, so no issue with the return of the SQL request). As long as it runs with 10 threads, no issues. At 100 threads, the software is blocked by what I think is a locking issue. I guess Python multithreading and queue are working good enough that it can handle 100 threads with no issue (give me wrong here), so I guess the problem is in my code. The function (thread) who handles SQL requests. def execute_sql(q): print("Start SQL Thread") while True: try: data = q.get(True,5) except: print("No data") continueprint("RECEIVED SQL ORDER")print(data) print("END") if data == "EXIT": return try: request = data['request'] arg = data['arg'] ref.execute(request,arg) except: print("Can not execute SQL request") print(data) The code to send the SQL request. sql = dict() sql['request'] = "update b2_user set credit = credit -%s where id = %s" sql['arg'] = (i,username,) try: q.put(sql,True,5) except: print("Can not insert data") The launch of the SQL thread (nothing fancy here). q = qu.Queue() t = th.Thread(target = execute_sql, args = (q,)) t.start() Any idea ?
The "bare excepts" are a bad idea because they catch _all_ exceptions, even the ones that might occur due to bugs, such as NameError (misspelled variable or function).
-- https://mail.python.org/mailman/listinfo/python-list
