I reviewed my code and the slow speed I was talking about was when I said
|if a in myList]. I switched to the dict and I haven't noticed a
performance hit from keeping a pointer to a list.
I ended up switching to Postgres COPY for the importing. It's a lot
faster.
Brian
--
You
> Den 13/08/2015 kl. 04.09 skrev yakkades...@gmail.com:
>
> I'll run a test with the dict vs list+position counter. I know I saw a speed
> improvement but I can't remember if that was the only thing I changed.
>
> I'd have to change a lot of code if I change the DB scheme so I'm not wanting
This Python wiki
(https://wiki.python.org/moin/PythonSpeed/PerformanceTips#Choose_the_Right_Data_Structure)
suggests:
* Membership testing with sets and dictionaries is much faster, O(1), than
searching sequences, O(n). When testing "a in b", b should be a set or
dictionary instead of a list
I'll run a test with the dict vs list+position counter. I know I saw a
speed improvement but I can't remember if that was the only thing I
changed.
I'd have to change a lot of code if I change the DB scheme so I'm not
wanting to create an intermediate table. I'm going to go down the SQL
> Den 12/08/2015 kl. 20.00 skrev yakkades...@gmail.com:
>
> In the actually code I create and preload all the DataPoints and Sensors
> outside the loop. I found a dict was too slow for DataPoints.
That's suspicious. Compared to loading data from the database, Python dicts are
not slow, for
Hi Erik,
In the actually code I create and preload all the DataPoints and Sensors
outside the loop. I found a dict was too slow for DataPoints. I ended up
sorting the DataPoints query by date and using the fact that they were in
the same order as the CSV to speed things up.
Looping
> Den 12/08/2015 kl. 04.47 skrev yakkades...@gmail.com:
>
> for row in rows:
> dp = DataPoint.objects.get(Taken_datetime=row['date'])
>
> sensorToAdd = []
> for sensor in sensors:
> s = Sensor.objects.get(Name=sensor.name, Value=sensor.value )
> sensorToAdd.append(
7 matches
Mail list logo