Hey folks,

i am writing a program to perform some tasks i would like to do. I am
running 3.8 on 64 bit box, for now.

While playing around with int a long types i could see a performance
improvement when using long type over the same method using int type.
What is the theory behind this increase on performance?

If theory validades the practice, could i infere i should always use
long instead of int on my 64 machines?

Thanks a lot for your time and cooperation.

Best regards.

Reply via email to