Re: OT: This Swift thing

2014-06-27 Thread Albert van der Horst
In article mailman.11028.1402548495.18130.python-l...@python.org,
Chris Angelico  ros...@gmail.com wrote:
On Thu, Jun 12, 2014 at 12:08 PM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 I'm just pointing out that our computational technology uses
 over a million times more energy than the theoretical minimum, and
 therefore there is a lot of room for efficiency gains without sacrificing
 computer power. I never imagined that such viewpoint would turn out to be
 so controversial.

The way I understand it, you're citing an extremely theoretical
minimum, in the same way that one can point out that we're a long way
from maximum entropy in a flash memory chip, so it ought to be
possible to pack a lot more data onto a USB stick. The laws of physics
tend to put boundaries that are ridiculously far from where we
actually work - I think most roads have speed limits that run a fairly
long way short of c.

As a physicist I'm well aware that houses need no heating.
With a suitable isolation and top-notch heat-exchangers in the
ventilation system, our bodies generate enough heat to keep our houses
at a comfortable 21 degrees. (Well, and there is the disk washer.)

In the same vain cars need very little fuel, we just must accept that
cars move slightly slower than we could walk.


ChrisA
-- 
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
albert@spearc.xs4all.nl =n http://home.hccnet.nl/a.w.m.van.der.horst

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-18 Thread Bob Martin
in 723903 20140617 121638 alister alister.nospam.w...@ntlworld.com wrote:
On Tue, 17 Jun 2014 08:34:13 +1000, Chris Angelico wrote:


 Partly that. But also, people want to know how long that will *really*
 last. For instance, 10 hours of battery life... doing what? Can I really
 hop on a plane for ten hours and write code the whole way without
 external power? Or will each minute spent recompiling Python (with the
 CPU pegged) cost 2-3 minutes out of those ten hours? What if I watch
 videos (on headphones, probably, given how noisy airliners are!)?
 That'll surely take more power than the manufacturers estimate.
 And what happens six months from now? Will battery life decay to the
 point where it's no longer interesting? (Obviously it'll decay some. But
 how much?)


I bought a 12 cell battery for my Acer Once netbook  did exactly that
(LHR to LAX), listening to music playing supertuxcart  reading ebooks
for most of the flight.

It was a life saver as the on-board entertainment from American Airlines
was terrible, next time i will happily pay the extra 100 for a Virgin
flight LWG to LAS instead.

c/LWG/LGW/  ;-)
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-17 Thread alister
On Tue, 17 Jun 2014 08:34:13 +1000, Chris Angelico wrote:

 
 Partly that. But also, people want to know how long that will *really*
 last. For instance, 10 hours of battery life... doing what? Can I really
 hop on a plane for ten hours and write code the whole way without
 external power? Or will each minute spent recompiling Python (with the
 CPU pegged) cost 2-3 minutes out of those ten hours? What if I watch
 videos (on headphones, probably, given how noisy airliners are!)?
 That'll surely take more power than the manufacturers estimate.
 And what happens six months from now? Will battery life decay to the
 point where it's no longer interesting? (Obviously it'll decay some. But
 how much?)
 

I bought a 12 cell battery for my Acer Once netbook  did exactly that 
(LHR to LAX), listening to music playing supertuxcart  reading ebooks 
for most of the flight.

It was a life saver as the on-board entertainment from American Airlines 
was terrible, next time i will happily pay the extra 100 for a Virgin 
flight LWG to LAS instead.




-- 
Distance doesn't make you any smaller, but it does make you part of a
larger picture.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-16 Thread Anssi Saari
Gregory Ewing greg.ew...@canterbury.ac.nz writes:

 Current draw of CMOS circuitry is pretty much zero when
 nothing is changing, so if you didn't care how slow it ran,
 you probably could run a server off a watch battery today.

That was before 90 nm when leakage current started dominating over
switching current. But has low power or battery life been in anyone's
interest ever?  Or rather, is battery life interesting enough that
marketing would notice? Or maybe it's so that what a marketing guy or a
manager needs is maybe one hour for his presentation so anything over
that is extra?

A few years ago jumbo sized but cheapish CULV laptops suddenly had 10
hours plus battery but did anyone notice or care? Today expensive
Haswell ULT laptops get the same while being relatively thin and light
but again, where's the interest? Apple didn't even bother trying to make
improved battery life a selling point for the 2013 Macbook Air. I was
seriously considering one but I prefer matte displays and cellular
connectivity built in.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-16 Thread Gregory Ewing

Anssi Saari wrote:

That was before 90 nm when leakage current started dominating over
switching current.


Well, if you don't care about speed, you probably don't
need to make it that small. There's plenty of time for
signals to propagate, so you can afford to spread the
circuitry out more.

The point is that optimising for power consumption on
its own doesn't really make sense, because there's no
optimum point -- you can more or less make the power
consumption as low as you want if you *really* don't
care about speed in the slightest.

In practice, people *do* care about speed, so it
becomes a tradeoff between low power consumption and
something fast enought that people will want to use
it.


A few years ago jumbo sized but cheapish CULV laptops suddenly had 10
hours plus battery but did anyone notice or care?


I think people do care, it's just that going from
something like 6 hours to 10 hours is not a big
enough change to warrant much hype. If it were
100 hours, without losing too much else, I'm
pretty sure it *would* be made a marketing point!

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-16 Thread Chris Angelico
On Tue, Jun 17, 2014 at 8:12 AM, Gregory Ewing
greg.ew...@canterbury.ac.nz wrote:
 A few years ago jumbo sized but cheapish CULV laptops suddenly had 10
 hours plus battery but did anyone notice or care?


 I think people do care, it's just that going from
 something like 6 hours to 10 hours is not a big
 enough change to warrant much hype. If it were
 100 hours, without losing too much else, I'm
 pretty sure it *would* be made a marketing point!

Partly that. But also, people want to know how long that will *really*
last. For instance, 10 hours of battery life... doing what? Can I
really hop on a plane for ten hours and write code the whole way
without external power? Or will each minute spent recompiling Python
(with the CPU pegged) cost 2-3 minutes out of those ten hours? What if
I watch videos (on headphones, probably, given how noisy airliners
are!)? That'll surely take more power than the manufacturers estimate.
And what happens six months from now? Will battery life decay to the
point where it's no longer interesting? (Obviously it'll decay some.
But how much?)

These are unanswerable questions. (Unless you count It depends as an
answer.) If I have two laptop models I'm looking at, one with a
boasted 10 hour battery and the other with just 8 hours, all those
other considerations will be much more important than the two hours of
rated difference. Now, if you had that 100 hour battery, well, then
I'd be interested! Because even after six months of usage, that'll
still be giving several times what a 10-hour battery would be.
Obviously still read the fine print as regards usage patterns, but
even if you get the full hundred hours *only* with the screen on
minimum brightness *and* the absolute lightest usage, it's probably
going to be possible to use that usefully.

That's why purported battery life isn't such an advertisable point.
And it's why the business I worked for recently, where we sold
second-hand laptops, was very clear about our battery testing
methodology - it was approximately equivalent to light usage, keeping
the screen, CPU, and disk all intermittently active. If it lasted two
hours in our test, we expect that it'll last two hours of, say, text
editing. (And yes, the scale is very different. Our idea of a good,
saleable battery was one that lasts one hour; below that was
considered a fault to be discounted for. Two hour batteries were
excellent. Anything more than that was wow, this in a second-hand
laptop?!?. I doubt we would *ever* see a ten-hour battery.)

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-15 Thread Johannes Bauer
On 07.06.2014 11:54, Alain Ketterlin wrote:

 No. Cost is the issue (development, maintenance, operation,
 liability...). Want an example? Here is one:
 
 http://tech.slashdot.org/story/14/06/06/1443218/gm-names-and-fires-engineers-involved-in-faulty-ignition-switch

Yeah this is totally believable. One rogue engineer who clearly did it
all by himself. He just wanted to save the company a few dollars out of
pure love for it. Clearly it's his and only his fault with no boundary
conditions that could have influenced his decision in any meaningful
ways. In fact, there's even a GM company memo that states Hey Ray, just
do what is sensible engineering-wise and don't worry about cost. It's
kewl. But no, Ray just had to go rogue. Just had to do it his way. Man.
Typical Ray thing.

Cheers,
Johannes

-- 
 Wo hattest Du das Beben nochmal GENAU vorhergesagt?
 Zumindest nicht öffentlich!
Ah, der neueste und bis heute genialste Streich unsere großen
Kosmologen: Die Geheim-Vorhersage.
 - Karl Kaos über Rüdiger Thomas in dsa hidbv3$om2$1...@speranza.aioe.org
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-15 Thread Johannes Bauer
On 05.06.2014 23:53, Marko Rauhamaa wrote:

 or:
 
def make_street_address_map(info_list):
return dict((info.get_street_address(), info.get_zip_code())
for info in info_list)
 

or, what I think is even clearer than your last one:

def make_street_address_map(info_list):
return { info.get_street_address(): info.get_zip_code()
   for info in info_list }

Regards,
Johannes

-- 
 Wo hattest Du das Beben nochmal GENAU vorhergesagt?
 Zumindest nicht öffentlich!
Ah, der neueste und bis heute genialste Streich unsere großen
Kosmologen: Die Geheim-Vorhersage.
 - Karl Kaos über Rüdiger Thomas in dsa hidbv3$om2$1...@speranza.aioe.org
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-15 Thread CHIN Dihedral
On Wednesday, June 4, 2014 10:53:13 PM UTC+8, Mark H. Harris wrote:
 On 6/4/14 9:24 AM, Skip Montanaro wrote:
 
  Surely your local colleagues realize that Python has been around for
 
  20-odd years now, that indentation-based block structure has been
 
  there since Day One, and that it's not going to change, right?
 
 
 
  Yup. Its the primary argument on the side for indentation. ... and 
 
 don't call me Surely:-)
 
 
 
  {snip}
 
 
 
  Why are you people even having this discussion?
 
 
 
 
 
  The topic came up because the C/C++ coders were being encouraged to 
 
 try Python3 as the language of choice for a new project, and someone 
 
 said they would never consider Python for a project primary language 
 
 because of indentation block delimiting. The whole debate, as in most 
 
 flames, was stupid. The primary paradigm on this topic locally is that 
 
 indents are bad because malformed or mangled code cannot be reformatted 
 
 easily (if at all).
 
 
 
  From my own perspective, if you tell me I need to use END, ok.  If 
 
 I need to use {} , well ok, and if the BDFL says we use indents here, 
 
 well that's cool tool.
 
 
 
  Getting back to Swift, did they choose {} braces because JS uses 
 
 them, or did they choose {} braces because 'they' think most people will 
 
 want that style?
 
 
 
 marcus

Well, I think a tool kit such as the 
Pascal to C translator can solve the 
problem. 

Since Python is OOP and functional with
dynamic typing,  a Python to Go or Swift translator can be written in Python. 

Of course some features of Python might be restricted.

We need to chunk out fashion jobs in 
different  commercial platforms to draw attentions from non-programmers
whose roles are to pay for fashion HW/SW products.  






-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-15 Thread Marko Rauhamaa
Johannes Bauer dfnsonfsdu...@gmx.de:

 def make_street_address_map(info_list):
 return { info.get_street_address(): info.get_zip_code()
for info in info_list }

Live and learn. Have been an the lookout for dict comprehensions, but
didn't notice they were already included.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-14 Thread Joshua Landau
On 12 June 2014 03:08, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 We know *much more* about generating energy from E = mc^2 than we know
 about optimally flipping bits: our nuclear reactions convert something of
 the order of 0.1% of their fuel to energy, that is, to get a certain
 yield, we merely have to supply about a thousand times more fuel than
 we theoretically needed. That's about a thousand times better than the
 efficiency of current bit-flipping technology.

You're comparing a one-use device to a trillion-use device. I think
that's unfair.

Tell me when you find an atom splitter that works a trillion times.
Then tell me what it's efficiency is, because it's not nearly 0.1%.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-14 Thread Steven D'Aprano
On Sun, 15 Jun 2014 02:51:49 +0100, Joshua Landau wrote:

 On 12 June 2014 03:08, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info wrote:
 We know *much more* about generating energy from E = mc^2 than we know
 about optimally flipping bits: our nuclear reactions convert something
 of the order of 0.1% of their fuel to energy, that is, to get a certain
 yield, we merely have to supply about a thousand times more fuel than
 we theoretically needed. That's about a thousand times better than the
 efficiency of current bit-flipping technology.
 
 You're comparing a one-use device to a trillion-use device. I think
 that's unfair.
 
 Tell me when you find an atom splitter that works a trillion times. 
 Then tell me what it's efficiency is, because it's not nearly 0.1%.

Nuclear bombs may only get used once, but nuclear reactors get used 
continuously for years or decades, and like I already said, their 
efficiency is around 0.1% (mass converted to energy). There are also 
various types of atomic batteries, such as radioisotope thermoelectric 
generators, which convert the radiation given off by radioactive 
substances to electricity. They are typically expected to have an 
effective working life of a decade or more.




-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-13 Thread Roy Smith
In article 5399019e$0$29988$c3e8da3$54964...@news.astraweb.com,
 Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:

 On Wed, 11 Jun 2014 08:48:36 -0400, Roy Smith wrote:
 
  In article 53984cd2$0$29988$c3e8da3$54964...@news.astraweb.com,
   Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:
  
  Yes, technically water-cooled engines are cooled by air too. The engine
  heats a coolant (despite the name, usually not water these days) which
  then heats the air.
  
  Not water???  I'm not aware of any water-cooled engines which use
  anything other than water.  Well, OK, it's really a solution of ethylene
  or propylene glycol in water, but the water is what does most of the
  heat transfer.  The glycol is just there to provide freezing point
  depression and boiling point elevation.
 
 Would you consider it fair to say that, say, vinegar is not water? 
 Depending on the type of vinegar, it is typically around 5-10% acetic 
 acid, and the rest water. Spirit vinegar can be as much as 20% acetic 
 acid, which still leaves 80% water.

In a car, the water is the important part (even if it's only a 50% 
component).  The primary job of the circulating coolant is to absorb 
heat in one place and transport it to another place.  That requires a 
liquid with a high heat capacity, which is the water.  The other stuff 
is just there to help the water do its job (i.e. not freeze in the 
winter, or boil over in the summer, and some anti-corrosive action 
thrown into the mix).

When you said, usually not water these days, that's a misleading 
statement.  Certainly, it's not pure water, or even just water.  But 
not water is a bit of a stretch.

With vinegar, the acetic acid is the important component.  The water is 
just there to dilute it to a useful working concentration and act as a 
carrier.  People are 90% water too, but I wouldn't call a person 
water.  I would, however, as a first-order description, call the stuff 
circulating through the cooling system in my car, water.

 Back in the day, car radiators were *literally* water-cooled in the sense 
 that the radiator was filled with 100% water. You filled it from the tap 
 with drinking water. In an emergency, say broken down in the desert, you 
 could drink the stuff from the radiator to survive. If you tried that 
 with many modern cars, you would die a horrible death.

But, I could do that right now, with my car (well, not the drinking 
part) .  In an emergency, I could fill my cooling system with pure 
water, and it would work well enough to get me someplace not too far 
away where I could get repairs done.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread Steven D'Aprano
On Thu, 12 Jun 2014 12:16:08 +1000, Chris Angelico wrote:

 On Thu, Jun 12, 2014 at 12:08 PM, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info wrote:
 I'm just pointing out that our computational technology uses over a
 million times more energy than the theoretical minimum, and therefore
 there is a lot of room for efficiency gains without sacrificing
 computer power. I never imagined that such viewpoint would turn out to
 be so controversial.
 
 The way I understand it, you're citing an extremely theoretical minimum,
 in the same way that one can point out that we're a long way from
 maximum entropy in a flash memory chip, so it ought to be possible to
 pack a lot more data onto a USB stick. 

Um, yes? 

Hands up anyone who thinks that today's generation of USB sticks will be 
the highest capacity ever, that all progress in packing more memory into 
a thumb drive (or the same memory into a smaller drive) will cease 
effective immediately?

Anyone?


 The laws of physics tend to put
 boundaries that are ridiculously far from where we actually work - I
 think most roads have speed limits that run a fairly long way short of
 c.

186,000 miles per second: not just a good idea, it's the law


There's no *law of physics* that says cars can only travel at the speeds 
they do. Compare how fast a typical racing car goes with the typical 
60kph speed limit in suburban Melbourne. Now compare how fast the 
Hennessey Venom GT goes to that speed limit.

http://www.autosaur.com/fastest-car-in-the-world/?PageSpeed=noscript


Speed limits for human-piloted ground-based transport (cars) are more 
based on social and biological factors than engineering ones. Similarly, 
there are biological factors that force keyboards to be a minimum size. 
We probably could build a keyboard where the keys were 0.1mm square, but 
what would be the point? Who could use it? Those social and biological 
factors don't apply to computing efficiency, so it's only *engineering* 
factors that prevent us from being able to run your server off a watch 
battery, not the laws of physics.

It is my contention that, had Intel and AMD spent the last few decades 
optimizing for power consumption rather than speed, we probably could run 
a server off, well, perhaps not a watch battery, but surely a factor of 
100 improvement in efficiency isn't unreasonable given that we're just 
moving a picogram of electrons around?


-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread alister
On Thu, 12 Jun 2014 09:06:50 +, Steven D'Aprano wrote:

 On Thu, 12 Jun 2014 12:16:08 +1000, Chris Angelico wrote:
 
 On Thu, Jun 12, 2014 at 12:08 PM, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info wrote:
 I'm just pointing out that our computational technology uses over a
 million times more energy than the theoretical minimum, and therefore
 there is a lot of room for efficiency gains without sacrificing
 computer power. I never imagined that such viewpoint would turn out to
 be so controversial.
 
 The way I understand it, you're citing an extremely theoretical
 minimum,
 in the same way that one can point out that we're a long way from
 maximum entropy in a flash memory chip, so it ought to be possible to
 pack a lot more data onto a USB stick.
 
 Um, yes?
 
 Hands up anyone who thinks that today's generation of USB sticks will be
 the highest capacity ever, that all progress in packing more memory into
 a thumb drive (or the same memory into a smaller drive) will cease
 effective immediately?
 
 Anyone?
 
 
 The laws of physics tend to put boundaries that are ridiculously far
 from where we actually work - I think most roads have speed limits that
 run a fairly long way short of c.
 
 186,000 miles per second: not just a good idea, it's the law
 
 
 There's no *law of physics* that says cars can only travel at the speeds
 they do. Compare how fast a typical racing car goes with the typical
 60kph speed limit in suburban Melbourne. Now compare how fast the
 Hennessey Venom GT goes to that speed limit.
 
 http://www.autosaur.com/fastest-car-in-the-world/?PageSpeed=noscript
 
 
 Speed limits for human-piloted ground-based transport (cars) are more
 based on social and biological factors than engineering ones. Similarly,
 there are biological factors that force keyboards to be a minimum size.
 We probably could build a keyboard where the keys were 0.1mm square, but
 what would be the point? Who could use it? Those social and biological
 factors don't apply to computing efficiency, so it's only *engineering*
 factors that prevent us from being able to run your server off a watch
 battery, not the laws of physics.
 
 It is my contention that, had Intel and AMD spent the last few decades
 optimizing for power consumption rather than speed, we probably could
 run a server off, well, perhaps not a watch battery, but surely a factor
 of 100 improvement in efficiency isn't unreasonable given that we're
 just moving a picogram of electrons around?

but a 20 year old server would probably take a week to do what a current 
one does in an hour (random figures chosen for effect not accuracy).

How does the power consumption compare on those time-scales, not to 
mention the cost of the wasted time?

I would agree that for the average desk-top users modern processor 
performance exceeds that required by a considerable margin so perhaps 
optimising for power consumption is now possible, wait a minute arn't 
intel  AMD now developing lower powered processors?



-- 
Breeding rabbits is a hare raising experience.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread Gregory Ewing

Steven D'Aprano wrote:
It is my contention that, had Intel and AMD spent the last few decades 
optimizing for power consumption rather than speed, we probably could run 
a server off, well, perhaps not a watch battery,


Current draw of CMOS circuitry is pretty much zero when
nothing is changing, so if you didn't care how slow it ran,
you probably could run a server off a watch battery today.
Users wouldn't like waiting a week for their web pages to
load, though...

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread Rustom Mody
I am bewildered by this argument...

[Heck Ive recently learnt that using ellipses is an easy way to 
produce literature... So there...]

On Thursday, June 12, 2014 2:36:50 PM UTC+5:30, Steven D'Aprano wrote:

 It is my contention that, had Intel and AMD spent the last few decades 
 optimizing for power consumption rather than speed, we probably could run 
 a server off, well, perhaps not a watch battery, but surely a factor of 
 100 improvement in efficiency isn't unreasonable given that we're just 
 moving a picogram of electrons around?

This is fine and right.
I personally would pay more if my PCs/laptops etc were quieter/efficient-er.
So we agree... upto here!


 On Thu, 12 Jun 2014 12:16:08 +1000, Chris Angelico wrote:

  On Thu, Jun 12, 2014 at 12:08 PM, Steven D'Aprano wrote:
  I'm just pointing out that our computational technology uses over a
  million times more energy than the theoretical minimum, and therefore
  there is a lot of room for efficiency gains without sacrificing
  computer power. I never imagined that such viewpoint would turn out to
  be so controversial.
  The way I understand it, you're citing an extremely theoretical minimum,
  in the same way that one can point out that we're a long way from
  maximum entropy in a flash memory chip, so it ought to be possible to
  pack a lot more data onto a USB stick. 

 Um, yes? 

 Hands up anyone who thinks that today's generation of USB sticks will be 
 the highest capacity ever, that all progress in packing more memory into 
 a thumb drive (or the same memory into a smaller drive) will cease 
 effective immediately?

 Anyone?

  The laws of physics tend to put
  boundaries that are ridiculously far from where we actually work - I
  think most roads have speed limits that run a fairly long way short of
  c.

 186,000 miles per second: not just a good idea, it's the law

 There's no *law of physics* that says cars can only travel at the speeds 
 they do. Compare how fast a typical racing car goes with the typical 
 60kph speed limit in suburban Melbourne. Now compare how fast the 
 Hennessey Venom GT goes to that speed limit.

 http://www.autosaur.com/fastest-car-in-the-world/?PageSpeed=noscript

Now you (or I) are getting completely confused.

If you are saying that the Hennessey Venom (HV) is better than some
standard vanilla Ford/Toyota (FT) based on the above, thats ok.

In equations:
maxspeed(HV) = 250 mph
maxspeed(FT) = 150 mph
so HV is better than FT.

Ok...

But from your earlier statements you seem to be saying its better
because:
250 mph is closer to 186,000 mps (= 670 million mph) than 150 mph

Factually this is a correct statement.

Pragmatically this is as nonsensical as comparing a mile and a
kilogram.


 Speed limits for human-piloted ground-based transport (cars) are more 
 based on social and biological factors than engineering ones. Similarly, 
 there are biological factors that force keyboards to be a minimum size. 
 We probably could build a keyboard where the keys were 0.1mm square, but 
 what would be the point? Who could use it? Those social and biological 
 factors don't apply to computing efficiency, so it's only *engineering* 
 factors that prevent us from being able to run your server off a watch 
 battery, not the laws of physics.

As best as I can see you are confused about the difference between
science and engineering.

Saying one car is better engineered than another on direct comparison
(150mph250mph) is ok

Saying one car is better than another because of relation to physics
limits (c-150c-250) is confusing science and engineering.

Likewise saying AMD and Intel should have done more due diligence to
their clients (and the planet) by considerging energy efficiency is right 
and I (strongly) agree.

But compare their products' realized efficiency with theoretical limits like
Landauers is a type-wrong statement
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread Steven D'Aprano
On Thu, 12 Jun 2014 05:54:47 -0700, Rustom Mody wrote:

 On Thursday, June 12, 2014 2:36:50 PM UTC+5:30, Steven D'Aprano wrote:
[...]
  The laws of physics tend to put
  boundaries that are ridiculously far from where we actually work - I
  think most roads have speed limits that run a fairly long way short
  of c.
 
 186,000 miles per second: not just a good idea, it's the law
 
 There's no *law of physics* that says cars can only travel at the
 speeds they do. Compare how fast a typical racing car goes with the
 typical 60kph speed limit in suburban Melbourne. Now compare how fast
 the Hennessey Venom GT goes to that speed limit.
 
 http://www.autosaur.com/fastest-car-in-the-world/?PageSpeed=noscript
 
 Now you (or I) are getting completely confused.
 
 If you are saying that the Hennessey Venom (HV) is better than some
 standard vanilla Ford/Toyota (FT) based on the above, thats ok.

I'm not making any value judgements (better or worse) about cars 
based on their speed. I'm just pointing out that the speed limits on our 
roads have very little to do with the speeds cars are capable of 
reaching, and *nothing* to do with ultimate limits due to the laws of 
physics.

Chris made the argument that *the laws of physics* put limits on what we 
can attain, which is fair enough, but then made the poor example of speed 
limits on roads falling short of the speed of light. Yes, speed limits on 
roads fall considerably short of the speed of light, but not because of 
laws of physics. The speed limit in my street is 50 kilometres per hour, 
not because that limit is a law of physics, or because cars are incapable 
of exceeding 50kph, but because the government where I live has decided 
that 50kph is the maximum safe speed for a car to travel in my street, 
rounded to the nearest multiple of 10kph.

In other words, Chris' example is a poor one to relate to the energy 
efficiency of computing.

A more directly relevant example would have been the efficiency of heat 
engines, where there is a fundamental physical limit of 100% efficiency. 
Perhaps Chris didn't mention that one because our technology can build 
heat engines with 60% efficiency, which is probably coming close to the 
practical upper limit of attainable efficiency -- we might, by virtue of 
clever engineering and exotic materials, reach 70% or 80% efficiency, but 
probably not 99.9% efficiency. That's a good example.

Bringing it back to computing technology, the analogy is that our current 
computing technology is like a heat engine with an efficiency of 
0.01%. Even an efficiency of 1% would be a marvelous improvement. In 
this analogy, there's an ultimate limit of 100% imposed by physics 
(Landauer's Law), and a practical limit of (let's say) 80%, but current 
computing technology is so far from those limits that those limits might 
as well not exist.


 In equations:
 maxspeed(HV) = 250 mph
 maxspeed(FT) = 150 mph
 so HV is better than FT.

Better is your word, not mine.

I don't actually care about fast cars, but if I did, and if I valued 
speed above everything else (cost, safety, fuel efficiency, noise, 
environmental impact, comfort, etc) then yes, I would say 250 mph is 
better than 150 mph, because 250 mph is larger.


 Ok...
 
 But from your earlier statements you seem to be saying its better
 because:
 250 mph is closer to 186,000 mps (= 670 million mph) than 150 mph

 Factually this is a correct statement.

And yet you're going to disagree with it, even though you agree it is 
correct?


 Pragmatically this is as nonsensical as comparing a mile and a kilogram.

This makes no sense at all.

Your two statements about speeds are logically and mathematically 
equivalent. You cannot have one without the other.

Take three numbers, speeds in this case, s1, s2 and c, with c a strict 
upper-bound. We can take:

s1  s2  c

without loss of generality. So in this case, we say that s2 is greater 
than s1:

s2  s1

Adding the constant c to both sides does not change the inequality:

c + s2  c + s1

Subtracting s1 + s2 from each side:

c + s2 - (s1 + s2)  c + s1 - (s1 + s2)
c - s1  c - s2

In other words, if 250mph is larger than 150mph (a fact, as you accept), 
then it is equally a fact that 250mph is closer to the speed of light 
than 150mph. You cannot possibly have one and not the other. So why do 
you believe that the first form is acceptable, but the second form is 
nonsense?


 Speed limits for human-piloted ground-based transport (cars) are more
 based on social and biological factors than engineering ones.
 Similarly, there are biological factors that force keyboards to be a
 minimum size. We probably could build a keyboard where the keys were
 0.1mm square, but what would be the point? Who could use it? Those
 social and biological factors don't apply to computing efficiency, so
 it's only *engineering* factors that prevent us from being able to run
 your server off a watch battery, not the laws of physics.
 
 As best as I can see you are 

Re: OT: This Swift thing

2014-06-12 Thread Chris Angelico
On Fri, Jun 13, 2014 at 3:04 AM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 Chris made the argument that *the laws of physics* put limits on what we
 can attain, which is fair enough, but then made the poor example of speed
 limits on roads falling short of the speed of light. Yes, speed limits on
 roads fall considerably short of the speed of light, but not because of
 laws of physics. The speed limit in my street is 50 kilometres per hour,
 not because that limit is a law of physics, or because cars are incapable
 of exceeding 50kph, but because the government where I live has decided
 that 50kph is the maximum safe speed for a car to travel in my street,
 rounded to the nearest multiple of 10kph.

 In other words, Chris' example is a poor one to relate to the energy
 efficiency of computing.

The point isn't so much the legal or safe limit as that that's the
speed of most driving. That is to say: Around here, most cars will
travel at roughly 50 kph, which is a far cry from c. There are other
reasons than physics for choosing a speed.

 Take three numbers, speeds in this case, s1, s2 and c, with c a strict
 upper-bound. We can take:

 s1  s2  c

 without loss of generality. So in this case, we say that s2 is greater
 than s1:

 s2  s1

 Adding the constant c to both sides does not change the inequality:

 c + s2  c + s1

As long as we accept that this is purely in a mathematical sense.
Let's not get into the realm of actual speeds greater than c.

 Subtracting s1 + s2 from each side:

 c + s2 - (s1 + s2)  c + s1 - (s1 + s2)
 c - s1  c - s2

 In other words, if 250mph is larger than 150mph (a fact, as you accept),
 then it is equally a fact that 250mph is closer to the speed of light
 than 150mph. You cannot possibly have one and not the other. So why do
 you believe that the first form is acceptable, but the second form is
 nonsense?

And at this point the calculation becomes safe again, and obvious
common sense. (Or alternatively, substitute Mach 1 for c; it's not a
hard limit, but there are good reasons for staying below it in
practical application - most airliners cruise a smidge below the speed
of sound for efficiency.)

 If I were arguing that there are no engineering limits prohibiting CPUs
 reaching Landauer's limit, then you could criticise me for that, but I'm
 not making that argument.

 I'm saying that, whatever the practical engineering limits turn out to
 be, we're unlikely to be close to them, and therefore there are very
 likely to be many and massive efficiency gains to be made in computing.

And this I totally agree with. The limits of physics are so incredibly
far from where we now are that we can utterly ignore them; the limits
we face are generally engineering (with the exception of stuff
designed for humans to use, eg minimum useful key size is defined by
fingers and not by what we can build).

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread Gene Heskett
On Thursday 12 June 2014 13:18:00 Chris Angelico did opine
And Gene did reply:
 On Fri, Jun 13, 2014 at 3:04 AM, Steven D'Aprano
 
  I'm saying that, whatever the practical engineering limits turn out
  to be, we're unlikely to be close to them, and therefore there are
  very likely to be many and massive efficiency gains to be made in
  computing.
 
 And this I totally agree with. The limits of physics are so incredibly
 far from where we now are that we can utterly ignore them; the limits
 we face are generally engineering (with the exception of stuff
 designed for humans to use, eg minimum useful key size is defined by
 fingers and not by what we can build).
 
 ChrisA

Thats a bit too blanketish a statement, we do see it in the real world.  
Some of the electronics stuff we've been using for nearly 50 years 
actually runs into the e=MC^2 effects, and it affects their performance in 
pretty deleterious ways.

A broadcast power klystron, like a 4KM100LA, which is an electron beam 
device that does its amplifying by modulating the velocity of an electron 
beam which is being accelerated by nominally a 20,000 volt beam supply.
But because of the beam speed from that high a voltage brings in 
relativity effects from e=MV^2 mass of the electrons in that beam, an 
equal amount of energy applied to speed it up does not get the same 
increase in velocity as that same energy applied to slow it down decreases 
it.  This has the net effect of making the transit time greater when under 
high power drive conditions such as the sync pulses of the now out of 
style NTSC signal.  The net result is a group delay characteristic that is 
uncorrectable when the baseband video is where you are trying to correct 
it.  In a few words, the shape of the sync signal is damaged.  Badly.

Because most transmitters of that day used separate amplifiers for the 
audio, and the receivers have used the 4.5 mhz difference signal to 
recover the audio in the receiver for the last 63+ years, this Incidental 
Carrier Phase Modulation noise is impressed into the detected audio.  And 
I am sure that there are many here that can recall back a decade that the 
UHF stations in your area, all had a what was often called chroma buzz 
in the audio that was only about 50 db down.  Ear fatiguing at best.  
Market share effecting too.  And that translates directly into station 
income minus signs.

It was fixable, but at an additional cost in efficiency of about -20%, but 
consider what that 20% costs when a station using a 30kw rated 
transmitter, actually pulls around 225 kwh from the powerline for every 
hour it is on the air.  Bean counters have heart attacks over such 
figures.

Cheers, Gene Heskett
-- 
There are four boxes to be used in defense of liberty:
 soap, ballot, jury, and ammo. Please use in that order.
-Ed Howdershelt (Author)
Genes Web page http://geneslinuxbox.net:6309/gene
US V Castleman, SCOTUS, Mar 2014 is grounds for Impeaching SCOTUS
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread Rustom Mody
On Thursday, June 12, 2014 10:48:00 PM UTC+5:30, Chris Angelico wrote:
 On Fri, Jun 13, 2014 at 3:04 AM, Steven D'Aprano
  Take three numbers, speeds in this case, s1, s2 and c, with c a strict
  upper-bound. We can take:
  s1  s2  c
  without loss of generality. So in this case, we say that s2 is greater
  than s1:
  s2  s1
  Adding the constant c to both sides does not change the inequality:
  c + s2  c + s1

 As long as we accept that this is purely in a mathematical sense.
 Let's not get into the realm of actual speeds greater than c.

You got a keen eye Chris -- didn't notice that!
And captures my point better than my long-winded attempts

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-12 Thread Steven D'Aprano
On Fri, 13 Jun 2014 03:18:00 +1000, Chris Angelico wrote:

 On Fri, Jun 13, 2014 at 3:04 AM, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info wrote:
[...]
 Take three numbers, speeds in this case, s1, s2 and c, with c a strict
 upper-bound. We can take:

 s1  s2  c

 without loss of generality. So in this case, we say that s2 is greater
 than s1:

 s2  s1

 Adding the constant c to both sides does not change the inequality:

 c + s2  c + s1
 
 As long as we accept that this is purely in a mathematical sense. Let's
 not get into the realm of actual speeds greater than c.

Well, yes, it is in the mathematical sense, and it doesn't require any 
actual physical thing to travel at faster than light speed. There is no 
implication here that there is something travelling at (c + s1). It's 
just a number.

But note that even in *real* (as opposed to science fiction, or 
hypothetical) physics, you can have superluminal speeds. Both the phase 
velocity and group velocity of a wave may exceed c; the closing velocity 
of two objects approaching each other is limited to 2c. Distant galaxies 
are receding from us at greater than c. There are other situations where 
some measurable effect can travel faster than c, e.g. the superluminal 
spotlight effect.

https://en.wikipedia.org/wiki/Faster-than-light




-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Gregory Ewing

Rustom Mody wrote:

JFTR: Information processing and (physics) energy are about as convertible
as say: Is a kilogram smaller/greater than a mile?


Actually, that's not true. There is a fundamental
thermodynamic limit on the minimum energy needed to
flip a bit from one state to the other, so in that
sense there's a relationship between watts and
bits per second.

We're nowhere near reaching that limit with
current technology, though. In principle, our
CPUs could be a lot more energy-efficient.

(That doesn't mean they would convert a smaller
proportion of their energy input into heat. It
means they would need less energy input in the
first place).

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Gregory Ewing

Steven D'Aprano wrote:
Everything *eventually* gets converted to heat, but not immediately. 
There's a big difference between a car that gets 100 miles to the gallon, 
and one that gets 1 mile to the gallon.


With a car, the engine converts some of its energy to
kinetic energy, which is subsequently dissipated as heat,
so it makes sense to talk about the ratio of kinetic
energy produced to energy wasted directly as heat.

But when you flip a bit, there's no intermediate form
of energy -- the bit changes state, and heat is produced.
So all of the heat is waste heat.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Gregory Ewing

Chris Angelico wrote:

So, let me get this straight. A CPU has to have a fan, but a car
engine doesn't, because the car's moving at a hundred kays an hour. I
have a suspicion the CPU fan moves air a bit slower than that.


If the car were *always* moving at 100km/h, it probably
wouldn't need a fan.

In practice, all cars do have fans (even the ones that
aren't air-cooled), for the occasions when they're not
moving that fast.

(BTW, so-called water-cooled engines are really air-cooled
too, just not by air flowing directly over the engine
block. (Although marine engines may be an exception.))

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Steven D'Aprano
On Wed, 11 Jun 2014 19:50:20 +1200, Gregory Ewing wrote:

 Chris Angelico wrote:
 So, let me get this straight. A CPU has to have a fan, but a car engine
 doesn't, because the car's moving at a hundred kays an hour. I have a
 suspicion the CPU fan moves air a bit slower than that.

I'm not sure where Chris' message comes from, I can't see the original, 
so I'm guessing the context.

Air cooled cars don't just cool the engine when they are travelling at 
100kmh. Some air-cooled engines used a fan to blow extra air over the 
cooling fins, but many did not. Normal air flow is sufficient to keep 
them in a safe operating temperature, the hot engine warms the air, which 
flows away and is replaced by cooler air.

It's possible to design CPUs to work the same way. My wife is using a PC 
right now with a 1.66GHz Atom CPU and no CPU fan. Even though the power 
supply fan died, the machine is still running perfectly, with two laptop 
HDDs, and no overheating. 1.66GHz is plenty fast enough for web browsing, 
word processing, email, etc.

Go back 30 years, and I don't think that the average PC needed a CPU fan. 
Possibly not even a case fan. Just the normal air flow over a small heat 
sink was enough. And of course, your mobile phone has no room for a heat 
sink, unless it's tiny, and no fan. And people expect it to keep working 
even when shoved in their pocket.


 If the car were *always* moving at 100km/h, it probably wouldn't need a
 fan.

 In practice, all cars do have fans (even the ones that aren't
 air-cooled), for the occasions when they're not moving that fast.

That may be true of water-cooled engines *now*, but it's not a law of 
engineering. Many air-cooled engines do not (did not) require a fan, or 
only needed the extra cooling when stuck idling for long periods in hot 
weather. E.g. Beetles didn't use a fan. (A great idea for Germany, not so 
much for hot and dusty Southern California, as my wife can tell you.)


 (BTW, so-called water-cooled engines are really air-cooled too, just not
 by air flowing directly over the engine block. (Although marine engines
 may be an exception.))

Yes, technically water-cooled engines are cooled by air too. The engine 
heats a coolant (despite the name, usually not water these days) which 
then heats the air.


-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Roy Smith
In article 53984cd2$0$29988$c3e8da3$54964...@news.astraweb.com,
 Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:

 Yes, technically water-cooled engines are cooled by air too. The engine 
 heats a coolant (despite the name, usually not water these days) which 
 then heats the air.

Not water???  I'm not aware of any water-cooled engines which use 
anything other than water.  Well, OK, it's really a solution of ethylene 
or propylene glycol in water, but the water is what does most of the 
heat transfer.  The glycol is just there to provide freezing point 
depression and boiling point elevation.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Steven D'Aprano
On Wed, 11 Jun 2014 19:41:12 +1200, Gregory Ewing wrote:

 Steven D'Aprano wrote:
 Everything *eventually* gets converted to heat, but not immediately.
 There's a big difference between a car that gets 100 miles to the
 gallon, and one that gets 1 mile to the gallon.
 
 With a car, the engine converts some of its energy to kinetic energy,
 which is subsequently dissipated as heat, so it makes sense to talk
 about the ratio of kinetic energy produced to energy wasted directly as
 heat.
 
 But when you flip a bit, there's no intermediate form of energy -- the
 bit changes state, and heat is produced. So all of the heat is waste
 heat.

Not the point. There's a minimum amount of energy required to flip a bit. 
Everything beyond that is, in a sense, just wasted. You mentioned this 
yourself in your previous post. It's a *really* tiny amount of energy: 
about 17 meV at room temperature. That's 17 milli electron-volt, or 
2.7×10^-21 joules. In comparison, Intel CMOS transistors have a gate 
charging energy of about 62500 eV (1×10^-14 J), around 3.7 million times 
greater.

Broadly speaking, if the fundamental thermodynamic minimum amount of 
energy needed to flip a bit takes the equivalent of a single grain of 
white rice, then our current computing technology uses the equivalent of 
175 Big Macs.

(There are approximately 50 grains of rice in a gram, and a gram of rice 
is about 1.3 Calories. A Big Mac is about 550 Calories. You do the maths.)


-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Rustom Mody
On Wednesday, June 11, 2014 1:11:12 PM UTC+5:30, Gregory Ewing wrote:
 Steven D'Aprano wrote:
  Everything *eventually* gets converted to heat, but not immediately. 
  There's a big difference between a car that gets 100 miles to the gallon, 
  and one that gets 1 mile to the gallon.

 With a car, the engine converts some of its energy to
 kinetic energy, which is subsequently dissipated as heat,
 so it makes sense to talk about the ratio of kinetic
 energy produced to energy wasted directly as heat.

 But when you flip a bit, there's no intermediate form
 of energy -- the bit changes state, and heat is produced.
 So all of the heat is waste heat.

Actually the car-drive and the bit-flip are much more identical than
different.  Its just that the time-scales are minutes/hours in one
case and nanoseconds or less in the other so our powers of
visualization are a bit taxed.

In more detail:

One drives a car from A to B for an hour (assume no change in 
height above sea level so no potential difference).
All the energy that was there as petrol has been dissipated as heat.

A bit flips from zero to one. Pictorially
(this needs to be fixed-pitch font!):

   +-
   | 
   | 
   | 
---+ 

However in reality that 'square' wave is always actually sloped


   +--
  /  
 /
/
---+

Now for say CMOS technology, one may assume no currents in both zero
and one states (thats the C in CMOS). However when its neither zero
nor one (the sloping part) there will be current and therefore heat.

So just as the car burns energy in going from A to B, the flipflop
burns it in going from 0 to 1


Steven D'Aprano wrote:
 Not the point. There's a minimum amount of energy required to flip a bit.
 Everything beyond that is, in a sense, just wasted. You mentioned this
 yourself in your previous post. It's a *really* tiny amount of energy:
 about 17 meV at room temperature. That's 17 milli electron-volt, or
 2.7×10^-21 joules. In comparison, Intel CMOS transistors have a gate
 charging energy of about 62500 eV (1×10^-14 J), around 3.7 million times
 greater.
  
 Broadly speaking, if the fundamental thermodynamic minimum amount of
 energy needed to flip a bit takes the equivalent of a single grain of
 white rice, then our current computing technology uses the equivalent of
 175 Big Macs. 

Well thats in the same realm as saying that by E=mc² a one gram stone can yield
21 billion calories energy.

[Ive forgotten how the units stack up, so as usual relyin on google
instead of first principles:

http://en.wikipedia.org/wiki/Mass%E2%80%93energy_equivalence#Practical_examples
:-)
]

ie. from a a pragmatic/engineering pov we know as much how to use
Einstein's energy-mass-equivalence to generate energy as we know how
to use Landauer's principle to optimally flip bits.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Steven D'Aprano
On Wed, 11 Jun 2014 08:48:36 -0400, Roy Smith wrote:

 In article 53984cd2$0$29988$c3e8da3$54964...@news.astraweb.com,
  Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:
 
 Yes, technically water-cooled engines are cooled by air too. The engine
 heats a coolant (despite the name, usually not water these days) which
 then heats the air.
 
 Not water???  I'm not aware of any water-cooled engines which use
 anything other than water.  Well, OK, it's really a solution of ethylene
 or propylene glycol in water, but the water is what does most of the
 heat transfer.  The glycol is just there to provide freezing point
 depression and boiling point elevation.

Would you consider it fair to say that, say, vinegar is not water? 
Depending on the type of vinegar, it is typically around 5-10% acetic 
acid, and the rest water. Spirit vinegar can be as much as 20% acetic 
acid, which still leaves 80% water.

How about brandy, which is typically 35%-60% alcohol, with most of the 
rest being water? Or household bleach, which is typically a 3-6% solution 
of sodium hypochlorite? Or milk (85-90% water)? I think it is fair to 
describe those as not water. You shouldn't try to put out a fire by 
pouring a bottle of brandy on it.

Automotive cooling fluid in modern sealed radiators is typically a 
mixture of 50% anti-freeze and 50% water.

Back in the day, car radiators were *literally* water-cooled in the sense 
that the radiator was filled with 100% water. You filled it from the tap 
with drinking water. In an emergency, say broken down in the desert, you 
could drink the stuff from the radiator to survive. If you tried that 
with many modern cars, you would die a horrible death.



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Steven D'Aprano
On Wed, 11 Jun 2014 08:28:43 -0700, Rustom Mody wrote:

 Steven D'Aprano wrote:

 Not the point. There's a minimum amount of energy required to flip a
 bit. Everything beyond that is, in a sense, just wasted. You mentioned
 this yourself in your previous post. It's a *really* tiny amount of
 energy: about 17 meV at room temperature. That's 17 milli
 electron-volt, or 2.7×10^-21 joules. In comparison, Intel CMOS
 transistors have a gate charging energy of about 62500 eV (1×10^-14 J),
 around 3.7 million times greater.
  
 Broadly speaking, if the fundamental thermodynamic minimum amount of
 energy needed to flip a bit takes the equivalent of a single grain of
 white rice, then our current computing technology uses the equivalent
 of 175 Big Macs.
 
 Well thats in the same realm as saying that by E=mc² a one gram stone
 can yield 21 billion calories energy.
[...]
 ie. from a a pragmatic/engineering pov we know as much how to use
 Einstein's energy-mass-equivalence to generate energy as we know how to
 use Landauer's principle to optimally flip bits.

You know, I think that the people of Hiroshima and Nagasaki and Chernobyl 
and Fukushima (to mention only a few places) might disagree.

We know *much more* about generating energy from E = mc^2 than we know 
about optimally flipping bits: our nuclear reactions convert something of 
the order of 0.1% of their fuel to energy, that is, to get a certain 
yield, we merely have to supply about a thousand times more fuel than 
we theoretically needed. That's about a thousand times better than the 
efficiency of current bit-flipping technology.

We build great big clanking mechanical devices out of lumps of steel that 
reach 25% - 50% of the theoretical maximum efficiency:

https://en.wikipedia.org/wiki/Thermal_efficiency

while our computational technology is something of the order of 0.1% 
efficient. I'm just pointing out that our computational technology uses 
over a million times more energy than the theoretical minimum, and 
therefore there is a lot of room for efficiency gains without sacrificing 
computer power. I never imagined that such viewpoint would turn out to be 
so controversial.




-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Gregory Ewing

Steven D'Aprano wrote:
Automotive cooling fluid in modern sealed radiators is typically a 
mixture of 50% anti-freeze and 50% water.


Sometimes it's even more than 50%, at which point
you really have an antifreeze-cooled engine. :-)

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Chris Angelico
On Thu, Jun 12, 2014 at 12:08 PM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 I'm just pointing out that our computational technology uses
 over a million times more energy than the theoretical minimum, and
 therefore there is a lot of room for efficiency gains without sacrificing
 computer power. I never imagined that such viewpoint would turn out to be
 so controversial.

The way I understand it, you're citing an extremely theoretical
minimum, in the same way that one can point out that we're a long way
from maximum entropy in a flash memory chip, so it ought to be
possible to pack a lot more data onto a USB stick. The laws of physics
tend to put boundaries that are ridiculously far from where we
actually work - I think most roads have speed limits that run a fairly
long way short of c.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-11 Thread Gene Heskett
On Wednesday 11 June 2014 22:11:53 Gregory Ewing did opine
And Gene did reply:
 Steven D'Aprano wrote:
  Automotive cooling fluid in modern sealed radiators is typically a
  mixture of 50% anti-freeze and 50% water.
 
 Sometimes it's even more than 50%, at which point
 you really have an antifreeze-cooled engine. :-)

There have been cases where that 50% may have been exceeded actually 
driving on the streets.

At least 3 decades back, not too long before caddy came out with the 
northstar engine, which was rigged to get you home at a reasonable speed 
even if the radiator had been holed  the coolant lost.  They used a wee 
bit of the knowledge gained from keeping Smokey Yunick is experimenting 
cash.  He had an old VW Rabbit that was both a parts car, and the test 
bed. Two cylinder motor, I suspect built on a Harley 78cid crankcase, no 
radiator, no air cooling.  Ceramic cylinders and pistons, it ran at a 
quite high internal temperature because the cylinders were insulated from 
losing heat by fiberglass blankets.  It displaced 78 cid, made about 150 
HP, and got well over 120 mpg running around in Daytona Beach.  The one 
magazine article said it hadn't lost a stoplight grand prix ever but 
Smokey stopped that by making whoever was driving it, 100% responsible for 
any tickets it collected.

It would have been gawdawful expensive to put it into production since 
those 2 cylinders  pistons cost more than the complete V8 Northstar 
engine.

I thought it was one radically cool idea at the time.  And I am amazed 
that something like it has not invaded the automotive world what with all 
the emphasis on both high mileage  decent horsepower caused by the high 
petro prices.  Today I'd imagine a new cat converter might need to be 
built because at those temps and compression ratio's, I can see a hugely 
illegal amount of the various nitrogen oxides the EPA wouldn't tolerate. 

Cheers, Gene Heskett
-- 
There are four boxes to be used in defense of liberty:
 soap, ballot, jury, and ammo. Please use in that order.
-Ed Howdershelt (Author)
Genes Web page http://geneslinuxbox.net:6309/gene
US V Castleman, SCOTUS, Mar 2014 is grounds for Impeaching SCOTUS
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Rustom Mody
On Monday, June 9, 2014 9:50:38 AM UTC+5:30, Steven D'Aprano wrote:
 On Sun, 08 Jun 2014 19:24:52 -0700, Rustom Mody wrote:
 
 
  On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:
  CPU technology is the triumph of brute force over finesse.
  
  If you are arguing that computers should not use millions/billions of
  transistors, I wont argue, since I dont know the technology.
 
 No. I'm arguing that they shouldn't convert 90% of their energy input 
 into heat.
 

Strange statement.
What should they convert it into then?

JFTR: Information processing and (physics) energy are about as convertible
as say: Is a kilogram smaller/greater than a mile?
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Steven D'Aprano
On Sun, 08 Jun 2014 23:32:33 -0700, Rustom Mody wrote:

 On Monday, June 9, 2014 9:50:38 AM UTC+5:30, Steven D'Aprano wrote:
 On Sun, 08 Jun 2014 19:24:52 -0700, Rustom Mody wrote:
 
 
  On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:
  CPU technology is the triumph of brute force over finesse.
  
  If you are arguing that computers should not use millions/billions of
  transistors, I wont argue, since I dont know the technology.
 
 No. I'm arguing that they shouldn't convert 90% of their energy input
 into heat.
 
 
 Strange statement.
 What should they convert it into then?

Useful work, duh.

Everything *eventually* gets converted to heat, but not immediately. 
There's a big difference between a car that gets 100 miles to the gallon, 
and one that gets 1 mile to the gallon. Likewise CPUs should get more 
processing units (however you measure them) per watt of electricity 
consumed.

See, for example:

http://www.tomshardware.com/reviews/fx-power-consumption-efficiency,3060.html

http://en.wikipedia.org/wiki/Performance_per_watt

Quote:

Theoretically, room‑temperature computer memory operating 
at the Landauer limit could be changed at a rate of one 
billion bits per second with only 2.85 trillionths of a 
watt of power being expended in the memory media. Modern 
computers use millions of times as much energy.

http://en.wikipedia.org/wiki/Landauer's_principle


Much to my surprise, Wikipedia says that efficiency gains have actually 
been *faster* than Moore's Law. This surprises me, but it makes sense: if 
a CPU uses ten times more power to perform one hundred times more 
computations, it has become much more efficient but still needs a much 
bigger heat sink.

http://en.wikipedia.org/wiki/Koomey's_law


 JFTR: Information processing and (physics) energy are about as
 convertible as say: Is a kilogram smaller/greater than a mile?

(1) I'm not comparing incompatible units. And (2) there is a fundamental 
link between energy and entropy, and entropy is the reverse of 
information. See Landauer's Principle, linked above. So information 
processing and energy are as intimately linked as (say) current and 
voltage, or mass and energy, or momentum and position.



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Rustom Mody
On Monday, June 9, 2014 2:57:26 PM UTC+5:30, Steven D'Aprano wrote:

http://en.wikipedia.org/wiki/Landauer's_principle

Hey thanks for that!
Always thought something like this should exist but did not know what/where/how!

 On Sun, 08 Jun 2014 23:32:33 -0700, Rustom Mody wrote:

  On Monday, June 9, 2014 9:50:38 AM UTC+5:30, Steven D'Aprano wrote:
  On Sun, 08 Jun 2014 19:24:52 -0700, Rustom Mody wrote:
   On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:
   CPU technology is the triumph of brute force over finesse.
   If you are arguing that computers should not use millions/billions of
   transistors, I wont argue, since I dont know the technology.
  No. I'm arguing that they shouldn't convert 90% of their energy input
  into heat.
  Strange statement.
  What should they convert it into then?

 Useful work, duh.

 Everything *eventually* gets converted to heat, but not immediately. 
 There's a big difference between a car that gets 100 miles to the gallon, 
 and one that gets 1 mile to the gallon. Likewise CPUs should get more 
 processing units (however you measure them) per watt of electricity 
 consumed.

 See, for example:

 http://www.tomshardware.com/reviews/fx-power-consumption-efficiency,3060.html

 http://en.wikipedia.org/wiki/Performance_per_watt

 Quote:

 Theoretically, room-temperature computer memory operating 
 at the Landauer limit could be changed at a rate of one 
 billion bits per second with only 2.85 trillionths of a 
 watt of power being expended in the memory media. Modern 
 computers use millions of times as much energy.

Right so we are still very much in theoretical zone.
As the next para there says:

| If no information is erased, computation may in principle be achieved
| which is thermodynamically reversible, and require no release of
| heat. This has led to considerable interest in the study of reversible
| computing.

Particularly interesting as no-information-erasure corresponds to functional
(or maybe relational) programming. Of course still all theoretical.

 Much to my surprise, Wikipedia says that efficiency gains have actually 
 been *faster* than Moore's Law. This surprises me, but it makes sense: if 
 a CPU uses ten times more power to perform one hundred times more 
 computations, it has become much more efficient but still needs a much 
 bigger heat sink.

That was essentially my point


-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Roy Smith
In article 53953616$0$29988$c3e8da3$54964...@news.astraweb.com,
 Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:

 Moore's Law observes that processing power has doubled about every two 
 years. Over the last decade, processing power has increased by a factor 
 of 32. If *efficiency* had increased at the same rate, that 500W power 
 supply in your PC would now be a 15W power supply.

I think you're using a strange definition of efficiency.  I would define 
it as electric_power_in / processing_power_out.  If processing power has 
gone up by a factor of 32, and electric power used has stayed more or 
less the same (which it has), then efficiency has gone up.

 Your mobile phone would last a month between recharges, not a day. 
 Your laptop could use a battery half the size and still last two 
 weeks on a full charge.

One of the real industrial problems facing today's society is storage of 
electrical energy in batteries.  The lead-acid batteries in our cars are 
not terribly different from the ones in our grandparents' cars (or even 
our great-grandparents', if they had cars).  The storage capacity has 
gone up a little, mostly because the plastic shells we use now are 
thinner than the bakelite shells they used to use, so there's more 
internal volume for the same external size container.

And, yes, we now have other chemistries (lithium ion, metal hydride, 
etc) which are better in various ways, but the energy density (joules / 
kg) really hasn't changed much in 100 years.

 No. I'm arguing that they shouldn't convert 90% of their energy input 
 into heat.

Actually, they convert 100% of their energy input into heat.  The trick 
is having them do something useful along the way.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Gene Heskett
On Monday 09 June 2014 02:32:33 Rustom Mody did opine
And Gene did reply:
 On Monday, June 9, 2014 9:50:38 AM UTC+5:30, Steven D'Aprano wrote:
  On Sun, 08 Jun 2014 19:24:52 -0700, Rustom Mody wrote:
   On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:
   CPU technology is the triumph of brute force over finesse.
   
   If you are arguing that computers should not use millions/billions
   of transistors, I wont argue, since I dont know the technology.
  
  No. I'm arguing that they shouldn't convert 90% of their energy input
  into heat.

Looking at the whole system, about the only energy input that is not 
converted to heat, would be the milliwatt or 3 of sound from the speaker 
when it beeps at you, and the additional energy to spin the fans.  That is 
all calculate able if you have experience in air moving, as in HVAC.
 
 Strange statement.
 What should they convert it into then?
 
 JFTR: Information processing and (physics) energy are about as
 convertible as say: Is a kilogram smaller/greater than a mile?

;-)

Cheers, Gene Heskett
-- 
There are four boxes to be used in defense of liberty:
 soap, ballot, jury, and ammo. Please use in that order.
-Ed Howdershelt (Author)
Genes Web page http://geneslinuxbox.net:6309/gene
US V Castleman, SCOTUS, Mar 2014 is grounds for Impeaching SCOTUS
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Michael Torrie
On 06/08/2014 10:20 PM, Steven D'Aprano wrote:
 A typical desktop computer uses less than 500 watts for *everything* 
 except the screen. Hard drives. DVD burner. Keyboard, mouse, USB devices, 
 network card, sound card, graphics card, etc. (Actually, 350W is more 
 typical.)
 
 Moore's Law observes that processing power has doubled about every two 
 years. Over the last decade, processing power has increased by a factor 
 of 32. If *efficiency* had increased at the same rate, that 500W power 
 supply in your PC would now be a 15W power supply. Your mobile phone 
 would last a month between recharges, not a day. Your laptop could use a 
 battery half the size and still last two weeks on a full charge.

Actually that's not what Moore's law is about.  Moore's law states that
the number of transistors on the die doubles every 18 months.  Any other
doubling of something else is entirely coincidental.

 snip

 No. I'm arguing that they shouldn't convert 90% of their energy input 
 into heat.

All electronic circuits that don't create a motive force that performs
work convert 100% of their electrical energy into heat. I'm using work
defined in the physics sense.  CPUs take in electricity and expire 100%
of it as heat, and do so immediately.  This conversion to heat does
happen to do something useful along the way (flipping states on
transistors that represent information).  We used to tell people that
computers make very efficient space heaters.  Because in fact they do.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Marko Rauhamaa
Gene Heskett ghesk...@wdtv.com:

 Looking at the whole system, about the only energy input that is not 
 converted to heat, would be the milliwatt or 3 of sound from the speaker 
 when it beeps at you, and the additional energy to spin the fans.

That all becomes heat as well.

The dust particles that stick to the ceiling would be an example of
energy not wasted as heat (gravitational potential energy).


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread Marko Rauhamaa
Michael Torrie torr...@gmail.com:

 We used to tell people that computers make very efficient space
 heaters. Because in fact they do.

And that's no joke. Our home in Finland is heated with electric
radiators. They are on 8-9 months a year. During those months, the use
of all electrical appliances is free (apart from wear and tear).


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-09 Thread alex23

On 6/06/2014 9:11 PM, Alain Ketterlin wrote:

The nice thing with optional type annotations and an hypothetical Python
compiler would be that you could, e.g., continue using the interpreter
during development and then compile for production use.


s/annotations/decorators/ and you effectively have Cython's pure 
Python mode.

--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread MRAB

On 2014-06-07 17:18, Marko Rauhamaa wrote:

Roy Smith r...@panix.com:


The original MacOS was written in Pascal (both applications and
kernel). Being able to touch memory locations or registers requires no
more than a few short glue routines written in assembler.


Pascal is essentially equivalent to C, except Pascal has a cleaner
syntax. I like the fact that the semicolon is a separator. Also, the
variable declaration syntax is done more smartly in Pascal. And the
pointer/array confusion in C is silly.


I also like the fact that the semicolon is a separator, but, in
practice, misplaced semicolons can cause problems, so languages
descended of Pascal prefer to have explicit terminators.
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Roy Smith
In article 5393dd6a$0$29988$c3e8da3$54964...@news.astraweb.com,
 Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:

 On Sat, 07 Jun 2014 20:09:37 -0400, Roy Smith wrote:
 
  We've also got machines that are so fast, it's not longer critical that
  we squeeze out every last iota of performance.  Oh, but wait, now we're
  trying to do absurd things like play full-motion video games on phones,
  where efficiency equates to battery life.  Sigh.
 
 That's where there needs to be a concerted push to develop more efficient 
 CPUs and memory, in the engineering sense of efficiency (i.e. better 
 power consumption, not speed). In desktop and server class machines, 
 increasing speed has generated more and more waste heat, to the point 
 where Google likes to build its server farms next to rivers to reduce 
 their air conditioning costs. You can't afford to do that on a battery.
 
 Even for desktops and servers, I'd prefer to give up, say, 80% of future 
 speed gains for a 50% reduction in my electricity bill.

For desktops, I'm more concerned about physical size.  On my desk at 
work, I have a Mac Mini.  It's about 8 inches square, by an inch and a 
half high.  It sits in a corner of my desk and doesn't take up much 
room.  The guy that sits next to me has a Dell running Linux.  It's 
about 8 inches wide, 15 inches deep, and 24 inches high.  In terms of 
CPU, memory, disk, video, networking, etc, they have virtually identical 
specs.

I've never compared the power consumption, but I assume his eats many 
time the electricity mine does (not to mention makes more noise).
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Gene Heskett
On Sunday 08 June 2014 10:51:24 Roy Smith did opine
And Gene did reply:
 In article 5393dd6a$0$29988$c3e8da3$54964...@news.astraweb.com,
 
  Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:
  On Sat, 07 Jun 2014 20:09:37 -0400, Roy Smith wrote:
   We've also got machines that are so fast, it's not longer critical
   that we squeeze out every last iota of performance.  Oh, but wait,
   now we're trying to do absurd things like play full-motion video
   games on phones, where efficiency equates to battery life.  Sigh.
  
  That's where there needs to be a concerted push to develop more
  efficient CPUs and memory, in the engineering sense of efficiency
  (i.e. better power consumption, not speed). In desktop and server
  class machines, increasing speed has generated more and more waste
  heat, to the point where Google likes to build its server farms next
  to rivers to reduce their air conditioning costs. You can't afford
  to do that on a battery.
  
  Even for desktops and servers, I'd prefer to give up, say, 80% of
  future speed gains for a 50% reduction in my electricity bill.
 
 For desktops, I'm more concerned about physical size.  On my desk at
 work, I have a Mac Mini.  It's about 8 inches square, by an inch and a
 half high.  It sits in a corner of my desk and doesn't take up much
 room.  The guy that sits next to me has a Dell running Linux.  It's
 about 8 inches wide, 15 inches deep, and 24 inches high.  In terms of
 CPU, memory, disk, video, networking, etc, they have virtually
 identical specs.
 
 I've never compared the power consumption, but I assume his eats many
 time the electricity mine does (not to mention makes more noise).

You may want to reconsider that statement after the first fan failure in 
your mini.  We've had quite a few Mac's in the tv station, as video 
servers, graphics composers, etc.  The airflow for cooling in them is 
controlled by baffles to get the maximum air flow past the hot spots, but 
a fan failure usually cooks the whole thing.  And at that time, Macs 
warranty did not cover collateral damage from a fan failure. Cooked cpu? 
Too bad, so sad.

Cheers, Gene Heskett
-- 
There are four boxes to be used in defense of liberty:
 soap, ballot, jury, and ammo. Please use in that order.
-Ed Howdershelt (Author)
Genes Web page http://geneslinuxbox.net:6309/gene
US V Castleman, SCOTUS, Mar 2014 is grounds for Impeaching SCOTUS
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Roy Smith
In article mailman.10878.1402242019.18130.python-l...@python.org,
 Gene Heskett ghesk...@wdtv.com wrote:

 You may want to reconsider that statement after the first fan failure in 
 your mini.  We've had quite a few Mac's in the tv station, as video 
 servers, graphics composers, etc.  The airflow for cooling in them is 
 controlled by baffles to get the maximum air flow past the hot spots, but 
 a fan failure usually cooks the whole thing.  And at that time, Macs 
 warranty did not cover collateral damage from a fan failure. Cooked cpu? 
 Too bad, so sad.

The CPU (or maybe I'm thinking of the video card?) in the Dell has some 
huge heat sink, a bunch of funky ductwork, and a dedicated fan.  I 
suspect if that fan were to fail, the chip it's cooling would fry itself 
pretty quickly too.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Gene Heskett
On Sunday 08 June 2014 12:09:41 Roy Smith did opine
And Gene did reply:
 In article mailman.10878.1402242019.18130.python-l...@python.org,
 
  Gene Heskett ghesk...@wdtv.com wrote:
  You may want to reconsider that statement after the first fan failure
  in your mini.  We've had quite a few Mac's in the tv station, as
  video servers, graphics composers, etc.  The airflow for cooling in
  them is controlled by baffles to get the maximum air flow past the
  hot spots, but a fan failure usually cooks the whole thing.  And at
  that time, Macs warranty did not cover collateral damage from a fan
  failure. Cooked cpu? Too bad, so sad.
 
 The CPU (or maybe I'm thinking of the video card?) in the Dell has some
 huge heat sink, a bunch of funky ductwork, and a dedicated fan.  I
 suspect if that fan were to fail, the chip it's cooling would fry
 itself pretty quickly too.

Probably.  I have lost several nvidia video cards over the years from fan 
failures.  My phenom in this box has a 75C shutdown that has not been 
tested.  Best fan  sink assembly I could buy at the time.  And I have 
gotten into the habit of replacing the 45 cent fans on the video card with 
bigger, ball bearing fans at the first hint of a squall.  A lot of this 
stuff has more engineering time in assuring it will die 2 weeks out of 
warranty, than in giving top performance.  And that goes double for stuff 
wearing an Antec label.  I'm on the 4th psu in this box, its a $12.65 in 
10 packs 350 watter, Chinese of course, running 4 terrabyte drives and a 
USB tree that looks like a weeping willow plus the original 2.1Mhz Phenom. 
165 watts IIRC.  I run gkrellm and watch its voltages.  Now about 3 years 
old, the 5 volt line is still 5.08 volts.  Whats not to like?  The 2 
Antecs I was dumb enough to try, had 5 volt lines down to 4.75 volts and 
doing random resets at the end of the 1 year warranty.  Thats not an 
excusable failure in my book.

Cheers, Gene Heskett
-- 
There are four boxes to be used in defense of liberty:
 soap, ballot, jury, and ammo. Please use in that order.
-Ed Howdershelt (Author)
Genes Web page http://geneslinuxbox.net:6309/gene
US V Castleman, SCOTUS, Mar 2014 is grounds for Impeaching SCOTUS
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Chris Angelico
On Mon, Jun 9, 2014 at 3:14 AM, Gene Heskett ghesk...@wdtv.com wrote:
 I have lost several nvidia video cards over the years from fan
 failures.

From a discussion on one of Threshold RPG's out-of-character channels:

Kurdt: I wouldn't disturb the fan controller.
Kurdt: Ever seen an AMD without a fan? ;)
Leshrak: heh, yeah
Leshrak: actually.  it's not a pretty smell
Kurdt: Especially when it's overclocked. It goes FT in under two seconds.

I think that's about right.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Sturla Molden
Chris Angelico ros...@gmail.com wrote:

 Kurdt: I wouldn't disturb the fan controller.
 Kurdt: Ever seen an AMD without a fan? ;)
 Leshrak: heh, yeah
 Leshrak: actually.  it's not a pretty smell
 Kurdt: Especially when it's overclocked. It goes FT in under two seconds.
 
 I think that's about right.

One would think that in 2014, a device called a thermostat would shut
down the power before expensive equipent goes up in a ball of smoke.


Sturla

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Chris Angelico
On Mon, Jun 9, 2014 at 4:09 AM, Sturla Molden sturla.mol...@gmail.com wrote:
 Chris Angelico ros...@gmail.com wrote:

 Kurdt: I wouldn't disturb the fan controller.
 Kurdt: Ever seen an AMD without a fan? ;)
 Leshrak: heh, yeah
 Leshrak: actually.  it's not a pretty smell
 Kurdt: Especially when it's overclocked. It goes FT in under two seconds.

 I think that's about right.

 One would think that in 2014, a device called a thermostat would shut
 down the power before expensive equipent goes up in a ball of smoke.

That exchange actually happened back in 2005 (wow! ages ago now), but
same difference. However, I think there are very few thermostats that
can cut the power quickly enough for an overclocked chip that loses
its heat sink. MAYBE if the heat sink is still on and the fan isn't,
but not if the hs falls off. Under two seconds might become the
blink of an eye.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Carlos Anselmo Dias


On 06/08/2014 06:14 PM, Gene Heskett wrote:

On Sunday 08 June 2014 12:09:41 Roy Smith did opine
And Gene did reply:

In article mailman.10878.1402242019.18130.python-l...@python.org,

  Gene Heskett ghesk...@wdtv.com wrote:

You may want to reconsider that statement after the first fan failure
in your mini.  We've had quite a few Mac's in the tv station, as
video servers, graphics composers, etc.  The airflow for cooling in
them is controlled by baffles to get the maximum air flow past the
hot spots, but a fan failure usually cooks the whole thing.  And at
that time, Macs warranty did not cover collateral damage from a fan
failure. Cooked cpu? Too bad, so sad.

The CPU (or maybe I'm thinking of the video card?) in the Dell has some
huge heat sink, a bunch of funky ductwork, and a dedicated fan.  I
suspect if that fan were to fail, the chip it's cooling would fry
itself pretty quickly too.

Probably.  I have lost several nvidia video cards over the years from fan
failures.  My phenom in this box has a 75C shutdown that has not been
tested.  Best fan  sink assembly I could buy at the time.  And I have
gotten into the habit of replacing the 45 cent fans on the video card with
bigger, ball bearing fans at the first hint of a squall.  A lot of this
stuff has more engineering time in assuring it will die 2 weeks out of
warranty, than in giving top performance.  And that goes double for stuff
wearing an Antec label.  I'm on the 4th psu in this box, its a $12.65 in
10 packs 350 watter, Chinese of course, running 4 terrabyte drives and a
USB tree that looks like a weeping willow plus the original 2.1Mhz Phenom.
165 watts IIRC.  I run gkrellm and watch its voltages.  Now about 3 years
old, the 5 volt line is still 5.08 volts.  Whats not to like?  The 2
Antecs I was dumb enough to try, had 5 volt lines down to 4.75 volts and
doing random resets at the end of the 1 year warranty.  Thats not an
excusable failure in my book.

Cheers, Gene Heskett

Reading this reminds me the hypothetical dilemma of (...)

If one solution based in n dependencies(client apis) would need to 
optimize it's system(in dependencies too) to face the massive hits of 
search engines in the indexation of n millions of pages with tracking 
integrated at several levels(...) ... how would it be solved? ... It 
would turn at n volts(...) and it would need to decrease the voltage(...)


This is somehow integrated in what I wrote in the post with the subject 
'python team(...)'


To put this working and optimized is really fascinating (...)

Regards,
Carlos



--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Sturla Molden
Chris Angelico ros...@gmail.com wrote:

 Kurdt: I wouldn't disturb the fan controller.
 Kurdt: Ever seen an AMD without a fan? ;)
 Leshrak: heh, yeah
 Leshrak: actually.  it's not a pretty smell
 Kurdt: Especially when it's overclocked. It goes FT in under two 
 seconds.
 
 I think that's about right.
 
 One would think that in 2014, a device called a thermostat would shut
 down the power before expensive equipent goes up in a ball of smoke.
 
 That exchange actually happened back in 2005 (wow! ages ago now), but
 same difference. However, I think there are very few thermostats that
 can cut the power quickly enough for an overclocked chip that loses
 its heat sink. MAYBE if the heat sink is still on and the fan isn't,
 but not if the hs falls off. Under two seconds might become the
 blink of an eye.

If the heat sinks falls off, yes, that is really bad news... But if the fan
fails the warm up shouldn't be that rapid. I thought we were taking about
fan failure, not detached heat sink.

Sturla

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Steven D'Aprano
On Mon, 09 Jun 2014 04:16:24 +1000, Chris Angelico wrote:

 On Mon, Jun 9, 2014 at 4:09 AM, Sturla Molden sturla.mol...@gmail.com
 wrote:
 Chris Angelico ros...@gmail.com wrote:

 Kurdt: I wouldn't disturb the fan controller. Kurdt: Ever seen an AMD
 without a fan? ;) Leshrak: heh, yeah
 Leshrak: actually.  it's not a pretty smell Kurdt: Especially when
 it's overclocked. It goes FT in under two seconds.

 I think that's about right.

 One would think that in 2014, a device called a thermostat would shut
 down the power before expensive equipent goes up in a ball of smoke.
 
 That exchange actually happened back in 2005 (wow! ages ago now), but
 same difference. However, I think there are very few thermostats that
 can cut the power quickly enough for an overclocked chip that loses its
 heat sink. MAYBE if the heat sink is still on and the fan isn't, but not
 if the hs falls off. Under two seconds might become the blink of an
 eye.

The fact that CPUs need anything more than a passive heat sink is 
*exactly* the problem. A car engine has to move anything up to a tonne of 
steel around at 100kph or more, and depending on the design, they can get 
away with air-cooling. In comparison, a CPU just moves around a trickle 
of electric current.

(No currently designed car with an internal combustion engine uses air-
cooling. The last mass market car that used it, the Citroën GS, ceased 
production in 1986. The Porsche 911 ceased production in 1998, making it, 
I think, the last air-cooled vehicle apart from custom machines. With the 
rise of all-electric vehicles, perhaps we will see a return to air-
cooling?)

CPU technology is the triumph of brute force over finesse.



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Rustom Mody
On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:
 On Mon, 09 Jun 2014 04:16:24 +1000, Chris Angelico wrote:

  wrote:
  Chris Angelico wrote:
  Kurdt: I wouldn't disturb the fan controller. Kurdt: Ever seen an AMD
  without a fan? ;) Leshrak: heh, yeah
  Leshrak: actually.  it's not a pretty smell Kurdt: Especially when
  it's overclocked. It goes FT in under two seconds.
  I think that's about right.
  One would think that in 2014, a device called a thermostat would shut
  down the power before expensive equipent goes up in a ball of smoke.
  That exchange actually happened back in 2005 (wow! ages ago now), but
  same difference. However, I think there are very few thermostats that
  can cut the power quickly enough for an overclocked chip that loses its
  heat sink. MAYBE if the heat sink is still on and the fan isn't, but not
  if the hs falls off. Under two seconds might become the blink of an
  eye.

 The fact that CPUs need anything more than a passive heat sink is 
 *exactly* the problem. A car engine has to move anything up to a tonne of 
 steel around at 100kph or more, and depending on the design, they can get 
 away with air-cooling. In comparison, a CPU just moves around a trickle 
 of electric current.

Trickle?
Ok... only its multiplied by a billion:
http://en.wikipedia.org/wiki/Transistor_count

 (No currently designed car with an internal combustion engine uses air-
 cooling. The last mass market car that used it, the Citroën GS, ceased 
 production in 1986. The Porsche 911 ceased production in 1998, making it, 
 I think, the last air-cooled vehicle apart from custom machines. With the 
 rise of all-electric vehicles, perhaps we will see a return to air-
 cooling?)

 CPU technology is the triumph of brute force over finesse.

If you are arguing that computers should not use millions/billions of
transistors, I wont argue, since I dont know the technology.

Only pointing out that billion is a large number in pragmatic terms
- So is million for that matter
- Actually not so sure even on that count
  [Never counted beyond hundred!]
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Rustom Mody
On Monday, June 9, 2014 5:04:05 AM UTC+5:30, Sturla Molden wrote:
 Chris Angelico wrote:

  Kurdt: I wouldn't disturb the fan controller.
  Kurdt: Ever seen an AMD without a fan? ;)
  Leshrak: heh, yeah
  Leshrak: actually.  it's not a pretty smell
  Kurdt: Especially when it's overclocked. It goes FT in under two 
  seconds.
  I think that's about right.
  One would think that in 2014, a device called a thermostat would shut
  down the power before expensive equipent goes up in a ball of smoke.
  That exchange actually happened back in 2005 (wow! ages ago now), but
  same difference. However, I think there are very few thermostats that
  can cut the power quickly enough for an overclocked chip that loses
  its heat sink. MAYBE if the heat sink is still on and the fan isn't,
  but not if the hs falls off. Under two seconds might become the
  blink of an eye.

 If the heat sinks falls off, yes, that is really bad news... But if the fan
 fails the warm up shouldn't be that rapid. I thought we were taking about
 fan failure, not detached heat sink.

Dont know about 'fall off'
However one day I tried to 'clean' my 'dirty' computer 
- which included removing the CPU fan, dusting it and fitting it back
- didnt know about thermal paste

Machine shut down in a minute (if I remember right)
with a message about overheating

When the (new!) thermal paste was applied it started again
I vaguely remember that the bios remembered the untoward event and some
resetting was required though dont remember what
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Rustom Mody
On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:
 On Mon, 09 Jun 2014 04:16:24 +1000, Chris Angelico wrote:
  wrote:
 The fact that CPUs need anything more than a passive heat sink is 
 *exactly* the problem. A car engine has to move anything up to a tonne of 
 steel around at 100kph or more, and depending on the design, they can get 
 away with air-cooling. In comparison, a CPU just moves around a trickle 
 of electric current.

 (No currently designed car with an internal combustion engine uses air-
 cooling. The last mass market car that used it, the Citroën GS, ceased 
 production in 1986. The Porsche 911 ceased production in 1998, making it, 
 I think, the last air-cooled vehicle apart from custom machines. With the 
 rise of all-electric vehicles, perhaps we will see a return to air-
 cooling?)

 CPU technology is the triumph of brute force over finesse.

BTW people are going this way:
http://www.silentpcreview.com/
http://www.endpcnoise.com/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Steven D'Aprano
On Sun, 08 Jun 2014 19:24:52 -0700, Rustom Mody wrote:

 On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:

 The fact that CPUs need anything more than a passive heat sink is
 *exactly* the problem. A car engine has to move anything up to a tonne
 of steel around at 100kph or more, and depending on the design, they
 can get away with air-cooling. In comparison, a CPU just moves around a
 trickle of electric current.
 
 Trickle?
 Ok... only its multiplied by a billion:
 http://en.wikipedia.org/wiki/Transistor_count

A typical desktop computer uses less than 500 watts for *everything* 
except the screen. Hard drives. DVD burner. Keyboard, mouse, USB devices, 
network card, sound card, graphics card, etc. (Actually, 350W is more 
typical.)

Moore's Law observes that processing power has doubled about every two 
years. Over the last decade, processing power has increased by a factor 
of 32. If *efficiency* had increased at the same rate, that 500W power 
supply in your PC would now be a 15W power supply. Your mobile phone 
would last a month between recharges, not a day. Your laptop could use a 
battery half the size and still last two weeks on a full charge.

In practice, hard drives are not likely to get more efficient, since you 
have to spin up a lump of metal. (Solid state drives tend to be either 
slow and unreliable, or blindingly fast and even more unreliable. Let me 
know how they are in another ten years.) Network cards etc. are 
relatively low-power. It's only the CPU and some of the bigger graphics 
cards that really eat electrons. Moore's Law for power efficiency is 
probably asking too much, but is it too much to ask that CPUs should 
double their efficiency every five years? I don't think so.


 CPU technology is the triumph of brute force over finesse.
 
 If you are arguing that computers should not use millions/billions of
 transistors, I wont argue, since I dont know the technology.

No. I'm arguing that they shouldn't convert 90% of their energy input 
into heat.


 Only pointing out that billion is a large number in pragmatic terms - So
 is million for that matter
 - Actually not so sure even on that count
   [Never counted beyond hundred!]

Not really. A single grain of salt contains billions of billions of 
atoms. A billion transistors is still a drop in the ocean. Wait until we 
get the equivalent of an iPhone's processing power in a speck of dust 
that can float in the air.

http://www.technovelgy.com/ct/content.asp?Bnum=245



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-08 Thread Chris Angelico
On Mon, Jun 9, 2014 at 11:44 AM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 The fact that CPUs need anything more than a passive heat sink is
 *exactly* the problem. A car engine has to move anything up to a tonne of
 steel around at 100kph or more, and depending on the design, they can get
 away with air-cooling. In comparison, a CPU just moves around a trickle
 of electric current.

So, let me get this straight. A CPU has to have a fan, but a car
engine doesn't, because the car's moving at a hundred kays an hour. I
have a suspicion the CPU fan moves air a bit slower than that.

*dives for cover*

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Christian Gollwitzer

Am 06.06.14 13:20, schrieb Alain Ketterlin:

Chris Angelico ros...@gmail.com writes:

It's impossible to accidentally call a base class's method when you
ought to have called the overriding method in the subclass, which is a
risk in C++ [2].


I don't how this can happen in C++, unless you actually have an instance
of the base class. Anyway, I didn't mention C++.


A special, but important case of this is inside the constructor. Until 
you exit the constructor, C++ treats the object as not fully 
constructed, and if you call a virtual method there, it calls the method 
of the base class.


http://www.parashift.com/c++-faq/calling-virtuals-from-ctors.html

The answer is, of course, to create a *separate* init function in 
addition to the constructor and to require the user of the class to call 
it after the constructor, or to hide the real constructor away and 
require the user to call a factory function instead.


I love C++.
(seriously, but not /that/ part)

Christian
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Alain Ketterlin
Sturla Molden sturla.mol...@gmail.com writes:

 Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
 Sturla Molden sturla.mol...@gmail.com writes:
 
 Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
 
 Many of these students suggest Python as the
 development language (they learned it and liked it), and the suggestion
 is (almost) always rejected, in favor of Java or C# or C/C++.
 
 And it was almost always the wrong decision...
 
 I think they know better than you and me.

 Now it's my turn to say oh, come on. Those who make these decisions have
 likely never written a line of code in their life.

This totally contradicst my experience. I've heard horror stories like
everybody else, but I just have been lucky enough to work with people
that very seriously evaluate their engineering decisions.

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Mark Lawrence

On 07/06/2014 09:20, Alain Ketterlin wrote:

Sturla Molden sturla.mol...@gmail.com writes:


Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:

Sturla Molden sturla.mol...@gmail.com writes:


Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:


Many of these students suggest Python as the
development language (they learned it and liked it), and the suggestion
is (almost) always rejected, in favor of Java or C# or C/C++.


And it was almost always the wrong decision...


I think they know better than you and me.


Now it's my turn to say oh, come on. Those who make these decisions have
likely never written a line of code in their life.


This totally contradicst my experience. I've heard horror stories like
everybody else, but I just have been lucky enough to work with people
that very seriously evaluate their engineering decisions.

-- Alain.



Clearly manpower isn't an issue.

--
My fellow Pythonistas, ask not what our language can do for you, ask 
what you can do for our language.


Mark Lawrence

---
This email is free from viruses and malware because avast! Antivirus protection 
is active.
http://www.avast.com


--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Alain Ketterlin
Mark Lawrence breamore...@yahoo.co.uk writes:

 On 07/06/2014 09:20, Alain Ketterlin wrote:
 Sturla Molden sturla.mol...@gmail.com writes:

 Many of these students suggest Python as the
 development language (they learned it and liked it), and the suggestion
 is (almost) always rejected, in favor of Java or C# or C/C++.

 And it was almost always the wrong decision...

 I think they know better than you and me.

 Now it's my turn to say oh, come on. Those who make these decisions have
 likely never written a line of code in their life.

 This totally contradicst my experience. I've heard horror stories like
 everybody else, but I just have been lucky enough to work with people
 that very seriously evaluate their engineering decisions.

 Clearly manpower isn't an issue.

No. Cost is the issue (development, maintenance, operation,
liability...). Want an example? Here is one:

http://tech.slashdot.org/story/14/06/06/1443218/gm-names-and-fires-engineers-involved-in-faulty-ignition-switch

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Roy Smith
In article 87zjhpm8q7@dpt-info.u-strasbg.fr,
 Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:

 Sturla Molden sturla.mol...@gmail.com writes:
 
  Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
  Sturla Molden sturla.mol...@gmail.com writes:
  
  Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
  
  Many of these students suggest Python as the
  development language (they learned it and liked it), and the suggestion
  is (almost) always rejected, in favor of Java or C# or C/C++.
  
  And it was almost always the wrong decision...
  
  I think they know better than you and me.
 
  Now it's my turn to say oh, come on. Those who make these decisions have
  likely never written a line of code in their life.
 
 This totally contradicst my experience. I've heard horror stories like
 everybody else, but I just have been lucky enough to work with people
 that very seriously evaluate their engineering decisions.

You are lucky indeed.  Trust me, in big companies, technical decisions 
are often made by people who are not using the technology.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Roy Smith
In article mailman.10852.1402154644.18130.python-l...@python.org,
 Dennis Lee Bieber wlfr...@ix.netcom.com wrote:

 On 07 Jun 2014 04:57:19 GMT, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info declaimed the following:
 
 
 Swift is intended as a new generation *systems language*. The old 
 generation of systems languages are things like C, Objective-C, C#, C++, 
 Java, Pascal, Algol, and so forth. The new generation are intended to 
 fulfil the same niches, but to have syntax and usability closer to that 
 of scripting languages. Languages like Go, Rust, Ceylon, and now Swift.
 
   Pascal as a systems language? We must have major differences what
 constitutes a systems language then...

The original MacOS was written in Pascal (both applications and kernel).  
Being able to touch memory locations or registers requires no more than 
a few short glue routines written in assembler.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Roy Smith
In article mailman.10851.1402154030.18130.python-l...@python.org,
 Dennis Lee Bieber wlfr...@ix.netcom.com wrote:

 On Sat, 07 Jun 2014 08:52:36 -0400, Roy Smith r...@panix.com declaimed the
 following:
 
 
 You are lucky indeed.  Trust me, in big companies, technical decisions 
 are often made by people who are not using the technology.
 
   Or influenced by someone familiar with some tech and having a big
 ego...
 
   Many years ago, in a company to remain nameless, I was in a department
 with ~130 programmers distributed among 3-4 main subsystems (batch analysis
 [aka, post-processing of the daily tapes], planning [generating the
 schedule for the next day], and real-time [operations using the schedule]).
 The real-time group was 15-30 people using Macro-11 (PDP-11s if that dates
 things). The rest of the department was pretty much all skilled VAX
 FORTRAN-77.
 
   The time came to port real-time from PDP-11 to a VAX system. A small
 study was performed to determine what language would be used. Very small
 study -- I think it was restricted to the 30 RT folks; I only learned of
 the result after a choice had been made.
 
   The candidates: VAX-11 Assembly, F77, C, Pascal.

What was wrong with just running the original pdp-11 binaries on the VAX 
in compatibility mode? :-)
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Michael Torrie
On 06/07/2014 09:23 AM, Dennis Lee Bieber wrote:
 On 07 Jun 2014 04:57:19 GMT, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info declaimed the following:
 

 Swift is intended as a new generation *systems language*. The old 
 generation of systems languages are things like C, Objective-C, C#, C++, 
 Java, Pascal, Algol, and so forth. The new generation are intended to 
 fulfil the same niches, but to have syntax and usability closer to that 
 of scripting languages. Languages like Go, Rust, Ceylon, and now Swift.

   Pascal as a systems language? We must have major differences what
 constitutes a systems language then...
 
   Native Pascal had no features to support hitting the hardware or
 arbitrary memory addresses/registers. It was a candidate for an
 applications language (though even that always felt a stretch to me; as a
 teaching language for structured programming it was ideal, though). Try
 writing a serial port driver for a memory mapped I/O system using pure
 Pascal.

Technically C doesn't either, except via subroutines in libc, though C
does have pointers which would be used to access memory.  In the old MS
DOS days, C would embed assembly to call interrupts and set up interrupt
tables, etc.

As someone else mentioned recently, Pascal was used as the system
language on Mac computers for many years.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Roy Smith
In article mailman.10853.1402162690.18130.python-l...@python.org,
 Michael Torrie torr...@gmail.com wrote:

 Technically C doesn't [have features to support hitting the hardware] 
 either, except via subroutines in libc, though C does have pointers 
 which would be used to access memory.

Several language constructs in C are there specifically to diddle bits 
in hardware.  Bit fields were in the earliest implementations of the 
language to allow you to address individual bit control and status bits 
in memory-mapped device controllers.  The volatile keyword is there to 
deal with bits which change value on their own (as hardware status 
registers do).

And, why do you need a library routine to touch a memory location, when 
you can just dereference an integer? :-)
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Michael Torrie
On 06/07/2014 12:11 PM, Roy Smith wrote:
 Several language constructs in C are there specifically to diddle bits 
 in hardware.  Bit fields were in the earliest implementations of the 
 language to allow you to address individual bit control and status bits 
 in memory-mapped device controllers.  The volatile keyword is there to 
 deal with bits which change value on their own (as hardware status 
 registers do).
 
 And, why do you need a library routine to touch a memory location, when 
 you can just dereference an integer? :-)

Which of course, technically, Pascal has too.

But memory addressing is only half the story.  You still need interrupts
and ioctl access, both of which happen via assembly instructions that
libc exposes via a standard C subroutine interface.

Really any language can access hardware this way.  Whether it's
MicroPython on an embedded system, or BASIC on a pic.  The lines are
being blurred.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Roy Smith
In article mailman.10857.1402167635.18130.python-l...@python.org,
 Michael Torrie torr...@gmail.com wrote:

 On 06/07/2014 12:11 PM, Roy Smith wrote:
  Several language constructs in C are there specifically to diddle bits 
  in hardware.  Bit fields were in the earliest implementations of the 
  language to allow you to address individual bit control and status bits 
  in memory-mapped device controllers.  The volatile keyword is there to 
  deal with bits which change value on their own (as hardware status 
  registers do).
  
  And, why do you need a library routine to touch a memory location, when 
  you can just dereference an integer? :-)
 
 Which of course, technically, Pascal has too.
 
 But memory addressing is only half the story.  You still need interrupts
 and ioctl access, both of which happen via assembly instructions that
 libc exposes via a standard C subroutine interface.

Well, on a machine where all I/O is memory mapped, it's really 3/4 of 
the story, but I get your point.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Steven D'Aprano
On Sat, 07 Jun 2014 11:13:42 -0400, Dennis Lee Bieber wrote:

 About a decade later, said manager retired and confessed that the choice
 of Pascal was a mistake

There's Pascal and there's Pascal. Standard Pascal, I admit, is woefully 
unsuitable for real world work. But Pascal with suitable extensions was 
good enough for the first 6 generations of the Macintosh operating system 
and key applications, at a time when *nobody* was even coming close to 
doing what the Mac was capable of. (Admittedly, a certain number of the 
core OS libraries, most famously Quickdraw, were handwritten in assembly 
by a real genius.) By the mid-80s, Apple's SANE (Standard Apple Numeric 
Environment) was quite possibly the best environment for doing IEEE-754 
numeric work anywhere. But of course, Macintoshes were toys, right, and 
got no respect, even when the Mac G4 was the first PC powerful enough to 
be classified by US export laws as a supercomputer.

[Disclaimer: Pascal on the Mac might have been far ahead of the pack when 
it came to supporting IEEE-754, but it didn't have the vast number of 
(variable-quality) Fortran libraries available on other systems. And 
while it is true that the G4 was classified as a supercomputer, that was 
only for four months until the Clinton administration changed the laws. 
Apple, of course, played that for every cent of advertising as it could.]



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Steven D'Aprano
On Sat, 07 Jun 2014 14:11:27 -0400, Roy Smith wrote:

 And, why do you need a library routine to touch a memory location, when
 you can just dereference an integer? :-)

And in one sentence we have an explanation for 90% of security 
vulnerabilities before PHP and SQL injection attacks...

C is not a safe language, and code written in C is not safe. Using C for 
application development is like shaving with a cavalry sabre -- harder 
than it need be, and you're likely to remove your head by accident.




-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Roy Smith
In article 5393a264$0$29988$c3e8da3$54964...@news.astraweb.com,
 Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote:

 On Sat, 07 Jun 2014 14:11:27 -0400, Roy Smith wrote:
 
  And, why do you need a library routine to touch a memory location, when
  you can just dereference an integer? :-)
 
 And in one sentence we have an explanation for 90% of security 
 vulnerabilities before PHP and SQL injection attacks...
 
 C is not a safe language, and code written in C is not safe. Using C for 
 application development is like shaving with a cavalry sabre -- harder 
 than it need be, and you're likely to remove your head by accident.

I never claimed C was a safe language.  I assume you've seen the classic 
essay, http://www-users.cs.york.ac.uk/susan/joke/foot.htm ?

And, no, I don't think C is a good application language (any more).  
When it first came out, it was revolutionary.  A lot of really amazing 
application software was written in it, partly because the people 
writing in it were some of the smartest guys around.  But, that was 40 
years ago.  We've learned a lot about software engineering since then.  

We've also got machines that are so fast, it's not longer critical that 
we squeeze out every last iota of performance.  Oh, but wait, now we're 
trying to do absurd things like play full-motion video games on phones, 
where efficiency equates to battery life.  Sigh.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Chris Angelico
On Sun, Jun 8, 2014 at 10:09 AM, Roy Smith r...@panix.com wrote:
 We've also got machines that are so fast, it's not longer critical that
 we squeeze out every last iota of performance.  Oh, but wait, now we're
 trying to do absurd things like play full-motion video games on phones,
 where efficiency equates to battery life.  Sigh.

Efficiency will never stop being important. Efficiency will also never
be the one most important thing. No matter how much computing power
changes, those statements are unlikely to be falsified...

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Gregory Ewing

Michael Torrie wrote:

Technically C doesn't either, except via subroutines in libc, though C
does have pointers which would be used to access memory.


The Pascal that Apple used had a way of casting an
int to a pointer, so you could do all the tricks
you can do with pointers in C.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Gregory Ewing

Dennis Lee Bieber wrote:

Not standard Pascal... It had pointer types, but no means to stuff
an integer into the pointer variable in order to dereference it as a memory
address...


Although most implementations would let you get the same
effect by abusing variant records (the equivalent of a
C union).


What is an interrupt --
typically a handler (function) address stored in a fixed location used by
the CPU when an external hardware signal goes high... Nothing prevents one
from writing that handler in C and using C's various casting operations to
stuff it into the vector memory.


Most CPU architectures require you to use a special
return from interrupt instruction to return from
a hardware interrupt handler. So you need at least
a small assembly language stub to call a handler
written in C, or a C compiler with a non-standard
extension to generate that instruction.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Steven D'Aprano
On Sat, 07 Jun 2014 20:09:37 -0400, Roy Smith wrote:

 We've also got machines that are so fast, it's not longer critical that
 we squeeze out every last iota of performance.  Oh, but wait, now we're
 trying to do absurd things like play full-motion video games on phones,
 where efficiency equates to battery life.  Sigh.

That's where there needs to be a concerted push to develop more efficient 
CPUs and memory, in the engineering sense of efficiency (i.e. better 
power consumption, not speed). In desktop and server class machines, 
increasing speed has generated more and more waste heat, to the point 
where Google likes to build its server farms next to rivers to reduce 
their air conditioning costs. You can't afford to do that on a battery.

Even for desktops and servers, I'd prefer to give up, say, 80% of future 
speed gains for a 50% reduction in my electricity bill.



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-07 Thread Marko Rauhamaa
Roy Smith r...@panix.com:

 The original MacOS was written in Pascal (both applications and
 kernel). Being able to touch memory locations or registers requires no
 more than a few short glue routines written in assembler.

Pascal is essentially equivalent to C, except Pascal has a cleaner
syntax. I like the fact that the semicolon is a separator. Also, the
variable declaration syntax is done more smartly in Pascal. And the
pointer/array confusion in C is silly.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Travis Griggs


 On Jun 5, 2014, at 1:14, Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
 
 Swift's memory management is similar to python's (ref. counting). Which
 makes me think that a subset of python with the same type safety would
 be an instant success.

Except that while you don't need to regularly worry about cycles in python, you 
do in swift. Which means you get to think constantly about direct and indirect 
cycles, figure out where to put weak stuff, when to use a local to keep a weak 
property alive until it finds it's strong home, etc.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Alain Ketterlin
Travis Griggs travisgri...@gmail.com writes:

 On Jun 5, 2014, at 1:14, Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
 
 Swift's memory management is similar to python's (ref. counting). Which
 makes me think that a subset of python with the same type safety would
 be an instant success.

 Except that while you don't need to regularly worry about cycles in
 python, you do in swift.

Right. You can't just ignore cycle in Swift.

 Which means you get to think constantly about direct and indirect
 cycles, figure out where to put weak stuff, when to use a local to
 keep a weak property alive until it finds it's strong home, etc.

Well, I don't consider this a bad trade-off. Deciding which refs are
weak and which are strong, or even designing an explicit deallocation
strategy, are design decisions.

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Alain Ketterlin
Terry Reedy tjre...@udel.edu writes:

 On 6/5/2014 4:07 PM, Alain Ketterlin wrote:

 When I compile Cython modules I use LLVM on this computer.

 Cython is not Python, it is another language, with an incompatible
 syntax.

 Cython compiles Python with optional extensions that allow additional
 speed ups over compiling Python as is. In other word, the Cython
 language is a Python superset.

You're right. What I question is the fact that anybody uses Cython
without the additional syntax. There is little chance that a pure
Python program will see any significant speedup when compiled with
Cython (or, if it does, it means that the canonical Python interpreter
has some sub-optimal behavior that will, eventually, be corrected).

The nice thing with optional type annotations and an hypothetical Python
compiler would be that you could, e.g., continue using the interpreter
during development and then compile for production use. Or whatever mix
you need.

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Alain Ketterlin
Sturla Molden sturla.mol...@gmail.com writes:

 Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:

 Many of these students suggest Python as the
 development language (they learned it and liked it), and the suggestion
 is (almost) always rejected, in favor of Java or C# or C/C++.

 And it was almost always the wrong decision...

I think they know better than you and me.

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Alain Ketterlin
Chris Angelico ros...@gmail.com writes:

 On Fri, Jun 6, 2014 at 7:23 AM, Mark Lawrence breamore...@yahoo.co.uk wrote:
 On 05/06/2014 21:07, Alain Ketterlin wrote:

 Sturla Molden sturla.mol...@gmail.com writes:

 On 05/06/14 10:14, Alain Ketterlin wrote:

 Type safety.

 Perhaps. Python has strong type safety.

 Come on.

 I don't understand that comment, please explain.

 Type safety means many different things to different people. What
 Python has is untyped variables, and hierarchically typed objects.
 It's impossible to accidentally treat an integer as a float, and have
 junk data [1].

It's impossible in Swift as well.

 It's impossible to accidentally call a base class's method when you
 ought to have called the overriding method in the subclass, which is a
 risk in C++ [2].

I don't how this can happen in C++, unless you actually have an instance
of the base class. Anyway, I didn't mention C++.

[I agree with the rest of your explanation.]

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Alain Ketterlin
Sturla Molden sturla.mol...@gmail.com writes:

 On 05/06/14 22:27, Alain Ketterlin wrote:
 I have seen dozens of projects where Python was dismissed because of the
 lack of static typing, and the lack of static analysis tools.

[...]
 When is static analysis actually needed and for what purpose?

For example WCET analysis (where predictability is more important than
performance). Or code with strong security constraint. Or overflow
detection tools. Or race condition analyzers. And there are many others.
And I don't even mention engineering tools for dependence analysis,
packaging, etc. (or even IDEs).

 [...] But still they avoid Ada [...]

Sorry, I forgot Ada in my list.

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Chris Angelico
On Fri, Jun 6, 2014 at 9:20 PM, Alain Ketterlin
al...@dpt-info.u-strasbg.fr wrote:
 It's impossible to accidentally call a base class's method when you
 ought to have called the overriding method in the subclass, which is a
 risk in C++ [2].

 I don't how this can happen in C++, unless you actually have an instance
 of the base class. Anyway, I didn't mention C++.

Mostly if you forget to declare the method 'virtual'; there are other
ways to muck things up, but that's the main one.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Terry Reedy

On 6/6/2014 7:11 AM, Alain Ketterlin wrote:

Terry Reedy tjre...@udel.edu writes:


On 6/5/2014 4:07 PM, Alain Ketterlin wrote:


When I compile Cython modules I use LLVM on this computer.


Cython is not Python, it is another language, with an incompatible
syntax.


Cython compiles Python with optional extensions that allow additional
speed ups over compiling Python as is. In other word, the Cython
language is a Python superset.


I am assuming here that the claim to have reached this goal is correct.


You're right. What I question is the fact that anybody uses Cython
without the additional syntax. There is little chance that a pure
Python program will see any significant speedup when compiled with


I believe the Cython author has claimed a 2x-5x speedup for stdlib 
modules when compiled 'as is'.



Cython (or, if it does, it means that the canonical Python interpreter
has some sub-optimal behavior that will, eventually, be corrected).


I believe that there is some inherent overhead that Cython bypasses.

--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Sturla Molden
Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:

 When is static analysis actually needed and for what purpose?
 
 For example WCET analysis (where predictability is more important than
 performance). Or code with strong security constraint. Or overflow
 detection tools. Or race condition analyzers. And there are many others.
 And I don't even mention engineering tools for dependence analysis,
 packaging, etc. (or even IDEs).

You don't have to answer a rhetorical question.


Sturla

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Sturla Molden
Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
 Sturla Molden sturla.mol...@gmail.com writes:
 
 Alain Ketterlin al...@dpt-info.u-strasbg.fr wrote:
 
 Many of these students suggest Python as the
 development language (they learned it and liked it), and the suggestion
 is (almost) always rejected, in favor of Java or C# or C/C++.
 
 And it was almost always the wrong decision...
 
 I think they know better than you and me.

Now it's my turn to say oh, come on. Those who make these decisions have
likely never written a line of code in their life.

Sturla

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Michael Torrie
On 06/06/2014 12:28 AM, Travis Griggs wrote:
 
 
 On Jun 5, 2014, at 1:14, Alain Ketterlin
 al...@dpt-info.u-strasbg.fr wrote:
 
 Swift's memory management is similar to python's (ref. counting).
 Which makes me think that a subset of python with the same type
 safety would be an instant success.
 
 Except that while you don't need to regularly worry about cycles in
 python, you do in swift. Which means you get to think constantly
 about direct and indirect cycles, figure out where to put weak stuff,
 when to use a local to keep a weak property alive until it finds it's
 strong home, etc.

Swift's reference counting seems to be fairly close to Objective C's,
which makes sense since the classes can be used directly in Swift.
Seems to me that Swift is just Objective C with some syntactic sugar and
a nicer syntax. That's why I said it was a little odd to be comparing
Swift to Python, or at least to be claiming Apple should have made
Python it's core language.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-06 Thread Steven D'Aprano
On Fri, 06 Jun 2014 20:41:09 -0600, Michael Torrie wrote:

 On 06/06/2014 12:28 AM, Travis Griggs wrote:
 
 
 On Jun 5, 2014, at 1:14, Alain Ketterlin al...@dpt-info.u-strasbg.fr
 wrote:
 
 Swift's memory management is similar to python's (ref. counting).
 Which makes me think that a subset of python with the same type safety
 would be an instant success.
 
 Except that while you don't need to regularly worry about cycles in
 python, you do in swift. Which means you get to think constantly about
 direct and indirect cycles, figure out where to put weak stuff, when to
 use a local to keep a weak property alive until it finds it's strong
 home, etc.
 
 Swift's reference counting seems to be fairly close to Objective C's,
 which makes sense since the classes can be used directly in Swift. Seems
 to me that Swift is just Objective C with some syntactic sugar and a
 nicer syntax. That's why I said it was a little odd to be comparing
 Swift to Python, or at least to be claiming Apple should have made
 Python it's core language.

A little odd? It's utterly astonishing!

Swift is not in the same family of languages as Python, Perl, Javascript, 
Ruby, Applescript, etc. I'll call them scripting languages, but I don't 
mean that as a put-down. I just mean that they are intended for rapid 
development, they are dynamically typed, lightweight, and typically 
aren't expected to compile directly to machine code.

Swift is intended as a new generation *systems language*. The old 
generation of systems languages are things like C, Objective-C, C#, C++, 
Java, Pascal, Algol, and so forth. The new generation are intended to 
fulfil the same niches, but to have syntax and usability closer to that 
of scripting languages. Languages like Go, Rust, Ceylon, and now Swift.

We're starting to see the distinction between systems and scripting 
languages blurred even more than it used to be. These smart, lightweight 
but powerful systems languages are likely to be a big challenge to 
scripting languages like Python and Ruby in the coming decades. If you 
had a language as easy to use and as safe as Python, but as efficient as 
C, why wouldn't you use it?

It is naive to expect Apple to have made Python it's core language. 
Apple's core language is Objective-C, and if they were going to pick a 
core scripting language (other than Applescript, which exists in a 
different ecological niche) it would have been Ruby, not Python. Ruby's 
fundamental model is very similar to Objective-C, Python's is not. Apple 
already developed a version of Ruby, MacRuby, which was designed to 
call directly into the Objective-C APIs without an intermediate interface 
layer, but they have abandoned that to focus on Swift.

(Besides, Apple is unlikely to commit to a core language being something 
they don't control.)



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Alain Ketterlin
Sturla Molden sturla.mol...@gmail.com writes:

 Dear Apple,

 Why should I be exited about an illegitmate child of Python, Go and
 JavaScript?
[...]

Type safety. (And with it comes better performance ---read battery
life--- and better static analysis tools, etc.) LLVM (an Apple-managed
project) for the middle- and back-end, and a brand new front-end
incorporating a decent type system (including optional types for
instance).

Swift's memory management is similar to python's (ref. counting). Which
makes me think that a subset of python with the same type safety would
be an instant success.

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Chris Angelico
On Thu, Jun 5, 2014 at 6:14 PM, Alain Ketterlin
al...@dpt-info.u-strasbg.fr wrote:
 Swift's memory management is similar to python's (ref. counting). Which
 makes me think that a subset of python with the same type safety would
 be an instant success.

In the same way that function annotations to give type information
were an instant success?

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Steven D'Aprano
On Wed, 04 Jun 2014 22:43:05 -0400, Terry Reedy wrote:

 Many mail readers treat \t as a null char since it actually has no
 standard translation into screen space.

I challenge that assertion. There are two standard translations into 
screen space: jump to the next multiple of 8 spaces, or 1 space.

Treating \t as a single space would be pathetic but standard. Treating it 
as (up to) 8 spaces would be more useful, and standard. Rendering it as a 
picture of a banana dancing on the ceiling would be silly and non-
standard. Not rendering it at all is even more stupid and less justified.


-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Chris Angelico
On Thu, Jun 5, 2014 at 6:39 PM, Steven D'Aprano st...@pearwood.info wrote:
 Treating \t as a single space would be pathetic but standard. Treating it
 as (up to) 8 spaces would be more useful, and standard. Rendering it as a
 picture of a banana dancing on the ceiling would be silly and non-
 standard. Not rendering it at all is even more stupid and less justified.

While I don't generally cite a show that I don't actively watch, it
seems appropriate here: http://youtu.be/F_1zoX5Ax9U?t=1m

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Alain Ketterlin
Chris Angelico ros...@gmail.com writes:

 On Thu, Jun 5, 2014 at 6:14 PM, Alain Ketterlin
 al...@dpt-info.u-strasbg.fr wrote:
 Swift's memory management is similar to python's (ref. counting). Which
 makes me think that a subset of python with the same type safety would
 be an instant success.

 In the same way that function annotations to give type information
 were an instant success?

If they were useful, they would be used more. I have made several uses
of (a variant of)

http://code.activestate.com/recipes/578528-type-checking-using-python-3x-annotations/

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Chris Angelico
On Thu, Jun 5, 2014 at 7:42 PM, Alain Ketterlin
al...@dpt-info.u-strasbg.fr wrote:
 Chris Angelico ros...@gmail.com writes:

 On Thu, Jun 5, 2014 at 6:14 PM, Alain Ketterlin
 al...@dpt-info.u-strasbg.fr wrote:
 Swift's memory management is similar to python's (ref. counting). Which
 makes me think that a subset of python with the same type safety would
 be an instant success.

 In the same way that function annotations to give type information
 were an instant success?

 If they were useful, they would be used more. I have made several uses
 of (a variant of)

 http://code.activestate.com/recipes/578528-type-checking-using-python-3x-annotations/

Precisely. I don't see that there's a huge body of coders out there
just itching to use Python but with some type information, or we'd
be seeing huge amounts of code, well, written in Python with type
information. They've been seen as an interesting curiosity, perhaps,
but not as hey look, finally Python's massive problem is solved. So
I don't think there's much call for a *new language* on the basis that
it's Python plus type information.

There's more call for Python with C-like syntax, given the number of
times people complain about indentation. (There already is such a
language, but it's somewhat obscure, so it's quite likely Apple aren't
aware of its merits.) There might be call for Python that can be
compiled efficiently to the such-and-such backend. But not Python
with declared-type variables, not as a feature all of its own.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Rustom Mody
On Thursday, June 5, 2014 2:09:34 PM UTC+5:30, Steven D'Aprano wrote:
 On Wed, 04 Jun 2014 22:43:05 -0400, Terry Reedy wrote:

  Many mail readers treat \t as a null char since it actually has no
  standard translation into screen space.

 I challenge that assertion. There are two standard translations into 
 screen space: jump to the next multiple of 8 spaces, or 1 space.

 Treating \t as a single space would be pathetic but standard. Treating it 
 as (up to) 8 spaces would be more useful, and standard. Rendering it as a 
 picture of a banana dancing on the ceiling would be silly and non-
 standard. Not rendering it at all is even more stupid and less justified.

A random thread (I guess one can find more):

https://mail.python.org/pipermail/python-list/2012-March/621993.html
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Sturla Molden

On 05/06/14 10:14, Alain Ketterlin wrote:

 Type safety.

Perhaps. Python has strong type safety. It is easier to spoof a type in 
C or C++ than Python.


Python 3 also has type annotations that can be used to ensure the types 
are correct when we run tests. In a world of consenting adults I am not 
sure we really need static types compared to ducktyping.



(And with it comes better performance ---read battery
 life--- and better static analysis tools, etc.)

Perhaps, perhaps not. My experience is that only a small percentage of 
the CPU time is spent in the Python interpreter.


- The GPU does not care if my OpenGL shaders are submitted from Python 
or C. Nor do any other library or framework. If I use OpenCV to capture 
live video, it could not care less if I use Python or C. A cocoa app 
using PyObjC will not use Python to prepare each pixel on the screen. 
Even if the screen is frequently updated, the battery is spent somewhere 
else than in the Python interpreter.


- A GUI program that is mostly idle spends more battery on lighting the 
screen than executing code.


- If I use a 3g connection on my iPad, most of the battery will be spent 
transmitting and receiving data on the mobile network.


- Where is the battery spent if I stream live video? In the Python 
interpreter that executes a few LOC for each frame? I will make the bold 
statement that an equivalent C program would exhaust the battery equally 
fast.


- If an web app seems slow, it is hardly every due to Python on the 
server side.


- If the response time in a GUI is below the limits of human perception, 
can the user tell my Python program is slower than a C program?



For the rare case where I actually have to run algorithmic code in 
Python, there is always Numba (an LLVM-based JIT compiler) or Cython 
which can be used to speed things up to C performance when the Python 
prototype works. I rarely need to do this, though.




 LLVM (an Apple-managed
 project) for the middle- and back-end, and a brand new front-end
 incorporating a decent type system (including optional types for
 instance).

Numba uses LLVM.

When I compile Cython modules I use LLVM on this computer.


 Swift's memory management is similar to python's (ref. counting). Which
 makes me think that a subset of python with the same type safety would
 be an instant success.

A Python with static typing would effectively be Cython :)

It is the tool of choice in many scientific Python projects today. Most 
projects affiliated with NumPy and SciPy prefer Cython to C or Fortran 
for new code.



Sturla


--
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Michael Torrie
On 06/05/2014 08:10 AM, Sturla Molden wrote:
 Perhaps, perhaps not. My experience is that only a small percentage of 
 the CPU time is spent in the Python interpreter.

Depends greatly on the type of application.  While it's true that most
apps that aren't CPU bound are idle most of the time, there's more to
the story than that.  A handy utility for analyzing power usage by
applications is Intel's powertop.  It measures things like how many
wakeups a program caused, and which sleep states a CPU is spending time
in.  It's more complicated and nuanced than simply adding up CPU time.

In any case I'm a bit surprised by people comparing Python to Swift at
all, implying that Python would have worked just as well and Apple
should have chosen it to replace Objective C.  Why are we comparing an
interpreter with a compiled language?  Apple's goal is to produce a
language that they can transition from Objective C to, and use to build
apps as well as core system frameworks.  Swift provides a cleaner system
for developers to work in than Obj C did (which, by the way has
reference counting), but carries on the same object model that
developers are used to (and existing frameworks use).

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: OT: This Swift thing

2014-06-05 Thread Steven D'Aprano
On Thu, 05 Jun 2014 05:56:07 -0700, Rustom Mody wrote:

 On Thursday, June 5, 2014 2:09:34 PM UTC+5:30, Steven D'Aprano wrote:
 On Wed, 04 Jun 2014 22:43:05 -0400, Terry Reedy wrote:
 
  Many mail readers treat \t as a null char since it actually has no
  standard translation into screen space.
 
 I challenge that assertion. There are two standard translations into
 screen space: jump to the next multiple of 8 spaces, or 1 space.
 
 Treating \t as a single space would be pathetic but standard. Treating
 it as (up to) 8 spaces would be more useful, and standard. Rendering it
 as a picture of a banana dancing on the ceiling would be silly and non-
 standard. Not rendering it at all is even more stupid and less
 justified.
 
 A random thread (I guess one can find more):
 
 https://mail.python.org/pipermail/python-list/2012-March/621993.html


I don't understand why you posted this link. I wasn't questioning Terry's 
description of the problem, that his mail client eats tabs. Thunderbird 
doesn't eat tabs for me, but I believe Terry when he says it eats them 
for him. I was questioning his assertion that there is no standard way to 
render a tab character.



-- 
Steven D'Aprano
http://import-that.dreamwidth.org/
-- 
https://mail.python.org/mailman/listinfo/python-list


  1   2   >