Re: [Haskell-cafe] Poll plea: State of GUI graphics libraries in Haskell

2013-10-05 Thread Sven Panne
2013/9/27 Heinrich Apfelmus apfel...@quantentunnel.de:
 Actually, I'm reading about WebGL right now, and it appears to me that it
 should be very easy to support in Threepenny. [...]

I am not sure if WebGL is enough: WebGL is basically OpenGL ES 2.0,
which is again basically OpenGL 2.0 plus some extensions. OpenGL
itself is currently at 4.4, and the situation regarding the supported
shading language versions is even worse. In a nutshell: WebGL =
ancient OpenGL. If it's enough for your purposes, fine, but otherwise
I guess a lot of people want something more recent.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Poll plea: State of GUI graphics libraries in Haskell

2013-10-05 Thread Sven Panne
2013/9/27 Conal Elliott co...@conal.net:
 [...] Am I mistaken about the current status? I.e., is there a solution for
 Haskell GUI  graphics programming that satisfies the properties I'm looking
 for (cross-platform, easily buildable, GHCi-friendly, and
 OpenGL-compatible)? [...]

Time warp! ;-) Point your browser at the g...@haskell.org archives a
decade ago... I think the consensus at that time was a bit
disappointing: Either one could have something portable but hard to
install and alien-looking, or something non-portable but easy to
install and native-looking. The fundamental UI concepts on the various
platforms differed so much that there was no hope for a grand unified
pretty UI library, so those GUI efforts basically ended. I think the
reasoning behind this hasn't changed recently, but I would love being
proven wrong.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Mystery of an Eq instance

2013-09-24 Thread Sven Panne
2013/9/22 Mike Meyer m...@mired.org:
 On Sat, Sep 21, 2013 at 5:28 PM, Bardur Arantsson s...@scientician.net
 wrote:
 Trying to make something whose name is Not A Number act like a
 number sounds broken from the start.

The point here is that IEEE floats are actually more something like a
Maybe Float, with various Nothings, i.e. the infinities and NaNs,
which all propagate in a well-defined way. Basically a monad built
into your CPU's FP unit. ;-)

 I just went back through the thread, and the only examples I could
 find where that happened (as opposed to where floating point
 calculations or literals resulted in unexpected values) was with
 NaNs. Just out of curiosity, do you know of any that don't involve
 NaNs?

Well, with IEEE arithmetic almost nothing you learned in school about
math holds anymore. Apart from rounding errors, NaNs and infinities,
-0 is another fun part:

   x * (-1)

is not the same as

   0 - x

(Hint: Try with x == 0 and use recip on the result.)

 Float violates the expected behavior of instances of - well, pretty
 much everything it's an instance of. Even if you restrict yourself to
 working with integer values that can be represented as floats.  If
 we're going to start removing it as an instance for violating instance
 expectations, we might as well take it out of the numeric stack (or
 the language) completely.

Exactly, and I am sure 99.999% of all people wouldn't like that
removal. Learn IEEE arithmetic, hate it, and deal with it. Or use
something different, which is probably several magnitudes slower. :-/
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] ANNOUNCE: New OpenGL packages

2013-09-15 Thread Sven Panne
New versions of the OpenGL packages are available on Hackage:

   * OpenGLRaw 1.4.0.0
   * GLURaw 1.4.0.0
   * OpenGL 2.9.0.0
   * GLUT 2.5.0.0

The mid-term goal is to make all these packages conform to the latest
OpenGL 4.4 specification, and while we're not yet there, this release
is nevertheless an important stepping stone towards that goal. The
packages contain a few non-backwards compatible changes, something
which is never nice for a public API, but it has been necessary:
OpenGL has come a long way from its initial fixed function pipeline to
its current form flexible form centered around shaders and buffers.
Because of this, a few design decisions on the Haskell side were not a
good fit anymore and simply had to change. Nevertheless, almost all
changes needed in the applications and libraries using the OpenGL
packages should be mechanical and straightforward. If not:
hope...@haskell.org is the place to get help if needed.

Hopefully the new packages will make it into the next Haskell Platform
release (2013.4.0.0), at least if I find out how to make it through
the proposal process... ;-)

Cheers,
   S.

P.S.: Here a list of the changes for each package:

==
Changes in the OpenGLRaw package
==

* Added support for the following extensions:

 GL_ARB_occlusion_query2
 GL_ARB_timer_query
 GL_ARB_draw_indirect
 GL_ARB_gpu_shader5
 GL_ARB_tesselllation_shader
 GL_ARB_transform_feedback3
 GL_ARB_ES2_compatibility
 GL_ARB_get_program_binary
 GL_ARB_separate_shader_objects
 GL_ARB_shader_atomic_counters
 GL_ARB_compute_shader
 GL_ARB_ES3_compatibility
 GL_ARB_framebuffer_no_attachments
 GL_ARB_shader_storage_buffer_object
 GL_ARB_query_buffer_object

* Added GLfixed type from OpenGL 4.1.

* Moved GLhandle type to
Graphics.Rendering.OpenGL.Raw.ARB.ShaderObjects where it belongs and
fixed its representation on Mac OS X.

* Added new Graphics.Rendering.OpenGL.Raw.Type module which exports
all GL types. Core31 and Core32 export only their respective subset
now.

* Correctly typed bitfield tokens as, well, GLbitfield instead of GLenum.

* Consistently use ‘Ptr a’ for ‘void*’ which are not opaque.

* Use ccall instead of stdcall on x86_64-windows.

* Use the OpenGLES framework on iOS.

==
Changes in the GLURaw package
==

* Use ccall instead of stdcall on x86_64-windows.

* Use the OpenGLES framework on iOS.

==
Changes in the OpenGL package
==

* Added sync object support.

* Added full support for OpenGL 4.4 query objects, extending and
changing the previous query object API a bit.

* Split ObjectName class into ObjectName + GeneratableObjectName
classes. Added single-name variants deleteObjectName and
genObjectName, they are a very common use case.

* Made BufferObject and TextureObject abstract. Now all GL objects
names are abstract and have to be explicitly generated. The only
exception is DisplayList, which is required to be non-abstract by the
spec.

* Shader is not a class anymore, but a data type with an ObjectName
instance and a creation action. Added ShaderType and shaderType.

* Added partial support for tessellation/geometry/compute shaders.

* Program is not a GeneratableObjectName, but has a createProgram
action now. Added attachShader and detachShader for incremental
changes.

* Deprecated shaderSource and added shaderSourceBS instead. Using
ByteString is more efficient and forces the caller to think about
encodings, e.g. via Data.Text.Encoding.

* Added support for shader/program binaries and shader precision queries.

* Revamped texture targets, adding partial support for texture buffer
objects and texture arrays.

* OpenGL 3.1 deprecated separate polygon draw modes, so use
GL_FRONT_AND_BACK internally in polygonMode whenever possible.

* Added missing Eq/Ord/Show instances for lots of data types.

* Simplified TransformFeedbackVaryings API, making it a bit more
similar to the one for activeAttribs and activeUniforms.

* Exported ContextProfile’.

* Renamed BlitFramebufferMask (to BlitBuffer) and its constructors.

* Renamed BufferRangeAccessBit (to MapBufferUsage) and its constructors

* Removed currentMatrix, genLists, deleteLists and isList, they have
been deprecated for ages.

* Full internal support for UTF-8.

* Do not expose internal #hidden modules.

* Lots of Haddock fixes and improvements.

* Renamed IndexedTransformFeedbackBuffer to IndexedTransformFeedbackBuffer.

* Fixed clip plane query.

==
Changes in the GLUT package

Re: [Haskell-cafe] Renumbered mailing list posts

2013-08-12 Thread Sven Panne
2013/8/12 Joachim Breitner m...@joachim-breitner.de:
 happens with mailman/pipermail occasionally.

o_O That's news to me... Why/how does this happen? This sounds like a
serious bug to me, the URLs should really, really be stable to be of
any use.

  It is more reliable to link to message ids, e.g. via gmame: [...]

That hint doesn't really help to unbreak the Haskell wiki, the various
trac instances, my mailbox, etc. :-(

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: [Haskell] ANNOUNCE: GLUT 2.2.1.0

2009-08-17 Thread Sven Panne
Am Sonntag, 16. August 2009 22:10:23 schrieb Rafael Gustavo da Cunha Pereira 
Pinto:
 BTW, as an enhancement for 2.2.2.0, you could treat unnamed mouse buttons.
 Mouses with more axis and more buttons are becoming increasingly common,
 and unmarshalMouseButton is not prepared to accept them!!

 Here are exceptions I caught, playing with my Genius Traveler 515 mouse:

 unmarshalMouseButton: illegal value 5
 unmarshalMouseButton: illegal value 6
 unmarshalMouseButton: illegal value 7
 unmarshalMouseButton: illegal value 8

Good point, I had similar reports already, but I simply forgot to handle this 
in yesterday's release. The right way to handle this would probably be 
extending the MouseButton data type with an 'AdditionalButton Int' constructor 
and simply pass the unknown button numbers via this case. I am not so sure 
about a nice name for this constructor: AdditionalButton? GenericButton? Or 
simply MouseButton, just like the type itself?

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary and little endian encoding

2009-05-18 Thread Sven Panne
Am Sonntag, 17. Mai 2009 15:08:29 schrieb Don Stewart:
 Sven.Panne:
  [...]
  I think most problems can be fixed in a rather pragmatic way by adding a
  few functions to the binary package:
 [...]
 Patches are welcome.

Attached. A few remarks:

 * This is only a quick and mildly tested implementation of the IEEE 
functions, especially NaNs, infinities and denormalized numbers are untested. 
These problems could totally be avoided if we can coerce representations 
directly, changing only their interpretation.

 * The *host functions assume an IEEE platform, but this can easily be changed 
(see comments).

 * Perhaps one can use unsafeCoerce for word32ToFloat and friends, but I 
haven't checked this.

 * I've seen a few {- INLINE -} comments. Is this really wanted or only a 
typo?

 * A comment about using peek/poke for the *le/*be functions is wrong, because 
this would introduce alignment constraints on some platforms.

I think the main point is to provide a nice and efficient API, hiding all the 
dirty stuff in the implementation.

  One final remarks: I think the low level functions of the binary package
  should really keep the notions of endianess and alignment constraints
  separate, something which isn't done currently: The *host functions have
  alignment restrictions, the *be/*le functions don't. There is no good
  reason for this non-orthogonality.

 That seems reasonable.

There are various ways to achieve this, but the most obvious way leads to a 
combinatorial explosion of functions:

   no. of types * 3 (LE/BE/host) * 2 (aligned/unaligned)

Furthermore, it would be good to split the binary package into the 2 layers 
already discussed first, then it is perhaps a bit clearer what a nice API 
would look like. I think it would be best to shift this API design discussion 
to the libraries list.

Cheers,
   S.

Only in binary-0.5.0.1: dist
diff -r -u binary-0.5.0.1.orig/src/Data/Binary/Builder.hs binary-0.5.0.1/src/Data/Binary/Builder.hs
--- binary-0.5.0.1.orig/src/Data/Binary/Builder.hs	Sat Mar  7 23:59:44 2009
+++ binary-0.5.0.1/src/Data/Binary/Builder.hs	Mon May 18 17:36:22 2009
@@ -41,20 +41,27 @@
 , putWord16be   -- :: Word16 - Builder
 , putWord32be   -- :: Word32 - Builder
 , putWord64be   -- :: Word64 - Builder
+, putFloatIEEEbe-- :: Float - Builder
+, putDoubleIEEEbe   -- :: Double - Builder
 
 -- ** Little-endian writes
 , putWord16le   -- :: Word16 - Builder
 , putWord32le   -- :: Word32 - Builder
 , putWord64le   -- :: Word64 - Builder
+, putFloatIEEEle-- :: Float - Builder
+, putDoubleIEEEle   -- :: Double - Builder
 
 -- ** Host-endian, unaligned writes
 , putWordhost   -- :: Word - Builder
 , putWord16host -- :: Word16 - Builder
 , putWord32host -- :: Word32 - Builder
 , putWord64host -- :: Word64 - Builder
+, putFloatIEEEhost  -- :: Float - Builder
+, putDoubleIEEEhost -- :: Double - Builder
 
   ) where
 
+import Prelude hiding (significand, exponent)
 import Foreign
 import Data.Monoid
 import Data.Word
@@ -360,6 +367,60 @@
 -- on a little endian machine:
 -- putWord64le w64 = writeN 8 (\p - poke (castPtr p) w64)
 
+-- | Write a Float in IEEE big endian format
+putFloatIEEEbe :: Float - Builder
+putFloatIEEEbe = putWord32be . floatToWord32
+{-# INLINE putFloatIEEEbe #-}
+
+-- | Write a Double in IEEE big endian format
+putDoubleIEEEbe :: Double - Builder
+putDoubleIEEEbe = putWord64be . doubleToWord64
+{-# INLINE putDoubleIEEEbe #-}
+
+-- | Write a Float in IEEE little endian format
+putFloatIEEEle :: Float - Builder
+putFloatIEEEle = putWord32le . floatToWord32
+{-# INLINE putFloatIEEEle #-}
+
+-- | Write a Double in IEEE little endian format
+putDoubleIEEEle :: Double - Builder
+putDoubleIEEEle = putWord64le . doubleToWord64
+{-# INLINE putDoubleIEEEle #-}
+
+floatToWord32 :: Float - Word32
+-- floatToWord32 = unsafeReinterpret
+floatToWord32 = encodeIEEE 8 23
+
+doubleToWord64 :: Double - Word64
+-- doubleToWord64 = unsafeReinterpret
+doubleToWord64 = encodeIEEE 11 52
+
+-- TODO: Check if this works for denormalized numbers, NaNs and infinities.
+encodeIEEE :: (RealFloat a, Bits b, Integral b) = Int - Int - a - b
+encodeIEEE exponentBits significandBits f =
+  (signBit `shiftL` (exponentBits + significandBits)) .|.
+  (exponentField `shiftL` significandBits) .|.
+  significandField
+   where (significand, exponent) = decodeFloat f
+
+ signBit | significand  0 = 1
+ | otherwise = 0
+ exponentField | significand == 0  exponent == 0 = 0
+   | otherwise = fromIntegral exponent + exponentBias + fromIntegral significandBits
+ significandField = fromIntegral (abs significand) .. significandMask
+
+ exponentBias = bit (exponentBits - 1) - 1
+ significandMask = bit significandBits - 1
+
+{-
+-- Evil! Poor man's version 

Re: [Haskell-cafe] Linkage errors in scenegraph

2009-05-17 Thread Sven Panne
Am Sonntag, 17. Mai 2009 01:07:55 schrieb Gregory D. Weber:
 I'd like to get the scenegraph package
 (http://hackage.haskell.org/cgi-bin/hackage-scripts/package/scenegraph)
 to work, but am encountering linkage errors.
 [...]
 Also, I notice that in the cabal file for scenegraph, the
 list of exposed modules

 Exposed-Modules: Graphics.SceneGraph,
   Graphics.SceneGraph.Basic,
 Graphics.SceneGraph.Vector,
 Graphics.SceneGraph.Render,
 Graphics.SceneGraph.SimpleViewport,
   Graphics.SceneGraph.GraphViz,
 Graphics.SceneGraph.Library,
 Graphics.SceneGraph.Dump,
 Graphics.SceneGraph.Textures

 does not include Graphics.SceneGraph.Matrix, but that should only mean
 that I can't call functions of that module directly -- not that the
 other SceneGraph modules can't call them -- right? [...]

That basically means that the scenegraph package is broken. ;-) Internal 
modules have to be listed in other-modules:, a section the Cabal file 
doesn't contain. As a quick fix, you can add all missing modules in this 
section, but this should of course be fixed in the official package, too.

http://www.haskell.org/cabal/release/cabal-latest/doc/users-
guide/authors.html#buildinfo

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary and little endian encoding

2009-05-17 Thread Sven Panne
Am Freitag, 15. Mai 2009 06:37:22 schrieb Don Stewart:
 timd:
  On a related matter, I am using Data.Binary to serialise data from
  haskell for use from other languages. [...]
 [...]
 Yep, it's possible, just not portably so. Google for Data.Binary IEEE
 discussions.

I think this topic pops up over and over again, and the proposed solutions 
are no solutions at all, neither from a performance point of view, nor from an 
ease of use point of view. Proposing insane bit fiddling by hand when all one 
technically needs is often a peek or poke amounts to simply ignoring an 
API problem. ;-)

I think most problems can be fixed in a rather pragmatic way by adding a few 
functions to the binary package:

Add to Data.Binary.Builder:

   putFloatIEEEbe :: Float - Builder
   putDoubleIEEEbe :: Double - Builder
   putFloatIEEEle :: Float - Builder
   putDoubleIEEEle :: Double - Builder
   putFloatIEEEhost :: Float - Builder
   putDoubleIEEEhost :: Double - Builder

Add to Data.Binary.Get:

   getFloatIEEEbe :: Get Float
   getDoubleIEEEbe :: Get Double
   getFloatIEEEle :: Get Float
   getDoubleIEEEle :: Get Double
   getFloatIEEEhost :: Get Float
   getDoubleIEEEhost :: Get Double

Add to Data.Binary.Put:

   putFloatIEEEbe ::  Float - Put
   putDoubleIEEEbe ::  Double - Put
   putFloatIEEEle ::  Float - Put
   putDoubleIEEEle ::  Double - Put
   putFloatIEEEhost ::  Float - Put
   putDoubleIEEEhost ::  Double - Put

The *host functions are basically peek/poke for most platforms. The *le/*be 
functions can use peek/poke if the endianess matches (compile time decision) 
*and* the alignment is OK for the given platform (runtime decision). Non-IEEE 
platforms always have to do the bit fiddling internally, but all this is 
hidden behind the above API.

IIRC I have proposed something similar 1-2 years ago, but I can't remember any 
reason why this hasn't been implemented. Any comments on the above functions?

One final remarks: I think the low level functions of the binary package 
should really keep the notions of endianess and alignment constraints 
separate, something which isn't done currently: The *host functions have 
alignment restrictions, the *be/*le functions don't. There is no good reason 
for this non-orthogonality.

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Decoupling OpenAL/ALUT packages from OpenGL

2009-05-16 Thread Sven Panne
Am Montag, 11. Mai 2009 12:04:07 schrieb Neil Brown:
 [...] So possible additions to your type-class list are Foldable and maybe
 Traversable (no harm, although I'd have to reach further for an example
 for this).  I guess the tricky decision might be whether to provide a
 Num instance  (again, probably more suitable for Vector2)? [...]

OK, I've added a bunch of instances for all vertex attribute types in the 
OpenGL 2.2.3.0 package. Let me know if there are standard classes for which 
you would like to see instances, too.

I've deliberately omitted instances for Num, because they are not correct from 
a mathematical point of view: You can't e.g. add two points (only a point and 
a vector), the difference between two points is not a point (it's a vector), 
etc.

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Decoupling OpenAL/ALUT packages from OpenGL

2009-05-10 Thread Sven Panne
Am Montag, 4. Mai 2009 13:33:33 schrieb David Duke:
 Decoupling basic primitives for geometric modelling from OpenGL would be
 useful. [...]
 Even just data constructors and instances of these within Functor and
 Applicative are a useful starting point. [...]

I've taken a closer look at the available packages for vector math/linear 
algebra. They differ in a lot of respects, starting from their representations 
of vectors and matrices, use of the type system and its extensions, 
strictness, structure of their type classes, etc.

This leads me to the conclusion that I should only lift the data types for 
vectors and matrices out of the OpenGL package, including only instances for 
standard type classes like Eq, Ord, Functor, etc. This means that the new 
package will *not* include type classes for things like scalars, vector 
spaces, etc. These can be defined by the other packages in their own type 
class language. I really fail to see a common ground in this respect, even 
for basic things: Keeping things H98-compliant is a must for me, so putting 
things like fundeps or associated types in this new package is a no-go for me. 
Nevertheless, having a common set of (strict) data types for vector math will 
probably be very useful, even if it won't fulfill everybody's needs.

What standard instances should be defined for those vectors and matrices? 
Things coming to mind are Eq, Ord, Show, Storable, Typeable1, Functor and 
Applicative. Have I missed some type classes?

Regarding Functor/Applicative: The obvious instances for e.g. a 2-dimensional 
vertex are:

   data Vertex2 a = Vertex2 a a

   instance Functor Vertex2 where
  fmap f (Vertex2 x y) = Vertex2 (f x) (f y)

   instance Applicative Vertex2 where
  pure a = Vertex2 a a
  Vertex2 f g * Vertex2 x y = Vertex2 (f x) (g y)

They fulfill all required laws, but are these the only possible instances? If 
not, are they at least the most canonical ones in a given sense? And 
finally: Does somebody have a real-world example where the Applicative 
instance is useful? Usages of the Functor instance are much more obvious for 
me.

Cheers,
   S.


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Decoupling OpenAL/ALUT packages from OpenGL

2009-05-03 Thread Sven Panne
Am Sonntag, 3. Mai 2009 00:56:00 schrieb Tillmann Vogt:
 Sven Panne schrieb:
 * a tiny ObjectName package, consisting only of OpenGL's ObjectName
  class (In Data.ObjectName? I'm not very sure about a good place in the
  hierarchy here.)

 How about Data.GraphicsObjects ? [...]

Thanks for all the suggestions so far, a few remarks from my side (I just 
picked the last mail for the reply, no strong reason for this...):

Data.GraphicsObjects is a bit misleading, because OpenAL's Buffers and 
Sources are instances, too, and they have nothing to do with graphics. 
Instances of ObjectName are just opaque resources from an external API, which 
you have to allocate and deallocate explicitly.

 I think it would be nice to have data types and functions for dot
 produkt, scalar product, norms, ...
 together with HOpenGL types.

I fear that this might open a can of worms and could lead to even longer 
discussions than the ones about a collection framework. The design space for a 
vector math package is quite large, and I fear that e.g. a mathematician 
trying to implement some linear algebra package has vastly different 
requirements than somebody trying to implement the n-th incarnation of the 
Quake engine. Some points to consider:

   * Should the components of vectors etc. be strict? In OpenGL they are, and 
Data.Complex is similar in this respect. In my experience non-strict 
components lead to space leaks too easily, and I guess this is the rationale 
behind Complex, too. But non-strict components have some benefits, too, of 
course, especially if you do symbolic computation.

   * Should we abstract over the number of dimension of vectors, etc.? If yes, 
how strong can our compile-time type checking be?

   * If we really make this a more general vector math package, things like 
colors etc. should probably stay in the OpenGL package. But there are a few 
other packages needing color data types, too...

   * If the package is not general enough, it might be a bad idea to steal 
names/hierarchy locations which *are* general.

Nevertheless, I'd be happy to see some proposals for a sensible, compact 
vector math package. Probably we can fulfill most of the common use cases with 
something simple.

And one word about lumping the 3 packages together: Any function, module, and 
package should have *one* clearly defined task, this is crucial for every SW 
design. I would have a hard time explaining what this super package is all 
about, even if we throw only 2 of the 3 packages together. Personally I feel 
that this is a strong argument for 3 separate packages.

 (I know that glu has tesselation). [...]

But GLU is basically dead with OpenGL 3.x. :-)

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] ANN: Silkworm game

2009-05-03 Thread Sven Panne
Nice work! Two minor suggestions, apart from the the paths issue already 
discussed here:

   * Either include a license file in the source distribution or remove the 
corresponding line in the .cabal file. Cabal won't work if it is specified and 
missing.

   * List all your build dependencies directly, so users can simply do a 
cabal install, pulling all missing dependencies automatically. No need for 
Makefiles or a long description in the README anymore...

Patch for those items attached.

As a side note, I get a very bad feeling when Hipmunk gets compiled on my 
x86_64 box:

chipmunk/cpCollision.c: In function ‘findVerts’:

chipmunk/cpCollision.c:174:0:
 warning: cast from pointer to integer of different size

chipmunk/cpCollision.c:180:0:
 warning: cast from pointer to integer of different size
chipmunk/cpCollision.c: In function ‘findPointsBehindSeg’:

chipmunk/cpCollision.c:233:0:
 warning: cast from pointer to integer of different size
chipmunk/cpCollision.c: In function ‘seg2poly’:

chipmunk/cpCollision.c:274:0:
 warning: cast from pointer to integer of different size

chipmunk/cpCollision.c:276:0:
 warning: cast from pointer to integer of different size
chipmunk/cpSpace.c: In function ‘queryFunc’:

chipmunk/cpSpace.c:411:0:
 warning: cast from pointer to integer of different size

chipmunk/cpSpace.c:411:0:
 warning: cast from pointer to integer of different size

This can't be correct, but I'll probably have to take a look at that. Or is it 
a know bug that Hipmunk ist not 64bit-clean?

Cheers,
   S.



--- Silkworm.cabal.orig	2009-04-14 00:33:14.0 +0200
+++ Silkworm.cabal	2009-05-03 13:48:05.0 +0200
@@ -2,7 +2,6 @@
 Version: 0.2
 Description: 2D game based on the Hipmunk physics engine bindings.
 License: LGPL
-License-file:LICENSE
 Author:  Duane Johnson
 Maintainer:  duane.john...@gmail.com
 Build-Type:  Simple
@@ -10,5 +9,5 @@
 
 Executable haq
   Main-is:   main.hs
-  Build-Depends: base
+  Build-Depends: base, array, containers, directory, random, pngload, Hipmunk, GLFW, OpenGL
 
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Decoupling OpenAL/ALUT packages from OpenGL

2009-05-02 Thread Sven Panne
I'd like to get some feedback from the Haskell community about some packaging 
issues, so here is my problem: As a medium-term goal, I'd like to decouple the 
OpenAL/ALUT packages from the OpenGL package, because there are very sensible 
use cases where you might need some sound, but not OpenGL. The current 
coupling has mostly historic reasons.

The OpenAL package depends on the OpenGL package in 3 areas:

   * OpenAL uses OpenGL's notion of StateVars all over the place.

   * OpenAL's Buffer and Source are instances of OpenGL's ObjectName class.

   * OpenAL's sources and listeners have some properties like velocity, 
orientation, direction and position which are modeled by OpenGL's Vector3 and 
Vertex3 data types.

The ALUT package depends on the OpenGL package because of GettableStateVars.

The packages are supposed to fit nicely together, so using the same types is a 
must, but the actual packaging is not nice. So the obvious idea is to 
introduce 3 new packages which lift out functionality from the OpenGL package:

   * a small StateVar package, consisting only of OpenGL's StateVar module 
(in a better place in the name hierarchy, of course, perhaps Data.StateVar?)

   * a tiny ObjectName package, consisting only of OpenGL's ObjectName class 
(In Data.ObjectName? I'm not very sure about a good place in the hierarchy 
here.)

   * a package containing most of the data types/newtypes in OpenGL's 
VertexSpec module (Vertex{2,3,4}, Color{3,4}, etc.) plus instances for the 
base classes like Eq, Ord, Show, etc. I really don't know a good name for this 
package and what a good place in the hierarchy would be (probably something 
like Data.Foo, but what is Foo?)

The point is: The first two package would be very small. Would this be OK? 
Does anybody see other alternatives? What would be good names for those 
packages and where in the naming hierarchy should they really go?

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] How to install HOpenGL to Windows?

2009-04-30 Thread Sven Panne
Am Mittwoch, 29. April 2009 11:25:31 schrieb Duncan Coutts:
 On Mon, 2009-04-27 at 19:03 +0200, Sven Panne wrote:
  [...]
  As usual, things are always a bit trickier than they appear initially: On
  non- Windows systems it is not always sufficient to link against libGL
  alone, sometimes you'll have to link against several X11 libs, too. I am
  not sure if this is still a widespread problem, but in the past it was.

 Right. It's still possible to use custom code in Setup.hs to test these
 kinds of things. It's a bit less easy however.

That's why the autoconf macros are so tricky. Re-inventing the wheel in 
Haskell is not something I'd like to do. Note: I see autoconf as a necessary 
evil, not as a glorious tool. The predefined autoconf macros contain man years 
(if not man decades) of sweat and tears from people trying to make their SW 
portable. If you only know 1 or 2 platforms, everything looks easy, but this 
is not the case at all. Good luck for everybody trying to ignore that 
accumulated knowledge...

 I didn't know that there was any working GHC for Cygwin. Or do you mean
 building a non-cygwin lib but under the cygwin shell? [...]

I don't know if GHC currently builds under Cygwin, but in former times it was 
*the* way (read: one and only way) to build it on Windows.

  On *nices you look into /usr/include and /usr/local/include, and
  that's it, unless the user tells you something different. And Apple is
  always a very creative company, so they decided to put *their* OpenGL
  headers in a completely different path where no -I flag can help...

 But you have some way of finding them right? Even if it's platform
 dependent. We can do the same in the .cabal file or the Setup.hs.
 There's also a Cabal flag users can pass to tell us about extra lib and
 include dirs.

I even have standard, well-documented, platform-independent way: Tell 
configure about it via the environment variable CPPFLAGS for headers and 
LDFLAGS for libraries. Cabal is buggy in this respect, IIRC, it passes no or 
incorrectly named variables to configure.

 Absolutely, finding headers is important. Cabal now checks at configure
 time that all header files and libs listed in the .cabal file can
 actually be found.

I don't know what Cabal does internally, but simply checking for the existence 
of a header file or a library file is *far* too naive, look into the autoconf 
docs and/or mailing lists to see why. You actually have to compile for header 
checks and link for library checks, everything else is plainly wrong, because 
you'll probably miss dependencies, pick up the wrong stuff, etc.

 One suggestion I've seen is just to improve the ffi pre-processors. The
 c2hs tool is in a position to discover if the calling convention is
 stdcall or ccall so it could generate the foreign imports correctly. [...]

This wouldn't help much: The foreign imports in OpenGL are written by hand, 
and when the OpenGL ARB ever manages to release some consistent, machine-
usable and complete API description (the current .spec files fail in almost 
all these aspects), they will be generated from that, not from C headers, 
because, not surprisingly, the latter lack a lot of information.

 In practise I've not found that most configure scripts actually do
 feature based tests. There's some, but half of it degenerates into if
 we've not got the OSX framework for this then do that. I mean they just
 end up doing platform-specific conditionals. [...]

This is often correct for hobby projects of inexperienced people, but not 
necessarily for larger or more mature projects. And in a (very) few cases, 
platform-specific conditionals are even the right thing to do, but not in the 
vast majority of cases, of course.

And this has nothing to say about autoconf itself: Seeing e.g. ugly Haskell 
code is not a reason to condemn Haskell itself, either.

 Anyway, so that's why I'd like us to look in detail at what features we
 need in Cabal to let us switch most packages from using ./configure
 scripts to using Setup.hs scripts.

As I said in another email: I'll happily review and accept patches for this, 
but I won't spend any effort on re-inventing the autoconf wheel in Haskell.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] How to install HOpenGL to Windows?

2009-04-27 Thread Sven Panne
Am Montag, 27. April 2009 00:11:20 schrieb Duncan Coutts:
 On Sun, 2009-04-26 at 19:03 +0200, Sven Panne wrote:
 [...]
 * How to link programs using OpenGL

 This is because the GL libs are called different names on different
 platforms right? But they're consistent within each platform, it's just
 Windows vs everyone else isn't it?

 How about:

 if os(windows)
   extra-libraries: gl32
 else
   extra-libraries: GL

As usual, things are always a bit trickier than they appear initially: On non-
Windows systems it is not always sufficient to link against libGL alone, 
sometimes you'll have to link against several X11 libs, too. I am not sure if 
this is still a widespread problem, but in the past it was. Hopefully most 
*nices get their dynamic library dependencies right nowadays... :-P Windows 
is, as always, a special beast, especially when you take Cygwin into account: 
On Cygwin you can either build against the native OpenGL or against Cygwin's 
X11 OpenGL. This can be configure via --without-x. How can we do this in 
.cabal files? And MacOS had some peculiarities which I can't fully remember 
anymore, too.

 * The Haskell types corresponding to the OpenGL types

 Does hsc2hs #type not help us with this? [...]

I am not sure, because I haven't had a look at hsc2hs for a long time, and at 
the beginning of the OpenGL binding there were no such tools, not even the FFI 
in its current form. Perhaps I'll have a look at this, but to make this work, 
I am sure that we'll have to solve the next item:


 * To do the former: How to find the OpenGL C headers

 What's needed here? Are they not in standard locations? Cabal has
 support for users to specify non-standard locations.

What is a standard location for headers on Windows? There is no such 
concept. On *nices you look into /usr/include and /usr/local/include, and 
that's it, unless the user tells you something different. And Apple is always 
a very creative company, so they decided to put *their* OpenGL headers in a 
completely different path where no -I flag can help...

Having access to the OpenGL headers is crucial for finding out which C types 
are behind OpenGL types like GLint, GLenum, ... The OpenGL spec only specifies 
minimum requirements for these types and *not* their C mapping.

 * The library calling convention

 This is stdcall on Windows and ccall everywhere else right?

 How about:

 if os(windows)
   cpp-options: -DCALLCONV=stdcall
 else
   cpp-options: -DCALLCONV=ccall

This should be fine, at least when we solve the Cygwin problem discussed 
above: The X11 OpenGL libraries on Windows do *not* use stdcall, only the 
native OpenGL libraries. (The whole calling convention story on Windows 
really, really sucks, IMHO...) Using CPP for this simple task doesn't look 
right, but with the current FFI I don't see a way around this, which is a 
shame. Any ideas/proposals for a FFI change?

 * How to load OpenGL extensions

 I don't know enough of the details here to comment.

You'll have to know if wglGetProcAddress, NSAddressOfSymbol (+ a bit more 
code) or some variant of glXGetProcAddress has to be used, plus their 
necessary headers. This is probably doable via some platform switches in 
Cabal, too.

A few general remarks:

 * Ignoring the (usual) Windows trouble, I like autoconf's approach of testing 
features instead of relying on extremely fragile platform conditionals. The 
latter are a constant source of troubles and from a SW engineering point of 
view, they are a clear step backwards. The I know my platform approach which 
you basically propose reminds me of the xmkmf hell from the last millennium: 
If X11 didn't know your platform, you had a *lot* of fun getting the platform 
description right.

 * We can go this autoconf-free route, but this is a part of my bindings which 
I definitely won't maintain in full time. I'll be happy to review and accept 
patches, but making things work on Windows, Cygwin, Linux, *BSD, MacOS X, ... 
is a lot of work, which is a fact I have painfully learned in the past few 
years. The autoconf code may be ugly, but the tons of platform differences are 
ugly, to. I want to work on the binding itself mainly, not on the build 
system, which currently works. To repeat myself: Patches are happily accepted, 
but I propose incremental changes and heavy testing on various platforms.

 * Most of the tasks my autoconf scripts perform are not specific to OpenGL at 
all. I guess that most bindings for non-trivial C libraries face the same 
challenges (Where are my libs? Where are my headers? What are my types?) 
Having good Cabal support for these would be very nice.

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] How to install HOpenGL to Windows?

2009-04-26 Thread Sven Panne
Am Donnerstag, 9. April 2009 00:28:35 schrieb Peter Verswyvelen:
 Yes I totally agree that it is overkill. Ideally I would like every package
 to install on Windows without requiring MinGW. But I was just explaining
 the situation as it is right now.

Well, I don't like using autoconf, either, but currently I don't see an 
alternative. If you look at the configure.ac of the OpenGL package, you can 
see that autoconf has to figure out the following:

   * How to link programs using OpenGL

   * The Haskell types corresponding to the OpenGL types

   * To do the former: How to find the OpenGL C headers

   * The library calling convention

   * How to load OpenGL extensions

The build process in itself is purely Cabal-based, it is only the 
configuration above which done via autoconf. So in theory you could write the 
few output files of the configure run by hand and then use Cabal, without any 
MinGW/MSYS or cygwin.

I don't see how Cabal could help here even further, the tests above are quite 
OpenGL-specific, but I am open for suggestions.

Cheers,
  S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Linker Errors For OpenGL / GLUT 'Hello World' Program.

2008-10-09 Thread Sven Panne
On Saturday 20 September 2008 19:13:43 Donnie Jones wrote:
 [...]
  checking GL/gl.h usability... yes
  checking GL/gl.h presence... yes
  checking for GL/gl.h... yes
  checking OpenGL/gl.h usability... no
  checking OpenGL/gl.h presence... no
  checking for OpenGL/gl.h... no
  checking GL/glu.h usability... yes
  checking GL/glu.h presence... yes
  checking for GL/glu.h... yes
  checking OpenGL/glu.h usability... no
  checking OpenGL/glu.h presence... no
  checking for OpenGL/glu.h... no
 
  That looks like to me that the gl.h and glu.h header files were found and
  are usable (in some cases).  I am able to build

That's correct, but that's a transcript of configure in the OpenGL package, 
not the GLUT package. The OpenGL package contains the bindings for the GL and 
GLU libraries, the GLUT package contains the binding for the, well, GLUT 
library.

 [...]
 ### Relevant lines that include -lGL or -lGLU in config.log ###
 [...]

This test is again OK, it first checks if we have to link against glu32 (for 
Windows) or GLU (for the rest of the world) to use GLU features. But again, 
this excerpt is from the config.log of the OpenGL package, not the GLUT 
package.

 It seems like the OpenGL and GLUT libraries are found (after -lglu32 fails,
 I am using Debian Linux).  I am not sure what to try now.

What you have posted so far indicates that the GL and GLU headers and 
libraries have been found, but for GLUT only the header has been found. Could 
you please post the relevant part of the config.log in the GLUT package, i.e. 
the part for checking for GLUT library... no? Where is the GLUT library on 
your system located?

Does anybody else have a similar problem?

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Win32 Open GL / Glut Applications

2007-09-23 Thread Sven Panne
On Friday 21 September 2007 20:19, Ronald Guida wrote:
 John Wicket wrote:
   yea, that is probably what I need.  Can you post in a step-by-step way.

 Here is a set of instructions for what I had to do to get FreeGLUT
 working with GHCi [...].

Oh dear, a long a sad story... :-(

 [...]  Although I just don't understand why freeglut, the
 Haskell GLUT library, and GHCi won't work together in the first place.

That statement is not correct, they *do* work together. The problem you are 
experiencing is that the GLUT version used to build the GHC installer/binary 
distro is obviously not freeglut, but classic GLUT. As long as you only 
use classic GLUT features, this is OK. Things get really hairy when you 
want to use freeglut-only features and still have a GHC installer/binary 
distro which is guaranteed to run with classic GLUT as well (with 
restricted features in the latter case, of course). To do this properly, the 
GLUT package has to resolve freeglut-only API entries dynamically, but 
glutGetProcAddress is not contained in lots of GLUT DLLs out in the wild 
(it's in GLUT API version 5 plus freeglut). This is really a pity and a big 
design flaw in GLUT IMHO, but there is not much one can do about that.The 
only thing left is to load the GLUT/freeglut dynamic library, 
well, dynamically and resolve the freeglut API entries by hand. Doing this 
is not hard, but a little bit tricky to get right portably: Use dlopen/dlsym 
on most *nices, LoadLibrary/GetProcAddress on Windoze, something else on Mac 
OS, take care of possible leading underscores, etc. etc. I really wanted to 
avoid doing this, but it looks like there is no way around it. Given the 
current time frame for the GHC 6.8.1 release, I don't think that it is 
feasible to get this into that release, because I would need feedback from 
lots of platforms to be sure things work.

 [...] darcs-1.0.9
   http://darcs.net/darcs-1.0.9.tar.gz [...]

There are darcs binaries for Windows, so there is no need to build it and the 
libraries it needs:

http://wiki.darcs.net/DarcsWiki/CategoryBinaries#head-c7910dd98302946c671cf63cb62712589b392074

Furthermore, darcs itself is not needed for what you want to do.

 [...] Freeglut-2.4.0
   http://freeglut.sourceforge.net/index.php#download [...]

The freeglut project currently doesn't provide prebuilt binaries, so this is 
hardly the GLUT package's fault. ;-) Furthermore, the official way to build 
the project on Windows is via MSVC, and there are projects files for this. 
Building a DLL via MinGW/MSYS would be nice, too, so perhaps you could post 
your patches in the freeglut-developer mailing list. I think that there will 
be a new freeglut release soon, perhaps I can push people to make at least a 
simple ZIP file with the binaries for Windows available on the project pages.

 GLUT-2.1.1
   You need to use darcs to download GLUT-2.1.1.
 [...]
Locate the line start starts with build-depends: and remove
the dependencies array and containers

Now you enter the great world of Cabal versionits and the Big Library Splitup 
(tm). ;-) If you want to use a bleeding edge version of GLUT, you need a 
bleeding edge version of GHC and the libraries coming with it. A released 
version is available via hackage.haskell.org.

   [...] 6. Modify GLUT-2.1.1/Graphics/UI/GLUT/Extensions.hs as follows:

Look at the last two lines:

 foreign import ccall unsafe hs_GLUT_getProcAddress hs_GLUT_getProcAddress

:: CString - IO (FunPtr a)

Change hs_GLUT_getProcAddress to glutGetProcAddress

   7. Modify GLUT-2.1.1/cbits/HsGLUT.c as follows:

Look for void* hs_GLUT_getProcAddress(char *procName) and
remove the whole function.

Huh? If you *really* compile against the freeglut header, these steps are 
unnecessary. What is the reason for this change?

 {...]
   11. In GHC's directory, there is a file named package.conf.  This
   file contains one extremely long line.  You need to find an
   editor that will let you insert some text into this line without
   introducing any line breaks.  Emacs can do this.

   You need to locate the text  pkgName = GLUT  and then you
   need to locate  hsLibraries = [HSGLUT-2.1.1]  to the right
   of there.  The very next thing to the right of hsLibraries
   should be  extraLibraries = [] .  You need to change it to
extraLibraries = [freeglut] .

This is unnecessary if you install the freeglut DLL correctly. What is the 
output of ghc-pkg describe GLUT before this modification?

   13. If you want to /compile/ with GHCi, then you'll need to copy the
   freeglut.dll file into the same directory as your newly-compiled
   program.exe file.  I haven't tried static linking yet; that
   would require recompiling freeglut as a static library instead
   of a DLL.

The traditional way is to put the DLL as glut32.dll into your 
WINDOWS/system32, just next to opengl32.dll. :-P I don't know what the 
Vista-approved 

Re: [Haskell-cafe] Library Process (was Building production stable software in Haskell)

2007-09-23 Thread Sven Panne
On Thursday 20 September 2007 16:33, David Menendez wrote:
 Does RPM, etc., deal with the fact that Haskell library installations
 are specific to a particular platform?

It depends what you mean with deal: If it is only making sure that a given 
binary library RPM matches the installed Haskell system, yes.

 If I install GHC and some library, and later install Hugs or a later
 GHC, how easy is it to make sure the library is available to the new
 installations?

This is asking for much more, and the general answer for all mature packaging 
systems I know is: This will never be the case for *binary* packages. The 
simple reason is that for rebuilding packages (this is basically what you are 
asking for), you have different (and typically *many*) more dependencies. 
This is why e.g. RPM differentiates between Requires (i.e. normal runtime 
dependecies) and Build-Requires (dependencies for building an RPM).

Although this situation is far from perfect, it is a direct consequence of the 
fact that there is no such thing as a standard Haskell ABI which is shared 
between all implementations and versions of them. The situation in C is much 
better (because it is much easier there), but even C++ suffered from ABI 
problems for several years on most platforms.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Library Process (was Building production stable software in Haskell)

2007-09-18 Thread Sven Panne
On Tuesday 18 September 2007 09:44, Dominic Steinitz wrote:
 This discussion has sparked a question in my mind:

 What is the process for the inclusion of modules / packages in ghc, hugs
 and other compilers  interpreters?

Personal interest of the people working on GHC et. al. ;-)

 I thought the master plan was that less would come with the compiler /
 interpreter and the user would install packages using cabal. [...]

Although this statement might be a bit heretical on this list, I'll have to 
repeat myself again that Cabal, cabal-install, cabal-whatever will *never* be 
the right tool for the end user to install Haskell packages on platforms with 
their own packaging systems like RPM (the same holds for other systems, I 
just use RPM as an example here). This packaging system, and nothing else, 
will write into my /usr, otherwise chaos and versioning hell will threaten 
the system integrity. Cabal is a very fine tool to be used from RPM .spec 
files and to develop below one's own home directory, but it is not the tool 
of choice for the final installation. There are various reasons for this:

   * Imagine you want to find out to which package in which version a given 
file belongs. Impossible, if RPM is bypassed.

   * RPM offers functionality to verify the integrity of the installed SW, it 
can tell me which files are documentation, which ones are configuration 
files, etc. All this meta information has to be in a single system.

   * YaST, yum, etc. already have the notion of repositories, trust (via 
standard cryptographic methods) and resolving transitive dependencies, so we 
would re-implement things which are already there, well maintained and 
well-known to end users.

   * Imagine every language would force their end users to use specific tools 
for installation, this would be ridiculous. Personally I don't care at all 
about the details how Perl modules, some PHP/Python/... libraries etc. are 
installed on my system. This knowledge belongs to the packager who builds a 
nice RPM, mentioning all dependencies, so my favourite RPM tool can 
transitively resolve, download and install everything, offering a nice GUI if 
I like. No need to remember how to do this for Perl/PHP/Python/etc.

Regarding the pros and cons of small, separate packages: In general I agree 
that this is the right direction, and this is what other languages do as 
well. There are e.g. tons of Perl/PHP/Python/Ruby RPMs available for my 
system, each offering a specific library, while the RPMs containing the 
interpreters/compilers are rather small. But: IMHO we are not there yet, 
because we still have to remove quite a few rough edges until we can smoothly 
offer e.g. an RPM repository with lots of small library RPMs (Cabal 
versionitis, updating the Haddock documentation index, etc.). Therefore, I'll 
continue to offer only Sumo-style RPMs for GHC + boot-libs + extra-libs for 
now, but I hope I can change this in the future.

Another point: Although this is hard to believe nowadays, ;-) people are not 
always online, so simply installing what is missing might not always be an 
option. Great, I'd really need the foobar-2.1 package now, but I'm currently 
1 feet above the Atlantic ocean... The right way to tackle this problem 
is using meta packages, basically referencing lots of bundled small packages. 
RPM offers such a feature, and I guess other systems, too. On a laptop, such 
a meta package leading to the installation of tons of libraries is the right 
approach, on a fixed workstation the story might be different.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary Endianness

2007-09-15 Thread Sven Panne
On Tuesday 11 September 2007 09:17, Don Stewart wrote:
 Just in case people didn't see, the `binary' package lives on

   http://darcs.haskell.org/binary/

 However, Lennart Kolmodin, Duncan and I are actively maintaining and
 reviewing patches, so send them to one (or all) of us for review.

Is there any deep reason why binary is not below packages like almost all 
other packages? The toplevel directory is already a complete mess, so a 
little bit more structure would be good. So I propose the following: 
Move binary to packages/binary, update any references to the old URL and 
use a symlink on darcs.haskell.org for backwards compatibility (this should 
be nuked in a few months or so).

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary Endianness

2007-09-15 Thread Sven Panne
On Saturday 15 September 2007 20:09, Stefan O'Rear wrote:
 packages is only for those libraries that are shipped with GHC.

First of all, this fact would be new to me, furthermore this would be a highly 
volatile categorization. Should URLs change when a package suddenly gets into 
or was thrown out of boot-/extra-packages? I don't think so. The fact what is 
shipped and what not is explicit in the ghc/libraries/{boot,extra}-libraries, 
not in the structure of darcs.haskell.org. Packages which are hosted there 
should be below, well, packages. The toplevel directory is currently a bit 
crowded and unstructured...

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Data.Binary Endianness

2007-09-11 Thread Sven Panne
On Monday 10 September 2007 21:02, apfelmus wrote:
 [...]
class Put a endian where
  put :: endian - a - Put
 [...]
 Oh, and the 8,16,32 and 64 are good candidates for phantom
 type/associated data types, too.

I think that using any non-H98 feature like MPTC or associated data types for 
such a generally useful and basic package would be a *big* mistake.  Let's 
follow the KISS principle here. Everybody is free to make a nicer, but 
non-portable wrapper...

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary Endianness

2007-09-11 Thread Sven Panne
On Monday 10 September 2007 19:50, Thomas Schilling wrote:
 [...]
 instance Binary MP3 where
   get = MP3 $ getHeader * getData -- [*]
 where getHeader = do magic - getWord32le
case magic of
...

Of course this works in the sense that it compiles, but Binary is 
conceptually the wrong class to use.

 to read a (IEEE) double you use

   do x - (get :: Double); ...

:Where is IEEE mentioned in the docs? Does it use LE/BE/host order? Plain 
get/put on Float/Double are useless for reading IEEE floating numbers.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary Endianness

2007-09-11 Thread Sven Panne
On Tuesday 11 September 2007 08:14, Don Stewart wrote:
 sven.panne:
  On Monday 10 September 2007 19:50, Thomas Schilling wrote:
   [...]
   instance Binary MP3 where
 get = MP3 $ getHeader * getData -- [*]
   where getHeader = do magic - getWord32le
  case magic of
  ...
 
  Of course this works in the sense that it compiles, but Binary is
  conceptually the wrong class to use.

 I wouldn't go as far as saying `wrong', for protocol-specific data types it
 seems reasonable for the Haskell serialisation to use an agreed-upon
 external format, via a Binary instance.

The question is: What is the *human reader* suggested when he sees a signature 
like foo :: Binary a = ... - a - ...? This should probably mean foo is 
using some portable (de-)serialization, but doesn't care about the actual 
representation, at least this is how I understand Binary's contract from the 
Haddock docs. The example above means something completely different, so I 
propose to add another class (e.g. ExternalBinary, better name needed) to 
the binary package for such uses. This class wouldn't have any instances in 
the package itself, but at least it would set a standard, so things are 
crystal clear when one sees a signature like blah :: ExternalBinary a 
= ... - a - 

This situation is very similar to the OO-world, where class inheritance is 
often overused, just because it looks so handy when written initially and 
delegation/aggregation/... would be the conceptually right way (i.e. 
implementation hierarchy vs. interface hierarchy).

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary Endianness

2007-09-11 Thread Sven Panne
On Monday 10 September 2007 19:26, Don Stewart wrote:
 Yep, just send a patch. Or suggest what needs to happen.

OK, I'll see what I can do next weekend, currently I'm busy with 
packaging/fixing GHC. I have similar code lying around in various places, and 
it would be nice if there was a more officially sanctioned place to put this.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Tiny documentation request

2007-09-10 Thread Sven Panne
On Sunday 09 September 2007 18:41, Andrew Coppin wrote:
 [...]
 Well, if I could collapse it with a single click, it would be much
 easier to scroll past it and get to the thing I'm looking for. I didn't
 say remove it, just give me the option to hide it. ;-)

OK, that shouldn't be too hard to implement.

 Oh goodie... So it's there to keep the machines happy?

No, it's there to keep *me* happy when I'm looking for a module. ;-)

 It's just tedious that every single time I load up this page, I have to
 spend 30 seconds manually collapsing everything so I can get to the
 module I actually want to look at. (The alternative is to manually
 scroll the 13-page list my hand. Not very funny...)

I still fail to understand why you have to scroll or collapse manually, every 
browser I know of has a search facility. And there is the index page, where 
you have an incremental search facility even when your poor browser (guess 
which one I mean? :-) doesn't have it, at least when the index has been 
generated by a recent Haddock.

 OK, so... can we add a pair of expand all/collapse all buttons then?

Again, this should be rather easy to add.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary Endianness

2007-09-10 Thread Sven Panne
On Monday 10 September 2007 17:17, Jules Bean wrote:
 On the documentation page:

 http://www.cse.unsw.edu.au/~dons/binary/Data-Binary.html
 [...]

Just a small hint: That page seems to be out of date compared to:

   http://hackage.haskell.org/cgi-bin/hackage-scripts/package/binary-0.3

The library looks quite nice, but I'm missing support for reading/writing 
Int{8,16,32,64} and Float/Double (in IEEE format, currently *the* binary 
representation in most formats I know) in LE/BE/host byte order. Have I 
overlooked something? Using unsafeCoerce# and friend there are probably 
workarounds for this, but having it in the official API would be quite 
helpful for I/O of real-world formats.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Data.Binary Endianness

2007-09-10 Thread Sven Panne
On Monday 10 September 2007 18:21, Thomas Schilling wrote:
 On Mon, 2007-09-10 at 18:11 +0200, Sven Panne wrote:
 [...]
  The library looks quite nice, but I'm missing support for reading/writing
  Int{8,16,32,64}

 maybe this?

 http://hackage.haskell.org/packages/archive/binary/0.3/doc/html/Data-Binary
-Get.html#v%3AgetWord8

Of course I can *implement* everything on top of this, but this is not the 
point. The binary library should have builtin support for more data types, 
and this is probably not hard to implement.

 Also note that many Haskell standard types are instances of the Binary
 class.  I might have misunderstood what you're asking for, though...

Again a confusion of the 2 things the binary package offers (I was confused 
initially as well): The Binary class is totally useless for reading/writing 
existing formats, simply because that's not its task. To read/write an 
existing format (BMP, MP3, WAV, Quake BSP, etc.) you have to use the 
getFoo/readFoo functions. So what I was asking for is:

   getInt32be, putIEEEFloatLe, getIEEEDoubleHost, ...

Type classes might be used to get a slightly smaller API, but I am unsure 
about the performance impact and how much this would really buy us in terms 
of the ease of use of the API.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Tiny documentation request

2007-09-09 Thread Sven Panne
On Sunday 09 September 2007 16:40, Andrew Coppin wrote:
 I have the following page bookmarked:

   http://haskell.org/ghc/docs/latest/html/libraries/

 I'd like to ask 2 things.

 1. Would it be possible to make the *huge* list of package names at the
 top collapsable? (That way I don't have to scroll through several pages
 of uninteresting text to get to the bit I actually want.)

What do you mean exactly with the *huge* list of package names? The 
description list with the short textual descriptions of each package? I'd say 
that this list is highly interesting to people unfamiliar with the package 
structure, so it is good that it is there.

 2. Could we make is so all items are collapsed initially? (Currently
 they're all expended initially - which makes it take rather a long time
 to find anything.)

Again this depends on the use case: I'd vote strongly against collapsing the 
list initially, because that way the incremental search in Firefox won't work 
without un-collapsing everything.

When the index is generated with a more recent Haddock, you get a search 
field, which does an incremental search, so this might perhaps be more what 
you are looking for.

A more aesthetical note: We should really get rid of the ugly table/CSS layout 
mixture, the lower part of the page renders a bit ugly and varies between 
browsers. Switching to pure CSS should be safe in 2007, I guess.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Hackage and GHC 6.8

2007-09-08 Thread Sven Panne
On Friday 07 September 2007 09:57, Neil Davies wrote:
 Given that GHC 6.8 is just around the corner and, given how it has
 re-organised the libraries so that the dependencies in many (most/all)
 the packages in the hackage DB are now not correct.

 Is there a plan of how to get hackage DB up to speed with GHC 6.8 ?

Given all those changes in the library structure and Cabal, naming the next 
GHC release 7.x (probably 7.0.1, if I understand the current naming policy 
correctly) would be better IMHO, giving everybody a clear hint about 
portability issues. (even if those issues are only import changes, 
*.cabal/Setup.hs changes and various minor library API changes) But it's too 
late for such discussions, 6.8.1 has already been branched...

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Installation of GLUT package

2007-09-04 Thread Sven Panne
On Tuesday 04 September 2007 15:37, Paul L wrote:
 The detection of freeglut or glut is at compile time by checking if
 some function exists. Otherwise it's not able to link. So you'll have
 to re-compile the Haskell GLUT package.

Show me the code where the alleged tests are made, please... :-) The only 
things which are determined at build time are the linker options for linking 
OpenGL/GLUT applications and the calling convention on the platform in 
question. If you change your GLUT DLL to a freeglut DLL, everything should 
work, including freeglut extensions. If not, I consider this as a bug and 
I'll try to fix it. But to see what's going on, some logs, commandlines, etc. 
are needed to reproduce what other people have done.

For a more detailed discussion, perhaps the hopengl list might be more 
appropriate.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] RE: Definition of the Haskell standard library

2007-09-02 Thread Sven Panne
On Sunday 02 September 2007 03:29, Hugh Perkins wrote:
 A really simple way to track the quality of a package is to display
 the number of downloads.

 A posteriorae, this works great in other download sites.

 We can easily hypothesize about why a download count gives a decent
 indication of some measure of quality: [...]

... and even more easily hypothesize why this is not always a good indication: 
High-qualitiy standard libraries which are packaged with GHC/Hugs/... will 
probably almost never be downloaded separately.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Ideas

2007-09-02 Thread Sven Panne
On Saturday 25 August 2007 20:49, Andrew Coppin wrote:
 [...] Would be nice if I could build something in Haskell that overcomes
 these. OTOH, does Haskell have any way to talk to the audio hardware?

Depending on what you are exactly trying to do, the OpenAL/ALUT packages might 
be of interest. Slighty dated online docs are here:

   http://haskell.org/HOpenGL/newAPI/OpenAL/Sound-OpenAL.html
   http://haskell.org/HOpenGL/newAPI/ALUT/Sound-ALUT.html

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] RE: Definition of the Haskell standard library

2007-09-01 Thread Sven Panne
On Tuesday 31 July 2007 19:39, Duncan Coutts wrote:
 [...]
 The docs for those packages would be available for packages installed
 via cabal (assuming the user did the optional haddock step) and would
 link to each other.

Well, on a normal Linux distro a user should *never* have to call cabal (or 
any of its cousins) directly, the distro's package manager should be the used 
instead. On an e.g. RPM system, the .spec file would use Cabal to e.g. 
(un-)register a package, because RPM has to know what is installed, which 
other packages are prerequisites, how to cleanly uninstall, etc. IMHO Cabal 
should not try to mirror a full-fledged package system, simply because on 
every (non-Windows ;-) platform there are tons of native tools for this 
purpose, and Cabal is not in the driver's seat when it comes to SW 
installation.

 What is missing from the local docs is a single integrated index page
 that lists all the modules and then links off to the various packages's
 docs like we have on the ghc website.

 The problem with generating one of those is what manages it? What
 package would it belong to etc.

Of course we are not the first project to face this kind of problem: Texinfo 
offers a central contents page as well. To maintain this page, it comes with 
a tool install-info, which updates the index page after (de-)installation. 
On RPM systems, the .spec file calls install-info after (de-)installation of 
a package with info pages.

http://www.gnu.org/software/texinfo/manual/texinfo/html_node/Installing-an-Info-File.html

 On some systems (windows, gnome) there are dedicated help viewers that
 can help with this contents/index issue. haddock supports both (mshelp,
 devhelp). I'm not sure everyone would find that a sufficient solution
 however.

A install-haddock tool would be the solution IMHO.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Converting CTime - Int

2007-08-26 Thread Sven Panne
[ Sorry for the *extremely* slow response, but I'm currently working through 
my backlog of 6000 mails... :-P ]

On Wednesday 16 May 2007 09:35, Tomasz Zielonka wrote:
 I wonder why CTime is not Integral - maybe there is no such guarantee
 for time_t? If so, then you shouldn't rely on Enum. The safest bet seems
 to be toRational - CTime is Real.

The Single Unix Specification has the answer:

http://www.opengroup.org/onlinepubs/95399/basedefs/sys/types.h.html#tag_13_67

   time_t and clock_t shall be integer or real-floating types.

CTime's Enum instance is as debatable as the ones for Float and Double, but 
for consistency reasons we included it in the FFI spec.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] renderString problems

2007-08-26 Thread Sven Panne
On Wednesday 01 August 2007 18:30, Dave Tapley wrote:
 I'm having a lot of trouble using renderString from Graphics.UI.GLUT.Fonts.
 All my attempts to render a StrokeFont have so far failed.
 Using a BitmapFont I can get strings to appear but they demonstrate
 the odd behaviour of translating themselves a distance equal to their
 length every time my displayCallback function is evaluated.

This is actually not a bug, but a feature. :-) From the Haddock docs for 
renderString:


Render the string in the named font, without using any display lists. 
Rendering a nonexistent character has no effect.

If the font is a bitmap font, renderString automatically sets the OpenGL 
unpack pixel storage modes it needs appropriately and saves and restores the 
previous modes before returning. The generated call to bitmap will adjust the 
current raster position based on the width of the string. If the font is a 
stroke font, translate is used to translate the current model view matrix to 
advance the width of the string. 


The rational behind this is that you set a position once, and subsequent 
multiple renderString calls will render the individual strings one after the 
other, just like printf appends strings on the output. If this is not clear 
from the documentation, any suggestions how to improve the docs?

And another hint: On usual consumer graphic cards, stroke fonts are normally 
faster than bitmapped fonts. The fastest, most flexible (scaling, 
filtering, ...) and visually nicest option would be textured quads, but this 
is not supported by GLUT.

 My requests are:
   * Does anyone know how to keep the position fixed?

For bitmapped fonts, explicitly set the currentRasterPosition. For stroke 
fonts, you can use the normal modelview machinery 
(loadIdentity/preservingMatrix/...).

   * Are there any good examples of (working) GLUT code available on
 the web, I'm finding it very hard to make any progress at the moment.
 Certainly not at Haskell speed :(

Depending on the distribution you use, you probably have the example already 
somewhere on your disk or you can directly have a look at the repository:

   http://darcs.haskell.org/packages/GLUT/examples/

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] (newbie) instance Enum MyType where, smarter way?

2007-03-27 Thread Sven Panne
On Tuesday 27 March 2007 17:15, Adrian Neumann wrote:
 [...]
 Which doesn't work because succ and pred are not (properly?) defined. Is
 there a way to let deriving Enum do *some* of work (i.e. defining succ
 and pred) while manually defining the other functions?

Hmmm, this seems to be a confusing usage of the Enum class, e.g. 'fromEnum . 
toEnum' changes some values (allowed by the report, but confusing 
nevertheless). Using Data.Bits.Bits somehow could be a better option, but one 
has to know more about your use case.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Link error in ALUT Hello, World

2007-03-25 Thread Sven Panne
On Sunday 25 March 2007 04:38, Nobuhito Mori wrote:
 [...] Though there are clearly link errors, I can not understand why it
 happens. By option -package ALUT,  libalut.a (which made by pexports and
 dlltool because I do not know original alut.lib can be used by mingw) and
 other necessary libraries are automatically linked, I think. What is @8
 of [EMAIL PROTECTED]? [...]

Just a little bit of background: Because Microsoft has decided that *nothing* 
will be easy or straightforward on their platform, they introduced tons of 
different calling conventions:

   http://msdn2.microsoft.com/en-us/library/984x0h58.aspx

When a function is called, arguments are pushed onto the stack. The question 
is: Who cleans up the stack then, the caller or the callee? For functions 
with variable argument lists like printf in C the usual answer is: The 
caller, because it is the only one who knows what was pushed (__cdecl 
convetion in MS speak). OTOH, having the cleanup code in the callee leads to 
slightly smaller code, assuming that a function is called more than once 
(__stdcall in MS speak). To catch a mismatch in calling conventions at link 
time (and because of a few other reasons), functions expection to be called 
the __stdcall way get a @BytesOnTheStackToBeCleanedUp suffix.

Historically, DLLs on Windows use the __stdcall convention, but there is no 
deep reason why. The OpenAL DLL uses __cdecl, but the reasons for this have 
probably vanished alongside Loki Software itself, the initial designers of 
OpenAL. ALUT is used with OpenAL, so it made sense to me to choose the same 
calling convention as the OpenAL DLL, i.e. __cdecl, for the ALUT DLL.

Looking at the OpenAL and ALUT Haskell packages, I think that there are some 
cut-and-paste bugs from the OpenGL/GLUT packages (where the DLLs use 
__stdcall). I'll have a look at it, but in the meantime could you please add 
a detailed description to the corresponding ticket of how you installed the 
OpenAL/ALUT packages and which OpenAL/ALUT DLLs/libs you used, how you tried 
to convert them etc. (command lines, outputs, ...)? I'd just like to make 
sure that I'll reproduce exactly what you did.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL and GLUT in GHC

2007-03-24 Thread Sven Panne
[ Small note: Library-related questions should better be directed to 
[EMAIL PROTECTED], and for mails regardind the OpenGL/GLUT packages there 
is the [EMAIL PROTECTED] mailing list. ]

On Saturday 24 March 2007 13:37, Ruben Zilibowitz wrote:
 [...] I've encountered a strange bug which I'm having trouble with. If I
 render a sphere and a plane, then the plane is facing the wrong way
 and is shaded on the wrong side. If however I only render the plane
 then it appears to be facing the right way and is shaded on the
 correct side. [...]

I guess the problem is a misunderstanding of what 'autoNormal $= Enabled' 
does. It enables the automatic generation of analytic normals when 2D 
evaluators are used. It doesn't affect the rendering of normal primitives 
like quads. You don't provide any normals for your plane, so the current 
normal is used for all four vertices. The value of the current normal is 
(Vector 0 0 1) initially, so this seems to work if you render the plane 
alone, *but* the GLUT object rendering functions provide normals for all 
their vertices. So the net effect is that the normals for the vertices of 
your plane are set to whichever normal GLUT has specified last. Simple fix: 
Provide normals for your quad, i.e. use

   normal (Normal3 0 0 (1 :: GLfloat))

before specifying any vertex of your quad. In general when lighting is used, 
make sure to provide the correct normals for all vertices. Unit normals 
should be preferred, otherwise you have to tell OpenGL about that and this 
leads to more work (= rescaling/normalization of all normals within OpenGL).

A few more notes:

   * There is no need in your example to use 'normalize $= Enabled' when you 
provide unit normals. GLUT does this, BTW.

   * Setting the material properties could be done only once in the 
initialization function.

   * Use postRedisplay only when something has really changed, e.g. at the end 
of 'motion'. Otherwise you get 100% CPU/GPU load for no good reason.

   * IORefs are StateVars, so you can use get and ($=) instead of readIORef 
and writeIORef, this is more consistent with the OpenGL/GLUT API.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Why the Prelude must die

2007-03-24 Thread Sven Panne
On Saturday 24 March 2007 03:48, Stefan O'Rear wrote:
 1. Namespace pollution

 The Prelude uses many simple and obvious names.  Most programs don't
 use the whole Prelude, so names that aren't needed take up namespace
 with no benefit. [...]

Even though I think that the current Prelude is far from perfect, one should 
not forget that is a very solid foundation of a common language: If one 
sees e.g. '(.)' or 'map', it is immediately clear to everybody what this 
means, without having to scan through (perhaps long) import lists. Of course 
one could hide some parts of the Prelude etc., but I think in the long run 
this only leads to confusion. Redefining common things, heavy use of tons of 
self-defined operators etc. all make maintenance much harder.

Try reading Lisp code with heavy use of macros or C++ code with tons of 
overloadings. This is more like Sudoku solving than anything else, because 
there is no common language between the author and the reader anymore.

And taking away the prelude is a little bit like taking 
away 'int', 'double', 'for', 'while' etc. from the C programmer...

 11. Committeeism

 Because the Prelude has such a wide audience, a strong committee
 effect exists on any change to it.  This is the worst kind of
 committeeism, and impedes real progress while polluting the Prelude
 with little-used features such as fail in Monad (as opposed to
 MonadZero) and until.

Depending on your viewpoint, you can see this as a plus. Everybody agrees that 
finalizers are evil, but propose the removal of that method from 
java.lang.Object to the Java people. :-)

My proposal would be to incrementally improve the Prelude, modularize it a bit 
more, fix the Num hierarchy, but basically leave it as it is.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Strange memory consumption problems in something that should be tail-recursive

2007-02-16 Thread Sven Panne
On Tuesday 13 February 2007 22:32, Bernie Pope wrote:
 Creighton Hogg wrote:
 [...]
  So for example in the case of,
  facTail 1 n' = n'
  facTail n n' = facTail (n-1) (n*n')

 The problem with this example is that it will build up an expression of
 the form:

(n1 * n2 * n3 .)
 [...]

This is not true if one takes strictness analysis into account: facTail is 
strict in both arguments, and any decent strictness analyser will detect 
this, e.g. GHC with -O. Strict arguments can be evaluated before the function 
call, so in the example above no stack space will be consumed and the above 
will basically be a good old loop.

For the brave: Use ghc -v4 -O on the example above to see what really 
happens. GHC is even clever enough to factor out (un-)boxing (at least for 
Int and friends), so there is a rather tight loop with unboxed Ints.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] FFI basics

2007-02-12 Thread Sven Panne
On Monday 12 February 2007 09:54, Yitzchak Gale wrote:
 Bulat Ziganshin wrote:
  examples of lifting C functions into Haskell world:
 
  mysin :: Double - Double
  mysin = realToFrac . c_mysin . realToFrac
 
  -- c_mysin :: CDouble - CDouble
 
  rnd :: Int - IO Int
  rnd x = do r - c_rnd (fromIntegral x)
 return (fromIntegral r)
 
  -- c_rnd :: CInt - IO CInt

 OK, got it. I'll put that in.

Just a small note here: GHC and the base library are both very careful to 
completely eliminate things like realToFrac or fromIntegeral in code similar 
to the one above, if the representations of the Haskell type and the C type 
are identical. Therefore there is no need to sacrifice portability for speed 
by leaving these conversion function out and making invalid assumptions. If 
actual conversion code is generated without a good reason, I would consider 
this as a bug.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] FFI basics

2007-02-10 Thread Sven Panne
Am Samstag, 10. Februar 2007 09:21 schrieb Donald Bruce Stewart:
 bulat.ziganshin:
  Hello Yitzchak,
 
  Friday, February 9, 2007, 3:23:53 PM, you wrote:
   I would like to use FFI for the first time. Can someone
   give me a really, really simple complete example?
 
  nothing can be easier
 
  main = print (c_mysin 1.0)
 
  foreign import ccall mysin.h mysin
c_mysin :: Double - Double

 Shouldn't that be CDouble? At least for Int/CInt you can hit troubles on
 64 bit machines...

Yes, the code above is wrong in the sense that it makes assumptions which are 
not guaranteed at all in the FFI spec. The rules to follow are extremely 
simple, so there is no reason to ignore them:

   * If you want to use a C type foo in Haskell, use CFoo in the FFI 
(imported from Foreign.C.Types).

   * If you want to use a Haskell type Bar in C, use HsBar in C code 
(#included from HsFFI.h).

It depends on the application/library in question which alternative is easier, 
but never use a mix.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] HaskellForge?

2007-01-08 Thread Sven Panne
Am Montag, 8. Januar 2007 17:15 schrieb Justin Bailey:
 [...]
 For example, if I want to install Rails (ruby web-app framework), I just
 type:

   gem install rails

 It's pretty slick.

How does this work with the native packaging mechanism on your platform 
(RPM, ...)? Does it work behind it's back (which would be bad)? Let's 
assume that rails needs foo and bar as well, which are not yet on your 
box. Does gem install transitively get/update all dependecies of rails?

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] IFF reader and writer

2006-12-01 Thread Sven Panne
Am Freitag, 1. Dezember 2006 16:30 schrieb Henning Thielemann:
 On AmigaOS there is a library called iffparse.library, which is used for
 processing the Interchange File Format, which is a binary container format
 developed by Electronic Arts for any kind of data.
   http://en.wikipedia.org/wiki/Interchange_File_Format
 The best known instances of this format are certainly the AIFF sampled
 sound format and WAV (which is RIFF, that is little endian IFF).
  Short question: Is there some Haskell library for parsing and
 constructing files of this format?

I don't have any Haskell lib for (R)IFF, but as one of the freealut authors I 
have the pleasure to maintain a WAV reader, among other things. IMHO WAV is 
one of the most idiotic, redundant and underspecified format in the world, 
and most existing WAV files are broken in some respect. PNGs are a bit 
better, but all those chunked formats are a bit problematic in practice, 
because new chunk types are constantly being invented, contradict other 
chunks, etc. etc.

Quite a few concrete (R)IFF instances can contain (R)IFF within chunks 
themselves, furthermore you have always be prepared to handle an unknown 
chunk type. So a general (R)IFF type can't be much more than a tree with a 
tagged bunch of bytes at each node, which is not really of much help IMHO. 
Separate libraries for handling WAV, TIFF, PNG, AVI, etc. might make more 
sense, as they can reflect the underlying structure much better.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A restricted subset of CPP included in a revision of Haskell 98

2006-11-05 Thread Sven Panne
[ I'm just working through a large backlog of mails, so the original message 
is a bit old... :-) ]

Am Sonntag, 20. August 2006 22:37 schrieb Henning Thielemann:
 On Thu, 17 Aug 2006, Brian Smith wrote:
 [...]
 I think there should be more effort to avoid CPP completely. My
 experiences with Modula-3 are, that you can nicely separate
 special-purpose stuff into modules which are included depending on some
 conditions. Say you want the same module both for Windows and Unix, you
 provide directories WIN32 and POSIX containing implementations with the
 same interface and then the make system can choose the appropriate
 directory. [...]

That's a nice theory, but this doesn't work in practice, at least not for me. 
The problem in my OpenGL/GLUT/... bindings is that the calling convention to 
the native libraries is different on Windows, and there is no Haskell way 
to parametrize this. Therefore using a preprocessor is the only sane way I 
see here. Having to duplicate e.g. 567 foreign imports just to avoid CPP in 
the OpenGL package is a rather bad tradeoff IMHO. Almost everything is better 
than redundancy, even CPP...

Another use of CPP in the OpenGL package is to access OpenGL extension entry 
points. Here CPP is used to generate a 'foreign import dynamic' and two 
Haskell functions per extension entry. Perhaps this could be done via TH, but 
this would limit the portability, again a bad tradeoff.

I would be glad if there were other ways to achieve these things, but I fail 
to see them.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] CDouble type coercion

2006-05-14 Thread Sven Panne
Am Sonntag, 14. Mai 2006 09:30 schrieb SevenThunders:
 I am new to Haskell and found myself in a bind concerning the use of
 the C types, CDouble in particular.  I extract a CDouble via it's pointer
 from a StorableArray.  Since the array must interface with C the elements
 of the array must be CDouble.  Now I'd like to use Text.Printf to format
 print statements of elements of the array, but Text.Printf requires Doubles
 as inputs and so far I have not found an obvious way to coerce CDoubles
 into Doubles. [...]

You can use the Prelude function realToFrac to convert between the various 
floating-point types:


[EMAIL PROTECTED]:~ ghci -v0
Prelude :t realToFrac
realToFrac :: (Fractional t1, Real t) = t - t1
Prelude (realToFrac :: Foreign.C.Types.CDouble - Double) 1234.5
1234.5


As you can see from its type, realToFrac is not exactly about floating-point 
conversions, but for almost all practical use cases it is. :-)

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] #if and #endif

2006-05-06 Thread Sven Panne
Am Freitag, 14. April 2006 02:34 schrieb ihope:
 On 4/13/06, Jason Dagit [EMAIL PROTECTED] wrote:
  Try using passing -cpp to ghc when you compile.
 
  Jason

 Thanks. Will do.

A small note: I worked on the tools recently, so Alex/Haddock/Happy should be 
fully cabalized now. Consequently there should be no need to fiddle around 
with compiler options, the usual Cabal stuff should work:

   runhaskell Setup.lhs configure
   runhaskell Setup.lhs build
   runhaskell Setup.lhs copy

If not, please submit a bug report.

SimonM: Do you I think that a new release of those tools is appropriate?

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] need help please [HOpenGL]

2006-05-06 Thread Sven Panne
Am Sonntag, 23. April 2006 04:49 schrieb Brian Hulley:
 Brian Hulley wrote:
[...]
 Sorry I shouldn't have replied when I hadn't even tried it myself ;-)
 I don't think it is nearly so easy to display a bitmap from an image on
 file. If you look at the online version of the OpenGL redbook here
 http://www.rush3d.com/reference/opengl-redbook-1.1/chapter08.html the
 example C code for glBitmap is: [...]

The GLUT package contains all examples from the Red Book, see:

   http://darcs.haskell.org/packages/GLUT/examples/RedBook/

The coding style is probably not the best one can imagine, but the intention 
is to give a very close 1:1 mapping of these well-known examples in Haskell. 
If you have read the Red Book (what everybody trying to do some OpenGL 
programming should have done IMHO), you should have no trouble understanding 
the Haskell examples.

As already mentioned in another post, loading image data is by design out of 
the scope of OpenGL, but the examples contain a loader for a trivial raw RGB 
format:

   http://darcs.haskell.org/packages/GLUT/examples/RedBook/ReadImage.hs

For further questions, [EMAIL PROTECTED] is probably a better mailing list.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] library sort

2006-03-08 Thread Sven Panne
Am Dienstag, 7. März 2006 14:24 schrieb Neil Mitchell:
 I would also imagine that Joe Programmer is more likely to use
 wxHaskell or Gtk2Hs than those [...]

Just a (hopefully final) remark about this, because the above statement seems 
to imply something that is not completely true: 3 of the 4 packages I've 
mentioned, i.e. OpenGL (rendering) and OpenAL/ALUT (sound) do not compete in 
any way with the GUI packages mentioned above, they can be happily used with 
those. And regarding the 4th package (GLUT): It very much depends on which 
book you read first, lots of OpenGL books use GLUT as their GUI toolkit and 
do this for a very good reason (reproducibility, widespread availibility, 
ease of use for simple up to medium-sized programs etc.). For a larger 
application other GUI toolkits are probably a better choice, and all of the 
serious ones offer an OpenGL canvas to render on, anyway.

I just had to reply because lots of people seem to confuse GUI issues with 
rendering issues, which are two completely different beasts, and this might 
lead to various preconceptions.

 The data generation is now bundled with Haddock, and as far as I know,
 will be in the next release. [...]

That's good to hear. I really have to take a closer look at the current state 
of the former fptools projects, but my job and the switch to darcs got in the 
way...

Thanks for a really nice tool,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] library sort

2006-03-08 Thread Sven Panne
Am Mittwoch, 8. März 2006 15:11 schrieb Neil Mitchell:
 I never claimed it was a good reason, merely that it was a reason :) [...]

:-)

 Anyway, my current plan is:
 * lots of smallish packages, and one big base package which is the
 default search
 * OpenGL, wxHaskell, Gtk2Hs, Darcs, GHC API, GHC (the code base), Yhc,
 Parsec will all be options to search for. At some point in the future
 I will send out an offer to everyone if they want their package
 included. [...]

In the meantime it would be great if Hoogle could be made consistent with the 
documentation on http://haskell.org/ghc/docs/latest/html/libraries/ (this is 
the big base package IMHO). Currently the differences might be quite 
confusing for new people.

 Maybe (depending on how efficient I can make Hoogle), it can give
 google style hints - there were also 3 results in OpenGL, would you
 like to add OpenGL to your search options.

That would be a great feature IMHO.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] library sort

2006-03-07 Thread Sven Panne
Am Samstag, 4. März 2006 21:30 schrieb Neil Mitchell:
  And a related question is: Which packages are searchable by Hoogle?

 The best answer to that is some. I intentionally excluded OpenGL and
 other graphics ones because they have a large interface and yet are
 not used by most people using Haskell. [...]

Well, this a bold assumption IMHO, and I'm not particularly happy with that, 
as you can probably imagine. For my part, I would assume that Joe Programmer 
is much more likely to use some multimedia packages than TH or Data.Graph.* 
etc., but this is a bold assumption on *my* side...

 I have recently patched Haddock so it will directly generate Hoogle
 information, and am in the process of modifying  hoogle so that you
 can pick which libraries or applications to search. Hopefully once
 this is all done, the standard web hoogle interface will allow
 searching OpenGL libraries, if the user selects that as an option -
 and also allow searching other libaries/applications such as
 GHC/Yhc/Darcs/Gtk2Hs as selected by the user. [...]

Integrating with Haddock makes much sense, and perhaps we can bundle Hoogle 
somehow with Haddock, so everybody can use Hoogle locally on the whole set of 
packages which are installed. The current Hoogle website would then just be 
an instance of that bundle. Are there any plans in this direction?

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] library sort

2006-03-04 Thread Sven Panne
Am Montag, 20. Februar 2006 12:46 schrieb Simon Peyton-Jones:
 Strangely, Hoogle isn't easy to find at haskell.org.  I'm not sure where
 the best place to add a link would be: perhaps near the top of the
 libraries-and-tools page?  It's all wikified now, so would someone like
 to add it somewhere appropriate? [...]

And a related question is: Which packages are searchable by Hoogle? I can't 
find anything contained in my OpenGL/GLUT/OpenAL/ALUT packages. :-(

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenAL bindings / compiling ghc 6.5

2005-12-28 Thread Sven Panne
Am Mittwoch, 28. Dezember 2005 16:15 schrieb Michael Benfield:
 I see here:
 http://www.haskell.org/HOpenGL/newAPI/

 OpenAL bindings listed as part of the Hierachical Libraries. And when I
 download the source to a development snapshot of GHC, there they are.
 Is there a way to install this on GHC 6.4?

Although I haven't tried it, you should be able to simply use the OpenAL and 
ALUT directories from the HEAD in the 6.4 branch. The only problem that might 
arise is that the Cabal stuff might have changed a bit, but I'm not sure 
about that. Anyway, this should be fairly easy to fix...

 Alternatively... I can't get GHC 6.5 to compile. I do ./configure 
 make and it gets to this step:
 
 ==fptools== make all -r;
   in /Users/mike/source/ghc-6.5.20051221/ghc/compiler
 
 and sits forever [...]

I haven't heard about that problem. :-(

Cheers,
S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenAL bindings / compiling ghc 6.5

2005-12-28 Thread Sven Panne
Am Mittwoch, 28. Dezember 2005 16:24 schrieb Joel Reymont:
 I think you should post to cvs-ghc. I was able to get things to
 compile (almost) on 10.4.3 but had to configure with --disable-alut --
 disable-openal, etc.

Why were those --disable-foo options necessary? In theory everything should be 
autodetected, otherwise it's a bug. Detailed information about the platform, 
compilers, a configuration/compilation log plus config.log will help to 
diagnose the problem.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] [Newbie] Why or why not haskell ?

2005-12-11 Thread Sven Panne
Am Sonntag, 11. Dezember 2005 09:58 schrieb Tomasz Zielonka:
 [...] I would like to see some support in tools for enforcing such a coding
 policy. It could look like this - a function written using only safe
 components would be marked as safe. Every unsafe feature like FFI,
 unsafePerformIO, etc. would taint a module/function, marking it
 unsafe. [...]

... in effect making things like putStrLn, getContents etc. tainted, 
resulting in probably  95% of the hierachical libraries in the fptools 
repository being tainted, including lots of stuff from the H98 report. :-) 
Nice idea, but not very practical IMHO.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: wxHaskell and do statements

2005-12-02 Thread Sven Panne
Am Dienstag, 29. November 2005 16:16 schrieb Sebastian Sylvan:
 IIRC Haskell assumes a tab is 8 spaces.

Correctly, it is explicitly specified in the Haskell spec, see:

   http://haskell.org/onlinereport/syntax-iso.html#layout

 IMO that's way too much. Haskell tends to take up quite a bit of
 horizontal real-estate so I usually go with 2 spaces.

 At any rate, I set my editor to convert them to spaces.

I think this in general the best idea for all projects, even if the language 
in question has no layout rule. Tabs are simply evil...

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Records

2005-11-22 Thread Sven Panne
I think this discussion has reached a point where it is of utmost importance 
to re-read Wadler's Law of Language Design, a law so fundamental to 
computer science that it can only be compared to quantum dynamics in physics:

   http://www.informatik.uni-kiel.de/~mh/curry/listarchive/0017.html

:-)

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Records

2005-11-22 Thread Sven Panne
Am Dienstag, 22. November 2005 19:30 schrieb Greg Woodhouse:
 To be honest, I haven't followed the entire records thread (at least
 not yet), but I don't know that it's fair to say that we've been
 focusing entirely (or nearly so) on lexical issues. I'll grant you that
 there's an awful lot of that going on, but unless I'm missin something
 obvious, support for a record data type isn't even a purely syntactic
 issue. [...]

I definitely didn't want to offend anybody, and I'm sure that there have been 
quite a few good (non-syntactical) proposals, but to be honest: They vanished 
in a sea of syntactic discussions, at least for me, and I couldn't follow the 
whole thread closely due to a lack of time. Hopefully somebody writes up the 
relevant points and proposals in a condensed form...

As an aside, such heated syntactical discussions come up at least once a year 
on the Haskell lists for almost a decade now, and I think it is a good time 
to remind people about the law then... :-)

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Project postmortem

2005-11-18 Thread Sven Panne
Am Freitag, 18. November 2005 17:16 schrieb Jason Dagit:
 [...]
 I was playing with one of the Haskell OpenGL libraries (actually it's
 a refined FFI) over the summer and some things about it rubbed me the
 wrong way.  I wanted to try fixing them but I really couldn't figure
 out how to get ahold of the code and start hacking.  I found some
 candidates, but it seemed like old cvs repositories or something.  I
 was confused, ran out of time and moved on.  Why do I bring it up?
 If it had been obvious where to get an official copy of the library I
 could have tried sending in some patches to make things work the way
 I wanted.  I'm a huge fan of darcs repositories, BTW.

Hmmm, as the OpenGL/GLUT/OpenAL/ALUT guy I have to admit that I should really, 
really update the web pages about those packages. But anyway: Asking on any 
Haskell mailing list (there is even one especially for the OpenGL/GLUT 
packages) normally gives you fast response times. Without even knowing that 
there is a problem, there is nothing I can fix. :-) And don't hesitate to ask 
questions about the usage of those packages, because this is valuable 
feedback, too. Regarding the repository: The normal fptools repository is the 
official one for those packages. But IIRC, most GHC binary packages include 
OpenGL/GLUT support, so there is normally no urgent need for a home-made 
version. All packages are already cabalized, but I have to admit that I have 
never tried to build them on their own.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: [Haskell] Making Haskell more open

2005-11-13 Thread Sven Panne
Am Sonntag, 13. November 2005 22:05 schrieb Gour:
 Wolfgang Jeltsch ([EMAIL PROTECTED]) wrote:
[...]
  The question is if HTML is sufficient.  In addition, HTML is at some
  points not well thought-out.

 True, but considering the present situation, it is all what is required.

Well, that's a wrong assumption: People on Windows will expect HTML Help, and 
the current build system can easily generate this from DocBook XML. I think 
the current installers/ZIP files for GHC and the other tools don't include 
HTML Help, but I consider this a packaging bug.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: [Haskell] Making Haskell more open

2005-11-13 Thread Sven Panne
Am Sonntag, 13. November 2005 22:22 schrieb Gour:
[...]
 Besides that, 'txt2tags-like technology' is already in use for some time
 - e.g AFT (http://www.maplefish.com/todd/aft.html) dating back in '99
 and XMLmind XML Editor has plugin which supports (similar) markup called
 APT (http://www.xmlmind.com/xmleditor/_distrib/doc/apt/apt_format.html)
 [...]

Great! If you have already an XML editor, start writing DocBook now! :-)

More seriously: This is again a useless tools discussion, we *are* using 
DocBook currently and it works fine. The real problem is not the XML format 
and any XML toolchain, it is the lack of people willing to write 
documentation. There are enough people in the various fptools projects 
(including me) who will happily and quickly accept documentation patches, be 
it in plain text or DocBook. And if we are honest: Whoever will contribute to 
the GHC/Happy/... documentation with a non-trivial amount of text has very 
probably suffered through the build process, anyway, and getting the XML 
tools up and running has been the least problem then...

Cheers,
   S.

P.S.: In a Google search, DocBook XML dominated txt2tags by a factor of 29, 
and an amazon.de search showed 7:0 books... :-)
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] FFI and modifying haskell memory

2005-10-29 Thread Sven Panne
Am Montag, 24. Oktober 2005 17:20 schrieb Joel Reymont:
 Is with better than allocaBytes?

with is just a utility function around alloca and poke, where alloca 
is another utility function around allocaBytes. Here the code from the 
repository:

   with val f  =
 alloca $ \ptr - do
   poke ptr val
   res - f ptr
   return res

(Hmmm, why not simplify the last two lines to just f ptr?) GHC does some 
tricky things you probably don't want to know about :-) to do something 
better for alloca than an exception-protected malloc/free pair.

In a nutshell: with is not better than allocaBytes, it is something 
different. with can be used to pass a Storable Haskell value in a temporary 
memory buffer to a function, while allocaBytes only does plain temporary 
memory allocation.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] FFI and modifying haskell memory

2005-10-29 Thread Sven Panne
Am Samstag, 29. Oktober 2005 14:27 schrieb Joel Reymont:
 So both with and allocaBytes allocate bytes on the stack then, correct?

It depends on what you mean by stack. :-) From a conceptual point of view, 
both pass a pointer to a temporary memory region to a given action *which is 
only valid during the execution of the action*, so it would be incorrect to 
if the pointer somehow escapes the action.

How this temporary memory is actually allocated depends on the implementation: 
Hugs and NHC use malloc/free, ensuring that free is even called in case of an 
exception, so that there will be no space leaks. GHC does something more 
efficient by allocating the memory from the normal Haskell runtime heap, but 
ensuring that the memory is never moved by garbage collection. This is called 
pinned memory sometimes, see e.g. Java's JNI.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] FFI and modifying haskell memory

2005-10-22 Thread Sven Panne
Am Samstag, 22. Oktober 2005 01:42 schrieb John Meacham:
 On Fri, Oct 21, 2005 at 03:19:36PM +0100, Joel Reymont wrote:
  Is there a particular reason why StablePtr cannot provide a fixed
  memory address? Then 4 bytes of memory won't need to be allocated so
  that C could write to them and C could just modify the Haskell variable.

 because haskell values don't have the same representation as C values.
 haskell values are pointers to updatable thunks. in any case 'with'
 just allocates 4 bytes on the stack (the same as a auto C declaration)
 so is quite speedy compared to what it would take to make a haskell
 value look like a C one. not to mention haskell values can't be
 modified.

2 tiny remarks:

 * In Simon's example the first parameter to 'with' (the initial buffer size) 
was missing, but I guess you've figured this out already. In general: If the 
pointer you're passing to C has in semantics, 'with' (or one of its 
variations) is your friend. For in/out semantics, use 'with' + 'peek', and 
for out semantics use 'alloca' + 'peek'.

 * As already mentioned, StablePtr can even refer to something which has no 
direct C counterpart, like Haskell functions. Needing them in the C world 
seems strange at first, because there is nothing thrilling you can do with 
them but pass them back to Haskell. But there are actually some good uses for 
that, like interfacing to callback-based C APIs where some additional 
bookkeeping is needed.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Haskell, SDL, OpenGL

2005-07-20 Thread Sven Panne
Am Montag, 18. Juli 2005 18:46 schrieb yin:
 [...]
 ld-options: -L/usr/lib -Wl -rpath /usr/lib -lSDL

This looks a bit suspicious: The syntax for ld options is -rpath DIR, so the 
option for gcc should be -Wl,-rpath,DIR. Ugly, but I didn't invent 
that. :-) Furthermore, I've never seen a Linux/*nix system where the 
(dynamic) linker doesn't look into /usr/lib, so probably the best way is to 
simply use:

   ld-options: -lSDL

In addition, the -rpath option can be a bit surprising for the user of the 
executable later, depending on OS/platform peculiarities.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] class Ref...

2005-06-12 Thread Sven Panne

[EMAIL PROTECTED] wrote:

Quoting Gracjan Polak [EMAIL PROTECTED]:
[...]

Is there any reason why isn't it included?



Nobody could agree on the details.  For example, MVars are perfectly
respectable Refs on the IO monad.  So would it make sense to add an
instance for that?  If so, the functional dependency should go, which
introduces its own problems.


A few more design problems:

 * Due to the functional dependency, that class is not Haskell98, which
   is a *very* good reason IMHO not to standardize it, at least in that
   way. Remember: There are not only GHC and Hugs out there...

 * The 3 operations should not be packed together in a single class,
   because there might be e.g. references which you can't create (e.g.
   OpenGL's state variables), references which are read-only and even
   references which are write-only.

 * What about strictness of e.g. the setter? There is no right version,
   this depends on the intended usage.

 * Are the references located in the monad (like in the suggested class)
   or are they within objects, which have to be given as additional
   arguments (e.g. like wxHaskell's widgets/Attr/Prop).

 * Atomic operations might be needed, too.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] CGI module almost useless

2005-06-11 Thread Sven Panne

[ Moving this thread slowly to the libraries list... ]

Bjorn Bringert wrote:

John Goerzen wrote:


My apologies if this sounds like a bit of a rant; I know people put good
effort into this, but

The Network.CGI module in fptools (and GHC) is not very useful.  I think
that it should be removed or re-tooled.  Here are the main problems with
it:

1. It does not permit custom generation of output headers.  Thus the CGI
script cannot do things like set cookies, manage HTTP auth, etc.

2. It does not permit generation of anything other than text/html
documents.  Many CGI scripts are used to manage other types of
documents.  Notably this makes it incompatible with serving up even
basic things like stylesheets and JPEGs.

3. It does not permit the use of any custom design to serve up HTML,
forcing *everything* to go through Text.Html.  This makes it impossible
to do things like serving up HTML files from disk.

4. There is documentation in the code, but it is as comments only, and
doesn't show up in the Haddock-generated GHC library reference.  (Should
be an easy fix)

5. It does not appear to support file uploads in any sane fashion

Is there a better CGI module out there somewhere that I'm missing, or
should I just set about writing my own?



I wrote this module (based on the Network.CGI code) a while ago:

http://www.dtek.chalmers.se/~d00bring/darcs/blob/lib/Network/SimpleCGI.hs

I don't remember what it does really, but I think it solves issues 1,2,3 
and some of 4.


Although (among other people) I did some hacking in this module in the remote
past, I don't have the time and energy to maintain and/or extend this module
anymore. It would be really great if somebody more actively working in this
area could take the spec lead here and push the development via discussions
here on this library list. John? Björn? A few general design thoughts:

 * To keep people's mind sane, backwards compatibility with the existing
   Network.CGI would be a very worthy goal.

 * Don't use any Haskell language extension available. :-) Currently the
   module can be used e.g. by Hugs in H98 mode, and keeping it that way
   would again be something very desirable.

Cheers,
   S.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Must be a FAQ - Can't make install Hugs98-Nov2003 on MacOSX 10.3.8

2005-02-25 Thread Sven Panne
Arthur Baars wrote:
See the hugs-bugs archive:
http://www.mail-archive.com/hugs-bugs@haskell.org/msg02815.html
Malcolm Wallace wrote:
The configure script is (wrongly) determining that the MacOS X C
compiler does not support Floats/Doubles.  Ideally, the autoconf magic
which determined this setting should be fixed, [...]
Hmmm, I'm not sure if the autoconf magic has been fixed. Does it work with 
Hugs from
CVS HEAD? If not, could somebody please send a patch for it or at least a log + 
all
involved config.logs? I don't have access to a Mac...
Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: File path programme

2005-01-31 Thread Sven Panne
Peter Simons wrote:
[...]
There also is a function which changes a path specification
into its canonic form, meaning that all redundant segments
are stripped. So although two paths which designate the same
target may not be equal, they can be tested for equivalence.
Hmmm, I'm not really sure what equivalence for file paths should
mean in the presence of hard/symbolic links, (NFS-)mounted file
systems, etc.  Haskell's stateless (==) function doesn't really
make sense IMHO, but perhaps I've missed something in this epic
discussion... :-]
Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: File path programme

2005-01-31 Thread Sven Panne
Peter Simons wrote:
Sven Panne writes:
  Hmmm, I'm not really sure what equivalence for file
  paths should mean in the presence of hard/symbolic links,
  (NFS-)mounted file systems, etc.
Well, there is a sort-of canonic version for every path; on
most Unix systems the function realpath(3) will find it.
OK, but even paths which realpath normalizes to different things might
be the same (hard links!). This might be OK for some uses, but not for
all.
My interpretation is that two paths are equivalent iff they
point to the same target. [...]
This would mean that they are equal iff stat(2) returns the same 
device/inode
pair for them. But this leaves other questions open:
 * Do we have something stat-like on every platform?
 * What does this mean for network file systems, e.g. in the presence of
   the same files/directories exported under different NFS mounts? I don't
   have enough books/manual pages at hand to answer this currently...
 * What does this mean if the file path doesn't refer to an existing
   file/directory?
IMHO we can provide something like realpath in the IO monad, but shouldn't
define any equality via it.
Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] List manipulation

2005-01-26 Thread Sven Panne
Jules Bean wrote:
[...] You rather want 'zipWith'.  Documentation at:
http://www.haskell.org/ghc/docs/latest/html/libraries/base/GHC.List.html
...along with lots of other funky list processing stuff.
Just a small hint: Everything below GHC in the hierarchical libraries
is, well, GHC-specific, meant for internal use only and may change without
further notice, see Stability: internal in the page you mentioned. Just
use http://haskell.org/ghc/docs/latest/html/libraries/base/Data.List.html
instead.
Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Non-technical Haskell question

2004-12-06 Thread Sven Panne
Philippa Cowderoy wrote:
On Mon, 6 Dec 2004, Jules Bean wrote:
[...] If that is the case, then it already is 'smart linking' and I stand
corrected. Unless the granularity of the .o files is too large, of
course...
It is - you get one .o per module.
That's not true with -split-objs, e.g. the base package consists of almost
12000 *.o files and one module of my OpenGL package is compiled into 3000
*.o files alone. One can't really get much smaller...
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Non-technical Haskell question

2004-12-06 Thread Sven Panne
Robert Dockins wrote:
[...] In particular, if we 
could segment closely related code with many interdependencies into 
discrete units with well defined external interfaces (sound like 
packages to anyone else?), then my intuition tells me that the cost of 
setting up an inlining barrier should be fairly low.  Module inlining 
_within_ a package would still occur, just not _between_ packages.
This reasoning might be valid for traditional languages, but not for
languages like Haskell promoting the use of higher order, typically
small functions. Not inlining most monads would probably be catastrophic,
as would be not doing so for our beloved map, foldr, etc. And IMHO C++
has given up this kind of binary compatibility completely lon ago when
code in headers was introduced, so Haskell is in good company with one
of the languages in use. Not really a good excuse, but a fact...
And just a remark: We don't need a new technique for a no inline barrier:
Just compile the library optimized and use a facade which re-exports your
public API compiled without optimizations.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Non-technical Haskell question

2004-12-06 Thread Sven Panne
Jules Bean wrote:
I don't think it does, actually. You can imagine a compiler which has 
access to not *only* the .so files, but also the haskell source. 
Therefore it can still unroll (from the source), but it can choose to 
link to an exported symbol if unrolling isn't worth it.
But that's not dynamic linking... Imagine a bug in version X of your lib,
simply using version X+1 with your already compiled program won't fix that
bug. Again, this is just like C++.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Non-technical Haskell question

2004-12-06 Thread Sven Panne
[EMAIL PROTECTED] wrote:
The original observation was that the compiler seems archaic. When
asked, I gave some general comments. What I should have just said was
that it was to much like a C compiler. Which, no matter how neat you
think it is, is archaic.
Hmmm, using the number of files generated from a source program as a measure
of the coolness of a programming language or its compiler is extremely
strange. There's nothing I could care less about if the language itself
fulfills my needs. Do you care about the strange intermediate files
VisualStudio produces? The contents of you CVS or .svn subdirectories?
I'm quite happy being able to ignore these things...
 When I use javac every file that is created is necessary for the
application to run. This can't be said of the ghc compiler. Having an
excuse that this is way the C compiler does it or that this is the way
its always been done is to poor of a reason to even argue against. If a
file isn't needed then it shouldn't be left there. 
Using Java class files as a good example is strange again: Java *does*
inline code, namely primitive constants, without leaving any traces of that
fact in the class file. That's part of the reason why every recompilation
checker for Java can only do an approximate job without actually *doing*
the compilation. GHC handles this much better.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: [Haskell] Real life examples

2004-11-26 Thread Sven Panne
Keean Schupke wrote:
[...] I don't even need to recompile your module, simply providing
the alternate Storable module at link time is sufficient. [...]
[ Completely off-topic for this thread ] But this *won't* work in the
presence of cross-module inlining, e.g. when you are using GHC with
-O or -O2. And IMHO this aggressive inlining is a very good thing.
Haskell is not C. :-)
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Are handles garbage-collected?

2004-10-24 Thread Sven Panne
Conal Elliott wrote:
I'm puzzled why explicit bracketing is seen as an acceptable solution.
It seems to me that bracketing has the same drawbacks as explicit memory
management, namely that it sometimes retains the resource (e.g., memory
or file descriptor) longer than necessary (resource leak) and sometimes
not long enough (potentially disastrous programmer error).  Whether the
resource is system RAM, file descriptors, video memory, fonts, brushes,
bitmaps, graphics contexts, 3D polygon meshes, or whatever, I'd like GC
to track the resource use and free unused resources correctly and
efficiently.
The lifetime problems are worse without explicit bracketing IMHO: One has
to wait for the next GC to free them (resource leaks) and even without
bracketing there's nothing to stop you from using a Handle after closing.
Furthermore, the main difference between simple RAM and more external
things like Handles is that the effect of keeping them occupied is
observable from outside our happy Haskell world, e.g. a server socket
might stay open, taking away a port number, an external device with
exclusive access is unusable until the next GC, my valuable texture memory
on the graphics card contains unused textures until the GC runs, etc. etc.
Finalizers are not a solution for these problems, a fact that e.g. a lot
of novice Java programmers have to learn when they do their first real
world programs...
IMHO it would be best to use explicit bracketing where possible, and hope
for the RTS/GC to try its best when one runs out of a given resource.
Admittedly the current Haskell implementations could be improved a little
bit in the last respect.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Writing binary files?

2004-09-12 Thread Sven Panne
Glynn Clements wrote:
[...]
main :: IO ()
main = do
h - openBinaryFile out.dat WriteMode
hPutStr h $ map (octetToChar . bitsToOctet) bits
hClose h
Hmmm, using string I/O when one really wants to do binary I/O gives me a bad
feeling. Haskell characters are defined to be Unicode characters, so the
above only works because current Haskell implementations usually get this wrong
(either no Unicode support at all and/or ignoring any encodings and doing I/O
only with the lower 8 bits of the characters)... hGetBuf/hPutBuf plus their
non-blocking variants are the only way to *really* do binary I/O currently.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Writing binary files?

2004-09-12 Thread Sven Panne
Glynn Clements wrote:
The problem with this approach is that the entire array has to be held
in memory, which could be an issue if the amount of data involved is
large.
Simple reasoning: If the amount of data is large, you don't want the overhead
of lists because it kills performance. If the amount of data is small, you
can easily use similar code to read/write a single byte. :-)
Of course things are a bit different when you are in the blissful position
where lazy I/O is what you want. This implies that you expect a stream of
data of a single type. How often is this really the case? And I'm not sure
if this the correct way of doing things even when the data involved wouldn't
fit into memory all at once. I'd prefer something mmap-like then...
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Writing binary files?

2004-09-11 Thread Sven Panne
Hal Daume III wrote:
There's a Binary module that comes with GHC that you can get somewhere (I 
believe Simon M wrote it).  I have hacked it up a bit and added support 
for bit-based writing, to bring it more in line with the NHC module.  
Mine, with various information, etc., is available at:

  http://www.isi.edu/~hdaume/haskell/NewBinary/
Hmmm, I'm not sure if that is what Ron asked for. What I guess is needed is
support for things like:
   read the next 4 bytes as a low-endian unsigned integer
   read the next 8 bytes as a big-endian IEEE 754 double
   write the Int16 as a low-endian signed integer
   write the (StorableArray Int Int32) as big-endian signed integers
   ...
plus perhaps some String I/O with a few encodings. Alas, we do *not* have
something in our standard libs, although there were a few discussions about
it. I know that one can debate ages about byte orders, external representation
of arrays and floats, etc. Nevertheless, I guess covering only low-/big-endian
systems, IEEE 754 floats, and arrays as a simple 0-based sequence of its
elements (with an explicit length stated somehow) would make at least 90% of all
users happy and would be sufficient for most real world file formats. Currently
one is bound to hGetBuf/hPutBuf, which is not really a comfortable way of doing
binary I/O.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Working debuggers?

2004-07-05 Thread Sven Panne
Jim Apple wrote:
I downloaded and compiled buddha, but it apparently does not support 
Control.Monad.State. I downloaded Hat, but it requires hmake. Hmake 
fails to build (GHC 6.2.1).
You can checkout hmake/nhc98 from CVS
   cvs -d :pserver:[EMAIL PROTECTED]:/cvs co nhc98
and give it a try. It compiles with hbc, ghc (HEAD) and of course itself,
so I guess ghc 6.2.1 will be no problem.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Using Product Algebraic types

2004-07-05 Thread Sven Panne
David Menendez wrote:
Neat! Are you getting this from -ddump-simpl?
Yep, that's the most readable intermediate form to look at IMHO.
[...]
helper2 = \ds eta -
case ds of
  (n,a) - ( case eta of
   (x,y) - x `plusInt` y
   , case eta of
   (x,y) - y
   )
[...] I don't know what to make of that. Semantically, helper2 is identical to
helper, but I'm not brave enough to look at the C output to see if they
produce different results.
Well, they are almost equivalent: If you keep the second element of the result of
helper2 alive without having it evaluated, you keep the first element of the
second argument of helper2 alive, too. Depending on the usage of helper2, this
might make a difference in the space behaviour, but in our current example it
doesn't matter very much. If in doubt, use e.g. GHC's profiling facility, which
can produce nice  colourful graphs, useful for impressing your friends... :-)
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A binary tree

2004-07-04 Thread Sven Panne
paolo veronelli wrote:
I want to build a binary tree where each leaf is a string of L and R 
defining their position from the top

This should be done without context isn't it?
I'm not sure what you mean with context here, but anyway...
data Tree a = Fork (Tree a) (Tree a) | Leaf a deriving Show
t =Leaf 
treeGrower :: Tree a - Tree a
treeGrower (Leaf a )= treeGrower (Fork (Leaf (a++1)) (Leaf (a++2)))
treeGrower (Fork l r)  = Fork (treeGrower l) (treeGrower r)
ghci says:
Cannot unify the type-signature variable `a' with the type `[a1]'
Expected type: [a1]
Inferred type: a
In the first argument of `(++)', namely `a'
In the first argument of `Leaf', namely `(a ++ 1)'
I don't get it.
That means that the type for treeGrower is wrong: Because (++) is used with
the elements contained in the leaves, these elements must be of a list type
(i.e. [a1]). But the type signature pretends that treeGrower could be applied
to trees with any kind of leaf elements (i.e. a). But even
   treeGrower :: Tree [a1] - Tree a
would be too general, as GHC would have found out. Strings are appended to the
leaf elements, so the correct type would be
   treeGrower :: Tree String - Tree a
Even without looking at any code, this should give you an uneasy feeling.
It essentially says: You have to give me a tree, and I will return a tree
with leaf elements of any kind you want!? This is a strong hint that treeGrower
will not have a finite value, try it for yourselves.
To achieve what you want, construct the (reverse) path while descending the
tree and plumb it into any leaf encountered:
   labelTree :: Tree a - Tree String
   labelTree tree = label tree 
  where label (Leaf _)   path = Leaf (reverse path)
label (Fork l r) path = Fork (label l ('l':path)) (label r ('r':path))
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Using Product Algebraic types

2004-07-04 Thread Sven Panne
David Menendez wrote:
[...] If that turned out to be a performance bottleneck, you could factor out
pair and write f directly: [...]
... or directly complain to your compiler supplier if the compiler in question
does not do this simple transformation for you. :-)
sigh
   I always get a bad feeling when people start to think about efficiency right
   from the beginning: First get your program correct and readable, then measure,
   and only then optimize (if at all). Programmers are notoriously bad when
   guessing about efficiency, which even more true for lazy functional programs.
/sigh
Let's e.g. have a look at the code generated by GHC for the inefficient version:
-
helper :: (Fitness, a) - (Fitness, a) - (Fitness, a)
helper = \ds eta -
   case ds of
  (f, a) - case eta of
   (g, b) - (g `plusInteger` f, b)
rw :: Population a - Population a
rw = \ds -
   case ds of
  Population xs -
 Population (case xs of
(x:xs1) - scanl helper x xs1
[] - [])
-
Or in a more readable, but equivalent, form:
-
helper :: (Fitness, a) - (Fitness, a) - (Fitness, a)
helper (f, a) (g, b) = (g `plusInteger` f, b)
rw :: Population a - Population a
rw (Population xs) = Population (case xs of
   (x:xs1) - scanl helper x xs1
   [] - [])
-
What has happened? `pair', `id', and `scanl1' were completely inlined and instead
of the overloaded (+), an Integer-specific addition was chosen (`plusInteger',
it's not the real name, just something for presentation purposes). Although these
are all only O(1) optimizations (nothing really clever), one can hardly do better
by hand... So keep the program in a form which is most comprehensible, even if
this seems to imply some superfluous things at first. This enables you to have
more time for more interesting things which could really have some effect on
performance, like clever algorithms etc.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Running Total or Incremental Sum as a higher order type ?

2004-07-03 Thread Sven Panne
Crypt Master wrote:
[...] Surely this is a pattern which has been abstracted ? I feel I 
have missed the obvious here.
There is a prelude function scanl1 for this kind of pattern, so you could write:
   incrementalSum = scanl1 (+)
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] c2hs and gtk+hs problem

2004-05-18 Thread Sven Panne
scott west wrote:
Hello all,
   I've recently attempted to get the gtk+hs bindings operational, with 
evidently no success. They both compile fine, but when trying to make 
all the examples in the gtk+hs tree, it gives up with:

/usr/local/lib/c2hs-0.12.0/ghc6/libc2hs.a(C2HS.o)(.text+0x32): In 
function `__stginit_C2HS_':  [...]
This suspiciously looks like a missing -package haskell98, try adding this
to the linking step.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] GHC and libc

2004-05-16 Thread Sven Panne
Per Larsson wrote:
[...] P.S Now everything seems to work, except that I get the compiler message:
 
/usr/local/lib/ghc-6.2/libHSunix.a(User__17.o)(.text+0x160): In function
'SystemziPosixziUser_getUserEntryForName_entry':
: Using 'getpwnam_r' in statically linked applications requires at runtime the 
shared libraries from the glibc version used for linking

This seems to indicate that there are a few functions (probably in the Posix 
package) which can't be can't be statically linked?
Yes, this seems to be the case for some functions in glibc, see e.g.:
   http://www.busybox.net/lists/busybox/2004-May/011474.html
and the reply:
   http://www.busybox.net/lists/busybox/2004-May/011475.html
No idea what this means in practice, though...
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] GHC and libc

2004-05-15 Thread Sven Panne
Per Larsson wrote:
[...] Is there a GHC switch that I have missed that enables you to statically link 
the parts of libc that is used by the haskell program? [...]
Passing -static to the GNU linker results in a, well, statically linked program. 
:-)
Using -optl -static with GHC does what you want, see:
   
http://haskell.org/ghc/docs/latest/html/users_guide/options-phases.html#FORCING-OPTIONS-THROUGH
And here the proof:
   [EMAIL PROTECTED]: ghc --make -optl -static Main.hs
   Chasing modules from: Main.hs
   Compiling Main ( Main.hs, Main.o )
   Linking ...
   [EMAIL PROTECTED]: file a.out
   a.out: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), for GNU/Linux 
2.2.5, statically linked, not stripped
   [EMAIL PROTECTED]: ldd a.out
not a dynamic executable
For non-GNU linkers consult the man pages for ld, differing linking techniques
are one of the most annoying incompatibilities between different *nices IMHO.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] handling read exceptions

2004-04-13 Thread Sven Panne
S. Alexander Jacobson wrote:
My point is that I am reading in name/value pairs
and once I know the name, I know the type of the
value, but I don't want to have to pass that
information programatically to the point in the
code where I am doing the read.
OK, I see... I don't know the exact syntax you are using (e.g. how
are the strings terminated?), but reads is still useful:
   readIS :: ReadS (Either Integer String)
   readIS s = take 1 $
  [ (Left  x, t) | (x, t) - reads s ] ++
  [ (Right x, t) | (x, t) - lex   s ]
Then we have:

   Main readIS 123blah
   [(Left 123,blah)]
   Main readIS blah123
   [(Right blah123,)]
   Main readIS 
   [(Right ,)]
   Main readIS foo bar
   [(Right foo, bar)]
If you have only simple parsing tasks and are not looking for extreme
performance, the Read class is a good choice. Otherwise you should
probably have a look at the Parsec package which comes with Hugs and GHC:
   http://www.haskell.org/ghc/docs/latest/html/libraries/parsec/Text.ParserCombinators.Parsec.html

or Happy:

   http://haskell.org/happy/

Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A useful trick for inclusion and exclusion of commented code

2004-04-12 Thread Sven Panne
Wolfgang Jeltsch wrote:
does this trick also work with GHC?  I think that GHC needs a space after -- 
if these two dashes shall introduce an one line comment.
It works with GHC, too, and this conforms to the H98 report, section 9.2
(Lexical Syntax):
   http://haskell.org/onlinereport/syntax-iso.html#sect9.2

The main reason for this to work is that the braces are classified as special.
If they were classified as symbol, '--}' would be a varsym, not the start
of a comment.
Cheers,
   S.


___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] handling read exceptions

2004-04-12 Thread Sven Panne
S. Alexander Jacobson wrote:
I want to read strings that look like 2  or
hello into values of type Integer or String.
The problem is that read requires that strings be
read as \hello\.  Is there a way either to
convince read to not require wrapping quotation
marks or, alternetively, to catch a read
exception, and do something sane?
reads is probably what you are looking for:

   Prelude (reads :: ReadS Integer) 
   []
   Prelude (reads :: ReadS Integer) a
   []
   Prelude (reads :: ReadS Integer) 2
   [(2,)]
   Prelude (reads :: ReadS Integer) 123blah
   [(123,blah)]
And reading a string the way you want is best done by id. :-)

Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] matching constructors

2004-03-08 Thread Sven Panne
Vadim Zaliva wrote:
[...] It slightly bothers me that this solution seems to be using non-standard 
GHC extensions.
Hmmm, using generics seems like massive overkill for option handling. Could you
describe what you are exactly trying to achieve?
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Perspectives on learning and using Haskell

2004-01-05 Thread Sven Panne
Duncan Coutts wrote:
On Sun, 2004-01-04 at 10:20, Graham Klyne wrote:
[...]  I would expect that when using GHC to compile a 
stand-alone Haskell program, any expressions that are not referenced are 
not included in the final object program, so leaving these test cases 
uncommented would be harmless:  is this so?
If your test functions are not exported, I would expect that this is the
case. [...]
Yes, unused functions which are not exported are nuked during compilation,
even without using the -O flag. But don't guess, just ask GHC itself via its
-ddump-occur-anal flag. (DISCLAIMER: I'm not responsible for the, well,
slightly obscure name of this flag! :-)   There are a lot more flags of this
kind, see:
   http://haskell.org/ghc/docs/latest/html/users_guide/options-debugging.html#DUMPING-OUTPUT

When you are *really* curious, use -v5.

Simon^2: The -ddump-all and -ddump-most flags mentioned on the page above are
not working anymore, -v5 / -v4 seem to do their job now. Should the documentation
be fixed or GHC?
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Preventing/handling space leaks

2003-12-08 Thread Sven Panne
[ Just one more mail and I promise to shut up on this topic... :-) ]

Fergus Henderson wrote:
[...] C does suffer from many of the same problems as C.  But in C++, it is
much easier to automate techniques like reference counting, which can
be done manually in C but are much more cumbersome and error-prone when
done manually.
Granted, C++'s (copy) constructors, destructors and assignment operators make some
things relatively easy compared to C, but the complexity of handling exceptions
*correctly* makes things worse again: There is a famous article (I can't remember the
title or the author) where a well-known C++ grandmaster explains a stack class,
but another later article by someone else describes the numerous bugs in that class
when exceptions are taken into account.
And one final remark on Haskell and Java: In large projects in those languages you
usually get the genuine space leaks in those languages plus all those nice little
leaks from the ubiquitous native functions/methods. So you have to debug in at least
two languages at the same time and I haven't seen combined tool support for this yet...
:-P * * *
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Preventing/handling space leaks

2003-12-06 Thread Sven Panne
Henk-Jan.van.Tuyl wrote:
[...] it looks to me, that the problem of space leaks is a very good reason
to not use Haskell for commercial applications. Java, for example, does not 
have this problem.
I just can't resist when I read PR statements like this (SUN's marketing department
has *really* done a good job): Granted, Haskell has problems with space leaks
from time to time, and it is especially easy for beginners to stumble over them,
see e.g. the recurring foldl (+) 0 [1..100]-discussions popping up regularly.
But for large realistic programs most programming languages converge and you basically
have the choice of what kind of space leak you want:
   * C: Missing calls to free(), etc.

   * C++: All of C's leaks + lots of hard to find space leaks due to incorrectly
 handled exceptions + ...
   * Haskell: Functions which are not strict enough, thunks which are never evaluated
 but hold large data structures, etc.
   * Java: Listeners which are not de-registered, containers which are not nulled
 after removal of an element, badly written cache-like data structures, etc.
Note that I earn my living with JVMs and Java programs (some 1mio LOC), but every
project I worked for had problems with space leaks at some point. The advantage of
Haskell IMHO is that you normally get them very early...   :-]
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Type tree traversals [Re: Modeling multiple inheritance]

2003-11-15 Thread Sven Panne
Ralf Laemmel wrote:
[...]
find . -name configure.ac -print


to find all dirs that need autoreconf (not autoconf anymore)

autoreconf
(cd ghc; autoreconf)
(cd libraries; autoreconf)
FYI: Just issue autoreconf at the toplevel, and you're done. It will
descend into all necessary subdirectories, just like configure itself.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Instance checking and phantom types

2003-09-15 Thread Sven Panne
Nick Name wrote:
Hi all, I have an example wich I don't understand:
First of all, let's rename the constructors and types a bit to make
things clearer add the instance in question, and remove the type
signatures:

module Main where
class C t
data T = MkT
instance C T
instance C ()
data C t = T1 t = MkT1

f1 = MkT1

data C t = T2 t = MkT2 t

f2 = MkT2 ()

Then we can easily ask GHC:


[EMAIL PROTECTED]:~ ghci -v0 Main.hs
*Main :i T1 MkT1 f1 T2 MkT2 f2
-- T1 is a type constructor, defined at Main.hs:8
data (C t) = T1 t = MkT1
-- MkT1 is a data constructor, defined at Main.hs:8
MkT1 :: forall t. T1 t
-- f1 is a variable, defined at Main.hs:10
f1 :: forall t. T1 t
-- T2 is a type constructor, defined at Main.hs:12
data (C t) = T2 t = MkT2 t
-- MkT2 is a data constructor, defined at Main.hs:12
MkT2 :: forall t. (C t) = t - T2 t
-- f2 is a variable, defined at Main.hs:14
f2 :: T2 ()

The first function, f1, is accepted both by hugs and ghc, unlike the 
second wich is rejected.

Why does this happen? Shouldn't f1 be rejected with no instance C ()
The reason is buried in

   http://haskell.org/onlinereport/decls.html#sect4.2.1

In a nutshell: The context in datatype declarations has only an effect for
the *data* constructors of that type which use the type variables mentioned
in the context. Contexts have no effect for the *type* constructor. IIRC the
reason for this design decision was that contexts in type signatures should
always be explicit.
Cheers,
   S.
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


  1   2   >