Re: Profiling and Data.HashTable

2005-10-17 Thread Ketil Malde
Jan-Willem Maessen [EMAIL PROTECTED] writes:

 The practical upshot is that, for a hash table with (say) 24
 entries, the GC must scan an additional 1000 pointers and discover
 that each one is [].

Would a smaller default size help?  In my case, I just wanted HTs for
very sparse tables.

 [Curious: what (if anything) is being used to test Data.HashTable?
 I'd be willing to undertake very small amounts of fiddling if I could
 be sure I wasn't slowing down something which mattered.]

I'd be happy to test it (or provide my test code).  My program isn't
too beautiful at the moment, but is tunable to distribute the word
counts over an arbitrary number of hash tables.

BTW, could one cheat by introducing a write barrier manually in some
way?  Perhaps by (unsafe?) thaw'ing and freeze'ing the arrays when they
are modified?

-k
-- 
If I haven't seen further, it is by standing in the footprints of giants

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: Dynamic Map

2005-10-17 Thread Bjorn Bringert

Lajos Nagy wrote:

Would it be possible to implement a Map in Haskell that, when
asked for a key it doesn't have, would return a 'fresh'
(meaning: not in the Map already) value, and afterwards it
would consistently return the same value for the given key.
In other words, it would behave like a dynamic map which
silently extends itself when someone asks for a non-existent
key.  If the user of this map only depends on the returned
value being different from the rest, then it's easy to
see that this dynamic map cannot break equational reasoning
so one would expect to be able to assign it a non-monadic
type:

lookup :: k - DMap a - a
insert :: k - a - DMap a - DMap a

Of course, DMap cannot support certain operations because
that would break equational reasoning.  For example:

size :: DMap a - Int

would depend on the order of lookups.  However, if
we restrict the operations to insert and lookup then
ER is restored.  (And those two operations is all I
need.)

I tried several ways of implementing it but those
monadic types just kept cropping up in the map interface.

I'd appreciate any ideas or pointers.


You could use unsafePerformIO, if it doesn't make you feel dirty.

Here's what I do to achieve sharing of strings when parsing large files:

import Data.HashTable as H
import System.IO.Unsafe (unsafePerformIO)

{-# NOINLINE stringPool #-}
stringPool :: HashTable String String
stringPool = unsafePerformIO $ new (==) hashString

{-# NOINLINE shareString #-}
shareString :: String - String
shareString s = unsafePerformIO $ do
mv - H.lookup stringPool s
case mv of
Just s' - return s'
Nothing - do
   H.insert stringPool s s
   return s

It seems to work, but if any GHC gurus notice any problems, please let 
me know. OT: This one feels pretty fast, does anyone have a more 
efficient implementation?


/Björn
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


RE: GHC-6.4.1 on FreeBSD-amd64 still not ported

2005-10-17 Thread Simon Marlow
On 15 October 2005 23:10, Wilhelm B. Kloke wrote:

 ghc/includes  
 touch

ghc/includes/{ghcautoconf.h,DerivedConstants.h,GHCConstants.h,mkDerivedC
onstants.c}
 touch

ghc/includes/{mkDerivedConstantsHdr,mkDerivedConstants.o,mkGHCConstants,
mkGHCConstants.o}
 touch ghc/includes/{ghcautoconf.h,DerivedConstants.h,GHCConstants.h}
 chflags uchg 
 ghc/includes/{ghcautoconf.h,DerivedConstants.h,GHCConstants.h} (cd
 glafp-utils  gmake boot  gmake) (cd ghc  gmake boot  gmake)
 (cd libraries  gmake boot  gmake)
 (cd ghc/compiler  gmake boot stage=2  gmake stage=2)
 (cd ghc/lib/compat  gmake clean; rm .depend; gmake boot
 UseStage1=YES EXTRA_HC_OPTS='-O -fvia-C -keep-hc-files'; gmake -k
 UseStage1=YES EXTRA_HC_OPTS='-O -fvia-C -keep-hc-files') (cd ghc/rts
  gmake -k UseStage1=YES EXTRA_HC_OPTS='-O -fvia-C -keep-hc-files')
 (cd ghc/utils  gmake clean; gmake -k UseStage1=YES
 EXTRA_HC_OPTS='-O -fvia-C -keep-hc-files')   
 gmake hc-file-bundle Project=Ghc

Thanks very much for this, I'll update the docs.

 Don't forget to delete Linker.c (for ghci). The stage on teh host
 system 
 where the process fails jsut now is
 $MAKE -C libraries boot all
 because
 Fake happy is not happy!

You mean on the target system?  Can you give more details?
 
 But ghc-inplace seems to work pretty good now on amd64.

Great!

Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


RE: Profiling and Data.HashTable

2005-10-17 Thread Simon Marlow
On 14 October 2005 20:31, Jan-Willem Maessen wrote:

 That 5K number made me immediately suspicious, so I took a look at
 the source code to Data.HashTable.  Sure enough, it's allocating a
 number of large IOArrays, which are filled with pointers.  The
 practical upshot is that, for a hash table with (say) 24 entries, the
 GC must scan an additional 1000 pointers and discover that each one
 is [].
 
 I've seen other implementations of this two-level technique which use
 a smallish sEGMENT_SIZE in order to avoid excessive GC overhead for
 less-than-gigantic hash tables.  This might be worth doing in the
 Data.HashTable implementation.

 [Curious: what (if anything) is being used to test Data.HashTable?
 I'd be willing to undertake very small amounts of fiddling if I could
 be sure I wasn't slowing down something which mattered.]

The few benchmarks I've tried seem to indicate that Data.HashTable
performance isn't overwhelming, and can often be bettered by Data.Map.
I've never investigated too deeply.  GC overhead seems a very plausible
cause, though.

Please by all means tweak the implementation and try some benchmarks,
I'll incorporate any performance fixes that arise.

Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


RE: Profiling and Data.HashTable

2005-10-17 Thread Simon Marlow
On 17 October 2005 08:07, Ketil Malde wrote:

 BTW, could one cheat by introducing a write barrier manually in some
 way?  Perhaps by (unsafe?) thaw'ing and freeze'ing the arrays when
 they are modified?

Might be worthwhile: freezing is very quick (basically a single write),
thawing is slightly slower (an out-of-line call, but a short one).  The
effect of freezing will only be felt after a couple of GCs, when
everything the array points to is pulled into the old generation and the
GC can take it off the mutable list.

Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re[2]: Dynamic Map

2005-10-17 Thread Bulat Ziganshin
Hello Bjorn,

Monday, October 17, 2005, 11:48:10 AM, you wrote:

BB You could use unsafePerformIO, if it doesn't make you feel dirty.

BB Here's what I do to achieve sharing of strings when parsing large files:

BB It seems to work, but if any GHC gurus notice any problems, please let
BB me know. OT: This one feels pretty fast, does anyone have a more 
BB efficient implementation?

something like it is done in ghc compiler itself :))) see FastString.lhs



-- 
Best regards,
 Bulatmailto:[EMAIL PROTECTED]



___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


GHC 6.4.1 barfs building haskell-src-exts on MingW/WinXP

2005-10-17 Thread Bayley, Alistair
Hello all,

I'm trying to build haskell-src-exts-0.2 with GHC 6.4.1 under MingW on
WinXP. It segfaults on the runhaskell Setup.hs build command (in the
src/haskell-src-exts subdir). Does anyone else get this, or is it just me?

sh-2.04$ runhaskell Setup.hs build -v5
Preprocessing library haskell-src-exts-0.2...
c:\happy\happy-1.13\bin\happy.exe -agc -oLanguage\Haskell\Hsx\Parser.hs
Language\Haskell\Hsx\Parser.ly
shift/reduce conflicts:  5
Building haskell-src-exts-0.2...
[segfault: instruction at 0 ref'd memory at 0]
sh-2.04$


How do I modify the Cabal setup to get GHC -v output (so I can see which
module it fails on)?

Alistair.
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: GHC 6.4.1 barfs building haskell-src-exts on MingW/WinXP

2005-10-17 Thread Babo Attila

Bayley, Alistair wrote:
 I'm trying to build haskell-src-exts-0.2 with GHC 6.4.1 under MingW on
 WinXP. It segfaults on the runhaskell Setup.hs build command (in the
 src/haskell-src-exts subdir). Does anyone else get this, or is it 
just me?


Hi,

I have the same problem with haskell-src-exts and with hs-plugins as 
well. I tried with a default installation of ghc-6.4.1 and later with 
the updated Cabal-1.1.4, but it remains the same. Using the Makefile 
instead of Cabal there is no problem, it also works with vshaskell or on 
a Linux box with the same software versions.


Here is the error message:

$ runhaskell Setup.hs build
Preprocessing library plugins-0.9.10...
Building plugins-0.9.10..

The instruction at 0x referenced memory at 0x000. The 
memory could not be read.


Regards:

Attila
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


GHC-6.4.1 on FreeBSD-amd64 port progress

2005-10-17 Thread Wilhelm B. Kloke
Simon Marlow [EMAIL PROTECTED] schrieb:
 On 15 October 2005 23:10, Wilhelm B. Kloke wrote:

 Don't forget to delete Linker.c (for ghci). The stage on teh host
 system 
 where the process fails jsut now is
 $MAKE -C libraries boot all
 because
 Fake happy is not happy!

 You mean on the target system?  Can you give more details?

Yes. Sorry for any confusion. The happy error messages was an easy one,
because I have a working 32bit happy on the system.

 But ghc-inplace seems to work pretty good now on amd64.

I reached the end of hc-build successfully. Now there is a new problem.
I tried (as root) make install at this point. This fails with error messages
related to missing stage2 subdirectories. So I tried

gmake stage=2 boot

This fails with messages
...
cmm/CmmLex.hs:28: unterminated #if
cmm/CmmLex.hs:20: unterminated #if
ghc: 55123864 bytes, 6 GCs, 160600/160600 avg/max bytes residency (1 
samples), 16M in use, 0.00 INIT (0.00 elapsed), 0.12 MUT (0.41 elapsed), 0.01 
GC (0.05 elapsed) :ghc
gmake: *** [depend] Fehler 1

This error is due to indented preprocessor lines
#else
#endif

I removed the indentation of these 2 lines. Then this file compiled,
but there are more of them, the next being parser/Lexer.hs

Is there a recommended way to handle this?
-- 
Dipl.-Math. Wilhelm Bernhard Kloke
Institut fuer Arbeitsphysiologie an der Universitaet Dortmund
Ardeystrasse 67, D-44139 Dortmund, Tel. 0231-1084-257

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: GHC 6.4.1 barfs building haskell-src-exts on MingW/WinXP

2005-10-17 Thread Mark Wassell
- Original Message - 
From: Bayley, Alistair [EMAIL PROTECTED]



Hello all,

I'm trying to build haskell-src-exts-0.2 with GHC 6.4.1 under MingW on
WinXP. It segfaults on the runhaskell Setup.hs build command (in the
src/haskell-src-exts subdir). Does anyone else get this, or is it just me?



Hi,

I get the same when building the HSQL part of hsql-1.6 (on WinXP). I added a 
ghc-options: -v line to the cabal file hoping it would print something 
before crashing. Nothing was printed which possibly indicates that the 
problem lies in the build code rather than the code being compiled?


Cheers

Mark 


___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: GHC 6.4.1 barfs building haskell-src-exts on MingW/WinXP

2005-10-17 Thread Donald Bruce Stewart
attila.babo:
 Bayley, Alistair wrote:
  I'm trying to build haskell-src-exts-0.2 with GHC 6.4.1 under MingW on
  WinXP. It segfaults on the runhaskell Setup.hs build command (in the
  src/haskell-src-exts subdir). Does anyone else get this, or is it 
 just me?
 
 Hi,
 
 I have the same problem with haskell-src-exts and with hs-plugins as 

Note that the latest version of hs-plugins (from the darcs repo) only
optionally uses haskell-src-exts.

 well. I tried with a default installation of ghc-6.4.1 and later with 
 the updated Cabal-1.1.4, but it remains the same. Using the Makefile 
 instead of Cabal there is no problem, it also works with vshaskell or on 
 a Linux box with the same software versions.
 
 Here is the error message:
 
 $ runhaskell Setup.hs build
 Preprocessing library plugins-0.9.10...
 Building plugins-0.9.10..
 
 The instruction at 0x referenced memory at 0x000. The 
 memory could not be read.
 
 Regards:
 
 Attila
 ___
 Glasgow-haskell-users mailing list
 Glasgow-haskell-users@haskell.org
 http://www.haskell.org/mailman/listinfo/glasgow-haskell-users
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users