Re: [Haskell-cafe] What's the deal with Clean?
On Fri, Nov 6, 2009 at 8:45 AM, Ketil Malde ke...@malde.org wrote: enough about. I'd be happy to hear any suggestions. This is more a question than a suggestion, but would the iteratees package fit your needs ? David. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Martin DeMello wrote: On Wed, Nov 4, 2009 at 10:34 AM, Richard O'Keefe o...@cs.otago.ac.nz wrote: (4) It comes with its own IDE. I don't think it can do anything much that Haskell tools can't do, but if you don't like looking for things, it's a help. And a well-integrated GUI toolkit. If it weren't for the Windows bias I'd have definitely taken the time to learn the language. I'm dissapointed that Haskell doesn't have *more* of a Windows bias. It _is_ the platform used by 90% of the desktop computers, after all. (As unfortunate as that undeniably is...) In particular, I really wish we could make is so that stuff from Hackage actually compiles on Windows. (Disclaimer: Stuff written in Haskell compiles just fine. It's FFI bindings that unanimously refuse to compile.) It's also somewhat irritating that the I/O libraries have a few quirks apparently related to mingw32. But hey, that one at least should be fixable... ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Andrew Coppin wrote: I'm dissapointed that Haskell doesn't have *more* of a Windows bias. It _is_ the platform used by 90% of the desktop computers, after all. (As unfortunate as that undeniably is...) That is not true in my home and its not true where I work. In addition, saying 90% of all desktop computers is misleading; instead we should be talking about the computers of software developers and there, the figure is almost certainly well below 90%. In particular, I really wish we could make is so that stuff from Hackage actually compiles on Windows. (Disclaimer: Stuff written in Haskell compiles just fine. It's FFI bindings that unanimously refuse to compile.) It's also somewhat irritating that the I/O libraries have a few quirks apparently related to mingw32. But hey, that one at least should be fixable... The problem here is that window is the odd one out. Stuff written for userspace Linux will usually compile with little more than minor alterations on OSX and all the other Unix-like systems. Making that same code build on windows can be a significant amount of work and that work should not be the responsibility of the people who write code on Linux and Mac. Erik -- -- Erik de Castro Lopo http://www.mega-nerd.com/ ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Erik de Castro Lopo wrote: Andrew Coppin wrote: I'm dissapointed that Haskell doesn't have *more* of a Windows bias. It _is_ the platform used by 90% of the desktop computers, after all. (As unfortunate as that undeniably is...) That is not true in my home and its not true where I work. In addition, saying 90% of all desktop computers is misleading; instead we should be talking about the computers of software developers and there, the figure is almost certainly well below 90%. Depends what you develop. I know of plenty of developers who use MS Visual Studio for everything, for example. You can pretend that Windows isn't popular and thus there's no need to support it, but to me that seems like a fairly unrealistic point of view. The problem here is that window is the odd one out. This, it seems, is why there are programs in this world that are designed for Windows but (sometimes) also run on Unix, and other programs which are designed for Unix but (sometimes) also run on Windows. I would like to add that GHC itself is really rather well-behaved under Windows. Many Unix programs simply get recompiled under Cygwin or something, resulting in a program that *runs* on Windows, but doesn't follow any Windows-related conventions and so forth. GHC actually behaves very well under Windows. And we have some quite nice library support for writing Haskell programs which compile unmodified under Windows and Linux. (E.g., filepath, ansi-terminal, etc.) I'm just saying, we could still do better... ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
2009/11/5 Erik de Castro Lopo mle...@mega-nerd.com: Andrew Coppin wrote: I'm dissapointed that Haskell doesn't have *more* of a Windows bias. It _is_ the platform used by 90% of the desktop computers, after all. (As unfortunate as that undeniably is...) That is not true in my home and its not true where I work. In addition, saying 90% of all desktop computers is misleading; instead we should be talking about the computers of software developers and there, the figure is almost certainly well below 90%. Why? After all, software is always (in one way or another) written for users, not other software developers. -- Deniz Dogan ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Am Donnerstag 05 November 2009 23:02:30 schrieb Erik de Castro Lopo: Andrew Coppin wrote: I'm dissapointed that Haskell doesn't have *more* of a Windows bias. It _is_ the platform used by 90% of the desktop computers, after all. (As unfortunate as that undeniably is...) That is not true in my home and its not true where I work. Neither is it true in the group of people I know. However, the number of computers in that group which had Windows installed when they were bought may be close 90% - it's close to impossible to buy a completely assembled, ready to go computer without here - which is easily remedied by inserting an openSuse or Ubuntu disk as soon as it's connected to power :) In addition, saying 90% of all desktop computers is misleading; instead we should be talking about the computers of software developers and there, the figure is almost certainly well below 90%. In particular, I really wish we could make is so that stuff from Hackage actually compiles on Windows. (Disclaimer: Stuff written in Haskell compiles just fine. It's FFI bindings that unanimously refuse to compile.) It's also somewhat irritating that the I/O libraries have a few quirks apparently related to mingw32. But hey, that one at least should be fixable... The problem here is that window is the odd one out. Still it would be nice if things were easily installable on Windows, so maybe some Windows user should write a tool which makes installing C libraries on windows feasible. Stuff written for userspace Linux will usually compile with little more than minor alterations on OSX and all the other Unix-like systems. Making that same code build on windows can be a significant amount of work and that work should not be the responsibility of the people who write code on Linux and Mac. Erik ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Deniz Dogan wrote: 2009/11/5 Erik de Castro Lopo mle...@mega-nerd.com: In addition, saying 90% of all desktop computers is misleading; instead we should be talking about the computers of software developers and there, the figure is almost certainly well below 90%. Why? After all, software is always (in one way or another) written for users, not other software developers. We're talking about Haskell libraries. The *only* people who are interested in Haskell libraries are Haskell developers (ie not even developers in general). Counting desktop users who use nothing more than Word, Excel and IE are not interested in Haskell libraries. Erik -- -- Erik de Castro Lopo http://www.mega-nerd.com/ ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
A comparison of the evolution in speed of Haskell from version to version rather than with other languages could have been very informative about the progress in haskell speed. I think that the progression has been astonishing. I though that while seeing this language shootout in windows http://dada.perl.it/shootout/ where ghc performance is rather mean. Until I realized that the version used was GHC 5.04.2 http://www.haskell.org/ 2009/11/4 Don Stewart d...@galois.com bulat.ziganshin: oh, can we stop saying about shootout? if you want to see speed of pure haskell code, look at papers about fast arrays/strings - their authors have measured that lazy lists are hundreds times slower than idiomatic C code. is use of lazy lists counted as mistake too and paper authors had too small haskell experience? Comparing apples against oranges is a mistake, yes. -- Don ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Andrew Coppin wrote: Depends what you develop. I know of plenty of developers who use MS Visual Studio for everything, for example. And those developers do not care whether Haskell libraries compile on windows or not. You can pretend that Windows isn't popular and thus there's no need to support it, but to me that seems like a fairly unrealistic point of view. Windows is popular amongst general computer users, but less popular amongst developers and less popular still amongst Haskell developers. The problem here is that window is the odd one out. This, it seems, is why there are programs in this world that are designed for Windows but (sometimes) also run on Unix, This is a very small proportion of all windows applications and many/ most of them run via Wine, a code base of 1.8 million lines with a worth according to Ohloh of $29Mil (which I consider very conservative): http://www.ohloh.net/p/wine and other programs which are designed for Unix but (sometimes) also run on Windows. A far larger proportion of Unix program run on windows, due to two factors: - Unix people write command line apps and libraries which are always easier to port to windows. - Unix people doing things like windows backends to GTK+ and QT to make GUI applications portable to windows. I'm just saying, we could still do better... My point is that its up to people who care about windows to fix things for windows. I am a user of Debian and Ubuntu. About a year ago I became sick and tired of the poor state of haskell in Debian. I didn't complain, I joined the debian-haskell-maintainers group and started packaging haskell stuff for Debian. Now, the situation has improved vastly. Erik -- -- Erik de Castro Lopo http://www.mega-nerd.com/ ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Roman Leshchinskiy wrote: wren ng thornton wrote: Roman Leshchinskiy wrote: On 04/11/2009, at 13:23, Daniel Peebles wrote: In the presence of fusion (as is the case in uvector), it's hard to give meaningful time complexities for operations as they depend on what operations they are paired with. We need to think of a better way to express this behavior in the documentation though. I have to disagree here. Fusion never makes the complexity of operations worse. If it does, it's a bug. I think the point was more that the relevant complexity bound can change in the presence of fusion. For a poor example: the first map over a list is O(n) but all subsequent ones in a chain of maps are O(1) with fusion. I'm sure there are better examples than that, but you get the idea. Some people may care to know about that latter complexity rather than just the independent complexity. I think asymptotic complexity is the wrong tool for what you're trying to do. [...] Executing the two maps, be it one after another or interlocked, is linear simply because O(n) + O(n) = O(n), not because of fusion. As I said, it was a bad example. Off-hand I can't think of any examples where fusion actually does affect asymptotic complexity rather than just reducing the constant factor. But I think such examples (if they exist) are what Daniel was concerned with, rather than any bugs where fusion makes the complexity (or constant factors) worse. -- Live well, ~wren ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Nov 5, 2009, at 8:26 AM, Jason Dagit wrote: I can't really think of how laziness and polymorphism are related. For me the big win with laziness is composability. Laziness allows us to express things in ways that are more natural. The prelude function 'take' is a perfect example. It allows you to use finite portions of infinite lists. You could then express an infinite series very naturally and then decouple from that the logic to process finite parts. The implication here is that laziness allows you to use data structures for control flow. This all works together to enable separation of concerns. Which is generally a very good thing if you want to reason about your source code. My bad, I meant polymorphism as the answer as to why things are boxed. Laziness can also be thought of as a transformation on the time complexity of algorithms. Sure, the worst-case complexity still remains but often you can get a better average case by only computing as much as you need. I hope that helps, It does. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Nov 5, 2009, at 8:26 AM, Jason Dagit wrote: Haskell knows when I have a list of Doubles, you know, because it's strongly typed. Then it proceeds to box them. Huh ? Imagine a computation which will yield a Double if evaluated, but has not yet been evaluated. How do you store that in the list? So laziness is causing the boxing to be necessary ? ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Thu, Nov 5, 2009 at 6:15 PM, brian bri...@aracnet.com wrote: On Nov 5, 2009, at 8:26 AM, Jason Dagit wrote: Haskell knows when I have a list of Doubles, you know, because it's strongly typed. Then it proceeds to box them. Huh ? Imagine a computation which will yield a Double if evaluated, but has not yet been evaluated. How do you store that in the list? So laziness is causing the boxing to be necessary ? Necessary is a strong word within formal/mathematical communities. If you mean it in that sense, then I'm not sure it's necessary. My (incomplete) understanding is that no one has a better way than boxing that has as wide applicability as boxing. Perhaps there are techniques that work better. My guess is that they are either 1) special cases; or 2) have yet to be discovered. I wonder if perhaps supercompilation or perhaps whole program optimizations will eventually be able to eliminate much of the boxing we have today. Strictness analysis has done a lot to remove boxing but it is not perfect due to the halting problem. Jason ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
And this is confusing to those of us who are not compiler experts. Haskell knows when I have a list of Doubles, you know, because it's strongly typed. Then it proceeds to box them. Huh ? The laziness thing has many example od _reducing_ efficiency, but there seems to be a real lack of example where it helps. In fact it seems to _always_ hurt. People sure seem excited about it. Me, not so excited. I have a program that involves a somewhat involved compilation pipeline. As a result of laziness, it's all done incrementally and I can start getting results right away, which is essential to my application. Without laziness I'd have to do something awkward and complicated like break it into a lot of message passing threads or process in chunks (and it's impossible to know how much context each chunk will need without further hackery). I can abort the computation cleanly by simply stopping the consumer, and everything gets GCed. And I get all sorts of convenient idioms like 'zip [0..]' and calculating stuff in 'where' that I may not need. And it's just fun. So I'm still excited about it :) ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Duncan Coutts duncan.cou...@googlemail.com writes: The build operation part often ends up a bit gross, but I have a plan for that which I hope to come back to later on. Yes, they're not good for construction atm. The Builder monoid from the I've used that a bit, but that's not what I'm talking about. Serializing and deserializing to/from bytestrings is sometimes complicated, but usually straightforward enough. The operation in between is what is giving me headaches. Basically, I have a bunch of command line options - filter on this, modify on that, produce some kind of output to some file - that must be interpreted in order to produce a combined filter/modifier processing. The pipeline looks something like this: readFoo :: IO [Foo] process :: [Foo] - IO [Foo] writeFoo :: [Foo] - IO () The hard part is how to elegantly construct the process part. If it were just filtering or modifications, it could be a pure function. The complexity comes from sometimes needing to split off some output to some file. Currently, I'm opening handles in advance, and processing one Foo at a time, writing it to the correct handles, and finally closing handles when done. This is a rather pedestrian approach. I'm now considering defining branch :: ([Foo] - IO ()) - [Foo] - IO [Foo] branch iop fs = do forkIO (iop fs) return fs Which, if I understand correctly, would allow me to write process = filterM this = mapM that = branch (writeBar . map foo2bar) = filterM theother So - is this a good way to approach it? I feel slightly queasy about using concurrency for this, although I think it'll work well in practice. It is very important that this is lazy - the list of Foos can be larger than available memory, so one risk is that one thread might run off with the list of Foos with the other trailing far behind, leading to increased memory use. Previous experience seems to indicate that the 'head' thread will be slowed by disk/memory and allow the trailing threads to keep up. I do have a nagging feeling that this could be solved more elegantly with arrows or a lazy writer monad, or something else that I don't know enough about. I'd be happy to hear any suggestions. -k PS: I probably need to catch the threadID, and wait for all threads to finish as well. This is left as an excercise for the writer. :-) -- If I haven't seen further, it is by standing in the footprints of giants ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Wed, Nov 4, 2009 at 10:34 AM, Richard O'Keefe o...@cs.otago.ac.nz wrote: (4) It comes with its own IDE. I don't think it can do anything much that Haskell tools can't do, but if you don't like looking for things, it's a help. And a well-integrated GUI toolkit. If it weren't for the Windows bias I'd have definitely taken the time to learn the language. martin ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Fwd: [Haskell-cafe] What's the deal with Clean?
The code executed by uniqueness types is somehow similar to the internal code executed in a state monad (or in the case of IO, the IO monad). The main difference is that the pairs of results (state, value) are explicitly written in Clean by the programmer and the type sytem assures that the order of executions makes sense at compile time, whereas in the case of the state monad the sequence of instructions is lazily assembled at runtime in the first step and executed in a second step. So there is a little more overhead in haskell but the code is higher level. Am I right? 2009/11/4 wren ng thornton w...@freegeek.org Stephen Tetley wrote: 2009/11/3 Andrew Coppin andrewcop...@btinternet.com: As far as I can tell, Clean is to Haskell as C is to Pascal. I.e., Clean is notionally very similar to Haskell, but with lots of added clutter, complexity and general ugliness - but it's probably somehow more machine-efficient as a result. (All of which makes the name Clean rather ironic, IMHO.) OUuch - you really could put it the other way round. Part of this really comes down to how one feels about the monads vs uniqueness types argument. It's a silly argument to have since the ideas are orthogonal and only really intersect at IO, but there's history there which lead to the current state of things. Sometimes in Haskell I've thought about how uniqueness typing would make something faster, but in general all the plumbing associated with it in Clean makes me feel like I'm writing systems-level code (i.e. C, asm) instead of using a high-level language. The extra plumbing really makes it feel dirtier to work with. That doesn't mean Clean is bad, but I think it does contribute to the cluttered feeling Haskellers get. But as I said, it's a silly argument and folks should use whichever gives them warm fuzzies. I also have a vague unnameable distaste whenever working with Python, and rather enjoy working with Perl. Nobody's perfect :) -- Live well, ~wren ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Tue, 2009-11-03 at 18:12 -0800, brian wrote: Really, arrays in Haskell are the most @#!$! confusing thing in the world. There's a bunch of different array structures. I can't tell which one works best, and all I want to do is x[i] = value. I thought uvector was the answer, you know, fast unboxed ARRAYs. Rather than confusing yourself with new packages like uvector I suggest you just use the arrays from the standard 'array' package that comes with GHC. It provides mutable and immutable, boxed and unboxed arrays. The mutable ones have to be used in a monad (ST or IO). The boxed ones can be used with any element type (eg an array of records) while unboxed ones work with simple primitive types like ints, floats etc. The difference is about memory layout and therefore performance: unboxed ones are simple flat C-like arrays while the boxed ones are arrays of pointers to heap objects. Duncan ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Duncan Coutts duncan.cou...@googlemail.com writes: The boxed [array types] can be used with any element type (eg an array of records) while unboxed ones work with simple primitive types like ints, floats etc. The difference is about memory layout and therefore performance ...and of strictness. A boxed array can contain pointers to unevaluated thunks (including references to other cells in the array), an unboxed array only contains evaluated values. But yes, it'd be nice to tidy up the set of available array libraries, and perhaps related functionality (bytestring, text) to provide a unified and non-redundant, whole. Platform prime, anyone? -k -- If I haven't seen further, it is by standing in the footprints of giants ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Hello, 2009/11/4 Alberto G. Corona agocor...@gmail.com: The code executed by uniqueness types is somehow similar to the internal code executed in a state monad (or in the case of IO, the IO monad). The main difference is that the pairs of results (state, value) are explicitly written in Clean by the programmer and the type sytem assures that the order of executions makes sense at compile time, whereas in the case of the state monad the sequence of instructions is lazily assembled at runtime in the first step and executed in a second step. So there is a little more overhead in haskell but the code is higher level. Am I right? I would rather say: code with uniqueness types allows for safe destructive updates. In Clean, a variable of unique type is ensured to have only one reference to it, at any time (that's why it's called uniqueness typing). So you can't write the code like this f(x) + f(x) where f : *a - int (x is of unique type), because x is clearly referenced two times here. What to do? Let f yield another reference to x! That also means that the old reference is not usable any more, since you have new one. f becomes: f : *a - (int, *a) and the code looks very familiar: let (a, x') = f(x) (b, x'') = f(x') in a + b The function f can use destructive updates under the hood though it doesn't violate referential transparency. I bet you can you see why. I'd say that call-by-need is orthogonal to uniqueness typing. Cheers, Artyom Shalkhakov. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re[2]: [Haskell-cafe] What's the deal with Clean?
Hello Don, http://shootout.alioth.debian.org/u64q/benchmark.php?test=alllang=ghclang2=cleanbox=1 http://shootout.alioth.debian.org/u64q/benchmark.php?test=alllang=ghclang2=ocamlbox=1 The Haskell compiler isn't the bottleneck. Use it when performance matters. I do. Don, shootout times may be used to measure how many people was contributed solutions for each language, but nothing more. these tests depends mainly on libraries bundled with each compiler and, secondary, on enthusiasts writing low-level code -- Best regards, Bulatmailto:bulat.zigans...@gmail.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Bulat Ziganshin bulat.zigans...@gmail.com writes: http://shootout.alioth.debian.org/u64q/benchmark.php?test=alllang=ghclang2=cleanbox=1 http://shootout.alioth.debian.org/u64q/benchmark.php?test=alllang=ghclang2=ocamlbox=1 The Haskell compiler isn't the bottleneck. Use it when performance matters. I do. Don, shootout times may be used to measure how many people was contributed solutions for each language, but nothing more. Well, it clearly demonstrates that it is possible to write fast code in Haskell. Last time I looked, much of the shootout code was overly complicated (i.e. enthusiasts writing low-level code). And I can't help but notice that Clean beats Haskell on code compactness. It'd be interesting to see how well more naïve/idiomatic code fares. While it's nice to be able to write fast programs, the main reason to use Haskell is to write succinct and correct programs. (Is it possible to have an alternative Haskell track in the shootouts?) Since this was done, there has been great strides in available libraries and GHC optimizations, and it'd also be interesting to see whether we now are able to optimize ourselves away from much of the overhead. -k -- If I haven't seen further, it is by standing in the footprints of giants ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Fwd: [Haskell-cafe] What's the deal with Clean?
Artyom. I know what uniqueness means. What I meant is that the context in which uniqueness is used, for imperative sequences: (y, s')= proc1 s x (z, s'')= proc2 s' y . is essentially the same sequence as if we rewrite an state monad to make the state explicit. When the state is the world state, then it is similar to the IO monad. An state monad forces a single use of the implicit state variable too (unless you pass it trough the next step without changes. That can be done in Clean too. 2009/11/4 Artyom Shalkhakov artyom.shalkha...@gmail.com Hello, 2009/11/4 Alberto G. Corona agocor...@gmail.com: The code executed by uniqueness types is somehow similar to the internal code executed in a state monad (or in the case of IO, the IO monad). The main difference is that the pairs of results (state, value) are explicitly written in Clean by the programmer and the type sytem assures that the order of executions makes sense at compile time, whereas in the case of the state monad the sequence of instructions is lazily assembled at runtime in the first step and executed in a second step. So there is a little more overhead in haskell but the code is higher level. Am I right? I would rather say: code with uniqueness types allows for safe destructive updates. In Clean, a variable of unique type is ensured to have only one reference to it, at any time (that's why it's called uniqueness typing). So you can't write the code like this f(x) + f(x) where f : *a - int (x is of unique type), because x is clearly referenced two times here. What to do? Let f yield another reference to x! That also means that the old reference is not usable any more, since you have new one. f becomes: f : *a - (int, *a) and the code looks very familiar: let (a, x') = f(x) (b, x'') = f(x') in a + b The function f can use destructive updates under the hood though it doesn't violate referential transparency. I bet you can you see why. I'd say that call-by-need is orthogonal to uniqueness typing. Cheers, Artyom Shalkhakov. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re[2]: [Haskell-cafe] What's the deal with Clean?
Hello Ketil, Wednesday, November 4, 2009, 4:31:20 PM, you wrote: Well, it clearly demonstrates that it is possible to write fast code in Haskell. my measures says that by psending 3x more time than for C you can optimize haskell code to be only 3x slower than C one succinct and correct programs. (Is it possible to have an alternative Haskell track in the shootouts?) even w/o enthusiasts Shootout mainly measure speed of libraries Since this was done, there has been great strides in available libraries and GHC optimizations, and it'd also be interesting to see whether we now are able to optimize ourselves away from much of the overhead. eh, if it was possible, we have seen this. both on shootout and here when people are crying that their code isn't as fast as those ads say. haskell compilation can't yet automatically avoid laziness and convert pure high-level code into equivalent of C one. libraries doesn't change anything - they provide low-level optimized solutions for particular tasks but can't optimize your own code once you started to write it -- Best regards, Bulatmailto:bulat.zigans...@gmail.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: Re[2]: [Haskell-cafe] What's the deal with Clean?
I personally don´t care about raw performance. Haskell is in the top of the list of language performance. It has all the ingredients for improving performance in the coming years: A core language, clear execution strategy, analysis and parsing, transformations based on math rules. So my code will improve with each new compiler version at the same or better pace than any other language. Moreover I can not care less about how fast is C, when I simply can not program many things I need in C or C++ or Java and in general any of the language of the performance list that are above... or below, because they lack the necessary type safety, expressiveness, abstraction.etc. Not to mention time. Not to mention the growing community etc. Regards. 2009/11/4 Bulat Ziganshin bulat.zigans...@gmail.com Hello Ketil, Wednesday, November 4, 2009, 4:31:20 PM, you wrote: Well, it clearly demonstrates that it is possible to write fast code in Haskell. my measures says that by psending 3x more time than for C you can optimize haskell code to be only 3x slower than C one succinct and correct programs. (Is it possible to have an alternative Haskell track in the shootouts?) even w/o enthusiasts Shootout mainly measure speed of libraries Since this was done, there has been great strides in available libraries and GHC optimizations, and it'd also be interesting to see whether we now are able to optimize ourselves away from much of the overhead. eh, if it was possible, we have seen this. both on shootout and here when people are crying that their code isn't as fast as those ads say. haskell compilation can't yet automatically avoid laziness and convert pure high-level code into equivalent of C one. libraries doesn't change anything - they provide low-level optimized solutions for particular tasks but can't optimize your own code once you started to write it -- Best regards, Bulatmailto:bulat.zigans...@gmail.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On 4 Nov 2009, at 13:36, Alberto G. Corona wrote: Artyom. I know what uniqueness means. What I meant is that the context in which uniqueness is used, for imperative sequences: (y, s')= proc1 s x (z, s'')= proc2 s' y . is essentially the same sequence as if we rewrite an state monad to make the state explicit. When the state is the world state, then it is similar to the IO monad. Yes, as long as there is a single thing that is being updated there's little difference between the state monad and a unique type. But uniqueness typing is more general. For instance, a function which updates two arrays f (arr1, arr2) = (update arr1 0 'x', update arr2 0 'y') is easily written in functional style in Clean, whereas in Haskell we need to sequentialize the two updates: f (arr1, arr2) = do writeArray arr1 0 'x' writeArray arr2 0 'y' You can find a more detailed comparison in my thesis (https://www.cs.tcd.ie/Edsko.de.Vries/pub/MakingUniquenessTypingLessUnique-screen.pdf , Section 2.8.7). -Edsko ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Tue, Nov 3, 2009 at 10:44 PM, Ben Lippmeier ben.lippme...@anu.edu.auwrote: David Leimbach wrote: I have to admit, the first time I hit the wiki page for DDC I said to myself Self, this sounds crazy complicated. Then I read part of the PDF (your thesis I believe) about Region Types on the bus ride to work and thought. Gee I think I scared myself off too quickly. Uniqueness typing is quite interesting in Clean, but to control aliasing, like really *control* aliasing, that's just far out man. So I still have to wrap my head around why this isn't going to get completely out of control and see why it's all safer than just writing C code but I must say the attention I will be paying to DDC has just gone quite a bit up. :) A correct C program is just as safe as a correct Haskell/Disciple program. Well, of course, the question is in what sort of guarantees a language or compiler provides I guess. If you're using destructive update then aliasing, side effects and mutability all start to matter. It might look complicated when you reflect all these things in the type system, but you're really just getting a handle on the inherent complications of the underlying program. So it's just really more notation to let you know which tools are being used when you use them? Does Disciple completely avoid the need for such things as unsafePerformIO? (a perhaps overly paranoid comment but...) I realize we're probably not supposed to worry about the existence of unsafePerformIO, and that library authors know what they're doing. But doesn't it automatically mean that there's a bit of implicit trust whenever I see a function that's of type (a - a) that there *isn't* IO going on in there? :-) If Disciple can guarantee that no one is allowed to cheat, is that not a better approach? I suppose the trick is to be able to ignore said complications when you just don't care, or they're not relevant for your particular problem... Yes, the Disciple documentation says that this stuff can be inferred, but I don't even let Haskell infer my types for *any* functions I write in any code. I like to restrict what can go in and out of the function even if it's more general. Perhaps this is the knee-jerk reaction of an angry Erlang programmer who really wanted some types to reign in the overly-dynamic evaluations that are allowed in that environment, but that's how I roll baby! I will admit that on occasion I will write and expression that I think does what I want, and look at in in ghci, having it tell me the type because sometimes I've not had enough coffee to do it in my head, but I either look at the inferred type and realize that it's what I originally wanted, or add further restrictions. Dave Ben. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Wed, Nov 4, 2009 at 7:11 AM, Edsko de Vries edskodevr...@gmail.comwrote: On 4 Nov 2009, at 13:36, Alberto G. Corona wrote: Artyom. I know what uniqueness means. What I meant is that the context in which uniqueness is used, for imperative sequences: (y, s')= proc1 s x (z, s'')= proc2 s' y . is essentially the same sequence as if we rewrite an state monad to make the state explicit. When the state is the world state, then it is similar to the IO monad. Yes, as long as there is a single thing that is being updated there's little difference between the state monad and a unique type. But uniqueness typing is more general. For instance, a function which updates two arrays f (arr1, arr2) = (update arr1 0 'x', update arr2 0 'y') is easily written in functional style in Clean, whereas in Haskell we need to sequentialize the two updates: f (arr1, arr2) = do writeArray arr1 0 'x' writeArray arr2 0 'y' Those sequential updates can be run concurrently on both, just with different syntax though right? You can find a more detailed comparison in my thesis ( https://www.cs.tcd.ie/Edsko.de.Vries/pub/MakingUniquenessTypingLessUnique-screen.pdf, Section 2.8.7). -Edsko ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re[4]: [Haskell-cafe] What's the deal with Clean?
Hello Alberto, Wednesday, November 4, 2009, 5:58:31 PM, you wrote: I personally don´t care about raw performance. me too. actually, i write time-critical parts of my app in c++ Haskell is in the top of the list of language performance. this list is meaningless, as i said before It has all the ingredients for improving performance in the coming years: that's different question. i think that at the last end lazy languages will become as efficient as assembler, it just may happen not in my lifespan :) I can not care less about how fast is C, when I simply can not program many things I need in C or C++ or Java and in general any of the language of the performance list that are above... if you don't know how to implement things in C, you cannot do it efficiently in Haskell too. when i write efficient code in Haskell, i actually use Haskell as obscure assembler (the same holds for C) or below, because they lack the necessary type safety, expressiveness, abstraction.etc. they also can't make you coffee but it's is different story. i use Haskell too. i just know that it is slow and use other languages when i really need speed. it's why my archiver is world's fastest one :) -- Best regards, Bulatmailto:bulat.zigans...@gmail.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Bulat Ziganshin bulat.zigans...@gmail.com writes: Well, it clearly demonstrates that it is possible to write fast code in Haskell. my measures says that by psending 3x more time than for C you can optimize haskell code to be only 3x slower than C one Right¹, the interesting thing is not how fast I can get with N times the effort, but if I can get fast enough with 1/N. when people are crying that their code isn't as fast as those ads say. haskell compilation can't yet automatically avoid laziness and convert pure high-level code into equivalent of C one. Many of those people are making fairly simple mistakes. I think a somewhat seasoned programmer using good libraries can write declarative, concise, and readable code that still is reasonably fast. -k ¹) At least for some approximation of the word. Only one benchmark on the shootout has C at a 3x advantage. -- If I haven't seen further, it is by standing in the footprints of giants ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re[2]: [Haskell-cafe] What's the deal with Clean?
Hello Ketil, Wednesday, November 4, 2009, 7:43:38 PM, you wrote: Right?, the interesting thing is not how fast I can get with N times the effort, but if I can get fast enough with 1/N. it depends entirely on how fast you need. so it's again changing the topic - while i say that haskell is slow compared to other languages, i don't say that it is slow for you or that you need sped at all. why it's repeated again and again? why you don't write to Don what you don't need speed when he wrote that haslkell is fast but wrote this to me? :( when people are crying that their code isn't as fast as those ads say. haskell compilation can't yet automatically avoid laziness and convert pure high-level code into equivalent of C one. Many of those people are making fairly simple mistakes. I think a somewhat seasoned programmer using good libraries can write declarative, concise, and readable code that still is reasonably fast. i don't think that omitting strictness declarations is a mistake :) ?) At least for some approximation of the word. Only one benchmark on the shootout has C at a 3x advantage. oh, can we stop saying about shootout? if you want to see speed of pure haskell code, look at papers about fast arrays/strings - their authors have measured that lazy lists are hundreds times slower than idiomatic C code. is use of lazy lists counted as mistake too and paper authors had too small haskell experience? -- Best regards, Bulatmailto:bulat.zigans...@gmail.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
I'm not sure I follow you? The compiler can't reorder the two updates or do them in parallel (IO is not a commutative monad). You might tell the compiler this explicitly, but then are you writing lower and lower level code, further removed from the functional paradigm. Edsko On 4 Nov 2009, at 15:27, David Leimbach wrote: On Wed, Nov 4, 2009 at 7:11 AM, Edsko de Vries edskodevr...@gmail.com wrote: On 4 Nov 2009, at 13:36, Alberto G. Corona wrote: Artyom. I know what uniqueness means. What I meant is that the context in which uniqueness is used, for imperative sequences: (y, s')= proc1 s x (z, s'')= proc2 s' y . is essentially the same sequence as if we rewrite an state monad to make the state explicit. When the state is the world state, then it is similar to the IO monad. Yes, as long as there is a single thing that is being updated there's little difference between the state monad and a unique type. But uniqueness typing is more general. For instance, a function which updates two arrays f (arr1, arr2) = (update arr1 0 'x', update arr2 0 'y') is easily written in functional style in Clean, whereas in Haskell we need to sequentialize the two updates: f (arr1, arr2) = do writeArray arr1 0 'x' writeArray arr2 0 'y' Those sequential updates can be run concurrently on both, just with different syntax though right? You can find a more detailed comparison in my thesis (https://www.cs.tcd.ie/Edsko.de.Vries/pub/MakingUniquenessTypingLessUnique-screen.pdf , Section 2.8.7). -Edsko ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
bulat.ziganshin: oh, can we stop saying about shootout? if you want to see speed of pure haskell code, look at papers about fast arrays/strings - their authors have measured that lazy lists are hundreds times slower than idiomatic C code. is use of lazy lists counted as mistake too and paper authors had too small haskell experience? Comparing apples against oranges is a mistake, yes. -- Don ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Nov 4, 2009, at 9:56 PM, Martin DeMello wrote [about Clean]: And a well-integrated GUI toolkit. If it weren't for the Windows bias I'd have definitely taken the time to learn the language. The GUI toolkit was originally available on the Mac. But now, ah, now! The Object I/O Library 1.2 is currently only available for the Wintel platform. The last time I tried Clean on a SPARC, using even the old GUI toolkit required you to locate and install a graphics library (XView) that Sun abandoned a _long_ time ago. They have some _really_ interesting ideas about building web sites using generics. I paid for the Intel C compiler. I'd pay for Clean, if only I _could_ use it. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: Re[2]: [Haskell-cafe] What's the deal with Clean?
On 05/11/2009, at 04:01, Bulat Ziganshin wrote: oh, can we stop saying about shootout? if you want to see speed of pure haskell code, look at papers about fast arrays/strings - their authors have measured that lazy lists are hundreds times slower than idiomatic C code. is use of lazy lists counted as mistake too and paper authors had too small haskell experience? In the papers I coauthored, I don't think we measured any such thing. What we measured was that in algorithms that are best implemented with (unboxed) arrays, using boxed lists is going to cost you. That's not a very surprising conclusion and it's by no means specific to Haskell. The problem was/is the lack of nice purely declarative array libraries but that changing, albeit slowly. It's a question of using the right data structure for the algorithm, not a C vs. Haskell thing. Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
2009/11/3 Andrew Coppin andrewcop...@btinternet.com: As far as I can tell, Clean is to Haskell as C is to Pascal. I.e., Clean is notionally very similar to Haskell, but with lots of added clutter, complexity and general ugliness - but it's probably somehow more machine-efficient as a result. (All of which makes the name Clean rather ironic, IMHO.) Ouch - you really could put it the other way round. Clean has very little clutter, though I suppose some people might take offence to it having simple macros (:==), but wait so does GHC via -XCPP... I think Clean had generics before Haskell had Data.Generics, otherwise Haskell generally has more innovation, more people work on Haskell, Haskell's motivation was language research... Clean has far fewer libraries, more people use Haskell... Clean used to be considered faster than Haskell, though I don't know what the situation is now: http://www.haskell.org/pipermail/haskell-cafe/2007-October/033854.html Clean is a very fine language, if I wasn't using Haskell I know what my alternative choice would be. Best wishes Stephen ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Roman Leshchinskiy wrote: On 04/11/2009, at 13:23, Daniel Peebles wrote: In the presence of fusion (as is the case in uvector), it's hard to give meaningful time complexities for operations as they depend on what operations they are paired with. We need to think of a better way to express this behavior in the documentation though. I have to disagree here. Fusion never makes the complexity of operations worse. If it does, it's a bug. I think the point was more that the relevant complexity bound can change in the presence of fusion. For a poor example: the first map over a list is O(n) but all subsequent ones in a chain of maps are O(1) with fusion. I'm sure there are better examples than that, but you get the idea. Some people may care to know about that latter complexity rather than just the independent complexity. While this comes up with fusion, it's not a new problem. The same sort of thing is gotten at by distinguishing worst-case vs average-case complexity, or amortized worst-case vs non-amortized wost-case, etc. -- Live well, ~wren ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
briand: Really, arrays in Haskell are the most @#!$! confusing thing in the world. There's a bunch of different array structures. I can't tell which one works best, and all I want to do is x[i] = value. I thought uvector was the answer, you know, fast unboxed ARRAYs. Imagine my surprise when I saw this indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. Umm That's a typo in the docs. Thanks. -- Don ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
David Leimbach wrote: Disciplined Disciple might be interesting to look at here too, but i'm not sure I'd deploy anything with DDC just yet :-) Indeed. What DDC needs most at the moment is more people working on it. I've fixed a couple of bugs and I'm working on some others, but there are a large chunk of them in the bug tracker which are simply too deep for me with my current level of knowledge. Erik -- -- Erik de Castro Lopo http://www.mega-nerd.com/ ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
So what's the deal with Clean? Why is it preferable to Haskell? Why is it not? Purely from glancing through the language reference, two things that it looks like clean has that I would love to have in haskell are better records and better arrays. The records don't implement any of the fancy subtyping stuff that the various haskell proposals do, but they have the benefits of a nicer syntax and actually being implemented. Haskell arrays (by which I mean the IArray and MArray interfaces) are, to me at least, really hard to use. From little things like using closed ranges where the rest of the world uses half-open ones and opaque documentation and no consistency between IArray and MArray, to bigger things like how do you insert or delete something. My conclusion, after wrestling with ixmap for 15 minutes, was to convert to a list, concatMap across [(i, a)], convert back to an array, which has the bonus of runtime crashes if you forget an 'i'. Sorry if this turned into a rant about arrays, but it's bothered me for a while :) I think the clean guys got it right when they decided to make good array support an explicit goal. I suppose haskell has since gone a different way with the various array fusion libraries with listlike interfaces, and maybe that's better in the long term. Maybe type families can make a nice interface someday. Meanwhile it's a mess though. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Tue, Nov 3, 2009 at 3:14 PM, Ben Lippmeier ben.lippme...@anu.edu.auwrote: David Leimbach wrote: Disciplined Disciple might be interesting to look at here too, but i'm not sure I'd deploy anything with DDC just yet :-) :) Nor would I (and I wrote most of it). I think the approach is right, but the compiler itself is still in the research prototype stage. Ben. I have to admit, the first time I hit the wiki page for DDC I said to myself Self, this sounds crazy complicated. Then I read part of the PDF (your thesis I believe) about Region Types on the bus ride to work and thought. Gee I think I scared myself off too quickly. Uniqueness typing is quite interesting in Clean, but to control aliasing, like really *control* aliasing, that's just far out man. So I still have to wrap my head around why this isn't going to get completely out of control and see why it's all safer than just writing C code but I must say the attention I will be paying to DDC has just gone quite a bit up. Dave ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Tue, Nov 3, 2009 at 2:16 PM, Tracy Wadleigh tracy.wadle...@gmail.comwrote: I had to implement a ring buffer, and I wanted the code using it to be written in Haskell. I ended up implementing the buffer in C, and wrapping it in FFI from Haskell because implementing a destructive array in Haskell is kind of unwieldy to someone of my experience level. In Clean, it looks like the uniqueness typing allows for destructive updates in a very controlled manner. The ST monad provides this functionality. The never-instantiated-in-a-visible-way state parameter of the ST monad provides the uniqueness required for doing destructive updates in a pure way. Someone suggested that to me on IRC once I'd already cranked out a C implementation with FFI bindings. It's just too easy to use the FFI in Haskell :-) If we raise the barrier of FFI, more people will use ST! Dave ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Fwd: [Haskell-cafe] What's the deal with Clean?
I had to implement a ring buffer, and I wanted the code using it to be written in Haskell. I ended up implementing the buffer in C, and wrapping it in FFI from Haskell because implementing a destructive array in Haskell is kind of unwieldy to someone of my experience level. In Clean, it looks like the uniqueness typing allows for destructive updates in a very controlled manner. The ST monad provides this functionality. The never-instantiated-in-a-visible-way state parameter of the ST monad provides the uniqueness required for doing destructive updates in a pure way. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Deniz Dogan wrote: Recently there has been a lot of discussion on this list about the programming language Clean and converting Clean programs to Haskell. Reading the Wikipedia article on the language, I can't really see any major difference between that and Haskell, except for the monads vs. uniqueness types. So what's the deal with Clean? Why is it preferable to Haskell? Why is it not? As far as I can tell, Clean is to Haskell as C is to Pascal. I.e., Clean is notionally very similar to Haskell, but with lots of added clutter, complexity and general ugliness - but it's probably somehow more machine-efficient as a result. (All of which makes the name Clean rather ironic, IMHO.) Of course, this is merely the opinion I formed after performing a cursory scan of some of the introductory documentation. I haven't actually seen any code written with it or anything, so my opinion probably doesn't mean a lot... ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Stephen Tetley wrote: 2009/11/3 Andrew Coppin andrewcop...@btinternet.com: As far as I can tell, Clean is to Haskell as C is to Pascal. I.e., Clean is notionally very similar to Haskell, but with lots of added clutter, complexity and general ugliness - but it's probably somehow more machine-efficient as a result. (All of which makes the name Clean rather ironic, IMHO.) Ouch - you really could put it the other way round. Part of this really comes down to how one feels about the monads vs uniqueness types argument. It's a silly argument to have since the ideas are orthogonal and only really intersect at IO, but there's history there which lead to the current state of things. Sometimes in Haskell I've thought about how uniqueness typing would make something faster, but in general all the plumbing associated with it in Clean makes me feel like I'm writing systems-level code (i.e. C, asm) instead of using a high-level language. The extra plumbing really makes it feel dirtier to work with. That doesn't mean Clean is bad, but I think it does contribute to the cluttered feeling Haskellers get. But as I said, it's a silly argument and folks should use whichever gives them warm fuzzies. I also have a vague unnameable distaste whenever working with Python, and rather enjoy working with Perl. Nobody's perfect :) -- Live well, ~wren ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Actually, it's not a typo. If you look at the source, what you'll see is indexU arr n = indexS (streamU arr) n and then tracking down indexS, you'll see indexS (Stream next s0 _) n0 | n0 0= error Data.Array.Vector.Stream.indexS: negative index | otherwise = loop_index n0 s0 where loop_index n s = case next s of Yield x s' | n == 0- x | otherwise - s' `seq` loop_index (n-1) s' Skips' - s' `seq` loop_index ns' Done - error Data.Array.Vector.Stream.indexS: index too large So in other words, indexU really does have O(n) complexity since it first converts the array into a stream and then walks down the stream in order to find the desired element. Cheers, Greg On Nov 3, 2009, at 6:25 PM, Roman Leshchinskiy wrote: On 04/11/2009, at 13:12, brian wrote: indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. This is a typo (unless Don inserted a nop loop into the original DPH code). Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
stephen.tetley: 2009/11/3 Andrew Coppin andrewcop...@btinternet.com: As far as I can tell, Clean is to Haskell as C is to Pascal. I.e., Clean is notionally very similar to Haskell, but with lots of added clutter, complexity and general ugliness - but it's probably somehow more machine-efficient as a result. (All of which makes the name Clean rather ironic, IMHO.) Clean used to be considered faster than Haskell, though I don't know what the situation is now: http://www.haskell.org/pipermail/haskell-cafe/2007-October/033854.html We've come a long way in 5 years. Haskell is almost always faster on the shootout now. And parallelism goes a long way to helping there: http://shootout.alioth.debian.org/u64q/benchmark.php?test=alllang=ghclang2=cleanbox=1 Though this is also true on a single core: http://shootout.alioth.debian.org/u64/benchmark.php?test=alllang=ghclang2=cleanbox=1 It's just a lot closer. Clean continues to have a very small memory footprint. -- Don ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Really, arrays in Haskell are the most @#!$! confusing thing in the world. There's a bunch of different array structures. I can't tell which one works best, and all I want to do is x[i] = value. I thought uvector was the answer, you know, fast unboxed ARRAYs. Imagine my surprise when I saw this indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. An array implementation with an order N lookup. huh ?? That's not an array, that's a list. I was looking for an array. However, I then found in the same hackage: readMU :: MUArr e s - Int - ST s e O(1). readMU reads the element at the specified index of a mutable unboxed array. So O(1) for mutable, but O(n) for immutable ? See, confusing... I'm sure there's a really good, lofty type safety, something or other reason for that, that I'm sure I don't care about ;-) There's also ST. So why is there a uvector, when there's ST ?? etc, etc, etc... and then there's monads... other than that, having fun with haskell :-) Brian On Nov 3, 2009, at 3:42 PM, David Leimbach wrote: On Tue, Nov 3, 2009 at 2:16 PM, Tracy Wadleigh tracy.wadle...@gmail.com wrote: I had to implement a ring buffer, and I wanted the code using it to be written in Haskell. I ended up implementing the buffer in C, and wrapping it in FFI from Haskell because implementing a destructive array in Haskell is kind of unwieldy to someone of my experience level. In Clean, it looks like the uniqueness typing allows for destructive updates in a very controlled manner. The ST monad provides this functionality. The never-instantiated-in- a-visible-way state parameter of the ST monad provides the uniqueness required for doing destructive updates in a pure way. Someone suggested that to me on IRC once I'd already cranked out a C implementation with FFI bindings. It's just too easy to use the FFI in Haskell :-) If we raise the barrier of FFI, more people will use ST! Dave ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
G'day all. Quoting wren ng thornton w...@freegeek.org: Sometimes in Haskell I've thought about how uniqueness typing would make something faster, but in general all the plumbing associated with it in Clean makes me feel like I'm writing systems-level code (i.e. C, asm) instead of using a high-level language. The extra plumbing really makes it feel dirtier to work with. That doesn't mean Clean is bad, but I think it does contribute to the cluttered feeling Haskellers get. I think you're right here. Haskell has developed something of an aversion to naming things that aren't important enough to have a name, such as variables whose only reason to exist is plumbing. We'd far rather spend effort on more higher-order functions, monads, combinators and points-freeness than name something that's unimportant. And the funny thing about this is that it usually works, because in Haskell, abstraction is cheap. I believe that this is the main reason why Haskell programmers haven't embraced arrows, despite their theoretical advantages: Every notation that has been implemented so far requires names for things that shouldn't need names. But as I said, it's a silly argument and folks should use whichever gives them warm fuzzies. I'd like to think that professional developers are a bit more scientific than this. Cheers, Andrew Bromage ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On 04/11/2009, at 13:23, Daniel Peebles wrote: In the presence of fusion (as is the case in uvector), it's hard to give meaningful time complexities for operations as they depend on what operations they are paired with. We need to think of a better way to express this behavior in the documentation though. I have to disagree here. Fusion never makes the complexity of operations worse. If it does, it's a bug. Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
In the presence of fusion (as is the case in uvector), it's hard to give meaningful time complexities for operations as they depend on what operations they are paired with. We need to think of a better way to express this behavior in the documentation though. On Tue, Nov 3, 2009 at 9:12 PM, brian bri...@aracnet.com wrote: Really, arrays in Haskell are the most @#!$! confusing thing in the world. There's a bunch of different array structures. I can't tell which one works best, and all I want to do is x[i] = value. I thought uvector was the answer, you know, fast unboxed ARRAYs. Imagine my surprise when I saw this indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. An array implementation with an order N lookup. huh ?? That's not an array, that's a list. I was looking for an array. However, I then found in the same hackage: readMU :: MUArr e s - Int - ST s e O(1). readMU reads the element at the specified index of a mutable unboxed array. So O(1) for mutable, but O(n) for immutable ? See, confusing... I'm sure there's a really good, lofty type safety, something or other reason for that, that I'm sure I don't care about ;-) There's also ST. So why is there a uvector, when there's ST ?? etc, etc, etc... and then there's monads... other than that, having fun with haskell :-) Brian On Nov 3, 2009, at 3:42 PM, David Leimbach wrote: On Tue, Nov 3, 2009 at 2:16 PM, Tracy Wadleigh tracy.wadle...@gmail.com wrote: I had to implement a ring buffer, and I wanted the code using it to be written in Haskell. I ended up implementing the buffer in C, and wrapping it in FFI from Haskell because implementing a destructive array in Haskell is kind of unwieldy to someone of my experience level. In Clean, it looks like the uniqueness typing allows for destructive updates in a very controlled manner. The ST monad provides this functionality. The never-instantiated-in-a-visible-way state parameter of the ST monad provides the uniqueness required for doing destructive updates in a pure way. Someone suggested that to me on IRC once I'd already cranked out a C implementation with FFI bindings. It's just too easy to use the FFI in Haskell :-) If we raise the barrier of FFI, more people will use ST! Dave ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On 04/11/2009, at 13:12, brian wrote: indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. This is a typo (unless Don inserted a nop loop into the original DPH code). Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] What's the deal with Clean?
Recently there has been a lot of discussion on this list about the programming language Clean and converting Clean programs to Haskell. Reading the Wikipedia article on the language, I can't really see any major difference between that and Haskell, except for the monads vs. uniqueness types. So what's the deal with Clean? Why is it preferable to Haskell? Why is it not? -- Deniz Dogan ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
David Leimbach wrote: Disciplined Disciple might be interesting to look at here too, but i'm not sure I'd deploy anything with DDC just yet :-) :) Nor would I (and I wrote most of it). I think the approach is right, but the compiler itself is still in the research prototype stage. Ben. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Tue, Nov 3, 2009 at 1:37 PM, Andrew Coppin andrewcop...@btinternet.comwrote: Deniz Dogan wrote: Recently there has been a lot of discussion on this list about the programming language Clean and converting Clean programs to Haskell. Reading the Wikipedia article on the language, I can't really see any major difference between that and Haskell, except for the monads vs. uniqueness types. So what's the deal with Clean? Why is it preferable to Haskell? Why is it not? As far as I can tell, Clean is to Haskell as C is to Pascal. I.e., Clean is notionally very similar to Haskell, but with lots of added clutter, complexity and general ugliness - but it's probably somehow more machine-efficient as a result. (All of which makes the name Clean rather ironic, IMHO.) Of course, this is merely the opinion I formed after performing a cursory scan of some of the introductory documentation. I haven't actually seen any code written with it or anything, so my opinion probably doesn't mean a lot... It's preferable to Haskell in situations where Haskell isn't the best choice. The criteria for that decision is different from problem to problem. Example: I had to implement a ring buffer, and I wanted the code using it to be written in Haskell. I ended up implementing the buffer in C, and wrapping it in FFI from Haskell because implementing a destructive array in Haskell is kind of unwieldy to someone of my experience level. In Clean, it looks like the uniqueness typing allows for destructive updates in a very controlled manner. Disciplined Disciple might be interesting to look at here too, but i'm not sure I'd deploy anything with DDC just yet :-) Dave ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On 04/11/2009, at 13:35, wren ng thornton wrote: Roman Leshchinskiy wrote: On 04/11/2009, at 13:23, Daniel Peebles wrote: In the presence of fusion (as is the case in uvector), it's hard to give meaningful time complexities for operations as they depend on what operations they are paired with. We need to think of a better way to express this behavior in the documentation though. I have to disagree here. Fusion never makes the complexity of operations worse. If it does, it's a bug. I think the point was more that the relevant complexity bound can change in the presence of fusion. For a poor example: the first map over a list is O(n) but all subsequent ones in a chain of maps are O(1) with fusion. I'm sure there are better examples than that, but you get the idea. Some people may care to know about that latter complexity rather than just the independent complexity. I think asymptotic complexity is the wrong tool for what you're trying to do. You implement your algorithm using operations with known complexities. This allows you to compute the complexity of the entire algorithm. That's all you can use operation complexities for. The compiler is then free to optimise the algorithm as it sees fit but is supposed to preserve (or improve) its complexity. It is not guaranteed or even supposed to preserve the original operations. To stay with your example, each of the two maps is linear regardless of whether fusion happens. Executing the two maps, be it one after another or interlocked, is linear simply because O(n) + O(n) = O(n), not because of fusion. Essentially, you're trying to use complexity to describe an optimisation which doesn't actually affect the complexity. Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On 04/11/2009, at 14:07, Gregory Crosswhite wrote: Actually, it's not a typo. If you look at the source, what you'll see is indexU arr n = indexS (streamU arr) n I suspect it gets rewritten back to the O(1) version somewhere after is has had a chance to fuse. If not, then it's a bug. In the vector package, I do this instead, though: indexU arr n = O(1) implemetation {-# RULES indexU/unstreamU forall s n. indexU (unstreamU s) n = indexS s n #-} Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
On Nov 4, 2009, at 9:30 AM, Deniz Dogan wrote: So what's the deal with Clean? Why is it preferable to Haskell? Why is it not? (1) Speed. (2) If you are a Windows developer, the fact that Windows is the primary platform and others (even Mac OS, which is historically ironic) are second- (or in the case of Solaris) third-class citizens. (3) Did I mention speed? (4) It comes with its own IDE. I don't think it can do anything much that Haskell tools can't do, but if you don't like looking for things, it's a help. (5) Plus of course there's speed. (6) They're working on a Haskell front end, so you won't actually have to choose. (Anyone doing a Clean front end for Haskell?) (7) Haskell now has bang-patterns so you can specify (a bound on) intended strictness when you declare a function. But that's not in Haskell 98. (8) As a result of this, speed is a bit more declarative than adding $! in strange places. (9) There's a theorem prover for Clean, called Sparkle. Sadly, it's Windows-only, but we all know what most computers on the planet run, don't we? (It's probably Symbian, actually.) (10) And finally, of course, there's speed. Did I mention that? ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Well, it depends on which indexU the OP means. The one linked in the docs is the O(1) UA type class version. -- Don gcross: Actually, it's not a typo. If you look at the source, what you'll see is indexU arr n = indexS (streamU arr) n and then tracking down indexS, you'll see indexS (Stream next s0 _) n0 | n0 0= error Data.Array.Vector.Stream.indexS: negative index | otherwise = loop_index n0 s0 where loop_index n s = case next s of Yield x s' | n == 0- x | otherwise - s' `seq` loop_index (n-1) s' Skips' - s' `seq` loop_index ns' Done - error Data.Array.Vector.Stream.indexS: index too large So in other words, indexU really does have O(n) complexity since it first converts the array into a stream and then walks down the stream in order to find the desired element. Cheers, Greg On Nov 3, 2009, at 6:25 PM, Roman Leshchinskiy wrote: On 04/11/2009, at 13:12, brian wrote: indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. This is a typo (unless Don inserted a nop loop into the original DPH code). Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
So I take it you are saying that it really *cleans* Haskell's clock when it comes to speed? ;-) - Greg On Nov 3, 2009, at 9:04 PM, Richard O'Keefe wrote: On Nov 4, 2009, at 9:30 AM, Deniz Dogan wrote: So what's the deal with Clean? Why is it preferable to Haskell? Why is it not? (1) Speed. (2) If you are a Windows developer, the fact that Windows is the primary platform and others (even Mac OS, which is historically ironic) are second- (or in the case of Solaris) third-class citizens. (3) Did I mention speed? (4) It comes with its own IDE. I don't think it can do anything much that Haskell tools can't do, but if you don't like looking for things, it's a help. (5) Plus of course there's speed. (6) They're working on a Haskell front end, so you won't actually have to choose. (Anyone doing a Clean front end for Haskell?) (7) Haskell now has bang-patterns so you can specify (a bound on) intended strictness when you declare a function. But that's not in Haskell 98. (8) As a result of this, speed is a bit more declarative than adding $! in strange places. (9) There's a theorem prover for Clean, called Sparkle. Sadly, it's Windows-only, but we all know what most computers on the planet run, don't we? (It's probably Symbian, actually.) (10) And finally, of course, there's speed. Did I mention that? ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Given the Shootout results, the difference is a matter of a few seconds. If Clean Programmers need those few extra seconds, they're welcome to them. We're Lazy around here. :) /Joe On Nov 4, 2009, at 12:16 AM, Gregory Crosswhite wrote: So I take it you are saying that it really *cleans* Haskell's clock when it comes to speed? ;-) - Greg On Nov 3, 2009, at 9:04 PM, Richard O'Keefe wrote: On Nov 4, 2009, at 9:30 AM, Deniz Dogan wrote: So what's the deal with Clean? Why is it preferable to Haskell? Why is it not? (1) Speed. (2) If you are a Windows developer, the fact that Windows is the primary platform and others (even Mac OS, which is historically ironic) are second- (or in the case of Solaris) third-class citizens. (3) Did I mention speed? (4) It comes with its own IDE. I don't think it can do anything much that Haskell tools can't do, but if you don't like looking for things, it's a help. (5) Plus of course there's speed. (6) They're working on a Haskell front end, so you won't actually have to choose. (Anyone doing a Clean front end for Haskell?) (7) Haskell now has bang-patterns so you can specify (a bound on) intended strictness when you declare a function. But that's not in Haskell 98. (8) As a result of this, speed is a bit more declarative than adding $! in strange places. (9) There's a theorem prover for Clean, called Sparkle. Sadly, it's Windows-only, but we all know what most computers on the planet run, don't we? (It's probably Symbian, actually.) (10) And finally, of course, there's speed. Did I mention that? ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Oh, that's strange... the type class UA is defined twice, once in Data.Array.Vector and once in Data.Array.Vector.UArr; in the first module indexU is a separate function with the sources I exhibited, in the second module it is a method of the UA type-class which seems to have O(1) access for most of the defined instances. That's incredibly confusing... - Greg On Nov 3, 2009, at 9:15 PM, Don Stewart wrote: Well, it depends on which indexU the OP means. The one linked in the docs is the O(1) UA type class version. -- Don gcross: Actually, it's not a typo. If you look at the source, what you'll see is indexU arr n = indexS (streamU arr) n and then tracking down indexS, you'll see indexS (Stream next s0 _) n0 | n0 0= error Data.Array.Vector.Stream.indexS: negative index | otherwise = loop_index n0 s0 where loop_index n s = case next s of Yield x s' | n == 0- x | otherwise - s' `seq` loop_index (n-1) s' Skips' - s' `seq` loop_index ns' Done - error Data.Array.Vector.Stream.indexS: index too large So in other words, indexU really does have O(n) complexity since it first converts the array into a stream and then walks down the stream in order to find the desired element. Cheers, Greg On Nov 3, 2009, at 6:25 PM, Roman Leshchinskiy wrote: On 04/11/2009, at 13:12, brian wrote: indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. This is a typo (unless Don inserted a nop loop into the original DPH code). Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
gcross: Oh, that's strange... the type class UA is defined twice, once in Data.Array.Vector and once in Data.Array.Vector.UArr; in the first No, its exported from the former. module indexU is a separate function with the sources I exhibited, in the second module it is a method of the UA type-class which seems to have O(1) access for most of the defined instances. That's incredibly confusing... There's direct and stream-based versions. You can choose which version you need. If you use the stream-based implementations, the compiler will apply the stream fusion optimization to your loops. If you use the direct versions, that won't apply. I'd be happy to talk more about the design of the library, if you like. -- Don ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
Don, There is more than one indexU ? In Data.Array.Vector there is only 1 indexU that I can find. Brian On Nov 3, 2009, at 9:15 PM, Don Stewart wrote: Well, it depends on which indexU the OP means. The one linked in the docs is the O(1) UA type class version. -- Don gcross: Actually, it's not a typo. If you look at the source, what you'll see is indexU arr n = indexS (streamU arr) n and then tracking down indexS, you'll see indexS (Stream next s0 _) n0 | n0 0= error Data.Array.Vector.Stream.indexS: negative index | otherwise = loop_index n0 s0 where loop_index n s = case next s of Yield x s' | n == 0- x | otherwise - s' `seq` loop_index (n-1) s' Skips' - s' `seq` loop_index ns' Done - error Data.Array.Vector.Stream.indexS: index too large So in other words, indexU really does have O(n) complexity since it first converts the array into a stream and then walks down the stream in order to find the desired element. Cheers, Greg On Nov 3, 2009, at 6:25 PM, Roman Leshchinskiy wrote: On 04/11/2009, at 13:12, brian wrote: indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. This is a typo (unless Don inserted a nop loop into the original DPH code). Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
UArr operations subject to stream fusion: http://code.haskell.org/~dons/code/uvector/Data/Array/Vector/Strict/ Direct-style operations, not subject to the optimization: http://code.haskell.org/~dons/code/uvector/Data/Array/Vector/UArr.hs /me needs to write a tutorial on this. -- Don briand: Don, There is more than one indexU ? In Data.Array.Vector there is only 1 indexU that I can find. Brian On Nov 3, 2009, at 9:15 PM, Don Stewart wrote: Well, it depends on which indexU the OP means. The one linked in the docs is the O(1) UA type class version. -- Don gcross: Actually, it's not a typo. If you look at the source, what you'll see is indexU arr n = indexS (streamU arr) n and then tracking down indexS, you'll see indexS (Stream next s0 _) n0 | n0 0= error Data.Array.Vector.Stream.indexS: negative index | otherwise = loop_index n0 s0 where loop_index n s = case next s of Yield x s' | n == 0- x | otherwise - s' `seq` loop_index (n-1) s' Skips' - s' `seq` loop_index ns' Done - error Data.Array.Vector.Stream.indexS: index too large So in other words, indexU really does have O(n) complexity since it first converts the array into a stream and then walks down the stream in order to find the desired element. Cheers, Greg On Nov 3, 2009, at 6:25 PM, Roman Leshchinskiy wrote: On 04/11/2009, at 13:12, brian wrote: indexU :: UA e = UArr e - Int - e O(n). indexU extracts an element out of an immutable unboxed array. This is a typo (unless Don inserted a nop loop into the original DPH code). Roman ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What's the deal with Clean?
David Leimbach wrote: I have to admit, the first time I hit the wiki page for DDC I said to myself Self, this sounds crazy complicated. Then I read part of the PDF (your thesis I believe) about Region Types on the bus ride to work and thought. Gee I think I scared myself off too quickly. Uniqueness typing is quite interesting in Clean, but to control aliasing, like really *control* aliasing, that's just far out man. So I still have to wrap my head around why this isn't going to get completely out of control and see why it's all safer than just writing C code but I must say the attention I will be paying to DDC has just gone quite a bit up. :) A correct C program is just as safe as a correct Haskell/Disciple program. If you're using destructive update then aliasing, side effects and mutability all start to matter. It might look complicated when you reflect all these things in the type system, but you're really just getting a handle on the inherent complications of the underlying program. I suppose the trick is to be able to ignore said complications when you just don't care, or they're not relevant for your particular problem... Ben. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe