I was expecting the Haskell program and C program attached to produce
the same results. However, as shown below, the C program gives identical
values for the two pointers whereas the Haskell program gives different
ones.

$ ./c
0x0804f050
0x0804f050
0
$ ./h
0x08084970
0x0807eba8
0

Does anyone know what is going on here?
Am I incorrectly using the FFI?


Thanks
Ian


#include <ncurses.h>

int main(void) {
    WINDOW *w, *s;
    int r;
    w = initscr();
    r = endwin();
    printf("0x0%x\n0x0%x\n%d\n", w, stdscr, r);
    return 0;
}

\begin{code}
module Main (main) where

import Ptr
import CTypes

data Window = Window
type PWindow = Ptr Window

foreign import ccall unsafe "static ncurses.h initscr" initscr :: IO PWindow
foreign import ccall unsafe "static ncurses.h &stdscr" stdscr :: PWindow
foreign import ccall unsafe "static ncurses.h endwin" endwin :: IO CInt

main :: IO()
main = do w <- initscr
          r <- endwin
          putStrLn (show w)
          putStrLn (show stdscr)
          putStrLn (show r)
\end{code}


all:
        gcc -lcurses foo.c -o c
        ghc -Wall --make foo -o h -fglasgow-exts -package lang -lcurses

clean:
        rm -f c h *.o *.hi

Reply via email to