Re: Nim's popularity

2020-06-17 Thread moerm
"Vala" \- I think that tells us next to nothing. Simple reason: unlike Nim Vala 
was linked to Gnome, i.e. a major (well, kind of) and well known name. When 
Mozilla (Rust) burps the planet trembles, when Gnome (Vala) burps, the world 
notices it ... and when Nim burps (or sings a nice song for that matter) next 
to nobody notices it. This, however, can also turn _against_ a project because 
a big name creates big expectations.

"Nim popularity" \- I couldn't care less. About popularity, well noted. Gaining 
traction is another (and important) matter.

People come to programming languages largely either due to need or based on 
some form of herd phaenomenon (like "our company uses xyz" or "we learned and 
worked with xyz in uni"). The latter can only be addressed once a language has 
a certain "weight". So the former is the relevant one for us, at least for 
quite some time.

Learning and reaching a level of _mastering_ a language is a considerable 
effort - hence only very few undertake that for the fun of it. For most it's 
probably based on something between an itch and real need (or pain) what makes 
them _seriously_ look at alternatives and to learn them. Examples are code 
quality, multi-threading, events/async, typing (strong, weak, static, etc.), 
tool support, FFI & libraries.

I for example made the effort to occasionally scan the status in PL world and 
to have a closer look at some alternatives that looked promising because in my 
field reliable and correct software as well as speed are critical and because 
Ada is somewhat cumbersome and poor on libraries, especially async/event based 
IO. All that isn't a question of lila lala taste/preference but of hard facts. 
If I can 30% more done in xyz than in Ada then learning xyz is worth it.

Another factor that I consider important - and often overlooked - is the fact 
that a relatively new (as in "not yet widely used and established) invariably 
has some sharp corner and a noticeable lack of comfort. This usually translates 
to a strong tendency to be looked at and picked up by experienced developers 
but not by newbies or languages hoppers. This again translates to such a 
language not _yet_ being used a lot by the large crowds out there.

Funnily(?) really old PLs like for example Pascal which are perfectly fine 
languages are considered unattractive just like new PLs. It's sad but I'm 
afraid that "coolness" is an important factor too for the large crowd.

On the other hand greybeards opinions carry quite some weight and _if_ they 
recommend a PL then that recommendation has value. Moreover code quality tends 
to be better with greybeards which often reflects back at their favourite PL 
(or the one the project has been implemented in).

Finally, there is the 800 pound gorilla, Rust, which gets aggressively 
pushed/marketed by a very major (and rich) organisation as well as by a large 
crowd.

Which, sorry if that hurts, also means that Nim has only one way to go - and 
that is not noise/marketing/bla bla fandom. It is _quality_ and an attractive 
feature set along with good tool support. (At this point I mention again that 
we will be punished for largely ignoring or treating a tier 3 anything except 
Visual Studio Code (the ugly monstrosity)). Simple reason: for many programmers 
their favourite editor/IDE is closer to their heart (and fingers) than a - new 
- language.

TL;DR Let us not care about Tiobe and other "the crowds favourite xyz" but 
let's continue to enhance Nim.


Re: New entry on Nim blog...

2020-06-17 Thread moerm
Dafny is a quite nice and actually useable tool but unfortunately .Net only.


Re: New entry on Nim blog...

2020-06-17 Thread moerm
You are all welcome

@cantanima

I disagree. Reasons, mainly:

a) Ada is intrinsically harder to work with. Likely reasons are its age 
(decades older than Nim) and some almost anal typeing problems, probably also 
due to its age; in a way J. Ichbiah had to try to emulate SA in the language 
itself back then.

b) one can make SA syntax (and working) hard and ugly or one can make it 
reasonably human friendly. I have reason to assume that Nim's SA will be quite 
"friendly" (well, as friendly as is feasible considering that "mathematically 
rigorous" and "easy to grasp and use" aren't easy to mate).

Moreover I'm pleased to see that @Araq goes at it in his usual way, i.e. 
profoundly reflecting, thinking through it as well as interested in mortal 
human usability. From what I know I'm very confident that our (Nim's) SA will 
be both rigorous (which is highly desirable in SA) and "easy" to use. One must 
keep the context in mind, i.e. compare our SA to what others offer. Real world: 
(almost all) SAs are either unsatisfactory (superficial, picking out only the 
easy stuff) -or- hard to very hard to use. Just look at Frama-C which is a PITA 
to install and use and which needs _very_ elaborate annotations to work at all.

As far as I'm concerned (Ada's) Spark is by far the best compromise and about 
the only SA that's actually used even without pressing need. From what I know 
Nim's SA will feel similar to Spark.

Summary: Sorry, SA just isn't easy as in "hacking some javascript or PHP junk" 
but I have reason to believe that Nim + Nim's SA will be actually useable for 
mere mortal developers after some (not to steep) learning and getting used to 
it.

Btw, if there are questions re. my article feel free to shoot them at me.


Re: if error

2020-04-10 Thread moerm
NaN is floating point. So something = if (condition): foo else: bar can't work 
if 'foo' is of different type than the type of 'bar'.

You can use if (condition): Cell(x:j+1, y:i) else: Cell(-1, -1) or another 
"special Cell" to indicate what you wanted to indicate using NaN.


Re: Nim 1.2 is here

2020-04-03 Thread moerm
Great, thanks so much Araq and Nim team!

I particularly love the SA/H3 and the .localPassC pragmas.

Quick question related to that: Assume that I have a Nim file which needs a C 
file compiled but with different C compiler parameters than the Nim file. Is 
there a way to tell which compiler parameters shall be used _only_ for the C 
file, possibly along with .compile?


Re: Announcement: The Nim compiler is rewritten in Python with some modules optimized in C

2020-04-01 Thread moerm
I'm strongly opposed to that plan! While I fully agree that Nim (currently) 
does a lot of uncool stuff (e.g. static typing) I do not consider Python and C 
good choices. Instead of Python we should be bold and directly walk into where 
the future is: AI based image interpretation instead of Python, which while 
being widely liked still implements, pardon me, an old-school boring paradigm. 
We should not allow ourselves to be blocked by frankly meaningless dictums like 
a PL having to be "text based"! As for C I _almost_ agree if it were not for 
the stubborn newer standards introducing anti-freedom stuff like "restrict" 
and, as I just learned by even GCC inventing "safety" devices like "access". 
Reason: All of that costs performance. Sorry, some might not like to hear it, 
but _every_ check, every "safety feature", and in fact even superficial type 
checking _costs performance_ either at compile time or at run time.

I do however fully propose our BDFL's plan to use void pointers and would 
infect strongly urge the Nim team to _only_ support void pointers. No more nice 
and dandy - but performance hungry and attention demanding - integers, floats, 
and what not (let alone signed and unsigned!) but only 'ptr void'. If something 
can't be expressed by void pointers it's not worth to be expressed anyway.


Re: help information sécurité

2020-03-30 Thread moerm
Just a sidenote: NO! The true news is that GCC finally gets at least some very 
limited static analysis capabilities. Clang/llvm is light years ahead. Please 
note that my statement is only addressing static analysis and not GCC in 
general!


Re: help information sécurité

2020-03-30 Thread moerm
YAY! That's extremely good news. Thanks a lot!

I particularly love to see _full_ H3 - incl. 'invariant'. Are there also plans 
to provide quantors ('forall', 'exists' (plus negation))? Even hotter: Are 
there plans to provide a _compile time_ interface to Z3? I'm asking because 
that would be the "sweet spot" in terms of the least work for the Nim team 
while providing full formal capabilities for those (probably rather few) Nim 
developers who need full formal verification, some kind of transfer to/from 
model checkers, etc.

Last but not least, having some capabilities and/or interface to formal 
analysis would (IMO) pretty much close the gap between Nim and Ada+Spark plus 
it would open the door to (step by step) have Nim's stdlib and some major Nim 
libs fully verified. That plus the fact that Nim, while (still) lacking some of 
Ada's strengths, also has some strengths that Ada does not have (and is 
extremely unlikely to ever get) could make Nim _preferrable_ over Ada.

I'm really enchanted by the news. Thanks so much @Araq and Nim team! [pls. 
imagine a "heart" icon here which to make/insert I'm too clueless]


Re: Strange (maybe memory related behaviour)

2020-03-29 Thread moerm
The echo statement might trigger a call to getFrameWidth().

Try to do an "old style" echo like 'echo "Width: " & $width' directly after a 
getFrameWidth() call and assigning its result ("let width = 
getFrameWidth(...)". 


Re: help information sécurité

2020-03-27 Thread moerm
Yes, that's the list I meant. Thanks.

Note however that that list is extremely "generous" and most entries are just 
more or less reasonable _linters_ rather than static analyzers/verifiers. 


Re: help information sécurité

2020-03-22 Thread moerm
I'll provide more info once I'm finished doing some verification of Nim C 
output.

"constant time stuff" \- no, I don't know such a tool. Static verifiers are 
only looking for code correctness, e.g. proper mem. boundaries, reachability, 
loop invariants, etc. If you want to verify things like constant time operation 
I'd suggest a modelling tool to check your model ("algorithm"). Note that some 
MVs can also produce code.

"non-verified compiler" \- a) there are some (very few) verified compilers, 
e.g. CompCert.

b) _THE_ source of errors are still us humans, so verifying source code 
produced by humans (or not properly tested generators) will lead to catching 
and avoiding 99.x% of bugs/vulnerabilities.

The real problem is IMO that static as well as dynamic analyzers are relatively 
complicated (to use and understand) beasts and to use them properly one must 
understand quite a bit of formal methods and hence math (e.g. FOL, H3, 
separation logic) - hence most developers do not use those tools.

One hint I can provide (in terms of _easy to use_ verifiers) is facebooks 
"infer" tool. Be aware though that infer is rather limited, but it's very easy 
to use and it seems to catch at least the more common (albeit simple) errors.


Re: help information sécurité

2020-03-22 Thread moerm
**NO! NOT less secure than C** \- but less secure than _formally verified_ C. 
That is a _very significant_ difference.

"Normal" C code, i.e. C code that has not been formally designed and verified 
is considerably less secure than normal Nim code.

The point is that for C - unlike for most languages - there _are_ static 
analysis tools avalaible. But those same tools can also be used to verify C 
code produced by Nim.


Re: help information sécurité

2020-03-22 Thread moerm
I'm working in the field and Nim is my "everyday job language". Translation: 
Nim is _considerably less qualified_ than Ada (much harder to master), Eiffel 
(unattractive for some other reasons), and even Ocaml (less attractive for 
multiple reasons), let alone F star (which however is considerably more 
demanding and more complex plus poorly documented). But Nim is _way better 
qualified_ than most languages in common use (except a few which are however 
either JVM or .Net based). Some reasons for that are strong static typing, 
(albeit still very modest contracts), good readability and a creator/team 
leader with a healthy mindset and knowledge of and respect for languages like 
Modula [2|3], Pascal.

One point I'd like to add, although not yet final (more testing required), is 
that Nim (inter alia and probably most used) translates to C which can be 
statically and dynamically tested.

For hardcore security jobs I still use Ada, for (rare) GUI jobs I still use 
FreePascal, for pure crypto jobs like porting or optimizing crypto algos I 
still use C along with formal modelling, proto verif. and (usually static) 
analysis.

I'm currently looking closely at the C code Nim generates. I'm not exactly 
happy because it _looks_ very ugly and seems to occasionally produce, let me 
word it nicely, strange code and errors in static analysis (might be false 
positives), that's why I still keep Ada at hand. But if my tests would show 
that Nim generates reliably error free code (or if the team reacts 
constructively on eventual criticism) I'll use Nim even for at least some 
critical jobs.

Note: At least Nim (well, its output) _can_ be statically checked, most 
languages can't (because there are no tools available), so what I wrote here is 
_not negative_ as compared to other languages. In fact my impression so far is 
that Nim is in the top 10% of commonly used languages in terms of safety. So, 
my testing is not about tearing into Nim but rather about _verifying_ the 
(positive) impression I've got so far.

Short version/TL;DR: unless it's hardcore jobs (e.g. crypto implementations) 
Nim is probably the best language in my toolset and the one I like most as well 
as the most efficient one (in terms of coding time and efforts).

If you want a (premilinary) 1 sentence verdict -> Nim is at least as 
safe/secure as FreePascal and the C and C++ code it produces is safer/more 
secure/less buggy than what 98% of developers produce by hand.


Re: Nim problems. 1 internal, 1 mine

2020-02-07 Thread moerm
Thanks but I don't assume it's a Nim Bug. It rather looks like a mistake I made.


Re: Nim problems. 1 internal, 1 mine

2020-02-07 Thread moerm
Thanks. In fact I doubt the proposition that warnings are only for humans. For 
example stating "noreturn" but actually returning _should_ be considered as at 
least questionable. Sure, one can do a lot of trickery but I have learned (the 
hard way) that good code is clean code and I would never deliver code that 
doesn't pass `-Wall -Werror -pedantic`. In fact I often even use good quality 
static analyzers too. But again, I'm not looking for trouble and won't insist. 
It's just that I had high hopes re. Nim and seeing Ada mentioned as a major 
influence was one of the factors that drove me to Nim. And don't get me wrong, 
I still think that Nim is a great language, it just seems to be less safety 
obsessed than I expected. On the other hand I understand that that's maybe 
something for later and that Araq first wanted to get it feature complete and 
to 1.0.

Thank you also for calming my initial worries that this community has gotten 
less friendly and helpful than I remember it. Have a nice weekend!


Re: Nim problems. 1 internal, 1 mine

2020-02-07 Thread moerm
Sorry, I meant the prng itself. Also while I understand that one might read it 
as an optimization it actually was a question where I made an error that led to 
"anti-optimization".

Well, whatever, I went the cumbersome path and found my suspicion confirmed. 
`cast[PResult](s.x)` had Nim _copy_ the content (32 bytes) rather than simply 
changing a uint32 pointer to a uint8 pointer. Meanwhile I have changed my 
wrapper and achieve about 99% of the speed of the C version.

Btw, all the relevant information _was_ provided but it was of course easier to 
ask me for my source and to tell me that I should solve my problem myself.

regards


Re: Nim problems. 1 internal, 1 mine

2020-02-07 Thread moerm
Hmm, frankly, I'm not satisfied by the "it's just warnings" answer as we are 
not talking about "variable declared but unused" bla bla. But I'll leave it at 
that because my point isn't about creating a fuzz. I _know_ for a fact that 
some warnings should be taken seriously and that it's no matter of human vs. 
machine created. But if the Nim tell me that I need not worry I accept that.

As for my problem: thanks but I need no optimization help. The C code (which is 
the core) has already been optimized. My problem, I'm pretty sure of that, is 
some mistake I made. It seems highly likely that Nim copies either the state or 
the result instead of just an address.

I would be grateful if someone with Nim - C interface experience could provide 
a meaningful hint _where_ I made a mistake.

Thanks


Nim problems. 1 internal, 1 mine

2020-02-07 Thread moerm
Hello

I noticed that compiling fails (Nim 1.06) with `{.passC: "-Werror -pedantic".}` 
i.e. when I follow my C habit to be stringent with code and ask gcc to treat 
warnings as errors also in the Nim generated code. It seems the culprit is 
stdlib_system.nim.c.

Error 1 `error: ISO C forbids conversion of object pointer to function pointer 
type [-Werror=pedantic] ((Finalizer) ((*t).finalizer))(T5_);`

Error 2 `In function ‘sysFatal__... error: ‘noreturn’ function does return`

Error 3 same as 2 but another `sysFatal__...` where '...' stands for what looks 
like a hash

The other problem (mine) is about wrapping some C code.

Situation: I've written a Nim wrapper for (yet another) fast but good quality 
(non-cs) random generator. When running the C test version I achieve 
(-march=x86-64 -mtune=nehalem on a Zen1 Ryzen) a bit above 3 GB of random bytes 
per second. When running test nim (using my wrapper) it's about 1 GB/s slower. 
Losing about 30% performance is of course not something that makes me happy.

My suspicion is that it's my mistake and that NIM copies the state object 
and/or the result array on each call.

Explanation: I have to pass in a state object which is but two uint32 arrays. 


type PState* = ref object
  x: array[8, uint32]
  y: array[4, uint32]

Run

in my prng_wrap.nim along with type `PResult* = array[32, uint8]` and `proc 
prand(s: PState) : ptr byte {.importc.} ## The C core proc` Note: My wrapper 
calling this proc `prand` casts the result to PResult (`return 
cast[PResult](s.x)`)

Which I create in test.nim `var ps = new(PState)` and (var) `pr: PResult` and 
then call `pr = prand(ps)` 32 million times (exactly as I do in the C version) 
it works fine but is about 30% slower.

Now a) I'm confident that Nim's overhead is far less than 30% and b) I have 
been forced to work in C and Ada for some months and certainly forgot some of 
the finer points in dealing with and passing pointers (and refs), so I guess 
it's _my_ fault. Any idea _where_ my mistake is and how to bring the Nim 
version to something like 95+% of the C version speed?

Thanks in advance (oh, and congrats and thanks a ton and another ton for Nim 
1.x! I'm immensely pleased by that) 


Re: Begginer's question - pointers and refs

2019-03-19 Thread moerm
I disagree, at least regarding the question in the title.

A ref is always preferable unless one _needs_ a pointer. And the heap vs. stack 
is not Nim specific.


Re: Strange bug in (open)array handling?

2019-03-18 Thread moerm
Solved ... kind off

The line `c = bufx[k]´´ above was followed by ``key[j] = c`

(which already was a split for debugging and was originally `` key[j] = 
bufx[start + j]``) and `key` was a `newStringOfCap(64)` var.

@Araq whom I want to thank and laud for his friendly help in our chat for 
providing useful hints and tips drove me to furiously testing/looking at my 
`bufx` array and to also try to heap allocate it.

What this lead to was finding the real culprit, namely `key` \- which makes 
this whole thing even weirder because the compiler was unhappy about the wrong 
array (bufx instead of key).

The solution (well, rather a work around) was to _stack_ allocate the string 
`key` (Don't ask me for the reason; that's something probably only true Nim 
wizards could find out). Once I had done that everything worked fine, even with 
index checking turned on.

However, I want to clearly excuse the Nim compiler. The whole stuff I talked 
about here got fed into _heavily inlined_ and complex code and I've seen way 
more mature (e.g. C and Pascal) compilers choke on way less demanding code. Nim 
is still just v. 19.x and I found it all in all to be quite reliable and smart.

While it's not urgent the Nim devs might sometimes want to have a closer look 
at the array/openarray/string stuff (alloc and checks). 


Strange bug in (open)array handling?

2019-03-18 Thread moerm
I have a strange problem with Nim (both 19.2 release and 19.9[most current] 
nightly.


const MEG64 = 1024 * 1024 * 64
var somebuf: array[MEG64, char]  # some large buffer
# ...
proc char_arr_stuff(bufx: openArray[char], start: int) =
   var
   j = 0
   c: char
   while j < 64:
  let k = start + j
  debugEcho "j, k: " & $j & ", " & $k # first output: "0, 0" (as 
expected)
  c = bufx[k] # BANG!
# ...
# open a file with (a bit more than) 64MB data (all in a..z,A..Z,0..9)
let bytesRead = myfile.readData(addr(somebuf[0]),  MEG64) # Note: This 
works.
# ...

Run

with `char_arr_stuff` getting passed in `somebuf`.

Nim compiles OK but when running it it raises "index 0 not in 0 .. -1 
[IndexError]" through `raiseIndexError2`.

Well noted, the file reading works and `somebuf` (as well as `bufx` which is 
just `somebuf` passed into the proc) is valid and does contain (the correct) 
data. So 0 is not within `0 .. -1`?? Side note: To be really, really sure I 
verified that `bufx.low() == 0`.

Btw @dom96

It drives me nuts that I always have to indent by hand code I insert here. 
Probably it's just me being forum-stupid. Would you please tell me how to do it 
better?


Re: Passing a pointer to an array to a c function!

2019-03-16 Thread moerm
Update: The results I talked about yesterday were obtained with Nim simply 
-d:release compiling but with quite some optimization for the C reference code.

Today I cleaned up some minor lose ends and did some polishing (for both, C and 
Nim) and set Nim to compile with --opt:speed plus some checks disabled (which 
is a) unnecessary in this case, and b) fair because C has none of those at all).

And - I hope you are seated properly - Bang, the algorithm implemented in Nim 
is on average 2% to 3% **faster than the C version!**

And no that's not due to an error. I cross checked over 100K test vectors. The 
Nim implementation is correct.

Kudos to @Araq and the Nim team!


Re: Passing a pointer to an array to a c function!

2019-03-15 Thread moerm
I don't know about graphics stuff, but

  1. I found c2nim very helpful. Preprocessing C source beforehand is a very 
small price to pay.
  2. (and not directly related) **CONGRATS** to the Nim developers! I just 
finished porting a (very time sensitive) modern fast hashing algorithm to Nim 
and speed is within 1% of C (4.904 GB/s vs 4.935 GB/s when hashing 64 byte 
strings to 64 bit hashes). That's insignificant and well within the margin of 
measuring error.




Re: Newbie question about reading asynchronous text file line by line

2019-02-12 Thread moerm
The problem is that there seems to be no check for EOF.

This here works:


import asyncdispatch, asyncfile, os

proc main() {.async.} =
  var filename = "test.txt"
  var file = openAsync(filename, fmRead)
  let fileSize = file.getFileSize()
  while file.getFilePos() < fileSize:
let line = await file.readLine()
if line.len == 0: break
echo line
  
  file.close()

waitFor main()

Run


Re: len [0, 1, 2] fails

2019-02-11 Thread moerm
cblake, tim_st et al

Yes. I never liked Nims easy going in some points like the one that shows its 
ugly head here. Simple reason: ambiguity - as in "not _evidently and 
strikingly_ clear" \- is known to be one of the major infectors in software, 
comparable to rats in a city.

My personal rule, i.e. the way I handle it, ignoring some of Nims "generosity", 
is: a proc call has the form **foo(...)** \- always with only one exception: 
UFC with only one arg.

`[1,2,3].len` is OK, but `len [1,2,3]` is not in _my_ personal rule book.

I _do_ understand why Araq calls it "fixed" (rather than "allowing for 
ambiguity"): there is no var `len`, hence it must be a proc call.

But what do we gain? Saving two parentheses - but for a price, namely confusion.

I know, I know, I'm quite picky and somewhat like Cassandra but decades of 
software development - and bugs! - should have tought us the lesson that 
readability should always trump saving-a-key-stroke. This world doesn't need 
more "cool" "everything goes" languages; it needs languages that allow for the 
creation of _safe_ code (incl. maintainability ~ readability!) in as 
comfortable a way as possible. Which Nim to a large degree - and way more and 
better than anyone else - gladly does deliver indeed - modulo some well meant 
hiccups (like this one). 


Re: how to pass a C array to C function?

2019-02-11 Thread moerm
I'm a bit careful and general in my reply because I don't know the library you 
use. First, in Nim an array has a known and fixed size. If you need a dynamic 
array have a lot of seq.

Basically you should differentiate between two cases:

  * 1) The C func has a size parameter directly following the array/pointer 
(typically of type size_t)
  * 2) The C func does not have a size parameter directly following



Examples for both would be


// type 1, with size
int foo1(char *bar, int bsize); // 'bsize' is the number of chars in 'bar'

// type 2, no size
int foo2(char *bar);

Run

The proper declaration and calling of those funcs in/from Nim:


proc foo1(arr: openarray[char]): cint   {.importc.}
proc foo2(arr: ptr char): cint   {.importc.}
#...
var myArr: array[42, char]
let res1 = foo1(myArr)
let res2 = foo2(addr(myArr[0]))

Run

Note that Nim automagically passes both, a ptr to the array data and the size 
of the array to a C func declared to have an `openarray` parameter.

In your case - and assuming `imgdata` has a fixed size, e.g. 10 - something 
like this


type
   Image = object
  # fields
   Ihandle = ptr Image

proc iupImageRGBA(p1, p2: cint; imgArr: ptr byte): Ihandle   {.importc.}

var imgdata = [0'byte, 0, 0, 0, 0, 0, 0, 0, 0, 0] # fixed size! array
var image: Ihandle = iupImageRGBA(32, 32, addr(imgdata[0]))

Run

should do what you want.

Warning: my example is based on certain assumptions (e.g. the first 2 
parameters being of type cint). I strongly suggest to make the effort of 
providing all needed info when asking questions to avoid misunderstandings.


Re: Debugging - again

2019-02-09 Thread moerm
> Geany is a (more or less dead) zombie, PRs are not getting merged.

Ts, their last release was in january. It seems though that the Nim support 
project did indeed end up somewhere between the taiga and a black hole.

For your other point: OK, OK, I see it and you are right. When I said "all" I 
didn't think of Haiku or VxWorks. So I'll limit my criterion to "Win, Mac, 
Linux, BSD", i.e. all Nim supported systems with more than 1% (0.5%? 0.1%?) of 
"market" share.

As for your "we had a survey" argument, pardon me, and I know I won't get love 
for this, but: Nim isn't the result of a survey - and that's GREAT. Nim is the 
result of one knowledgeable professional who stubbornly followed _his_ 
understanding of how a great language should be. The fact, that you also listen 
to Nim's users is an added plus. YOU are the reason that Nim isn't just yet 
another Java, js, ML, a-better-C.

Nim is the result of "A. Rumpf thinks a good language must be like ...".

I don't say that to sugar you. I say it because (I expect lightning to strike 
me for heresy) democracy is a quite worthless mechanism in the world of 
technology and engineering. _That 's_ why I don't care the least about "the top 
10 languages/editors/IDEs/OSs, ...". And please kindly try to follow my 
argument, that's particularly true for Nim which tries to do things _the right 
way_ , an element of which is _not_ "Windows and be done" or "Linux and be 
done".

I still think Geany is a good example for what I mean. Actually I myself only 
rarely use it and I do _not_ consider it to be a great editor or IDE - but 
that's not what this is about. This is about another set of criteria. One might 
describe it as "what's a reasonable minimum to work efficiently, preferably 
with a GUI?" \- plus - enough flexibility to allow for supporting Nim.

VSC may be a great editor for many. But VSC is also the antithesis of what we 
really need as a _basis_.


Re: Debugging - again

2019-02-08 Thread moerm
I know vim - and I also know that many strongly dislike it (and others like 
it). For the kind of basic tool set for _everybody_ that I have in mind any 
editor with a "religious" follower or a "religious" hater group seems to be a 
bad choice.

That also means that it's not about "the best editor" and, in fact, not even 
about a particularly good editor, but about one that a) is acceptable (at least 
as a good compromise) and preferably even liked by many, b) is available on 
_all_ Nim supported desktop and server platforms, c) has modest dependencies 
and demands (disk space, memory, processor), d) is (relatively) easily quite 
configurable and versatile and allows for the (relatively) easy creation of an 
interface to the Nim compiler back end.

I think Geany demonstrates well what I mean. Some even love it, most can easily 
work with it and have no bad feelings re. Geany and only very few hate it. That 
IMO important requirement is not met by both vim and Emacs (although many value 
them highly). That said I personally would prefer Textadept because it meets 
all criteria plus also works in the terminal.


Re: Debugging - again

2019-02-08 Thread moerm
1) Thanks for the hint! Now with the  "trick" setting a breakpoint at a 
proc does work. Great. Unfortunately though, the gdb problem is still there. 
Maybe it's a version thing; I'll try a more current Gcc/Gdb later and will let 
you know in case that works. Btw. the Nim version on FreeBSD is 19.2 - Kudos to 
the port maintainer!

2) Nope. I want/need that tool set for _every_ Nim supported OS. Not working on 
_any_ of those should be a reason for exclusion. And btw. on servers the BSDs 
_are_ important players,


Re: Debugging - again

2019-02-08 Thread moerm
First, thanks for your constructive and friendly reaction.

I have a module, let's call it F.nim with, say, proc "fsysctl" that calls into 
FreeBSDs sysctl. Module F is imported into and called by, let's call it 
prog.nim. Within prog.nim I have a "tmain" proc that sets up some variables and 
then calls fsysctl from F.

(After compiling with --lineDir:on and --debugger:native ... and `gdb prog` (in 
the source directory)) when trying `b tmain" gdb says "function tmain not 
defined in prog.nim". Hmm, OK, let's be modest and just break at the first line 
within tmain, line 5 (something like ``var foo: int = 42`) which _seems_ to 
work ... but then when running ("r" in gdb) I get "warning: could not trace the 
inferior process", then "warning: ptrace: operation not permitted", and finally 
"During startup program exited with code 127".

Re "editor" and "gdb is ugly":

You are absolutely right, gdb of course can be used without a nice GUI. And in 
fact I wouldn't have complained if it worked - but it doesn't, at least not on 
FreeBSD (using gcc6-aux which is just gcc6 plus Ada support). BTW, using clang 
throws up an error within Nim using my (quite tough and tight) flags (something 
about "gcc ranges").

But we should also keep in mind that _very many_ developers aren't used to 
"naked" gdb but rather to a nice GUI. And it more often than not really makes 
debugging a lot more efficient.

As for the editor: I - and certainly not just me - want, need, must have a 
"basic tool set" that works on all supported systems and the editor, a critical 
component of that tool set, should be relatively small and have modest 
dependencies. VSC (and some other bloat monsters) just don't cut it and are not 
available everywhere. Geany and textadept both are reasonably small with modest 
dependencies (that are available!) and are flexible (or programmable) enough to 
build Nim support (for textadept some support already exists).

I'm absolutely convinced that if we could say - and show - that Nim has a 
reasonable tool set - incl. a not too shabby and widely acceptable editor - and 
at least some kind of basic GUI debugging that will be a very major factor in 
attracting more Nim users. 


Debugging - again

2019-02-08 Thread moerm
I have wasted 2 days now hunting down a problem on FreeBSD and in particular to 
get some acceptable way to debug my Nim code in the first place.

Result so far: about the only way to debug Nim code known to me is the bloated 
perverse Visual Studio Code monstrosity - which isn't available on FreeBSD 
anyway.

Simply using gdb - which isn't exactly user friendly - doesn't work either. All 
I see (gdb -tui) is "system.nim", setting a breakpoint on a function doesn't 
work at all and setting one on my main functions line number is accepted by gdb 
but doesn't work.

All I get is "warning: could not trace the inferior process", then "warning: 
ptrace: operation not permitted", and finally "During startup program exited 
with code 127".

So, what I'm to take away? Am I back to write dummy C functions to check 
whether Nim passes the parameters the way it should? Plus a whole load of 
debugEchos? And no, I can _not_ develop on Linux (where VSC works) and debug on 
FreeBSD; for one my code is FreeBSD specific plus gdb seems to not be properly 
supporting Nim on FreeBSD.

I'm seriously pissed off. I have commented, discussed, begged and preached for 
weeks how important it is to have a usable basis working reliably only to get 
dressed down. To add insult to injury I'm stuck now and bleeding time due to 
exactly the damn problems I complained about. VSC not available on a - Nim 
supported! - platform and gdb not working (with Nim code). Thanks so much.

Bloody give us a reliable reasonable minimum before adding more gadgets, 
please, pretty please!

Here's my thinking:

  * VSC - utterly inacceptable perversion, fat, quite many (and fat) 
dependencies, etc.
  * vim - a six legged creepy animal to some, the greatest thing since sliced 
bread for others. Probably not a good choice for a basis (those who don't love 
it tend to fear it).
  * emacs - an OS with a mediocre editor. Probably not a good choice either.
  * sublime - I love it, but it's not open source plus it's not available on 
all platforms (e.g. BSD)
  * textadept - not as cool as sublime I guess but a nice and highly 
configurable editor with lots of flexibility based on Lua.
  * geany - loved by many and certainly acceptable as a reliable basis. Also 
works on BSD.
  * a bunch of editors that are electron or XUL or  based. Obviously not acceptable.



My favourite would be textadept. It's open source, relatively small, and works 
both as a gtk based graphical as well as a cli (terminal) editor. Available on 
Windows, Linux, Mac, BSD - plus there is some Nim support module available that 
works (but could profit from some love). Should lend itself very well to 
becoming a Nim "basic IDE" (or luxurious editor). GDB interface should be quite 
feasible. My second choice would be Geany.

Risking to not make friends, but it seems important: One of Ada's major 
advantages (practically speaking) is GNAT, it's IDE (which _is_ available 
pretty much everywhere) and has a _working_ debugger. Hell, I'd be better off 
(in this specific case) even with C due to a choice of well supported editors 
and/incl. debuggers.

This is _not_ a luxury issue or one of personal taste. It's a thrive or starve 
issue that makes Nim a _practically_ usable everyday language - or not. It 
_MUST_ be possible to have a reasonable and reasonably small editor with Nim 
support as well as _good_ gdb support on _all_ supported platforms!


Re: questions on binding C DLL

2019-01-18 Thread moerm
Re question 1: Where does that `void Fl_Widget_TrackerDelete(Fl_Widget_Tracker* 
& wt);` declaration come from? I doubt that it's coming from FLTK and I'm not 
surprised that c2nim doesn't digest it. Are you sure that the `&` is really 
there before `wt`?


Re: questions on binding C DLL

2019-01-17 Thread moerm
We are not in your head nor do we have the C (or Basic?) source code in front 
of us. So if you throw some code line at us without any context it'll be hard 
for us to help you. Also note that the intersection of Nim and Basic developers 
is probably very small (as opposed to Nim and C) so asking questions along the 
line "In Basic it's done like that. How in Nim?" will risk to severely limit 
the set of people willing and capable to help you.

As for your question 2: Nim knows that `proc` is obviously about an address. 


Re: the ignoring of _ make some translation need more work

2019-01-16 Thread moerm
The following _does_ work:


// file ccall.c
int some_weirdly_named_func(int p)
{
   return(p += 10);
}

Run


# file ccaller.nim
{.compile:"ccall.c".}

proc weirdCcall(p: cint) : cint 
{.importc:"some_weirdly_named_func".}

let r = weirdCcall(3)
echo $r

Run

Note that nim _can_ import "some_weirdly_named_func()" as one is free to name 
it in Nim as one wishes (in the example `weirdCcall`). One might as well stick 
with `some_weirdly_named_func` but then Nim will treat the Nim version as if 
one had called it `someweirdlynamedfunc` \- but still call the correct C 
function.


Re: I do not perceive the advantages of Nim over C #

2019-01-15 Thread moerm
Although this thread is rather old I'll answer anyway because certain attitudes 
and questions come up again and again. It might be useful to have something to 
link to instead of saying it again and again.

Front-up: My first gut reaction to the OP was "Oh well, if he likes C# so much, 
why doesn't he just stick to it and be done". In other words: We are not 
salesmen, we love Nim and we are certainly happy if new users join but we are 
not salesmen. If you feel that language XYZ is so much better than Nim then 
simply stick to it. Simple as that. In fact, you might be perfectly right in 
thinking that XYZ is better than Nim - _for you_.

There simply is _no_ best language. A language may be the right thing for many 
users and a broad spectrum of problems but _no_ language is generally the best.

In my case (brutally compressed: reliable and safe software) I used Ada for 
quite some years and was _almost_ quite happy. In fact I've said a lot of good 
things about Ada (and still do). But there were those "buts". Nim addresses 
those. Well noted, Nim is _not_ "better than Ada". It just happens to offer a 
better trade-off for me. At least as of today Ada is (still) better in certain 
aspects but those are largely about "paranoid safety levels" (like real formal 
verifiability) and some things like much,much better docu (well, existing for 
decades that's not astonishing).

I usually do not need those "paranoid safety levels". Software is almost never 
unsafe or insecure because AES-128 was used instead of AES-256 or because it 
breaks after having ca. 2 billion users (due to signed int32), etc. Nope, stuff 
breaks due to factors like creepily lousy code, utterly lacking design (and 
care), pressure by management demanding features, features, features, and inept 
languages (often (ab)used by inept developers).

Nim addresses those issues to a large degree. And - that's very important - it 
makes it _easy, natural, and comfortable_ for the developer to write 
_considerably_ less buggy code. And by "considerably" I mean something in the 
range of 80% to 95% less bugs. In other words: Creating something like 
Heartbleed (SSL) was almost inevitable with the C code base. Creating 
Heartbleed using Nim would need a _serious_ level of carelessness or ill will.

Two other points that are relevant for quite many are:

Nim runs on quite many architectures and OSs. Unlike C# whose non-Windows 
support were ugly siblings and afterthoughts Nim wasn't designed for any 
particular architecture or OS.

Nim offers the full feature set that is needed for many applications today. 
Support for both multithreading and async/await as well as for immutable 
variables come to mind as well as many (seemingly or factually) smaller things 
like defer or a really, really nice C interface or the fact that one can target 
javascript (and end up with something that is not crappy).


Re: "Nim needs better documentation" - share your thoughts

2019-01-14 Thread moerm
I'm somewhat torn between the two camps. My compromise: the Nim way for _short_ 
proc docu (1 - 2 liners) plus, if more than a couple of lines are needed I add 
something like `More Info: `Details `_`.

This allows me to stay Nim style and still have a clear link to more elaborate 
proc docu.


Re: New "Learn Nim" page

2019-01-10 Thread moerm
I don't think that's largely a question of what's your need and goal. If you 
want programming as a hobby Python might a be better choice. If on the other 
hand you plan to do programming more seriously learning Nim is probably a 
better investment.

Reason for my opinion: Nim is a quite easy to learn language _for a serious 
system programming language_ plus it's very versatile; Nim covers "quick and 
dirty jobs" (that are typically done in Python or Ruby) and it allows also for, 
say, creating full servers (e.g. a web server). Python on the other hand is 
even easier to learn and has many more libraries, docu (incl. hundreds of 
websites about Python) and better tool support (like IDEs, editors and 
whatnot). It should be noted though that Nim being (not hard at all but) a bit 
harder to learn than Python is mainly due to what Nim is capable of and that 
Nim cares a lot about creating _good_ and _reliable_ software (which Python 
doesn't care about much).

My personal opinion/advice is that you should stay at least a bit with Nim. If 
it turns out that you find it to be too hard or if you feel that you would need 
much more and better docu, or ... you can still switch to Python and lose 
nothing (and much of what you learn with Nim will stay valid and be helpful). 
It shouldn't take more than a few weeks (with an hour or so per day) playing 
with Nim to find out whether you and Nim get along well.


Re: New "Learn Nim" page

2019-01-10 Thread moerm
I'll add something to juancarlospaco's helpful explanation:


var
  k: array[10, int]

for i in k.low .. k.high:
   k[i] = (i + 1) * 10
   if k[i] mod 2 == 0:
  echo k[i]

Run

Note the `.low` and `.high` in the loop. That's a good way to avoid index 
errors and also allows you to later change your array to, say, start at 1 
instead of 0 and the loop will still work fine.


Re: Nim vs D

2019-01-10 Thread moerm
I agree with some points but that

> The NIM project founder is sort of a one person show in development and 
> promotion.

is plain wrong.

Not having a large organization like Mozilla behind it Nim can obviously not 
compete with some "competitors" in terms of how many full time developers it 
has - but it does have some, plus some who more devs who are not in the core 
team but who significantly contribute to Nim. 


Re: New "Learn Nim" page

2019-01-10 Thread moerm
Nice! Thanks, Nim team!


Re: "Nim needs better documentation" - share your thoughts

2019-01-10 Thread moerm
I agree with Araq's statement and in some cases talking bad about language XYZ 
indeed is trying to lift up ones own or preferred language (or at least 
strongly looks like it). But there's a big fat "but": Better languages, at 
least to some significant degree, are often created _because_ of bad things in 
other languages and/or the desire to avoid those problems.

On a somewhat deeper level a language is a much more complex thing that just 
some technical points. Many factors, some of them often not seen, like 
psychology and philosophy, play important roles. Also the very reason for quite 
some languages having been created - incl. Rust - is in the dark spots of what 
was available. In the case of Rust, for instance, it was (to word it neutrally) 
the memory handling related problems of C and many of its children and 
derivates.

I see in Nim a language that really and properly (and often elegantly which 
isn't just pleasant but a well established albeit not easily tangible symptom 
of engineering quality) solved many urging problems by recognizing and 
analyzing them and then creating a much better solution.

In other words: Nim is great because Araq _did_ see and experience the poor 
choices and smelling points of other languages - and then - successfully - 
tried to do it much better.

@JD

I also didn't like your statement. Not because you are not right; you probably 
are to a large degree. But because I was missing the connection to Nim and the 
constructive motivation.

I personally detest Rust. But it's neither necessary nor worthwhile to tell 
that - unless the perspective is constructive: "how can or did we do it better? 
What should be learned from Rust's poor choices and approach? Why is it bad in 
the first place and how could it be done better?". That is what I missed in 
your post.

That said one should also be fair enough to say that Rust does have some good 
points where it's much better than C or C++. In particular Rust at least 
addresses the ownership question (wrt. memory). I don't think that their 
approach is the right one but I _do_ recognize that they did some analysis and 
seriously tried to do better.


Re: Nim Advocacy & Promotion Strategies

2019-01-07 Thread moerm
> And why do you think that such crowds would take "use Aporia" as a satisfying 
> answer?

Because afaik it is a real IDE (unlike e.g. VSC).

> I think you're (too easily) dismissing ...

I don't think so. I _know_ that many don't want an IDE at all and that many 
want to use their favourite editor.

But I also know that the "is there an IDE for it?" question is one very many do 
ask. Go, Zig, ... you name it, just look at stackoverflow, HN, quora, etc ... 
that question is very prominent.(And again: I personally don't care that much 
for an IDE. But the discussion here is not what moerm likes; it's how to 
promote Nim ~ what many like and what are typical demands that come up very 
frequently).


Re: Nim Advocacy & Promotion Strategies

2019-01-07 Thread moerm
Besides the fact that almost always the large IDEs also join in once a language 
already has a solid base and uptake I fully agree with you. Again: My point was 
_not_ that the Nim team should create our _own_ IDE. It just so happened to be 
the case that two IDEs were made (and went quite far) by the Nim team.

Also again: The IDE issue is not even particularly _my_ point. I can (and do) 
live with Nimlime and gdb and occasionally VSC. It's just that the IDE question 
invariably comes up whenever a now (or not wide spread) language is talked 
about.

In other words: I can live quite OK with the current situation. But when we try 
to promote Nim the IDE question will come up and frequently so. And many won't 
take "Use VSC" as a satisfying answer. One example is the vim crowd. Another 
one is the sublime text crowd (plus probably Windows devs but I don't know that 
world good enough to make a tenable statement).


Re: Nim Advocacy & Promotion Strategies

2019-01-07 Thread moerm
> There is not much professional behaviour in your posts here, some would call 
> you a troll, consider this to be my first and only warning.

My engagement is honest and my motivation is constructive. That's why I also 
tried to help with concrete problems of users (like just today in the dll 
callback thread). But you dressed me down (in another thread) and now you 
threaten me. Maybe _you_ should think about "acting professional" yourself. Not 
every critical voice is an enemy.

As you obviously didn't get it: I love what you created, I highly value it and 
I also highly value the work of you and the Nim team. It's _because_ I highly 
value Nim that I'm also speaking about its less shiny points (hoping that they 
will be solved).

What's your intention when you threaten to ban me? If I really were a troll I 
could find better (more visible) places to trash Nim and you couldn't stop me 
anyway. (pure theory. I like and value Nim way too much to trash it and I even 
profoundly respect you (modulo your intolerance and being quite touchy)).

In case I was wrong in some point and really was unfair to you, I apologize. 
Honestly. Simply correct me and I'll even be pleased (seeing Nim getting better 
and more and easier usable means much more to me than being right).


Re: Nim Advocacy & Promotion Strategies

2019-01-07 Thread moerm
My point wasn't that Nim must have its _own_ IDE. My point was that having an 
IDE is a major point wrt uptake and that Nim just happened to have some 
considerable effort going into 2 IDEs but left them unfinished.

Well noted, my line isn't what _I_ personally like or want but what is 
generally and widely considered as strongly desirable (see for example plenty 
of questions on stackoverflow and elsewhere along the line "Is there a good IDE 
for [any language]?").

As a side mote I also find that point interesting because one does not need to 
be a Nim guru to create e.g. syntax highlighting for some of the top 10 editors 
and afaik even way more is within mere mortals reach (e.g. auto-completion) or 
a gdb interface. That means that the Nim core team could be (at least largely) 
kept free from that burden yet the language could gain attractivity. Of course 
the precondition is enough users being convinced enough that their investment 
makes sense (which again probably is an analog of the general perception and 
uptake of Nim). 


Re: still no success binding DLL's callback function

2019-01-07 Thread moerm
Here is your problem:


Fl_Callback* = proc(widget: ptr Fl_Widget, pData: ptr any)

# ...
proc ButtonClick (button: ptr FL_WIDGET, arg: ptr any):**cint  {.cdecl.}** =

Run

In other words, `Fl_Callback` is not cdecl and its return type is void while 
`ButtonClick` is cdecl and its return type is cint.


Re: Nim Advocacy & Promotion Strategies

2019-01-07 Thread moerm
So? We do not at all disagree on that. You basically just worded differently 
what I meant (modulo some minor points like "imperative" vs. "imperative with 
some functional").

I'm _not_ for relaxing those points. Quite the contrary. I was addressing 
another question, namely whether fully implementing that basic feature set 
should be delayed because always a new gadget here, some niche stuff there 
comes up.

My basic line is

  * clearly and _bindingly_ define what Nim is
  * implement that. Consistently. Fully (incl. good docu and explanatory 
material).
  * make sure there are sound basic tools (e.g. debugging, c conversion,...) 
and a reasonable choice of tools (e.g. editors) support.



In other words: Build a solid and full Nim 1.0

Then - and only then - _additionally_ work on gadgets, additions for niches, 
etc.

Example: (afaik; I don't check twice a week) we still do not have proper full 
uint64 support. Hell, it's 2019 with 64 bits processors being the normal and 
instead of getting something as basic as uint64.high(), i.e. consistent integer 
support, Nim cares about what-not niche stuff.

Again: I'm in no position to demand anything. The Nim team can do whatever they 
feel like and we have to gratefully eat what's served (IF we want to use Nim). 
But this discussion is about promoting Nim and hence I ask what exactly we are 
to promote? Plus: How seriously can we ask for a language to be taken (and 
hence be promoted) if its very developers seem to be often distracted and seem 
to not fervently strive for whatever-is-Nim-is-defined-to-be v. 1.0 incl. the 
practical basics?

What, for instance, will I respond to someone to whom I try promote Nim when he 
asks me, why 2, not one but two, Nim IDE projects have been left to rot while 
one of the typical first questions re. languages is "Is there a good IDE"? And 
what should I respond when that person follows up asking whether that fact (two 
disbanded IDE projects) does suggest a tendency in Nim's team?

I'm a professional. I earn my living with software development. I simply can't 
afford to make a bad choice. I - and certainly not only just I - need Nim to 
have clear contours, to reach 1.0, and to deliver.

Similarly we know from concrete experience that developers _need_ a good and 
complete tool set and good docu helping them to get up to speed and to have a 
solid reference.

I learned a lot from Haxe, another somewhat outside the mainstream language 
with very high potential. Although it _is_ a good language (in its field) and 
_does_ have a very attractive feature set (and even was promoted and even hyped 
for some time) all in all it failed. Why? (imo) (very) poor editor and IDE 
support, no debugger (outside of Adobe Flash), too One-OS (windows) centric. 
And I'm honestly sorry to say that Nim might have quite similar problems.

We should make sure that Nim isn't one of those projects with ever new ideas 
and experiments that fails to finish and win the race because it got lost in 
too many distractions.


Re: Nim Advocacy & Promotion Strategies

2019-01-06 Thread moerm
Can be a problem though because in today's "language rich" environment with 
hundreds of languages catering to the needs of 10 niche groups not only highly 
likes to fail gaining significant traction but also risks to lose its main 
target group.

Re js: That's something I don't see as a problem as long as it's just a back 
end and not something that triggers changes in Nim itself. In fact additional 
back ends are a good example for _additional_ features. My only worry there 
would be that not having an (additional) back end is better than having a half 
cooked and not long term maintained one (which however seems to not be likely 
with Nim).


Re: Nim Advocacy & Promotion Strategies

2019-01-06 Thread moerm
I'm a lousy marketing or sales guy but one thing even I understood and keep in 
mind is this: the result of communication is _not_ what has been said but what 
has been heard. As promoting a language (and arguably even building one meant 
not only for oneself) is a "use case" of communication it seems logical to me 
that the relevant part (in the context of promotion or even just introduction) 
is _not us_ but rather the target audience.

I agree with your definitions - but: the relevant question is -> what do the 
people think and want and understand to whom one tries to promote Nim? Plus: 
will they, looking closer, find their perception and our "promise" confirmed?

One point that keeps me thinking is this -> versatility != jack of all trades 
without clear contour.

That's why I talked of "the core". What is Nim meant to be at its core? If it 
servers well in other fields/beyond its core _in addition_ that's great. Right 
now I feel that Nim's contours are somewhat vage; at times it almost seems that 
Nim is defined by whatever some of its developers feel to be important at any 
point in time.

Another point I feel to be of importance is the well established rule that most 
people don't care a lot _how_ something is achieved. They care about _what_ is 
achieved. Similarly they don't care a lot about mechanisms but about results. 
At least at first and most.

Python, for instance, was/is promoted as (and indeed is) "easy, everywhere, 
versatile, lots of batteries" and Guido van Rossum evidently wasn't just a 
benevolent dictator but also one who made sure that Python always had a clear 
core definition and contours and delivered on that and its "promise".

What is Nim's core and "promise"? To promote Nim we need an answer to that and 
one that is clear cut, understandable, binding, and consistently observable.

I'll end with a personal view: I looked at many languages and tried quite some. 
I was drawn to Nim by my impression (largely based on material on Nim's web 
site) that Nim covers what C covers but with a sharp eye on safety and 
consistency as well as with a more modern feature set (e.g. UFC). In the next 
step I verified that Nim _really_ cares for safety and really covers the field 
at least to a large degree. Only then did I actually download and try it.

Frankly, if my need (for certain features) were a bit less urgent I would have 
left quite soon due to some shortcomings - "funnily" largely not in Nim but 
around it (editor, debugging, docs). Another point that makes me unhappy is 
that it sometimes seems (apologies in advance if I'm simply mistaken) that some 
core developers seem to not consistently pull through and do so clearly focused 
on a well established definition of what Nim is meant to be but rather "jump 
around" with an ever changing interest in this or that new feature. Inter alia 
two Nim IDEs seem to be symptoms of that. And 1.0 still seems to move and move 
and move into the future.

So, what are we to promote?


Re: Nim nightly builds

2019-01-06 Thread moerm
Thanks.


Re: Nim nightly builds

2019-01-05 Thread moerm
Great (although I personally prefer halfway stable releases) Thank you.

Out of curiosity: how come the jump from 19.2 -> 19.9?.


Re: Nim Advocacy & Promotion Strategies

2019-01-05 Thread moerm
Front-up disclaimer: I went into the being exited trap due to what Nim has to 
offer, too.

Re the OP issue: To promote a language one should first know what it actually 
is one wishes to promote. So: what is Nim at its core? What is _the_ 
definition/goal of Nim?

Is it a system language or is it one targeting the web (js)? Or one that wants 
to address many niches? Whatever it is desired and decided to be, should be a 
clear and binding frame at least for some time (well beyond v. 1.0).

It seems to me that to promote Nim we should have a clear, binding, and 
effective understanding of what it is meant to be because those to whom we 
promote it will check whether it actually delivers and whether they are 
interested. Side note to avoid misunderstandings: I do not care (anymore) what 
Nim is meant to be. But whatever the Nim masters decide it to be should be 
targeted _consistently_.

It's no shame for a not yet mature language to not offer everything and the 
kitchen sink. It _is_ a problem, however, if what it _actually_ is (and 
develops into) seems to be more a question of what just happens to be of 
interest to its developers at any point in time than of growing and extending a 
well defined core.

Moreover, before promoting something one should be sure to have in place what's 
needed for newly won users to actually use it. Re. tooling I feel that Nim is 
well equipped (modulo debugging and, but that's maybe just me being picky, 
editor support); nimble, c2nim, etc. as well as nimdoc are useful and working 
(some polishing is needed but that is not critical I think). As for 
documentation I will refrain from commenting as my view seems to be different 
from the whole universe.

To put it bluntly I think that the question should not yet be how to promote 
Nim. The question should be _what_ we are to promote (a system language? A do a 
bit of everything language? The cool new web thingy, ..?.) and how a status can 
be achieved in which promoting Nim makes sense and seems promising. In other 
words, a Nim beyond, say, 0.7 that has clear contours, is _consistently_ 
evolving within those contours instead of being somewhat of a chameleon, and 
that targets actual and _practical_ usability for the very developers (read: 
not just the Nim team and insiders) it is promoted to.

> But yes, either @Araq will make some tough choices, bite the bullet and 
> remove niche features or we won't ever get 1.0.

I'm surprised to read that here (and Araq not angrily dressing you down) but: 
_100% ACK_. I also feel that that without certain changes Nim will be 0.3 in 
2025 and have many more features and gadgets but not many more users.


Re: "Nim needs better documentation" - share your thoughts

2019-01-03 Thread moerm
Thanks for that clear statement. I'm learning and drawing my conclusions.


Re: trouble during wrapping a windows DLL

2019-01-03 Thread moerm
I see 2 major factors and the only reason I'm mentioning them is my optimistic 
hope that someone concerned will read and think about it:

  * Nim's docu is poor in many places, let's be honest.
  * people should understand that there is a strong relation between the 
quality of a question and the quality (as well as the chance) of answers.



I have the good will and the readiness to help and even some experience with 
Nim interfacing C. What I do not have, however, is the willingness to read 
through more than a couple of (relevant!) lines of code, to wrap my head around 
whole projects of users needing help and in particular I am absolutely 
positively not ready to invest more time and effort than the one asking a 
question.

I'm sorry to word it that rude but it's not really news. It has been written a 
thousand times in diverse IT forums of diverse kinds. "Do _not_ just mindlessly 
plunk a rather unspecific question, in particular questions like "It doesn't 
work!" into a forum and expect help!

First rule: make it easy to help you. Help others to help you! Second rule: 
think at the very minimum enough about your problem so as to be able to word it 
clearly and concisely! Btw, the reward for that effort will surprisingly often 
be that you actually don't need help anymore (or way less help) because trying 
to present a problem well often leads to seeing a solution.


Re: "Nim needs better documentation" - share your thoughts

2019-01-03 Thread moerm
As a somewhat rude demonstration: From the help for module system


proc GC_ref[T](x: ref T) {...} # empty
proc GC_ref[T](x: seq[T]) {...} # empty
proc GC_ref(x: string) {...}
 
 marks the object x as referenced, so that it will not be freed until it is 
unmarked via GC_unref. If called n-times for the same object x, n calls to 
GC_unref are needed to unmark x.   Source Edit

proc GC_unref[T](x: ref T) {...} # empty
proc GC_unref[T](x: seq[T]) {...} # empty
proc GC_unref(x: string) {...} # "see the documentation of GC_ref."   i.e. 
basically empty, too

Run

This, ladies and gentlemen is what I would call a major clusterf_ck and it 
doubtlessly results in _many_ unnecessary problems, bugs, and trouble as well 
as to people seeing that, turning around, and walking away from Nim.

It's also an example of what I meant recently: it's next to worthless toying 
when ever new gadgets are worked on and being introduced while significant 
parts of the bloody vital core of Nim are between unusable and a lottery due to 
poor, ridiculously minimal, or even absent docu.

Unless, for instance, the C FFI is _sufficiently_ (interpretation on the 
generous side, please) and _properly_ documented (e.g. -> GC_[ref|unref]()) we 
have no right to truthfully assert that Nim has a good C interface. (Hint: Araq 
recently said that the manual on GC_ref/unref is basically wrong).

Simple rule: if something isn't _well_ documented it should be regarded as 
non-existing.

Finally, allow me to bring up another aspect in that context. "We don't do XYZ 
yet" is not nice but at least honest. "We do XYZ and we even do it well!" but 
then failing the user giving Nim a chance is an invitation to wasted time, 
disappointment, and sooner or later a seriously tainted reputation. Recently 
there was discussion here re. How to attract new users. Hint: Not pissing them 
off by making them waste their time and having to walk all over the place 
(official docu, forum, other web sites, source, etc) to find bits of (hopefully 
correct) information would seem to be a good start.


Re: "Nim needs better documentation" - share your thoughts

2019-01-03 Thread moerm
  * real documentation - as opposed to just proc comments
  * Examples, examples, examples
  * Explanations for more difficult matters like async, threading, etc. 
Something like page or two only "generally" explaining how those work in Nim 
(from there the user can go to the modules normal docu and learn practical 
details.



The basic structure is this: First give the user an "overview" how e.g. Nim 
threading works. Then give him examples to show how it's done in Nim. Finally, 
have good docu for the language (which I find not great but acceptable) and the 
libraries (which I often find poorly docu'd).


Re: [help needed] nim version of: COMPARING PYTHAGOREAN TRIPLES IN C++, D, AND RUST

2019-01-02 Thread moerm
We are in agreement if I understand you correctly.

I don't care whether Nim code runs 0.5% or 3% slower than C code. In fact, I 
think that whole benchmarking is irrelevant except for a rough overview ("Nim 
is within x% of C's speed").

Reason (and C/C++/D developers might want to read this carefully): Looking at 
the code created by - even very good - C compilers everyone with some not 
insignificant knowledge of ASM will concur that developing in ASM is the way to 
get the fastest code. Unfortunately though, ASM (next to some other issues like 
portability problems) also has the major disadvantage of being ridiculously far 
away from the problem domain. So, if we were serious about it we needed to make 
my axis longer to fit ASM quite a bit beyond C, at the extreme end of runtime 
speed but also extreme distance to problem domain.

In other words: If we _really_ were obsessed with runtime speed we should chose 
ASM over C - but we don't, i.a. because we would pay with a very significant 
increase in development time, more and more difficult to spot bugs, etc. So the 
reality is that C developers already made a compromise trading development 
speed for runtime speed. Nim developers do the same - but with a much, much 
better deal; we get a ton of lower dev. time, fewer bugs, etc. for what in the 
end is a ridiculously low runtime price even if it happened to be 5% less speed 
than C.

And btw - albeit largely theoretically right now (I assume) - we even _could_ 
compensate and reach C's RT speed due to Nims compiler having much more 
information than a C compiler (almost always has) due to factors like e.g. Nim 
strong static typing which allow the Nim compiler to generate C code that then 
again would allow the C compiler to create faster code. 


Re: Use GC_ref on a pointer

2019-01-02 Thread moerm
Yes and no. Yes, you are right, I had a wrong (and since then corrected) 
statement in my post. And yes it does work but it's almost certainly not what 
he wanted anyway (and a weird way). But yes, I got confused myself.


Re: Use GC_ref on a pointer

2019-01-02 Thread moerm
Yes and no.

Yes insofar as you can of course allocate whatever you please and pass those 
pointers around and/or use them. No insofar as you must be careful to not 
wildly mix up Nim-allocated objects and pointers. In your example


proc newMyObj(): MyObj =
   result = MyObj(x: 42)

proc allocMyObj(): ptr MyObj =
   result = cast[ptr MyObj](alloc(sizeof(ptr MyObj)))
   result[] = newMyObj()

Run

there are multiple problems. For one `result[] = newMyObj()` says "return what 
result points to" which however is not a pointer. More gravely though, even if 
it worked you would lose the just allocated pointer/memory because you would 
overwrite `result` with what `MyObj()` returns.

What you actually wanted is probably this:


proc allocMyObj(): ptr MyObj =
  result = cast[ptr MyObj](alloc(sizeof(MyObj)))
  result[].x = 42

Run

Alternatively you could also use your `newMyObj()` proc _instead_ of 
`allocMyObj()` but then your object were a Nim (allocated) object and taking 
its address to have a pointer to pass around (to C code) would risk to end in a 
weird situation because from Nims point of view _it_ is in charge of that 
object while from C's point of view it could do with that pointer whatever it 
pleases and assume its - not guaranteed - existence etc.

Maybe the following code can help you to see what I mean


type
  MyObj = object
x: int

proc newMyObj(v: int): MyObj =
  result = MyObj(x: v)

proc allocMyObj(): ptr MyObj =
  result = cast[ptr MyObj](alloc(sizeof(MyObj)))
  var tmp = newMyObj(42) # under the hood a Nim ref
  echo "tmp: " & tmp.repr() & "tmp address: " & repr(tmp.addr()) # show it 
from Nim's point of view
  echo "allocated object: " & result.repr() # now show the alloc'd object
  result = tmp.addr() # but return the other one bc. result has been 
overwritten

# -- main --
let mo = allocMyObj()
# the allocated object is lost. We have no way to access or free it
echo "After allocMyObj(): " & mo.repr()
mo.dealloc() # deallocates which one? Obviously *not* the alloc'd one
let mo2 = newMyObj(43) # try it the other way
echo "After newMyObj(): " & mo2.repr()

Run


Re: Use GC_ref on a pointer

2019-01-02 Thread moerm
The way you asked your question (hint: vague, general, not clear what you 
really want) Araq responded perfectly correctly.

Now you ask us to read a 150 lines file and we are supposed to find out what 
you want and what your problem is by reading through and make sense of your 
code.. Chances for that to happen are slim I guess.

Maybe helpful: Sometimes C stuff wants/needs some structs and stuff allocated 
by a caller. In such cases it's often better to do those allocations in C and 
to pass them to Nim in a way where the Nim code doesn't need to know anything 
about them (other than that they are pointers of some kind). As long as the Nim 
code doesn't need to do anything with those pointers (other than passing them 
around) you need _not_ be precise; you can, for instance, tell Nim that some 
pointer to some  is simply a char pointer.

Another maybe helpful hint is to "allocate" _simple stuff_ (like say a C char 
array) in Nim (by simply having a var) and to pass the pointer to it (more 
precisely usually a pointer to its data) via myvar.addr() to a C function. Do 
**not** allocate a bit here (Nim) and bit there (C) but try to be consistent.

For a more specific answer you need to provide a more specific problem 
description. 


Re: [help needed] nim version of: COMPARING PYTHAGOREAN TRIPLES IN C++, D, AND RUST

2019-01-02 Thread moerm
I fully agree on Nim indeed _being_ a good language. My point though wasn't "I 
can do faster code than ...".

My point was that one should a) _think_ about optimization starting from 
"what's actually the point and what's the bottleneck or the most promising 
approach?" (in this case it was "use a better algorithm") and also "how much 
optimization do I need and what is it worth? (e.g. in dev time)", b) avoid 
_obvious_ problems (like nesting loops without need), and c) _help_ the 
compiler by providing clear hints.

I also copied your code verbatim (modulo the now(); I prefer my own routine 
because I know it's directly getting the monotonic clock), compiled it with the 
exact switches used by timothee and the above code from you took around 210 ms 
on my system (debian, Ryzen (in a VM), gcc 8.2.0).

And I'm not surprised. While you are right and Nim has excellent iterators the 
basic problem still is 3 loops and an if in the inner most loop (and a bad 
algo). Maybe my Ryzen is a bit more or a bit less sensitive than your CPU in 
that regard but any kind of branching (loops, ifs) risk to trash the L1 and 
often enough L2 too.

And btw, Nim's iterators, as great as they are, are not zero cost. One single 
change in your code, namely replacing `for z in toInfinity(1):` with `for z in 
1 ..< 0x7FF:` made the code run almost 10% faster.

But I had another interesting point in my first post: Why use _any_ language 
XYZ? Why not use, for instance, Python? What's the point, the difference? 
(Looking from the perspective I'm interested in here) the answer is: Python 
means "way less dev. time than C but way slower code (runtime)". Where is Nim 
on that axis? _That_ (imo) is an immensely important point and one where Nim 
really shines: You get a dev. time not far from Python -and also- a run time 
very close to C.

That's why I do not even _expect_ and desire Nim to ever reach 100% of C code 
(runtime) speed. What I want is a language that makes it easy to think about 
and focus on my task, the algorithm and still get near C speed. From what I see 
nobody and nothing gets even close to Nim in that crucial point that is 
_directly related_ both to productivity and code quality and I still _can_ 
profile and optimize real hot spots in C. 


Re: [help needed] nim version of: COMPARING PYTHAGOREAN TRIPLES IN C++, D, AND RUST

2019-01-02 Thread moerm
For what it's worth: I c2nim'd the simple.cpp and slightly adapted it to have a 
`limit` parameter to (using `i`) limit the number of computed triples. Compile 
time on my Ryzen box and using gcc as the backend was around 1.6s the first 
time and about 0.25 s for following runs (said Nim). Execution time of the 
release compiled code was about 220 ms.

My main motivation was to follow a hunch, namely that this isn't about 
generators, lambdas or whatever but about a) a very poor algorithm (no surprise 
there; after all it's even called "simple" (as in "naive" I presume)) and b) 
careless loop nesting.

One of the major rules wrt performance is to be careful with loops. Another 
rule is to help the compiler by good hinting.

So, using a better algo plus having only 2 loops (and using a microsecond timer 
rather than crude `time` for measurement) I arrived with the following which 
(release mode) does the job - incl printing! (which amazed me. Kudos to the Nim 
team!) - in about 2 ms.


const plimit = 1000 # how many triplets to compute (1000 like in the blog 
code)

proc pyth3s(limit: int) =
var found = 1
var m: int = 2
while true:
var n: int = 1
while n < m:
let m1: int = m * m
let n1: int = n * n
let c: int = m1 + n1
let a: int = m1 - n1
let b: int = (m * n) shl 1
echo $a & ", " & $b & ", " & $c
if found >= limit:
   return
found.inc
n.inc
m.inc

let t0 = getHRtime() # microsecond timer
pyth3s(plimit)
let t1 = getHRtime()
echo "Done. First " & $plimit & " pyth. triplets computed in " & $(t1 - t0) 
& " musec"

Run

Thoughts:

Obviously reducing the loops to 2 is a _very major_ performance improvement. 
But I think (might be wrong) that explicitly introducing `m1` and `n1` is 
helpful as a hint to the compiler. Moreover limiting the use of vars to the 
reasonable minimum and to prefer `let` plays on one of Nims strengths. Yes, the 
code _looks_ longer but wrt performance the decisive factor isn't how it looks 
to a human but what the compiler can make out of it.


Re: Convincing my friend about Nim

2019-01-01 Thread moerm
Pardon me but it seems that you should change friends rather than language.

What your friend said about Nim being incomprehensible for people who don't 
know Nim is simply ridiculous BS.

If I were in your place -[and](https://forum.nim-lang.org/postActivity.xml#and) 
\- for whatever weird reason - wanted to keep that friend I's simply tell him 
"You are right. Thanks, I've seen the light. But as I'm a weird guy I'll play a 
bit more with that language Nim although c# is so so much better". That should 
help as it would address and sooth both major factors, namely your friends 
cluelessness and his urgent desire to be "right". (But then, neither am I in 
your place nor would I hesitate a split second to tell such a moron Adios...).


Re: Convincing my friend about Nim

2019-01-01 Thread moerm
> First of all, he states that not having such an indentation-based syntax 
> allows for more freedom ...

He is right in that but only in one regard: Having explicit block markers (like 
{ and } or `begin` and `end`) allows for (visually largely unstructured) 
"streams" of code. If that were really, really needed, one could however 
introduce a "3 spaces" symbol, say '°'.

It might be noteworthy that quite some studies, typically made in the context 
of some entity seriously needing that question answered, have clearly shown 
that readability _is_ important, in fact _very_ important. While I know of no 
study addressing indentation based blocks vs marker based blocks it seems 
reasonable to assume that both are clearly on the good side (remember: The 
issue is (only) readability). The studies that were made all and consistently 
demonstrated that "curly blocks" are clearly the worst approach (~ worst 
readability) and introduce significantly more errors and are significantly 
worse to maintain.

> First of all, he really dislikes the lack of brackets, and the "misuse" of 
> them

I would (mildly) agree. It _does_ make sense to have `[]` meaning "something 
array like" always (although I personally and subjectively - and unreasonably - 
prefer [] over <>). More importantly however I reject your friends criticism 
because It's basically just based on "it's not c# like and hence it's bad!". 
That's, Pardon me, just stupid.

> Next up, an "invalid" argument. He says macros are dumb, and you could just 
> write code normally without them.

Very wrong. Obviously your friend, like so many in the curly braces camp, fails 
to understand the concept of readability which is closely linked to 
maintainability and probability of errors. Probably your friend has become a 
victim of the (utterly stupid) "fun" paradigm (Typical statements are "It's fun 
to develop in XYZ" or "I'm a developer because it's fun") - that camp has been 
found to be responsible to a very large degree for the major clusterf_ck we're 
in today (e.g. funny bug of the month in SSL).

They are, Pardon my french, braindead morons. Here is the truth and the only 
one in that regard: software development is an _engineering_ discipline - and a 
very complex one at that. If the "fun" camp guys were also in bridge building, 
railroads, air traffic etc. humanity probably would be largely extinguished by 
now (the survivors living deep in the woods, far away from killing 
"civilization" with tech).

Just like with bridges or railroads the relevant factor is absolutely _not_ 
whether those who designed and built it felt it was "fun" to do but whether the 
result was reliable, meeting the needs, etc.

Macros not only allow for comfort but more importantly they allow for 
"centralized code fragments". And that is directly linked to probability of 
error as well as to maintainability. Just have a look at crypto code; you'll 
almost invariably find (long) macros. Often the very core of an algorithm is 
implemented in one or more macros. And I think even your friend would accept 
that the people who gave us AES or Blake2 certainly aren't stupid.

Whenever some halfway responsible entity _really, really_ needed reliable and 
bug free software they turned to languages like Ada, Eiffel, Ocaml, etc. (along 
with _formal_ spec., modelling, and verification) as soon as and when such 
options were available. Hell, even Pascal was used because it might not have 
been a "nice" language but it offered many of the features high reliability 
scenarios required.

So, to summarize it somewhat rudely: why would you even care about the opinion 
of someone who doesn't have even the slightest clue what he talking about? You 
are like some engineer debating with an idiot whose arguments basically are "my 
(make shifted wannabe) crap is "cool" and "fun" while your (properly working 
and reliable) bridges and railroad systems are boring and uncool!.

Btw, I myself am not at all against .Net (any more). For one it incl. Mono 
(nowadays) offers reasonably good and wide linux support, too and isn't a pure 
Windows world thingie any more. More importantly though, Microsoft (funnily 
unknown to many) has invested tons and tons of money in safety, reliability, 
and security research and has really done a lot of work and has created much 
progress in that field. Just think of Z3 ( _the_ solver), Boogie (an "easy" 
interface language for Z3), code contracts (i.a. for c#!), Dafny (a safety 
targeting language that actually works) and more. With their async/await work 
they have tried to make .Net also useable for servers but although they seem to 
have done it well I personally and subjectively limit my .Net use for business 
applications.

Well noted this comes from someone whose first (and then beloved) language was 
C (and, I have to admit it, then I also was a "coolness" obsessed idiot) and 
who taught C. In fact I still sometimes implement certain tricky things in C 
because Nim does 

Re: How can we define a function that returns a type like a Union type?

2018-12-30 Thread moerm
Well, "the above code" from alehander42 is Nim's usual way to go about object 
variants and the code you don't like is the usual way to go about "methods" of 
object variants. You might want to write `result = ...` instead of `return ...`.


proc addTwo(self: C): int =
   case self.kind
   of int: result = self.num + 2
   of string: result = 10_000_000
   else: {.fatal: "Meh, illegal case in addProc" .}

Run


Re: How can we define a function that returns a type like a Union type?

2018-12-30 Thread moerm
Sorry but "the above code" can be a lot of things and we are not in your head 
and can't clearly know what you want. Please do not mistake this as rudeness 
but I've found again and again that there is a relation between the quality of 
a problem description and the quality of the responses. I'd be certainly 
willing to help you (if I can) - if only I could clearly understand your 
problem.


Re: How to achieve "mechanical substitution" like C macro

2018-12-29 Thread moerm
You are welcome.

Plus another side note: Nim templates _are_ (what C calls) macros albeit pimped 
up and cleaner. One important point to remember is that templates (unlike Nim 
macros) are basically but smart text substitution. This also means that any 
variables one refers to must either be defined in the _calling_ context or be 
provided as parameters. And it means that a template "returning" something 
(like `let x = mytemplate(...)` can be looked at (in the _" caller"_ as 
basically just "do everything but last template expression ... and then assign 
last expression to x".

So in my code above, if push and pop and "called" in, say, proc foo the 
following happens:


template push(val: int) =
   stack[p] = val
   p.inc()

template pop(): int =
   p.dec()
   stack[p]
   
   push(bar)
   # ... some other stuff
   let r = pop()

Run

comes down to


proc foo(...) =
   var stack = array[10, int]
   var p = 0
   var bar = 5
   # push(bar)
   stack[p] = bar
   p.inc()
   # ... some other stuff
   # let r = pop()
   p.dec()
   let r = stack[p]

Run


Re: How to achieve "mechanical substitution" like C macro

2018-12-28 Thread moerm
This should do what you want:


template push(val: int) =
   stack[p] = val
   p.inc()

template pop(): int =
   p.dec()
   stack[p]

Run

The relevant point isn't the stack logic (p++ vs p--) but that a template that 
is supposed to "return" something is supposed to have that something - in this 
case `stack[p]` \- as _last_ expression. So in your pop() version the template 
were to "return" the decremented `p` (if it could).

You could make it work however if you changed it to:


template POP =
  let v = stack[p]; dec p; v

Run

Side note: in Nim identifiers that start with a capital letter are supposed to 
be types, so "PUSH" and "POP" while being typical for C are a poor choice in 
Nim.


Re: openArray[(cstring,cstring)] parameter

2018-12-28 Thread moerm
type
CsTuple = tuple[first: cstring, second: cstring] CstSeq = seq[CsTuple]
proc test(p: openarray[CsTuple]) =


for t in p:
echo "first: " & $t[0] & ", second: " & $t[1]

let ts = 
@[("a0".cstring,"a1".cstring),("b0".cstring,"b1".cstring),("c0".cstring,"c1".cstring)]
 test(ts) 


Re: My experience with nim so far

2018-12-25 Thread moerm
> Latest Nim is not that dumb for literals: _(example)_ Works fine, even the x 
> / 2!

Excellent news, thank you.

> The fact that Nim does not do that many automatic type conversions is good 
> for safety and performance ... And as Dom said, lenientOps exists now, but I 
> think I will generally not need it.

I agree but I talked about a particularly annoying - and safe - case, something 
like `var x: uint16 = 5.uint16`. But again generally I'm also leaning towards 
safety even if there is a price to pay (in extra keyboard typing). But it's 
great news that _obviously_ clear cases are now properly handled by Nim. Lovely.


Re: How can we define a function that returns a type like a Union type?

2018-12-25 Thread moerm
@mrsekut

First, your code has a problem by passing a small 'a' but `parseStatement` 
expects a capital 'A' or 'B'.

I'm a bit confused by your approach but I'll try to help anyway.


type
   A = ref object of RootObj
  name: string
  value: string
   
   B = ref object of RootObj
  name: string
   
   C = ref object of RootObj
  flg: string
   
   None = ref object of RootObj

proc parseStatement(self: C): ref RootObj =
case self.flg
of "A":
result = A(name: "a", value: "val")
of "B":
result = B(name: "b")
else:
result = cast[None](0)
   # now, test it

let c = C(flg: "A")
let resc = c.parseStatement()
echo resc.repr
let d = C(flg: "?")  # unknown/invalid
let resd = d.parseStatement()
echo resd.repr
let e = C(flg: "B")  # unknown/invalid
let rese = e.parseStatement()
echo rese.repr

Run

works and is probably what you wanted.

Side note re. small vs capital 'a' etc.: You might find it interesting to know 
that Nim's `case` accepts multiple options or ranges, too. So the follwing 
might be closer to what you had in mind:


proc parseStatement(self: C): ref RootObj =
case self.flg
of "A", "a":
result = A(name: "a", value: "val")
of "B","b":
result = B(name: "b")
else:
result = cast[None](0)
   # now, test it

Run


Re: My experience with nim so far

2018-12-24 Thread moerm
> 1) When you work on very platform dependent code, you have to manually add a 
> bunch of constants and so on.

My current solution: a single -d:[platform] switch and an include file with 
some `when` logic. Probably by far not the smartest way but simple and working

2) Too picky with variables types. I ended up casting so much in my code (or 
pending my values with ".intx" it is not even funny anymore. You type a lot.

Yes, sometimes Nim is (still) a bit dumb. When I for example assign 5 to a 
uint16 var it feels a bit strange to still have to write `5.uint16`. That said 
I strongly prefer to (keyboard) type somewhat more over guessing whether the 
compiler produces correct code. All in all I love Nim's correctness obsession 
and am willing to pay the price, particularly when considering that it's 
getting smarter as it evolves.

3) Working with raw memory: ... a situation where I could not modify the raw 
memory of my pointer.

Example code?

4) Compiled program. For the exact same program (line by line translation), nim 
is asking for 110k bytes (-d:release). Directly written in C: 14k (-O3).

  1. runtime lib overhead?
  2. you _can_ tell Nim to compile the resulting C code with pretty any 
switches you like.




Re: Future of Nim ?

2018-12-23 Thread moerm
Maybe I was just insanely lucky or maybe in your field there are particularly 
dangerous spots - but I can't confirm your impression. My experience with Nim 
is that it's surprisingly well working.

The probably biggest problem I see is **documentation**. It _seems_ to cover 
most areas but it doesn't really or only superficially. I end up looking at the 
source and experimenting way more often than I like.

> Freezing the language for some time and fixing compiler bugs, refactoring, 
> maturing I think would be good for Nim.

Fully agreed. Nim urgently needs to stop being ever moving and to become stable 
and reliable (also as in "works as expected").

I also agree with your statement that _keeping_ (happy) users is more important 
and decisive than winning them over.


Re: Future of Nim ?

2018-12-21 Thread moerm
Well noted, I'm not at all opposed to what you wish. It's just that I don't 
agree with it being the big turbo for Nims uptake.

For a start, very many if not most professionals have little choice anyway; 
projects are done in whatever language management decides. The next major user 
group are the hobbyists and experience shows that their top priorities in 
choosing a language is ease of use and coolness; most of them will stick with 
Python, javascript, PHP and will consider many of Nims advantages as burdensome 
and not worth the effort. The third group (coming from all corners be it 
professional or hobbyists) is what I call the "maximum hunters" who for 
whatever reason happened to choose one attribute, often speed, as the only 
really important one.

Moreover (imo) developers are a lot less "political" (and a lot more socially 
driven) than lots of noise from diverse groups might make it look like. Most of 
those who do get to choose their language choose what their peers and/or social 
environment consider great and for what lots of tools are available, incl. a 
fat IDE. Don't get me wrong but just look at what most Nim developers chose: 
The fattest and worst piece of editor-monstrosity of all, VSC; why? Probably 
because it offered the easiest way to get support for Nim done.

Also don't underestimate the immense damage the GNU activists (now in the 
second wave followed by social/diversity/hot chocolate activists) have created. 
I've seen enough clients whose first and very mistrusting question is "is it 
GPL poisoned? Is there _any_ piece of GPL infested code in the software? If yes 
just leave, good bye". Add to that the (not at all) funny zoo of licences that 
bewilders and confuses people incl. many developers.

Yes, Nim has a good and really free license. But I think you got it wrong. 
People don't chose a language for its license; they do _not_ choose it if it 
has the wrong one. Small but decisive difference. Nims license _was_ indeed one 
(of many) point when I chose Nim but it was just a "non poisonous license? 
Check". The real and major reasons for me choosing Nim are probably known by 
now.

No matter who is right here, I do not hope (because size of user base is not 
important to me) but I do think that Nim will continue to attract - and keep - 
developers. It has no major negative issue (like e.g. a poisonous license) and 
it has a convincing set of positive features and factors to offer. And, that's 
important, being small and not in the big spot light also keeps a lot of 
trouble away and keeps us really free. Let's just continue to grow at our own 
pace.


Re: Using var object in a proc that is the object's property

2018-12-21 Thread moerm
P.S. @Araq and team

If it's not too much trouble it might be a good thing to differentiate between 
illegal memory access and _calling_ a nil proc. Would probably be helpful and 
errors like the above would be easier to see/understand for new Nim users.


Re: Future of Nim ?

2018-12-21 Thread moerm
> I hope you scored all the brownie points with Araq

My respect for Araq is well earned and deserved. Thinking that it's about 
brownie points tells more about you than about me.

As for the matter:

You were (and are) free to clearly show how Zig's importing C stuff is not 
insignificantly better than Nim's -and- how Zig is otherwise equal to Nim in 
all other regards -and- how a possibly slightly easier or slightly more 
efficient C FFI is a _significant_ factor.

As for the not caring it seems to me that we both do not care a lot about each 
others points. I, however, have well accepted studies on my side that clearly 
demonstrate the importance of readability (where Zig looks poor next to Nim), 
and also the obvious fact that software quality is the single most important 
factor for many _major_ problems (like safety and security nightmares).

To name just one example: I recently ported a CAESAR (crypto competition, AEAD 
sym. crypto) finalist reference code (written in C) to Nim and discovered a 
(typical C/C++) off by one buffer overflow error that does allow to crash any 
server using it. That's _important_ and it shows that after decades of 
experience we _still_ produce bad code even in core software (those finalists 
have the potential to soon run on more than a billion systems and in very 
sensitive areas). Well noted, that code wasn't simply bad; it was carefully and 
generally well designed and well implemented ... but it's just too easy in 
languages with bad readability to miss a detail and/or make a small but in that 
case tragically dangerous error. But you praise C++. So it's not ill will from 
my side but it just seems that we really live in different worlds. I care about 
systems having at least a reasonable level of reliability while you seem to 
care about minute insignificant (from my point of views) advantages of a 
language over Nim.


Re: Future of Nim ?

2018-12-21 Thread moerm
a) I don't see the big difference or how using C stuff in Nim is more labour or 
more complicated than in Zig.

b) Again: Even _if_ you were right I wouldn't care because that would be a 
discussion about a Ferrari being oh so much slower than a Lamborghini; looking 
from the perspective of a Renault minivan driver both are top. And well noted I 
have plenty often used plenty many C libraries from Nim and have also ported 
some C libraries. I simply fail to see how Zig is oh so much easier and better 
in that regard.

Also again: I'm not getting paid to use C functions 0.7% faster or easier. I'm 
getting paid to produce good quality, well to maintain, reliable, and safe 
software and while I'm in no way against Zig I fail to see how it would serve 
my purpose better than Nim which gave me a _major_ boost compared to _actually 
useable_ languages I used before. Moreover there are also quite some points 
where Zig can't match Nim.

> RAII, Parallelism, Plugins - these are things that most C++ devs who are 
> building game engines or complex RT apps, are going to want.

As for complex RT apps (which I do) I disagree. As for games I don't care. 
Reason: We are not in a situation where tens or even hundreds of millions of 
password are stolen almost every week, we are not in a situation where the 
complete stack from operating systems, core libraries (openssl anyone?), 
libraries, applications are utterly rotten because game developers are unhappy. 
We are in that situation mainly because C and C++ failed miserably. As long as 
we have real problems I couldn't care less about game developer whims and I'm 
definitely not listening to C++ fans who still didn't get the message. I do 
listen however, and carefully, to someone like Araq who created a language that 
actually helps us to address a bunch of major problems in software development 
where it counts. Have a nice day.


Re: Using var object in a proc that is the object's property

2018-12-21 Thread moerm
No, it doesn't. It just seems it does but as soon as a call like 
`s.events.onload("some event"` is made it segfaults with `Illegal storage 
access. (Attempt to read from nil?)`

Please note that the approach of @Araq also fails because in the OP 
`wrapper.onload` was left nil which leads to the segfault.

Adding something like `wrapper.onload = proc (name: string) = echo "wrapper 
loaded & name` (plus onfail()) solves the problem for both the ptr and the ref 
approach. 


Re: Future of Nim ?

2018-12-21 Thread moerm
> Zig and Kit have arguably better interop capabilities with C than Nim does atm

One might see it like that - I, however, do not. One reason is c2nim and the 
other reason is that I'm interested in an easy comfortable way. In a way we 
talk about different things; I talk about that it's easy while you talk about 
"the (presumably) best". For me, say, a Ferrari (no matter the model) is _very 
fast_ (as compared to 95% of all cars) while you argue that car XYZ is even 
faster. Maybe but I don't care.

> I can't do things with Nim that I can in C++

Probably. If you say so. But then that's true for pretty much every language.

Maybe my wording was clumsy or misleading, sorry. I do _not at all_ care about 
superlatives - I care about how easily and efficiently a language allows me to 
address 95+% of any dev. job I typically encounter and to create _maintainable_ 
and _safe_ code; versatility (e.g. many platforms) is a major plus. btw I also 
do _not_ care (at all) about how popular a language is but I know that 
popularity tends to bring good tool support along. If Nim's docs and tool 
support get better I will very happily use Nim even if it happened to not be 
popular because everybody and his dog are running for the newest totally-hip 
trend.

> Loyalty doesn't replace practicality.

Yes, I agree. But then I didn't mean loyalty as in "whatever my hero X does is 
great!" but as in "X _earned_ my loyalty by being great" (and I won't jump ship 
just because Y is a bit better in some respect).


Re: Future of Nim ?

2018-12-20 Thread moerm
I'll stay away from game related stuff as I know next to nothing about game 
development. One thing, however, I can say (certainly not adding to my being 
liked around here): I personally and subjectively consider Nim's success in 
some game development to be a warning and a grave danger.

> I do think Nim has a bright future - but I'm not sure where it lies. Right 
> now I view Nim in a similar light to Go - except that Nim is better suited to 
> real time app development.

My _personal_ view: Nim _will_ have a good future (but possibly might never 
become a _major_ language (in terms of user numbers).

Two factors I see as very important are

a) The Nim BDFT Araq (I'm certain) and the team around him (I presume) try hard 
to make Nim safe - safe as in verifiably safe.

b) Nim offers the "full package", i.e. threads, async, channels together with 
good system dev. capabilities and usability in a wide range of applications 
from system to games and whatnot. _My personal_ impression is that Nim is 
almost never the best choice for any field but always among the best 3 choices.

I know, I know, in our world full of "gurus", "evangelists" and every other 
week (it feels like it) a new "wisdom" and "disruptive paradigm" it's terribly 
uncool to _really_ know only one (or a few) language(s) really well and we're 
told to (almost) use a different language for each project - but I think that's 
BS. Reason: It takes years to achieve mastery in a language. I have thought 
about the problems in development for more than a decade and I _did_ look at, 
try, use quite some languages from Pascal, Modula3, Oberon and Ada to more 
exotic ones like albatross or Felix and specialized ones for design, modelling, 
and verification. One very clear result of that is that one should have one 
language one really masters that covers most scenarios one encounters in a 
professional life. That's worth much more than halfway knowing a dozen 
languages (which most of us do anyway I presume).

Another major and important strength of Nim is the fact that Araq _wisely_ 
chose to stay away from braces and to instead favour readability which is - 
proven to be - one of the major factors leading to quality code. (Side note and 
another compliment to Araq: He didn't fall for the "Let's save on (keyboard) 
typing" trap or in other words, Araq was smart enough to not confuse the jobs 
of a language and an editor).

What I said above ("Nim is almost never the best choice for any field...") 
unfortunately(?) also has an analog in "Nim is not the [superlative]". Nim 
probably will never reach 100% of C's speed, nor 100% of Pythons ease and speed 
of getting something done, nor 100% of Pony's IO speed, ... hence Nim probably 
will never be _the_ language for XYZ.

BUT: Coding in C not only creates fast code; it also creates bugs. Coding in 
Python not only creates quick solutions; it also creates slow solutions. Coding 
in Pony not only creates the (an educated guess) fastest servers; it also 
drives away many developers due to it's complexity. Nim, however, plays in a 
very good position in all those fields plus (important in many fields) it 
allows to create code for pretty any relevant architecture (plus even 
javascript).

Finally, yes, Nim still has many sharp edges, immature spots, and a not really 
complete set of "batteries" (libs) plus the tool situation is, Pardon me, 
miserable (and IMO one of the major barriers for better uptake). But then, Nim 
is a) a small project without millions of $, and b) not yet 1.0.

My personal summary is that Nim (probably) will never be a Top 5 (or even Top 
10) language, will continue to grow slowly (in terms of number of users), but 
will a) develop a very loyal user base, b) stay relevant and be used for a long 
time, and c) will attract developers who don't care about being "cool" but 
about having an excellent tool for serious work.

P.S. While it is (like many other things) only relatively poorly documented, I 
can't commend @Araq enough for the ridiculously good, easy, and comfortable C 
interface. As I became more proficient with Nim I used it less and less and 
often preferred to just re-create libraries in Nim right away (based on a quite 
useable c2nim output. Kudos for that, too) but I consider Nim's interface to C 
as very important anyway because it means that the immense number of C 
libraries are easily available.

If I were asked for a (not trivial) suggestion for Nim's future I'd suggest to 
create an option for formal verification via Z3.


Re: Future of Nim ?

2018-12-20 Thread moerm
> Nim needs a "killer app". This doesn't necessarily mean a specific app, like 
> Rails was for Ruby - it can be a general pattern, some clear specific thing 
> that Nim does better than its competitors.

I'm afraid neither will do. I'm afraid that "being the most libertarian 
programming language: no restrictive licenses / patents" also has quite little 
influence; for most "it's free (beer & freedom) is good enough.

I'm afraid the only way to take off in a fast way would be a big organization 
or company using Nim in a _well visible way_. Regarding Python I 'd be careful 
with comparisons because when Python came up (I remember it well) it did 
scratch a _real and major_ itch; that's something Nim hardly can do because 
nowadays there are way, way more languages than itches.

_The major_ factor is observed is boring: it's basically social. It's 
"everybody learned it at university" (e.g. java), "(almost) everybody is using 
it", "it came for free", "my peer group uses it", etc.

Another major factor is (imo) that relatively few seriously study diverse 
languages and (can and) do value Nim for what it offers. It's weird, I know, 
but from what I see, Nim will continue to grow only slowly _because_ of it 
being about good principles and design rather than about this or that "sells 
well" argument, plus Nims lack of a big organization with lots of money and 
marketing behind it - which in my book is _positive_ because it means that Araq 
and team (and to a degree, we, Nims users) make decisions.

Btw, Nim isn't that poorly placed. Just think about Ada which had and still has 
some very large users but still is all but significant (even "nobodies" like Go 
overtook it). So what? Neither do I need a tool to brag nor one to use together 
with many, many others. What I need is a _good_ tool - and Nim _is_ a good 
tool. And in some respects it's an advantage to be less well known.


Re: Nim vs D

2018-12-14 Thread moerm
Let me help you out. I'll emphasize the relevant word to make it easier for you.

> I can let other opinions stand and have no need to "facepalm" or similar to 
> otherwise belittle or attack **anyone** here

Maybe, just maybe one might take my opinion re. D as "attack" or belittling IFF 
I had it written in the _D_ forum. Which, however, is not the case.

More generally: Why are you all here, why are you using (and liking I hope) 
Nim? Because you think that at least for certain jobs it's better than other 
languages. Maybe even because you think (like myself) that Nim is generally a 
better language. Some here might be well advised to think about that before 
getting excited.

I did _not_ call Mr. Bright or any D developer stupid or anything negative. I 
merely stated my personal opinion about D the language - not about any people 
involved with D.


Re: Nim vs D

2018-12-14 Thread moerm
So? You have your opinion and I have mine - and as you saw (with my no further 
discussing with someone else) I _can_ let other opinions stand and have no need 
to "facepalm" or similar to otherwise belittle or attack anyone here. In fact I 
assume the differences in our views largely stem from different points of 
views, needs, priorities and not from ill will. But well, that attitude seems 
to not be in reach for everyone.


Re: high(int) works, high(uint) does not

2018-12-11 Thread moerm
Thanks. I'll wait (and of course just like you have an interim work around).

Nice work btw. (NimCrypto).


Re: Unknown cause of AssertionError

2018-12-11 Thread moerm
I don't think the problem is with Nim. It's highly likely with floating point. 
Unless you want to play with arbitrary precision FP you'll have to live with 
the fact that while FP can represent mildly large integers it doesn't offer 
precision with fractions.

Btw, that you use only up to 2 decimal places in your code suggests that your 
project might be about finance/trading or similar, i.e. about currency values. 
Should that be the case I strongly advise you to use integer arithmetic and to 
work with cents (or, should it be required as in some financial fields, with 
tenths of cents).


Re: high(int) works, high(uint) does not

2018-12-11 Thread moerm
Thanks. I mentioned it (uint.high) mostly for general interest anyway because I 
almost always use specific intXX and uintXX types.

What I said does hold true, however, also for and specifically for `uint64`.

`echo "max uint32: " & uint32.high` (and smaller sizes) does compile/work

but

`echo "max uint64: " & uint64.high` does _not_ compile.

Note that `int64.high` also compiles and works. It seems hence that the uint64 
case is simply a forgotten one that can and should be easily fixed.

As for int/uint I'm bewildered by the argument (not an ordinal) because one 
would assume that Nim (like most languages) has those defined to be whatever 
the bit size of a system happens to be, i.e. int is either int32 or int64 and 
the same for uint.


high(int) works, high(uint) does not

2018-12-10 Thread moerm
`let h = high(int)` works

but

`let h = high(uint)` does not work

Suggestions?


Re: Nim vs D

2018-12-09 Thread moerm
Hmm, I see. A classical political correctness and Über-Ich argument ...

Well, no. I clearly said "personal summary" and I have no obligation to meet 
any arbitrary conditions like e.g. "examples!" Well noted, I _could_ provide 
more detail (and I did for Nim which is _worth it_ ) but I don't for D.

But I will provide one hint as a token of good will: readability. Unlike Nim D 
stayed stuck in the braces and "let's save some characters for efficiency" 
paradigm - which has been demonstrated to be a _major_ source of errors. 
Summary: D is, just as I said, just yet another "let's make a better C/C++" 
(with some thrown in/glued on modern stuff.

With all due respect I'm not interested in your list. Those things are 
technicalities. A good language, however, needs deeper insights (I'd even say 
wisdom). Araq demonstrably has that insight while Mr. Bright actually comes 
from a decades long C compiler background. Nothing against Mr. Bright, he is 
probably a nice and smart man, but he even said himself (!) that D came into 
existence to be a better C/C++. Classical premise problem.


Re: Nim vs D

2018-12-08 Thread moerm
FWIW I had a look at D multiple times and learned to fervently dislike it. My 
personal summary is that D is but yet another better C/C++ attempt with funny 
gadgets added and an utter lack of consistence in concept and design.

To even put Nim and D next to each other is ridiculous.


Re: Should we get rid of style insensitivity?

2018-12-07 Thread moerm
Yes and no. Yes if the libraries are written in Nim but no, when they are not. 
But: If they are written in Nim they can be both, snake_case or camelCase 
_because_ Nim makes no difference.

Also note that than Nim can import C libraries and use any identifier style you 
prefer anyway.

The trade off is that you get a tiny bit more freedom in the case you don't 
like a Nim libraries style but you get uncertainty and confusion because the 
compiler does not differentiate.

Maybe the difference in our points of view is due to you preferring more 
liberty (no matter whether it's actually useful) while I prefer safety and a 
compiler not playing tricks behind my back.


Re: Should we get rid of style insensitivity?

2018-12-07 Thread moerm
Why should they? Nim _has_ a good - and minimally restricting - rule. Types 
start with a capital letter, vars and procs don't. There's nothing in your way 
to, e.g. have all types start with 'T' (like Tperson or TPerson)


Re: Should we get rid of style insensitivity?

2018-12-06 Thread moerm
No, style insensitivity does _not_ allow your to do what you want. The reason 
being that the language behind the scenes changes identifiers. The correct and 
honest way to say it is that Nim gives you less freedom (in that regard).

Look: e.g. in C (and many other languages) you can have two identifiers 
"foo_bar" and "fooBar" and they are two different things. In Nim, however, they 
are just two names for the _same_ thing. 


Re: Should we get rid of style insensitivity?

2018-12-05 Thread moerm
I'm one of those. I still sometimes find myself starting types with 'T'. But I 
don't feel that to be a problem. And anyway, a rule that says that types must 
start with a capital letter makes sense and is easily within what a language 
can reasonably demand.

Style insensitivity, however, always looked willy nilly to me, just an 
arbitrary rule made by the creator(s) of Nim.


Re: How to lookup the IPV6 addr of a domain name?

2018-12-03 Thread moerm
The getAddrInfo call doesn't return the full struct (only to 0xfff1). Moreover 
the first couple of bytes are _not_ part of the IPv6 address and the first 
address byte is the 5th byte after 0x50 (0x20, 0x1, ...).

Frankly, I don't think it worth putting more work into the current solution. If 
you are interested just have a good c solution with 2 functions, one for IPv6 
and one for IPv4 and then a Nim wrapper with an additional convenience 
function/proc that calls both and returns a seq[IpAddress] (from which btw. the 
C memory can be freed too once the C helpers are done and Nim has built up its 
result seq.


Re: Deprecation of

2018-12-02 Thread moerm
Also note the conceptual difference:

`for x in someIterable: foo(x)` deals with the _elements_ of someIterable, e.g. 
with the letters of a string.

`for i in 0 ..> something.len: foo(i)` is conceptionally quite different albeit 
well known from most older languages and deals with the _indices_.

More often than not the former is what is actually desired but the latter is 
used because it's what we grew up with in C.

Also note that Nim has the very valuable option of low and high for the 
"indexed loop variant" which frees you from worrying about and using `..` vs. 
`..<`.

For example `for i in someArray.low .. someArray.high: foo(i)` safely walks 
from the first element of someArray (whatever that may be) to the last element 
or more precisely yields all indices from the first to the last one.


Re: How to lookup the IPV6 addr of a domain name?

2018-12-02 Thread moerm
@satoru

Note that `getHostByName` is deprecated at least on linux.

@Libman

Nim (nativesockets) _does_ provide `getAddrInfo` but as you probably saw in 
your research it's all but worthless because it's a mess. Which btw. is hardly 
Nim's fault but rather a consequence of IPv6 being a makeshift insanity and 
mess. So I can perfectly well understand that the Nim developers basically just 
threw something very close to a blank importc at us. A proper clean Nim version 
would be quite some work due to both IPv6's insanity and plenty of OS 
implementation details. And as only a few proponents really use IPv6 while 
pretty much all servers still use IPv4 it's not exactly an attractive and 
urgent looking goal to do that work.

My personal approach - and suggestion for those who absolutely want that 
functionality - would be to create a reasonably sane (well, as sane as anything 
IPv6 related can be) implementation in C, say, one simply returning a list of 
objects holding IPs, maybe with an extra field indicating IPv4 or IPv6, and to 
then create a Nim binding returning a sequence of those simple objects for 
that. 


Re: IpAddress to/from disk problem

2018-12-01 Thread moerm
  1. I _did_ describe my solution
  2. I don't react as desired to paternalizing and trying to paint me as a bad 
guy without a social conscience. Quite the contrary.
  3. I try to avoid putting code here because I would always have to look up 
the forum syntax how to do that.




Re: IpAddress to/from disk problem

2018-12-01 Thread moerm
Yes that's one way but not what I want. I mean, come on, it's not something 
exotic to binary read/write from/to disk.

Anyway, I have found a working solution now after a lot of research. Thank you 
all for sharing your thoughts.


Re: IpAddress to/from disk problem

2018-11-29 Thread moerm
I'm a human. _Of course_ I use higher level functionality when available and 
adequate. I'm no less lazy than others - g

For configs for example I use high level modules (like json). But for some 
things the ugly old way of using low level binary writing is needed.

Btw: I'm less afraid than many because crypto is a major part of my work and 
that _is_ low level by its very nature. Also of importance (well, to me): 
that's one of Nim's strengths; it can serve at a level almost as low a C as 
well as on a level not far from Python. If I had to choose the latter is very 
nice and great for productivity but the former is conditio sine qua non.

IMHO the real solution to that problem field would be to have something like 
"toBinary" and "fromBinary" for all types returning a byte array or seq. But 
I'm not complaining; I guess comes time come (stable) solutions.


Re: IpAddress to/from disk problem

2018-11-29 Thread moerm
Yes, that's in part what this (and, I assume, the problem) is about. I _did_ 
study the doc (and the source) carefully.

More docs on Nim "toBinary" and "fromBinary" magic in types, used. eg. in 
FileStream read and write would be urgently needed IMHO.


  1   2   >