Re: [PDCurses] PDCurses 3.9 pkg-config for THE-3.3RC8

2022-04-12 Thread LM
On Mon, Apr 11, 2022 at 6:09 PM Richard Narron  wrote:

>
> The PDCurses 3.9 package on SLackBuilds.org now creates a pkg-config file
> for use in building the Hessling editor 3.3RC8.
>

Nice addition.

>
>
> Ideally this file would be named the same as the library:
>
> /usr/lib64/pkgconfig/LibXCurses.pc
>
> And also it would have Libs: for dynamic loading and and Libs.private:
> for static loading.
>
> Libs: -lXCurses
> Libs.private: -l:libXCurses.a  -lXaw -lXmu -lXt -lX11 -lXpm -lSM -lICE
> -lXext
>
> More information on creating these files is here:
>
> https://www.freedesktop.org/wiki/Software/pkg-config/
>
>
>
I use pkgconf ( https://github.com/pkgconf/pkgconf ) with almost all of my
builds and I have scripts similar to Slackbuilds (for a variety of
operating systems) that generate a .pc file as well.

Here's part of a script for MinGW on Windows:

#!/bin/bash
...
cat > pdcurses.pc << EOF
prefix=/opt
exec_prefix=\${prefix}
libdir=\${exec_prefix}/lib
includedir=\${prefix}/include
Name: pdcurses
Description: Public Domain Curses screen library
Version: 3.4
Requires:
Requires.private:
Libs: -L\${libdir}  -lpdcurses
Libs.private:
Cflags: -I\${includedir}
EOF

It gets tricky tracking the different dependent libraries on different
operating systems.  None of the X libraries are typically used on Windows
(unless you're developing for a target like Cygwin).

I investigated naming the pc file something generic like curses.pc so I
could switch out curses libraries and use either pdcurses or ncurses as
desired.  Didn't work out that well so I just use pdcurses.pc.

Sincerely,
Laura
http://www.distasis.com/cpp


[PDCurses] box drawing with fonts

2018-12-06 Thread LM
I'm trying to convert some old programs I wrote a long time ago from
box drawing with OEM fonts to uses PDCurses with SDL2 and UTF-8 box
drawing and block element characters (
https://www.utf8-chartable.de/unicode-utf8-table.pl ).  Was just
wondering what's been investigated for improving lining up of font
characters.  I noticed the _grprint routine in pdcdisp.c drawing some
box drawing characters with FillRect.  I've been experimenting with a
few modifications that make UbuntuMono-R.ttf look good with box
drawing.  It doesn't look quite as good with DejaVuSansMono.ttf, but
not sure if the algorithm's still missing an adjustment or if it's an
issue with the font.  I did update to a later version of the
DejaVuSansMono.ttf font and it made some improvements in display.  Was
wondering if anyone else has looked into this or is looking into this.
Would like to compare notes with others on this subject and avoid
reinvestigating areas that someone's already thoroughly looked into.
Thanks.

Sincerely,
Laura Michaels
http://www.distasis.com/cpp


Re: [PDCurses] Custom font with PDCurses 3.6 / SDL2

2018-03-21 Thread LM
On Tue, Mar 20, 2018 at 2:24 PM, Karl Garrison  wrote:
> Does the SDL2 version of PDCurses still support using a custom font?  If so,
> are there any special considerations for this version?  Having pdcfont.bmp
> in the current directory does not work for me, nor does explicitly setting
> PDC_FONT environment work.

Don't know if the code has changed in the current version of PDCURSES
from what I added when I patched it to create the SDL2 port, but I
have the following documentation on how I modified fonts when working
with the SDL2_TTF:

Default font for SDL_TTF is assumed to be DejaVuSansMono.ttf from the Open
Source DejaVu fonts.  The code looks for this font to already be installed in
a standard directory (based on the Filesystem Hierarchy Standard)
/usr/local/share/fonts/truetype/dejavu/.  This can be modified by setting a
define during compile time
( -DPDC_FONT_PATH=/usr/share/fonts/truetype/dejavu/ ) This can be overridden
by initializing pdc_ttffont in your own code.  Similar to the
PDC_FONT environment variable for SDL bitmap fonts, one can also override TTF
fonts using the PDC_FONT and PDC_FONT_POINT_SIZE environment variables.

The SDL2 port adds:
  SDL_Window *pdc_window;
  SDL_Renderer *pdc_render;
  SDL_Texture *pdc_texture;
Using SDL_ttf (with SDL2 or SDL) adds:
  TTF_Font *pdc_ttffont;
  SDL_Color *pdc_ttffont_foregroundcolor;
  SDL_Color *pdc_ttffont_backgroundcolor;
  int pdc_ttffont_spointsz;
  int pdc_ttffont_hint;
These can be used or modified in your own code.  Like the pdc_screen variable
used by the SDL port, you can initialize pdc_window, pdc_render,
pdc_texture in your own code.  If it's not initialized, PDCurses will do
it for you.  See the sdltest demo for an example.  If PDCurses is built with
the PDC_WIDE flag, it will clean up/deallocate these variables on exit when
needed.

As mentioned, use the WIDE=Y flag with the makefile if you want
SDL2_TTF support.

I use the SDL2_TTF library with both SDL 1.x and SDL 2.x.  SDL2_TTF
supports the UCS-2 character set (16 bits).  If you want to work with
the UTF-32 character set (32 bits), I have a patched version I
modified to handle it.  You can find a link to the code at:
 http://www.distasis.com/cpp/lmports.htm

Last I checked, PDCURSES only supported the UCS-2 character set as
well.  (On Windows systems wchar_t is only 16 bits.)  Was going to
investigate support for 32 bits with PDCURSES at some point in the
future.

Sincerely,
Laura


Re: [PDCurses] Need with Autotools compilation on X11

2016-01-19 Thread LM
On Sun, Jan 17, 2016 at 4:11 AM, anatoly techtonik  wrote:
> And actually I don't understand what "a tool to help you configure
> projects" really means, because I am not familiar with autoconf
> (that's why I asked the question in the first place). So I see that C
> files already have #define mechanism inside, but it looks like C
> compilers can not agree what #define's to set and what do they mean.
> That's why users need to create #defines themselves and define the
> meaning for them. Am I right?
>
> Is there any attempt to define standard set for those #define's and
> their meaning across compilers?

If you're just starting out, it's not unusual to set compiler and
platform specific functionality using defines built into the compiler.
However, most experienced C/C++ developers consider it bad practice to
do so.  Instead, they create their own defines to help specify which
platforms should use subsets of code that may not be portable.  Why go
to all this trouble and create defines independent to the compiler and
platform?  Take a look at some real world examples.  What happens when
you try to build a library using X11 on Windows, but it hard codes
Win32 header files using the compiler's flag to check if you're on a
Windows platform.  You have to go through and fix every define,
because you only want the X11 headers, not the Win32 headers.  Cygwin
went so far as to remove the Windows define from their compiler so
they wouldn't have the issue.  However, if you're using X11 on a
Windows compiler other than Cygwin and the library or application uses
a builtin define, you're stuck fixing every instance of it.  I often
find myself going in and fixing defines because someone assumed
functionality was or wasn't available based on a compiler flag and my
compiler doesn't behave the way they assumed.  That's typically why
using built-in compiler defines to determine what code to use is not
considered the best practice.  What about when a compiler doesn't
support a feature in one version, but does in a later version?  Then
not only do you need to check the compiler flags for what operating
system and what compiler, but also what versions work and what
versions don't with a particular feature.  What happens when you need
to know the sizeof a particular type on a particular system?  Do you
assume you know what it is based on compiler and platform?  How do you
know if you're on a 32 or 64 bit system?  Do you look for yet more
compiler specific defines?  What if there aren't any?  C libraries
like musl went to great lengths not to have any compiler specific
defines to let you know that you're working with musl in place of
something like glibc or uclibc.  The developer wanted to be able to
add new features in later versions and discourage developers from
hard-coding what works based on compiler specific defines.  Things
also get complicated if you're building with a cross-compiler.  If you
check what platform you're on according to the compiler, it might tell
you Linux, but you may be building for Windows or an embedded system
with DOS or some other platform.

If you're familiar with a language like JavaScript, you'll find
JavaScript developers solve the issue of platform specific code by
using a runtime check to see whether functionality is available.  Some
developers have taken a similar approach with C/C++.  AT came up
with one configure/build system that does runtime time checks for
functionality.  Typically however, the most widely adopted methods for
checking functionality in C/C++ are systems such as autoconf and
cmake.  They creates small programs to run to see if a feature is
available or if it will fail, to check the size of types, to check
what flags work if a library is available, etc.  So basically it's
doing a runtime check.  The information gathered from these runtime
checks can be used to create defines or select what code is used when
the program is built.  So build systems for most C/C++ programs check
for runtime functionality during the build instead of having the
programs check when they're run.  It's usually much faster for a
program to do the check ahead of time and use the information in the
compile than to try to check everything dynamically at runtime.  It's
also easier for the developer if the build tool simply checks if the
code works or not on a platform rather than the developer trying to
code in every possible combination of which compiler specific flags
might be available and in what combination to determine how to do with
code that might not be portable.

It would be much simpler if developers stuck with code that was
portable to all systems, but in practice, this usually only happens
for very small sample programs.  Even command line tools like many of
the common GNU command line utilities (find, ls, etc.) use a lot of
platform specific code that's not part of the official C language
specifications.  There are several extensions to the C language such
as the POSIX 

Re: [PDCurses] Need with Autotools compilation on X11

2016-01-12 Thread LM
On Mon, Jan 11, 2016 at 10:15 PM, Mark Hessling wrote:
> That would be me. Remember at the time I did the X11 port autotools were 
> quite primitive. Maybe it is time to replace a lot of the custom autotools 
> stuff with more standard procedures.

Relating my own experiences, I've been switching applications and
libraries to use CDetect, make and pkgconf instead of the standard GNU
autotools or cmake.  I find that much easier to work with than the
many other build options I've been researching and trying.

There is some basic documentation at:
https://github.com/IngwiePhoenix/cDetect/wiki/Documentation:-The-general

I have several patches to CDetect to work with pkg-config and provide
other features I needed.  They may eventually get integrated into this
version ( https://github.com/IngwiePhoenix/cDetect ) of CDetect as
well.

You can find a link to my patches for CDetect at
http://www.distasis.com/cpp/lmbld.htm.  Just click on the archive link
and look for the files with cdetect in their name.  The original
source I'm using is in the file:
cdetect-git-8245c9591104ce4541e6dc6d2f4be041d9970a8b.zip.  My build
files and patches are in cdetectbld.zip.  A Windows (mingw and msys)
build script (shell script) generated from my build files is in
cdetectbldmingw.zip.  It creates a tarball (
cdetect-20100612-i686-1w32.tgz ) with files ready to install  (using
package manager or general extraction tools like tar or 7zip) on my
system.

Once the Win32 updates get added into the latest version of pdcurses,
I'll look into updating my SDL2 patches to work with the latest
version of pdcurses and will try to submit those patches.  I've been
using them on and off for a while now and they seem stable.

Also, I've been doing some testing of libmenu and libform.  May still
need patching in some areas and there are some differences between
applications using ncurses with built in libmenu/libform support and
applications using pdcurses with libmenu and libform.  So, you can't
typically take an application using libmenu, libform and ncurses and
port to pdcurses without a few modifications necessary.  However, not
all ncurses applications work with pdcurses without a few
modifications either.  Hope to make the code/patches for them
available as well.

PDCurses SDL2 port and SDL 1.2.x port currently use just a make file
with switches.  I was considering whether or not to introduce CDetect
into the build process.  Didn't want to make too much of a change to
how things worked though.  I could probably create one CDetect program
that would work for any of the builds, SDL, SDL2, X11, Win32, Win32
console.  Some programs/libraries offer multiple ways to build them
using tools such as cmake and GNU autotools.  So, if having another
build option for pdcurses would be useful, I could write a
CDetect/make/pkgconfig option for pdcurses.  If this would be useful
to anyone besides myself, please let me know.  To date, I've been
attempting to diverge as little as possible from the official pdcurses
code and makefiles.

Sincerely,
Laura


[PDCurses] Re: form and menu libraries

2015-10-09 Thread LM
I've been experimenting with the BSD versions of libform and libmenu
further.  They need some patching, but they do work with pdcurses.  Is
anyone interested in using these libraries with pdcurses besides me?
Seems easier to port certain pdcurses applications if you have them.

Sincerely,
Laura


Re: [PDCurses] Alt + Shift + n (for example) on Windows

2002-07-12 Thread LM

At 7/7/2002 6:10:00 PM, Jeremy Smith wrote:
I have run into a bit of a problem. I use the Curses function getch() in
my program, for keys such as 'n' or Alt+'n' but how do I detect these
combinations:

Ctrl+n
Alt+Shift+n
Ctrl+Alt+n
Ctrl+Alt+Shift+n

I imagine there must be some kind of flag to return Shift+Alt+2 as an
actual keycode (which is different from Shift, Alt or 2), but I don't
know how. :-(

I checked the header for file for curses.h and they appear to list all the
standard scan codes you can get back from a keyboard, including some ALT
combinations and a few of the Control combinations.

Then I checked the Windows implementation for getting a character.  In
pdckbd.c the routine win32_kbhit calls ReadConsoleInput.  The shift, alt,
control and other status keys are returned from this function in the
dwControlKeyState.  The PDC_get_bios_key reports what key is pressed using
the two kptab tables (kptab and ext_kptab), so if the combination is not
in the table, you probably aren't seeing it.  You may be able to add some
definitions to the tables if something is missing.  There's also some code
in the same routine that sets the states (BUTTON_SHIFT, BUTTON_CONTROL,
BUTTON_ALT), but this appears to be for mouse input.  So you can check
these keys in the MOUSE_STATUS structure.  This only pertains to the
Windows implementation.  Every platform has a unique way of figuring out
which keys are pressed.

In 16 bit DOS, I used to change the state of shift, caps lock and num lock
by changing the settings at the hex 417 address.  This only works for real
mode programs.  Many of the new Windows compilers don't even allow you to
access bios or memory directly with functions any more.  The 16 bit
location at hex 417 and hex 418 indicates that the right shift has been
pressed by setting the highest bit at the 417 address, the left shift, by
setting the second highest bit in the byte, the ctrl key by setting the
next highest and the alt key by the next highest after that.  The 418
address holds the status of whether or not the right or left alt and
control keys were pressed.

It appears as though you may have to make some modifications to your
particular version of curses to get exactly what you want.  Curses is a
very portable user interface, thus it often supports the lowest common
denominator of features on some platforms.  

By the way, I remember reading that there was work underway for a new
curses standard that included international character support.  Has anyone
heard anything further on this effort?

Best wishes.

Laura Michaels
http://www.distasis.com