Re: redirecting a file descriptor to an array variable? Possible? How? RFE?

2015-11-18 Thread Linda Walsh



Greg Wooledge wrote:

On Sun, Nov 15, 2015 at 05:26:26AM -0800, Linda Walsh wrote:
This can be used to serialize an array into a file
and to read it back:

printf '%s\0' "${array[@]}" > file


So just like you can have [<>|]&[-0-9][0-0] as redirection ops that work
with files or FD's, why not have something like:

out=()
printf '%s\0' "${array[@]}" >&{out[@]} (**)

**: or ">@out" or ">&@out"



using 4.3,   But how is this:

  > mapfile -td '' array < file

That's fine for reading files, but I'm wanting this to be doable
in the absence of a file system, so again, instead of '>' used in printf,
above, use '<' -- but have the object it reads from represent
an array -- so you could have your file in an 'array' with the
nul's trimmed off (or nl's).

As for not wanting to use files... they are an unnecessary
dependency in most cases and file opens are expensive on some
OS's -- more so than linux (ex. cygwin).

If I have GigaBytes of memory, why should I have to use a 
file system to store transient, tmp information (unless it exceed 
all of memory), BUT -- that has often happened when "/tmp" hasn't been

large enough -- with cheaper memory, note the situation on my system
as it's been for a couple of years now...


df /tmp  /dev/mem

Filesystem  Size  Used Avail Use% Mounted on
/dev/sdc2   7.8G  3.3G  4.6G  42% /tmp
devtmpfs 48G   44K   48G   1% /dev


So if you are writing to a tmp file, which will die of
ENOMEM first?

If I thought adding 4.6G to '48G' (statistically), enough
of the the time, I'd certainly arrange for a spill.

But at run time, wouldn't it be smart to check resources --
like disk space in /tmp or free memory?  If free memory is

10x the disk space, maybe using the memory should be given
priority in automatic mode, thought for corner cases, 
'tmp' should allow "auto_largest", "mem", "/tmp" or "~/tmp", or 
auto_besteffort (do what one can -- but intially, start w/auto_largest,

then add a spell-to-disk "/tmp", if there is <90% space on it...




But contrary to the manpage under printf:
" The -v option causes the  output  to  be
 assigned  to  the  variable var rather than being printed to the
 standard output. "

printf does not print the output of the above into a var instead
of stdout.  


Seems like that would be a bug?


The command you used above does not have '-v var' in it.

---
1) printf '1\000\n'  | hexdump -C
  31 00 0a  |1..|

2) printf '1%c\n' '\x00' | hexdump -C  
  31 5c 0a  |1\.|


with/"-v"
1) printf -v v '1\000\n' 

declare -p v

declare -- v="1"  <<-- note that $v does not have '1\000\n' in it.

Ex2:

1)  printf '"%c"\n' '' | hexdump -C  
  22 00 22 0a 
1b) printf '"%c"\n' $'\x00' | hexdump -C  
  22 00 22 0a  
# Showing, indeed, '' == $'\x00', in this case, however, if we

try printing the result using "-v v" and look at it:

1st, assignment:

unset v; printf -v v '"%c"\n'  $'\x00' ;declare -p v

declare -- v="\""<<-- again note that "$v" doesn't contain $'"\x00"'.




If I am in a locale using UTF-16, there will be lots of 'NUL'
chars if I view that output as binary -- so there has to be one
or more ways to encode NUL bytes in a var such that they don't
terminate the string.  


I do not have access to a UTF-16 environment.  Does bash even WORK
in such an environment?  At all?


Actually, on this one, I'll have to find a set of 'locale'
files to describe the charset.  Several files that work with the
"LC_" files end up working due to the way they are specified ..
for example, the 1st go-around for the Chinese to encode their stuff
and everything else ended up considerably larger than UTF-8:

/usr/share/i18n/charmaps> ll --sort size -r | tail -2
-rw-rw-r-- 1 1951778 Aug 11 11:55 UTF-8
-rw-rw-r-- 1 4181789 Nov 17 15:12 GB18030


GB18030 _does_ provide a 1:1 mapping of Unicode codepoints there and
back again (same as UTF-8 and UTF-16).  
But since UTF-16 & UTF-8 have the same # of codepoints, they'd

likely be similar in size.



I don't see the relation between these two issues.  "Being able to
store NUL bytes in a string" has nothing to do with "being able to
capture stdout and stderr separately in shell variables" as far as
I can see.

---
As I pointed out above, you cannot safely create, save or restore
output from printf that will be redirected into a variable, which
contradicts the manpage specification saying:

The -v option causes the  output  to  be
assigned  to  the  variable var rather than being printed to the
standard output.


Chet would have to introduce a new feature for the latter to be possible.


	I'm sure it's not trivial to handle all the different 
surprise cases.










Re: redirecting a file descriptor to an array variable? Possible? How? RFE?

2015-11-18 Thread Linda Walsh



konsolebox wrote:

On Fri, Nov 13, 2015 at 8:45 AM, Linda Walsh  wrote:

I'd like to be able to record stdout and stderr
without using any temp files into bash array files, AND
record the status of the command executed.



You can use coproc processes that would act as buffers.
Obvious Note: Avoid it if you're conservative about some undocumented
areas that would look too hacky to you.


More often people think my code looks hacky because
I haven't used alot of Integrated GUI+Devel platforms, 


Re... using coprocs hadn't even though of those
(probably since I haven't had good luck with them working the way
I expected).

Maybe I'll work w/that, or maybe will try a refactor
of my current code to combine the output streams in 1 sub and that
might make my return of the 'exit code'($?) easier.



# TODO: Not sure how a proper coproc cleanup is done.  Closing
FDs does funny things.  Perhaps it's automatic, but not sure how
disown affects it.  We could examine the source code but not sure if
that would be reliable enough for all versions of bash including the
upcoming ones.


One thing I found useful is where you have
2 procs talking over a pipe, and the reader needs time to finish the pipe
read before parent closes it.  So instead of just closing a pipe, I
open a pair of pipes w/1 going in reverse direction. Then parent can send
various control commands to child while the child is generating output
so the child won't end prematurely.

As mentioned, was close but wasn't feeling right about
it.  But some things that simplify my life.  I'm running
commands that talk to the terminal.  None of them _should_ be putting
out NULs... and going off on red-herring rat-holes isn't the best
way to get something that works.  


One thing that seems to be another problem.  Greg called
my attention to another printf format bug:

 %q causes  printf  to output the corresponding argument in a
format that can be reused as shell input.

Which it doesn't when nuls are involved.



Re: redirecting a file descriptor to an array variable? Possible? How? RFE?

2015-11-18 Thread Greg Wooledge
On Wed, Nov 18, 2015 at 10:16:46AM -0800, Linda Walsh wrote:
>   So just like you can have [<>|]&[-0-9][0-0] as redirection ops that 
>   work
> with files or FD's, why not have something like:
> 
> out=()
> printf '%s\0' "${array[@]}" >&{out[@]} (**)
> 
> **: or ">@out" or ">&@out"

You seem to be a bit unfocused here.  If you seriously want to request
a new shell feature of this level of complexity, you're going to have
to come up with an actual syntax and specification.  Then you and Chet
may have to discuss implementation details that I can't even guess at.

I would steer away from > where "punc" is some third punctuation
characters.  We already have > for FD duplication and closing,
where "word" can be an integer FD to duplicate, or - to mean close.
Find some syntax that won't conflict with existing features.

Also note that you still wouldn't be able to store \0 (NUL) bytes in
bash variables.

Also note that if you just want to copy the content of an array (but
not the indices) to another array, you can write:

  out=("${in[@]}")

You should stop using irrelevant examples, and return to your core
question.  I believe your initial goal was to run a command, and capture
the command's stdout in one variable, and the command's stderr in another
variable, without using any temp files.

Come up with some syntax and specification to achieve that, and present
it to Chet.



Re: redirecting a file descriptor to an array variable? Possible? How? RFE?

2015-11-18 Thread Greg Wooledge
On Wed, Nov 18, 2015 at 10:46:57AM -0800, Linda Walsh wrote:
>   One thing that seems to be another problem.  Greg called
> my attention to another printf format bug:
> 
>  %q causes  printf  to output the corresponding argument in 
>  a
> format that can be reused as shell input.
> 
>   Which it doesn't when nuls are involved.

An argument cannot contain a NUL byte.  So it's moot.




OLDPWD unset when bash starts

2015-11-18 Thread John Wiersba
Why does bash clear OLDPWD when a child script is started?

OLDPWD is exported and passed to any children, but bash apparently clears 
OLDPWD whenever a child script is started:

$ cd /etc
$ cd
$ perl -e 'print "<$ENV{OLDPWD}>\n"'

$ ksh  -c 'echo "<$OLDPWD>"'

$ bash -c 'echo "<$OLDPWD>"'

<>
$ uname -a
Linux myserver 2.6.32-504.8.1.el6.x86_64 #1 SMP Fri Dec 19 12:09:25 EST 2014 
$ cat /etc/redhat-release
Red Hat Enterprise Linux Server release 6.5 (Santiago)
$ bash --version | head -1

GNU bash, version 4.1.2(1)-release (x86_64-redhat-linux-gnu)

Can bash be fixed to preserve the value of any OLDPWD in its initial 
environment, like it does with PWD?

Thanks!
-- John Wiersba



Re: redirecting a file descriptor to an array variable? Possible? How? RFE?

2015-11-18 Thread Linda Walsh



Greg Wooledge wrote:

You should stop using irrelevant examples, and return to your core
question.  I believe your initial goal was to run a command, and capture
the command's stdout in one variable, and the command's stderr in another
variable, without using any temp files.


Actually, wanted them already line split  into 2 arrays.

That's why I didn't care too much if the lines were separated
by NUL's instead of NL's.



Come up with some syntax 


The syntax isn't the hard part -- really, I think there are several
formats that could be used that wouldn't conflict w/current uses.

I'm more interested in some of the features...like the ability
to separate input datums w/NUL's read by readAR/mapfl seems perfect for this
type of thing.

The opposite side -- printing things out with a nul or NUL-sep
is missing -- but WOULDN'T be missing if bash's implementation followed
the man page.

I.e. If "$v" could really store '"\x00"\n', then another huge lump would
"go away".


I would steer away from > where "punc" is some third punctuation
characters.  We already have > for FD duplication and closing,
where "word" can be an integer FD to duplicate, or - to mean close.


---
Except that would be the wrong thing to do.

You have:
1) optional 'fd' to work with -- that, currently has to be blank or evaluate to 
an integer; so (after quote removal) it has to start with [$0-9].

2) "direction": into something (<); out of something (>);
a not-so-useful way to open a file with RW access(<>) (but no way to 
seek, skip, truncate or use use mapfile/readarray because

of the embedded nul problem;  the "through" something (|).

3) then comes the 2nd operand which says "word", but where is
'word' defined.  Is that any varname or funcname or alias name? 
Words normally don't start with "$" though varnames do.
Words, also, don't normally start with '(' but a subprocess 
can -- e.g. process substitution).


Since you are wanting to retain field '1' and field '2', it seems
a specification for '3' indicating array usage  would be a perfect
fit for the language... Since '@' is already used as a subscript meaning
all of an array -- using '@' before a 'word'  would store multiple,
separated, 'lines' as members in the array specified by 'word' & after the
'@'.

Something that parallels the syntax would be a better solution, I 
believe,  than coming up with something that looked entirely unrelated,

no?








Re: redirecting a file descriptor to an array variable? Possible? How? RFE?

2015-11-18 Thread Linda Walsh



Greg Wooledge wrote:

On Wed, Nov 18, 2015 at 10:46:57AM -0800, Linda Walsh wrote:

One thing that seems to be another problem.  Greg called
my attention to another printf format bug:

 %q causes  printf  to output the corresponding argument in 
 a

format that can be reused as shell input.

Which it doesn't when nuls are involved.


An argument cannot contain a NUL byte.  So it's moot.

---
As in:

printf '"%c"\n'  $'\x00'|hexdump -C

  22 00 22 0a   |".".|

	I see 2 arguments being passed to printf. That puts the 
NUL byte between 2 double quotes and terminates the line w/a newline.


What I'm pointing out is that a NUL byte can be used
and processed as an argument in some cases.  The fact that it doesn't work
in most places, I will agree, is a short-coming.  However one cannot
categorically say that a NUL byte can't be used as an argument.  Solving
other places where it doesn't work might make it so that it *would* work...

Maybe a "-b" (binary) option could be added to 'declare'?




Re: redirecting a file descriptor to an array variable? Possible? How? RFE?

2015-11-18 Thread Greg Wooledge
On Wed, Nov 18, 2015 at 12:29:15PM -0800, Linda Walsh wrote:
> Greg Wooledge wrote:
> >An argument cannot contain a NUL byte.  So it's moot.
> ---
>   As in:
> >printf '"%c"\n'  $'\x00'|hexdump -C
>   22 00 22 0a   |".".|

imadev:~$ printf '"%c"\n' '' | od -t x1
000   22   0  22   a
004

$'\x00' and $'\0' and '' and "" are all empty strings.

Still don't believe me?

imadev:~$ x=$'\x00'; echo "${#x}"
0

It has length 0.



Re: Proposed Prompt Escapes

2015-11-18 Thread Ángel González
Dennis Williamson wrote:
> Do you mean something like:
> 
> PS1='$SECONDS '

Not exactly. $SECONDS is the number of seconds since the shell was
started. I am interested in the seconds elapsed by the last command.

It is a hacky solution, but it could be done with:
> PS1=''; PROMPT_COMMAND='printf "%s " $SECONDS; SECONDS=0'



> (along with shopt -o promptvars)

For anyone trying to follow this, the above line should have been 
shopt -s promptvars (promptvars option is set by default)

Best regards



Re: Proposed Prompt Escapes

2015-11-18 Thread isabella parakiss
On 11/19/15, Ángel González  wrote:
> Dennis Williamson wrote:
>> Do you mean something like:
>>
>> PS1='$SECONDS '
>
> Not exactly. $SECONDS is the number of seconds since the shell was
> started. I am interested in the seconds elapsed by the last command.
>
> It is a hacky solution, but it could be done with:
>> PS1=''; PROMPT_COMMAND='printf "%s " $SECONDS; SECONDS=0'
>
>

You need to reset it right before the command is executed, PROMPT_COMMAND
is evaluated when the prompt is drawn, before you type your command.
A debug trap is better suited, but you have to make sure that it's only
executed once in a complex line.

Something along these lines should work:

PROMPT_COMMAND=debug_trap=0
trap '((!debug_trap++))&=0' debug
PS1='$SECONDS \$ '


---
xoxo iza



return bultin ignores explicit exit value when used in DEBUG trap

2015-11-18 Thread Grisha Levit
Not sure if this is a documentation bug or a functional one, but the man page 
does not seem to agree with the observed behavior for return when invoked in 
DEBUG traps.

return [n]
   Causes a function to stop executing and return the value specified
   by n to its caller.  If n is omitted, the return status is that of 
   the last command executed in the function body.
   ...
   If return is executed by a trap handler, the last command used to 
   determine the status is the last command executed before the trap 
   handler.
   ...
   If  return  is executed during a DEBUG trap, the last command used 
   to determine the status is the last command executed by the trap 
   handler before return was invoked.



It seems that when return is used in a DEBUG trap, any explicit return value 
used is ignored.

For example:

fun1() { trap "trap - DEBUG; (exit 1); return"   DEBUG; :; }
fun1; echo $?  # 1, as expected

fun2() { trap "trap - DEBUG; (exit 1); return 2" DEBUG; :; }
fun2; echo $?  # also 1!


The man page doesn't mention anything about the RETURN trap, but it seems to 
function the same way as the DEBUG trap does in this regard.

fun3() { trap 'trap - RETURN; (exit 1); return 2' RETURN; (exit 3); return 
4; }
fun3; echo $?  # also 1!



There's also a minor inconsistency between the help topic and the man page 
regarding which traps are inherited when functrace is on:

 -T  If  set,  any traps on DEBUG and RETURN are inherited by
  shell functions,  command  substitutions,  and  commands
  executed  in  a  subshell  environment.   The  DEBUG and
  RETURN traps are normally not inherited in such cases.

 -T  If set, the DEBUG trap is inherited by shell functions.