Re: Getting file extensions [linux fs]

2019-04-05 Thread Pablo Lucena
Have you looked into eBPF? They have mature Python bindings. It makes
interacting with the kernel as efficient as possible - you can run it in
production at high resolutions without putting things at risk. One of the
benefits - any averaging / aggregation / histograms / etc can be done by
the kernel within the eBPF runtime, then passed back to python using eBPF
"maps" - the link between your userspace program and the eBPF kernel code.
From python this "map" is just a dict.

You'll need to be running at least kernel version 4.4 to get the basic
functionality, but ideally 4.9 or higher for all the newer and stable
features. No dependencies, its baked into the kernel. You will need clang
support to compile stuff, if you want to build modules on your own.


*Pablo Lucena*


On Sat, Mar 30, 2019 at 8:30 PM Paulo da Silva <
p_s_d_a_s_i_l_v_a...@netcabo.pt> wrote:

> Às 22:18 de 28/03/19, Cameron Simpson escreveu:
> > On 28Mar2019 01:12, Paulo da Silva 
> wrote:
> >> Às 23:09 de 27/03/19, Cameron Simpson escreveu:
> ...
>
> >
> > Oh, just tangential to this.
> >
> > If you were doing this ad hoc, yes calling the filefrag executable is
> > very expensive. But if you are always doing a large batch of filenames
> > invoking:
> >
> >  filefrag lots of filenames here ...>
> > and reading from its output can be very effective, because the expense
> > of the executable is amortized over all the files - the per file cost is
> > much reduced. And it saves you working out how to use the ioctls from
> > Python :-)
> That's not the case.
> I need to do it on some files basis which I don't know in advance.
> Using IOCTL, I don't need to parse or unpack the output. Only compare
> the output arrays. Besides I need to store many of the outputs. Doing
> that from filefrag text output would be unpractical. I needed, at least,
> to compress the data. Although may be I might have to compress the ioctl
> arrays ... Let's see how big in average is the storage needed.
>
> I have to go with ioctl. I have to open the files anyway, so there is no
> overhead for that when calling the ioctl.
>
> Anyway, thank you for the suggestion.
>
> Regards.
> Paulo
> --
> https://mail.python.org/mailman/listinfo/python-list
>
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: distribute python interpreter and dependencies

2018-11-13 Thread Pablo Lucena
After changing PATH variable; are you ensuring that the failed hosts are
using the new PATH vs the old one? I've seen different behavior on
different Windows versions for what it takes to "refresh" the env. System
vs User defined env may also affect this, depending on the user invoking
your program (windows versions have different behaviors from my experience)

Also may depend on 32 vs 64 bit? But sounds like this isn't the issue. Also
seen issues related to having the same Visual Studio files on client
machines as was used to compile the version of python your distributing.


Just a few thoughts.

On Mon, Nov 12, 2018 at 4:11 PM Thomas Jollans  wrote:

> On 12/11/2018 17:40, Juan Cristóbal Quesada wrote:
> > Hello,
> > this is my first mail. I resorted to the list after some prior
> struggling.
>
> Welcome!
>
> > Im facing the need to distribute a python installation folder and
> > interpreter in a shared network drive.
> >
> > Im also distributing the application's source code that would lie also in
> > another network drive.
> >
> > For this, my attempts have gone towards replicating the python
> installation
> > folder created after installing python27 in one machine and copy all the
> > files and directories to the network drive. After that, copied the
> > python27.dll of the C:/Windows/System32 file and set all the
> > Python27/lib;Python27/DLLs/Python27/Scripts... to the PATH
> environment
> > variable through a launcher script.
>
> I assume you have a good reason to want to use an old version of Python...
>
> >
> > This works on my machine and a couple othersBUT, not in some other
> > machines running as well windows 10 pro.
>
> In what way does it not work? Is there an error message?
>
> > So i investigated a bit and
> > discovered that if i install the python27 (2.7.11 same version) in one of
> > those failing machines... the "ctypes.pyd" module's filesize is
> > different So i replaced the original python27 folders with those of
> the
> > new installed python and now it works on those machines..havent
> > tried yet if it still works on the first ones...
> >
> > Why is this behaviour? Im guessing the python27 installer generates some
> > dlls "on the fly" that are tied to the windows operating
> system...
> >
> > I dont want to create a windows executable via py2exe or
> > pyinstaller.. What are the best steps to make a python
> interpreter
> > available to all windows based different machines? Am i missing something
> > else? What are the steps the python windows installer performs in order?
>
> I have no idea what the Python.org installer is doing here, but you
> could try one of the other Python distributions (e.g. miniconda)...
> MAYBE you'll have more luck with that (Or ActivePython, or WinPython, or
> whatever).
>
>
> -- Thomas
> --
> https://mail.python.org/mailman/listinfo/python-list
>
-- 
*Pablo Lucena*
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Search a sequence for its minimum and stop as soon as the lowest possible value is found

2017-01-07 Thread Pablo Lucena
How about using the second usage of builtin iter()?

In [92]: iter?
Docstring:
iter(iterable) -> iterator
iter(callable, sentinel) -> iterator

Get an iterator from an object.  In the first form, the argument must
supply its own iterator, or be a sequence.
*In the second form, the callable is called until it returns the sentinel.
 <<<-this one <<<---*
Type:  builtin_function_or_method


In [88]: numbers
Out[88]: [1, 9, 8, 11, 22, 4, 0, 3, 5, 6]

# create iterator over the numbers to make callable simple
# you may pre-sort or do w/e as needed of course
In [89]: numbers_it = iter(numbers)

# callable passed into iter - you may customize this
# using functools.partial if need to add function arguments
In [90]: def grab_until():
...: return next(numbers_it)
...:

# here 0 is the 'sentinel' ('int()' would work as well as you have
# the iterator produced by iter() here stops as soon as sentinel value
# is encountered
In [91]: list(iter(grab_until, 0))
Out[91]: [1, 9, 8, 11, 22, 4]



Hope this helps

Pablo

On Sat, Jan 7, 2017 at 8:38 AM, Jussi Piitulainen <
jussi.piitulai...@helsinki.fi> wrote:

> Rustom Mody writes:
> > On a Saturday, Jussi Piitulainen wrote:
>
> [snip]
>
> >> You switched to a simpler operator. Would Haskell notice that
> >>
> >>def minabs(x, y): return min(x, y, key = abs)
> >>
> >> has a meaningful zero? Surely it has its limits somewhere and then
> >> the programmer needs to supply the information.
> >
> > Over ℕ multiply has 1 identity and 0 absorbent
> > min has ∞ as identity and 0 as absorbent
> > If you allow for ∞ they are quite the same
>
> There is nothing like ∞ in Python ints. Floats would have one, but we
> can leave empty minimum undefined instead. No worries.
>
> > Below I am pretending that 100 = ∞
>
> Quite silly but fortunately not really relevant.
>
> > Here are two lazy functions:
> > mul.0.y = 0  -- Lazy in y ie y not evaluated
> > mul.x.y = x*y
> >
> > minm.0.y = 0  -- likewise lazy in y
> > minm.x.y = min.x.y
>
> Now I don't see any reason to avoid the actual function that's been the
> example in this thread:
>
> minabs.0.y = 0
> minabs.x.y = x if abs.x <= abs.y else y
>
> And now I see where the desired behaviour comes from in Haskell. The
> absorbing clause is redundant, apart from providing the specific
> stopping condition explicitly.
>
> > Now at the interpreter:
> > ? foldr.minm . 100.[1,2,3,4]
> > 1 : Int
> > ? foldr.minm . 100.[1,2,3,4,0]
> > 0 : Int
> > ? foldr.minm . 100.([1,2,3,4,0]++[1...])
> > 0 : Int
> >
> > The last expression appended [1,2,3,4,0] to the infinite list of numbers.
> >
> > More succinctly:
> > ? foldr.minm . 100.([1,2,3,4,0]++undefined)
> > 0 : Int
> >
> > Both these are extremal examples of what Peter is asking for — avoiding
> an
> > expensive computation
>
> Ok. Thanks.
> --
> https://mail.python.org/mailman/listinfo/python-list
>



-- 
*Pablo Lucena*
-- 
https://mail.python.org/mailman/listinfo/python-list


Cycling through iterables diagonally

2016-02-25 Thread Pablo Lucena
Hello,

I am trying to accomplish the following:

Say I have a group of 4 lists as follows:

l1 = ['a1', 'a2', 'a3', 'a4']
l2 = ['b1', 'b2', 'b3', 'b4']
l3 = ['c1', 'c2', 'c3', 'c4']
l4 = ['d1', 'd2', 'd3', 'd4']

I would like to cycle through these lists "diagonally" in groups of
len(list) (in this example, each list has 4 items).

cycle1: a1, b2, b3, b4
cycle2: a2, b3, c4, d1
cycle3: a3, b4, c1, d2
cycle4: a4, b1, c2, d3

The way I thought about doing this is as follows:

from collections import deque
from itertools import cycle

l1 = deque(['a1', 'a2', 'a3', 'a4'])
l2 = deque(['b1', 'b2', 'b3', 'b4'])
l3 = deque(['c1', 'c2', 'c3', 'c4'])
l4 = deque(['d1', 'd2', 'd3', 'd4'])

l1.rotate(-0)
l2.rotate(-1)
l3.rotate(-2)
l4.rotate(-3)

groups = cycle([l1, l2, l3, l4])

In [115]: for group in groups:
   .: if not group:
   .: break
   .: print(group.popleft())
   .:
a1
b2
c3
d4
a2
b3
c4
d1
a3
b4
c1
d2
a4
b1
c2
d3

Prior to this I was mucking around with index counting while looping, and
popping lists out of a deque, popping an item out of the list, and
appending the list back into the deque during each iteration.

Is there a better/cleaner way to do this? I was hoping for some cool
itertools logic =)

Thanks!


-- 
*Pabl​o​*
-- 
https://mail.python.org/mailman/listinfo/python-list


https://www.python.org/downloads/ offline

2015-11-29 Thread Pablo Lucena
Is anyone else getting 503 errors when accessing the downloads page of
python.org?


-- 
*Pablo Lucena*
-- 
https://mail.python.org/mailman/listinfo/python-list


Keeping context-manager object alive through function calls

2015-11-10 Thread Pablo Lucena
I am running into a bit of an issue with keeping a context manager open
through function calls. Here is what I mean:

There is a context-manager defined in a module which I use to open SSH
connections to network devices. The "setup" code handles opening the SSH
sessions and handling any issues, and the teardown code deals with
gracefully closing the SSH session. I normally use it as follows:

from manager import manager
def do_stuff(device):
with manager(device) as conn:
output = conn.send_command("show ip route")
#process output...
return processed_output

In order to keep the SSH session open and not have to re-establish it
across function calls, I would like to do add an argument to "do_stuff"
which can optionally return the SSH session along with the data returned
from the SSH session, as follows:

def do_stuff(device, return_handle=False):
with manager(device) as conn:
output = conn.send_command("show ip route")
#process output...
if return_handle:
return (processed_output, conn)
else:
return processed_output


I would like to be able to call this function "do_stuff" from another
function, as follows, such that it signals to "do_stuff" that the SSH
handle should be returned along with the output.

def do_more_stuff(device):
data, conn = do_stuff(device, return_handle=True)
output = conn.send_command("show users")
#process output...
return processed_output

However the issue that I am running into is that the SSH session is closed,
due to the do_stuff function "returning" and triggering the teardown code
in the context-manager (which gracefully closes the SSH session).

I have tried converting "do_stuff" into a generator, such that its state is
suspended and perhaps causing the context-manager to stay open:

def do_stuff(device, return_handle=False):
with manager(device) as conn:
output = conn.send_command("show ip route")
#process output...
if return_handle:
yield (processed_output, conn)
else:
yield processed_output

And calling it as such:

def do_more_stuff(device):
gen = do_stuff(device, return_handle=True)
data, conn = next(gen)
output = conn.send_command("show users")
#process output...
return processed_output

However this approach does not seem to be working in my case, as the
context-manager gets closed, and I get back a closed socket.

Is there a better way to approach this problem? Maybe my generator needs
some more work...I think using a generator to hold state is the most
"obvious" way that comes to mind, but overall should I be looking into
another way of keeping the session open across function calls?

Thanks


-- 
*Pablo Lucena*
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: help in pexpect multiprocessing

2015-11-10 Thread Pablo Lucena
I think the problem is that you cannot pass around an open socket via
pickle. Whats the error message you are getting?

Try adding the session establishment code to the stop function, so that
each new process opens a session to the server and executes the command.

def stop(ds):
s = pxssh.pxssh()
​​
s.login ('host','username','password')
s.sendline ('ps -ef|grep java')
s.prompt(timeout=1)
s.sendline ('cd /usr')
if condition:
lock = threading.Lock()
lock.acquire()
s.expect('Enter username:')
s.sendline ('user')
s.expect('Enter password:*')
s.sendline('pass')
lock.release()
s.prompt(timeout=200)
print('stopped ds...')


if condition == 'met':
   np = len(list1)
   p = multiprocessing.Pool(np)
   p.map(stop, [(ds) for ds in list1])

On Mon, Nov 9, 2015 at 7:37 AM, <harirammanohar...@gmail.com> wrote:

> Hi,
>
> I am using multiprocessing with pexpect, issue is whenever i call a
> function which is having expect(), its throwing error which is genuine as
> multiple threads are processing it same time (i/o prompt same time by
> multiple processes..so issues in picture...), so i want to use lock for
> that section alone to avoid it, but still fails in implementing it...can
> you help me
>
> username = input('Enter your username: ')
> password = getpass.getpass()
>
> s = pxssh.pxssh()
> s.login ('host','username','password')
> s.sendline ('ps -ef|grep java')
> s.prompt(timeout=1)
>
> Try 1:
>
> if condition == 'met':
>np = len(list1)
>p = multiprocessing.Pool(np)
>p.map(stop, [(ds) for ds in list1])
>
> def stop(ds):
> s.sendline ('cd /usr')
> if condition:
> lock = threading.Lock()
> lock.acquire()
> s.expect('Enter username:')
> s.sendline ('user')
> s.expect('Enter password:*')
> s.sendline('pass')
> lock.release()
> s.prompt(timeout=200)
> print('stopped ds...')
>
> Try 2:
>
> if condition == 'met':
> lock = Lock()
> for ds in list1:
> Process(target=stop, args=(ds,lock)).start()
>
> def stop(ds,l):
> s.sendline ('cd /usr')
> if condition:
> l.acquire()
> s.expect('Enter username:')
> s.sendline ('user')
> s.expect('Enter password:*')
> s.sendline('pass')
> l.release()
> s.prompt(timeout=200)
> print('stopped ds...')
>
> Both are giving me same trace..
>
> pexpect.ExceptionPexpect: isalive() encountered condition where
> "terminated" is 0, but there was no child process. Did someone else call
> waitpid() on our process?
>
> Thanks in Advance
> --
> https://mail.python.org/mailman/listinfo/python-list
>



-- 
*Pablo Lucena*
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: convert output to list(and nested dictionary)

2015-07-21 Thread Pablo Lucena

 },{

 'cidr': '0.0.0.0/0',

 'proto': 'tcp',

 'port': 80

 }]



 --
 https://mail.python.org/mailman/listinfo/python-list




-- 
*Pablo Lucena*
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python write to spreadsheet?

2015-05-30 Thread Pablo Lucena
Try openpyxl - I've found this to be a really nice library for interacting
with MS Excel.

On Sat, May 30, 2015 at 5:30 AM, Justin Thyme justinth...@nowhere.com
wrote:

 Is it possible to write a Python program that will start MS Excel, create
 a spreadsheet and fill cells A1 to A10 (say) with the data in a Python
 array?  The answer is surely yes, but is there an outline of how to do it
 somewhere?
 --
 Shall we only threaten and be angry for an hour?
   When the storm is ended shall we find
 How softly but how swiftly they have sidled back to power
   By the favour and contrivance of their kind?

 From /Mesopotamia/ by Rudyard Kipling
 --
 https://mail.python.org/mailman/listinfo/python-list




-- 
*Pablo Lucena*
-- 
https://mail.python.org/mailman/listinfo/python-list