Nicola,
I tried, but it doesn't install (with Pip), I am attaching the error screen.
haven't tried the whole thing, but to get past that particular error:
pypy -m pip install cmake
Beyond that, if the package requires cmake, it's going to build something
native, so it's going to need
Yaroslav,
ROOT data analysis framework (https://root.cern) is written in C++
(with cppyy Python bindings).
not exactly: ROOT uses a fork of cppyy with several local modifications (and
it's also very much behind cppyy master). One of them is precisely this:
ImportError: Failed to import
Matti,
Although implementation_version looks promising, it is defined as
"sys.implementation.version" which is an alias to the python version.
"implementation" is also not helpful.
does pypy even have a sys.implementation? (My installation doesn't?)
For now, I'm simply breaking the build
Hi,
pip 21 is (likely) going to drop the fallback of running `setup.py install`
if wheels can not be build. I've been using that fallback b/c wheels are
not installed immediately on completion and do not respect the dependency
order for build-time, whereas `setup.py install` does.
With a
David,
I would like to ask You, if there is some chance to help with the PyPy
project. I am studying MS Computer Sciences at CTU FEE, Prague.
there is this:
https://doc.pypy.org/en/latest/contributing.html
and in particular the "Your first contribution" section.
Myself, I have only
Hi,
On Mon, 22 Oct 2018, Barry wrote:
On 21 Oct 2018, at 19:04, Armin Rigo wrote:
On Sun, 21 Oct 2018 at 16:47, Barry Scott wrote:
How odd. sys.maxint on macOS and Fedora is (2**63)-1 not (2**31)-1.
That's because MacOS and Fedora are not Windows.
Do you why windows is unique in this
Matti,
thanks for the pull request. :)
I'm still bogged down on get the CPython side of things running on Windows.
Will have more cycles available once I get that out of my hair.
Cheers,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
Matti,
They claim 1500 less code lines and a 2x speedup (on cpython).
I certainly believe the former, but am surprised by the latter. From some
simple benchmarking, I've found PyBind11 to be very slow (as in 3x and up).
Perhaps the problem is in the C wrappers more so than in cffi?
On the
Matti,
thanks for your patience.
Fix is in default, so next time when default into the py3.5 brach, the error
should go away.
The fix is to completely defer the loading of the backend shared library to
the point of first use. Thus importing _cppyy (and checking its __doc__ or
running help() on
Carl,
thanks for your help!
How are you running these CPython tests?
Using the pypy/test_all.py script.
The best way to run them is using a translated pypy-c and then something
like
pypy-c test_pydoc.py
That sounds like an expensive debugging cycle? At least on my machine,
translation
Matti,
Translates and runs. There are two new failing tests,
http://buildbot.pypy.org/summary?branch=py3.5=linux64
not following the number two? I see three new failing tests:
test_pydoc
test_site
test_app
Only one prints something about _cppyy, but I don't understand why.
I can't
Matti,
... and it fails to translate. Anyone want to fix?
thanks for trying; I have the fixes: an indentation error (which does not
show up in p2.7 as there is only one branch; bytes <-> text thingy), and
the removal of __cmp__ if p3.
Where should I put them? They're safe to go into default,
Hi,
in the release-pypy3.5-v6.0.0 branch (and hence in the 3.5 release), module
_cppyy was disabled:
"_csv", "_pypyjson", "_posixsubprocess", # "_cppyy", "micronumpy"
The commit history doesn't say? Anything I should fix?
Thanks!
Best regards,
Wim
--
wlavrij...@lbl.gov--
Shaheed,
yes, or rather:
+add_pkg = ['cppyy']
Anyway, is in, thanks!
I was off on a tangent now that PyPy 5.9 is released. For that matter,
the cppyy-backend package has been updated, too. PyPy 5.7-5.9 will have the
0.2.x branch, later is now 0.3.x. Change is minor, but needed for
Shaheed,
Oh wait, I think I see that cling-config is installed by the cppyy
package. (Seems a tad confusing, ho-hum).
no, it's in cppyy-cling, which was freshly pulled in when starting from
cppyy, as all has been updated to take that new split into account. (I'm
not sure how to force such
Parker,
thanks! Both (encoding and tracker) are now fixed.
Best regards,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
___
pypy-dev mailing list
pypy-dev@python.org
Shaheed,
One issue I am struggling to find a good solution for is to generate an
accurate list of the objects (classes, functions, variables etc) in a given
header file in order to populate the selection .XML.
that option exists, but apparently no-one has ever used it, as it is clearly
David,
Wow, this is a 7-year old issue with boost;
https://svn.boost.org/trac10/ticket/4125 (
well, I don't think boost.python has had any serious development effort
behind it since 2004. (Although, the reference manual says (c) 2002-2005, so
maybe there's been a bit.)
To answer your
David,
since I didn't see a reply, I'm having a stab at it ...
There is patch file for wrapper.cpp:
pypy/module/cpyext/patches/boost.patch
which looks to be what you want, as it modifies the 'offending' code. I've
not tested it.
And out of curiosity, why boost? cpyext needs to do a mapping
Armin,
people often fail to realize
count me among those. :) Yes, that is surprising.
For the cases where the call is "simple enough", you might think that the
API mode adds the marginal cost of a wrapper
To be sure, if the code is really simple, it's typically inlined on the C++
end and
Armin,
Ouch, I suppose :-) Explicit C++ mangling. The alternative would be
to use ffi.set_source() instead of ffi.dlopen(), and use a C++
compiler again to produce the cffi binding.
sure, but the mangled name saves a redirection and has less code to generate,
was my point. Besides, who
Hi,
[replying to self]
And yes, it is technically possible to write a bindings generator that only
depends on LLVM during offline bindings generation, not at run-time. But
then you'd just have SWIG (albeit with a standards-compliant parser).
it hit me yesterday that instead of generating
Armin,
If the newly-named '_cppyy' module is more minimal than '_cffi_backend'
'more minimal' is a hard to define term, but a 'wc -l' shows that
_cffi_backend has 8498 lines of python and _cppyy has 4265. Of course, the
latter uses the former, saving lots of code for it. :)
Further, _cppyy
Manuel,
How do you make sure that the pure Python part remains compatible with the
backend?
I'm thinking of selecting the versions explicitly in the front cppyy package:
it already has different dependencies selected based on interpreter: PyPy or
CPython. Also, the backend "API" is small and
Maciej,
yes, I know. :) I used and use CFFI for inspiration.
Now done. The package structure has become:
_cppyy for PyPy
/ \
cppyy (shared) cppyy-backend (shared)
\ /
Hi,
any objections to renaming cppyy into _cppyy?
I want to be able to do a straight 'pip install cppyy' and then use it
w/o further gymnastics (this works today for CPython), but then I can't
have 'cppyy' be a built-in module.
(You can pip install PyPy-cppyy-backend, but then you'd still have
Omer,
replying to this older mail, just so that it is seen on the list as well:
version '6.10.0.0' (is latest as of a week or so) has been uploaded to PyPI.
Best regards,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
Matti,
FWIW, when I download the latest cppyy backend from PyPI, there seems to
be a different version of clingwrapper.cxx:
yes. The one in the PyPy repository is the newer and has Ryan's fixes in it.
I haven't had time to update the PyPI package.
Best regards,
Wim
--
Omer,
in response to your last mail, I started a script to strip ROOT and build
the package. Will get to that and that should refresh my memory enough to
see what is going on (I don't remember having to make the changes that you
posted in the diff, but that may be just my (lack of) memory).
In
Omer,
I tried to find the source code repository for the package in order to try
to share my contributions to upgrade cling to 0.3 but I could not find it.
If it doesn't exist can you please create one, preferably under the pypy
project in Bitbucket?
the current version still has some
Omer,
Omar,
sorry, did it again. :P Should stick a note to my laptop ...
Best regards,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
___
pypy-dev mailing list
pypy-dev@python.org
Omar,
See build log
https://gist.github.com/thedrow/5eb31d5880b8ededc45466f8bedf6a76
I don't understand this one, as these dependencies are part of the package.
Are there any further details in any logs?
Best regards,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--
Anto,
On Thursday 2017-01-19 10:52, Antonio Cuni wrote:
But on top of that you need to put a layer which exposes a pythonic
interface (for example, offering list-like classes with an __iter__ and a
__getitem__). So I have no idea of how the final speed of the thing will
be, until someone tries
Hi Tobias,
Traceback (most recent call last):
File "", line 1, in
AttributeError: object has no attribute 'gInterpreter'
(details: "function 'cppyy_get_template' not found in library
'libcppyy_backend.so'")
okay, problem understood: the version of pypy-c that you have does not
support
Hi Tobias,
unknown type code: P
1
unknown type code: ???
-178968160
77
that falls in the category "should never happen." Arguments are passed as
a patch of memory, with a type code indicating how the individual arguments
should be treated. The only way this can be wrong, is if the offset is
Hi Anto,
On Wednesday 2017-01-18 10:02, Antonio Cuni wrote:
The original goal of capnpy was to be as fast as possible on PyPy. However,
if you find that C++ + cppyy is faster, I'd be very interested to know :).
since you wrote the initial data access part yourself way back when, I'd
expect
Hi Ryan, Tobias,
On Tuesday 2017-01-17 21:39, Ryan Gonzalez wrote:
Whoops, missed that one! Change line 715 to:
std::cerr << "Warning: " << msg.str() << '\n';
indeed, thanks! Now fixed in the repository. Hopefully up soon in the pip
(takes more testing, hence time) ...
Best regards,
Hi Tobias,
https://gist.github.com/oberstet/d260ee15c81954bea8298b7400d04870
those symbols (ZN4ROOT17TGenericClassInfoD1Ev and friends) live in libCore.so
which is installed under site packages and you can link with that. However,
the backend should add all those symbols and make them
Hi Ryan,
On Tuesday 2017-01-17 18:28, Ryan Gonzalez wrote:
I think you misunderstood me. Your build log shows:
no, I didn't. :) It's not my build log, it's Tobias's.
I'm saying to _remove_ the `-stdlib=libc++` that's currently there.
Yes and I expect it to work. But I wrote the pip, and
Hi,
On Tuesday 2017-01-17 18:03, Ryan Gonzalez wrote:
What happens if you remove the `-stdlib=libc++`? Seems like that's where the
Clang build error is coming from.
that will likely be a solution, b/c that wrapper file is put in place if the
libcxx from clang is not used. Seems a
Hi Armin,
good idea; done. Documentation and release notes are also updated (mostly
removal of caveats that no longer are :) ) and the PyPI package has also
been uploaded (PyPy-cppyy-backend).
Now that I know how to do PyPI packages (sortof, anyway), I should find the
time to do the CPython one
Hi Armin,
Could you (1) document the branch inside pypy/doc/whatsnew-head.rst
yes; I'm still working on documentation and the pypi package upload. Almost
there ...
, and (2) look at the failure on
32-bit? It seems that running py.test module/cppyy crashes with exit
code 3, according to
Shubna,
I used pre-built binaries from here:
https://slproweb.com/products/Win32OpenSSL.html
the 2nd link on the wiki (at which I arrive from the openssl pages):
https://indy.fulgan.com/SSL/
has zips with both libeay32.dll and ssleay32.dll. Also, the libs are named
explicitly in
Shubha,
On Thursday 2016-12-01 18:33, Shubha Ramani via pypy-dev wrote:
I've nor been able to build OpenSSL 32-bit which is required for pypy
windows build from source. Any suggestions?
you should be able to get away with removing _ssl from the list of modules
to use. See:
Shubha,
The documentation doesn't say much about compiling from source using
Visual Studio and NO Cygwin.
probably mostly due to lack of access to Windows boxes.
But 'compiling' is an ambiguous term in building PyPy. I'll presume that
you're talking about the final step after translation
Armin,
http://root.cern.ch/drupal/content/reflex
http://root.cern.ch/drupal/content/generating-reflex-dictionaries
They are 404 now. What should they be fixed to?
https://root.cern.ch/how/how-use-reflex
for both.
Aside, Google approved a GSoC student to make the Cling backend happen.
Hi Laura,
On Tuesday 2015-06-23 13:53, Laura Creighton wrote:
Note: I am still a little hazy on what Exascale means, and therefore
if we do it. :)
technically, I work on both: Cori (28 petaflops) and cppyy for PyPy.
Intermittently anyway; time is always limited ...
But unless Python is the
Ajit,
Hi Wim! I tried your suggestions (CC=icc and including -mmic in CFLAGS)
did you edit the make file (under $TMP), or only added the above as envars?
If the former, you can check the full linker command. I'd expect problems
with the latter, as the translation (of full pypy-c anyway) runs
Omar,
The documentation doesn't specify where should I place the rootmap file.
yes it does. :)
By convention, the rootmap files should be located next to the reflection
info libraries, so that they can be found through the normal shared library
search path.
i.e. they are found through the
Omer,
My name is Omer which is the same as Omar (which is in Arabic) only in
Hebrew. It's a common mistake. Don't worry.
whoops ... :} Sorry! Time for new glasses (or a larger font size).
Is there a good guide for compiling C++ code when running setup.py? I
think we should link to it in the
Hi,
I tried to pass a bytearray and that's also not currently supported.
no, it really expects an std::string object to be passed through an
std::string*, as Amaury advised. Any other type would require the creation
of a temporary. (A C++ string is not a byte array. Typically, it carries a
Omar,
So cppyy isn't production ready yet?
that will always be in the eye of the beholder. :)
If C++ exceptions can cause the process to crash that's very dangerous in
production systems.
Yes, C++ exceptions do that in C++ as well. :) Which is why we forbid them.
Can we warn about this
Toby,
I've actually put out enquiries to the CERN people about a very similar
idea, just relating to PyCling -- which is a more general cousin of
cppyy, from what I can tell -- and so perhaps I could combine your
expertise with theirs.
I'll just quickly answer here first, get into more detail
Jean-Paul,
to follow-up on what Toby already said ...
For us it's not so much differences between CPython and PyPy (there is cpyext,
after all). Even with CPython one has to deal with different Python versions,
and Reflex let's one get away from that.
The larger point is scaling. The idea
Jean-Paul,
Type help, copyright, credits or license for more information.
And now for something completely different: ``Is it a cactus bug or problem
with my war?''
import cppyy
Traceback (most recent call last):
File stdin, line 1, in module
ImportError: missing reflection library
Hi Armin,
Isn't it `ctypes.addressof(flag)`?
good idea; didn't think of that as addressof is not part of the public
C interface, but I can get hold of the python callable in the normal way.
Thanks,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
Hi,
A C++ reference type like int is implemented like a C++ pointer
type int * by most C++ compilers, as far as I know. My guess is
that you simply have to pretend that int is just int *.
wish that were true. Currently, I have no good int, although int* will
work through an array of size
Hi Alexander,
My application has own memory managing policy (references counting).
are the reference counts applied to the objects directly or through smart
pointers? And if the latter, are the smart pointers what you expose through
cppyy?
That's why I need to *catch* moment when pypy is
Hi Alex,
On Fri, 24 Jan 2014, Alex Stewart wrote:
Speaking as somebody who is currently using cppyy to build a Python binding
for a large already-existing third-party C++ library, I have to say I think
that moving cppyy in that direction would be a substantial step *backwards*,
making cppyy
Hi Armin,
More generally, wouldn't it make some sense to try to bring cppyy closer
to cffi?
yes, it does. Is high on my wish list (and not just the programmer-facing
side, also the internals although there are several things that are close
but don't quite fit). At the top is cling, though,
Hi Armin,
I meant specifically the way the interface is to be used by the end
programmer, ignoring its implementation for now. This would mean
moving to more cffi-like idioms: removing the implicit
ownership-tracking logic, not guessing too hard about which overloaded
function is meant to be
Hi Armin,
So you're basically answering no :-)
no, I'm not saying no. (Okay, now I did twice. :) ) I did say that it was
high on my wish list, after all.
I know that there are many people who like to work the explicit way, which
is why such an interface needs to be provided. And it can be
Hi Alex,
(I'm a little unclear what it means to represent void* as an array (array
of what?),
exactly. A PyCObject is useless by design (but lacking in PyPy AFAICT, other
than in cpyext, which use would be slow), as is an array of void. It is the
best equivalent I could come up with ...
I
Hi Alex,
That would be awesome, if it's not too much trouble..
well, void* is trouble by definition, but it has to be dealt with. Doing
something with it, is actually not that hard, but getting it consistent is.
Some code is in, on the reflex-support branch (to be merged once it is
Hi Jean-Francois,
I'd be personally interested in knowing what's been happening with numpypy,
PyPyROOT, and PyGame/PySDL, since those are the libs I use.
PyPyROOT is not part of PyPy; it's PyPy plus cppyy builtin with the CINT
backend and a ROOT.py tailored to it on top. That's how I
Hi Alex,
Out of curiosity, how much work do you expect the cppyy part of things to
be? (I'd offer to help, but I suspect it's all way over my head..)
the work it self is mostly transliterating code. Cleaning it up and removing
dependencies is a different matter.
This does also beg another
Hi Alex,
sorry for not responding earlier; had a bit of rough week at work.
So I actually worked around this problem by not using void * at all and
passing around intptr_t values instead,
Yes, I was going to suggest that. :)
But I'll first start implementing void* now.
Later,
Wim
--
Hi Alex,
I'd looked around a bit but could only find vague references to CINT, and
it wasn't even clear to me whether a full CINT backend really existed or it
was just a hack/experiment.
it's quite alive; in high energy physics, Reflex is only used by mapping
Reflex information into CINT,
Amaury,
On Wed, 30 Oct 2013, Amaury Forgeot d'Arc wrote:
Also, the process should perform 1000 iterations before you start the
timings.
The JIT needs a lot of iterations to be warm-up correctly.
so, each 'iteration' that I had in the table contains an inner loop that
is itself JIT-ed (not
Davide,
Thanks for posting your numbers. I think they are interesting and the 11x
speedup for 16 threads is not bad, however the overhead of STM is still too
high compared to PyPy.
well, yes and no: richards.py runs 30x faster on PyPy than on CPython. The
more typical speedup of PyPy is 5x,
Davide,
I don't know. But I do know that processor/thread binding (if that is what
you mean by pin)
is what I meant. :) But a qd implementation does not seem to make much
difference other than for 8 and 16 threads, where it helps a bit.
Running some more, I noticed that there are plenty of
Hi Armin,
This is a mis-installed PyPy. To fix it, run PyPy as root and type:
import syslog
You may have to also import a few other modules as needed. (syslog
appears in the traceback above.)
thanks for the recipe!
Note also that cppyy is now included in PyPy by default (on
Skip,
SuSE has a somewhat different packaging of curses than do other installation.
It'd be ideal if pypy-c would be immune to that, but so lacking I did:
1) symlink tinfo to ncurses:
/usr/lib64/libtinfo.so.5 - libncurses.so
2) symlink panel.h to ncurses/panel.h
/usr/include/panel.h -
Hi,
On Thu, 6 Jun 2013, Riccardo Rossi wrote:
i will keep monitoring your project (i really believe you do a very
interesting job) in the hope the link to C/C++ will get better (or that
boost_python will be ported based on cffi)
for general C++ (i.e. if the boost::python portion does not
Hi,
Sorry, I did not find any mistakes except that AttributeError. So I
think maybe there are some problems in my OS.
there may still be another problem lurking around that the AttributeError
is hiding (and which I'd consider a bug :} ).
Could you try the various cases using ctypes.CDLL(),
Hi,
On Fri, May 24, 2013 at 8:45 AM, Xia Xin xiax...@gmail.com wrote:
I believe that you need to say ./libMyClassDict.so. Otherwise it's
searching for the .so in the system's standard places, which do not
include ..
yes, or add '.' to LD_LIBRARY_PATH. The call is basically just a dlopen:
Hi,
1.
$ echo $LD_LIBRARY_PATH
/home/GeV/work/hello/v1/src:/opt/root/lib/root
$ ls /home/GeV/work/hello/v1/src
libMyClassDict.rootmap libMyClassDict.so MyClass.h MyClass_rflx.cpp
import cppyy
myinst = cppyy.gbl.MyClass(42)
Traceback (most recent call last):
File stdin, line 1,
Hi,
All needed libraries exist. Errors occur only when I try to use
automatic class loader.
puzzling ... Only thing I can think of is to use LD_DEBUG=files (or with
LD_DEBUG=symbols) and see whether either gives an indication as to what is
wrong (a library or directory not being considered
Phyo,
On Fri, 10 May 2013, Phyo Arkar wrote:
./bin/pypy: error while loading shared libraries: libtinfo.so.5: cannot
open shared object file: No such file or directory
not every distro splits up tinfo and ncurses. On SuSE, what I did was to
provide a symlink libtinfo.so.5 - libncurses.so and
Phyo,
On Fri, 10 May 2013, Phyo Arkar wrote:
just did that. got ELF header error @
/usr/lib64/libtinfo.so.5
what ELF header error? (Point being: pypy is linked with libncurses.so as
well, so that library has to be correct to begin with.)
Best regards,
Wim
--
wlavrij...@lbl.gov
Hi,
what are my chances of getting cppyy enabled by default in the next (beta)
release of pypy?
The code has been cleaned so that no external libraries are needed when
builing pypy-c, and none will be needed at run-time until import cppyy.
There is, however, an increase of 1.5MB in size of the
Hi,
say I have something like this during translation:
[rtyper:WARNING] SomeInstance(can_be_None=True,
classdef=pypy.interpreter.baseobjspace.W_Root) can be null, but forcing
non-null in dict key
[rtyper:WARNING] SomeInstance(can_be_None=True,
classdef=pypy.interpreter.baseobjspace.W_Root)
Hi Armin,
Uh, that's strange. The docstrings in interpreter/module.py say
specifically the opposite. But the truth looks a bit more complicated
indeed, e.g. it depends if getbuiltinmodule('cppyy') was already
called during translation or not...
yes, that is called, and it is being called by
Hi Armin,
Indeed, this runs ahead of time --- maybe already at translation time,
actually.
yes, but that's fine: the code has no side effects. If it does not run at
translation time, the value of gbl seems to be frozen at its initial one,
so that's not an option.
You can move this logic
Hi Armin,
Fwiw the patch I have in mind would look like
this: http://bpaste.net/show/87755/
this line:
-'gbl': 'pythonify.gbl',
can not be removed, as w/o it 'gbl' will not show up at the module level.
and I have no clue about which
part of the code contains
Hi,
when starting pypy-c, I get some extension module code executed before that
module is imported.
The code in question is in pypy/module/cppyy/pythonify.py:
gbl = make_cppnamespace(None, ::, None, False) # global C++ namespace
gbl.__doc__ = Global C++ namespace.
sys.modules['cppyy.gbl'] =
Hi,
any place where I can find docs describing _immutable_ and _immutable_fields_
in some detail? As it happens, they don't quite do what I expected, which led
to bugs. I'd like to know what their scope is and what the wildcards do.
Right now I just removed them all to be safe (and allow some
Hi Maciej,
A very short answer is never use _immutable_ it's very confusing.
that I found out the hard way already. :)
_immutable_fields_ is better (for subclassing) Wildcards mean array
on the instance is immutable so a[*] means x.a[3] will be constant
folded, if x is a constant.
I've
Hi Armin,
Fixed by introducing a cleaner interface in 28ae0f0e0b79.
thanks! Works for me.
Best regards,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
___
pypy-dev mailing list
pypy-dev@python.org
Hi Maciej,
Can you give me the exact branch and revision so I can try? It depends
a bit on the full traceback.
current reflex-support branch. Although as said, it comes and goes: I'll
re-run a few times and post the traceback when it occurs again.
Thanks,
Wim
--
wlavrij...@lbl.gov--
Hi Majiec,
Can you give me the exact branch and revision so I can try? It depends
a bit on the full traceback.
full stack trace below. One way out seems to be to @jit.elidable_promote(),
which is fine, although seems superfluous (the functions[] list is already
declared to be immutable).
Hi Uwe,
rules in translator/platform/posix.py places $(LDFLAGSEXTRA) before $(OBJECTS)
('$(TARGET)', '$(OBJECTS)', '$(CC_LINK) $(LDFLAGSEXTRA) -o $@ $(OBJECTS)
$(LIBDIRS) $(LIBS) $(LINKFILES) $(LDFLAGS)'),
ah, but that then just means that I should not use the 'link_extra' keyword in
the
Hi Uwe,
1) During build/translation there were several errors
like undefined reference to `Reflex::Member::~Member().
This was caused by not linking against libReflex.
With some modifications (some insertions of
(lib,Reflex) e.g. ) in pypy/translator/tool/cbuild.py
i got a
Hi all,
On Tue, 10 Jul 2012, Maciej Fijalkowski wrote:
Amaury - not true, we can JIT functions from the start. However, they would
still accept wrapped arguments. I can think about a simple way to enable
jitting from the start without the need though.
if the JITted function still requires
Hi Stefan,
There's no reason it would require wrapped arguments.
that completely depends on how it is generated, of course, and in the context
of calls within Python (pypy-c), it makes sense to have the entry point of
the function expect wrapped arguments, and have the exit point re-wrap.
Hi Stefan,
Ok, then in the case of a callback, the runtime could simply start with
totally stupid code that packs the low-level arguments into Python
arguments, attaches the low-level type information to them and then passes
them into the function. The JIT should then see in its trace what
Hi Amaury,
I don't think it's a good idea to use pointers with a moving GC.
then make it (the moral equivalent of) a PyPyObject**. Like in Smalltalk.
Best regards,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
Hi Alex,
cppyy is still so unfinished
provide a prioritized list of what's still missing for you? I'm following a
more or less random walk otherwise, with most work going into the CINT backend
at the moment.
and requires rebuild of pypy with root libraries and all the other tedious
things,
Hi Maciej,
Or CFFI
did I miss an announcement and has it been added to PyPy? I can't find it ...
Thanks,
Wim
--
wlavrij...@lbl.gov--+1 (510) 486 6411--www.lavrijsen.net
___
pypy-dev mailing list
pypy-dev@python.org
Hi Maciej,
it's in the process. look on ffi-backend branch
looking good!
Any chance of getting a W_CTypeFunc.call() taking args instead of args_w? I'd
like to pull in the function pointer from app-level, so that loading in the
.so's is deferred to run-time, but then do the call from
1 - 100 of 134 matches
Mail list logo