Python bindings HOWTO proof reader request
Ben McGinnes
ben at adversary.org
Fri Mar 23 23:01:51 CET 2018
Sorry for the delayed reply; there's been a few power outages here
... and bush fires.
On Wed, Mar 21, 2018 at 10:49:03AM +0100, Tobias Mueller wrote:
> hi,
>
> On Mon, 2018-03-19 at 14:35 +1100, Ben McGinnes wrote:
>> On Sat, Mar 17, 2018 at 11:24:47PM +0100, Tobias Mueller wrote:
>>> I understand that the system's gpgme.h is needed for generating the
>>> python bindings. But is that not very similar to other python
>>> bindings
>>> to (system wide) libraries on PyPI? What's making gpgme more special
>>> than those?
>>
>> Essentially it boils down to the dynamic generation of that header
>> file when GPGME is compiled
> But that's before someone installed the python bindings, right?
Yes.
>> in conjunction with the dynamic generation
>> of the SWIG bindings. If gpgme.h shipped statically with the source
>> then there'd be no problem, but that isn't the case here.
>
> But in case of source distribution, gpgme.h is on the target system,
> no? And for a binary distribution, gpgme.h doesn't matter.
Except you're not going to be able to build a binary distribution
without building it from something in the first place and that step is
going to need gpgme.h.
>>> You don't really have to ship a gpgme.h, do you?
>>
>> Yeah, actually we do since that's what GPGME itself depends upon.
>
> wait. what? Of course, gpgme itself does not depend on an existing
> gpgme.h. The python bindings do.
And so does everything that uses GPGME.
Take Mutt for instance; it's accessing the C code in GPGME, but every
component that it does this with it does via including gpgme.h.
> The source distribution of the Python bindings can probably pick the
> gpgme.h up that is already present on the system.
Right.
Here's the thing, though; if there is already a gpgme.h on the system
then GPGME has already been installed and since it is only made
available (from gnupg.org) as source then it was compiled for that
external system at some point. This means it was either compiled from
source on that system or precompiled (e.g. by a Linux distribution for
a target architecture) in a way which will "just work" on that system.
Since it ships with the bindings, why is that not enough to cover the
installation of those bindings at the same time? Why use PyPI
instead of the version which would be built against the version of
gpgme.h which matches the .so, .dll or .dylib files on the system?
For the Python bindings not to be there in the first place would
actually require manual intervention to disable them. For PyPI
installation to be necessary it is predicated on intervention to
disable default build parameters. I don't really see the value in
that, though I am aware that some distributions and package managers
do precisely that in order to separate the bindings from the library.
> One problem with that approach: When the python bindings make use of
> features not present in the system's GPGME and thus try to access
> fields or functions that do not exist.
Also right, but that's at the core of the current recommendation for
installation. When GPGME is built from source the resulting gpgme.h
file only contains functions that are in that GPGME source code. So
when the Python bindings are installed during the same process it has
the same features.
In theory that means the scenario you describe shouldn't occur. In
practice there are a couple of obscure edge cases in the underlying
code that might or might not do anything following a bugfix to make it
play nicely in i386 systems. Justus knows a bit more about that,
though, since he fixed it. ;)
> But even then you could argue that having the source up on PyPI helps
> people finding and installing the library.
Yes, this too is a valid point and as it happens when I last checked
there was a version of 1.10.0 on PyPI. I'm just not certain how
effective it is compared to a source install.
> A binary distribution would neatly solve this problem, assuming
> gpgme itself being able to deal with older version of gnupg (which
> it generally does).
I'm not certain. The bugfix I mentioned above is fairly significant
for 32-bit systems and that bugfix was made after GPGME 1.10.0 was
released.
>>> Neither for source distribution (you pick the system's header) nor
>>> for binary distribution (the header is not needed at runtime, is
>>> it?). That's at least how to packages that I know do it.
>>
>> You've just hit the nail on the head; these bindings don't behave
>> the same way as the ones you're used to do.
>>
> I still don't see why it wouldn't.
This might be one of those cases where it would become more apparent
by looking under the hood. ;)
>> No, but we are already aware of the range of architectures which
>> actively use GnuPG components
>
> Right. I assume that people on most of these architectures are happy
> to compile their own stuff. That's unlike people on consumer
> hardware.
This is true, but in these cases it would probably be better to simply
ship the entire thing as part of the consumer system's native
ecosystem (i.e. including the GPGME library with all the supported
language bindings included; including the python bindings and the
C++/Qt bindings).
>> Although there is another aspect to that, of course, and that is
>> that with crypto projects there is a general preference to not ship
>> binaries at all.
>
> Sure. Notice how I started my last email with "discussing the
> technical bits".
Of course.
> Whether you (the general you, not you, personally) actually want to
> distribute binaries via PyPI is a whole other set of
> questions. (hint: I'm in favour of doing that)
I figured, but I'm not convinced that with cryptographic software
that's something we really want to encourage. From an ease of use for
developers standpoint, I do see the advantages, of course (I've had to
install GTK a few times over the years, after all), but there are
inherent trust issues (in a general sense) with cryptographic software
which we do need to keep in mind.
>>> That is to say, PyPI is quite happy to accept packages without
>>> having binaries compiled for HURD on s/390 or BSD on a toaster. So
>>> there is no technical limitation in providing only, say, Linux
>>> binaries for x86 and x64.
>>
>> As Jakob already pointed out, it's more accurate to say that those are
>> the only options available for Linux binaries on PyPI.
>
> You see what I did there.
:D
>> So the question then becomes, is that usage enough to justify a
>> special binary built just for Linux x86-64 users or not?
>
> I don't think it's needs to be special. Otherwise, I think your
> assessment is correct.
Well, in this context any pre-built binary would be special since
currently we're not providing any GPGME binaries.
To provide a functional Python binary would effectively mean providing
GPGME binaries too. If we were to do that, though, then there might
be a better way than the sort of wheel you're thinking of, but that
gets into some of the options available via CFFI.
> My naïve thinking is that the those users benefit the most from a
> binary distribution on PyPI. Many others are comfortable with a
> binary distribution via other channels (I'm thinking Windows here),
Yes, but Windows is a whole different issue since Windows consistently
has problems with SWIG in its entirety and is one of the many reasons
for considering moving from SWIG to CFFI.
> source distribution via PyPI (probably the BSD and other folks who
> are happy to compile their own stuff), or source distribution via
> other channels (MacOS probably).
MacOS definitely, that's what my current laptop is. ;)
> And the hassle should be really small. For a release, you probably
> do a sdist and a bdist anyway to check whether everything builds
> correctly. Publishing these is a "twine upload" away.
Perhaps ... alternatively a straight "configure && make && make check
&& sudo [-H] make install" with GPGME will install the library and all
the available language bindings, including the python ones.
>> And yet GnuPG has an extraordinarily wide customer base. Recent
>> posts to gnupg-devel and gnupg-users alone indicates there are
>> still plenty of people using Solaris and GPG.
>
> I appreciate that.
> As by my last argument, these people are happy to provide the
> necessary infrastructure for building from source.
This is true.
> The casual developer for Ubuntu apps, let alone their users, are
> probably not.
This is also true, but the question then becomes why the bindings
would be deliberately left out of their distribution? Well, aside
from certain distributions remaining far behind of current library
versions, of course (insert pointed look at both the Ubuntu and RHEL
ecosystems here, regardless of whether or not security fixes are
backported).
>>> PEP-513 is quite clear about what it expects and while producing
>>> such a manylinux wheel is a bit of an effort, the number of
>>> potential consumers might make up for it.
>>
>> Yes, it might very well do so at that. We'll certainly consider it
>
> nice.
Yeah, I'm definitely not ruling it out, particularly considering some
of the other things under consideration, but it will still depend on
the degree of additional effort required and whether or not it
introduces too much in the way of either additional points of failure
or potential security problems. Not least of which being that if we
release it then we *must* support it.
Also, I do view this type of binary release as essentially an
additional amount of work which is only made necessary by the
deliberate actions of third parties to break functionality. That is,
it is only necessary because distribution managers have manually
removed the bindings from the library in the first place or simply not
updated the versions of the libraries they bundle with their
distribution in the first place. In some cases both and in some cases
the latter is caused by the lack of resources at their end to conduct
the same manual intervention of the former on the more recent
releases.
Now policies like that are fine; their distribution, their choice. A
point must come, however, where "fixing" their policy choices through
ad-hoc, binary installs via third party systems (in this case PyPI)
may not serve the original project best. There must be a point at
which it is not the responsibility of a software project to fix the,
for lack of a better term, sabotage of a distribution which has
modified that software.
Regards,
Ben
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 228 bytes
Desc: not available
URL: <https://lists.gnupg.org/pipermail/gnupg-devel/attachments/20180324/732048a2/attachment.sig>
More information about the Gnupg-devel
mailing list