Re: Changing how we compute release hashes

2018-03-23 Thread Nicholas Chammas
To close the loop here: SPARK-23716
<https://issues.apache.org/jira/browse/SPARK-23716>

On Fri, Mar 16, 2018 at 5:00 PM Nicholas Chammas <nicholas.cham...@gmail.com>
wrote:

> OK, will do.
>
> On Fri, Mar 16, 2018 at 4:41 PM Sean Owen <sro...@gmail.com> wrote:
>
>> I think you can file a JIRA and open a PR. All of the bits that use "gpg
>> ... SHA512 file ..." can use shasum instead.
>> I would not change any existing release artifacts though.
>>
>> On Fri, Mar 16, 2018 at 1:14 PM Nicholas Chammas <
>> nicholas.cham...@gmail.com> wrote:
>>
>>> I have sha512sum on my Mac via Homebrew, but yeah as long as the format
>>> is the same I suppose it doesn’t matter if we use shasum -a or sha512sum
>>> .
>>>
>>> So shall I file a JIRA + PR for this? Or should I leave the PR to a
>>> maintainer? And are we OK with updating all the existing release hashes to
>>> use the new format, or do we only want to do this for new releases?
>>> ​
>>>
>>> On Fri, Mar 16, 2018 at 1:50 PM Felix Cheung <felixcheun...@hotmail.com>
>>> wrote:
>>>
>>>> +1 there
>>>>
>>>> ------------------
>>>> *From:* Sean Owen <sro...@gmail.com>
>>>> *Sent:* Friday, March 16, 2018 9:51:49 AM
>>>> *To:* Felix Cheung
>>>> *Cc:* rb...@netflix.com; Nicholas Chammas; Spark dev list
>>>>
>>>> *Subject:* Re: Changing how we compute release hashes
>>>> I think the issue with that is that OS X doesn't have "sha512sum". Both
>>>> it and Linux have "shasum -a 512" though.
>>>>
>>>> On Fri, Mar 16, 2018 at 11:05 AM Felix Cheung <
>>>> felixcheun...@hotmail.com> wrote:
>>>>
>>>>> Instead of using gpg to create the sha512 hash file we could just
>>>>> change to using sha512sum? That would output the right format that is in
>>>>> turns verifiable.
>>>>>
>>>>>
>>>>> --
>>>>> *From:* Ryan Blue <rb...@netflix.com.INVALID>
>>>>> *Sent:* Friday, March 16, 2018 8:31:45 AM
>>>>> *To:* Nicholas Chammas
>>>>> *Cc:* Spark dev list
>>>>> *Subject:* Re: Changing how we compute release hashes
>>>>>
>>>>> +1 It's possible to produce the same file with gpg, but the sha*sum
>>>>> utilities are a bit easier to remember the syntax for.
>>>>>
>>>>> On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas <
>>>>> nicholas.cham...@gmail.com> wrote:
>>>>>
>>>>>> To verify that I’ve downloaded a Hadoop release correctly, I can just
>>>>>> do this:
>>>>>>
>>>>>> $ shasum --check hadoop-2.7.5.tar.gz.sha256
>>>>>> hadoop-2.7.5.tar.gz: OK
>>>>>>
>>>>>> However, since we generate Spark release hashes with GPG
>>>>>> <https://github.com/apache/spark/blob/c2632edebd978716dbfa7874a2fc0a8f5a4a9951/dev/create-release/release-build.sh#L167-L168>,
>>>>>> the resulting hash is in a format that doesn’t play well with any tools:
>>>>>>
>>>>>> $ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
>>>>>> shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
>>>>>> checksum lines found
>>>>>>
>>>>>> GPG doesn’t seem to offer a way to verify a file from a hash.
>>>>>>
>>>>>> I know I can always manipulate the SHA512 hash into a different
>>>>>> format or just manually inspect it, but as a “quality of life” 
>>>>>> improvement
>>>>>> can we change how we generate the SHA512 hash so that it plays nicely 
>>>>>> with
>>>>>> shasum? If it’s too disruptive to change the format of the SHA512
>>>>>> hash, can we add a SHA256 hash to our releases in this format?
>>>>>>
>>>>>> I suppose if it’s not easy to update or add hashes to our existing
>>>>>> releases, it may be too difficult to change anything here. But I’m not
>>>>>> sure, so I thought I’d ask.
>>>>>>
>>>>>> Nick
>>>>>> ​
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Ryan Blue
>>>>> Software Engineer
>>>>> Netflix
>>>>>
>>>>


Re: Changing how we compute release hashes

2018-03-16 Thread Nicholas Chammas
OK, will do.

On Fri, Mar 16, 2018 at 4:41 PM Sean Owen <sro...@gmail.com> wrote:

> I think you can file a JIRA and open a PR. All of the bits that use "gpg
> ... SHA512 file ..." can use shasum instead.
> I would not change any existing release artifacts though.
>
> On Fri, Mar 16, 2018 at 1:14 PM Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
>> I have sha512sum on my Mac via Homebrew, but yeah as long as the format
>> is the same I suppose it doesn’t matter if we use shasum -a or sha512sum.
>>
>> So shall I file a JIRA + PR for this? Or should I leave the PR to a
>> maintainer? And are we OK with updating all the existing release hashes to
>> use the new format, or do we only want to do this for new releases?
>> ​
>>
>> On Fri, Mar 16, 2018 at 1:50 PM Felix Cheung <felixcheun...@hotmail.com>
>> wrote:
>>
>>> +1 there
>>>
>>> --
>>> *From:* Sean Owen <sro...@gmail.com>
>>> *Sent:* Friday, March 16, 2018 9:51:49 AM
>>> *To:* Felix Cheung
>>> *Cc:* rb...@netflix.com; Nicholas Chammas; Spark dev list
>>>
>>> *Subject:* Re: Changing how we compute release hashes
>>> I think the issue with that is that OS X doesn't have "sha512sum". Both
>>> it and Linux have "shasum -a 512" though.
>>>
>>> On Fri, Mar 16, 2018 at 11:05 AM Felix Cheung <felixcheun...@hotmail.com>
>>> wrote:
>>>
>>>> Instead of using gpg to create the sha512 hash file we could just
>>>> change to using sha512sum? That would output the right format that is in
>>>> turns verifiable.
>>>>
>>>>
>>>> --
>>>> *From:* Ryan Blue <rb...@netflix.com.INVALID>
>>>> *Sent:* Friday, March 16, 2018 8:31:45 AM
>>>> *To:* Nicholas Chammas
>>>> *Cc:* Spark dev list
>>>> *Subject:* Re: Changing how we compute release hashes
>>>>
>>>> +1 It's possible to produce the same file with gpg, but the sha*sum
>>>> utilities are a bit easier to remember the syntax for.
>>>>
>>>> On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas <
>>>> nicholas.cham...@gmail.com> wrote:
>>>>
>>>>> To verify that I’ve downloaded a Hadoop release correctly, I can just
>>>>> do this:
>>>>>
>>>>> $ shasum --check hadoop-2.7.5.tar.gz.sha256
>>>>> hadoop-2.7.5.tar.gz: OK
>>>>>
>>>>> However, since we generate Spark release hashes with GPG
>>>>> <https://github.com/apache/spark/blob/c2632edebd978716dbfa7874a2fc0a8f5a4a9951/dev/create-release/release-build.sh#L167-L168>,
>>>>> the resulting hash is in a format that doesn’t play well with any tools:
>>>>>
>>>>> $ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
>>>>> shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
>>>>> checksum lines found
>>>>>
>>>>> GPG doesn’t seem to offer a way to verify a file from a hash.
>>>>>
>>>>> I know I can always manipulate the SHA512 hash into a different format
>>>>> or just manually inspect it, but as a “quality of life” improvement can we
>>>>> change how we generate the SHA512 hash so that it plays nicely with
>>>>> shasum? If it’s too disruptive to change the format of the SHA512
>>>>> hash, can we add a SHA256 hash to our releases in this format?
>>>>>
>>>>> I suppose if it’s not easy to update or add hashes to our existing
>>>>> releases, it may be too difficult to change anything here. But I’m not
>>>>> sure, so I thought I’d ask.
>>>>>
>>>>> Nick
>>>>> ​
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Ryan Blue
>>>> Software Engineer
>>>> Netflix
>>>>
>>>


Re: Changing how we compute release hashes

2018-03-16 Thread Sean Owen
I think you can file a JIRA and open a PR. All of the bits that use "gpg
... SHA512 file ..." can use shasum instead.
I would not change any existing release artifacts though.

On Fri, Mar 16, 2018 at 1:14 PM Nicholas Chammas <nicholas.cham...@gmail.com>
wrote:

> I have sha512sum on my Mac via Homebrew, but yeah as long as the format
> is the same I suppose it doesn’t matter if we use shasum -a or sha512sum.
>
> So shall I file a JIRA + PR for this? Or should I leave the PR to a
> maintainer? And are we OK with updating all the existing release hashes to
> use the new format, or do we only want to do this for new releases?
> ​
>
> On Fri, Mar 16, 2018 at 1:50 PM Felix Cheung <felixcheun...@hotmail.com>
> wrote:
>
>> +1 there
>>
>> --
>> *From:* Sean Owen <sro...@gmail.com>
>> *Sent:* Friday, March 16, 2018 9:51:49 AM
>> *To:* Felix Cheung
>> *Cc:* rb...@netflix.com; Nicholas Chammas; Spark dev list
>>
>> *Subject:* Re: Changing how we compute release hashes
>> I think the issue with that is that OS X doesn't have "sha512sum". Both
>> it and Linux have "shasum -a 512" though.
>>
>> On Fri, Mar 16, 2018 at 11:05 AM Felix Cheung <felixcheun...@hotmail.com>
>> wrote:
>>
>>> Instead of using gpg to create the sha512 hash file we could just change
>>> to using sha512sum? That would output the right format that is in turns
>>> verifiable.
>>>
>>>
>>> --
>>> *From:* Ryan Blue <rb...@netflix.com.INVALID>
>>> *Sent:* Friday, March 16, 2018 8:31:45 AM
>>> *To:* Nicholas Chammas
>>> *Cc:* Spark dev list
>>> *Subject:* Re: Changing how we compute release hashes
>>>
>>> +1 It's possible to produce the same file with gpg, but the sha*sum
>>> utilities are a bit easier to remember the syntax for.
>>>
>>> On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas <
>>> nicholas.cham...@gmail.com> wrote:
>>>
>>>> To verify that I’ve downloaded a Hadoop release correctly, I can just
>>>> do this:
>>>>
>>>> $ shasum --check hadoop-2.7.5.tar.gz.sha256
>>>> hadoop-2.7.5.tar.gz: OK
>>>>
>>>> However, since we generate Spark release hashes with GPG
>>>> <https://github.com/apache/spark/blob/c2632edebd978716dbfa7874a2fc0a8f5a4a9951/dev/create-release/release-build.sh#L167-L168>,
>>>> the resulting hash is in a format that doesn’t play well with any tools:
>>>>
>>>> $ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
>>>> shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
>>>> checksum lines found
>>>>
>>>> GPG doesn’t seem to offer a way to verify a file from a hash.
>>>>
>>>> I know I can always manipulate the SHA512 hash into a different format
>>>> or just manually inspect it, but as a “quality of life” improvement can we
>>>> change how we generate the SHA512 hash so that it plays nicely with
>>>> shasum? If it’s too disruptive to change the format of the SHA512
>>>> hash, can we add a SHA256 hash to our releases in this format?
>>>>
>>>> I suppose if it’s not easy to update or add hashes to our existing
>>>> releases, it may be too difficult to change anything here. But I’m not
>>>> sure, so I thought I’d ask.
>>>>
>>>> Nick
>>>> ​
>>>>
>>>
>>>
>>>
>>> --
>>> Ryan Blue
>>> Software Engineer
>>> Netflix
>>>
>>


Re: Changing how we compute release hashes

2018-03-16 Thread Nicholas Chammas
I have sha512sum on my Mac via Homebrew, but yeah as long as the format is
the same I suppose it doesn’t matter if we use shasum -a or sha512sum.

So shall I file a JIRA + PR for this? Or should I leave the PR to a
maintainer? And are we OK with updating all the existing release hashes to
use the new format, or do we only want to do this for new releases?
​

On Fri, Mar 16, 2018 at 1:50 PM Felix Cheung <felixcheun...@hotmail.com>
wrote:

> +1 there
>
> --
> *From:* Sean Owen <sro...@gmail.com>
> *Sent:* Friday, March 16, 2018 9:51:49 AM
> *To:* Felix Cheung
> *Cc:* rb...@netflix.com; Nicholas Chammas; Spark dev list
>
> *Subject:* Re: Changing how we compute release hashes
> I think the issue with that is that OS X doesn't have "sha512sum". Both it
> and Linux have "shasum -a 512" though.
>
> On Fri, Mar 16, 2018 at 11:05 AM Felix Cheung <felixcheun...@hotmail.com>
> wrote:
>
>> Instead of using gpg to create the sha512 hash file we could just change
>> to using sha512sum? That would output the right format that is in turns
>> verifiable.
>>
>>
>> --
>> *From:* Ryan Blue <rb...@netflix.com.INVALID>
>> *Sent:* Friday, March 16, 2018 8:31:45 AM
>> *To:* Nicholas Chammas
>> *Cc:* Spark dev list
>> *Subject:* Re: Changing how we compute release hashes
>>
>> +1 It's possible to produce the same file with gpg, but the sha*sum
>> utilities are a bit easier to remember the syntax for.
>>
>> On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas <
>> nicholas.cham...@gmail.com> wrote:
>>
>>> To verify that I’ve downloaded a Hadoop release correctly, I can just do
>>> this:
>>>
>>> $ shasum --check hadoop-2.7.5.tar.gz.sha256
>>> hadoop-2.7.5.tar.gz: OK
>>>
>>> However, since we generate Spark release hashes with GPG
>>> <https://github.com/apache/spark/blob/c2632edebd978716dbfa7874a2fc0a8f5a4a9951/dev/create-release/release-build.sh#L167-L168>,
>>> the resulting hash is in a format that doesn’t play well with any tools:
>>>
>>> $ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
>>> shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
>>> checksum lines found
>>>
>>> GPG doesn’t seem to offer a way to verify a file from a hash.
>>>
>>> I know I can always manipulate the SHA512 hash into a different format
>>> or just manually inspect it, but as a “quality of life” improvement can we
>>> change how we generate the SHA512 hash so that it plays nicely with
>>> shasum? If it’s too disruptive to change the format of the SHA512 hash,
>>> can we add a SHA256 hash to our releases in this format?
>>>
>>> I suppose if it’s not easy to update or add hashes to our existing
>>> releases, it may be too difficult to change anything here. But I’m not
>>> sure, so I thought I’d ask.
>>>
>>> Nick
>>> ​
>>>
>>
>>
>>
>> --
>> Ryan Blue
>> Software Engineer
>> Netflix
>>
>


Re: Changing how we compute release hashes

2018-03-16 Thread Felix Cheung
+1 there


From: Sean Owen <sro...@gmail.com>
Sent: Friday, March 16, 2018 9:51:49 AM
To: Felix Cheung
Cc: rb...@netflix.com; Nicholas Chammas; Spark dev list
Subject: Re: Changing how we compute release hashes

I think the issue with that is that OS X doesn't have "sha512sum". Both it and 
Linux have "shasum -a 512" though.

On Fri, Mar 16, 2018 at 11:05 AM Felix Cheung 
<felixcheun...@hotmail.com<mailto:felixcheun...@hotmail.com>> wrote:
Instead of using gpg to create the sha512 hash file we could just change to 
using sha512sum? That would output the right format that is in turns verifiable.



From: Ryan Blue <rb...@netflix.com.INVALID>
Sent: Friday, March 16, 2018 8:31:45 AM
To: Nicholas Chammas
Cc: Spark dev list
Subject: Re: Changing how we compute release hashes

+1 It's possible to produce the same file with gpg, but the sha*sum utilities 
are a bit easier to remember the syntax for.

On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas 
<nicholas.cham...@gmail.com<mailto:nicholas.cham...@gmail.com>> wrote:

To verify that I’ve downloaded a Hadoop release correctly, I can just do this:

$ shasum --check hadoop-2.7.5.tar.gz.sha256
hadoop-2.7.5.tar.gz: OK


However, since we generate Spark release hashes with 
GPG<https://github.com/apache/spark/blob/c2632edebd978716dbfa7874a2fc0a8f5a4a9951/dev/create-release/release-build.sh#L167-L168>,
 the resulting hash is in a format that doesn’t play well with any tools:

$ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
checksum lines found


GPG doesn’t seem to offer a way to verify a file from a hash.

I know I can always manipulate the SHA512 hash into a different format or just 
manually inspect it, but as a “quality of life” improvement can we change how 
we generate the SHA512 hash so that it plays nicely with shasum? If it’s too 
disruptive to change the format of the SHA512 hash, can we add a SHA256 hash to 
our releases in this format?

I suppose if it’s not easy to update or add hashes to our existing releases, it 
may be too difficult to change anything here. But I’m not sure, so I thought 
I’d ask.

Nick

​



--
Ryan Blue
Software Engineer
Netflix


Re: Changing how we compute release hashes

2018-03-16 Thread Sean Owen
I think the issue with that is that OS X doesn't have "sha512sum". Both it
and Linux have "shasum -a 512" though.

On Fri, Mar 16, 2018 at 11:05 AM Felix Cheung <felixcheun...@hotmail.com>
wrote:

> Instead of using gpg to create the sha512 hash file we could just change
> to using sha512sum? That would output the right format that is in turns
> verifiable.
>
>
> --
> *From:* Ryan Blue <rb...@netflix.com.INVALID>
> *Sent:* Friday, March 16, 2018 8:31:45 AM
> *To:* Nicholas Chammas
> *Cc:* Spark dev list
> *Subject:* Re: Changing how we compute release hashes
>
> +1 It's possible to produce the same file with gpg, but the sha*sum
> utilities are a bit easier to remember the syntax for.
>
> On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
>> To verify that I’ve downloaded a Hadoop release correctly, I can just do
>> this:
>>
>> $ shasum --check hadoop-2.7.5.tar.gz.sha256
>> hadoop-2.7.5.tar.gz: OK
>>
>> However, since we generate Spark release hashes with GPG
>> <https://github.com/apache/spark/blob/c2632edebd978716dbfa7874a2fc0a8f5a4a9951/dev/create-release/release-build.sh#L167-L168>,
>> the resulting hash is in a format that doesn’t play well with any tools:
>>
>> $ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
>> shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
>> checksum lines found
>>
>> GPG doesn’t seem to offer a way to verify a file from a hash.
>>
>> I know I can always manipulate the SHA512 hash into a different format or
>> just manually inspect it, but as a “quality of life” improvement can we
>> change how we generate the SHA512 hash so that it plays nicely with
>> shasum? If it’s too disruptive to change the format of the SHA512 hash,
>> can we add a SHA256 hash to our releases in this format?
>>
>> I suppose if it’s not easy to update or add hashes to our existing
>> releases, it may be too difficult to change anything here. But I’m not
>> sure, so I thought I’d ask.
>>
>> Nick
>> ​
>>
>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix
>


Re: Changing how we compute release hashes

2018-03-16 Thread Felix Cheung
Instead of using gpg to create the sha512 hash file we could just change to 
using sha512sum? That would output the right format that is in turns verifiable.



From: Ryan Blue <rb...@netflix.com.INVALID>
Sent: Friday, March 16, 2018 8:31:45 AM
To: Nicholas Chammas
Cc: Spark dev list
Subject: Re: Changing how we compute release hashes

+1 It's possible to produce the same file with gpg, but the sha*sum utilities 
are a bit easier to remember the syntax for.

On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas 
<nicholas.cham...@gmail.com<mailto:nicholas.cham...@gmail.com>> wrote:

To verify that I’ve downloaded a Hadoop release correctly, I can just do this:

$ shasum --check hadoop-2.7.5.tar.gz.sha256
hadoop-2.7.5.tar.gz: OK


However, since we generate Spark release hashes with 
GPG<https://github.com/apache/spark/blob/c2632edebd978716dbfa7874a2fc0a8f5a4a9951/dev/create-release/release-build.sh#L167-L168>,
 the resulting hash is in a format that doesn’t play well with any tools:

$ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
checksum lines found


GPG doesn’t seem to offer a way to verify a file from a hash.

I know I can always manipulate the SHA512 hash into a different format or just 
manually inspect it, but as a “quality of life” improvement can we change how 
we generate the SHA512 hash so that it plays nicely with shasum? If it’s too 
disruptive to change the format of the SHA512 hash, can we add a SHA256 hash to 
our releases in this format?

I suppose if it’s not easy to update or add hashes to our existing releases, it 
may be too difficult to change anything here. But I’m not sure, so I thought 
I’d ask.

Nick

​



--
Ryan Blue
Software Engineer
Netflix


Re: Changing how we compute release hashes

2018-03-16 Thread Ryan Blue
+1 It's possible to produce the same file with gpg, but the sha*sum
utilities are a bit easier to remember the syntax for.

On Thu, Mar 15, 2018 at 9:01 PM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> To verify that I’ve downloaded a Hadoop release correctly, I can just do
> this:
>
> $ shasum --check hadoop-2.7.5.tar.gz.sha256
> hadoop-2.7.5.tar.gz: OK
>
> However, since we generate Spark release hashes with GPG
> ,
> the resulting hash is in a format that doesn’t play well with any tools:
>
> $ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
> shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted SHA1 
> checksum lines found
>
> GPG doesn’t seem to offer a way to verify a file from a hash.
>
> I know I can always manipulate the SHA512 hash into a different format or
> just manually inspect it, but as a “quality of life” improvement can we
> change how we generate the SHA512 hash so that it plays nicely with shasum?
> If it’s too disruptive to change the format of the SHA512 hash, can we add
> a SHA256 hash to our releases in this format?
>
> I suppose if it’s not easy to update or add hashes to our existing
> releases, it may be too difficult to change anything here. But I’m not
> sure, so I thought I’d ask.
>
> Nick
> ​
>



-- 
Ryan Blue
Software Engineer
Netflix


Changing how we compute release hashes

2018-03-15 Thread Nicholas Chammas
To verify that I’ve downloaded a Hadoop release correctly, I can just do
this:

$ shasum --check hadoop-2.7.5.tar.gz.sha256
hadoop-2.7.5.tar.gz: OK

However, since we generate Spark release hashes with GPG
,
the resulting hash is in a format that doesn’t play well with any tools:

$ shasum --check spark-2.3.0-bin-hadoop2.7.tgz.sha512
shasum: spark-2.3.0-bin-hadoop2.7.tgz.sha512: no properly formatted
SHA1 checksum lines found

GPG doesn’t seem to offer a way to verify a file from a hash.

I know I can always manipulate the SHA512 hash into a different format or
just manually inspect it, but as a “quality of life” improvement can we
change how we generate the SHA512 hash so that it plays nicely with shasum?
If it’s too disruptive to change the format of the SHA512 hash, can we add
a SHA256 hash to our releases in this format?

I suppose if it’s not easy to update or add hashes to our existing
releases, it may be too difficult to change anything here. But I’m not
sure, so I thought I’d ask.

Nick
​