I reproduced the curl command with UserAgent successfully, and fixed my 
confusion with the headers.  Now I just have to be able to stream the 
results to the browser.  Here is the heart of my initial attempt, where 
$globalC points to the controller argument that should send data to the 
browser.  This reads data for 688MB, without apparently writing any of it 
to the browser, and then fails with an "Out of memory!"

$tx->res->content->unsubscribe('read')->on(read => sub {

    my($content, $bytes) = @_;

    our $globalC;

    our $digester;

    our $transferLength;

    our $contentLength;

    if (!$content->headers->header('headersWritten')) {

      foreach my $name (@{$content->headers->names}) {

my $value = $content->headers->header($name);


                                                 # $globalC should receive 
the streamed file

                                                 # This is setting the 
headers from the S3

                                                 # presigned URL.


$globalC->res->headers->header($name => $value);

if ($name =~ /[cC]ontent-[lL]ength/) {

  $contentLength = $value;

}

      }

      $content->headers->header('headersWritten' => 1);

      $globalC->write;

    }

    $digester->add($bytes);

    $transferLength += length($bytes);

    $globalC->write($bytes);

    print "$transferLength bytes written...\n";

  });

Attempted to use the $c->proxy->get_p helper in the following manner:

  $c->proxy->get_p($url => $headers)->catch(sub {

      my $err = shift;

      print "Proxy error is $err\n";

      $c->render(text => "Error: $err\n");

                                            });



This appears to restart three or four times before failing with a 
"Connection refused".  The headers are the same headers that work correctly 
in the download with Mojo::UserAgent.

Any guidance?  I assure you I have read the references you have mentioned 
heretofore assiduously.  

Regards,

Joe Fridy



On Monday, May 4, 2020 at 3:05:11 AM UTC-4, Joseph Fridy wrote:
>
> I am attempting to stream the output of a curl command with Mojolicious.  
> The curl command looks (roughly) like this:
>
> curl -H "x-amz-server-side-encryption-customer-algorithm:AES256" -H 
> "x-amz-server-side-encryption-cust\
>
> omer-key:secretKey=" -H "x-amz-server-side-encryption-customer-key-M\
>
> D5:HashOfASecret==" 
> "https://mybucket.s3.amazonaws.com/obscureFileLocation?AWSAccessKeyId=secretStuff&Expires=1588568911&Signature=moreSecrets";
>  
> --\
>
> dump-header header.461 --silent
>
>
> The curl command references a pre-signed URL of a file in AWS stored with 
> Server Side Encryption with Client Side Keys (SSE-C), and supplies the 
> necessary key information via HTTP headers (the -H command options).  The 
> curl command works - but I don't want my users to have to have a system 
> with curl on it to access their files.  The plan is to open the curl 
> command as input to a pipe, and stream its output to the user's browser 
> with Mojolicious.  The curl command also dumps out the HTTP headers from 
> Amazon, so they can be used by Mojolicious.  They look like this:
>
>
> x-amz-id-2: 
> sgMzHD2FJEGJrcbvzQwdhZK6mxUW+ePd6xdghTfgSlV45lMhliIw4prfk4cZMTHbS4fJN8N7xio=
>
> x-amz-request-id: 99B9CA56083DD9ED
>
> Date: Mon, 04 May 2020 04:57:22 GMT
>
> Last-Modified: Sat, 02 May 2020 03:47:35 GMT
>
> ETag: "b3a11409be2705e4581119fa59af79d3-1025"
>
> x-amz-server-side-encryption-customer-algorithm: AES256
>
> x-amz-server-side-encryption-customer-key-MD5: HashOfSecretKey==
>
> Content-Disposition: attachment; filename = "fiveGigFile"
>
> Accept-Ranges: bytes
>
> Content-Type: application/octet-stream; charset=UTF-8
>
> Content-Length: 5368709125
>
> Server: AmazonS3 
>
>
> Note that the file is 5Gig.
>
>
>
> This is my stab at streaming with Mojolicious:
>
>
> use strict;
> use Mojolicious::Lite;
> use FileHandle;
> use Digest::MD5;
>
> any '/' => sub {
>   my $c = shift;
>   $c->render(template => "test");
> };
>
> any 'pickup' => sub {
>   my $c = shift;
>   my $nGigs = 0;
>   my $nMegs = 0;
>   $| = 1;
>   open(CURLCMD,"curlCmd");
>   my $curlCmd = <CURLCMD>;
>   if ($curlCmd =~ /dump-header\s*(\S+)\s+/) {
>
>     my $headerFile = $1;
>     open(my $curl,"$curlCmd |");
>     binmode $curl;
>     my $initialized = 0;
>     my $digester = Digest::MD5->new;
>
>   my $transferLength = 0;
>
>   my $drain;
>
>   $drain = sub {
>
>       my $c = shift;
>
>       my $chunk;
>
>       sysread($curl,$chunk,1024*1024);
>
>       if (!$initialized) {
>       # read the headers, and set up the transfer...
>
>        open(HEADERS,$headerFile);
>
>        while(my $line = <HEADERS>) {
>
>          $c->res->headers->parse($line);
>
>        }
>        close(HEADERS);
>
>        $initialized = 1;
>
>        print "header initialization completed for the following 
> headers\n";
>
>        print join("\n",@{$c->res->headers->names}),"\n";
>
>       }
>
>       if ($initialized) {
>
>          while (length($chunk)) {
>
>            $digester->add($chunk);
>
>            $transferLength += length($chunk);
>
>            $c->write($chunk,$drain);
>
>            my $currentMegs = int($transferLength/(1024*1024));
>
>            if (($currentMegs > $nMegs) && ($currentMegs < 1024)) {
>
>              print "TransferLength: $transferLength\n";
>
>              $nMegs = $currentMegs;
>
>            }
>
>            my $currentGigs = int($transferLength/(1024*1024*1024));
>
>            if ($currentGigs > $nGigs) {
>
>              print "TransferLength: $transferLength\n";
>
>              $nGigs = $currentGigs;
>
>            }
>
>          }
>
>          if (length($chunk) <= 1) {
>
>            if ($chunk == 0) {
>
>             print "End of file found on curl pipe.";
>
>             print "$transferLength bytes transmitted\n";
>
>             print "with an MD5 hash of ",$digester->hexdigest,"\n";
>
>             $drain = undef;
>
>            }
>
>            if (!defined $chunk) {
>
>             print "Transfer error encountered on curl pipe.\n";
>
>             print "Error:",$!,"\n";
>
>             $drain = undef;
>
>           }
>
>         }
>
>       }
>
>     };
>
>     $c->$drain;
>
>   }
>
> };
>
> app->start;
>
> __DATA__
>
>
> @@ test.html.ep
>
> <!DOCTYPE html>
>
> <html>
>
> <body>
>
> <a href="/pickup" >Test of curl streaming... </a>
>
> </body>
>
> </html>
>
>
>
> When I ran this the first time, it read about 606MB of data, and the 
> server crashed with an "Out of memory!".  Subsequent runs failed at about 
> 139MB, with a server crash and no "Out of memory!" message.
>
>
> Obviously, I am an idiot.  Some guidance in the precise way I am being an 
> idiot would be greatly appreciated.
>
>
> Regards,
>
>
> Joe Fridy
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mojolicious" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mojolicious+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/mojolicious/dde82065-2e0a-4dfd-bdfe-042d5323be3f%40googlegroups.com.

Reply via email to