Kendall,

That looks like what I was looking for—not sure how I didn’t stumble on that 
aspect of the curl command—I had expected there to be something like that but 
hadn’t found it.

I will pursue that approach.

I also realized that the databases I’m loading to need to have UPDINDEX turned 
on in order to have the various indexes automatically updated when new files 
are loaded (although with a multi-file POST end point, the end point could do 
the DB optimization after loading the files, which will be more efficient, I 
think).

Thanks,

E.

_____________________________________________
Eliot Kimber
Sr Staff Content Engineer
O: 512 554 9368
M: 512 554 9368
servicenow.com<https://www.servicenow.com>
LinkedIn<https://www.linkedin.com/company/servicenow> | 
Twitter<https://twitter.com/servicenow> | 
YouTube<https://www.youtube.com/user/servicenowinc> | 
Facebook<https://www.facebook.com/servicenow>

From: BaseX-Talk <[email protected]> on behalf of 
Kendall Shaw <[email protected]>
Date: Friday, August 18, 2023 at 7:59 AM
To: [email protected] <[email protected]>
Subject: Re: [basex-talk] Use Bash to construct multi-file form submission
[External Email]

________________________________

Hi Eliot,

In case you are still working on this, there is --config/-k where you can 
specify curl arguments. I think that would avoid bash parsing.

Best,
Kendall
On 8/15/2023 8:00 AM, Eliot Kimber wrote:
This is not strictly a BaseX question but it is in the context of communicating 
with a BaseX-implemented REST service and I’m hoping someone in this community 
can answer my question as I’ve not found a solution the interwebs.

I’ve implemented a POST method that expects a multi-part form request with 
multiple files that then loads those files into a database. The files are all 
related such that I want something close to an atomic commit for the set of 
files. Thus the multi-file form approach.


The REST method works and I can use from the command line using curl, i.e:

curl --location 
http://localhost:9984/now/rest/validation/reports?debug=false<http://localhost:9984/now/rest/validation/reports?debug=false>
 \
--form 
'files=@"/Users/eliot.kimber/git-basex/product-content-analytics/validation-reports/vancouver/bundle-strategic-portfolio-mgmt_errors_vancouver_2023-08-14T22~11~28Z_1ae692f8_1692054936.xml"'
 \
--form 
'files=@"/Users/eliot.kimber/git-basex/product-content-analytics/validation-reports/vancouver/bundle-tech-technology-industry_errors_vancouver_2023-08-14T22~11~28Z_1ae692f8_1692054971.xml"'

What I haven’t been able to figure out is how to construct the equivalent curl 
command in a bash script, where the script gets a list of flies to load and 
then constructs the –form parameters.

Everything I’ve tried results in a “Warning: Trailing data after quoted form 
parameter” message or some more utter failure.

It must be some subtlety of how to escape the quotes—my googling around 
produces lots of stack overflow discussions of how to manage the escapes, but 
nothing I’ve tried has so far worked.

My bash approach is simply:

curlcmd=”curl --location …”
for file in ${files[@]}; do
   curlcmd+=” --form ‘files=@\”${file}\”’
done

And then execute the resulting string:

bash -c $curlcmd

I have a separate script that successfully constructs a curl call to a 
companion end point that takes one file:



curl \
"--location" 
"${serverUrl}:${serverPort}/now/rest/validation/report/${fileName}?debug=${debug}"
 \
"--header" "Content-Type: text/xml" \
"--data-binary" "@${filePath}"

And I’m currently just calling this for each file, which is OK.

But I’d really like to be able to send a set of files as a single REST call.
Thanks,

Eliot
_____________________________________________
Eliot Kimber
Sr Staff Content Engineer
O: 512 554 9368
M: 512 554 9368
servicenow.com<https://www.servicenow.com>
LinkedIn<https://www.linkedin.com/company/servicenow> | 
Twitter<https://twitter.com/servicenow> | 
YouTube<https://www.youtube.com/user/servicenowinc> | 
Facebook<https://www.facebook.com/servicenow>

Reply via email to