Sorry, I wasn't very clear (in part because it is early stages and a bit
muddled in my head). We have batch jobs that are driven by desktop
software, which produces the inputs and then wants to collect the outputs
afterwards. The data can be largish (a few gigabytes of binary output). So
we need somewhere to put the data for pods doing the computation to read it
from, and then a place to put the output that the desktop client can
collect it from.

We had thought it would be nice to have the storage be within the
kubernetes cluster, and so only have one set of things to clean up once the
data has been extracted. However, we are starting just by sending the data
through s3 and then cleaning it up separately.

As you can probably tell, we are just getting started and trying to figure
out the space and mostly trying to look for standard patterns to adopt.
Thanks.
          --Daniel

On Fri, Aug 3, 2018 at 5:14 PM Rodrigo Campos <rodr...@sdfg.com.ar> wrote:

> I'm not sure what would work for you. A configmap created for that job
> only? Or
> a secret, that can have binary data?
>
> Or just an URL for input and log to stdout the output? Or use an S3 bucket
> for
> input/output?
>
> Not sure I understand the no cleanup part. If you want the output, then
> you want
> the files in some place, right? So they can't be deleted until you
> processed or
> something, at least. Sorry, but not sure I follow :)
>
> On Fri, Aug 03, 2018 at 10:29:26AM -0700, Daniel Russel wrote:
> > Hi-- Any good solutions for job input and output? I'm running into the
> same
> > problem and was looking at spinning up a temp minio (
> https://www.minio.io/)
> > instance (without persistent store) or something along those lines, but
> > simpler solutions would be nicer. Mostly the goal is to avoid having to
> > separately clean up data/persistent storage afterwards.
> >
> > On Monday, May 21, 2018 at 8:55:17 PM UTC-7, bimar...@google.com wrote:
> > >
> > > I am setting up a go application that will create jobs. For each job
> > > execution, I need to pass a blob of parameter data into the job
> container,
> > > and get a blob back out. I'd like to avoid a requirement for database
> > > access in the job.
> > >
> > > Volumes seem awfully complicated, and, anyway, I don't see a
> > > straightforward way to get a files on and off of a GCE PD volume. I
> suppose
> > > I'm missing something simple, or that there's some other volume type
> that
> > > would be more appropriate.
> > >
> >
> > --
> > You received this message because you are subscribed to the Google
> Groups "Kubernetes user discussion and Q&A" group.
> > To unsubscribe from this group and stop receiving emails from it, send
> an email to kubernetes-users+unsubscr...@googlegroups.com.
> > To post to this group, send email to kubernetes-users@googlegroups.com.
> > Visit this group at https://groups.google.com/group/kubernetes-users.
> > For more options, visit https://groups.google.com/d/optout.
>
> --
> You received this message because you are subscribed to a topic in the
> Google Groups "Kubernetes user discussion and Q&A" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/kubernetes-users/DHZdfJ3iBLg/unsubscribe
> .
> To unsubscribe from this group and all its topics, send an email to
> kubernetes-users+unsubscr...@googlegroups.com.
> To post to this group, send email to kubernetes-users@googlegroups.com.
> Visit this group at https://groups.google.com/group/kubernetes-users.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Kubernetes user discussion and Q&A" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to kubernetes-users+unsubscr...@googlegroups.com.
To post to this group, send email to kubernetes-users@googlegroups.com.
Visit this group at https://groups.google.com/group/kubernetes-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to