+1 for at-at.
I've used it recently and it works well. It
uses ScheduledThreadPoolExecutor etc under the hood, and it's only 350
lines of readable code if you want to dig in.
I have found its interface easy to use. As well as scheduling at specific
times (the at function) or at regular inter
Hi,
On Mon, Dec 17, 2018 at 3:46 PM wrote:
> Laurens Van Houtven, good ideas, but then I'd also have to write some code
> to catch documents that got lost when a process died while trying to fetch
> a document from S3. If I simply check every 15 minutes, and grab everything
> that has not alread
I think iirc that at-at uses scheduledexecutor and is a very simple and
stable library which only does one thing, but does it well and has no
feature requests. I wrote schejulure which does a very similar job but with
cron style time specifications rather than periodic and haven't touched it
for ye
Laurens Van Houtven, good ideas, but then I'd also have to write some code
to catch documents that got lost when a process died while trying to fetch
a document from S3. If I simply check every 15 minutes, and grab everything
that has not already been stored in the database, then I automatically
James Reeves, that does sound like the right way to go. I'll do that.
On Monday, December 17, 2018 at 3:31:01 PM UTC-5, James Reeves wrote:
>
>
> I'd use an executor:
>
> (ns example.main
> (:import [java.util.concurrent Executors TimeUnit]))
>
> (def scheduler
> (Executors/newSchedu
I'd use an executor:
(ns example.main
(:import [java.util.concurrent Executors TimeUnit]))
(def scheduler
(Executors/newScheduledThreadPool 32))
(defn fetch-files []
(println "Fetching files...))
(defn -main []
(.scheduleAtFixedRate scheduler ^Runnable fetch-files 15 15
On Mon, Dec 17, 2018 at 2:54 PM wrote:
> But at-at has not been updated in 6 years, so I assume it is abandoned. I
> have two questions about this:
>
A common bit of wisdom here in the Clojure community is that time since
last update is not always (or even often) a sign of abandonment but instea
Honestly I'd use CloudWatch Timed Events to kick off a Lambda or ECS
Fargate job (which of course you can write in Clojure) assuming you're
using AWS yourself anyway. If you don't care about batching maybe even just
attach a Lambda to the write-to-S3 bucket itself instead of checking every
15m?
If
I'm coming back to Clojure development after a year away. This is a fast
moving community and it is hard to keep up when one is not working on it
full time. I'm dusting off some code I wrote 2 years ago, and trying to
bring all the dependencies up to their current versions.
I have a function t