This can be done at Sling level yes. But then any code which makes use
of JCR API would not be able to access the binary. One way to have it
implemented at Oak level would be to introduce some sort of
'ExternalBinary' and open up an extension in BlobStore implementation
to delegate binary lookup call to some provider. Just that it needs to
honor the contract of Binary and Blob API

That part is easy.

The problem comes in management side where you need to decide on GC.
Probably Oak would need to expose an API to provide list (iterator) of
all such external binaries it refers to and then the external system
can manage the GC
Chetan Mehrotra


On Wed, Aug 10, 2016 at 3:26 PM, Ian Boston <[email protected]> wrote:
> Hi,
>
> On 10 August 2016 at 10:29, Bertrand Delacretaz <[email protected]>
> wrote:
>
>> Hi,
>>
>> On Tue, Jul 26, 2016 at 4:36 PM, Bertrand Delacretaz
>> <[email protected]> wrote:
>> > ...I've thought about adding an "adopt-a-binary" feature to Sling
>> > recently, to allow it to serve existing (disk or cloud) binaries along
>> > with those stored in Oak....
>>
>> I just noticed that the Git Large File Storage project uses a similar
>> approach, it "replaces large files such as audio samples, videos,
>> datasets, and graphics with text pointers inside Git, while storing
>> the file contents on a remote server". Maybe there are ideas to
>> steal^H^H^H^H^H borrow from there.
>>
>
> Would that be something to do at the Sling level on upload of a large file?
>
> I am working on a patch to use the Commons File Upload streaming API in
> Sling servlets/post as a Operation impl.
> I know this is oak-dev, so the question might not be appropriate here.
>
> Best Regards
> Ian
>
>
>>
>> -Bertrand
>>
>> [1] https://git-lfs.github.com/
>>

Reply via email to