Good morning,

Just a follow-up question on this. Let's say for instance you have to
restore an entire database and the assetstore, do you lose all thumbnails
and will filter-media have to start building thumbnails from scratch?

I have been running filter-media and it has been running for 3 weeks and
not yet completed.

Looking forward to your response.

Kind regards,
Daan





[image: Mailtrack]
<https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality11&;>
Sender
notified by
Mailtrack
<https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality11&;>
11/06/24,
07:32:08

On Tue, Jun 11, 2024 at 6:28 AM SAI KUMAR S <[email protected]>
wrote:

> Hi Tim
>
> Thank you for the information.
>
> The issue is that when we run the command line *./dspace filter-media*,
> the thumbnail-generated files are also read, but they are skipped. This
> means the process reads the files from the beginning each time, which takes
> more time as the number of files increases.
>
> Is there any other method, such as executing a script, for generating
> thumbnails more efficiently?
> Regards
> Sai Kumar S
>
> On Tuesday 11 June 2024 at 02:37:15 UTC+5:30 DSpace Community wrote:
>
>> Hi Sai,
>>
>> If you run "filter-media" **without** the "-f" flag, then it should
>> automatically skip all Items that already have generated thumbnails.   For
>> example:
>>
>> ./dspace filter-media
>>
>> When you run it **with** the "-f" flag, that tells the filter-media
>> script to **regenerate all thumbnails**.
>>
>> For more information see the documentation on this script
>> <https://wiki.lyrasis.org/display/DSDOC7x/Mediafilters+for+Transforming+DSpace+Content#MediafiltersforTransformingDSpaceContent-Executing(viaCommandLine)>
>> .
>>
>> (The "skip list" is only needed if you have files which are consistently
>> throwing errors and you want to *skip them from all future runs* of the
>> "filter-media" script.  But, it shouldn't be necessary in your use case.)
>>
>> Tim
>>
>> On Monday, June 10, 2024 at 5:09:33 AM UTC-5 [email protected]
>> wrote:
>>
>>> Hi All,
>>>
>>> I have a query regarding filter-media. I have uploaded around 1000 books
>>> to a collection and generated thumbnails for the PDF files using the
>>> command line *dspace filter-media -f.*
>>>
>>> However, when I upload another 1000 files to the same collection, I need
>>> to generate thumbnails only for the newly uploaded files. I tried using the
>>> skip mode by creating a *skip-list.txt*, but I am not getting the
>>> desired result.
>>>
>>> Could anyone of you provide me an example of how to correctly use the
>>> skip-list.txt method to generate thumbnails?
>>>
>>> Alternatively, is there any other method, such as using a script (e.g.,
>>> Python), to generate the thumbnails for only the newly uploaded files?
>>>
>>> Please help me solve this query.
>>>
>>> Thanks & Regards
>>> Sai Kumar S
>>>
>>> --
> All messages to this mailing list should adhere to the Code of Conduct:
> https://www.lyrasis.org/about/Pages/Code-of-Conduct.aspx
> ---
> You received this message because you are subscribed to the Google Groups
> "DSpace Community" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/dspace-community/07d120cd-74de-4420-b49d-d3ee6744738an%40googlegroups.com
> <https://groups.google.com/d/msgid/dspace-community/07d120cd-74de-4420-b49d-d3ee6744738an%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>

-- 
All messages to this mailing list should adhere to the Code of Conduct: 
https://www.lyrasis.org/about/Pages/Code-of-Conduct.aspx
--- 
You received this message because you are subscribed to the Google Groups 
"DSpace Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dspace-community/CAJmvTcC1JXa3Bu6VLsyFsGMQa57BvP0teLvm4FND6RML3%2BYVgQ%40mail.gmail.com.

Reply via email to