Paul – Recall policy looks to be correct. I see that it is in line with 
/opt/ibm/MCStore/samples/recallFromCloud.template.

  1.  At the end of policy run, what does the stats indicate, such as “X Files 
migrated, Y skipped/error” etc?  I assume the path you have specified, has the 
files in ‘non-resident’ state, which you are trying to recall using the policy. 
Correct? Reason I ask this is because you mentioned around 650G data to be 
recalled, but policy seems to have chosen ~2.5G data across 4812 files. So if 
this path is actively used by users, it is likely that the files may have been 
transparently recalled on access, and are in co-resident state already. Hence 
no data movement visible.
  2.  In /var/MCStore/ras/logs/mcstore.log, do you see any recall specific 
errors, when you run the policy?
  3.  On our test setup, we are planning to run similar policy once, to see if 
error is reproducible. I will share what we find.

AMEY GOKHALE
Senior Software Engineer – Spectrum Scale
Phone: 91-988 100 8675
E-mail: [email protected]<mailto:[email protected]>

From: Huzefa H Pancha <[email protected]> On Behalf Of [email protected]
Sent: 19 July 2022 23:42
To: gpfsug main discussion list <[email protected]>; Amey P Gokhale 
<[email protected]>
Cc: gpfsug-discuss <[email protected]>
Subject: Re: [EXTERNAL] Re: [gpfsug-discuss] mass recall from on-prem COS using 
a policy


Hi Amey,

Can you provide them guidance from TCT angle.

Regards, The Spectrum Scale (GPFS) team

------------------------------------------------------------------------------------------------------------------
If you feel that your question can benefit other users of  Spectrum Scale 
(GPFS), then please post it to the public IBM developerWroks Forum at 
https://www.ibm.com/developerworks/community/forums/html/forum?id=11111111-0000-0000-0000-000000000479.

If your query concerns a potential software error in Spectrum Scale (GPFS) and 
you have an IBM software maintenance contract please contact  1-800-237-5511 in 
the United States or your local IBM Service Center in other countries.

The forum is informally monitored as time permits and should not be used for 
priority messages to the Spectrum Scale (GPFS) team.

[Inactive hide details for "Paul Ward" ---19-07-2022 09.34.33 PM---Thank you. 
Ill-placed on ESS or COS?]"Paul Ward" ---19-07-2022 09.34.33 PM---Thank you. 
Ill-placed on ESS or COS?

From: "Paul Ward" <[email protected]<mailto:[email protected]>>
To: "gpfsug main discussion list" 
<[email protected]<mailto:[email protected]>>
Date: 19-07-2022 09.34 PM
Subject: [EXTERNAL] Re: [gpfsug-discuss] mass recall from on-prem COS using a 
policy
Sent by: "gpfsug-discuss" 
<[email protected]<mailto:[email protected]>>

________________________________



Thank you. Ill-placed on ESS or COS? I understood restriping was for NSDs, so 
that would be on our ESS not COS? The direction I want to move the files is 
from COS to ESS. We do not have AFM enabled, we are using TCT. 
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd
Thank you.

Ill-placed on ESS or COS?

I understood restriping was for NSDs, so that would be on our ESS not COS?
The direction I want to move the files is from COS to ESS.

We do not have AFM enabled, we are using TCT.

From: gpfsug-discuss 
<[email protected]<mailto:[email protected]>> 
On Behalf Of IBM Spectrum Scale
Sent: 18 July 2022 20:35
To: gpfsug main discussion list 
<[email protected]<mailto:[email protected]>>; Venkateswara R 
Puvvada <[email protected]<mailto:[email protected]>>
Subject: Re: [gpfsug-discuss] mass recall from on-prem COS using a policy


 "KB_Ill" shows how much data are ill placed or ill replicated.  They can be 
resolved by mmrestripefs or mmrestripefile.
Copying to AFM team regarding recall in AFM-COS environment.

Regards, The Spectrum Scale (GPFS) team

------------------------------------------------------------------------------------------------------------------
If your query concerns a potential software error in Spectrum Scale (GPFS) and 
you have an IBM software maintenance contract please contact  1-800-237-5511 in 
the United States or your local IBM Service Center in other countries.

[Inactive hide details for "Paul Ward" ---07/12/2022 11:40:52 AM---Hi all, I 
need to recall from on-prem COS a folder with subfo]"Paul Ward" ---07/12/2022 
11:40:52 AM---Hi all, I need to recall from on-prem COS a folder with 
subfolders and files, approximately 4600 fil

From: "Paul Ward" <[email protected]<mailto:[email protected]>>
To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Date: 07/12/2022 11:40 AM
Subject: [EXTERNAL] [gpfsug-discuss] mass recall from on-prem COS using a policy
Sent by: "gpfsug-discuss" 
<[email protected]<mailto:[email protected]>>

________________________________




Hi all, I need to recall from on-prem COS a folder with subfolders and files, 
approximately 4600 files making up 656G. We have a policy that runs every 30 
mins, and I added this line to it: 
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd
Hi all,

I need to recall from on-prem COS a folder with subfolders and files, 
approximately 4600 files making up 656G.
We have a policy that runs every 30 mins, and I added this line to it:

RULE 'migrate_iac-workspace_Chem-labs' MIGRATE FROM POOl 'migrate' TO POOL 
'data' WHERE PATH_NAME LIKE 
'/gpfs/nhmfsa/bulk/share/workspace/iac-workspace/Chem_Labs/%'

Migrate is an external pool
RULE EXTERNAL POOL 'migrate' EXEC '/usr/lpp/mmfs/bin/mmcloudgateway files' OPTS 
'-F' ESCAPE '% -'

And data is the default placement pool
RULE 'Placement' SET POOL 'data'

When it runs it identifies matching files:
Rule#     Hit_Cnt               KB_Hit                 Chosen                
KB_Chosen         KB_Ill    Rule
13          4846                     2491152              4846                  
   2491152              7056      RULE      'migrate_iac-workspace_Chem-labs'   
    MIGRATE  FROM      POOL  'migrate'  TO  POOL  'data'     WHERE(.)

I can’t find what ‘KB_ Ill’ means but nothing is migrating.

Migrations in other policies work, but this is a ‘recall’ policy.
>From this document 
>https://www.ibm.com/docs/en/spectrum-scale/5.1.3?topic=pools-migrate-recall-external<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ibm.com%2Fdocs%2Fen%2Fspectrum-scale%2F5.1.3%3Ftopic%3Dpools-migrate-recall-external&data=05%7C01%7Cp.ward%40nhm.ac.uk%7Ca34ce171bfa64681616808da68f543f1%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637937700273967446%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=WmkkYdeQwjIZwqSS3UjEJwBwrmmzA0XP6o8QhQ1JQOM%3D&reserved=0>
It suggests it should be invoked as a recall.

Any ideas?

Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: [email protected]<mailto:[email protected]>
[A picture containing drawing    Description automatically generated]
_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at gpfsug.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss_gpfsug.org<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss_gpfsug.org&data=05%7C01%7Cp.ward%40nhm.ac.uk%7Ca34ce171bfa64681616808da68f543f1%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637937700273967446%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=4laoVLAiFUQWBj8aUcOpiuwJckP87X1uLG3UTVP1rN4%3D&reserved=0>


_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at gpfsug.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss_gpfsug.org

_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at gpfsug.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss_gpfsug.org

Reply via email to