Hi, Any further ideas on this issue?
Kindest regards, Paul From: Paul Ward Sent: 20 July 2022 15:47 To: gpfsug main discussion list <[email protected]> Subject: RE: [gpfsug-discuss] mass recall from on-prem COS using a policy I used the template policy, substituting where applicable: # more /gpfs/nhmfsa/custom/valentine-migrations/policy/iac-workspace_axio-imager_2021.pol /******************************************************************************* * Licensed Materials - Property of IBM * * OCO Source Materials * * (C) Copyright IBM Corp. 2016-2017 All Rights Reserved * * The source code for this program is not published or other- * wise divested of its trade secrets, irrespective of what has * been deposited with the U.S. Copyright Office. *******************************************************************************/ define( exclude_list, ( FALSE OR PATH_NAME LIKE '%/.mcstore/%' ) ) RULE EXTERNAL POOL 'migrate' EXEC '/usr/lpp/mmfs/bin/mmcloudgateway files' OPTS '-F' ESCAPE '% -' /* RULE EXTERNAL POOL 'mcstore' EXEC '/usr/lpp/mmfs/bin/mmcloudgateway files' OPTS '-F' ESCAPE '% -' RULE 'dmrecall1' MIGRATE FROM POOL 'mcstore' TO POOL 'system' WHERE ( ( PATH_NAME LIKE '/gpfs/test_dir/%' ) AND NOT (exclude_list) ) */ RULE 'migrate_bulk_workspace_axioimager_2021' MIGRATE FROM POOL 'migrate' TO POOL 'data' WHERE ( ( PATH_NAME LIKE '/gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/%' ) AND NOT (exclude_list) ) ----------------------------------- I changed to the folder listed in the path and ran this command, with the defer option: # pwd /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021 ]# mmapplypolicy $(pwd) -f $(pwd)/policy-lists -I defer -L 3 -P /gpfs/nhmfsa/custom/valentine-migrations/policy/iac-workspace_axio-imager_2021.pol The result of the policy was: [I] Summary of Rule Applicability and File Choices: Rule# Hit_Cnt KB_Hit Chosen KB_Chosen KB_Ill Rule 0 994 0 994 1566373796 1566373796 RULE 'migrate_bulk_workspace_axioimager_2021' MIGRATE FROM POOL 'migrate' TO POOL 'data' WHERE(.) [I] Filesystem objects with no applicable rules: 53. [I] GPFS Policy Decisions and File Choice Totals: Chose to migrate 1566373796KB: 994 of 994 candidates; 1566373796KB of chosen data is illplaced or illreplicated; Predicted Data Pool Utilization in KB and %: Pool_Name KB_Occupied KB_Total Percent_Occupied data 175459129252 245111980032 71.583253185% system 0 0 0.000000000% (no user data) [I] Because some data is illplaced or illreplicated, predicted pool utilization may be negative and/or misleading! [I] 2022-07-20@14:22:23.943<mailto:2022-07-20@14:22:23.943> Policy execution. 0 files dispatched. [I] A total of 0 files have been migrated, deleted or processed by an EXTERNAL EXEC/script; 0 'skipped' files and/or errors. --------------------------------------------------------------------- Its selected about the right number, as there are : # find -type f | wc -l 996 -------------------------------------------------------- the head of the deferred output file list: 15:27:41 [root@scale-sk-pn-1 2021]# head policy-lists.recall.migrate 9085980 2146238895 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FRobin Armstrong 2021%2FThumbs.db 30157518 903677244 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-113_RL_s0c0x0-8849y0-8889.tif 30157519 321389117 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-113_TL.czi 30157520 1970925505 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-113_TL_s0c0x0-8846y0-8892.tif 30157521 1773348368 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-113_TL_s0c0x0-8849y0-8889.tif 30157522 2126423502 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-29_RL.czi 30157523 1701851598 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-29_RL_s0c0x0-6359y0-6656.tif 30157524 1844809347 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-29_TL.czi 30157525 912638442 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-29_TL_s0c0x0-6359y0-6656.tif 30157526 1280698844 0 -- %2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-29_TL_s0c0x0-6359y0-7609.tif …. -------------------------- If I try to recall the file from these paths: # mmcloudgateway files recall "%2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-29_TL_s0c0x0-6359y0-7609.tif" mmcloudgateway: Internal cloud services returned an error: file system object /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/%2Fgpfs%2Fnhmfsa%2Fbulk%2Fshare%2Fworkspace%2Fiac-workspace%2Flight_microscopy%2Fzeiss_axioimager_m2%2F2021%2FMatt Loader 2021%2FML210215_OT-29_TL_s0c0x0-6359y0-7609.tif does not exist mmcloudgateway: Command failed. Examine previous error messages to determine cause. It seems the highlighted part is the current directory, as I tried from another location and it picked up the pwd I was in. ----------------------- If I Give it the correctly formed path, it works: # mmcloudgateway files recall "/gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-29_TL_s0c0x0-6359y0-7609.tif" mmcloudgateway: Command completed. ------------------------------------------ I also ran a policy just to identify migrated files: define(ismigrated, ( MISC_ATTRIBUTES LIKE '%V%') ) Rule 'migrated' LIST 'migrated_list' WHERE ismigrated Rule 'resident' LIST 'resident_list' WHERE not (ismigrated) ---------------------------------------------------------------- The head of that file correctly forms the posix path… 15:35:38 [root@scale-sk-pn-1 2021]# head list.migrated_list 9085980 2146238895 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Robin Armstrong 2021/Thumbs.db 30157518 903677244 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-113_RL_s0c0x0-8849y0-8889.tif 30157519 321389117 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-113_TL.czi 30157520 1970925505 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-113_TL_s0c0x0-8846y0-8892.tif 30157521 1773348368 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-113_TL_s0c0x0-8849y0-8889.tif 30157522 2126423502 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-29_RL.czi 30157523 1701851598 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-29_RL_s0c0x0-6359y0-6656.tif 30157524 1844809347 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-29_TL.czi 30157525 912638442 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-29_TL_s0c0x0-6359y0-6656.tif 30157526 1280698844 0 -- /gpfs/nhmfsa/bulk/share/workspace/iac-workspace/light_microscopy/zeiss_axioimager_m2/2021/Matt Loader 2021/ML210215_OT-29_TL_s0c0x0-6359y0-7609.tif What’s going on? What I have done in the past is take the output from my ‘ismigrated’ policy. Put recall commands before each path. Divide the file into 4 parts, and run each part of one of our 4 protocol nodes. Manually doing what mmapplypolicy should do! From: gpfsug-discuss <[email protected]<mailto:[email protected]>> On Behalf Of Amey P Gokhale Sent: 20 July 2022 07:55 To: [email protected]<mailto:[email protected]>; gpfsug main discussion list <[email protected]<mailto:[email protected]>> Cc: gpfsug-discuss <[email protected]<mailto:[email protected]>> Subject: Re: [gpfsug-discuss] mass recall from on-prem COS using a policy Paul – Recall policy looks to be correct. I see that it is in line with /opt/ibm/MCStore/samples/recallFromCloud.template. 1. At the end of policy run, what does the stats indicate, such as “X Files migrated, Y skipped/error” etc? I assume the path you have specified, has the files in ‘non-resident’ state, which you are trying to recall using the policy. Correct? Reason I ask this is because you mentioned around 650G data to be recalled, but policy seems to have chosen ~2.5G data across 4812 files. So if this path is actively used by users, it is likely that the files may have been transparently recalled on access, and are in co-resident state already. Hence no data movement visible. 2. In /var/MCStore/ras/logs/mcstore.log, do you see any recall specific errors, when you run the policy? 3. On our test setup, we are planning to run similar policy once, to see if error is reproducible. I will share what we find. AMEY GOKHALE Senior Software Engineer – Spectrum Scale Phone: 91-988 100 8675 E-mail: [email protected]<mailto:[email protected]> From: Huzefa H Pancha <[email protected]<mailto:[email protected]>> On Behalf Of [email protected]<mailto:[email protected]> Sent: 19 July 2022 23:42 To: gpfsug main discussion list <[email protected]<mailto:[email protected]>>; Amey P Gokhale <[email protected]<mailto:[email protected]>> Cc: gpfsug-discuss <[email protected]<mailto:[email protected]>> Subject: Re: [EXTERNAL] Re: [gpfsug-discuss] mass recall from on-prem COS using a policy Hi Amey, Can you provide them guidance from TCT angle. Regards, The Spectrum Scale (GPFS) team ------------------------------------------------------------------------------------------------------------------ If you feel that your question can benefit other users of Spectrum Scale (GPFS), then please post it to the public IBM developerWroks Forum at https://www.ibm.com/developerworks/community/forums/html/forum?id=11111111-0000-0000-0000-000000000479<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ibm.com%2Fdeveloperworks%2Fcommunity%2Fforums%2Fhtml%2Fforum%3Fid%3D11111111-0000-0000-0000-000000000479&data=05%7C01%7Cp.ward%40nhm.ac.uk%7Ca62196787b154603fba908da6a1d564b%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637938971501969639%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=NHOeiZDnIn96yNcqo%2FWXp5fT5Bt%2BLfMMfHWvTd1bSSM%3D&reserved=0>. If your query concerns a potential software error in Spectrum Scale (GPFS) and you have an IBM software maintenance contract please contact 1-800-237-5511 in the United States or your local IBM Service Center in other countries. The forum is informally monitored as time permits and should not be used for priority messages to the Spectrum Scale (GPFS) team. [Inactive hide details for "Paul Ward" ---19-07-2022 09.34.33 PM---Thank you. Ill-placed on ESS or COS?]"Paul Ward" ---19-07-2022 09.34.33 PM---Thank you. Ill-placed on ESS or COS? From: "Paul Ward" <[email protected]<mailto:[email protected]>> To: "gpfsug main discussion list" <[email protected]<mailto:[email protected]>> Date: 19-07-2022 09.34 PM Subject: [EXTERNAL] Re: [gpfsug-discuss] mass recall from on-prem COS using a policy Sent by: "gpfsug-discuss" <[email protected]<mailto:[email protected]>> ________________________________ Thank you. Ill-placed on ESS or COS? I understood restriping was for NSDs, so that would be on our ESS not COS? The direction I want to move the files is from COS to ESS. We do not have AFM enabled, we are using TCT. ZjQcmQRYFpfptBannerStart This Message Is From an External Sender This message came from outside your organization. ZjQcmQRYFpfptBannerEnd Thank you. Ill-placed on ESS or COS? I understood restriping was for NSDs, so that would be on our ESS not COS? The direction I want to move the files is from COS to ESS. We do not have AFM enabled, we are using TCT. From: gpfsug-discuss <[email protected]<mailto:[email protected]>> On Behalf Of IBM Spectrum Scale Sent: 18 July 2022 20:35 To: gpfsug main discussion list <[email protected]<mailto:[email protected]>>; Venkateswara R Puvvada <[email protected]<mailto:[email protected]>> Subject: Re: [gpfsug-discuss] mass recall from on-prem COS using a policy "KB_Ill" shows how much data are ill placed or ill replicated. They can be resolved by mmrestripefs or mmrestripefile. Copying to AFM team regarding recall in AFM-COS environment. Regards, The Spectrum Scale (GPFS) team ------------------------------------------------------------------------------------------------------------------ If your query concerns a potential software error in Spectrum Scale (GPFS) and you have an IBM software maintenance contract please contact 1-800-237-5511 in the United States or your local IBM Service Center in other countries. [Inactive hide details for "Paul Ward" ---07/12/2022 11:40:52 AM---Hi all, I need to recall from on-prem COS a folder with subfo]"Paul Ward" ---07/12/2022 11:40:52 AM---Hi all, I need to recall from on-prem COS a folder with subfolders and files, approximately 4600 fil From: "Paul Ward" <[email protected]<mailto:[email protected]>> To: "[email protected]<mailto:[email protected]>" <[email protected]<mailto:[email protected]>> Date: 07/12/2022 11:40 AM Subject: [EXTERNAL] [gpfsug-discuss] mass recall from on-prem COS using a policy Sent by: "gpfsug-discuss" <[email protected]<mailto:[email protected]>> ________________________________ Hi all, I need to recall from on-prem COS a folder with subfolders and files, approximately 4600 files making up 656G. We have a policy that runs every 30 mins, and I added this line to it: ZjQcmQRYFpfptBannerStart This Message Is From an External Sender This message came from outside your organization. ZjQcmQRYFpfptBannerEnd Hi all, I need to recall from on-prem COS a folder with subfolders and files, approximately 4600 files making up 656G. We have a policy that runs every 30 mins, and I added this line to it: RULE 'migrate_iac-workspace_Chem-labs' MIGRATE FROM POOl 'migrate' TO POOL 'data' WHERE PATH_NAME LIKE '/gpfs/nhmfsa/bulk/share/workspace/iac-workspace/Chem_Labs/%' Migrate is an external pool RULE EXTERNAL POOL 'migrate' EXEC '/usr/lpp/mmfs/bin/mmcloudgateway files' OPTS '-F' ESCAPE '% -' And data is the default placement pool RULE 'Placement' SET POOL 'data' When it runs it identifies matching files: Rule# Hit_Cnt KB_Hit Chosen KB_Chosen KB_Ill Rule 13 4846 2491152 4846 2491152 7056 RULE 'migrate_iac-workspace_Chem-labs' MIGRATE FROM POOL 'migrate' TO POOL 'data' WHERE(.) I can’t find what ‘KB_ Ill’ means but nothing is migrating. Migrations in other policies work, but this is a ‘recall’ policy. >From this document >https://www.ibm.com/docs/en/spectrum-scale/5.1.3?topic=pools-migrate-recall-external<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ibm.com%2Fdocs%2Fen%2Fspectrum-scale%2F5.1.3%3Ftopic%3Dpools-migrate-recall-external&data=05%7C01%7Cp.ward%40nhm.ac.uk%7Ca62196787b154603fba908da6a1d564b%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637938971501969639%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=lekQJv2rgi%2B8CKoheGryeF0HaSmZIxKaTEBofyAMypY%3D&reserved=0> It suggests it should be invoked as a recall. Any ideas? Kindest regards, Paul Paul Ward TS Infrastructure Architect Natural History Museum T: 02079426450 E: [email protected]<mailto:[email protected]> [A picture containing drawing Description automatically generated] _______________________________________________ gpfsug-discuss mailing list gpfsug-discuss at gpfsug.org http://gpfsug.org/mailman/listinfo/gpfsug-discuss_gpfsug.org<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss_gpfsug.org&data=05%7C01%7Cp.ward%40nhm.ac.uk%7Ca62196787b154603fba908da6a1d564b%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637938971501969639%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=kYuZCRoUCPgMVc5zNJiLI%2BDAO3wh8UGx6O3%2F9iBDkeM%3D&reserved=0> _______________________________________________ gpfsug-discuss mailing list gpfsug-discuss at gpfsug.org http://gpfsug.org/mailman/listinfo/gpfsug-discuss_gpfsug.org<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss_gpfsug.org&data=05%7C01%7Cp.ward%40nhm.ac.uk%7Ca62196787b154603fba908da6a1d564b%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637938971501969639%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=kYuZCRoUCPgMVc5zNJiLI%2BDAO3wh8UGx6O3%2F9iBDkeM%3D&reserved=0>
_______________________________________________ gpfsug-discuss mailing list gpfsug-discuss at gpfsug.org http://gpfsug.org/mailman/listinfo/gpfsug-discuss_gpfsug.org
