plainolneesh commented on pull request #338:
URL: https://github.com/apache/fluo-muchos/pull/338#issuecomment-655522990


   @arvindshmicrosoft when I tried troubleshooting my issues, I am having an 
issue with the cibuild script that's showing an error with ansible-lint. 
   
   ```/git/fluo_testing/fluo-muchos(master) ยป ./scripts/cibuild                 
                                                                                
           stark_enterprise@pop-os
   Running nose tests...
   ........
   ----------------------------------------------------------------------
   Ran 8 tests in 0.029s
   
   OK
   SUCCESS: All Nose tests completed.
   Running Ansible-lint...
   Couldn't parse task at 
/home/stark_enterprise/git/fluo_testing/fluo-muchos/ansible/roles/azure/tasks/log_analytics_ws_common.yml:40
 (no action detected in task. This often indicates a misspelled module name, or 
incorrect module path.
   
   The error appears to be in '<unicode string>': line 40, column 3, but may
   be elsewhere in the file depending on the exact syntax problem.
   
   (could not open file to display line))
   { 'azure_rm_resource_info': { '__file__': 
'/home/stark_enterprise/git/fluo_testing/fluo-muchos/ansible/roles/azure/tasks/log_analytics_ws_common.yml',
                                 '__line__': 42,
                                 'resource_group': '{{ resource_group }}',
                                 'resource_type': 'resources'},
     'name': 'Query all the resources in the resource group',
     'register': 'rgfacts',
     'skipped_rules': []}
   ```
   
   This is happens within my code and when I cloned the current version of 
fluo-muchos. Can you please assist me? This is now a blocker.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to