[ https://issues.apache.org/jira/browse/BEAM-5626?focusedWorklogId=153365&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153365 ]
ASF GitHub Bot logged work on BEAM-5626: ---------------------------------------- Author: ASF GitHub Bot Created on: 11/Oct/18 02:14 Start Date: 11/Oct/18 02:14 Worklog Time Spent: 10m Work Description: tvalentyn commented on issue #6628: [BEAM-5626] Run more tests in Python 3. URL: https://github.com/apache/beam/pull/6628#issuecomment-428794469 BEAM-5626 is solved, that's exactly why I would like to add hadoopfilesystem_test to the test suite. This PR does not skip other tests. I'll update the description. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking ------------------- Worklog Id: (was: 153365) Time Spent: 4h 40m (was: 4.5h) > Several IO tests fail in Python 3 with RuntimeError('dictionary changed size > during iteration',)} > ------------------------------------------------------------------------------------------------- > > Key: BEAM-5626 > URL: https://issues.apache.org/jira/browse/BEAM-5626 > Project: Beam > Issue Type: Sub-task > Components: sdk-py-core > Reporter: Valentyn Tymofieiev > Assignee: Ruoyun Huang > Priority: Major > Fix For: 2.8.0 > > Time Spent: 4h 40m > Remaining Estimate: 0h > > ERROR: test_delete_dir > (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem_test.py", > line 506, in test_delete_dir > self.fs.delete([url_t1]) > File > "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem.py", > line 370, in delete > raise BeamIOError("Delete operation failed", exceptions) > apache_beam.io.filesystem.BeamIOError: Delete operation failed with > exceptions {'hdfs://test_dir/new_dir1': RuntimeError('dictionary changed size > during iteration', )} -- This message was sent by Atlassian JIRA (v7.6.3#76005)