[ 
https://issues.apache.org/jira/browse/PIG-737?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12695980#action_12695980
 ] 

Daniel Dai commented on PIG-737:
--------------------------------

I compare the unit test time on Unix and Cygwin. It is not a comparison between 
performance between Unix and Cygwin cuz I use different computers for Unix and 
Cygwin test, rather, I was trying to find whether a particular unit test is 
significantly slower. All unit tests which use "MiniCluster" (marked as red in 
the table below) are consistently slower under Cygwin in my test. For those 
long unit tests(>100s), the range of slowdown is about 1.29--2.23. Here is a 
list:

||Test Case||Test Time in Unix||Test Time in Cygwin||
|TestAdd|0.116|0.25|
|{color:red}TestAlgebraicEval{color}|350.899|451.75| 
|TestAlgebraicEvalLocal|23.428|15.531|
|{color:red}TestBZip{color}|35.987|46.391|
|{color:red}TestBestFitCast{color}|211.681|409|
|TestBinaryStorage|1.135|4.062|
|TestBoolean|0.212|0.141|
|TestBuiltin|1.111|0.672|
|TestCmdLineParser|0.064|0.047|
|{color:red}TestCombiner{color}|144.184|222.625|
|{color:red}TestCompressedFiles{color}|24.744|50.359|
|TestConstExpr|0.115|0.063|
|TestConversions|0.385|0.25|
|{color:red}TestCustomSlicer{color}|953.088|211.719|
|TestDataBag|1.329|1.625|
|{color:red}TestDataBagAccess{color}|138.525|261.281|
|TestDataModel|0.264|0.281|
|TestDeleteOnFail|0.072|0.172|
|TestDivide|0.093|0.031|
|TestEqualTo|0.261|0.141|
|{color:red}TestEvalPipeline{color}|608.986|1,089.17|
|{color:red}TestEvalPipeline2{color}|64.247|127.532|
|TestEvalPipelineLocal|5.351|4.422|
|{color:red}TestExampleGenerator{color}|3.169|5.578|
|{color:red}TestFRJoin{color}|413.982|667.406|
|TestFilter|0.323|0.156|
|{color:red}TestFilterOpNumeric{color}|66.538|145.922|
|{color:red}TestFilterOpString{color}|54.282|105.844|
|{color:red}TestFilterUDF{color}|17.943|40.14|
|TestFinish|16.55|25.015|
|TestForEach|0.28|0.172|
|TestForEachNestedPlan|20.252|23.312|
|TestForEachNestedPlanLocal|0.591|0.562|
|TestFuncSpec|0.225|0.141|
|TestGTOrEqual|0.264|0.156|
|TestGreaterThan|0.264|0.188|
|TestGrunt|3.083|2.343|
|TestImplicitSplit|1.427|3.453|
|{color:red}TestInfixArithmetic{color}|76.284|146.781|
|{color:red}TestInputOutputFileValidator{color}|0.717|1.984|
|TestInstantiateFunc|0.055|0.032|
|{color:red}TestJobSubmission{color}|6.646|4.641|
|TestKeyTypeDiscoveryVisitor|55.592|88.703|
|TestLTOrEqual|0.264|0.171|
|TestLessThan|0.263|0.171|
|TestLoad|0.286|0.219|
|TestLocal|1.698|2.172|
|TestLocal2|0.713|0.609|
|TestLocalJobSubmission|7.925|8.438|
|TestLocalPOSplit|0.72|1.5|
|TestLocalRearrange|0.27|0.156|
|TestLogToPhyCompiler|1.13|1.719|
|TestLogicalOptimizer|1.519|2.093|
|TestLogicalPlanBuilder|1.804|2.203|
|TestMRCompiler|0.79|0.625|
|{color:red}TestMapReduce{color}|238.474|405.843|
|TestMapReduce2|40.298|46.078|
|TestMod|0.062|0.047|
|TestMultiply|0.062|0.031|
|TestNotEqualTo|0.254|0.188|
|TestNull|0.274|0.172|
|{color:red}TestNullConstant{color}|85.932|172.828|
|TestOperatorPlan|0.236|0.187|
|TestPOBinCond|0.114|0.078|
|TestPOCast|0.302|0.235|
|TestPOCogroup|0.251|0.156|
|TestPOCross|0.227|0.125|
|TestPODistinct|0.079|0.046|
|TestPOGenerate|0.093|0.046|
|TestPOMapLookUp|0.204|0.141|
|{color:red}TestPONegative{color}|9.339|22.531|
|TestPOSort|0.312|0.235|
|TestPOUserFunc|0.291|0.265|
|TestPackage|107.555|86.578|
|TestParamSubPreproc|0.586|2.734|
|{color:red}TestParser{color}|56.527|18.078|
|TestPhyOp|0.261|0.141|
|TestPigContext|1.265|1.922|
|TestPigScriptParser|0.411|0.391|
|{color:red}TestPigServer{color}|3.56|1.391|
|TestPigSplit|0.988|1.672|
|TestProject|0.283|0.188|
|TestRegexp|0.189|0.11|
|TestSchema|0.219|0.156|
|TestSchemaParser|0.323|0.219|
|{color:red}TestSplitStore{color}|363.749|562.797|
|TestStore|0.459|0.313|
|{color:red}TestStoreOld{color}|115.725|212.735|
|TestStreaming|2.017|14.344|
|TestStreamingLocal|1.964|12.875|
|TestSubtract|0.062|0.031|
|TestTypeChecking|1.264|1.781|
|TestTypeCheckingValidator|6.584|5.625|
|TestTypeCheckingValidatorNoSchema|0.408|0.25|
|{color:red}TestUnion{color}|43.671|75.234|

Here is a decomposed list for one of the test case TestDataBagAccess;

||Test||Test Time in Unix||Test Time in Cygwin||
|testSingleTupleBagAcess|0.261|0.188|
|testNonSpillableDataBag|0.14|0.093|
|testBagConstantAccess|17.773|23.375|
|testBagConstantAccessFailure|0.194|0.438|
|testBagConstantFlatten1|15.794|17.343|
|testBagConstantFlatten2|25.736|35.594|
|testBagStoreLoad|157.403|239.734|

Based on these data, no particular long unit test is significantly slow. The 
factor of slowdown is relatively stable considering the diversity of code we 
are testing. So I think what we need to deal with is general performance 
problem under Cygwin rather than a particular unit test. Does anyone see some 
exceptions on other computers?

> A few unit tests take a long timeto run  on windows
> ---------------------------------------------------
>
>                 Key: PIG-737
>                 URL: https://issues.apache.org/jira/browse/PIG-737
>             Project: Pig
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 0.2.0
>         Environment: Windows
>            Reporter: Santhosh Srinivasan
>             Fix For: 0.3.0
>
>
> A few unit tests take a long time to run on Windows. This problem has to be 
> diagnosed.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to