[ https://issues.apache.org/jira/browse/SPARK-37972?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Maciej Szymkiewicz reassigned SPARK-37972: ------------------------------------------ Assignee: Maciej Szymkiewicz > Typing incompatibilities with numpy==1.22.x > ------------------------------------------- > > Key: SPARK-37972 > URL: https://issues.apache.org/jira/browse/SPARK-37972 > Project: Spark > Issue Type: Bug > Components: MLlib, PySpark > Affects Versions: 3.3.0 > Reporter: Maciej Szymkiewicz > Assignee: Maciej Szymkiewicz > Priority: Minor > > When type checked against {{numpy==1.22}} mypy detects following issues: > {code:python} > python/pyspark/mllib/linalg/__init__.py:412: error: Argument 2 to "norm" has > incompatible type "Union[float, str]"; expected "Union[None, float, > Literal['fro'], Literal['nuc']]" [arg-type] > python/pyspark/mllib/linalg/__init__.py:457: error: No overload variant of > "dot" matches argument types "ndarray[Any, Any]", "Iterable[float]" > [call-overload] > python/pyspark/mllib/linalg/__init__.py:457: note: Possible overload variant: > python/pyspark/mllib/linalg/__init__.py:457: note: def dot(a: > Union[_SupportsArray[dtype[Any]], > _NestedSequence[_SupportsArray[dtype[Any]]], bool, int, float, complex, str, > bytes, _NestedSequence[Union[bool, int, float, complex, str, bytes]]], b: > Union[_SupportsArray[dtype[Any]], > _NestedSequence[_SupportsArray[dtype[Any]]], bool, int, float, complex, str, > bytes, _NestedSequence[Union[bool, int, float, complex, str, bytes]]], out: > None = ...) -> Any > python/pyspark/mllib/linalg/__init__.py:457: note: <1 more non-matching > overload not shown> > python/pyspark/mllib/linalg/__init__.py:707: error: Argument 2 to "norm" has > incompatible type "Union[float, str]"; expected "Union[None, float, > Literal['fro'], Literal['nuc']]" [arg-type] > {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org