Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5238#discussion_r27350987
  
    --- Diff: python/pyspark/sql/context.py ---
    @@ -173,31 +173,8 @@ def _inferSchema(self, rdd, samplingRatio=None):
             return schema
     
         def inferSchema(self, rdd, samplingRatio=None):
    -        """Infer and apply a schema to an RDD of L{Row}.
    -
    -        ::note:
    -            Deprecated in 1.3, use :func:`createDataFrame` instead
    -
    -        When samplingRatio is specified, the schema is inferred by looking
    -        at the types of each row in the sampled dataset. Otherwise, the
    -        first 100 rows of the RDD are inspected. Nested collections are
    -        supported, which can include array, dict, list, Row, tuple,
    -        namedtuple, or object.
    -
    -        Each row could be L{pyspark.sql.Row} object or namedtuple or 
objects.
    -        Using top level dicts is deprecated, as dict is used to represent 
Maps.
    -
    -        If a single column has multiple distinct inferred types, it may 
cause
    -        runtime exceptions.
    -
    -        >>> rdd = sc.parallelize(
    -        ...     [Row(field1=1, field2="row1"),
    -        ...      Row(field1=2, field2="row2"),
    -        ...      Row(field1=3, field2="row3")])
    -        >>> df = sqlCtx.inferSchema(rdd)
    -        >>> df.collect()[0]
    -        Row(field1=1, field2=u'row1')
    -        """
    +        """DEPRECATED: use :func:`createDataFrame` instead"""
    +        warnings.warn("Use createDataFrame instead of inferSchema.", 
DeprecationWarning)
    --- End diff --
    
    I updated it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to