Github user ksonj commented on the pull request:

    https://github.com/apache/spark/pull/6057#issuecomment-101166533
  
    The particular bug from above lies 
[here](https://github.com/apache/spark/blob/master/python/pyspark/sql/context.py#L304).
    
    Another solution I can think of, is to replace
    
        rdd = rdd.map(tuple)
    
    with 
    
        getter = operator.attrgetter(*[field.name for field in schema.fields])
        rdd = rdd.map(getter)
    
    This seems to return the datefields correctly.
    
    Both my suggestions feel a bit dirty to me, though, and maybe this is 
something that should be fixed in Pyrolite itself. There's an open issue about 
this here: https://github.com/irmen/Pyrolite/issues/28


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to