bowenli86 opened a new pull request #8813: [FLINK-12891][hive] remove 
hadoop/hive writable from boundaries of Hive functions and Flink
URL: https://github.com/apache/flink/pull/8813
 
 
   ## What is the purpose of the change
   
   This PR removes hadoop/hive writable from boundaries of Hive functions and 
Flink because Flink only deals with java objects rather than hadoop/hive 
writables. Data is passed from Flink to Hive functions and from Hive functions 
back to Flink will always be simple java objects.
   
   ## Brief change log
   
   - remove hadoop/hive writables conversions from `getConversion()` and 
`toFlinkObject()` in `HiveInspector`
   
   ## Verifying this change
   
   This change is already covered by existing tests, such as 
`HiveSimpleUDFTest`, `HiveGenericUDFTest`, `HiveGenericUDTFTest`.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): (no)
     - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: (no)
     - The serializers: (no)
     - The runtime per-record code paths (performance sensitive): (no)
     - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
     - The S3 file system connector: (no)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? (no)
     - If yes, how is the feature documented? (not applicable)
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to