[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-18 Thread KurtYoung
Github user KurtYoung commented on the issue:

https://github.com/apache/flink/pull/3406
  
Seems good to merge now, merging...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-16 Thread beyond1920
Github user beyond1920 commented on the issue:

https://github.com/apache/flink/pull/3406
  
@twalthr , thanks for the review. I have updated the pr based on your 
suggestion. I would add documents later in the pr #3409. About the name of 
ExternalCatalog, I notice that there are three kinds of catalog at flink now, 
the first one is calcite catalog, the second one is function catalog, the third 
one is the external catalog. I add 'external' prefix to distinguish them. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-16 Thread fhueske
Github user fhueske commented on the issue:

https://github.com/apache/flink/pull/3406
  
Good point about the documentation. I think this should be added with PR 
#3409 when registering external catalogs is exposed via the `TableEnvironment`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-16 Thread twalthr
Github user twalthr commented on the issue:

https://github.com/apache/flink/pull/3406
  
Not to forget: we also need some good documentation for the website.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-14 Thread fhueske
Github user fhueske commented on the issue:

https://github.com/apache/flink/pull/3406
  
Thanks for the update @beyond1920.
The PR looks good to me. 
@twalthr do you also want to have a look at this PR?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-13 Thread beyond1920
Github user beyond1920 commented on the issue:

https://github.com/apache/flink/pull/3406
  
@fhueske , thanks for your review. I already updated the pr based on your 
comments.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-11 Thread beyond1920
Github user beyond1920 commented on the issue:

https://github.com/apache/flink/pull/3406
  
@fhueske , thanks for your review. I updated the pr based on your comments.
Your suggestion about moving the annotation from TableSource to 
TableSourceConverter is good, I changed the pr in this way.  
About not use scanning at all but exactly specify the Converter classes. It 
can work, too. However, if somebody adds new tableSourceConverters , such as 
parquetTableSourceConverter or else, users need to change the code or config 
file to register new added Converters, right? However scanning method can finds 
all converters automatically.
Let me know what's your opinion, thanks.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-10 Thread fhueske
Github user fhueske commented on the issue:

https://github.com/apache/flink/pull/3406
  
Hi @KurtYoung, you are right. Only `requiredProperties()` would be required 
to verify properties. 

I thought that the other two methods would be a good way to define the 
parameters of the converter. They could be used to print a usage message or 
details when the properties are not matched. We can also leave those out if you 
think that the implementation overhead does not correspond to the gains.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-09 Thread KurtYoung
Github user KurtYoung commented on the issue:

https://github.com/apache/flink/pull/3406
  
Hi @fhueske , i like your propose about moving the annotation from 
`TableSource` to `TableSourceConverter`. Lets do it this way. 
BTW, i noticed that you offered three possible methods to the 
`TableSourceConverter`, i can only imagine `def requiredProperties: 
Array[String]
` is necessary for now. It can help validating the converter and to decide 
which converter we should use when multiple converters have the same 
`TableType`. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #3406: [flink-5568] [Table API & SQL]Introduce interface for cat...

2017-03-06 Thread beyond1920
Github user beyond1920 commented on the issue:

https://github.com/apache/flink/pull/3406
  
@fhueske, thanks for your review. I changed the pr based on your 
suggestions, except for one point.
About adding the version field to ExternalCatalogCompatible, could we 
define tableType is identifier, it includes version information. For example, 
kafka0.8/ kafka0.9 / kafka1.0 or hive1.0/ hive2.0.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---