## Functions vs Suppliers ##

I am a big IDE user myself, (now I am using IntelliJ) and I rely on 
hints/autocompletion all the time, so I certainly see value in that. However in 
this particular case I probably would prefer functions over the suppliers as in 
this case in the type hints I see a parameter name and it's type and if I see 
something like this for Scala:

```
transform[K1, V1](transformerSupplier: () => Transformer[K, V, (K1, V1)], ....)
```
I immediately understand that I need to provide a function that returns an 
actual transformer.

On the other hand if I see a Java-ish type hint:

```
transform[K1, V1](transformerSupplier: TransformerSupplier[K, V, KeyValue[K1, 
V1]], ......)
```
It's not exactly clear what am I supposed to put here. I know the standard Java 
Supplier<T> interface, but this is not it, so I would have to Ctrl-Click on the 
function to go to the definition of the .transform(), Ctrl-Click to go to the 
definition of TransformerSupplier, figure out that I can use a function in it's 
place and write a function.

I honestly don't have a habbit of starting to write new TransofmerSupplier() 
and seeing where it will lead me. I tried it, and it does work, and in the end 
IDE even offers me to replace a `new TransfomerSupplier()...` with a lambda 
expression, so it works out in the end, but you have to go through a lot of 
code as well.

## Anecdotal observations about KeyValue vs tuples ##

Scala developers that I work with or know personally, mostly hate Java, and 
having to use KeyValue class would cause huge amount of rolled eyes etc. Can't 
say for everyone of-course, just my observations.

## Implicit conversions for TransformerSuplier in practice ##

Sorry, forgot to mention something important in my previous message. If you use 
implicit conversions as shown in my previous message, my IDE (IntelliJ) can't 
figure out all necessary type information, so it gets confused.

I.e. with implicit conversions in scope the following code compiles just fine:

```scala
builder
      .stream[String, String]("words-part")
      .transform(() => new WordCountTransformer(), storeName)
      .mapValues(v => v.toString)
      .to("word-count")
```
But IntelliJ can't figure the types out and shows an error. To make it accept 
the code (which IDE users really do) you have to add some type information:

```scala
    builder
      .stream[String, String]("words-part")
      .transform[String, Int](() => new WordCountTransformer(), storeName)
      .mapValues(v => v.toString)
      .to("word-count")
```

Where WordCountTransfomer is defined like this:

```scala
class WordCountTransformer extends Transformer[String, String, (String, Int)] {

  var context: ProcessorContext = _
  var wordCountStore: KeyValueStore[String, Int] = _

  override def init(context: ProcessorContext): Unit = {
    this.context = context
    wordCountStore = context
      .getStateStore("word-count-store")
      .asInstanceOf[KeyValueStore[String, Int]]
  }

  override def transform(dummyKey: String, word: String): (String, Int) = {
    val oldValue = Option(wordCountStore.get(word)).getOrElse(0)
    val newValue = oldValue + 1
    wordCountStore.put(word, newValue)
    (word, newValue)
  }

  override def close(): Unit = {}
}
```

I tried messing with this code, but I can't figure a way how to change my 
implicit conversions to make it so that both the code compiles, and IntelliJ is 
happy. (I am on IntelliJ 2018.2). Some people (who use simple editors and don't 
rely on their IDEs) probably don't care about this point, but I would be 
certainly very sad if I had to add extra unnecessary type declarations to make 
my editor happy.

## What I would do about this ##
When I try to change some existing library I try to write the code in a way 
that would fit with the existing code, and this is what I tried this time. I do 
believe that this pull request matches the style/philosophy of the existing 
kafka-streams-scala (and the previous lightbend's version as well). However I 
understand that you would want to take this library in the other direction, in 
that case this pull request wouldn't make much sense, but on the other hand to 
present users with some uniform feel that would require a big rewrite changing 
many existing functions as having some methods take plain functions and other 
methods take suppliers feels very inconsistent.

I see your point as well, and I would be fine with these changes, i.e. API 
accepting suppliers, and converting Scala functions to suppliers via implicits, 
and implicits for KeyValue tuple conversions, (even though it would take a big 
rewrite to get the library to a consistent feel). However it seems that it 
would make life more difficult for IDE users (showing errors where there are 
none).

Personally I feel that this pull request keeps the consistency with the API 
without being a big rewrite, and without compromising usability for IDE users.

If you want I can put on github a simple demo project where you can play with 
this implicit conversions to see how IDE would react.

## Next steps ## 
I guess it's your decision as an author, which direction you want to take this 
library. Please take my observations, and usability for other IDE users into 
consideration.

[ Full content available at: https://github.com/apache/kafka/pull/5619 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to