Request for permissions to contribute to Apache Kafka

2021-08-13 Thread Christo NUKK
Hello,
Wiki ID: christo_lolovJira ID: christo_lolovCould you please grant me the 
necessary permissions to contribute to Apache Kafka. I don't know whether the 
above are the corrects identifiers, but they were the only ones I could find.
Best wishes,Christo

Build failed in Jenkins: Kafka » Kafka Branch Builder » trunk #407

2021-08-13 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 487049 lines...]
[2021-08-14T00:34:14.009Z] > Task :streams:copyDependantLibs UP-TO-DATE
[2021-08-14T00:34:14.009Z] > Task :streams:jar UP-TO-DATE
[2021-08-14T00:34:14.009Z] > Task :streams:test-utils:compileJava UP-TO-DATE
[2021-08-14T00:34:14.009Z] > Task 
:streams:generateMetadataFileForMavenJavaPublication
[2021-08-14T00:34:14.009Z] > Task :core:compileTestScala UP-TO-DATE
[2021-08-14T00:34:14.009Z] > Task :core:testClasses UP-TO-DATE
[2021-08-14T00:34:18.850Z] > Task :connect:api:javadoc
[2021-08-14T00:34:18.850Z] > Task :connect:api:copyDependantLibs UP-TO-DATE
[2021-08-14T00:34:18.850Z] > Task :connect:api:jar UP-TO-DATE
[2021-08-14T00:34:18.850Z] > Task 
:connect:api:generateMetadataFileForMavenJavaPublication
[2021-08-14T00:34:18.850Z] > Task :connect:json:copyDependantLibs UP-TO-DATE
[2021-08-14T00:34:18.850Z] > Task :connect:json:jar UP-TO-DATE
[2021-08-14T00:34:18.850Z] > Task 
:connect:json:generateMetadataFileForMavenJavaPublication
[2021-08-14T00:34:18.850Z] > Task 
:connect:json:publishMavenJavaPublicationToMavenLocal
[2021-08-14T00:34:18.850Z] > Task :connect:json:publishToMavenLocal
[2021-08-14T00:34:18.850Z] > Task :connect:api:javadocJar
[2021-08-14T00:34:18.850Z] > Task :connect:api:compileTestJava UP-TO-DATE
[2021-08-14T00:34:18.850Z] > Task :connect:api:testClasses UP-TO-DATE
[2021-08-14T00:34:18.850Z] > Task :connect:api:testJar
[2021-08-14T00:34:18.850Z] > Task :connect:api:testSrcJar
[2021-08-14T00:34:18.850Z] > Task 
:connect:api:publishMavenJavaPublicationToMavenLocal
[2021-08-14T00:34:18.850Z] > Task :connect:api:publishToMavenLocal
[2021-08-14T00:34:19.381Z] [Checks API] No suitable checks publisher found.
[Pipeline] echo
[2021-08-14T00:34:19.382Z] Skipping Kafka Streams archetype test for Java 11
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // timestamps
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[2021-08-14T00:34:22.084Z] > Task :streams:javadoc
[2021-08-14T00:34:22.084Z] > Task :streams:javadocJar
[2021-08-14T00:34:23.282Z] > Task :streams:compileTestJava UP-TO-DATE
[2021-08-14T00:34:23.282Z] > Task :streams:testClasses UP-TO-DATE
[2021-08-14T00:34:23.282Z] > Task :streams:testJar
[2021-08-14T00:34:24.414Z] > Task :streams:testSrcJar
[2021-08-14T00:34:24.414Z] > Task 
:streams:publishMavenJavaPublicationToMavenLocal
[2021-08-14T00:34:24.414Z] > Task :streams:publishToMavenLocal
[2021-08-14T00:34:25.345Z] > Task :clients:javadoc
[2021-08-14T00:34:26.362Z] > Task :clients:javadocJar
[2021-08-14T00:34:27.596Z] 
[2021-08-14T00:34:27.596Z] > Task :clients:srcJar
[2021-08-14T00:34:27.596Z] Execution optimizations have been disabled for task 
':clients:srcJar' to ensure correctness due to the following reasons:
[2021-08-14T00:34:27.596Z]   - Gradle detected a problem with the following 
location: 
'/home/jenkins/jenkins-agent/workspace/Kafka_kafka_trunk/clients/src/generated/java'.
 Reason: Task ':clients:srcJar' uses this output of task 
':clients:processMessages' without declaring an explicit or implicit 
dependency. This can lead to incorrect results being produced, depending on 
what order the tasks are executed. Please refer to 
https://docs.gradle.org/7.1.1/userguide/validation_problems.html#implicit_dependency
 for more details about this problem.
[2021-08-14T00:34:28.528Z] 
[2021-08-14T00:34:28.528Z] > Task :clients:testJar
[2021-08-14T00:34:28.528Z] > Task :clients:testSrcJar
[2021-08-14T00:34:29.715Z] > Task 
:clients:publishMavenJavaPublicationToMavenLocal
[2021-08-14T00:34:29.715Z] > Task :clients:publishToMavenLocal
[2021-08-14T00:34:29.715Z] 
[2021-08-14T00:34:29.715Z] Deprecated Gradle features were used in this build, 
making it incompatible with Gradle 8.0.
[2021-08-14T00:34:29.715Z] 
[2021-08-14T00:34:29.715Z] You can use '--warning-mode all' to show the 
individual deprecation warnings and determine if they come from your own 
scripts or plugins.
[2021-08-14T00:34:29.715Z] 
[2021-08-14T00:34:29.715Z] See 
https://docs.gradle.org/7.1.1/userguide/command_line_interface.html#sec:command_line_warnings
[2021-08-14T00:34:29.715Z] 
[2021-08-14T00:34:29.715Z] Execution optimizations have been disabled for 3 
invalid unit(s) of work during this build to ensure correctness.
[2021-08-14T00:34:29.715Z] Please consult deprecation warnings for more details.
[2021-08-14T00:34:29.715Z] 
[2021-08-14T00:34:29.715Z] BUILD SUCCESSFUL in 40s
[2021-08-14T00:34:29.715Z] 77 actionable tasks: 34 executed, 43 up-to-date
[Pipeline] sh
[2021-08-14T00:34:33.132Z] + grep ^version= gradle.properties
[2021-08-14T00:34:33.132Z] + cut -d= -f 2
[Pipeline] dir
[2021-08-14T00:34:34.152Z] Running in 
/home/jenkins/jenkins-agent/workspace/Kafka_kafka_trunk/streams/quickstart
[Pipeline] {

Jenkins build is unstable: Kafka » Kafka Branch Builder » 3.0 #93

2021-08-13 Thread Apache Jenkins Server
See 




[jira] [Resolved] (KAFKA-13194) LogCleaner may clean past highwatermark

2021-08-13 Thread Jun Rao (Jira)


 [ 
https://issues.apache.org/jira/browse/KAFKA-13194?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jun Rao resolved KAFKA-13194.
-
Fix Version/s: 3.1.0
 Assignee: Lucas Bradstreet
   Resolution: Fixed

Merged the PR to trunk.

> LogCleaner may clean past highwatermark
> ---
>
> Key: KAFKA-13194
> URL: https://issues.apache.org/jira/browse/KAFKA-13194
> Project: Kafka
>  Issue Type: Bug
>Reporter: Lucas Bradstreet
>Assignee: Lucas Bradstreet
>Priority: Minor
> Fix For: 3.1.0
>
>
> Here we have the cleaning point being bounded to the active segment base 
> offset and the first unstable offset. Which makes sense:
>  
> {code:java}
>// find first segment that cannot be cleaned
> // neither the active segment, nor segments with any messages closer to 
> the head of the log than the minimum compaction lag time
> // may be cleaned
> val firstUncleanableDirtyOffset: Long = Seq(  // we do not clean 
> beyond the first unstable offset
>   log.firstUnstableOffset,  // the active segment is always 
> uncleanable
>   Option(log.activeSegment.baseOffset),  // the first segment whose 
> largest message timestamp is within a minimum time lag from now
>   if (minCompactionLagMs > 0) {
> // dirty log segments
> val dirtyNonActiveSegments = 
> log.localNonActiveLogSegmentsFrom(firstDirtyOffset)
> dirtyNonActiveSegments.find { s =>
>   val isUncleanable = s.largestTimestamp > now - minCompactionLagMs
>   debug(s"Checking if log segment may be cleaned: log='${log.name}' 
> segment.baseOffset=${s.baseOffset} " +
> s"segment.largestTimestamp=${s.largestTimestamp}; now - 
> compactionLag=${now - minCompactionLagMs}; " +
> s"is uncleanable=$isUncleanable")
>   isUncleanable
> }.map(_.baseOffset)
>   } else None
> ).flatten.min
> {code}
>  
> But LSO starts out as None.
> {code:java}
> @volatile private var firstUnstableOffsetMetadata: Option[LogOffsetMetadata] 
> = None
> private[log] def firstUnstableOffset: Option[Long] = 
> firstUnstableOffsetMetadata.map(_.messageOffset){code}
> For most code depending on the LSO, fetchLastStableOffsetMetadata is used to 
> default it to the hwm if it's not set.
>  
> {code:java}
>   private def fetchLastStableOffsetMetadata: LogOffsetMetadata = {
> checkIfMemoryMappedBufferClosed()// cache the current high watermark 
> to avoid a concurrent update invalidating the range check
> val highWatermarkMetadata = fetchHighWatermarkMetadata
> firstUnstableOffsetMetadata match {
>   case Some(offsetMetadata) if offsetMetadata.messageOffset < 
> highWatermarkMetadata.messageOffset =>
> if (offsetMetadata.messageOffsetOnly) {
>   lock synchronized {
> val fullOffset = 
> convertToOffsetMetadataOrThrow(offsetMetadata.messageOffset)
> if (firstUnstableOffsetMetadata.contains(offsetMetadata))
>   firstUnstableOffsetMetadata = Some(fullOffset)
> fullOffset
>   }
> } else {
>   offsetMetadata
> }
>   case _ => highWatermarkMetadata
> }
>   }
> {code}
>  
>  
> This means that in the case where the hwm is prior to the active segment 
> base, the log cleaner may clean past the hwm. This is most likely to occur 
> after a broker restart when the log cleaner may start cleaning prior to 
> replication becoming active.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[DISCUSS] KIP-768: Extend SASL/OAUTHBEARER with Support for OIDC

2021-08-13 Thread Kirk True
Hi all!

I have created a new KIP for a new OAuth/OIDC related authentication feature.

This task is to provide a concrete implementation of the interfaces defined in 
KIP-255 to allow Kafka to connect to an OAuth / OIDC identity provider for 
authentication and token retrieval. While KIP-255 provides an unsecured JWT 
example for development purposes, this will fill in the gap and provide a 
production-grade implementation.

Here's the KIP:

https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=186877575

Thanks!
Kirk

Re: Requesting permission to contribute to Apache Kafka

2021-08-13 Thread Kirk True
Thanks!

On Thu, Aug 12, 2021, at 6:42 PM, Bill Bejeck wrote:
> Hi Kirk,
> 
> You're all set now.  Thanks for your interest in Apache Kafka.
> 
> -Bill
> 
> On Thu, Aug 12, 2021 at 1:40 PM Kirk True  wrote:
> 
> > Hi all,
> >
> > I'd like to contribute to Apache Kafka.
> >
> > My Confluence and Jira IDs are both kirktrue.
> >
> > Thanks,
> > Kirk
> 

[jira] [Created] (KAFKA-13202) KIP-768: Extend SASL/OAUTHBEARER with Support for OIDC

2021-08-13 Thread Kirk True (Jira)
Kirk True created KAFKA-13202:
-

 Summary: KIP-768: Extend SASL/OAUTHBEARER with Support for OIDC
 Key: KAFKA-13202
 URL: https://issues.apache.org/jira/browse/KAFKA-13202
 Project: Kafka
  Issue Type: New Feature
  Components: clients, security
Reporter: Kirk True
Assignee: Kirk True


This task is to provide a concrete implementation of the interfaces defined in 
[KIP-255|https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=75968876]
 to allow Kafka to connect to an [OAuth|https://en.wikipedia.org/wiki/OAuth] / 
[OIDC|https://en.wikipedia.org/wiki/OpenID#OpenID_Connect_(OIDC)] identity 
provider for authentication and token retrieval. While KIP-255 provides an 
unsecured JWT example for development, this will fill in the gap and provide a 
production-grade implementation.

The OAuth/OIDC work will allow out-of-the-box configuration by any Apache Kafka 
users to connect to an external identity provider service (e.g. Okta, Auth0, 
Azure, etc.). The code will implement the standard OAuth {{clientcredentials}} 
grant type.

The proposed change is largely composed of a pair of 
{{AuthenticateCallbackHandler}} implementations: one to login on the client and 
one to validate on the broker.

See [KIP-768: Extend SASL/OAUTHBEARER with Support for 
OIDC|https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=186877575]
 for more detail.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (KAFKA-13201) Convert KTable suppress to new PAPI

2021-08-13 Thread Jorge Esteban Quilcate Otoya (Jira)
Jorge Esteban Quilcate Otoya created KAFKA-13201:


 Summary: Convert KTable suppress to new PAPI
 Key: KAFKA-13201
 URL: https://issues.apache.org/jira/browse/KAFKA-13201
 Project: Kafka
  Issue Type: Sub-task
  Components: streams
Reporter: Jorge Esteban Quilcate Otoya






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (KAFKA-13200) Fix version of MirrorMaker2 connectors

2021-08-13 Thread Mickael Maison (Jira)
Mickael Maison created KAFKA-13200:
--

 Summary: Fix version of MirrorMaker2 connectors
 Key: KAFKA-13200
 URL: https://issues.apache.org/jira/browse/KAFKA-13200
 Project: Kafka
  Issue Type: Improvement
  Components: mirrormaker
Reporter: Mickael Maison
Assignee: Mickael Maison


MirrorMaker2 connectors have their version hardcoded to 1. Instead the should 
use the Kafka version like 2.8.0



--
This message was sent by Atlassian Jira
(v8.3.4#803005)