RyanSkraba commented on a change in pull request #18871:
URL: https://github.com/apache/flink/pull/18871#discussion_r811752561
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroOutputFormatTest.java
##########
@@ -117,21 +116,22 @@ private void serializeAndDeserialize(final
AvroOutputFormat.Codec codec, final S
new ObjectInputStream(new
ByteArrayInputStream(bos.toByteArray()))) {
// then
Object o = ois.readObject();
- assertTrue(o instanceof AvroOutputFormat);
+ assertThat(o instanceof AvroOutputFormat).isTrue();
@SuppressWarnings("unchecked")
final AvroOutputFormat<User> restored = (AvroOutputFormat<User>) o;
final AvroOutputFormat.Codec restoredCodec =
(AvroOutputFormat.Codec)
Whitebox.getInternalState(restored, "codec");
final Schema restoredSchema =
(Schema) Whitebox.getInternalState(restored,
"userDefinedSchema");
- assertTrue(codec != null ? restoredCodec == codec : restoredCodec
== null);
- assertTrue(schema != null ? restoredSchema.equals(schema) :
restoredSchema == null);
+ assertThat(codec != null ? restoredCodec == codec : restoredCodec
== null).isTrue();
+ assertThat(schema != null ? restoredSchema.equals(schema) :
restoredSchema == null)
+ .isTrue();
Review comment:
```suggestion
assertThat(codec).isSameAs(restoredCodec);
assertThat(schema).isEqualTo(restoredSchema);
```
AssertJ handles nulls correctly for these comparisons.
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroOutputFormatITCase.java
##########
@@ -46,6 +45,8 @@
import java.util.List;
import java.util.Objects;
+import static org.assertj.core.api.Assertions.assertThat;
+
/** IT cases for the {@link AvroOutputFormat}. */
@SuppressWarnings("serial")
public class AvroOutputFormatITCase extends JavaProgramTestBase {
Review comment:
This is an example of a class that moved to AssertJ matchers while
leaving the base class still uses JUnit 4 to run. The `public` modifiers are
still necessary, but fortunately there's nothing else JUnit4 to change here.
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/typeutils/AvroTypeExtractionTest.java
##########
@@ -27,55 +27,79 @@
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.core.fs.Path;
+import org.apache.flink.core.testutils.AllCallbackWrapper;
import org.apache.flink.formats.avro.AvroInputFormat;
import org.apache.flink.formats.avro.AvroRecordInputFormatTest;
import org.apache.flink.formats.avro.generated.Fixed16;
import org.apache.flink.formats.avro.generated.User;
-import org.apache.flink.test.util.MultipleProgramsTestBase;
-
-import org.junit.After;
-import org.junit.Assert;
-import org.junit.Before;
-import org.junit.Rule;
-import org.junit.Test;
-import org.junit.rules.TemporaryFolder;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.apache.flink.runtime.testutils.MiniClusterExtension;
+import org.apache.flink.runtime.testutils.MiniClusterResourceConfiguration;
+import org.apache.flink.test.util.CollectionTestEnvironment;
+import org.apache.flink.test.util.TestBaseUtils;
+import org.apache.flink.test.util.TestEnvironment;
+
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.extension.RegisterExtension;
+import org.junit.jupiter.api.io.TempDir;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
import java.io.File;
-import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
+import java.util.stream.Stream;
-/** Tests for the {@link AvroInputFormat} reading Pojos. */
-@RunWith(Parameterized.class)
-public class AvroTypeExtractionTest extends MultipleProgramsTestBase {
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.assertj.core.api.Assertions.fail;
- public AvroTypeExtractionTest(TestExecutionMode mode) {
- super(mode);
+/** Tests for the {@link AvroInputFormat} reading Pojos. */
+class AvroTypeExtractionTest {
Review comment:
This was the most "complicated" test in the module (that I attempted):
- Removed base class to provide 2 `ExecutionEnvironment` instance through a
JUnit 5 parameterization, and a MiniClusterExtension directly in the class.
- The call to
[teardown](https://github.com/apache/flink/blob/c6997c97c575d334679915c328792b8a3067cfb5/flink-test-utils-parent/flink-test-utils/src/main/java/org/apache/flink/test/util/MultipleProgramsTestBase.java#L102)
in the base class no longer occurs.
- It looks like there's no impact: since the ExecutionEvironment isn't set
as context, it doesn't need to be unset?
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroOutputFormatITCase.java
##########
@@ -99,8 +100,7 @@ protected void postSubmit() throws Exception {
output1 = file1.listFiles();
// check for avro ext in dir.
for (File avroOutput : Objects.requireNonNull(output1)) {
- Assert.assertTrue(
- "Expect extension '.avro'",
avroOutput.toString().endsWith(".avro"));
+ assertThat(avroOutput.toString()).endsWith(".avro");
Review comment:
AssertJ does nice enough descriptions, so I removed a few throughout the
code. This, for example, produces:
```
Expecting actual:
"/tmp/junit5763721249197813393/junit1392834020197511051/avro_output1/4.avrox"
to end with:
".avro"
```
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroOutputFormatTest.java
##########
@@ -117,21 +116,22 @@ private void serializeAndDeserialize(final
AvroOutputFormat.Codec codec, final S
new ObjectInputStream(new
ByteArrayInputStream(bos.toByteArray()))) {
// then
Object o = ois.readObject();
- assertTrue(o instanceof AvroOutputFormat);
+ assertThat(o instanceof AvroOutputFormat).isTrue();
Review comment:
```suggestion
assertThat(o).isInstanceOf(AvroOutputFormat.class);
```
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroOutputFormatTest.java
##########
@@ -216,9 +216,9 @@ public void testGenericRecord() throws IOException {
while (dataFileReader.hasNext()) {
GenericRecord record = dataFileReader.next();
- assertEquals(record.get("user_name").toString(), "testUser");
- assertEquals(record.get("favorite_number"), 1);
- assertEquals(record.get("favorite_color").toString(), "blue");
+
assertThat("testUser").isEqualTo(record.get("user_name").toString());
+ assertThat(1).isEqualTo(record.get("favorite_number"));
+
assertThat("blue").isEqualTo(record.get("favorite_color").toString());
Review comment:
```suggestion
assertThat(record.get("user_name").toString()).isEqualTo("testUser");
assertThat(record.get("favorite_number")).isEqualTo(1);
assertThat(record.get("favorite_color").toString()).isEqualTo("blue");
```
For consistency.
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/typeutils/AvroSerializerSnapshotTest.java
##########
@@ -44,11 +44,11 @@
import static
org.apache.flink.api.common.typeutils.TypeSerializerMatchers.isCompatibleAfterMigration;
import static
org.apache.flink.api.common.typeutils.TypeSerializerMatchers.isCompatibleAsIs;
import static
org.apache.flink.api.common.typeutils.TypeSerializerMatchers.isIncompatible;
-import static org.hamcrest.CoreMatchers.is;
-import static org.hamcrest.MatcherAssert.assertThat;
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.assertj.core.api.HamcrestCondition.matching;
Review comment:
I'm not sure whether this is the right change at this time, while the
`org.apache.flink.api.common.typeutils.TypeSerializerMatchers` are still being
used. This permits AssertJ to use the old style matchers for now, but the
class will need to be refactored if they are turned into AssertJ `Conditions`.
##########
File path:
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroRowDataDeSerializationSchemaTest.java
##########
@@ -206,34 +202,33 @@ public void testSpecificType() throws Exception {
RowData rowData = deserializationSchema.deserialize(input);
byte[] output = serializationSchema.serialize(rowData);
RowData rowData2 = deserializationSchema.deserialize(output);
- Assert.assertEquals(rowData, rowData2);
- Assert.assertEquals(timestamp, rowData.getTimestamp(0, 3).toInstant());
- Assert.assertEquals(
- "2014-03-01",
- DataFormatConverters.LocalDateConverter.INSTANCE
- .toExternal(rowData.getInt(1))
- .toString());
- Assert.assertEquals(
- "12:12:12",
- DataFormatConverters.LocalTimeConverter.INSTANCE
- .toExternal(rowData.getInt(2))
- .toString());
+ assertThat(rowData2).isEqualTo(rowData);
+ assertThat(rowData.getTimestamp(0,
3).toInstant()).isEqualTo(timestamp);
+
+ assertThat(
+ DataFormatConverters.LocalDateConverter.INSTANCE
+ .toExternal(rowData.getInt(1))
+ .toString())
+ .isEqualTo("2014-03-01");
+ assertThat(
+ DataFormatConverters.LocalTimeConverter.INSTANCE
+ .toExternal(rowData.getInt(2))
+ .toString())
+ .isEqualTo("12:12:12");
}
@Test
- public void testSerializationWithTypesMismatch() throws Exception {
+ void testSerializationWithTypesMismatch() throws Exception {
AvroRowDataSerializationSchema serializationSchema =
createSerializationSchema(ROW(FIELD("f0", INT()), FIELD("f1",
STRING())).notNull());
GenericRowData rowData = new GenericRowData(2);
rowData.setField(0, 1);
- rowData.setField(0, 2);
- String errorMessage = "Fail to serialize at field: f1.";
- try {
- serializationSchema.serialize(rowData);
- fail("expecting exception message: " + errorMessage);
- } catch (Throwable t) {
- assertThat(t, FlinkMatchers.containsMessage(errorMessage));
Review comment:
Ooooh, found a bug here: `rowData` is actually serialized correctly ,
but the `fail` call afterwards throws an assertion exception (containing the
expected error message by definition). I suspect that the `setField(1, 2)` is
what was intended, actually testing a type mismatch.
The `withStackTraceContaining` looks like a nice AssertJ replacement for
finding messages in the cause exceptions.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]