stevedlawrence commented on a change in pull request #643:
URL: https://github.com/apache/daffodil/pull/643#discussion_r715832996



##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/layers/LayerTransformer.scala
##########
@@ -101,15 +102,28 @@ object LayerTransformerFactory {
   register(GZIPTransformerFactory)
   register(IMFLineFoldedTransformerFactory)
   register(ICalendarLineFoldedTransformerFactory)
+
+  /**
+   * Arguably the above could be bundled with Daffodil.
+   *
+   * The transformers below really should be plugins defined
+   * outside Daffodil, per DAFFODIL-1927.
+   */
   register(AISPayloadArmoringTransformerFactory)
   register(FourByteSwapTransformerFactory)
+  register(IPv4Checksum.Factories.Part1)
+  register(IPv4Checksum.Factories.Part2)
+
 }
 
 /**
  * Shared functionality of all LayerTransformers.
+ *
+ * A layer transformer is serialized as part of the processor definition;
+ * hence, they must be stateless as we allow the processor definition to be
+ * shared across multiple threads.

Review comment:
       Is this comment regarding state correct? Don't layer transformers 
require internal state, e.g. current checksum value, which will change as bytes 
are read?
   
   Is it really that the `LayerTransformerFactory` is serialized, is shared 
among threads, and must be stateless? And `LayerTransformer`'s are uniquely 
allocated at parse/unparse time, for each parse/unparse, and those do not need 
to be thread safe/stateless? You removed Serializable, so I assume that's the 
case?

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/layers/LayerTransformer.scala
##########
@@ -200,6 +214,54 @@ abstract class LayerTransformer()
   //    println("DIS hex dump = " + str)
   //    ???
   //  }
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions before
+   * the parsing of a layer starts.

Review comment:
       This is going to be interesting when we add pluggable layers. I don't 
think we make PState publicly visible anywhere. Allowing layers to modify 
PState could lead to problems if a user isn't careful, at the very least 
backwards compatibility issues crop up if we start changing the PState, which 
isn't uncommon. I wonder if as part of pluggability we'll want to create a 
external view of PState, that is limited and checked to ensure pluggable layers 
don't do bad things?

##########
File path: 
daffodil-test/src/test/scala/org/apache/daffodil/layers/CheckDigit.scala
##########
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.daffodil.layers
+
+import org.apache.commons.io.IOUtils
+import org.apache.daffodil.schema.annotation.props.gen.LayerLengthKind
+import org.apache.daffodil.schema.annotation.props.gen.LayerLengthUnits
+import org.apache.daffodil.util.Maybe
+import org.apache.daffodil.processors.LayerLengthEv
+import org.apache.daffodil.processors.LayerBoundaryMarkEv
+import org.apache.daffodil.processors.LayerCharsetEv
+import org.apache.daffodil.processors.parsers.PState
+import org.apache.daffodil.processors.unparsers.UState
+
+import java.io._
+import org.apache.daffodil.dpath.NodeInfo.PrimType
+import org.apache.daffodil.exceptions.Assert
+import org.apache.daffodil.io.DataInputStream
+import org.apache.daffodil.processors.CharsetEvBase
+import org.apache.daffodil.processors.SequenceRuntimeData
+import org.apache.daffodil.processors.SuspendableOperation
+import org.apache.daffodil.processors.charset.BitsCharsetJava
+import org.apache.daffodil.util.ByteBufferOutputStream
+import org.apache.daffodil.xml.NS
+import org.apache.daffodil.xml.RefQName
+
+import java.nio.ByteBuffer
+import java.nio.charset.Charset
+
+
+final class CheckDigitExplicit(
+  srd: SequenceRuntimeData,
+  charset: Charset,
+  layerLengthEv: LayerLengthEv)
+extends LayerTransformer() {
+  Assert.invariant(charset.newDecoder().maxCharsPerByte() == 1) // is a SBCS 
charset
+
+  private val checkDigitVRD = {
+    val varNamespace = NS("urn:org.apache.daffodil.layers.checkDigit")
+    val varQName = RefQName(Some("cd"), "checkDigit", 
varNamespace).toGlobalQName
+    val vrd = srd.variableMap.getVariableRuntimeData(varQName).getOrElse {
+      srd.SDE("Variable '%s' is not defined.", varQName.toExtendedSyntax)
+    }
+    srd.schemaDefinitionUnless(vrd.primType == PrimType.UnsignedInt,
+      "Variable '%s' is not of type 'xs:unsignedInt'.", 
varQName.toExtendedSyntax)
+    vrd
+  }
+
+  private var limitingOutputStream : ByteBufferOutputStream = _
+  private var originalOutputStream: OutputStream = _
+
+  protected def wrapLayerDecoder(jis: InputStream) = jis
+
+  override def wrapLimitingStream(jis: java.io.InputStream, state: PState) = {
+    val layerLengthInBytes: Int = layerLengthEv.evaluate(state).toInt
+    val ba = new Array[Byte](layerLengthInBytes)
+    try
+      IOUtils.readFully(jis, ba)
+    catch {
+      case eof: EOFException =>
+        throw DataInputStream.NotEnoughDataException(ba.length * 8)
+    }
+    // this stream will be used by the parse when it parses the
+    // schema contents of the layer. We have already consumed the bytes
+    // from the original input stream, and saved those in the byte array
+    // to enable the computation of the checkDigit.
+    val layerStream = new ByteArrayInputStream(ba)
+    val str = new String(ba, charset) // always does replacement on decode 
errors.
+    val checkDigit = computeCheckDigit(str)
+    state.setVariable(checkDigitVRD, checkDigit, srd) // assign to result 
variable
+    layerStream

Review comment:
       Is this standard for layers, to read the entire data into memory, do 
something with it, and then return it as a new stream that parse then reads? Or 
is that just done for simplicity because this is a test implementation so not a 
big deal, or we expect the size of these layers to be small and not be a memory 
issue?

##########
File path: 
daffodil-test/src/test/scala/org/apache/daffodil/layers/CheckDigit.scala
##########
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.daffodil.layers
+
+import org.apache.commons.io.IOUtils
+import org.apache.daffodil.schema.annotation.props.gen.LayerLengthKind
+import org.apache.daffodil.schema.annotation.props.gen.LayerLengthUnits
+import org.apache.daffodil.util.Maybe
+import org.apache.daffodil.processors.LayerLengthEv
+import org.apache.daffodil.processors.LayerBoundaryMarkEv
+import org.apache.daffodil.processors.LayerCharsetEv
+import org.apache.daffodil.processors.parsers.PState
+import org.apache.daffodil.processors.unparsers.UState
+
+import java.io._
+import org.apache.daffodil.dpath.NodeInfo.PrimType
+import org.apache.daffodil.exceptions.Assert
+import org.apache.daffodil.io.DataInputStream
+import org.apache.daffodil.processors.CharsetEvBase
+import org.apache.daffodil.processors.SequenceRuntimeData
+import org.apache.daffodil.processors.SuspendableOperation
+import org.apache.daffodil.processors.charset.BitsCharsetJava
+import org.apache.daffodil.util.ByteBufferOutputStream
+import org.apache.daffodil.xml.NS
+import org.apache.daffodil.xml.RefQName
+
+import java.nio.ByteBuffer
+import java.nio.charset.Charset
+
+
+final class CheckDigitExplicit(
+  srd: SequenceRuntimeData,
+  charset: Charset,
+  layerLengthEv: LayerLengthEv)
+extends LayerTransformer() {
+  Assert.invariant(charset.newDecoder().maxCharsPerByte() == 1) // is a SBCS 
charset
+
+  private val checkDigitVRD = {
+    val varNamespace = NS("urn:org.apache.daffodil.layers.checkDigit")
+    val varQName = RefQName(Some("cd"), "checkDigit", 
varNamespace).toGlobalQName
+    val vrd = srd.variableMap.getVariableRuntimeData(varQName).getOrElse {
+      srd.SDE("Variable '%s' is not defined.", varQName.toExtendedSyntax)
+    }
+    srd.schemaDefinitionUnless(vrd.primType == PrimType.UnsignedInt,
+      "Variable '%s' is not of type 'xs:unsignedInt'.", 
varQName.toExtendedSyntax)
+    vrd
+  }
+
+  private var limitingOutputStream : ByteBufferOutputStream = _
+  private var originalOutputStream: OutputStream = _
+
+  protected def wrapLayerDecoder(jis: InputStream) = jis
+
+  override def wrapLimitingStream(jis: java.io.InputStream, state: PState) = {
+    val layerLengthInBytes: Int = layerLengthEv.evaluate(state).toInt
+    val ba = new Array[Byte](layerLengthInBytes)
+    try
+      IOUtils.readFully(jis, ba)
+    catch {
+      case eof: EOFException =>
+        throw DataInputStream.NotEnoughDataException(ba.length * 8)
+    }
+    // this stream will be used by the parse when it parses the
+    // schema contents of the layer. We have already consumed the bytes
+    // from the original input stream, and saved those in the byte array
+    // to enable the computation of the checkDigit.
+    val layerStream = new ByteArrayInputStream(ba)
+    val str = new String(ba, charset) // always does replacement on decode 
errors.
+    val checkDigit = computeCheckDigit(str)
+    state.setVariable(checkDigitVRD, checkDigit, srd) // assign to result 
variable
+    layerStream
+  }
+
+  protected def wrapLayerEncoder(jos: OutputStream) = jos
+
+  protected def wrapLimitingStream(jos: OutputStream, state: UState) = {
+    originalOutputStream = jos
+    val layerLengthInBytes: Int = layerLengthEv.evaluate(state).toInt
+    val ba = new Array[Byte](layerLengthInBytes)
+    val byteBuf = ByteBuffer.wrap(ba) // bigEndian byte order by default
+    limitingOutputStream = new ByteBufferOutputStream(byteBuf)
+    limitingOutputStream
+  }
+
+  /**
+   * Shared by both parsing and unparsing.
+   *
+   * Ignores any non-digit character in the argument.
+   *
+   * The checkDigit is the total of all digits, viewed as a string, the last 
digit of that total.
+   */
+  private def computeCheckDigit(s: String): Long = {
+    val digits: Seq[Int] = s.filter{ _.isDigit }.map{ _.asDigit }
+    val num = digits.sum
+    val checkDigit = num.toString.last.asDigit.toLong
+    checkDigit
+  }
+
+  final class SuspendableCheckDigitLayerOperation()
+    extends SuspendableOperation {
+    override def rd = srd
+
+    /**
+     * Test succeeds if all the data required has been written to the layer
+     */
+    protected def test(ustate: UState) = {
+      limitingOutputStream.size() == limitingOutputStream.byteBuffer.capacity()
+    }
+
+    /**
+     * Computes checkdigit, assigns the output variable, then writes the layer 
data out,
+     */
+    protected def continuation(ustate: UState): Unit = {
+      val ba = limitingOutputStream.byteBuffer.array()
+      val str = new String(ba, charset)
+      val checkDigit = computeCheckDigit(str)
+      ustate.setVariable(checkDigitVRD, checkDigit, srd) // assign to the 
result variable.
+      //
+      // write out the layer data
+      originalOutputStream.write(ba)
+      originalOutputStream.close()
+    }
+  }
+
+  private lazy val suspendableOperation = new 
SuspendableCheckDigitLayerOperation()
+
+  override def endLayerForUnparse(s: UState): Unit = {
+    suspendableOperation.run(s)
+  }
+
+}
+
+object CheckDigit
+  extends LayerTransformerFactory("checkDigit") {
+
+  override def newInstance(
+    maybeLayerCharsetEv: Maybe[LayerCharsetEv], // if not supplied, taken from 
the srd.
+    maybeLayerLengthKind: Maybe[LayerLengthKind], // explicit or implicit
+    maybeLayerLengthEv: Maybe[LayerLengthEv], // ignored (with warning ideally)
+    maybeLayerLengthUnits: Maybe[LayerLengthUnits], // ignored (with warning 
ideally)
+    maybeLayerBoundaryMarkEv: Maybe[LayerBoundaryMarkEv], // ignored (for now, 
this is a future possibility)
+    srd: SequenceRuntimeData) = {
+
+
+    val layerLengthKind = {
+      if (maybeLayerLengthKind.isEmpty) {
+        srd.SDE("The 'dfdlx:layerLengthKind' property must be defined.")
+      } else {
+        val llk = maybeLayerLengthKind.get
+        llk match {
+          case LayerLengthKind.Explicit => {
+            srd.schemaDefinitionUnless(maybeLayerLengthEv.isDefined, "Property 
dfdlx:layerLength must be defined.")
+            maybeLayerLengthUnits.toScalaOption match {
+              case Some(LayerLengthUnits.Bytes) => // ok
+              case None => // ok
+              case _ => srd.SDE("Property dfdlx:layerLengthUnits must be 
'bytes'.")
+            }
+          }
+          case other => srd.SDE("The dfdlx:layerLengthKind '%s' is not 
supported.", other)
+        }
+        llk
+      }
+    }
+
+    // To simplify, insist the charset is a constant, 8-bit aligned, SBCS 
encoding, e.g., ascii or 8859-1.
+
+    val layerCharsetEv: CharsetEvBase =
+      if (maybeLayerCharsetEv.isDefined) maybeLayerCharsetEv.get
+      else srd.encodingInfo.charsetEv
+
+    val bitsCharset = layerCharsetEv.optConstant.getOrElse{
+      srd.SDE("Must have a character set encoding defined.")
+    }
+    val javaCharset: Charset = bitsCharset match {
+      case jbc: BitsCharsetJava if jbc.bitWidthOfACodeUnit == 8 => 
jbc.javaCharset
+      case _ => srd.SDE(
+        "Charset encoding must be for a byte-aligned single-byte character 
set, but was encoding '%s', which has width %s bits, and alignment %s bits.",
+        bitsCharset.name, bitsCharset.bitWidthOfACodeUnit, 
bitsCharset.mandatoryBitAlignment)
+    }
+
+    //
+    // ToDo: We can't issue SDW because we don't have a context object
+    // for that sort of compilation time warning. We *should* have that.
+    // That requires a layer factory method called at schema compile time with 
the
+    // Sequence term (not just term runtime data) as an argument.
+    //
+    val xformer = new CheckDigitExplicit(srd, javaCharset, 
maybeLayerLengthEv.get)
+    xformer
+  }
+}

Review comment:
       I'm interesting to see how this all changes with plugability. There's a 
lot of things being used here that I'd consider internal and wouldn't 
necessarily want a user to directly access. Not an issue for this PR, though.

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/layers/LayerTransformer.scala
##########
@@ -173,37 +187,33 @@ abstract class LayerTransformer()
     //
   }
 
-  // These were very useful for debugging. Note that they stop with a ???
-  // error. That's because dumping streams changes their state.
-  //
-  // Keeping these around commented out, for now. While this feature is still
-  // new and may need further debugging.
-  //
-  //  def dumpLayer(is: InputStream) {
-  //    val str = Stream.continually(is.read).takeWhile(_ != 
-1).map(_.toChar).mkString
-  //    println("dump length " + str.length + " = " + str)
-  //    ???
-  //  }
-  //
-  //  def hexDumpLayer(is: InputStream) {
-  //    val str = hexify(is)
-  //    println("hex dump = " + str)
-  //    ???
-  //  }
-  //
-  //  def hexify(is: InputStream): String =
-  //    Stream.continually(is.read).takeWhile(_ != -1).map(x => 
"%02x".format(x.toInt)).mkString(" ")
-  //
-  //  def hexDumpLayer(is: InputSourceDataInputStream, finfo: FormatInfo) {
-  //    val jis = new JavaIOInputStream(is, finfo)
-  //    val str = hexify(jis)
-  //    println("DIS hex dump = " + str)
-  //    ???
-  //  }
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions before
+   * the parsing of a layer starts.
+   *
+   * Examples might be setting variable values.
+   *
+   * At the time this is called, the state contains the layered I/O stream 
already.
+   * @param s The parser state, which may be modified.
+   */
+  def startLayerForParse(s: PState):Unit = () // nothing by default

Review comment:
       Why do we only need a `start` function for parse and an `end` function 
for unparse? In the IPv4 case, doesn't parse need to set variables, so would 
need to know when the layer has ended? 

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/processors/parsers/PState.scala
##########
@@ -327,11 +328,17 @@ final class PState private (
     this.infoset = newParent
   }
 
-  def setVariable(vrd: VariableRuntimeData, newValue: DataValuePrimitive, 
referringContext: VariableRuntimeData): Unit = {
+  def setVariable(vrd: VariableRuntimeData, newValue: DataValuePrimitive, 
referringContext: ThrowsSDE): Unit = {
     variableMap.setVariable(vrd, newValue, referringContext, this)
     changedVariablesStack.top += vrd.globalQName
   }
 
+  def getVariable(vrd: VariableRuntimeData, referringContext: ThrowsSDE): 
DataValuePrimitive = {
+    val res = variableMap.readVariable(vrd, referringContext, this)
+    changedVariablesStack.top += vrd.globalQName

Review comment:
       This has no coverage, should our variable code be using this getVariable 
so that the stack is adjusted correctly, similar to setVariable? Seems 
inconsistent, and maybe a bug, if setVariable and getVariable are access in 
different ways?

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/layers/GZipTransformer.scala
##########
@@ -57,15 +57,9 @@ class GZIPTransformer(layerLengthInBytesEv: 
LayerLengthInBytesEv)
 object GZIPTransformerFactory
   extends LayerTransformerFactory("gzip") {
 
-  override def newInstance(
-    maybeLayerCharsetEv: Maybe[LayerCharsetEv],
-    maybeLayerLengthKind: Maybe[LayerLengthKind],
-    maybeLayerLengthInBytesEv: Maybe[LayerLengthInBytesEv],
-    maybeLayerLengthUnits: Maybe[LayerLengthUnits],
-    maybeLayerBoundaryMarkEv: Maybe[LayerBoundaryMarkEv],
-    tci: DPathCompileInfo): LayerTransformer = {
+  override def newInstance(maybeLayerCharsetEv: Maybe[LayerCharsetEv], 
maybeLayerLengthKind: Maybe[LayerLengthKind], maybeLayerLengthEv: 
Maybe[LayerLengthEv], maybeLayerLengthUnits: Maybe[LayerLengthUnits], 
maybeLayerBoundaryMarkEv: Maybe[LayerBoundaryMarkEv], srd: SequenceRuntimeData) 
= {

Review comment:
       Line too long, wrap to what it used to be.

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/processors/EvLayering.scala
##########
@@ -89,13 +100,7 @@ final class LayerTransformerEv(
   override def compute(state: State): LayerTransformer = {
     val layerTransform = layerTransformEv.evaluate(state)
     val factory = LayerTransformerFactory.find(layerTransform, state)
-    val xformer = factory.newInstance(
-      maybeLayerCharsetEv,
-      maybeLayerLengthKind,
-      maybeLayerLengthInBytesEv,
-      maybeLayerLengthUnits,
-      maybeLayerBoundaryMarkEv,
-      ci)
+    val xformer = factory.newInstance(maybeLayerCharsetEv, 
maybeLayerLengthKind, maybeLayerLengthEv, maybeLayerLengthUnits, 
maybeLayerBoundaryMarkEv, srd)

Review comment:
       Line too long, rewrap to previous change.

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/processors/unparsers/UState.scala
##########
@@ -74,6 +76,21 @@ abstract class UState(
   extends ParseOrUnparseState(vbox, diagnosticsArg, dataProcArg, tunable)
   with Cursor[InfosetAccessor] with ThrowsSDE with SavesErrorsAndWarnings {
 
+  final def setVariable(vrd: VariableRuntimeData, newValue: 
DataValuePrimitive, referringContext: ThrowsSDE) =
+    vbox.vmap.setVariable(vrd, newValue, referringContext, this)
+
+  /**
+   * For unparsing, this throws a RetryableException in the case where the 
variable cannot (yet) be read.
+   *
+   * @param vrd Identifies the variable to read.
+   * @param referringContext Where to place blame if there is an error.
+   * @return The data value of the variable, or throws exceptions if there is 
no value.
+   */
+  final def getVariable(vrd: VariableRuntimeData, referringContext: 
ThrowsSDE): DataValuePrimitive =

Review comment:
       Same comment as above, should we be using these set/getVariable 
functions internally? I assume right now we are just directly access this 
variable map?

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/processors/EvLayering.scala
##########
@@ -70,16 +70,27 @@ final class LayerTransformerEv(
   layerTransformEv: LayerTransformEv,
   maybeLayerCharsetEv: Maybe[LayerCharsetEv],
   maybeLayerLengthKind: Maybe[LayerLengthKind],
-  maybeLayerLengthInBytesEv: Maybe[LayerLengthInBytesEv],
+  maybeLayerLengthEv: Maybe[LayerLengthEv],
   maybeLayerLengthUnits: Maybe[LayerLengthUnits],
   maybeLayerBoundaryMarkEv: Maybe[LayerBoundaryMarkEv],
-  ci: DPathCompileInfo)
-  extends Evaluatable[LayerTransformer](ci)
+  srd: SequenceRuntimeData)
+  extends Evaluatable[LayerTransformer](srd.dpathCompileInfo)
   with NoCacheEvaluatable[LayerTransformer] {
 
+  /**
+   * A layer transformer is never a constant. It always must be evaluated to 
get
+   * a new one.
+   *
+   * This prevents it being cached/stored as part of the state of the data 
processor
+   * where it would be shared across threads.
+   *
+   * Otherwise layer transformers couldn't be stateful. But they often need to 
be stateful.
+   */
+  override protected def isNeverConstant = true

Review comment:
       Could/should this logic be part of `NoCacheEvaluatable`? Or are these 
different concepts? Seems like if something shouldn't be cached then it also 
should never be considered constant? It would be nice if mixing in the right 
things gave the right behavior without having to set other flags. 

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/processors/EvLayering.scala
##########
@@ -89,13 +100,7 @@ final class LayerTransformerEv(
   override def compute(state: State): LayerTransformer = {
     val layerTransform = layerTransformEv.evaluate(state)
     val factory = LayerTransformerFactory.find(layerTransform, state)
-    val xformer = factory.newInstance(
-      maybeLayerCharsetEv,
-      maybeLayerLengthKind,
-      maybeLayerLengthInBytesEv,
-      maybeLayerLengthUnits,
-      maybeLayerBoundaryMarkEv,
-      ci)
+    val xformer = factory.newInstance(maybeLayerCharsetEv, 
maybeLayerLengthKind, maybeLayerLengthInBytesEv, maybeLayerLengthUnits, 
maybeLayerBoundaryMarkEv, srd)

Review comment:
       Too long, rewrap.

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/layers/LayerTransformer.scala
##########
@@ -200,6 +214,54 @@ abstract class LayerTransformer()
   //    println("DIS hex dump = " + str)
   //    ???
   //  }
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions before
+   * the parsing of a layer starts.
+   *
+   * Examples might be setting variable values.
+   *
+   * At the time this is called, the state contains the layered I/O stream 
already.
+   * @param s The parser state, which may be modified.
+   */
+  def startLayerForParse(s: PState):Unit
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions after
+   * the parsing of a layer ends.
+   *
+   * Examples might be setting variable values.
+   *
+   * At the time this is called, the state still contains the layered I/O 
stream.
+   * @param s The parser state, which may be modified.
+   */
+  def endLayerForParse(s: PState): Unit
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions before
+   * the unparsing of a layer starts.
+   *
+   * Examples might be setting variable values.
+   *
+   * Note that if variables without values are read by this method, then this 
must create
+   * suspended calculations that clone the necessary state, which are 
evaluated later.
+   *
+   * At the time this is called, the state still contains the layered I/O 
stream.
+   * @param s The unparser state, which may be modified.
+   */
+  def startLayerForUnparse(s: UState): Unit
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions after
+   * the unparsing of a layer ends.
+   *
+   * Note that if variables without values are read by this method, then this 
must create
+   * suspended calculations that clone the necessary state, which are 
evaluated later.

Review comment:
       Similar to above, letting plug-ins create suspensions sounds very 
dangreous with how fragile suspensions can be. It probably can't be avoided, 
but DAFFODIL-1927 might need to consider how to make this safer.

##########
File path: 
daffodil-propgen/src/main/resources/org/apache/daffodil/xsd/DFDL_part2_attributes.xsd
##########
@@ -1069,18 +1037,5 @@
     <xsd:attributeGroup ref="dfdlx:SimpleTypeValueCalcAGQualified" />
     <xsd:attributeGroup ref="dfdlx:ObjectKindAGQualified" />
   </xsd:attributeGroup>
-
-  <xsd:attributeGroup name="LayeringAGQualified">
-    <xsd:attribute form="qualified" name="layerTransform"
-      type="dfdl:LayerTransformType_Or_DFDLExpression" />
-    <xsd:attribute form="qualified" name="layerEncoding"
-      type="dfdl:EncodingEnum_Or_DFDLExpression" />
-    <xsd:attribute form="qualified" name="layerLengthKind" 
type="dfdl:LayerLengthKindEnum" />
-    <xsd:attribute form="qualified" name="layerLength"
-      type="dfdl:DFDLNonNegativeInteger_Or_DFDLExpression" />
-    <xsd:attribute form="qualified" name="layerLengthUnits" 
type="dfdl:LayerLengthUnitsEnum" />
-    <xsd:attribute form="qualified" name="layerBoundaryMark"
-      type="dfdl:ListOfDFDLStringLiteral_Or_DFDLExpression" />
-  </xsd:attributeGroup>

Review comment:
       We'll want to make sure we mention the removal of these properties in 
the release notes as a backwards compatibility issue.
   
   But we don't have a good way of tracking that right now. Maybe the standard 
should be to include a line in the commit message like `Backwards 
Compatibility: ...`, and then the release manager can just search for commits 
in this release with that tag to figure out what to include in the release 
notes?

##########
File path: 
daffodil-runtime1/src/main/scala/org/apache/daffodil/layers/LayerTransformer.scala
##########
@@ -200,6 +214,54 @@ abstract class LayerTransformer()
   //    println("DIS hex dump = " + str)
   //    ???
   //  }
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions before
+   * the parsing of a layer starts.
+   *
+   * Examples might be setting variable values.
+   *
+   * At the time this is called, the state contains the layered I/O stream 
already.
+   * @param s The parser state, which may be modified.
+   */
+  def startLayerForParse(s: PState):Unit
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions after
+   * the parsing of a layer ends.
+   *
+   * Examples might be setting variable values.
+   *
+   * At the time this is called, the state still contains the layered I/O 
stream.
+   * @param s The parser state, which may be modified.
+   */
+  def endLayerForParse(s: PState): Unit
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions before
+   * the unparsing of a layer starts.
+   *
+   * Examples might be setting variable values.
+   *
+   * Note that if variables without values are read by this method, then this 
must create
+   * suspended calculations that clone the necessary state, which are 
evaluated later.
+   *
+   * At the time this is called, the state still contains the layered I/O 
stream.
+   * @param s The unparser state, which may be modified.
+   */
+  def startLayerForUnparse(s: UState): Unit
+
+  /**
+   * Enables the layer transformer to take necessary state-maniuplating 
actions after
+   * the unparsing of a layer ends.
+   *
+   * Note that if variables without values are read by this method, then this 
must create
+   * suspended calculations that clone the necessary state, which are 
evaluated later.
+   *
+   * Examples might be setting variable values.
+   * @param s The unparser state, which may be modified.
+   */
+  def endLayerForUnparse(s: UState): Unit

Review comment:
       Do we want to add default implementations of these functions that don't 
do anything? This way layers transformers that don't need this capability don't 
need extra boilerplate?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to