stevedlawrence commented on code in PR #1187:
URL: https://github.com/apache/daffodil/pull/1187#discussion_r1528903195
##########
daffodil-core/src/main/scala/org/apache/daffodil/core/runtime1/VariableRuntime1Mixin.scala:
##########
@@ -98,7 +100,10 @@ trait DFDLNewVariableInstanceRuntime1Mixin { self:
DFDLNewVariableInstance =>
defv.namedQName.asInstanceOf[GlobalQName],
defv.primType,
this.tunable.unqualifiedPathStepPolicy,
- this.schemaSet.allDefinedVariables.indexOf(defv),
+ // This is a really important invariant. The index of the NVI's VRD
+ // must be identical to that of the global VRD for this variable,
+ // So we take it from there.
+ vmapIndex = globalVRD.vmapIndex,
Review Comment:
Is this a bug fix unrelated to layers, and the layer implementation and more
complex variable tests has just revealed? If this PR is pushed to a later
release, we might want to cherry pick out this fix so it gets in the 3.7.0
release.
That said, I'm not sure I understand what why
`schemaSet.allDefiendVariables.indexOf(defv)` changes. The way you have it is
more obviously correct so I like it better anyways, it's not just clear to me
why it's broken. Seems like the order of allDefinedVariables shouldn't change,
so I don't know why the indexOf(defv) would be different from the
globalVRD.vmapIndex, since it used the exact same logic.
Another thought, I wonder if newVariableInstance shouldn't create a new VRD
at all? It seems like we only do it because NVI can have a different
defaultValueExpression, I think everything else is exactly the same as the
global VRD? Maybe that expression shouldn't be part of the VRD at all? For
example, maybe NewVariableInstanceParser wants to be something like:
```scala
class NewVariableInstanceStartParser(vrd: VariableRuntimeData, defaultExpr:
Maybe[Expression]) {
def parse(state: PState): Unit = {
val nvi = sate.newVariableInstance(vrd)
if (defaultExpr.isDefined) {
val defaultValue = defaultExpr.get.evaluate(state))
nvi.setDefaultValue(defaultValue)
}
}
}
```
So the default expression is a member of the parser instead of the VRD?
I guess a complication is that we need somewhere to store the
defaultValueExpressions for the global VRDs, but the VariableMap seems like a
reasonable place for that? Could just be something like a
`Seq[(VariableRuntimeData, Expression]]`. Then there only ever is a single VRD
per defined variable instead of per instance. Seems less error prone, since a
VRD then just means a single variable, and not a specific instance of that
variable?
Not sure if that's worth doing for 3.7.0 though, or at all. Maybe there's
other reasons why instances need a separate VRD.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]