vacu9708 opened a new pull request, #18119: URL: https://github.com/apache/tvm/pull/18119
# Summary This PR fixes https://github.com/apache/tvm/issues/17964 that occurs through the following process: 1. The Compress node produces a symbolic dimension "num_nonzero" 2. The relax.op.add() in the BiasGelu node adds shape=[2, num_nonzero] and shape=[3] 3. The current shape inference function does not take into account static vs symbolic dimension cases and produces "None" shape, which leads to the error. # Changes - Add comments on each case - Add support for comparing static and symbolic dimensions in shape inference # Notes The current shape inference takes into account the following cases | Case | Error? | Example | expected output dim | |----------------------------------|---------|-------------------|---| | static dim(1) | No | (2, 3) + (1, 3) | (2, 3) | | equal static dims | No | (2, 3) + (2, 3) | (2, 3) | | equal symbolic dims | No | (n, m) + (n, m) | (n, m) | However, it does not take into account the following cases | Case | Error? | Example | expected output dim | |----------------------------------|---------|-------------------|---| | static dim vs symbolic dim | Yes | (2, 3) + (2, n) | (2, 3) because the 2nd dim must be 3 regardless of the symbolic dim(n) | | different symbolic dims | Yes | (2, n) + (2, m) | (2, n) or (2, m) because output dim cannot be determined in compile time | I was going to add support for the "different symbolic dims" case by introducing a new symbolic dim representing the output dim as follows: ```cpp // different symbolic dims whose output dim cannot be determined in compile time static int _bcast_counter = 0; std::string name = "bcast_dim_" + std::to_string(_bcast_counter++); PrimExpr new_dim = tir::Var(name, DataType::Int(64)); output_shape.push_back(new_dim); ``` However, the current test code assumes that output dimensions are always derived from input dimensions, so I didn't add support for that case at this time. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
