Github user liancheng commented on a diff in the pull request:
https://github.com/apache/spark/pull/3066#discussion_r19784873
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Row.scala
---
@@ -42,6 +44,31 @@ object Row {
* This method can be used to construct a [[Row]] from a [[Seq]] of
values.
*/
def fromSeq(values: Seq[Any]): Row = new GenericRow(values.toArray)
+
+ /**
+ * This method can be used to construct a [[Row]] from a [[Seq]] of
Strings,
+ * converting each item to the type specified in a [[StructType]] schema.
+ * Only primitive types can be used.
+ */
+ def fromStringsBySchema(strings: Seq[String], schema: StructType): Row =
{
+ val values = for {
+ (field, str) <- schema.fields zip strings
+ item = field.dataType match {
+ case IntegerType => str.toInt
+ case LongType => str.toLong
+ case DoubleType => str.toDouble
+ case FloatType => str.toFloat
+ case ByteType => str.toByte
+ case ShortType => str.toShort
+ case StringType => str
+ case BooleanType => (str != "")
+ case DateType => Date.valueOf(str)
+ case TimestampType => Timestamp.valueOf(str)
+ case DecimalType() => new BigDecimal(str)
+ }
+ } yield item
+ new GenericRow(values.toArray)
+ }
--- End diff --
OK I see you point. However, `Row` is used extremely frequently on many
critical paths, so in general we choose to detect type errors while query
planning, rather than introduce per row type checking cost here. It's somewhat
similar to static typed programming languages like C++: once your program
compiles, types are guaranteed to be matched (unless you add nasty type casting
tricks intentionally), and you don't need to check types everywhere manually.
The query planner is just the compiler here.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]