Re: The equivalent of Scala mapping in Pyspark

2020-10-15 Thread Mich Talebzadeh
Hi all, I managed to sort this one out in a trench way as the Pyspark available materials are not as comprehensive as Scala one. Frankly to sort this out was a bit of a struggle for me. However, I managed to make it work. What the script does in a nutshell is to generate rows in Spark, and save t

The equivalent of Scala mapping in Pyspark

2020-10-13 Thread Mich Talebzadeh
Hi, I generate an array of random data and create a DF in Spark scala as follows val end = start + numRows - 1 println (" starting at ID = " + start + " , ending on = " + end ) val usedFunctions = new UsedFunctions *val text = ( start to end ).map(i =>* * (* * i.toSt