How do I convert a code into scala to pyspark?

0

Hello I need help with a code that I'm doing, it turns out that I must define a row wn pyspark object, but I have a code in scala that does that but I must do it in python

def getRow(x : String) : Row={
val columnArray = new Array[String](95)
columnArray(0)=x.substring(0,10)
columnArray(1)=x.substring(11,14)
columnArray(2)=x.substring(15,17)
columnArray(3)=x.substring(18,20)
columnArray(4)=x.substring(21,23)
columnArray(5)=x.substring(24,26)
columnArray(6)=x.substring(27,29)
columnArray(7)=x.substring(30,44)
columnArray(8)=x.substring(45,58)
columnArray(9)=x.substring(59,60)
columnArray(10)=x.substring(61,62)
columnArray(11)=x.substring(63,64)
columnArray(12)=x.substring(65,66)
columnArray(13)=x.substring(67,68)
columnArray(14)=x.substring(69,70)
columnArray(15)=x.substring(71,72)
columnArray(16)=x.substring(73,74)
columnArray(17)=x.substring(75,76)
columnArray(18)=x.substring(77,78)
columnArray(19)=x.substring(79,80)
Row.fromSeq(columnArray)
}

As I can transform this into pyspark, I remain attentive to your comments

Greetings

    
asked by Felipe Avalos 15.05.2018 в 18:19
source

0 answers