Using spark scala Is there an efficient way to take all the columns of a DF and add N more columns?

0

I am working with a script in which there is a join of two dataframes with many columns each, the first is given an alias "A" and the second "B", the same for the columns of each one. Additional columns are created for the result of the join that are conditionally filled and made inside a select to avoid the use of withColumn and thus make the process more efficient.

I've tried something like this

val correctionDF = clientsDF.alias ("A"            join (dataDF.alias ("B"), condition, "full").            select ("*",               when (col ("B.entity"). isNull, col ("A.entity")).               otherwise (col ("B.entity")). alias ("correctionEntity"))

But I get:

Name: Syntax Error. Message: StackTrace:

    
asked by SunoFer 06.11.2018 в 19:39
source

0 answers