Error due to insufficient memory to run algorithm in ApacheSpark

0

The error is:

  

System memory 259522560 must be at least 471859200. Increase the heap size by using the --driver-memory or spark.driver.memory option in the Spark configuration.

This is my code and the error is about the spark variable once the run starts.

object miObjetc {

    val spark: SparkSession = 
      SparkSession.builder()
        .appName("My Name")
        .master("local[*]")
        .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
        .getOrCreate()

    spark.sparkContext.setLogLevel("off")

    val tipoDistancia = new DistanciaEuclidean

    import spark.implicits._

    val es = Encoders.kryo[Tupla]

    //val tordd = df.repartition(2)


    def main(args: Array[String]): Unit = {

        // mi código .....

    }

}

I already put it this way:

val spark: SparkSession =
  SparkSession
    .builder()
    .appName("My Name")
    .master("local[*]")
    .config("spark.driver.memory", "6g")
    .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
    .getOrCreate()

and the error continues.

The computer where it runs is i7 with 6GB of RAM

Thanks for the collaborations.

    
asked by 19lenyar94 20.03.2018 в 18:57
source

0 answers