Cassandra Spark Dataframe, CSV : "query in the SELECT clause of the INSERT INTO/OVERWRITE statement generates the same number of columns as its schema -


i have csv file around 100 cols. wanted put in table 101 cols (it 102 cols)

the problem is have following message:org.apache.spark.sql.cassandra.cassandrasourcerelation requires query in select clause of insert into/overwrite statement generates same number of columns schema.

how can overcome problem?

here code:

  df =  sqlcontext.read()                   .format("csv")                   .option("delimiter", ";")                   .option("header", "true")                   .load("file:///" + namefile); 

and then:

df.repartition(8).select("col1","col2",..."col100").write().mode(savemode.append).saveastable("mytable"); 


Comments

Popular posts from this blog

html - How to set bootstrap input responsive width? -

javascript - Highchart x and y axes data from json -

javascript - Get js console.log as python variable in QWebView pyqt -