April 2016 - Talend Expert
contact@talendexpert.com US 1-877-283-2691 UK 020 7692 4827

HBase Basics Quizz

To read and download all our premium content unlimited access you should pay only 6$

Proceeding you may be asked to sign in or register before be sent to the payment page..

Read More >

How to use Talend with Dimelo API ?

How to use Talend with Dimelo API ? TalendExpert.com can help you on your process of Ingesting your data from Dimelo using Talend ETL. Our teams have developed and delivered successfully this integration journey, we have developed many template jobs reusable and ready for use regarding Dimelo integration. Also, we do propose ready-to-use components that extract/update data with Dimelo API using a very simple connectors in Talend , no coding skills needed to integrate your data using APIs. These components are not in Exchange community and on-demand use only….

Read More >

Tables saved with the Spark SQL DataFrame.saveAsTable method are not compatible with Hive

Writing a DataFrame directly to a Hive table creates a table that is not compatible with Hive; the metadata stored in the metastore can only be correctly interpreted by Spark. For example: val hsc = new HiveContext(sc) import hsc.implicits._ val val df = sc.parallelize(data).toDF() df.write.format(“parquet”).saveAsTable(tableName) creates a table with this metadata: inputFormat:org.apache.hadoop.mapred.SequenceFileInputFormat outputFormat:org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat This is also occurs when using explicit schema, such as: val schema = StructType(Seq(…)) val data = sc.parallelize(Seq(Row(…), …)) val df = hsc.createDataFrame(data, schema) df.write.format(“parquet”).saveAsTable(tableName) Workaround:…

Read More >