Difference between Standard jobs and Big Data jobs in Talend
- Difference between Standard jobs and Big Data jobs in Talend
- Standard job is a java process
- Big data job is a java that get translated to the target engine language Python for Mapreduce and Scala for Spark
- Standard – This is just a java process. It can access many data sources via JDBC or HDFS,etc but the main process just executes in a JVM.
- BigData Streaming – This is for realtime-ish/microbatch stuff, the jobs are sent to Spark or Storm for execution.
- BigData Batch – Uses either MR1(namenode,etc)/YARN for MapReduce execution, or Spark for execution (With or without YARN depending on platform)Standard JOb : Runs on the jobserver
You need more help ?