Please find details about training /consulting requirement
Kindly find theContents:
Read Kafka data and put into HDFS using Scala and spark streaming
Read Mysql data and put into HDFS using spark and Scala streaming
Hadoop Production Resource Allocation
Oozie scheduler and The JAVA API/framework integration with the Hadoop cluster
Is it real time data processing or batch processing? There is no real-time data processing here.
• Spark streaming Job -> which reads data from Kafka and loads into HDFS using Spark & Scala
• Spark Batch Job -> which reads data from MySql and loads into HDFS using Spark & Scala
• Oozie -> used for scheduling both the kind of jobs.
2. More detail on data ingestion part - After loading data into HDFS using the above jobs there is a Druid server which does indexing part.
3. Experience of Audience in BigData - None of the Audience as experience in BigData apart from couple of Team Members with Java programming background
17 freelancers are bidding on average $176 for this job
I have 6 years exp in bigdata hadoop and spark. done good project on spark, kafka, nifi, dataware house end to end implementation with admin+development.
Hi, I have 6 years of experience in spark, scala, hive, pig, sqoop, Kafka etc. I have worked on hadoop migration project. That will help me to implement your solution. thank you
I have worked in bigbdata application for some time. My last application included reading input from kafka, analyzing the data using spark and hive and finally storing in mysql.
I am big data professional having 11 years of experience in developing solution for big data using hadoop,spark,scala,Kafka using java programming.I can complete you task as per your requirement. Regards, B.K Sankhla