Hi,
I am Cloudera certified Spark and Hadoop developer with 3 +year of experience .
I am working with the similar kind of project, can work with multiple type of files.
Please start chat to initiate the discussion.
Hi,
Having 5 years experience as big data and hadoop developer under my umbrella, Extracting data from CSV, JSON etc. transforming it and loading into hdfs is my routine task.
I have good experience in importing CSV , Mysql database, elasticSearch and mongodb data into hdfs.
My approach for your task will be.
1)Transfer Csv from local to hadoop sandbox environment by using winscp.
2)Using hdfs "put" command to load data into hdfs.
3)Create table in hive on top of hdfs file.
4)Make queries in hive and pig.
Apart from vast experience ,I am also MAPR certified hadoop,hive and hbase developer.
You can contact me anytime on message.
I am new to freelance but my work will be assurance of my work experience.
Thanks
I am developer with 5 years exprience in Hadoop ,
Java,
hbase,
hive,
pig,
mapreduce,
sqoop,
oozie,
falcon
drill
along with hadoop good experience in spring, hibernate, spring boot and AWS
I have a good experience in migration projects where we copy the data from local to hdfs and we use hive external , manage tables to query on the data. So I think I am suitable for your requirement
Relevant Skills and Experience
I'm having a good experience of 2+ years and have worked on multiple migration projects and have a good knowledge on hdfs, hive and Scala.