Directive to install apache spark, airflow and hadoop

Closed Posted 2 years ago Paid on delivery
Closed

I need a directive to install apache spark, airflow and hadoop on a linux server with Debian OS. If it helps you can use appropriate containerization like kubernetes Suggestions, questions and comments are welcomed

Linux Hadoop System Admin Apache Spark Kubernetes

Project ID: #31822689

About the project

8 proposals Remote project Active 2 years ago

8 freelancers are bidding on average €36/hour for this job

aesthetichunzai

Greetings , I can configure for you apache spark, airflow and hadoop on a linux server with Debian OS (Ubuntu 18/20) let me know if you would be interested in my services further any specific versions for Apache spar More

€36 EUR / hour
(13 Reviews)
5.1
moezbouzayani

Hello, I am based in Germany. I am so interresed in your project. Please send me a message to dissucss more details in private to help you out in your project. Thank you.

€36 EUR / hour
(5 Reviews)
3.1
jframirezr22

Hope you are doing well. I'm a MLOps & DevOps engineer with product proven experiences. I have experiences deploy that kind of data platform on AWS. The kubernetes was used instead of Hadoop Yarn and the benifit was th More

€36 EUR / hour
(1 Review)
3.2
sanket73

Hi, I just read your job posting and I think that I am fit for this job as I have 4 years of experience building application for small startups to big organizations like VMWare, Western Union and Send Safely and deploy More

€36 EUR / hour
(1 Review)
0.0
AbacusSolution

We have expertise in Apache, LAMP, Azure and AWS, Azure & GCP Certified Professional Cloud Architect with multi-tasking skills having more over 10 years of extensive experience in AWS, Azure & Google Cloud Architectur More

€36 EUR / hour
(0 Reviews)
0.0
horvathjozsef22

I am the best person for the job. I have worked with servers for many years. I configure, maintain and develop servers.

€36 EUR / hour
(0 Reviews)
0.0