Directive to install apache spark, airflow and hadoop
min €36 EUR / hour
I need a directive to install apache spark, airflow and hadoop on a linux server with Debian OS. If it helps you can use appropriate containerization like kubernetes Suggestions, questions and comments are welcomed
Project ID: #31822689
About the project
8 freelancers are bidding on average €36/hour for this job
Greetings , I can configure for you apache spark, airflow and hadoop on a linux server with Debian OS (Ubuntu 18/20) let me know if you would be interested in my services further any specific versions for Apache spar More
Hello, I am based in Germany. I am so interresed in your project. Please send me a message to dissucss more details in private to help you out in your project. Thank you.
Hope you are doing well. I'm a MLOps & DevOps engineer with product proven experiences. I have experiences deploy that kind of data platform on AWS. The kubernetes was used instead of Hadoop Yarn and the benifit was th More
We have expertise in Apache, LAMP, Azure and AWS, Azure & GCP Certified Professional Cloud Architect with multi-tasking skills having more over 10 years of extensive experience in AWS, Azure & Google Cloud Architectur More
I am the best person for the job. I have worked with servers for many years. I configure, maintain and develop servers.