Job Description: Expertise with big data technologies like Hadoop, Spark, Java, AWS, Hive. Good experience in data platforms like AWS EC2, HDFS. Design, implement and deploy ETL to load data into Big Data. 5 years’ Experience in Agile methodology. Working and supporting Data Scientists and Machine learning teams.
Job Description: Expertise with big data technologies like Hadoop, Spark, Java, AWS, Hive. Good experience in data platforms like AWS EC2, HDFS. Design, implement and deploy ETL to load data into Big Data. 5 years’ Experience in Agile methodology. Working and supporting Data Scientists and Machine learning teams.
Job Description: Expertise with big data technologies like Hadoop, Spark, Java, AWS, Hive. Good experience in data platforms like AWS EC2, HDFS. Design, implement and deploy ETL to load data into Big Data. 5 years’ Experience in Agile methodology. Working and supporting Data Scientists and Machine learning teams.
Job Description: Expertise with big data technologies like Hadoop, Spark, Java, AWS, Hive. Good experience in data platforms like AWS EC2, HDFS. Design, implement and deploy ETL to load data into Big Data. 5 years’ Experience in Agile methodology. Working and supporting Data Scientists and Machine learning teams.
Job Description: Expertise with big data technologies like Hadoop, Spark, Java, AWS, Hive. Good experience in data platforms like AWS EC2, HDFS. Design, implement and deploy ETL to load data into Big Data. 5 years’ Experience in Agile methodology. Working and supporting Data Scientists and Machine learning teams.
Job Description: Expertise with big data technologies like Hadoop, Spark, Java, AWS, Hive. Good experience in data platforms like AWS EC2, HDFS. Design, implement and deploy ETL to load data into Big Data. 5 years’ Experience in Agile methodology. Working and supporting Data Scientists and Machine learning teams.
Please Subscribe our news letter and and get update.