Big Data & Cloud Engineer

Bengaluru, Karnataka, India
Mar 11, 2025
Mar 11, 2026
Hybrid
Full-Time
2 Years
Job Description

We are seeking a skilled Big Data & Cloud Engineer to join our dynamic team. In this role, you will be responsible for designing, implementing, and managing big data platforms on AWS and GCP cloud environments. You will work with a variety of technologies, including Kafka, Zookeeper, Hadoop, HBase, Spark, Hive, and Elasticsearch, ensuring the seamless performance and scalability of our data infrastructure. If you have experience with Terraform, Java, and shell scripting, along with strong Linux administration skills, we would love to hear from you!

Key Responsibilities

  • Design, set up, and manage Big Data platforms on AWS and GCP.
  • Develop and maintain Terraform scripts for infrastructure automation.
  • Implement and manage Kafka, Zookeeper, Hadoop, HBase, Spark, and Hive to optimize data processing and analytics.
  • Work with Elasticsearch for efficient data retrieval and indexing.
  • Create and maintain shell scripts for automation and system monitoring.
  • Support and optimize Java-based applications within the cloud infrastructure.
  • Ensure high availability, performance, and security of big data solutions.
  • Troubleshoot and resolve system performance issues in a cloud environment.
  • Collaborate with cross-functional teams, including DevOps, Data Engineering, and Software Development teams.

Mandatory Skills & Experience

  • Strong expertise in Linux administration.
  • Hands-on experience with AWS and GCP cloud platforms.
  • Proficiency in Terraform for infrastructure automation.
  • Experience in Java and shell scripting.
  • In-depth knowledge of Kafka, Zookeeper, Hadoop, HBase, Spark, and Hive.
  • Strong understanding of Elasticsearch.
  • Experience in setting up and managing Big Data platforms on AWS or GCP.
  • Ability to troubleshoot and resolve complex infrastructure issues.

Nice-to-Have Skills

  • Knowledge of Aerospike for NoSQL data storage.
  • Experience with Druid, Airflow, and Tableau.
  • Familiarity with DevOps practices and tools.
  • Understanding of Big Data architecture best practices.

About GlobalLogic

GlobalLogic is a leading digital engineering company, helping global brands design and develop innovative products, platforms, and digital experiences. We integrate experience design, complex engineering, and data expertise to help our clients drive digital transformation.

As a Hitachi Group Company, we leverage cutting-edge technologies to create sustainable solutions that enhance quality of life, business efficiency, and customer experiences. Our expertise spans across industries including automotive, finance, healthcare, manufacturing, media, and technology.

At GlobalLogic, we don’t just build software, we create the future of digital innovation. Join us and be part of something extraordinary!