Continental Logo

Continental

IT engineer Data & Analytics DevOps

Posted Yesterday
Be an Early Applicant
In-Office
Bengaluru, Bengaluru Urban, Karnataka
Senior level
In-Office
Bengaluru, Bengaluru Urban, Karnataka
Senior level
The role involves operating and optimizing Azure-based Data & Analytics platforms, managing CI/CD pipelines, and ensuring pipeline reliability and performance.
The summary above was generated by AI
Company Description

Continental develops pioneering technologies and services for sustainable and connected mobility of people and their goods. Founded in 1871, the technology company offers safe, efficient, intelligent, and affordable solutions for vehicles, machines, traffic and transportation. In 2023, Continental generated sales of €41.4 billion and currently employs around 200,000 people in 56 countries and markets.

 Guided by the vision of being the customer's first choice for material-driven solutions, the ContiTech group sector focuses on development competence and material expertise for products and systems made of rubber, plastics, metal, and fabrics. These can also be equipped with electronic components in order to optimize them functionally for individual services. ContiTech's industrial growth areas are primarily in the areas of energy, agriculture, construction, and surfaces. In addition, ContiTech serves the automotive and transportation industries as well as rail transport.

The IT Digital and Data Services Competence Center of ContiTech caters to all the Business Areas in ContiTech and responsible among other on areas of Data & Analytics, Web and Mobile Software Development and AI

The team for Data services specializes in all platforms, business applications and products in the domain of data and analytics, covering the entire spectrum including AI, machine learning, data science, data analysis, reporting and dashboarding.

Job Description

  • Ensure stable, scalable, and secure operation of the Azure-based Data & Analytics platform, including Databricks, Azure-native components, Power BI, and CI/CD infrastructure
  • Offload operational workload from platform architects by taking ownership of infrastructure, deployment automation, and pipeline reliability
  • Enable smooth execution and troubleshooting of data pipelines written in Scala and PySpark, including hybrid integration scenarios such as Power BI with gateway infrastructure
  • Reports to: Head of Data & Analytics IT Competence Center
  • Collaborates with: Platform Architects, Data Engineers, ML Engineers, Power BI Developers
  • Geography: Global (stakeholders in Germany, India, Manila)
  • Operational Scope: Azure services, Databricks workspaces, CI/CD toolchains, Power BI service (incl. gateways), and Spark-based data pipelines

 

Main Tasks

- Operate and optimize Azure resources (ADF, Key Vault, Monitor, Event Hub)
- Administer Databricks workspace access and cluster configs
- Apply Infrastructure-as-Code (Terraform/Bicep)

- Manage CI/CD pipelines for Scala and PySpark-based pipelines
- Integrate build steps (e.g., Maven/SBT, Python wheels) into automated deployments
- Enforce DevSecOps and IaC standards

- Monitor Spark job execution, analyze failures and stage-level issues using Spark UI and logs
- Configure alerts, metrics, and dashboards for pipelines and infrastructure
- Lead post-incident reviews and reliability improvements

- Administer Power BI tenant configuration, workspace access, and usage monitoring
- Operate and monitor on-premises or VM-hosted enterprise gateways
- Troubleshoot dataset refreshes and hybrid data integration

- Support runtime execution of production pipelines and ensure SLA adherence
- Collaborate with engineers to resolve Spark performance issues or deployment errors
- Participate in schema evolution and environment transitions

- Enforce platform policies (tagging, RBAC, audit logging)
- Maintain credential and secrets security using Key Vault and managed identity
- Conduct audits across Azure, Databricks, and Power BI environments

Qualifications

  • Education / Certification:
    Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field.
    Preferred: Azure DevOps Engineer Expert, Power BI Admin, or Databricks Admin certifications
  • Professional Experience:
    Minimum 5 years in cloud platform engineering, DevOps, or SRE roles within data or analytics platforms
    Hands-on experience with Spark (Databricks), PySpark, and CI/CD for JVM-based data applications
  • Project or Process Experience:
    Proven ability to deploy and operate complex data pipeline ecosystems using Scala and PySpark
    Experience in managing Power BI service in enterprise setups, including hybrid gateway environments
  • Leadership Experience:
    No formal people leadership required; expected to lead through technical authority and cross-team collaboration
  • Intercultural / International Experience:
    Experience working in distributed teams across time zones and cultures; strong communication skills and resilience

Additional Information

The well-being of our employees is important to us. That's why we offer exciting career prospects and support you in achieving a good work-life balance with additional benefits such as:

  • Training opportunities
  • Mobile and flexible working models
  • Sabbaticals

and much more...

Sounds interesting for you? Click here to find out more.

 

Diversity, Inclusion & Belonging are important to us and make our company strong and successful. We offer equal opportunities to everyone - regardless of age, gender, nationality, cultural background, disability, religion, ideology or sexual orientation.

Ready to drive with Continental? Take the first step and fill in the online application.

Top Skills

Adf
Azure
Bicep
Ci/Cd
Databricks
Event Hub
Key Vault
Power BI
Pyspark
Scala
Terraform

Similar Jobs

Yesterday
In-Office
Senior level
Senior level
Automotive • Manufacturing
Responsible for the operation and optimization of Azure-based Data & Analytics platform, managing data pipelines, and ensuring compliance with DevSecOps practices.
Top Skills: AzureCi/CdDatabricksPower BIPysparkScalaTerraform
3 Hours Ago
In-Office
Mid level
Mid level
Cloud • Information Technology • Internet of Things • Machine Learning • Software • Cybersecurity • Infrastructure as a Service (IaaS)
As an Infrastructure Engineer, you will develop and troubleshoot the development environment and AI infrastructure, ensuring scalability and security while collaborating with global teams and stakeholders.
Top Skills: Amazon Web ServicesAnsibleAzureEricsson Engineering Environment ToolsGoGCPIp NetworksIt SecurityKubernetesLinux SystemsPuppetPython
7 Hours Ago
Hybrid
Senior level
Senior level
Cloud • Insurance • Payments • Software • Business Intelligence • App development • Big Data Analytics
As a Site Reliability Engineer, you will manage hybrid infrastructure, ensure system reliability, automate configurations, and mentor junior engineers while leveraging tools like Terraform, Ansible, and Datadog.
Top Skills: AnsibleArgocdAWSAzureBashDatadogF5 LtmGCPGoGoogle Load BalancerHashicorp ConsulHelmKubernetesPackerPostgresPowershellPythonSQL ServerTerraformTerraform CdkTypescriptVMware

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account