Job Description

As a Data Engineer, you will be responsible for:

  • Delivering data-driven product features by working closely with developers and stakeholders
  • Managing backlog items and designing, developing, delivering, and supporting data changes
  • Ensuring data structures and database designs meet application, scalability, and architectural standards.
  • Building and maintaining ETL pipelines for OLAP and OLTP systems.
  • Driving build and release activities for data solutions and supporting related architecture artefacts.
  • Continuously improving and optimizing data models and warehouse structures.
  • Collaborating with product owners, architects, BAs, and scrum teams to deliver change on schedule.

 

Knowledge/Skills

  • Strong hands-on SQL and Python skills.
  • Strong hands-on experience with databases PostgreSQL
  • Strong hands-on experience with streaming platform and event-driven architectures
  • Strong hands-on experience with AWS cloud services (EKS, EC2, S3, RDS, Lambda, API Gateway, ECS, VPC, IAM, CloudWatch, etc.) and cloud architecture best practices.
  • Solid expertise in physical data modelling, DB design, CDC and performance tuning.
  • Experience in creating REST API
  • Experience designing, developing, and deploying applications using OLTP and OLAP systems.
  • Experience with ETL tools such as Boomi; scripting skills in Spark or PySpark are desirable.
  • Hands-on experience with CI/CD tools
  • Exposure to financial services, compliance, or regulated data domain
  • Familiarity with data governance, lineage, and auditability concepts
  • Experience working across onsite/offshore teams.
  • Strong team player with a willingness to learn new technologies.

 Experience

  • 10+ years of experience in various databases and ETL tools.
  • Must have ETL E2E experience in documentation, development, testing & deployment to Production
  • Strong Relational Database background and SQL Skills
  • Proficiency in automation and continuous delivery methods
  • Proficiency in all aspects of the Software Development Life Cycle in an Agile environment

 

Job Description for Integration Engineer

 

Job Description

 

As an Integration Engineer, you will be responsible for:

  • Designing, building and owning end-to-end enterprise integrations supporting critical business workflows.
  • Developing and maintaining Boomi-based integrations, including process design, connector usage, data mapping, error handling, retries, and environment promotions.
  • Building event-driven integrations using Kafka producers and consumers with proper offset management.
  • Implementing asynchronous and API-led integration patterns to decouple systems.
  • Managing backlog items and delivering integration changes using Agile practices.
  • Driving build, deployment, and release activities for integration solutions across environments.
  • Ensuring integrations meet scalability, security, auditability, and reliability standards.
  • Monitoring and supporting production integrations, including incident analysis, message replay, and recovery.
  • Collaborating with product owners, architects, application teams to deliver change on schedule.

 

Knowledge/Skills

  • Strong hands-on experience with Boomi (process design, connectors, mappings, Atom/Molecule).
  • Strong hands-on experience with Kafka including topics, partitions, offsets, and retries.
  • Proven experience designing event-driven and message-based architectures.
  • Strong understanding of resiliency patterns (idempotency, retries, DLQs, replay).
  • Experience building and consuming REST APIs (JSON/XML, authentication, HTTP semantics).
  • Ability to troubleshoot and resolve production integration issues, including failed messages or lag
  • Familiarity with logging, monitoring, and operational observability.
  • Experience working across onsite/offshore teams.
  • Strong ownership mindset and willingness to learn evolving integration patterns.

 

Experience

  • 6+ years of experience in enterprise integration and middleware development.
  • Proven end-to-end integration ownership from design through production support.
  • Mandatory hands-on experience delivering integrations using Boomi and Kafka.
  • Experience supporting business-critical, high-volume integrations in production.
  • Strong understanding of integration patterns, data transformation, and error-handling strategies.
  • Proficiency in automation, CI/CD, and Agile delivery practices.