The position will be responsible for the development of ETL components, providing user access to the data via reports, data extracts, utilizing analysis tools such as OLAP, and for coding stored procedures. The role will require the candidate to possess a strong understanding of database concepts including data warehouse, operational data stores, and data marts. Responsibilities will also require in-depth knowledge of ETL concepts and hands-on experience in implementing data integrations in multiple database platforms using custom development, scripting language such as Unix Shell, and ETL tool such as Informatica. Candidate must have strong python, SQL skills and should have experience developing data extracts and user reports. To be successful, the role will require the individual to understand the banking technology landscape as well as be able to step into existing projects taking on a hands-on development role. The candidate must be aware of Agile principles and actively contribute and participate in design and architecture discussions, daily stand-ups, and Agile Sprint planning sessions.
Job Functions/Duties and Responsibilities:
- Perform all database related work.
- Design and develop enterprise level data transformations using Python.
- Analyze Business user stories and translate them into meaningful tasks.
- Participate in design discussions and contribute to the architecture process.
- Identify potential improvements to the current design/processes.
- Design stable, scalable application database/data warehouse.
- Code and develop the functionality as per the proposed design and requirements.
- Plan and coordinate the data/process migration across databases.
- Work as part of a banking Agile Squad / Fleet.
- Participate in all aspects of SDLC (analysis, design, coding, testing and implementation).
- Actively contribute and participate in design and architecture discussions, daily stand-ups, and Agile Sprint planning sessions.
- 5 to 7 years of hands-on experience in RDBMS systems (e.g.: Sybase, DB2, Teradata, or Oracle).
- Proficiency with ETL development (Informatica, Composite).
- In-depth knowledge of Unix/Linux programming (shell and/or Perl).
- Strong SQL skills and database programming skills including creating views, stored procedures, triggers, implementing referential integrity, as well as designing and coding for performance.
- Understanding of requirements of large enterprise applications (security, entitlements, etc.)
- Experience with working in an Agile squad/chapter setup and familiarity with Agile software/tools (e.g., JIRA, etc.)
- Experience with DevOps processes and tools (e.g., Jenkins, TeamCity, etc.).
- Good communication and leadership skills.
- Organization, discipline, detail-orientation, self-motivation, and focused on delivery.
- Big Data experience using the Hadoop eco system for data storage and processing (e.g.: HDFS, HADOOP, Pig, Hive).
- Prior experience in the Financial Industry.
- Experience in Scala/Spark, Cloud database (e.g., Azure, AWS, etc.)
- Minimum BS degree in Computer Science, Engineering, or a related field.