Description

Responsibilities

  • Strong work ethic and highly self-motivated to work in a challenging integrated team environment 
  • Understanding the business requirements
  • Building the workflow configurations
  • Data Sanitization executions and validations for Non RPSWT LOB’s (GMOT, GBT, GMT, EET, ERFT). 
  • Provides regular progress updates to leadership team

 

Primary

  • 6 + years of IT experience
  • Strong in UNIX programming/scripting
  • Hadoop architecture/ecosystem experience
  • Experience in HIVE, HDFS (AVRO, PARQUET, QRC formats)
  • Experience with MapReduce/Spark concepts
  • IEDPS Data Sanitization Workflow
  • Knowledge about File extraction/Data access processes including performing file-splits at row and column levels
  • Ability to monitor Hadoop clusters
  • Database Management experience – any RDBMS like DB2, SQL, ORACLE, NETEZZA, SYBASE, SIEBEL, TERADATA, MYSQL 
  • Some ETL (extract – transform – load) experience (can be any tool – datastage, informatica, Teradata+unix)
  • Scheduling/Automation exposure using AUTOSYS
  • Exposure to data connection methods via JDBC, ODBC
  • Knowledge about File extraction/Data access processes 
  • Strong communication skills

 

Secondary

  • •           Database Management experience – any RDBMS 
  • •           Data Discovery knowledge of confidential data
  • •           Bank experience : Dev role for AITs under GBT, GMOT, GMT, EET, ERFT
  • •           KAFKA knowledge


 

Education

Any Graduate