● Around 4-7 years of experience working with Integration Platform, Datawarehouse, Data lake, and ETL/ELT Loads.
● Must be strong in coding either Java, Scala or Python worked on Integration between different sources and target systems like RDBMS(Salesforce, BB CRM, Oracle, Postgres, MySQL, SQL Server)
● Extract Data from Source Systems using APIs, Webservices, or bin log files using AWS Glue using pyspark or spark with scala.
● Must have experience in retrieving data from REST and SOAP API
● Experience with AWS Redshift.
● Develop ETL jobs after gathering and translating end-user requirements.
● Write Python scripts/Shell scripts to parse data and invoke APIs / Scripts and to parse data from
● Work with ETL and ELT workflow management
● Worked on Batch and Realtime Streaming Data
● Experience with ETL tools like Pentaho, Informatica, Talend, or similar.
● Experience with Data Ingestion, Profiling, Cleansing, Transformation, and Loading to Dimensional models.
● Efficient in SQL Queries
Nice to Have
● Data Migration from CRM tools BB CRM or Luminate Online to Salesforce CRM/NPSP.
● Experience with Visualization tools like Power BI, Tableau, Qlick Datorama.
●Experience with Datawarehouse Modeling
● Salesforce NPSP and Marketing Cloud