Snowflake Data Warehouse Developer at San Diego, CA

Please share your updated resume so that I can assist well in future.

Role: “Snowflake Data Warehouse Developer”
Location: San Diego, CA
Duration: Permanent Position (Fulltime)
Job Description:
Technical / Functional Skills
1. Ability to develop ETL pipelines in and out of data warehouse using combination of Python and Snowflake’s SnowSQL
2. Ability to write SQL queries against Snowflake.
3. Ability to develop scripts (Unix, Python etc.) to do Extract, Load and Transform data
4. Ability to provide production support for Data Warehouse issues such data load problems, transformation/translation problems etc.
5. Ability to integrate on premise infrastructure with public cloud (AWS, AZURE) infrastructure
6. Ability to translate requirements for BI and Reporting to Database design and reporting design
7. Ability to understand data transformation and translation requirements and which tools to leverage to get the job done
8. Ability to understand data pipelines and modern ways of automating data pipeline using cloud based and on premise technologies
9. Actively test and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
Roles and responsibility
1. 6 months to 1 Year experience with Snowflake cloud based data warehouse
2. 6 months to 1 Year experience with Snowflake SnowSQL and writing use defined functions
3. Hands-on experience with at least one Snowflake implementation
4. 3-5 years’ experience developing ETL, ELT and Data Warehousing solutions
5. 3-5 years’ experience AWS cloud, Azure or Google cloud
6. 3-5 years’ experience in data modeling using ERWIN
7. 3-5 years’ experience developing Python based code that reads/writes data into databases
8. 3-5 years’ experience developing SQL scripts and stored procedures that processes data from databases
9. 3-5 years’ experience in loading source system data extracts into data warehouse
10. 3-5 years’ experience with batch job scheduling and identifying data/job dependencies
11. 3-5 years’ experience with automation of DevOps build using Bitbucket/Jenkins/Maven
12. Strong in Linux experience for shell scripting and Python scripting
13. 3-5 years’ experience with REST API development and consumption
14. Strong understanding of various data formats such as CSV, XML, JSON etc

Roles & Responsibilities
1. This position will work closely with the Scrum masters, Data Architects, QA, Dev/Ops, as well as multiple organizations within the company
2. Partner with the Data architects, Product managers and Scrum Masters to deliver data integrations and BI solutions required for various projects
3. Enable Continuous Delivery (CD) to production for all data warehousing and BI builds
4. Collaborate with DevOps team to align with CI/CD requirements for assigned projects
5. Ability to understand end to end data integration requirements and response time SLA’s to build data driven solutions that provide best in class customer experience
Generic Managerial Skills
1. Strong understanding of incident management and change management process to support day to day production issues
2. Experience working directly with technical and business teams.

E-Mail: anshul.k@idctechnologies.com