Informatica developers are also called as ETL developers. Responsible for various DBA activities such as setting up access rights and space rights for Teradata environment. (555) 432-1000 resumesample@example.com Professional Summary Over 8 years of IT experience in Data warehousing and Business intelligence with an emphasis on Project Planning & Management, Business Requirements Analysis, Application Design, Development, testing, implementation, and maintenance of client/server Data Warehouse. Analysis of Test Track tickets and creating JIRA stories. Full-time. Apply to Business Intelligence Developer, Data Analyst Manager, Front End Associate and more! Created Snowpipe for continuous data load. Role: "Snowflake Data Warehouse Developer" Location: San Diego, CA Duration: Permanent Position (Fulltime) Job Description: Technical / Functional Skills 1. and ETL Mappings according to business requirements. Used ETL to extract files for the external vendors and coordinated that effort. $116,800 - $214,100 a year. In-depth knowledge ofData Sharingin Snowflake, Row level, column level security. By clicking Build Your Own Now, you agree to ourTerms of UseandPrivacy Policy. Designed Mapping document, which is a guideline to ETL Coding. Handled the performance issues by creating indexes, aggregate tables and Monitoring NQSQuery and tuning reports. Developed the repository model for the different work streams with the necessary logic that involved creating the Physical, BMM and the Presentation layer. Seeking to have a challenging career in Data Warehousing and Business Intelligence with growth potential in technical as well as functional domains and to work in critical and time-bound projects where can apply technological skills and knowledge in the best possible way. Develop & sustain innovative, resilient and developer focused AWS eco-system( platform and tooling). Used Rational Manager and Rational Clear Quest for writing test cases and for logging the defects. | Cookie policy, Informatica Developers/Architects Resumes, Network and Systems Administrators Resumes, Help Desk and Support specialists Resumes, Datawarehousing, ETL, Informatica Resumes, Business Intelligence, Business Object Resumes, Sr. Oracle PL/SQL Developer Resume West Des Moines, IA, Hire IT Global, Inc - LCA Posting Notices. Developed and optimized complex SQL queries and stored procedures to extract insights from large datasets. Produce and/or review the data mapping documents. Fill in your email Id for which you receive the Snowflake resume document. Collaborated with cross-functional teams to deliver projects on time and within budget. Built and maintained data warehousing solutions using Snowflake, allowing for faster data access and improved reporting capabilities. GClaireClaired knClairewledge with the Agile and Waterfall methClairedClairelClairegy in the SClaireftware DevelClairepment Life Cycle. Redesigned the Views in snowflake to increase the performance. Implemented Snowflake data warehouse for a client, resulting in a 30% increase in query performance, Migrated on-premise data to Snowflake, reducing query time by 50%, Designed and developed a real-time data pipeline using Snowpipe to load data from Kafka with 99.99% reliability, Built and optimized ETL processes to load data into Snowflake, reducing load time by 40%, Designed and implemented data pipelines using Apache NiFi and Airflow, processing over 2TB of data daily, Developed custom connectors for Apache NiFi to integrate with various data sources, increasing data acquisition speed by 50%, Collaborated with BI team to design and implement data models in Snowflake for reporting purposes, Reduced ETL job failures by 90% through code optimizations and error handling improvements, Reduced data processing time by 50% by optimizing Snowflake performance and implementing parallel processing, Built automated data quality checks using Snowflake streams and notifications, resulting in a 25% reduction in data errors, Implemented Snowflake resource monitor to proactively identify and resolve resource contention issues, leading to a 30% reduction in query failures, Designed and implemented a Snowflake-based data warehousing solution that improved data accessibility and reduced report generation time by 40%, Collaborated with cross-functional teams to design and implement a data governance framework, resulting in improved data security and compliance, Implemented a Snowflake-based data lake architecture that reduced data processing costs by 30%, Developed and maintained data quality checks and data validation processes, reducing data errors by 20%, Designed and implemented a real-time data processing pipeline using Apache Spark and Snowflake, resulting in faster data insights and improved decision-making, Collaborated with business analysts and data scientists to design and implement scalable data models using Snowflake, resulting in improved data accuracy and analysis, Implemented a data catalog using Snowflake metadata tables, resulting in improved data discovery and accessibility. You're a great IT manager; you shouldn't also have to be great at writing a resume. . Performed Functional, Regression, System, Integration and end to end Testing. Done analysis of Source, Requirements, existing OLTP system and identification of required dimensions and facts from the Database. DevelClaireped ETL prClairegrams using infClairermatica tClaire implement the business requirements, CClairemmunicated with business custClairemers tClaire discuss the issues and requirements, Created shell scripts tClaire fine tune the ETL flClairew Clairef the InfClairermatica wClairerkflClairews, Used InfClairermatica file watch events tClaire pClairele the FTP sites fClairer the external mainframe files, PrClaireductiClairen suppClairert has been dClairene tClaire resClairelve the ClairengClaireing issues and trClaireubleshClaireClairet the prClaireblems, PerfClairermance SuppClairert has been dClairene at the functiClairenal level and map level, Used relatiClairenal SQL wherever pClairessible tClaire minimize the data transfer Clairever the netwClairerk, Effectively used the InfClairermatica parameter files fClairer defining mapping variables, FTP cClairennectiClairens and relatiClairenal cClairennectiClairens, InvClairelved in enhancements and maintenance activities Clairef the data warehClaireuse including tuning, mClairedifying Clairef the stClairered prClairecedures fClairer cClairede enhancements, Effectively wClairerked in infClairermatica versiClairen-based envirClairenment and used deplClaireyment grClaireups tClaire migrate the Clairebjects, Used debugger in identifying bugs in existing mappings by analyzing data flClairew, evaluating transfClairermatiClairens, Effectively wClairerked Clairen Clairensite and ClaireffshClairere wClairerk mClairedel, Pre and PClairest assignment variables were used tClaire pass the variable values frClairem Clairene sessiClairen tClaire anClairether, Designed wClairerkflClairews with many sessiClairens with decisiClairen, assignment task, event wait, and event raise tasks, used infClairermatica scheduler tClaire schedule jClairebs, Reviewed and analyzed functiClairenal requirements, mapping dClairecuments, prClaireblem sClairelving and trClaireuble shClaireClaireting, PerfClairermed unit testing at variClaireus levels Clairef ETL and actively invClairelved in team cClairede reviews, Identified prClaireblems in existing prClaireductiClairen and develClaireped Clairene-time scripts tClaire cClairerrect them. Identifying key dimensions and measures for business performance, Developing Metadata Repository (.rpd) using Oracle BI Admin Tool. Created and used Reusable Transformations to improve maintainability of the Mappings. Volen Vulkov is a resume expert and the co-founder of Enhancv. Fixed the SQL/PLSQL loads whenever scheduled jobs are failed. Extracting business logic, Identifying Entities and identifying measures/dimension out from the existing data using Business Requirement Document. Conducted ad-hoc analysis and provided insights to stakeholders. Designing new reports in Jasper using tables, charts and graphs, crosstabs, grouping and sorting. Find more Quickstarts|See our API Reference, 2023 Snowflake Inc. All Rights Reserved. MongoDB installation and configuring three nodes Replica set including one arbiter. Worked as a Team of 14 and system tested the DMCS 2 Application. Used Temporary and Transient tables on diff datasets. Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, caching etc. Experience in using Snowflake Clone and Time Travel. Extensive work experience in Bulk loading using Copy command. Unix Shell scripting to automate the manual works viz. Build ML workflows with fast data access and data processing. Architected OBIEE solution to analyze client reporting needs. 23 jobs. Experience in real time streaming frameworks like Apache Storm. Customized all the dashboards, reports to look and feel as per the business requirements using different analytical views. Strong experience with ETL technologies and SQL. Postman Tutorial for the Snowflake SQL API , Get Started with Snowpark using Python Worksheets , Data Engineering with Apache Airflow, Snowflake & dbt , Get Started with Data Engineering and ML using Python , Get Started with Snowpark for Python and Feast , Build a credit card approval prediction ML workflow . Sr. Snowflake Developer Resume 5.00 /5 (Submit Your Rating) Hire Now SUMMARY: Over 13 years of experience in the IT industry having experience in Snowflake, ODI, INFORMATICA, OBIEE, OBIA, and Power BI. Developed around 50 Matillion jobs to load data from S3 to SF tables. Hybrid remote in McLean, VA 22102. Loaded the data from Azure data factory to Snowflake. Wrote scripts and indexing strategy for a migration to Confidential Redshift from SQL Server and MySQL databases. Developed and optimized complex SQL queries and stored procedures to extract insights from large datasets. Good understanding of Teradata SQL, Explain command, Statistics, Locks and creation of Views. Created Views and Alias tables in physical Layer. Expertise in Design and Developing reports by using Hyperion Essbase cubes. Used spark-sql to create Schema RDD and loaded it into Hive Tables and handled structured data using Spark SQL. Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column. Its great for applicants with lots of experience, no career gaps, and little desire for creativity. Full-time. Have excellent quality of adapting to latest technology with analytical, logical and innovative knowledge to provide excellent software solutions. Participates in the development improvement and maintenance of snowflake database applications. Easy Apply 3d Strong experience with Snowflake design and development. WClairerked Clairem lClaireading data intClaire SnClairewflake DB in the clClaireud frClairem variClaireus SClaireurces. Used COPY, LIST, PUT and GET commands for validating the internal stage files. Worked on Snowflake Shared Technology Environment for providing stable infrastructure, secured environment, reusable generic frameworks, robust design architecture, technology expertise, best practices and automated SCBD (Secured Database Connections, Code Review, Build Process, Deployment Process) utilities, Worked on Snowflake Schemas and Data Warehousing. Experience in working with (HP QC) for finding defects and fixing the issues. It offers the best of both worlds by combining sections focused on experience and work-related skills and at the same time keeping space for projects, awards, certifications, or even creative sections like my typical day and my words to live by. Snowflake Developer ABC Corp 01/2019 Present Developed a real-time data processing system, reducing the time to process and analyze data by 50%. Curated by AmbitionBox. Excellent experience Transforming the data in Snowflake into different models using DBT. Jpmorgan Chase & Co. - Alhambra, CA. Spark, Hive: LLAP, Beeline, Hdfs,MapReduce,Pig,Sqoop,HBase,Oozie,Flume, Hadoop Distributions: Cloudera,Hortonworks. Created Snowpipe for continuous data load. By clicking Customize This Resume, you agree to ourTerms of UseandPrivacy Policy. Implemented Change Data Capture (CDC) feature ofODI to refresh the data in Enterprise Data Warehouse (EDW). Created complex views for power BI reports. Building business logic in stored procedure to extract data in XML format to be fed to Murex systems. Database objects design including stored procedure, triggers, views, constrains etc. 92 Snowflake Developer Resume $100,000 jobs available on Indeed.com. Looking for ways to perfect your Snowflake Developer resume layout and style? Designed, deployed, and maintained complex canned reports using SQL Server 2008 Reporting Services (SSRS). Postproduction validations - code validation and data validation after completion of 1st cycle run. Check the Snowflake Developer job description for inspiration. What feature in Snowflake's architecture and pricing model set is apart from other competitors. Work Experience Data Engineer Worked on Cloudera and Hortonworks distribution. StrClaireng experience in wClairerking with ETL InfClairermatica (10.4/10.9/8.6/7.13) which includes cClairempClairenents InfClairermatica PClairewerCenter Designer, WClairerkflClairew manager, WClairerkflClairew mClairenitClairer, InfClairermatica server and RepClairesitClairery Manager. Senior Software Engineer - Snowflake Developer. Or else, theyll backfire and make you look like an average candidate. Extracting business logic, Identifying Entities and identifying measures/dimensions out from the existing data using Business Requirement Document and business users. 4,473 followers. Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Java, Lookup, Filter, Joiner, Rank, Router, and Update Strategy. Ensuring the correctness and integrity of data via control file and other validation methods. Senior Software Engineer - Snowflake Developer.