Senior Data Engineer - DBT - NYC, NY (Remote)
                                Job title: Senior Data Engineer - DBT - NYC, NY (Remote) in USA at Lorven Technologies
Company: Lorven Technologies
Job description: Role: Senior Data Engineer - DBTLocation: NYC, NY (Remote)Duration: Long-TermJob Description:Must have below:10+ Years ExperienceGreat Communicator/Client FacingIndividual Contributor100% Hands on in the mentioned skillsDBT Proficiency: model development:Experience in creating complex DBT models including incremental models, snapshots and documentation. Ability to write and maintain DBT macros for reusable codeTesting and documentation:Proficiency in implementing DBT tests for data validation and quality checksFamiliarity with generating and maintaining documentation using DBT's built in featuresVersion control:Experience in managing DBT projects using git, including implementing CI/CD process from the scratch
Expected salary:
Location: USA
Apply for the job now! [ad_2] Apply for this job
                            
                            
                        Company: Lorven Technologies
Job description: Role: Senior Data Engineer - DBTLocation: NYC, NY (Remote)Duration: Long-TermJob Description:Must have below:10+ Years ExperienceGreat Communicator/Client FacingIndividual Contributor100% Hands on in the mentioned skillsDBT Proficiency: model development:Experience in creating complex DBT models including incremental models, snapshots and documentation. Ability to write and maintain DBT macros for reusable codeTesting and documentation:Proficiency in implementing DBT tests for data validation and quality checksFamiliarity with generating and maintaining documentation using DBT's built in featuresVersion control:Experience in managing DBT projects using git, including implementing CI/CD process from the scratch
- AWS Expertise:
 
- Data STORAGE solutions:
 
- In depth understanding of AWS S3 for data storage, including best practices for organization and security
 - Experience with AWS redshift for data warehousing and performance optimization
 - Data Integration:
 
- Familiarity with Aws glue for ETL processes and orchestration -Nice to have
 - Experience with AWS lambda for serverless data processing tasks
 - Workflow Orchestration:
 
- Proficiency in using Apache Airflow on AWS to design ,schedule and monitor complex data flows
 - Ability to integrate Airflow with AWS services and DBT models such as triggering a DBT model or EMR or reading from s3 writing to redshift
 - Data Lakes and Data warehousing:
 
- Understanding the architecture of data lakes vs data warehouses and when to use each
 - Experience with amazon Athena for querying data directly in s3 using SQL
 - Monitoring and Logging:
 
- Familiarity with AWS cloud watch for monitoring the pipelines and setting up alerts for workflow failures
 - Cloud Security:
 
- Knowledge of AWS security best practices ,including IAM roles, encryption, DBT profiles access configurations
 
- Python:
 
- Proficiency in Pandas and NumPy for data analysis and manipulation
 - Ability to write scripts for automating ETL processes and scheduling jobs using airflow
 - Experience in creating custom DBT macros using jinja and Python allowing for reusable components within dbt models
 - Knowledge on how to implement conditional logic in DBT through python
 - SQL:
 
- Advanced SQL skills, including complex joins ,window functions, CTE's and subqueries
 - Experience in optimizing SQL queries for performance and optimization
 
Expected salary:
Location: USA
Apply for the job now! [ad_2] Apply for this job