
Turing
A U.S.-based company reshaping the way organizations think and drive their business is looking for a Data Engineer. The engineer will be responsible for creating functional solutions and approaches to problems while ensuring proper implementation. The company provides financial and business advisory services that cover a broad range of consulting solutions. This is a nice opportunity for developers to make a name for themselves and drive innovative projects.
Job Responsibilities:
- Provide end-to-end functional software landscape and solution design to support the company’s business processes
- Evaluate requirements and translate them to business processes
- Provide direction to the implementation teams and work as intermediaries across the local and onshore teams
- Standardize and implement code branching strategy
- Enable developers with automation of deployments and testing frameworks
- Document, implement and audit security and access permissions for the data platform
- Follow and implement CI/CD and data pipelines for cloud data platform
- Participate in driving engineering best practices around software development processes
Job Requirements:
- Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
- At least 5+ years of experience creating data pipelines in Python
- A minimum of 2+ years of experience in creating data pipelines using PySpark in Databricks
- At least 2+ years of experience in SQL
- 3+ years of experience designing, building and maintaining ETL/ELT data pipelines
- A minimum of 3+ years of experience leveraging, designing and building dimensional modeling
- Experience creating and supporting a complete ELT/ETL pipeline
- Knowledge of data movement tools, data stores, and cloud-based environments
- Demonstrable experience building and supporting a modern next generation data warehouse platform
- Solid understanding of data modeling and architecture
- Exposure to and support of downstream data usages for visualization and data science
- Ability to explain complex technical challenges to technical and business audiences
- Extensive knowledge of release/change management process
- Working experience with Scrum and Agile methodologies with geographically distributed teams
- Experience as DevOps skill is a plus
- Knowledge of AWS IAM and security policy management and service configurations and compliance management
- Experience with managing and scaling cloud data warehouse (Snowflake)
- Experience with AWS ecosystem – S3, Amazon RDS, Amazon ML ecosystem
- Experience in GitHub integration and CI/CD pipeline development
- Experience using orchestration tools like Apache Airflow
- Experience working with large datasets and big data technologies, preferably cloud-based, such as Snowflake, Databricks, or similar
- Excellent English communication and interpersonal skills