This company is a data science, machine learning, and data visualization firm. They work with businesses to implement data-driven analytic techniques derived from statistical modeling and data science. Their team is made of passionate data engineers, PhDs, data scientists, and visualization experts, that work for the same mission: to help their clients identify and apply advanced analytics to improve the quality of decision making.
This is a remote position but within +/- 2 hours of Eastern Standard Timezones(NYC). Applicants must have strong written and oral communication in English. — We are looking for a Senior Data Engineer to join our talented multi-disciplinary team. We build modern data analytics platforms for our clients that incorporate advanced analytics and machine learning to solve business problems. This company works across multiple industries, this role provides an exciting set of experiences across a wide range of domains. Your primary focus as a Senior Data Engineer will be collaborating on the design and development of a modern data platform in AWS built on top of Snowflake and dbt. Blue Orange engineers take end-to-end ownership of their code and platforms, so the ideal candidate for this position has a mixture of experience in Cloud Engineering and Data Engineering. This team never stops learning and growing, new team members should be passionate about data technology and life-long learners. Core Responsibilities & Skills: - Design, develop, test, and deliver a maintainable, scalable data platform built in AWS with Snowflake and dbt at its core. - Implement Snowflake’s best practices for loading and transforming data. - Work closely with business and technical stakeholders to refine pipeline requirements and design data models to support business objectives. - Help establish and follow a DataOps culture within the team. - Work closely in an agile team setting to deliver high-quality systems and value to our clients. Qualifications: - BA/BS degree in Computer Science or a related technical field, or equivalent practical experience. - Minimum of 5 years experience building ETL pipelines and working with different Cloud Data Warehouses (Redshift, Snowflake, Azure SQL Data Warehouse). - Minimum of 2 years of experience working with Snowflake. - Advanced experience in Python and SQL. Advanced understanding of computer science fundamentals, complex data structure, data processing, data quality, data lifecycle, and algorithms. - Experience using Infrastructure as Code tools (Terraform, CloudFormation). - AWS certification, or progress toward, at the associate level (Solutions Architect or Developer), or specialty (Big Data) a strong advantage. - Enjoys collaborating with other engineers on architecture and sharing designs with the team - Excellent verbal and written English communication. - Interacts with others using sound judgment, good humor, and consistent fairness in a fast-paced environment
Only candidates from Argentina, Brazil, Chile, Colombia, Costa Rica, Ecuador, El Salvador, Mexico, Panama, Peru, Uruguay
Intermediate or advanced spoken English is required for ALL opportunities. If you can't speak English yet, please keep practicing and apply in the future.