DUTIES: Design and maintain data pipelines to extract, transform, and load (ETL) data. Design and maintain data models that facilitate efficient data storage, organization, and retrieval. Integrate data from different databases, APIs, and external data feeds. Transform data into a usable format for analysis and reporting purposes. Optimize data infrastructure for speed, scalability, and cost-efficiency. Implement data quality checks and validation processes to ensure the accuracy, completeness, and reliability of data across systems. Establish and maintain data security measures, including access controls and data encryption, to protect sensitive information. *Telecommuting permitted: work may be performed in any location in the U.S.
JOB REQUIREMENTS: Bachelor's degree (U.S. or foreign equivalent) in Computer Science, Statistics, or a related field and three (3) years of experience in data engineering or related role. Must have three (3) years of experience with: programming language Python; designing and implementing data models; extract, transform, and load (ETL) tools and data orchestration tool Airflow; distributed computing frameworks; data warehousing technologies and Snowflake; databases PostgreSQL; cloud platforms AWS and services AWS S3, Lambda, EC2, RDS, SQS; and using version control systems Git.
SALARY: $143,499 to $215,249 / year
Jobs.Now Note: To tap into these hidden job opportunities, it's crucial to adhere strictly to the application process outlined in each job ad. At Jobs.Now, we ensure that every listing includes detailed employer instructions. Follow them precisely to be considered for these unique positions!