General

Job Listing

Number of Positions: 1
Job Listing
Job ID: 14084505
 
Job Location:
 

Public Transportation is Available.

Telecommute (See 'Help' at the top or bottom of page for definition.)

 
How to Apply:
To see employer contact information, log in or register.
 
Job Title:Senior Data Engineer (#VK-0211)
Work Type:Work Days: Weekdays, Work Vary: No , Shift: First (Day), Hours Per Week: 40, Work Type: Regular
 
Salary Offered:Minimum $129,375.00 Yearly
Benefits:401(k) or other retirement, Dental Insurance, Health Insurance, Sick Leave or PTO, Vacation or PTO, Vision Plan
 
Physical Required:No
Drug Testing Required:No
Education Required:Master's Degree
Experience Required:12 Months
 
Required Skills:
Master’s degree or equivalent in Computer Science, Management Information Systems, or a related field. 1 year of experience in data analysis, data development, data management, data science, business analytics, business intelligence, or data engineering to include all relevant job titles within the occupation. Employer will accept pre- and post- Master’s degree experience. All experience must be post-Bachelor’s degree. Employer will accept a Bachelor’s degree or equivalent in Computer Science, Management Information Systems, or a related field and 3 years of experience in data analysis, data development, data management, data science, business analytics, business intelligence, or data engineering to include all relevant job titles within the occupation in lieu of a Master’s degree and 1 year of experience. Work experience to include: 1. Data pipeline and data integration development experience including creating new and supporting existing extract, transform, load (ETL) and ETL pipelines, building previous data pipelines from data sourced from secure file transfer protocol (SFTP), application programming interface (API), Json in S3, MySQL, Oracle, MS SQL Server, and Redshift databases. 2. Creating new data connections and processes using Spark, SQL, Python, and AWS services including Glue and Lambda. 3. Building data pipelines through Apache Airflow on AWS EC2. 4. Script for automation with Python. 5. Managing at least one of the following AWS resources and processes, including Lambda, Glue, Airflow on EC2, S3, Rekognition, or Redshift Spectrum. 6. Working with business teams to define required metrics and KPI’s used to measure business success towards operational and financial goals. 7. Building and publishing reports and dashboards with PowerBI. 8. Developing and maintaining event-driven architectures with AWS Lambda and Glue. 9. Completing proof of concept data pipelines with one of the following services: AWS, Kafka, or Kinesis. 10. Maintaining code using Bitbucket for version control and Bamboo for code deployment. 11. Delivering projects and initiatives inside both an Agile and iterative framework using JIRA for task, store, and issue creation and management. 40 hrs./wk. Salary: $129,375 per year. Travel: Position requires 10% of domestic travel; expenses paid by employer. Address of employment: 100% telecommute position from anywhere in the U.S. reporting to Shutterfly, LLC located at 11000 Viking Drive, Eden Prairie, Minnesota 55344.

Individuals interested in applying for the position must mail resume with Job Ref. #VK-0211 to: Shutterfly, LLC, Attn: Human Resources, Job #VK-0211, 11000 Viking Drive, Eden Prairie, MN 55344.
Preferred Skills:
 
Driver's License: None Required
Job Description:
Collaborate with business teams and technology teams on the designs and solutions for new and extensions of existing data pipelines. Collaborate with the data teams to evaluate emerging technologies and data integration approaches. Develop frameworks and applications for large-scale data processing. Partner with other business and technology teams to extract, transform, and load data from a wide variety of data sources using amazon web services (AWS) and other technologies. This includes gathering and processing raw, structured, semi-structured, and unstructured data, from internal and external data providers via databases, APIs, FTPs, and messaging services, using batch and real-time data processing frameworks. Build technical knowledge related to applicable emerging technologies and trends in data integration. Design, develop, and implement an end-to-end analytical platform that is flexible, componentized, scalable, secure, and meets business needs. Design and develop data pipelines for business intelligence reporting and analytics, including data needed for data scientists and analysts. Build data pipelines that support the use and expansion of good data practices and data governance (e.g., metadata management, data quality, data modeling, master data management, data administration, security, privacy). Safeguard all customer, employee, and company proprietary and personal information ensuring customer and employee data is kept confidential at all times.

Refer to ID when applying