Data Engineer (Python/AWS/Spotfire)

Data Engineer (Python/AWS/Spotfire)

This job is no longer open
Nielsen Media would not function without our Technology teams! We are catalysts for delivery quality, on-time, reliable measurements to clients, and we are cultivators, growing our employees through education, skill building and experiences. Around the globe, our Technology teams are relentless in our pursuit of superior analytics, technology, process and support.

As Data Engineer, you will be part of the Big Data Platform team, which is composed of software developers and analysts, whose purpose is to develop, configure, maintain and deploy big data application platforms that function to allow the team to manage multiple large data sources. These software development activities enable Nielsen to collect, store, process and analyze complex data sets for market trends and insights through its digital products and solutions.

You will be responsible for designing, developing, maintaining, implementing and deploying data warehouse solutions for processing and analyzing data (e.g., using complex programming and software engineering tools to create data pipelines, implement platform frameworks and automated application workflows to extract, transform and load (ETL) data).

As part of the Big Data Platform team, develop highly-complex, business-critical bigdata platform applications and automation solutions for data ingestion, processing, cleansing, and transformation using programming, application/platform development and software, systems, data and engineering tools, such as Python, AWS Spark framework, Pyspark (The Spark Python Application Programming Interface), Lambda, Athena, Airflow and Presto.

Using software and platform engineering tools, monitor and maintain automation services, jobs, server health, and performance tuning of client-facing business intelligence reports and data analytics dashboards.

Job Responsibilities:

    • Hands on developer to work on data modeling, code, and test iteratively on technical solutions for monitoring and analytics requirements
    • Implement ETL processes, UI and dashboards for appropriate trends and alerts in BI tools (Spotfire, Superset, Tableau, etc.) for monitoring
    • Implement and monitor automated deployment solutions for cloud hosted monitoring applications
    • Work as part of the DevOps team create CI/CD pipeline and automation services
    • Create appropriate documentation (process flows and technical specs)
    • Assist in Monitoring platform support and troubleshooting production issues
    • Analyze data contained in the existing data lake to define monitoring solutions
    • work from your home office! #remote

Desired Qualifications:

    • BS degree in Computer Science, or equivalent experience
    • Architecture, design, implementation, and operational support expertise with Amazon Web Services such as IAM, EMR, EC2, S3 storage, Lambda, Relational Database Service, Simple Notification Service (AWS Certifications desired)
    • Strong "problem-solver" skills, experience with continuous integration and deployment automation tools such as Terraform or Cloudformation
    • Expertise in Apache Airflow
    • Strong Database Platform background and knowledge, with at least 5 yrs experience with large scale PostgreSQL/MySQL/Hive production environments
    • Experience with scripting languages - Pyspark, Python and Shell scripting
    • Experience with business intelligence tools such as TIBCO Spotfire, Superset and Tableau 
    • Team player with strong verbal and written communication skills

Required experience:

    • High Performance Computing Solutions: 4 years
    • AWS Cloud Architecture: 2 years
    • Business Intelligence development: 2 years
Remote Consideration - OK
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.