Disney Data Engineering teams leverage cutting-edge technology, immersive storytelling and imaginative thinking to create innovative, one-of-a-kind creations for Guests and consumers around the globe. As part of the Data Engineering team within Disney Parks, Experiences and Products, you will be using some of the most unique and robust data in the world! All of this incredible data is a key enabler behind:
- Disney’s world class, personalized Guest experience
- Interactive and immersive experiences
- Optimizing Parks & Resorts operations
- Vacation planning
- Disney Cruise Line – pricing, booking and experience
- Keeping our Guests safe
- Keeping our attractions running
- Product development and pricing
- Marketing and communication
- Consumer Product ecommerce
As a Sr Data Engineer, you will also participate in discovery processes with stakeholders to identify business requirements and expected outcomes. The Lead Data Engineer will also participate in operational support and maintenance of products and services as well as mentor junior team members.
- Design and engineer high-performance/large scale data engineering projects, producing maintainable and secure code with automated testing in a continuous integration environment.
- Develop production grade, consumable data views
- Ensure solutions support all functional and non-functional requirements
- Participate in operational support and maintenance of products and services
- Ability to participate in discovery processes with stakeholders to identify business requirements and expected outcomes
- Ability to mentor to junior team members
- Coordinate and collaborate with offshore teams
- Relevant cloud data engineering work experience
- Ability to perform across multiple phases of development for multiple complex projects, including technical design, build, and end-to-end testing.
- Passionate about delivering data engineering projects and features in a team environment.
- Demonstrate the ability to quickly learn new technologies
- Troubleshooting skills, ability to determine impacts, ability to resolve complex issues, and ability to exercise sound judgment and initiative in stressful situations.
- Strong oral and written communication and interpersonal skills
Must have technical qualifications:
- Fundamentals of data pipelining, ELT/ETL, data architecture, and the overall data lifecycle
- Cross-platform development languages: Python preferred (Java specialty OK)
- Snowflake data warehouse exposure and experience
- SQL and scripting proficiency
- Relational database and NoSQL (ex: MongoDB, DynamoDB, Redis, HBase, Cassandra) database experience
- Cloud technologies including AWS and Google Cloud Platform (GCP)
- Queuing Technology – Kafka, RabbitMQ, Redis, SQS, Kinesis Streams, Kinesis Firehose
- Data Processing – EMR, Spark, Glue, Spark Streaming/Flink
- Containers - Docker, Docker Swarm, Docker Applications
- CICD - Jenkins/Codebuild/GitLab
- Security - IAM roles, wire encryption, KMS, Kerberos, Authz, AD
- Infrastructure as Code - Terraform, Cloud Formation, CDK
- Bachelor's degree in Computer Science, Engineering, Information Technology, or related field