Macy’s is proudly America’s Department Store. For more than 160 years, Macy’s has served generations at every stage of their lives. Customers come to us for fashion, value and celebration. Now is an exciting time to join Macy’s, Inc. The face of retail is changing, and change requires innovation.
Macy’s Tech provides modern tools, platforms, and services to all parts of the business. Our team supports millions of customers in connected commerce across the technology hub at Macy’s Join our team to help shape the future of e-commerce and set the pace in retail technology. Whether focused on store technology, supply chain tech, application security, merchandising systems, or the mobile app – you’ll have opportunities to grow your career while finding meaningful ways to make a difference.
The Lead Data Engineer will develop, test and maintain architecture, including databases and processing systems, to support a robust analytical pipeline that facilitates priority analytics use cases. The Lead Data Engineer will ensure these data processing capabilities meet business requirements, use case and user needs while providing reliable, efficient and quality data.
- Adhere to processes to ensure data pulled from various sources meets quality standards, is curated and enhanced for analytical use and there is a "single source of truth"
- Work with counterparts from Tech to build frameworks that integrate data pipelines and machine learning models that facilitate use by data scientists for priority use cases; Enterprise Data and Analytics team focused on "last mile" transformations on select data required for use cases
- Build scalable data solutions following best practices on coding
- Maintain database structure and standard definitions for business users across Macy's
- Work with data architects to build the foundational extract / load / transform process and regularly review the architecture and recommend effectiveness improvements
- Collaborate with Technology to future-proof data & analytics software, tools and code to reduce risk and support pipeline owners
- Work with Legal and Privacy teams to adhere to data privacy and security standards
- Work with Data Architect to implement the data models, standards and quality rules
- Work with the Data Science team to understand data formatting and sourcing needs to enable them to build out use cases as efficiently as possible
- Community builder to foster data and analytics culture and to support data-driven thinking and discussions across priority analytics use case
- Be entrepreneurial, agile, and results-oriented
- Embody data-driven culture at Macy's
- Bolster awareness, knowledge and conviction around data-driven practices and behaviors, increasing digital IQ of Business Units
- Proactively raise data issues and take action to remediate support business users in identifying the correct data sets and providing easy to use tools to pull data
Key Performance Indicators
- Quantity, type, and quality of databases and pipelines, in partnership with Technology
- Compliance with relevant laws and regulations , in partnership with Legal/Privacy
- Automation of data cleansing and harmonization processes in refined/trusted zones
Qualifications and Competencies
- Bachelor’s degree in Computer Science (or related technical field).
- At least 8 years of overall experience in building ETL/ELT, data warehousing and big data solutions.
- At least 5 years of experience in building data models and data pipelines to process different types of large datasets.
- At least 3 years of experience with Python, Spark, Hive, Hadoop, Kinesis, Kafka.
- Proven expertise in relational and dimensional data modeling.
- Understand PII standards, processes, and security protocols.
- Experience in building data warehouse using Cloud Technologies such as AWS or GCP Services, and Cloud Data Warehouse preferably Google BigQuery.
- Able to confidently express the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior levels of management.
- Familiar with coding best practices, develop and manage code in a modularized and scalable way.
- Experience implementing and supporting operational data stores, data warehouses, data marts, and data integration applications.
- In depth knowledge in Big Data solutions and Hadoop ecosystem.
- Ability to effectively share technical information, communicate technical issues and solutions to all levels of business.
- Able to juggle multiple projects - can identify primary and secondary objectives, prioritize time and communicate timeline to team members.
- Passionate about designing and developing elegant ETL/ELT pipelines and Frameworks.
- Ability and desire to take product/project ownership.
- Ability to think creatively, strategically and technically.
- Ability to work a flexible schedule based on department and Company needs.
- Cloud Architect (AWS or GCP or Azure) Certification is a plus.