Data Engineer

Data Engineer

This job is no longer open
Our Data Engineers are experienced technologists with technical depth and breadth, along with strong interpersonal skills. In this role, you will work directly with customers and our team to help enable innovation through continuous, hands-on, deployment across technology stacks. You will work to build data pipelines be involved in the complete end-to-end Data Engineering efforts, including code development, integration, troubleshooting and testing.
 
If you get a thrill working with cutting-edge technology and love to help solve customers’ problems, we’d love to hear from you. It’s time to rethink the possible. Are you ready?

What You’ll Be Doing:

    • Build complex ETL code 
    • Work on Data and Analytics Tools in the Cloud 
    • Develop code using Python, Scala, R languages 
    • Work with technologies such as Spark, Hadoop, Kafka, etc. 
    • Build complex Data Engineering workflows 
    • Create complex data solutions and build data pipelines 
    • Establish credibility and build impactful relationships with our customers to enable them to be cloud advocates 
    • Capture and share industry best practices amongst the community 
    • Attend and present valuable information at Industry Events 
    • Drive the engagements with customers from the architectural pillar, from design to delivery, create runbooks etc. 

Qualifications & Experience:

    • 15+ Years of Data-warehouse and Analytic system development and deployment experience 
    • 10+ years of experience in database architectures and data pipeline development 
    • 8+ years of experience in modern data ware housing platform using cloud native technologies 
    • 5+ years of experience in delivering Azure/GCP/AWS Data Solutions. 
    • Demonstrated knowledge of software development tools and methodologies 
    • Presentation skills with a high degree of comfort speaking with executives, IT management, and developers 
    • Excellent communication skills with an ability to right level conversations 
    • Demonstrated ability to adapt to new technologies and learn quickly 
    • Experience with Google Cloud Services such as Streaming + Batch, BigQuery, BigTable, DataStudio, DataPrep, Pub/Sub , Cloud Storage, Cloud Dataflow, Data Proc, DataFlow, DFunc, Big Query & Big Table 
    • knowledge and proven use of contemporary data mining, cloud computing and data management tools including but not limited to Microsoft Azure, AWS Cloud, Google Cloud, hadoop, HDFS, MapR and spark. 
    • Design and configuration of data movement, streaming and transformation (ETL) technologies such as Informatica, Nifi, Kafka, Storm, Sqoop, SSIS, Alteryx, Pentaho, Alooma, Airflow. 
    • Creation of descriptive, predictive and prescriptive analytics solutions using Azure Stream Analytics, Azure Analysis Services, Data Lake Analytics, HDInsight, HDP, Spark, Databricks, MapReduce, Pig, Hive, Tez, SSAS. 
    • Design and configuration of data movement, streaming and transformation (ETL) technologies such as Azure Data Factory, HDF, Nifi, Kafka, Storm, Sqoop, SSIS, LogicApps, Signiant, Aspera, Alteryx, Pentaho, Alooma, Airflow. 
    • Large scale design, implementation and operations of OLTP, OLAP, DW and NoSQL data storage technologies such as SQL Server, Azure SQL, Azure SQL DW, PostgreSQL, CosmosDB, RedisCache, Azure Data Lake Store, Hadoop, Hive, MySQL, Neo4j, Cassandra, HBase 
    • Experience working within an agile development process (Scrum, Kanban, etc) 
    • Expertise in data estate workloads like HDInsight, Hadoop, Cloudera, Spark, Python. 
    • Familiarity with CI/CD concepts 
    • Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams. 
    • Knowledge or hands-on experience with data visualization and/or data sciences. 

Must have's:

    • Hands on experience with Azure/GCP projects. 
    • Cloud certifications such as GCP Professional Data Engineer or Microsoft Data / AI certifications. 
    • Technical degree required; Computer Science or Math background desired 

Location:

    • This is a virtual role
    • The candidate needs to be based in US or Canada

Travel:

    • This role would require 25 - 30% travel

About Rackspace Technology
We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future.
 
 
More on Rackspace Technology
Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.