Job Description:
What You’ll Be Doing:
* Work on Data and Analytics services of Cloud.
*Work on Data Services of GCP
*Develop and monitor data pipelines with GCP
*Develop automation using Python, Spark, Scala.
* Work with technologies such as Spark, Hadoop, Kafka, etc..
* Build complex Data Engineering workflows.
*Build monitoring dashboards, alerts with Stackdriver .
* Establish credibility and build impactful relationships with our customers to enable them to be cloud advocates.
* Capture and share industry best practices amongst the community.
* Attend and present valuable information at Industry Events.
* Drive the engagements with customers from the architectural pillar, from design to delivery, create runbooks etc.. Qualifications &
* Work in a team-oriented and work as an individual contributor round the shifts 24/7/365 pattern as needed by team and over customer necessary requirements.
Experience:
* 3+ Years of Datawarehouse and Analytic system development and deployment experience.
* 3+ years of experience in modern data ware housing platform using cloud native technologies.
* 3+ years of experience in delivering Azure/GCP/AWS Data Solutions.
* Excellent communication skills with an ability to right level conversations.
* Technical degree required, Computer Science or Math background desired.
*Hands on experience with Database, Data warehouse , SQL etc.
* Hands on experience with GCP projects .
* Experience with Google Cloud Services such as Streaming + Batch, BigQuery, Pub/Sub , Cloud Storage, Cloud Dataflow, Data Proc, Data Fusion, Stackdriver etc.
* knowledge and proven use of contemporary data mining, cloud computing and data management tools including but not limited to Microsoft Azure, AWS Cloud, Google Cloud, Hadoop, HDFS, MapR and spark..
* Design and configuration of data movement, streaming and transformation (ETL) technologies such as Kafka, Sqoop, Airflow.
* Building exposure to develop analytics solutions Hadoop-Spark,Oozie,MapReduce, Pig, Hive, Tez .
* Experience working within an agile development process (Scrum, Kanban, etc.).
*Experience in automation, logging, debugging, monitoring using cloud services
*Excellent understanding of CI/CD
* Expertise in data estate workloads like Hadoop, Cloudera, Spark, Python.
* Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams..
* Knowledge or hands-on experience with data visualization *Cloud certifications such as GCP Professional Data Engineer or Hadoop or Spark certification.
About Rackspace Technology
We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future.
More on Rackspace Technology
Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.