Senior DevOps Engineer

Ref#: 32157
CBS Business Unit: CBS Interactive
Job Type: Full-Time Staff
Job Location: San Francisco, CA, US

About Us:

CBS Interactive is the premier online content network for information and online operations of CBS Corporation as well as some of the top native digital brands in the entertainment industry. Our brands dive deep into the things people care about across entertainment, technology, news, games, business and sports. With over 1 billion users visiting our properties every quarter, we are a global top 10 web property and one of the largest premium content networks online.

Check us out on The Muse, Instagram and YouTube for an inside look into 'Life At CBSi' through employee testimonials, office photos and company updates.
Division Overview:
The Data technology team is an agile and dynamic group who is passionate about developing reliable data platform to facilitate internal groups to process data more efficiently. Our team collaborates across number of business units to design and implement reusable frameworks for different data teams. We use Google Cloud Platform to process large volumes of data from multiple vendors enabling our Business Intelligence, Yield, Finance and marketing teams to focus on data-driven insights.

Role Details:
You’ll be an integral part of the Data Technology team, working on designing and building new infrastructure frameworks and automation workflows for data acquisition and processing data in Google Cloud Platform. You will collaborate with both internal and external teams to finalize the requirements, design and develop the applications, and mentor co-workers. The ideal candidate will be very familiar with current technologies in modern Cloud environments.

Your Day-to-Day:
  • Collaborate with other Engineers to
  • Design and develop new data infrastructure frameworks
  • Design and develop applications to reduce operational overhead
  • Automate deployment processes in Google Cloud Platform
  • Monitor and resolve operational issues related to infrastructure and data pipelines
  • Develop Monitoring systems
  • Optimize performance and cost of cloud resources
  • Develop and document reusable processes for use by other data teams
  • Mentor co-workers

Key Projects:
  • Automate deployment processes for data management teams
  • Design and develop data acquisition templates in Apache NiFi
  • Design and develop data processing frameworks in Google Dataflow

What you bring to the team:
You have –

  • Experience with Google Cloud Platform and/or AWS
  • Experience in designing and delivering modern applications and services with the DevOps model
  • Experience with any "infrastructure as code" system, such as Terraform
  • Experience in deploying applications using Docker containers
  • Experience working with Linux servers and troubleshooting
  • Experience in Python
  • Bachelor's Degree in Computer Science or equivalent
You might also have-
  • Experience in Kubernetes and Jenkins is a plus
  • Experience in Google Dataflow/Apache Beam is a plus
  • Experience in Apache NiFi is a plus
EEO Statement:

Equal Opportunity Employer Minorities/Women/Veterans/Disabled

< Back to job list