To strengthen our growing US team C&F is pleased to offer the following position:
Big Data DevOps Engineer
Location: San Diego County, CA, USA
Role:
Principal Duties and Responsibilities
As a senior software engineer you will work as an integral part of our Cloud DevOps team for a pharmaceutical client. The ideal candidate has extensive DevOps knowledge and experience and has previously been part of a DevOps team running an analytical ecosystem in modern cloud-based big data systems in a fast-paced, agile environment.
The specific focus for the DevOps Engineer is on processing engine development as well as the design, development, automation and optimization of its usage in client’s analytic ecosystem:
- Liaise in a geographically dispersed team by co-location with our client in La Jolla, San Diego
- Participate in a continuous delivery pipeline to fully automate deployment of the highly available cloud platform that supports multiple teams/projects
- Build tools for deployment, monitoring and operations. Troubleshoot and resolve issues in our development, test and production environments
- Work with platform architects on software and system optimizations, helping to identify and remove potential performance bottlenecks
- Stay up-to-date on relevant technologies, plug into user groups, understand trends and opportunities to ensure we are using the best possible techniques and tools
- Understand, implement, and automate security controls, governance processes, and compliance validation
- Design, manage, and maintain tools to automate operational processes
Qualifications:
Education:
- Bachelor’s degree in Computer Science, or equivalent
Experience:
- 2 or more years working in DevOps, software development
- 1 or more years’ experience provisioning, operating, and managing AWS environments
- Strong background in Linux/Unix administration and scripting
- Strong background in Hadoop, Spark, Scala, Java
- Extensive experience with a public cloud provider, ideally Amazon Web Services
- Strong understanding of Continuous Integration and Continuous Delivery principles and practice
- Ability to use a wide variety of open source technologies and cloud services
- Research & investigative skills
- Strong experience with SQL and NoSQL data stores
- Software process automation with popular scripting languages (Python, Node.js, and/or Ruby)
- Experience developing code in at least one high-level programming language
- Experience in automation and testing via scripting/programming
- Understanding of Agile and other development processes and methodologies
- Source, build/release, and configuration management in a continuous integration & delivery environment
- Application performance analysis and monitoring
- Knowledge of best-practice security and networking techniques for an Internet-facing system
Highly desired, but not required, skills include:
- Experience with automation and configuration management using Ansible, or an equivalent
- Java development experience
- Technology vendor management
- Amazon Web Services certification highly desired
What we offer:
- Full time position with option of flexible hours for students
- Internal training on tools that are being used
- Opportunity to gain experience through participation in international projects for the largest corporations in the world
- Work on interesting projects in a creative team
- Remuneration adequate to experience and the scope of duties
- Opportunity for professional development
- Friendly atmosphere
How to apply:
For more information and to apply, please email us your questions or/and resume to careers@candf.com with “Big Data DevOps Engineer” in the subject line.