Big Data/ DevOps Engineer (100% Remote)

DevOps Engineer
Permanent
Remote

On behalf of our client, a leader UK-based world’s provider of private transportation services, we are currently looking for a self-driven Big Data/ DevOps Engineer, to join its software development team.

A day in the life of a Big Data/ DevOps Engineer

You will be an expert in creating, improving, deploying, and supporting systems. You will also get experience on monitoring servers, applications and automating the provisioning of servers. You will work with a broad set of technologies and assist the development and infrastructure teams in implementing and optimizing their technical workflows and deliverables.

Your day-to-day

  • Building and setting up new development tools and microservice infrastructure
  • Developing and maintaining solutions for operational administration, system/data backup, disaster recovery and security
  • Troubleshooting and bug fix of different microservices and CI/CD pipelines
  • Participating in the continuous integration and delivery pipeline to maximize efficiency
  • Designing and implementing secure automation solutions for development, testing and production environments
  • Continuously evaluating existing systems with industry standards and making recommendations for improvement
  • Developing interface simulators and designing automated module deployments
  • Designing, developing and maintaining data infrastructure, including data pipelines, data storage, and data integration solutions
  • Developing and maintaining data quality standards and processes to ensure data accuracy and consistency
  • Developing and implementing data security policies and procedures to safeguard sensitive data

What you will need

  • +2 years of experience as a DevOps Engineer or similar role in a technology environment
  • Working experience and good knowledge of Docker
  • Solid background in Linux environments in general
  • Experience in Monitoring Tools (Grafana)
  • Good knowledge and experience with Configuration tools
  • Experience in designing & maintaining CI/CD pipelines
  • Strong knowledge of SQL and database management systems (e.g., SQLServer, PostgreSQL)
  • Experience with Data Lake infrastructure, such as Amazon S3, Hadoop
  • Experience with Looker and Power BI for data visualization and reporting
  • Experience with data integration tools (e.g., Apache Kafka, Apache NiFi, etc.)

We love your personality if you

  • lover Brutal honesty
  • act like an owner
  • constantly raise the bar
  • enjoy being a Team player

What’s in it for you

A very competitive package, depending on level of experience. You will have the opportunity to work in a challenging and multicultural environment with full flexibility on remote working and working hours.

Aggelli Gkrintzou

Associate Consultant

REFERENCE: job000023646