Dataswift is the technology infrastructure company powering Data Economy 2.0 – the ethical data economy. Using our products, individuals, enterprises, and developers give, take and use personal data responsibly. Individuals get full ownership and control of their data. Developers and organisations get the APIs and tools they need to build scalable, data-rich applications with privacy and compliance built-in.
What you will be doing as our DevOps Engineer...
We’re looking for a DevOps engineer to own the development, maintenance, and enhancements of Dataswift’s AWS infrastructure.
You will be responsible for the management and testing of our infrastructure, researching cost-efficiency of alternative implementations and leading the platform's evolution to cross-cloud compatibility. Additionally you will be tasked with creating and maintaining CI/CD pipelines for our services. You’ll be involved in a variety of exciting projects, alongside your day to day responsibilities.
You can expect to:
- Create, deploy, and maintain a testable and secure AWS infrastructure
- Architect and build CI/CD pipelines
- Manage code deployments, fixes, updates, and related processes
- Document the pipelines, processes, and automated tasks
- Work with developers, technology, and product teams to build robust and scalable long-term solutions
- Define operations metrics and instrument the application to monitor the overall health status, resource usage, performance, and operational efficiency
- Schedule regular security tests and vulnerability scanning on deployed applications, monitoring for open ports, endpoints, and security holes
- Coordinate platform support for the company's platform
- Innovate to replace manual processes through automation that enable engineers to operate safely at high speed and wide scale
You will definitely need:
- experience in the same or related role managing AWS cloud infrastructure of a large scale deployment
- Strong familiarity with AWS CloudFormation and version control tools (e.g. Git)
- In-depth knowledge of containerization technologies such as Docker and Kubernetes
- Sound knowledge of network engineering and security principles, e.g. protocols, routing, switching, filtering, firewall rules
- Understanding of static and dynamic security testing tools, good understanding of security and systems best practices
- Experience with project management and workflow tools (e.g., Jira)
- Good familiarity with database technologies, primarily PostgreSQL
- Linux Administration and scripting ability is essential
- BONUS: Experience with other cloud providers (GCP and/or Azure)
- BONUS: Experience with other cloud orchestration tools (e.g. Terraform)