British Red Cross Ukraine Crisis Appeal

£3750+ donated already with continuing regular donations.

Ukraine Crisis

Apply for this Job

UploadChoose a CV

To find out about how we process your data, please read our privacy policy.

Send

  • LocationKrakow Metropolitan Area
  • SalaryCompetitive salary
  • Job TypeContract
  • ReferenceRM001DE
  • Date posted9-Mar-23

Role – DevOps Engineer with Data

 

Our client is one of the largest Financial Institutions and Financial Services organizations in Europe, with operations in 50 countries and territories.

 

We are looking for a GCP DevOps Engineer with Data / Data Engineer with experience in Data processing technologies and techniques to join our Krakow-based POD and lead the building of the client’s target state Data Warehouse Solution and Data models/products. Whilst core skills are listed below, we are mainly looking for passionate people who are looking to continually improve and challenge themselves to work in a highly disciplined, verifiable manner.

 

The person will also develop, test, deploy, and optimize data pipelines (databases, files in batch and stream) data models, data products, and reporting views for a variety of Data sources (other cloud providers and on-premises) along with its supporting CICD pipelines.

 

Employment Type – B2B/Perm

 

Responsibilities

  • Review and refine, interpret and implement business and technical requirements
  • Ensure you are part of the on-going productivity and priorities using User Stories, Jira, Backlogs, etc.
  • Deliver requirements to scope, quality, and time commitments in Agile mode and practice
  • Responsible for onboarding new data sources, design, build, test and deploy Cloud data ingest, pipelines, warehouse and data models/products.
  • Build and operate optimal data pipelines/models/products with SQL, stored procedures, indexes, clusters, partitions, triggers, etc.
  • Creating, owning, enhancing, and operating CI/CD pipelines using Git, Jenkins, Groovy and etc.
  • Deliver a data warehouse and pipelines which follow API, abstraction and ‘database refactoring’ best practice in order to support evolutionary development and continual change.
  • Develop procedures and scripts for data migration, back-population, and feed-to-warehouse initialization.
  • Extend the solution with Data Catalogue.
  • Protect the solution with Data Governance, Security, Sovereignty, Masking and Lineage capabilities.
  • Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective.
  • Ensure a consistent approach to logging, monitoring, error handling and automated recovery as per HSBC standards.
  • Fix defects and enhancements.
  • Maintain good quality and up to date knowledge base, wiki and admin pages of the solution
  • Peer review of colleague’s changes.
  • Speak up and help shape how we do things better.

 

Qualifications / Requirements:

  • Expert in Administration and development of Traditional and Cloud Databases
  • Excellent understanding of GCP Core and Data Products, Architecting and solution design
  • Minimum 1+ years of working experience on Google Cloud Platform Development, especially in Data / ETL related projects
  • Data preparation, wrangling and refactoring skills, for example as part of a Data Science pipelines
  • IT methodology/practices knowledge and solid experience in Agile/Scrum
  • Experience in building and operating CI/CD life-cycle management Git, Jenkins, Groovy, Checkmarx, Nexus, Sonar IQ and etc.
  • Experience in Collaboration tools usage such as JIRA/Confluence/Various board types
  • BS/MS degree in Computer/Data Science, Engineering or a related subject
  • Excellent communication and interpersonal skills in English. Proficiency in verbal, listening and written English is crucial.
  • Enthusiastic willingness to rapidly and independently learn and develop technical and soft skills as needs require.
  • Strong organizational and multi-tasking skills.
  • Good team player who embraces teamwork and mutual support.
  • Interested in working in a fast-paced environment

 

Nice to have:

  • Experience of deploying and operating Datafusion/CDAP based solutions.
  • Experience in GCP based big data / ETL solutions DevOps model.
  • Expertise of Java, Python, DataFlow.
  • Broad experience with IT development and collaboration tools.
  • An understanding of IT Security and Application Development best practice.
  • Understanding of and interest in various investment products and life cycle and the nature of the investment banking business.
  • Experience of working with infrastructure teams to deliver the best architecture for applications.
  • Working in a global team with different cultures.

 

Benefits:

  • Private medical care and life insurance.
  • Multisport card

 

Have we sparked your interest?

 

Get in touch! We are looking forward to speaking to you:)

 

Reach out to me at r.mishra@vertex-solutions.com or apply for this job to know more.

Similar Jobs.

Apply for this Job

Customer Reviews

Net Promoter Score

\10

Cyrus

NetPromoter Score

10/10

10 out of 10. Danielle has been a pleasure to work with thus far, excellent communication, and very helpful in scheduling time with distributed team members.

Dominik

NetPromoter Score

10/10

Unfortunately, position I applied has been filled out before my interview with client. However, I would definietely recomend (10) Vertex to others.

Julius

NetPromoter Score

8/10

- Felt professionally represented for roles we pursued - Relevant opportunities presented in light of discussions we had and my expectations

Let's Work Together

I’m looking for a job a candidate or would like to work at Vertex