We've detected you're from the Netherlands, if you'd like you can view this page in Dutch!

View Dutch Stay here

We've detected you're from Germany, if you'd like you can view this page in German!

View German Stay here

Big Data Engineer (Remote)

Big Data Engineer (Remote)

Remote, Permanent

Backed by one of the most well-respected private equity firms in the world, this company has pioneered brand establishment and research technologies for IP and brand professionals. Being the first to market and a continuous focus on research, innovation and collaboration has solidified their stronghold on the industry as they now plan for another stage of growth following a large acquisition.

Serving more than 5000 clients from around the globe, the goal is to empower them with the ability to easily monitor and protect brand assets from anywhere in the world. Now, they’re looking for an ambitious Big Data Engineer to join their diverse team-based anywhere in the world.

Industry:

OQ-industries Internet

What to expect:

You’ll be working with an advanced software solution that allows analysts to easily detect infringements that could have a detrimental effect on their client’s businesses. Working with data on the tera-scale, the goal is to spot those infringements and co-ordinate investigation activities with advanced data-streaming and data-classification techniques. You’ll have the opportunity to work with machine learning, image recognition, risk analysis and fraud detection algorithms in parallel with the grand goal of tidying up the internet (no biggie).

Perks:

  • Full time remote working environment
  • Laptop provided
  • The opportunity to work on challenging project with a supportive and equally talented team
  • 25 vacation days per year

Requirements:

  • 5+ years of experience in Big Data Engineering
  • Experience with Java or Python and solid experience in SQL and related RDBMS (MySQL, PostgreSQL)
  • Hands-on experience with the big-data eco-system - Hadoop, Spark, Kafka, Flink or Airflow
  • Familiar with NoSQL Databases (ElasticSearch, MongoDB, DynamoDB)
  • Experience building large scale Data Warehouses or Data Lakes on production environments
  • You have built several ETL pipelines and familiar with the ETL optimization techniques
  • Familiar with Amazon Web Services and related big data cloud services
  • Experience in developing enterprise multi-tenant SaaS
  • Experience with Linux/Unix a plus
  • Fluent communication skills in English;

Sounds good?

Apply now
For more information, connect with our specialised team member on LinkedIn Poppy Ashmore