Agoda is hiring a

Hadoop DevOps engineer

Khwaeng Pathum Wan, Thailand

Our Bangkok team is looking for top quality passionate Hadoop Operations engineers to test, deploy and manage our next generation Hadoop platform.  We have multiple clusters spanning multiple data centres that are already handling millions of writes per second with petabytes of data and growing at an exponential rate.

 

Our systems scale across multiple data centers, totaling a few million writes per second and managing petabytes of data. We deal with problems from real-time data-ingestion, replication, enrichment, storage and analytics. We are not just using Big Data technologies, we are pushing them to the edge.

 

Agoda is the largest and fastest growing online hotel booking platform in Asia. And as a Priceline Group company, we are part of the largest online travel company in the world. We have the dynamism and short chain of command of a startup and the capital to make things happen. In this competitive world of online travel agencies, finding even the slightest advantage in the data can make or break a company. That is why we put data systems in our top priorities.

 

While we are proud of what we built so far, there is still a long way to go to fulfill our vision of data. We are looking for people who are as excited about data technology as we are, to join the fight. You can be part of designing, building, deploying (and probably debugging) products across all aspects of our core data platform products.

 

Responsibilities:

  • Manage, adminster, troubleshoot and grow multiple Hadoop clusters
  • Build automated tools to solve operational issues
  • Run effective POC’s on new platform products that can grow the list of services we offer

 

Qualifications:

Requirements

  • Bachelor's degree in Computer Science /Information Systems/Engineering/related field
  • At least 3 years’ experience working with modern systems languages
  • Experience debugging and reasoning about production issues
  • A good understanding of data architecture principles

Bonuses

  • Any experience with ‘Big Data’ technologies / tools
  • Strong systems administration skills in Linux
  • Strong experience in JVM tuning
  • Python/Shell scripting skills also a plus
  • Experience working with open source products
  • Working in an agile environment using test driven methodologies.

A few of the technologies we use

Spray, Hadoop, Kibana, ElasticSearch, Yarn, Akka, Mesos, Kafka, Sensu, Redis, Scala, Python, Cassandra, Postgres, Spark, OpenStack, Logstash, Couchbase, Vertica, Grafana, Go.

 

We welcome both local and international applications for this role. Full visa sponsorship and relocation assistance available for eligible candidates.