Pivotal Software is hiring a

Customer Engineer

Palo Alto, United States

HAWQ Support Engineer


At Pivotal, our mission is to enable customers to build a new class of applications, leveraging big and fast data, and do all of this with the power of cloud-independence. Pivotal’s offering includes the Big Data Suite, the most complete approach to enterprise data lakes and advanced analytics; Pivotal Cloud Foundry, the industry leading Platform as a Service product; and world leading ultra-agile application development through Pivotal Labs. Open source is an important part of our strategy. Many of our products are already open source; those that are not will be soon.

The Big Data Suite includes HAWQ, our SQL on Hadoop solution; Greenplum Database (GPDB), our massively parallel data warehouse; GemFire, our distributed in-memory key-value store; and MADlib, our machine learning solution. With a rich and compliant Structured Query Language (SQL) dialect, Pivotal HAWQ® supports application portability and a large ecosystem of data analysis and data visualization tools such as SAS, Tableau and more. Analytic applications written over HAWQ are easily portable to other SQL compliant data engines, and vice versa. This prevents vendor lock-in for the enterprise and fosters innovation, while containing business risk.Pivotal HAWQ natively supports various Hadoop file formats and is managed like a native Hadoop service within Apache Ambari. Pivotal HAWQ also runs on Hortonworks Data Platform in addition to Pivotal Hadoop Distribution. This gives more choice to the enterprise while minimizing the integration and lifecycle management costs of the complete Hadoop solution.

We are looking for candidates with experience in schema design, SQL tuning, strategies for database maintenance and monitoring, backup and restore, disaster recovery, database mirroring, debugging with gdb and strace, operating system and networking optimization for the database platform, scripting and automation to manage the database, and deep insights to the inner workings of MPP databases. You may have worked on Hadoop-based solutions, or relational databases, or NoSQL systems. Should have strong Linux experience as well.

Please note: This role will support US government clients that require US citizenship or Permanent Residency.  Given this, US citizenship or Permanent Residency is required for you to apply.


Responsibilities and Desired Skills

  • Apply advanced and in-depth knowledge to analyze, diagnose, replicate, troubleshoot and resolve standard to highly complex technical customer reported issues around Pivotal HAWQ
  • Contribute to the (internal and external) knowledge base.
  • Assist and mentor junior staff in team to resolve complex issues.
  • Escalate unresolved issues that require more in-depth knowledge to engineering in a timely manner.
  • Database fundamentals - Database concepts, SQL, queries, plans, joins
  • Experience with Hadoop Based solutions, preferably Pivotal Hadoop OR Hortonworks Data Platform.
  • Strong analytical, troubleshooting, and problem solving skills.
  • OS fundamentals - Linux, navigation, memory, swap, understanding ps, top, vmstat, iostat, file system, etc.
  • Database troubleshooting - client/server, query execution sequence, execution plan, log files
  • Experience with in-memory processing and/or Hadoop desirable
  • Strong communication skills and ability to work as part of a team



Added advantage:

  • Working exposure Postgresql/Teradata/Netezza/Hadoop
  • Well versed with Database architecture with awareness on how things work internally.
  • Troubleshooting skills (PostgreSQL preferred)
  • SQL Tuning
  • DBA experience in a 24x7 production environment.
  • DB Patching and Upgrade skills in production environment (PostgreSQL preferred).
  • Strong knowledge of data normalization and data relationships.
  • Experience in Unix/linux
  • Customer communication skills


Nice To Have Skills


  • Develop and enhance scripts to automate root cause analysis.
  • Experience in Data warehousing technologies (PostgresSQL/Teradata/Netezza)
  • Knowledge of Python,C++/Java, Bash