La Haus is hiring a

Data Engineer

Remote
La Haus has the goal of making financial and geographic freedom accessible to millions of households in Latin America. We are transforming the real estate industry with world-class technology, data, and customer service. Technology provides efficiency and fluidity in our processes, data achieves transparency and precision, and personalized customer service brings the human touch to the most important financial decision for most people: Buying a home or starting an investment project.

We are looking for a Hauser:

  • Is guided by Human Sense, always with respect and empathy for others.
  • Is an example of Transparency.
  • Have a mindset of Innovation, always looking for the best way to do things.
  • Works with Autonomy is the owner of their future.
  • Is Humble, can see the opportunity to learn from any person and situation.
  • Always go the extra mile, working with commitment and dedication.
  • Keeps it Simple, remember ... less is more.

The role

  • The role approach is clearly aimed at generating a robust data infrastructure based on cloud technology through the different techniques available, using the technological or technical resources available to the company for its proper execution. The sources to be manipulated will be both external and internal. In the field of internal sources an intermediate or advanced level of SQL is desirable, in addition to the use of other languages that can manipulate and analyze data (python, javascript, R).
  • Some of the technologies to consider are: Postgres Snowflake Redshift AWS Glue AWS Lake Formation AWS Kinesis AWS Athena AWS S3 Heap Analytics Third-party APIs 

Tasks and Responsibilities:

  • Create and support data ingest process in our LakeHouse.
  • Data governance support for DWH and LakeHouse.
  • Build different data models for our organization.
  • Support for teams that require creating or implementing data models and extractions from different types of sources, both internal and external data.
  • Modeling and construction of data structures according to the needs of the requirements.
  • Creation, automation and monitoring of the different data pipelines responsible for feeding the different databases (transactional and analytical) of the company.
  • Exploratory analysis of the different data sets consumed by the company to determine the initial data quality metrics.
  • Support the product team in the design and optimization of the different databases managed to facilitate the transaction and analytical process of all countries.
  • Update metadata.
  • Constant communication with data analysts to implement or improve data extractions and models.
  • Design and implement strategies to integrate the different data sources of the company.
  • Taking requirements in the areas.

Competences:

  • High level of programming in Python, including Pandas.
  • High level of development in BD PostgreSQL/Snowflake.
  • High SQL level (for exploratory queries, ETL, and management).
  • Highly quantitative, logical and accurate thinking.
  • High level in ETL development.
  • Mid-level with documentary NoSQL databases (MongoDB).
  • High level of knowledge in dimensional model development/Kimball.
  • Familiarity with concepts of data lakes, OOP, software engineering.
  • Comfort and taste for math at the university level, both discreet and continues.

Benefits of working at La Haus:

  • A dynamic work environment, with an entrepreneurial spirit in which we love to work as a team.
  • Competitive compensation package.
  • Medical support.
  • 15 vacation days + 5 personal days.
  • Professional and psychological coaching.
  • Subsidy in education, fitness and welfare programs.
  • Working fully remote with an international team.

Learn more about La Haus
 
#BeHauser

Looking for a job?

Data Engineer at La Haus looks great, right? We have dozens of similar job posts on our site, interested? Leave your email and we'll send the best matches.