Who we are:
Loadsmart aims to move more with less. We combine great people and innovative technology to more efficiently move freight throughout North America. Our focus is on designing and building the best tools for our team and our customers, using machine learning models to connect freight with trucks. We automate with algorithms and scale with integrations to better match supply and demand. In doing this we reduce wasted fuel and lost time, cutting out empty miles for motor carriers and providing cost savings and instant booking for shippers.
Where we are:
Loadsmart was founded in New York and is currently headquartered in Chicago, IL. Our teams operate remotely from different parts of the United States as well as in several locations across Latin America.
Will join the Data Engineering squad remotely from LATAM. You will be involved in creating and maintaining the business intelligence stack, especially focusing on building a scalable backend data architecture and operation. Working closely with Business Analysts and Product team, you will extract and transform data from products and own the analytics layer of the company'’ data environment, ensuring the data available to the business is fresh and accurate. You will also help build self-serve tools that enable our users faster insights, data-informed decisions.
- Perform Data modeling (Data Warehouse, OLAP)
- Create SQL (Ad-hocs queries)
- Streamline data transformation with robust pipeline and ETL tools
- Build and maintain Integrations
- Maintain Data Integrity
- Design and implement Data Automation (Exports, Pipeline)
- Design and implement scalable processes
- Fluency in English (both written and spoken) and feel comfortable talking to native English speakers on a daily basis.
- Ability to lead internal stakeholders and implement initiatives with no (or little) supervision.
- Stay up to date on new and coming data architecture technologies.
- Minimum 5 years experience working with Data Warehousing, ETL and Advanced SQL.
- Advanced skills in data processing and pipeline automation using Python.
- Experience with dimensional modeling, data governance, master data management, security.
- Experience with installation, administration and performance tuning of databases (OLTP, OLAP or No-SQL).
- Experience modeling and developing OLAP cubes, metrics, etc in Power BI or similar tools.
- Skilled/experience with a pipeline management tool.
- Skilled version control tools like Git and GitHub.
- Experience with AWS ecosystem (Redshift, S3, EC2, Data Pipelines).
- Experience with Linux OS, Cron jobs.
Preferred but not required:
- Knowledge with Dimensional Modeling, Data Governance, Master Data Management, Security
- Knowledge with Installation, Administration and Performance Tuning of Databases (OLTP, OLAP or No-SQL)
- Experience with AWS ecosystem (Redshift, S3, EC2, Data Pipelines)
- Experience with Linux OS, Cron jobs
- Experience with Tableau, Power BI or Analysis Services
What you will find here:
- Generous Stock Option Plan
- Competitive Compensation
- Building a Rapidly-Growing Tech Company
- International Environment / Career
- Ability to Work with Cutting-Edge Technology
- Access to an Online Learning Platform
We are an international company, so only accept resumes in English.
At Loadsmart, we believe our biggest asset is our people. We are proud to be an equal opportunity employer, hiring and developing individuals from diverse backgrounds and experiences to add to our collaborative culture. Loadsmart treats all candidates and employees with respect and does not discriminate in our recruiting, hiring, and promoting processes, including on the basis of race, color, religion, sex, age, sexual orientation, gender identity and/or expression, national origin, veteran status, or disability.