Caterpillar is investing in our digital future, and we’re looking for a talented and motivated Data Engineer to build a world class platform to host a wide range of applications.
Our iconic products have evolved from mechanical work horses to highly sophisticated, and electronically-controlled worksite solutions. This transformation, along with our smart factories and our integrated dealer network, has a wealth of data ready to be leveraged by our customers and our dealers. We’re generating innovative solutions from millions of data points and hundreds of thousands of IoT connected assets.
Working along-side a senior level Data Architect, we are seeking an experienced Data Engineer to develop Telematics specific data models, dictionaries, and documentation. Our Data Engineers will also have plenty of opportunity to execute on proof of concepts and proof of technologies, independently, to help us continually evolve our Telematics data management.
JOB DUTIES: As a Data Engineer you will contribute to design, development and deployment of Caterpillar’s state-of-the-art digital platform.
- Competent to perform all programming, project management, and development assignments without close supervision; normally assigned the more complex aspects of systems work.
- Works directly on complex application/technical problem identification and resolution.
- Works independently on complex systems or infrastructure components that may be used by one or more applications or systems.
- Contributes to development focused around delivering business valuable features
- Maintains high standards of quality within the team by establishing and adhering to good practices and habits
- Identifies and encourage areas for growth and improvement within the team
- Communicate with internal customers to help in development, debugging, and testing of software for accuracy, integrity, interoperability, and completeness
- Performs integrated testing of components that requires careful planning and execution to ensure timely, quality results.
- Position requires a four-year degree from an accredited college or university.
- 5+ years developing big data models
- 5+ years developing highly normalized data models
- 3+ years working in Erwin
- 3+ years developing in Python
- 3+ years noSQL experience
Top candidates will also have:
- Proven experience in some of the following:
- Designing, developing, deploying and maintaining software at scale.
- Developing software applications using relational and noSQL databases.
- Deploying software using CI/CD tools such as Jenkins, GoCD, Azure Devops etc.
- Deploying and maintaining software using public clouds such as AWS or Azure.
- Working within an Agile framework (ideally Scrum)
- Developing advanced SQL
- Linux OS
- API consumption models
- Analytics models
- In-memory data stores
- Design and development of conceptual, logical, and/or physical data models
- Strong understanding and/or experience in some of the following,
- Experience designing well-defined Restful APIs
- Experience writing API proxies on platforms such as Apigee Edge, AWS API Gateway or Azure API Gateway
- Hands one experience with API tools such as Swagger, Postman and Assertible
- Test driven development and behavior driven development.
- Hands on experience with testing tools such as Selenium and Cucumber and their integration into CI/CD pipelines.
- Datastores such as MongoDB, Cassandra, Redis, Elasticsearch, MySQL, Oracle.
- Must demonstrate solid knowledge of computer science fundamentals like data structures and algorithms.
- Ability to work under pressure and within time constraints
- Passion for technology and an eagerness to contribute to a team-oriented environment