Caterpillar is investing in our digital future, and we’re looking for talented and motivated DevOps Engineer to build a world class platform to host a wide range of applications.
Our iconic products have evolved from mechanical work horses to highly sophisticated, and electronically-controlled worksite solutions. This transformation, along with our smart factories and our integrated dealer network, has a wealth of data ready to be leveraged by our customers and our dealers. We’re generating innovative solutions from millions of data points and hundreds of thousands of IoT connected assets.
- Build infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data source
- Design, develop, and maintain performant and scalable applications
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability
- Perform debugging, troubleshooting, modifications and unit testing of integration solutions
- Operationalize the developed jobs and processes and processes.
- Create solutions and methods to monitor systems and solutions
- Automate code testing and pipelines
- Engage directly with business partners to participate in design and development of data integration/transformation solutions per functional requirements.
- Work in a scaled Agile environment accountable to deliver results in sprints.
- Engage and actively seek industry perspectives through external engagements such as hackathons, peer groups, etc.
- Employee is also responsible for performing other job duties as assigned by Caterpillar management from time to time.
- BS in Computer Science, Computer Engineering, or Engineering related field
- Understanding of data structures, algorithms, profiling & optimization.
- Understanding of SQL, ETL design, and data modeling techniques
- Thrive in a fast-paced environment that delivers results and has fun
- AWS Certified Developer & Cloud Practitione
- Experience managing continuous integrations systems (Jenkins, etc.)
- Strong background working with revision control systems (Git, etc.).
- Experience with automated build automation tools (Maven, etc.).
- Advanced level of experience with object-oriented programming, data structures and algorithms.
- Passion for acquiring, analyzing, and transforming data to generate insights.
- Thrive in a fast-paced environment that delivers results and has fun.
- Strong analytical ability, judgment and problem analysis techniques.
- Working knowledge of Agile Software development methodology.
- Solid understanding of concepts of cloud computing.
- Great verbal and written communication skills to collaborate cross functionally and enhance scalability.
- Interpersonal skills with the ability to work effectively in a cross functional team.
- Knowledge of enterprise data sources and uses
- Strong working relationships with data owners/stewards
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with API managers: Apigee, Azure, Catana, etc