1 day left
- Contract Type
- Full Time
Hadoop Developer-HOP04144 Working at Cargill is an opportunity to thrive—a place to develop your career to the fullest while engaging in meaningful work that makes a positive impact around the globe. You will be proud to work for a company with a strong history of ethics and a purpose of nourishing people. We offer a diverse, supportive environment where you will grow personally and professionally as you learn from some of the most talented people in your field. With 150 years of experience Cargill provides food, agriculture, financial and industrial products and services to the world. We have 150,000 employees in 70 countries who are committed to feeding the world in a responsible way, reducing environmental impact and improving the communities where we live and work. Learn more at www.cargill.com.
As a Senior Data Engineer you will work on a product team using Agile Scrum methodology to design, develop, deploy and support solutions that leverage the Cargill big data platform. The Sr. Data Engineer will work with Enterprise Architecture, D&BI Solution Architects, and Business Analysts to understand business unit requirements and to build solutions to meet their needs and objectives. This role requires the ability to interpret and apply data ingestion/storage/usage patterns developed by the architecture team in order to build solutions. The Sr. Data Engineer is expected to leverage common solutions and services, and to follow Cargill development standards and principles. The Senior Data Engineer will also be responsible for troubleshooting complex incidents related to development
60% - Development, Testing, and Quality:
- Perform integration development to move data from production systems to database/data warehouses using ETL tools.
- Develop technical solutions.
- Support testing by fixing defects and making necessary back end design changes.
- Ensure adherence development and architecture standards and best practices.
- Provide necessary technical support through all phases of testing and incident handling after deployment.
- Provide support for product solutions as part of a DevOps model
30% - Solution Analysis
- Work with businesses, process owners, and product team members to design solutions for Cargill's big data and Advanced Analytics solutions.
- Perform data modeling and prepare data in databases for reporting through various analytics tools.
- Create or modify design documentation as defined by team development standards, processes, and tools.
- Ensure the solution designed and built is supportable as part of a DevOps model
10% - Business Partnering
and Relationship/Service Management:
- Provide technical expertise in collaborating with project and other IT teams, internal and external to Cargill.
- Plays a technical specialist role in championing data as a corporate asset.
- Take on technical leadership roles within the team, including project leadership, design leadership, and mentoring
- Bachelor's Degree or equivalent education or 7 years experience in: Computer Science, Information Systems, Engineering or Quantitative field
- 5 years of application development or data warehouse experience, including: analysis, technical design, coding, testing, deployment, and transition to support.
- 2 years of experience working with end to end software development
- 2 years of experience deploying solutions through formal change control processes
- Curious about data and passionate about the business value of big data and advanced analytics
- Agile, quick learner of new technologies
- Ability to work well in cross-functional teams and foster team commitment to meeting objectives
- Proactive, creative problem solving skills in ambiguous and changing environments
- Strong data modelling and query writing skills
- Skills building tables/views or data warehousing on Oracle or MS SQL Server environments
- Skills and/or experience developing database programming on Oracle or MS SQL Server
- Understanding of structured data, dimensional models or cubes and various forms of ETL for reporting
- Experience preparing data for use in analytics and reporting
- Familiarity with unstructured data
- Familiarity with object oriented programming
- Business fluency in English
- 3 years of experience developing in SAP HANA, SAP BW, Oracle, or SQL Server
- Experience building Big Data Solutions using a secure Hadoop environment and NoSQL technology
- Agile/scrum experience
- Experience data modeling using ETL/ingestion tools: SLT, Streamsets, Business Objects Data Services (BODS), Sqoop, and Flume)
- Advanced in at least one functional programming language
- Experience with scripting languages (SQL, Scala, Pig, Bash/Python) to manipulate data
- Experience in a JVM language
- Experience working with front end visualization tools like Power BI, Tableau, and Business Objects
- Experience with NoSQL data stores (especially Cassandra)
- Experience with Spark, Hive, Impala
- Experience with Kafka
- Comfortable scripting in *NIX environment (ssh and standard commands)
- Understanding of object-oriented and functional programming paradigms.
- Contributions to large open-source projects.
- Version control, particularly Git
Job Information Technology
Primary Location US-MN-Hopkins
Job Type Standard
Shift Day Job
Apply for Hadoop Developer
Already uploaded your resume? Sign in to apply instantly