Senior Data Platform Engineer

Employer
Location
Hopkins, MN
Posted
Dec 20, 2016
Closes
Apr 11, 2017
Contract Type
Full Time

Senior Data Platform Engineer-HOP03922 Working at Cargill is an opportunity to thrive—a place to develop your career to the fullest while engaging in meaningful work that makes a positive impact around the globe. You will be proud to work for a company with a strong history of ethics and a purpose of nourishing people. We offer a diverse, supportive environment where you will grow personally and professionally as you learn from some of the most talented people in your field. With 150 years of experience Cargill provides food, agriculture, financial and industrial products and services to the world. We have 150,000 employees in 70 countries who are committed to feeding the world in a responsible way, reducing environmental impact and improving the communities where we live and work. Learn more at www.cargill.com.

Description

As a Cargill Data Engineer you will work as part of a small team that is responsible to design, build and maintain a comprehensive big data platform for Cargill. This includes helping to define architecture, developing ingestion/storage patterns, developing exposure and analysis patterns, and defining a service around the platform. The Data Engineer works closely with Solution Architects, Solution Analysts and Process Designers to understand and evaluate business requirements that require use of the big data platform.

The Cargill Data Engineer has a comprehensive understanding of both technical and functional design and will translate that knowledge into a big data solution that meets the needs of our businesses. The Cargill Data Engineer will ensure solutions developed are aligned to the D&BI architecture standards and principles, ensure they leverage common solutions and services, as well as meet financial targets.


Principal Accountabilities


75% -Data Engineering

  • Support the design, build, test and maintaining of Cargill's Big Data and Advanced Analytics platform
  • Design, build and test API's that will enable simple use of complex datasets
  • Data Modeling that is performance and reusable
  • Develop interactive dashboards, reports, and perform analysis
  • Execute strategies that inform data design and architecture in partnership with enterprise-wide standard
10% - Business Partnering, Relationship Management and Consulting
  • Regularly interfaces with architects, analysts, process designers, and BU/Function subject matter experts to understand and evaluate business requirements.
  • Works with business to determine functional requirements and then translate those into platform specific design (including ingestion and storage patterns)
10% - Governance and Project Consulting
  • Demonstrates subject matter proficiency in the design, implementation and deployment of new software version, infrastructure, and processes for a large portfolio of services, spanning a significant subset of the organization.
  • Plays a technical specialist role in championing data as a corporate asset.
  • Utilizing substantial knowledge of data practices and procedures, conducts quality assurance evaluations on new and existing technology.
  • Provides expertise while collaborating with other partner IT teams, internal and external to Cargill
5% - Run Operations
  • Collaborate with AMS and Architecture organizations to maintain awareness on the health of overall Data Platform and monitor scorecard metrics.
  • Provide support in the transition of big data solutions from the build organization to the operations organization.

Qualifications

Required Qualifications
  • Bachelor's degree or equivalent experience.
  • 10 years experience in Information Technology field
  • 7 years developing software applications including: analysis, design, coding, testing, deploying and supporting of applications
  • 2 years experience with end-to-end software development processes and practices (agile/scrum experience preferred)
  • Proficient in application/software architecture (Definition, Business Process Modeling, etc)
  • Understanding of Big Data technology; current on new ideas and tools
  • Good understanding of the Hadoop ecosystem and low level constructs
  • Broad understanding of object-oriented and functional programming paradigms
  • Experience in *nix environment (e.g., ssh and standard commands)
  • Demonstrated collaboration skills, able to engage in interactive discussions with technical and business-oriented teams
  • Experience with SQL languages
  • Experienced with at least one scripting language (e.g., bash/python)
Preferred Qualifications
  • BS degree in Computer Science, Applied Mathematics, Physics, Statistics or area of study related to data sciences and data mining
  • Experience building Big Data solutions using Hadoop and/or NoSQL technology
  • Experience developing complex MapReduce programs against structured and unstructured data
  • Experience with loading data to Hive and writing software accessing Hive data
  • Experience loading external data to Hadoop environments using tools like MapReduce, Sqoop, and Flume
  • Experience using scripting languages like Pig to manipulate data
  • Experience working with very large data sets, knows how to build programs that leverage the parallel capabilities of Hadoop and MPP platforms
  • Experience interfacing with data-science products and creating tools for easier deployment of data-science tools
  • Experience in extending open-source Hadoop components
  • Experience with Scala and/or Spark. Technical Skills Experience with Front End BI tools (Tableau, PowerBI, Business Objects)

Equal Opportunity Employer, including Disability/Vet.

Job Information Technology

Primary Location US-MN-Hopkins

Schedule Full-time

Job Type Standard

Shift Day Job