Trabajo de Sustaining Engineer Data Operations en Oportun, Guanajuato - México

Sustaining Engineer Data Operations

SUMMARY

 

Do you want to be part of a BIG data transformation journey? Do you love exploring new avenues and pioneer things in the technology space? Do you love designing and implementing business-critical data management & engineering solutions using emerging technologies? Do you enjoy solving complex business problems in a fast-paced, collaborative, and iterative delivery environment? If this excites you, then keep reading!

 

We're seeking a hands-on Sustaining Engineer Data Operations that can design, code and provide architecture solutions for the team. The right candidate for this role is passionate about technology, can interact with product owners/analysts and technical stakeholders, thrives under pressure, and is hyper-focused on delivering exceptional results with good teamwork skills.

 

Essential Functions:

  • Contribute to the design of scalable Big Data solutions across the entire data supply chain with a focus on ensuring the functionality delivered can be monitored for health and the design is extensible
  • Monitoring the health of existing processes to ensure they are delivering output as expected and proactively raising issues when they are not
  • Modify existing data integration and data warehouse processes to account for requirement changes
  • Participate in 2 nd level production support
  • Verify accuracy of data, testing methods and the maintenance and support of the Analytics Data Platform
  • Collaborate with management, business partners, analysts, developers, architects, and engineers to support all data quality efforts
  • Create and review technical and user-focused documentation for data solutions (data models, data dictionaries, business glossaries, process and data flows, architecture diagrams, etc.)
  • Operational functions
  • You don't just learn how things work, you learn why. Understanding how systems work at a fundamental level is a passion of yours
  • Be open and willing to learn new skills!
  •   WHAT SKILLS ARE WE LOOKING FOR IN AN IDEAL CANDIDATE?
  • In data management, data access (Big Data, traditional Data Marts and Data Warehousing).
  • In Advanced programming (python, Shell scripting, and Java)
  • With interactive and batch processing using Spark SQL and spark scripting.
  • In applied data technologies:
  • Hadoop
  • Spark 
  • Kafka, Spark Streaming
  • Pig
  • Hive
  • MongoDB
  • Oozie
  • EMR
  • Lambda
  • SQL
  • Current data warehousing concepts and technologies like Redshift, Spark, Hadoop, web services etc. to support business-driven decisioning
  • In data architecture and data assembly
  • In Data Governance and Data Security

Experience (requires little direction):

  • Functional requirements, detailed technical specifications, and test cases for new or modified projects
  • Understanding of data sources (e.g., 3rd party RDBMS, MS access, SQL server, Oracle, and MySQL)
  • Data integration tools (Talend preferred)
  • Data manipulation scripting languages
  • Business Intelligence, MDM, XML, SOA/WebServices
  • Executing deliverables using Agile

REQUIREMENTS:

  • Proficiency in verbal and written English (90%)
  • Excellent Organizational and Project Management skills
  • Bachelor’s degree in computer science/data processing or equivalent
  • 5+ years of experience in Data Warehousing or similar analytic data experience
  • 5+ years of experience with Java programming and developing frameworks
  • 2+ years’ experience with Hadoop and Spark
  • 2+ years’ experience with Amazon EMR/EC2 (or equivalent)
  • 2+ years’ experience with Python
  • Experience with Postgres and MySql
  • A solid understanding of basic core computer science concepts
  • Familiarity with Linux
  • Experience with Bitbucket and a solid understanding of core concepts with Git a plus
  • Familiarity with Jenkins and CI/CD
  • Experience with AWS technologies such as Aurora, Athena, EMR, Redshift, S3
  • Experience with Scala a plus
  • Experience with Talend Data Integration (Big Data) platform a plus