Data Engineer II
Job Number: JR-008319
Location: Home Office
If you're passionate about building a better future for individuals, communities, and our country-and you're committed to working hard to play your part in building that future-consider WGU as the next step in your career.
Driven by a mission to expand access to higher education through online, competency-based degree programs, WGU is also committed to being a great place to work for a diverse workforce of student-focused professionals. The university has pioneered a new way to learn in the 21st century, one that has received praise from academic, industry, government, and media leaders. Whatever your role, working for WGU gives you a part to play in helping students graduate, creating a better tomorrow for themselves and their families.
This position will close internally on 9/10/21.
The principal function of the Data Engineer is to build robust ETL/ ELT pipelines to make data available to Data Analysts, Data Scientists, and other internal business units. The Data Engineer will be responsible for coding and working through the development cycle of ETL/ELT jobs and BI reporting. They will understand business requirements and design optimal ETL/ELT process for data acquisition, transformation, and publication for ease of analysis.
The Data Engineer will follow agile development methodology for timely delivery of accurate data. They must have coding proficiency to write unit tests for pipeline functionality as well as data quality - this includes job monitoring, alerting, and code versioning and deployment. This position is also expected to develop, modify, and deploy formal and ad hoc reports.
Essential Functions and Responsibilities:
- Develop and build ETL/ELT data pipelines for use in data analysis.
- Create and maintain optimal data pipeline architecture.
- Keep our data separated and secure across multiple cloud environments.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Deliver ad hoc and analytical reports to internal users and teams.
- Monitor and maintain ETL/ELT jobs and troubleshoot load issues.
- Manage change requests/ticket queues for analytical reports and ETL/ELT jobs.
- Perform data/technology discovery from new sources and third-party applications for data ingestion.
- Create complex reports and dashboards in Cognos and Tableau.
- Ingest and transform structured, semi structured and unstructured data from sources including relational databases, NoSQL, external APIs, JSON, XML, delimited files, and more.
- Work and deliver in agile methodology for new development projects. Deliver efficient and effective solutions on time.
- Ability to analyze and understand data source and design a data model for data capture and ETL/ ELT.
- Ability to identify bugs and apply fixes and check data quality via process/pipeline audits.
- Ability to work with team members, as well as cross-team for product delivery.
- Ability to work in agile environment with timely delivery of ETL/ ELT pipelines and reports.
Knowledge, Skill and Abilities:
- Cloudera or Horton
Organizational or Student Impact:
- Works on assignments of medium to complex level.
- Structure project plans and manages cost-effective execution of tasks.
- Limit errors to prevent impact to client operations, costs, or schedules.
- This individual will follow established processes and protocols.
Problem Solving & Decision Making:
- Individual meets department and personal goals with some direction/ supervision.
- An important player on large technical projects and programs.
- Uses discretion to help design and implement solutions to somewhat complex problems.
Communication & Influence:
- Communicates with contacts both within and outside of function on matters that require explanation, interpretation, and advising; typically has responsibility communicating to parties outside of the organization.
- Works to influence parties within the function at an operational level regarding policies, practices, and procedures.
Leadership & Talent Management:
- May be responsible for providing guidance, coaching, and training to other employees within the technical area.
- May manage technical projects at this level, requiring responsibility for the delegation of work and reviewing others' work products.
- Bachelor's degree in Business, Information Systems, Computer Science, or a related field, or an equivalent combination of experience and training.
- Three or more years of experience as a Data Engineer, Data Integration, Big Data, Business Intelligence, or Software Engineer.
- Experience working with SQL, Python, and Databricks.
- Advanced degrees (Master's or PhD) may be required for certain disciplines and reduce the experience requirement by 2-4 years (industry-relevant experience).
- Higher Education Domain exposure.
- Exposure to analytical reporting tools, preferably Cognos and Tableau.
- Use of industry best practices for code development, testing, implementation, and documentation.
- Prolonged periods sitting at a desk and working on a computer.
- Must be able to lift up to 15 pounds at times.
As an equal opportunity employer, WGU recognizes that our strength lies in our people. We are committed to diversity.