UNHCR
Job title:
Associate Cloud Data Engineer (Maternity Cover)
Company:
UNHCR
Job description
Hardship Level (not applicable for home-based) H (no hardship)Family Type (not applicable for home-based)FamilyStaff Member / Affiliate TypeUNOPS LICA9Target Start Date 2024-11-01Deadline for Applications October 15, 2024Terms of Reference 1. General BackgroundThis maternity cover position is with the Data and Analytics function in UNHCRs Supporter Engagement Section (SES) within the Private Sector Partnerships Service (private fundraising) and serves departments within Division of External Relations (DER), PSP and the wider UNHCR organization.
The Data and Analytics function supports UNHCR fundraising activities and works in the field of Digital Marketing and CRM tools. We operate systems for collecting marketing data and storing it in Google Cloud BigQuery. For bringing in new data and orchestrating it we use tools like: Cloud Data Fusion, Stitch, Skyvia, Apache Airflow.As Associate Cloud Data Engineer you will work with key stakeholders at UNHCR HQ and in the regional offices to advise, develop and implement a best practice analysis and marketing data warehouse in Google Cloud. You will be UNHCRs all-round expert in Google Cloud Platform products.Specific attention will be given to ensuring continued value from the UNHCR Data Lake for Fundraising data in GCP Big Query.This will include:
– Continued onboarding of more regional teams
– Adding more data sources and pipelines
– Designing and implementing data marts for data analysis and reporting officers.
– Introductory training in BigQuery
– Management and monitoring of pipelines
– Governance. Managing users access and writing user guidelines.
– Data management
– Working with UNHCR ICT security team2. Purpose and Scope of AssignmentUnder the overall supervision of the PSP Data and Analytics Officer, the individual contractor will assist in the following:– Designing and implementing data models and data pipelines for digital analytics, digital ads optimization, donations and CRM engagement
– Optimizing and operating a commercial data warehouse based on the architecture of Google Cloud BigQuery (Managed serverless data warehouse) and related GCP products.
– Define solutions, requirements and processes to make data universally available to the analyst teams and reporting teams who delivers dashboards in Microsoft Power BI
– Operate and design data models for data in raw, curated and normalized format to meet data consumer’s needs (data analysis and reporting officers)
– Assist other teams and stakeholders in best use cases of the data to make their operations faster
– Keep an eye out for changes in source system API changes and identify potentials, by being up to date with the release notes of these vendors
– Monitor and identify potentials for refactoring and improvements in the data pipelines
– Coordinate global network of internal clients across fundraising, advocacy and other programs, ensuring a standardized data convention
– Work closely with the different teams to bring value from data and help them create corresponding tasks from data and insights
– Help promoting a data-first based strategy and identify business opportunities from analyzing data
– Coordinate UNHCRs community of BigQuery data warehouse users.
– Be an advocate for the generally available dashboards in Power BI and work closely with the team that create and manage these dashboards
– Data management. Document data models and maintain data dictionaries
– Training colleagues in BigQuery and data models understanding.
– Working with UNHCR ICT security team to ensure security standards are met3. Monitoring and Progress Controls:– Delivery roadmap developed. For meeting business requirements with increasing new data pipelines, improving existing data pipelines and data models, onboarding more teams with BigQuery
– Maintain and follow the plan with agreed deliverables.
– New teams onboarded and trained in BigQuery
– New pipelines and data models made available for data consumers.
– Together with the Associate Data Quality Officer set up data quality assurance including GCP alerting and monitoring of flows, storage and CPU usage and billing4. Qualifications and Experience:a. Education– University degree in Computer Science, Engineering, Mathematics, Statistics, Business Administration or Marketing, or a related field – computer science and digital marketing qualifications preferred.
– Google Cloud Data Engineer Certification is preferred, but not a must.
– Google Cloud BigQuery or another NoSQL database completed training course is preferred, but not a must.
– SQL completed training course is preferredb. Work Experience– 3 years relevant experience with bachelor’s degree; or 2 years with master’s or equivalent or higher degree
– Experience with designing, building and maintaining data warehousing solutions using GCP technologies such as BigQuery, Cloud Storage, Dataform, Cloud Data Fusion, Apache Airflow, Cloud Composer, Pub/Sub. Or AWS or Azure equivalent products.
– Experience with SQL
– Python development experience with API and ETL integration tasks
– Experience with ETL processes and tools (e.g. Cloud Data Fusion, Stitch, Skyvia or similar tools)
– Experience in setting up data pipelines and automating data workflows
– Familiarity with digital marketing datasets such as Google Analytics, Google Ads, Meta Ads Familiarity with CRM systems such as Salesforce and reverse ETL for Salesforce and Marketing automation tools and ad platforms to make data actionable.
– Familiarity with cloud security best practices including access control and monitoring and data privacy regulations like GDPR and CCPA.
– Experience with data visualization tools such as MS PowerBi or Looker is desirable
– Worked with GitHub or similar toolc. Key Competencies– Strong understanding of data modeling, ETL processes, and data warehousing concepts.
– Having a structural and analytical mindset in explaining rather complex data models to other teams
– Being up to date with changes in marketing data sources and potential new data sources (Salesforce Sales Cloud, Google Analytics, Google Ads, Meta Ads)
– Project management
– Good communication skills for communicating technical topics clearly
– Strong interpersonal skills and ability to establish and maintain effective working relationships with people in a multicultural, multi-ethnic environment with sensitivity and respect for diversity.
– Fluency in spoken and written English is required
– Knowledge of another UN language is a plus (Arabic, Chinese, French, Russian, Spanish).4. Location and ConditionsThis is a remote-working position. The successful candidate will be home-based.It is a full-time role starting from 8.30am to 5pm Monday to Friday (40 hours per week).The remuneration level and the applicable entitlements and benefits may be different based on the residence of the most suitable selected candidate.Please note that only shortlisted candidates will be notified.Standard Job DescriptionRequired Languages,,Desired Languages,,Additional QualificationsSkillsEducationCertificationsWork ExperienceOther information This position doesn’t require a functional clearanceHome-Based Yes
Expected salary
Location
København
Job date
Thu, 26 Sep 2024 06:12:49 GMT
To help us track our recruitment effort, please indicate in your email/cover letter where (vacanciesineu.com) you saw this job posting.