82 Data Integration jobs in Canada
Data Integration Specialist
Posted 1 day ago
Job Viewed
Job Descriptions
Job DescriptionFirstPrinciples is seeking a skilled and detail-oriented Data Integration Specialist to play a crucial role in our data pipeline development. In this position, you will lead projects to design and implement data extraction processes from various structured and unstructured sources, create robust parsing mechanisms, and develop sophisticated logic to extract meaningful features from raw data. Working in an agile environment, you will iteratively refine extraction methods based on ongoing feedback.
ResponsibilitiesInvestigate and evaluate new data sources.Create comprehensive extraction plans and strategies for each data source.Lead the full lifecycle of data extraction projects from planning to implementation.Work closely with peers and managers to iterate quickly and refine various approaches.Progressively scale extraction processes from small test batches to full implementation.
Data Source IntegrationDevelop and maintain parsers for diverse data sources including APIs, databases, web content, PDFs, and scientific literature.Create reliable ETL processes to ensure data quality and consistency, including LLM-based extraction pipelines.Design and refine prompts for LLMs to extract structured information from unstructured data sources, including text, images, and other multimodal inputs.Implement error handling and logging systems to maintain data pipeline reliability.Identify and extract valuable features from complex raw data sets.Develop logic and algorithms to transform unstructured information into structured, analyzable formats.Create reproducible processes for data normalization and standardization.Optimize parsing procedures for performance and accuracy.Document data lineage and transformation processes for transparency.Work closely with cross-functional teams to understand feature requirements.Coordinate with engineering team to integrate data pipelines into broader systems.Communicate technical concepts clearly to non-technical stakeholders.Engage directly with third party data vendors to obtain technical specifications and integration details.Demonstrate ability to work effectively both as part of a collaborative team and independently on self-directed tasks.
QualificationsEducational Background: Bachelor's degree in computer science, data science, information systems, or related field.Experience: 1-3 years of experience working with data transformation, ETL processes, or similar roles.Project Management SkillsExperience managing small to medium-sized data projects from conception to completion.Demonstrated ability to create technical plans and roadmaps for data extraction.Experience working in agile environments with iterative development cycles.Technical SkillsProficiency in Python and/or similar languages for data processing.Experience with data parsing libraries and frameworks.Knowledge of data storage systems and formats (SQL, JSON, etc.)Familiarity with regular expressions and text processing techniques.Experience with prompt engineering for LLMs and AI-assisted data extraction.Analytical Skills: Strong problem-solving abilities and attention to detail.Communication: Ability to document processes clearly and communicate technical concepts.Bonus SkillsExperience with natural language processing.Knowledge of scientific literature and research data structures.Familiarity with cloud-based data processing.
Application ProcessInterested candidates are invited to submit their resume, a cover letter detailing their qualifications and vision for the role, and references. Please include "Data Integration Specialist" in the cover letter.
#J-18808-Ljbffr
Data Integration Specialist
Posted today
Job Viewed
Job Descriptions
Job DescriptionFirstPrinciples is seeking a skilled and detail-oriented Data Integration Specialist to play a crucial role in our data pipeline development. In this position, you will lead projects to design and implement data extraction processes from various structured and unstructured sources, create robust parsing mechanisms, and develop sophisticated logic to extract meaningful features from raw data. Working in an agile environment, you will iteratively refine extraction methods based on ongoing feedback.
ResponsibilitiesInvestigate and evaluate new data sources.Create comprehensive extraction plans and strategies for each data source.Lead the full lifecycle of data extraction projects from planning to implementation.Work closely with peers and managers to iterate quickly and refine various approaches.Progressively scale extraction processes from small test batches to full implementation.
Data Source IntegrationDevelop and maintain parsers for diverse data sources including APIs, databases, web content, PDFs, and scientific literature.Create reliable ETL processes to ensure data quality and consistency, including LLM-based extraction pipelines.Design and refine prompts for LLMs to extract structured information from unstructured data sources, including text, images, and other multimodal inputs.Implement error handling and logging systems to maintain data pipeline reliability.Identify and extract valuable features from complex raw data sets.Develop logic and algorithms to transform unstructured information into structured, analyzable formats.Create reproducible processes for data normalization and standardization.Optimize parsing procedures for performance and accuracy.Document data lineage and transformation processes for transparency.Work closely with cross-functional teams to understand feature requirements.Coordinate with engineering team to integrate data pipelines into broader systems.Communicate technical concepts clearly to non-technical stakeholders.Engage directly with third party data vendors to obtain technical specifications and integration details.Demonstrate ability to work effectively both as part of a collaborative team and independently on self-directed tasks.
QualificationsEducational Background: Bachelor's degree in computer science, data science, information systems, or related field.Experience: 1-3 years of experience working with data transformation, ETL processes, or similar roles.Project Management SkillsExperience managing small to medium-sized data projects from conception to completion.Demonstrated ability to create technical plans and roadmaps for data extraction.Experience working in agile environments with iterative development cycles.Technical SkillsProficiency in Python and/or similar languages for data processing.Experience with data parsing libraries and frameworks.Knowledge of data storage systems and formats (SQL, JSON, etc.)Familiarity with regular expressions and text processing techniques.Experience with prompt engineering for LLMs and AI-assisted data extraction.Analytical Skills: Strong problem-solving abilities and attention to detail.Communication: Ability to document processes clearly and communicate technical concepts.Bonus SkillsExperience with natural language processing.Knowledge of scientific literature and research data structures.Familiarity with cloud-based data processing.
Application ProcessInterested candidates are invited to submit their resume, a cover letter detailing their qualifications and vision for the role, and references. Please include "Data Integration Specialist" in the cover letter.
#J-18808-Ljbffr
Data Integration Specialist
Posted today
Job Viewed
Job Descriptions
Job DescriptionFirstPrinciples is seeking a skilled and detail-oriented Data Integration Specialist to play a crucial role in our data pipeline development. In this position, you will lead projects to design and implement data extraction processes from various structured and unstructured sources, create robust parsing mechanisms, and develop sophisticated logic to extract meaningful features from raw data. Working in an agile environment, you will iteratively refine extraction methods based on ongoing feedback.
ResponsibilitiesInvestigate and evaluate new data sources.Create comprehensive extraction plans and strategies for each data source.Lead the full lifecycle of data extraction projects from planning to implementation.Work closely with peers and managers to iterate quickly and refine various approaches.Progressively scale extraction processes from small test batches to full implementation.
Data Source IntegrationDevelop and maintain parsers for diverse data sources including APIs, databases, web content, PDFs, and scientific literature.Create reliable ETL processes to ensure data quality and consistency, including LLM-based extraction pipelines.Design and refine prompts for LLMs to extract structured information from unstructured data sources, including text, images, and other multimodal inputs.Implement error handling and logging systems to maintain data pipeline reliability.Identify and extract valuable features from complex raw data sets.Develop logic and algorithms to transform unstructured information into structured, analyzable formats.Create reproducible processes for data normalization and standardization.Optimize parsing procedures for performance and accuracy.Document data lineage and transformation processes for transparency.Work closely with cross-functional teams to understand feature requirements.Coordinate with engineering team to integrate data pipelines into broader systems.Communicate technical concepts clearly to non-technical stakeholders.Engage directly with third party data vendors to obtain technical specifications and integration details.Demonstrate ability to work effectively both as part of a collaborative team and independently on self-directed tasks.
QualificationsEducational Background: Bachelor's degree in computer science, data science, information systems, or related field.Experience: 1-3 years of experience working with data transformation, ETL processes, or similar roles.Project Management SkillsExperience managing small to medium-sized data projects from conception to completion.Demonstrated ability to create technical plans and roadmaps for data extraction.Experience working in agile environments with iterative development cycles.Technical SkillsProficiency in Python and/or similar languages for data processing.Experience with data parsing libraries and frameworks.Knowledge of data storage systems and formats (SQL, JSON, etc.)Familiarity with regular expressions and text processing techniques.Experience with prompt engineering for LLMs and AI-assisted data extraction.Analytical Skills: Strong problem-solving abilities and attention to detail.Communication: Ability to document processes clearly and communicate technical concepts.Bonus SkillsExperience with natural language processing.Knowledge of scientific literature and research data structures.Familiarity with cloud-based data processing.
Application ProcessInterested candidates are invited to submit their resume, a cover letter detailing their qualifications and vision for the role, and references. Please include "Data Integration Specialist" in the cover letter.
#J-18808-Ljbffr
Data Integration Specialist
Posted today
Job Viewed
Job Descriptions
Job DescriptionFirstPrinciples is seeking a skilled and detail-oriented Data Integration Specialist to play a crucial role in our data pipeline development. In this position, you will lead projects to design and implement data extraction processes from various structured and unstructured sources, create robust parsing mechanisms, and develop sophisticated logic to extract meaningful features from raw data. Working in an agile environment, you will iteratively refine extraction methods based on ongoing feedback.
ResponsibilitiesInvestigate and evaluate new data sources.Create comprehensive extraction plans and strategies for each data source.Lead the full lifecycle of data extraction projects from planning to implementation.Work closely with peers and managers to iterate quickly and refine various approaches.Progressively scale extraction processes from small test batches to full implementation.
Data Source IntegrationDevelop and maintain parsers for diverse data sources including APIs, databases, web content, PDFs, and scientific literature.Create reliable ETL processes to ensure data quality and consistency, including LLM-based extraction pipelines.Design and refine prompts for LLMs to extract structured information from unstructured data sources, including text, images, and other multimodal inputs.Implement error handling and logging systems to maintain data pipeline reliability.Identify and extract valuable features from complex raw data sets.Develop logic and algorithms to transform unstructured information into structured, analyzable formats.Create reproducible processes for data normalization and standardization.Optimize parsing procedures for performance and accuracy.Document data lineage and transformation processes for transparency.Work closely with cross-functional teams to understand feature requirements.Coordinate with engineering team to integrate data pipelines into broader systems.Communicate technical concepts clearly to non-technical stakeholders.Engage directly with third party data vendors to obtain technical specifications and integration details.Demonstrate ability to work effectively both as part of a collaborative team and independently on self-directed tasks.
QualificationsEducational Background: Bachelor's degree in computer science, data science, information systems, or related field.Experience: 1-3 years of experience working with data transformation, ETL processes, or similar roles.Project Management SkillsExperience managing small to medium-sized data projects from conception to completion.Demonstrated ability to create technical plans and roadmaps for data extraction.Experience working in agile environments with iterative development cycles.Technical SkillsProficiency in Python and/or similar languages for data processing.Experience with data parsing libraries and frameworks.Knowledge of data storage systems and formats (SQL, JSON, etc.)Familiarity with regular expressions and text processing techniques.Experience with prompt engineering for LLMs and AI-assisted data extraction.Analytical Skills: Strong problem-solving abilities and attention to detail.Communication: Ability to document processes clearly and communicate technical concepts.Bonus SkillsExperience with natural language processing.Knowledge of scientific literature and research data structures.Familiarity with cloud-based data processing.
Application ProcessInterested candidates are invited to submit their resume, a cover letter detailing their qualifications and vision for the role, and references. Please include "Data Integration Specialist" in the cover letter.
#J-18808-Ljbffr
Data Integration Specialist
Posted today
Job Viewed
Job Descriptions
Job DescriptionFirstPrinciples is seeking a skilled and detail-oriented Data Integration Specialist to play a crucial role in our data pipeline development. In this position, you will lead projects to design and implement data extraction processes from various structured and unstructured sources, create robust parsing mechanisms, and develop sophisticated logic to extract meaningful features from raw data. Working in an agile environment, you will iteratively refine extraction methods based on ongoing feedback.
ResponsibilitiesInvestigate and evaluate new data sources.Create comprehensive extraction plans and strategies for each data source.Lead the full lifecycle of data extraction projects from planning to implementation.Work closely with peers and managers to iterate quickly and refine various approaches.Progressively scale extraction processes from small test batches to full implementation.
Data Source IntegrationDevelop and maintain parsers for diverse data sources including APIs, databases, web content, PDFs, and scientific literature.Create reliable ETL processes to ensure data quality and consistency, including LLM-based extraction pipelines.Design and refine prompts for LLMs to extract structured information from unstructured data sources, including text, images, and other multimodal inputs.Implement error handling and logging systems to maintain data pipeline reliability.Identify and extract valuable features from complex raw data sets.Develop logic and algorithms to transform unstructured information into structured, analyzable formats.Create reproducible processes for data normalization and standardization.Optimize parsing procedures for performance and accuracy.Document data lineage and transformation processes for transparency.Work closely with cross-functional teams to understand feature requirements.Coordinate with engineering team to integrate data pipelines into broader systems.Communicate technical concepts clearly to non-technical stakeholders.Engage directly with third party data vendors to obtain technical specifications and integration details.Demonstrate ability to work effectively both as part of a collaborative team and independently on self-directed tasks.
QualificationsEducational Background: Bachelor's degree in computer science, data science, information systems, or related field.Experience: 1-3 years of experience working with data transformation, ETL processes, or similar roles.Project Management SkillsExperience managing small to medium-sized data projects from conception to completion.Demonstrated ability to create technical plans and roadmaps for data extraction.Experience working in agile environments with iterative development cycles.Technical SkillsProficiency in Python and/or similar languages for data processing.Experience with data parsing libraries and frameworks.Knowledge of data storage systems and formats (SQL, JSON, etc.)Familiarity with regular expressions and text processing techniques.Experience with prompt engineering for LLMs and AI-assisted data extraction.Analytical Skills: Strong problem-solving abilities and attention to detail.Communication: Ability to document processes clearly and communicate technical concepts.Bonus SkillsExperience with natural language processing.Knowledge of scientific literature and research data structures.Familiarity with cloud-based data processing.
Application ProcessInterested candidates are invited to submit their resume, a cover letter detailing their qualifications and vision for the role, and references. Please include "Data Integration Specialist" in the cover letter.
#J-18808-Ljbffr
Data Integration Specialist
Posted today
Job Viewed
Job Descriptions
Job DescriptionFirstPrinciples is seeking a skilled and detail-oriented Data Integration Specialist to play a crucial role in our data pipeline development. In this position, you will lead projects to design and implement data extraction processes from various structured and unstructured sources, create robust parsing mechanisms, and develop sophisticated logic to extract meaningful features from raw data. Working in an agile environment, you will iteratively refine extraction methods based on ongoing feedback.
ResponsibilitiesInvestigate and evaluate new data sources.Create comprehensive extraction plans and strategies for each data source.Lead the full lifecycle of data extraction projects from planning to implementation.Work closely with peers and managers to iterate quickly and refine various approaches.Progressively scale extraction processes from small test batches to full implementation.
Data Source IntegrationDevelop and maintain parsers for diverse data sources including APIs, databases, web content, PDFs, and scientific literature.Create reliable ETL processes to ensure data quality and consistency, including LLM-based extraction pipelines.Design and refine prompts for LLMs to extract structured information from unstructured data sources, including text, images, and other multimodal inputs.Implement error handling and logging systems to maintain data pipeline reliability.Identify and extract valuable features from complex raw data sets.Develop logic and algorithms to transform unstructured information into structured, analyzable formats.Create reproducible processes for data normalization and standardization.Optimize parsing procedures for performance and accuracy.Document data lineage and transformation processes for transparency.Work closely with cross-functional teams to understand feature requirements.Coordinate with engineering team to integrate data pipelines into broader systems.Communicate technical concepts clearly to non-technical stakeholders.Engage directly with third party data vendors to obtain technical specifications and integration details.Demonstrate ability to work effectively both as part of a collaborative team and independently on self-directed tasks.
QualificationsEducational Background: Bachelor's degree in computer science, data science, information systems, or related field.Experience: 1-3 years of experience working with data transformation, ETL processes, or similar roles.Project Management SkillsExperience managing small to medium-sized data projects from conception to completion.Demonstrated ability to create technical plans and roadmaps for data extraction.Experience working in agile environments with iterative development cycles.Technical SkillsProficiency in Python and/or similar languages for data processing.Experience with data parsing libraries and frameworks.Knowledge of data storage systems and formats (SQL, JSON, etc.)Familiarity with regular expressions and text processing techniques.Experience with prompt engineering for LLMs and AI-assisted data extraction.Analytical Skills: Strong problem-solving abilities and attention to detail.Communication: Ability to document processes clearly and communicate technical concepts.Bonus SkillsExperience with natural language processing.Knowledge of scientific literature and research data structures.Familiarity with cloud-based data processing.
Application ProcessInterested candidates are invited to submit their resume, a cover letter detailing their qualifications and vision for the role, and references. Please include "Data Integration Specialist" in the cover letter.
#J-18808-Ljbffr
Data Integration Specialist
Posted today
Job Viewed
Job Descriptions
Job DescriptionFirstPrinciples is seeking a skilled and detail-oriented Data Integration Specialist to play a crucial role in our data pipeline development. In this position, you will lead projects to design and implement data extraction processes from various structured and unstructured sources, create robust parsing mechanisms, and develop sophisticated logic to extract meaningful features from raw data. Working in an agile environment, you will iteratively refine extraction methods based on ongoing feedback.
ResponsibilitiesInvestigate and evaluate new data sources.Create comprehensive extraction plans and strategies for each data source.Lead the full lifecycle of data extraction projects from planning to implementation.Work closely with peers and managers to iterate quickly and refine various approaches.Progressively scale extraction processes from small test batches to full implementation.
Data Source IntegrationDevelop and maintain parsers for diverse data sources including APIs, databases, web content, PDFs, and scientific literature.Create reliable ETL processes to ensure data quality and consistency, including LLM-based extraction pipelines.Design and refine prompts for LLMs to extract structured information from unstructured data sources, including text, images, and other multimodal inputs.Implement error handling and logging systems to maintain data pipeline reliability.Identify and extract valuable features from complex raw data sets.Develop logic and algorithms to transform unstructured information into structured, analyzable formats.Create reproducible processes for data normalization and standardization.Optimize parsing procedures for performance and accuracy.Document data lineage and transformation processes for transparency.Work closely with cross-functional teams to understand feature requirements.Coordinate with engineering team to integrate data pipelines into broader systems.Communicate technical concepts clearly to non-technical stakeholders.Engage directly with third party data vendors to obtain technical specifications and integration details.Demonstrate ability to work effectively both as part of a collaborative team and independently on self-directed tasks.
QualificationsEducational Background: Bachelor's degree in computer science, data science, information systems, or related field.Experience: 1-3 years of experience working with data transformation, ETL processes, or similar roles.Project Management SkillsExperience managing small to medium-sized data projects from conception to completion.Demonstrated ability to create technical plans and roadmaps for data extraction.Experience working in agile environments with iterative development cycles.Technical SkillsProficiency in Python and/or similar languages for data processing.Experience with data parsing libraries and frameworks.Knowledge of data storage systems and formats (SQL, JSON, etc.)Familiarity with regular expressions and text processing techniques.Experience with prompt engineering for LLMs and AI-assisted data extraction.Analytical Skills: Strong problem-solving abilities and attention to detail.Communication: Ability to document processes clearly and communicate technical concepts.Bonus SkillsExperience with natural language processing.Knowledge of scientific literature and research data structures.Familiarity with cloud-based data processing.
Application ProcessInterested candidates are invited to submit their resume, a cover letter detailing their qualifications and vision for the role, and references. Please include "Data Integration Specialist" in the cover letter.
#J-18808-Ljbffr
Be The First To Know
About The Latest Data integration Jobs in Canada!
Data Integration Developer
Posted today
Job Viewed
Job Descriptions
Overview
At KPMG, you'll join a team of diverse and dedicated problem solvers, connected by a common cause turning insight into opportunity for clients and communities around the world.
The role of a Data Integration Developer (Operation Team) is to deliver services and solutions for KPMG Canada internal clients through the Enterprise Analytics CoE using the suite of Microsoft Azure Data Services and data integrations.
What You Will Do
- Develop and implement data integration solutions using Azure services like Azure Data Factory, Logic Apps, and Azure Synapse Analytics.
- Migrate on-premises data to Azure cloud environments, ensuring data integrity and consistency.
- Design, develop, and maintain ETL (Extract, Transform, Load) processes to extract data from various sources, transform it according to business requirements, and load it into data warehouses or data lakes.
- Build and manage data pipelines, ensuring their smooth operation, troubleshooting issues, and optimizing performance.
- Create and maintain data models for structured and unstructured data, ensuring data is accurately represented and accessible.
- Work closely with business analysts, data engineers, and other stakeholders to understand data requirements and deliver appropriate solutions.
- Ensure that all data integration processes comply with organizational security policies and regulatory requirements.
- Automate data integration tasks to improve efficiency and reduce manual intervention.
- Document data integration processes, data flow diagrams, and architecture for future reference and maintenance.
- Continuously learn and apply new features and best practices for Azure data integration services.
- Identify and resolve issues within data integration processes, ensuring high availability and reliability of data flows.
- Optimize data integration workflows for better performance and scalability.
What You Bring To The Role
- Proficiency in designing, developing, and managing data pipelines within Azure Data Factory.
- Familiarity with CI/CD pipelines, version control, and deployment practices using Azure DevOps.
- Skills in monitoring data integration pipelines and optimizing performance to handle large datasets efficiently.
- Experience with ETL tools and processes, including data extraction, transformation, and loading into data warehouses or lakes.
- Understanding of data security practices, including encryption, role-based access control, and compliance with regulatory requirements.
- Ability to create and manage workflows using Azure Logic Apps and develop serverless solutions using Azure Functions.
- Ability to troubleshoot and resolve issues in data integration processes efficiently.
- Experience with data visualization and analytics tools like Power BI or Azure Analysis Services.
- Proficiency in scripting languages like Python, PowerShell, or Scala for automation and data processing tasks.
- Experience with integrating data from APIs and other web services into Azure-based solutions.
- Understanding of Azure storage options like Blob Storage, Data Lake Storage, and Azure Files.
- Strong knowledge of SQL, including complex queries, stored procedures, and database management with platforms like Azure SQL Database and SQL Server.
- Current Microsoft Azure certifications are a plus.
Providing you with the support you need to be at your best
Our Values, The KPMG Way
Integrity
, we do what is right |
Excellence
, we never stop learning and improving |
Courage
, we think and act boldly |
Together
, we respect each other and draw strength from our differences |
For Better
, we do what matters
KPMG in Canada is a proud equal opportunities employer and we are committed to creating a respectful, inclusive and barrier-free workplace that allows all of our people to reach their full potential. A diverse workforce is key to our success and we believe in bringing your whole self to work. We welcome all qualified candidates to apply and hope you will choose KPMG in Canada as your employer of choice.
Adjustments and accommodations throughout the recruitment process
At KPMG, we are committed to fostering an inclusive recruitment process where all candidates can be themselves and excel. We aim to provide a positive experience and are prepared to offer adjustments or accommodations to help you perform at your best. Adjustments (informal requests), such as extra preparation time or the option for micro breaks during interviews, and accommodations (formal requests), such as accessible communication supports or technology aids, are tailored to individual needs and role requirements. You will have an opportunity to request an adjustment or accommodation at any point throughout the recruitment process. If you require support, please contact KPMG's Employee Relations Service team by calling
Data Integration Developer
Posted today
Job Viewed
Job Descriptions
Position Details
Position Title:
Data Integration Developer
Employment Type:
Full-time, Permanent (Vacant – New Role)
Reports to:
Director, Business Intelligence
Direct Reports:
0
Salary Range:
$70,500 to $85,500 per year annually with benefits and 3 weeks paid vacation
Work Location:
Toronto
Work Environment:
UNICEF Canada currently operates under a Flexible-Hybrid model that requires team members to attend the office at least 2 days/week.
The Opportunity
We are seeking a skilled Data Integration Developer to join our team and play a crucial role in designing, developing, and maintaining data integration solutions. The ideal candidate will have hands-on experience with data extraction, configuration, automation, and workflow orchestration.
As a key member of the Business Intelligence team, the Data Integration Developer is responsible for all supporter data created, modified, or moved through automation and integration. The successful candidate will have the opportunity to be part of a great team, to solve interesting and challenging problems, and help shape the ongoing development and implementation of our data strategy.
Key Accountabilities
New Automation and Integration Solutions (45%)
- Design and build ETL/ELT pipelines to process structured and unstructured data, automation, and integration solutions in support of our overall data strategy
- Work independently to understand business needs, meet with stakeholders, test, verify, and document new processes
- Contribute to the ongoing development and evolution of our overall Data Strategy by identifying opportunities for improving efficiency, accuracy, and usefulness of data through automation
Ongoing Management of Automation and Integration Solutions (40%)
Oversee, maintain, and troubleshoot all existing data integration solutions and pipelines to ensure automated processes and business processes preserve data integrity and meet business goals. These processes include:
Daily automated BI data processes
- Automated 2-way data integration between Raiser's Edge and our Engaging Networks online fundraising platform
- Daily fundraising data imports into Raiser's Edge through ImportOmatic
- Import and export processes to integrate data with fundraising vendors
Vendor Management (10%)
- Oversee ongoing development and maintenance of data
Data Integration Developer
Posted today
Job Viewed
Job Descriptions
Position Details
Position Title: Data Integration Developer
Employment Type: Full-time, Permanent (Vacant – New Role)
Reports to: Director, Business Intelligence
Direct Reports: 0
Salary Range: $70,500 to $85,500 per year annually with benefits and 3 weeks paid vacation
Work Location: Toronto
Work Environment: UNICEF Canada currently operates under a Flexible-Hybrid model that requires team members to attend the office at least 2 days/week.
The Opportunity
We are seeking a skilled Data Integration Developer to join our team and play a crucial role in designing, developing, and maintaining data integration solutions. The ideal candidate will have hands-on experience with data extraction, configuration, automation, and workflow orchestration.
As a key member of the Business Intelligence team, the Data Integration Developer is responsible for all supporter data created, modified, or moved through automation and integration. The successful candidate will have the opportunity to be part of a great team, to solve interesting and challenging problems, and help shape the ongoing development and implementation of our data strategy.
Key Accountabilities
New Automation and Integration Solutions (45%)
- Design and build ETL/ELT pipelines to process structured and unstructured data, automation, and integration solutions in support of our overall data strategy
- Work independently to understand business needs, meet with stakeholders, test, verify, and document new processes
- Contribute to the ongoing development and evolution of our overall Data Strategy by identifying opportunities for improving efficiency, accuracy, and usefulness of data through automation
Ongoing Management of Automation and Integration Solutions (40%)
- Oversee, maintain, and troubleshoot all existing data integration solutions and pipelines to ensure automated processes and business processes preserve data integrity and meet business goals. These processes include:
- Daily automated BI data processes
- Automated 2-way data integration between Raiser's Edge and our Engaging Networks online fundraising platform
- Daily fundraising data imports into Raiser's Edge through ImportOmatic
- Import and export processes to integrate data with fundraising vendors
Vendor Management (10%)
- Oversee ongoing development and maintenance of data