< More Jobs

Posted on 7/20/2025

Data, AI and Information Architect

Zurich 56 Company Ltd

Oregon, IL

Full-time

Qualifications

  • The ideal candidate will have experience building complex cloud native data analytics solutions, large scale data warehouse modernization and architecting, designing solutions for complex AI projects
  • Bachelor’s Degree in Mathematics/Statistics and 7 or more years of experience in the Predictive Analytics area
  • Zurich Certified Insurance Apprentice including an Associate Degree in Mathematics/Statistics and 7 or more years of experience in the Predictive Analytics area
  • High School Diploma or Equivalent and 9 or more years of experience in the Predictive Analytics area
  • Ability to work in a fast-paced environment as a technical lead
  • Strong software standards and engineering practices working within a shared codebase

Benefits

  • In compliance with local laws, Zurich commits to providing a fair and reasonable compensation range for each role

Responsibilities

  • The Data, AI and Information Architect be the key contributor to the execution of CDO Data & AI strategy for the Data Science, Data Governance, Business Intelligence and Data Engineering teams and will build the architecture, systems and technologies required to have a best-in-class Data & AI ecosystem
  • This role will work on the modernization of current data landscape to next generation data & AI architecture and processes that support Business Units and Share Service Units
  • Design and build scalable solutions to ingest, store and process very large amounts of data (structured, semi-structured and unstructured), including streaming and real-time into the cloud-based Lakehouse & cloud data warehouse
  • Take ownership of operationalization of Agentic AI flows while designing solutions to optimize data pipelines, model deployment, and model monitoring
  • Architect blueprints for production grade, scalable end-to-end Data & AI solutions
  • Lead the integration of LLMs, lang chain, lang graphs, agent orchestrators into business processes
  • Build enterprise grade, highly reusable data assets for analytics consumption
  • Partner with analytics experts, data scientists, and Business Unit analysts to build self-service experiences in our data democratization platform
  • Innovate different ways of combining data sets from various sources to augment our analytical literacy
  • Conduct POCs, Pilots on re-usable, scalable data processing, curation, and consumption frameworks to eliminate point solutions
  • Evaluate, analyze, and optimize system resources related to large scale data processing, image processing, computer vision and deep learning
  • Work closely with the IT and the cloud infrastructure teams to deliver technical solutions
  • This includes end to end Agent orchestration, data layering, data/AI governance, metadata,

Full Description

Location: Oregon

Press Tab to Move to Skip to Content Link

Select how often (in days) to receive an alert:

Select how often (in days) to receive an alert:

Zurich North America is currently hiring a AVP - Data, AI and Information Architect to work remotely for our Zurich North America Headquarters office.

The Data & Analytics department is looking for an expert to drive broader data architecture and solution blueprints to modernize data analytics ecosystem. The Data, AI and Information Architect be the key contributor to the execution of CDO Data & AI strategy for the Data Science, Data Governance, Business Intelligence and Data Engineering teams and will build the architecture, systems and technologies required to have a best-in-class Data & AI ecosystem.

This role will work on the modernization of current data landscape to next generation data & AI architecture and processes that support Business Units and Share Service Units. The ideal candidate will have experience building complex cloud native data analytics solutions, large scale data warehouse modernization and architecting, designing solutions for complex AI projects.

Responsibilities:

• Design and build scalable solutions to ingest, store and process very large amounts of data (structured, semi-structured and unstructured), including streaming and real-time into the cloud-based Lakehouse & cloud data warehouse.

• Take ownership of operationalization of Agentic AI flows while designing solutions to optimize data pipelines, model deployment, and model monitoring

• Architect blueprints for production grade, scalable end-to-end Data & AI solutions

• Lead the integration of LLMs, lang chain, lang graphs, agent orchestrators into business processes

• Build enterprise grade, highly reusable data assets for analytics consumption.

• Partner with analytics experts, data scientists, and Business Unit analysts to build self-service experiences in our data democratization platform.

• Innovate different ways of combining data sets from various sources to augment our analytical literacy.

• Conduct POCs, Pilots on re-usable, scalable data processing, curation, and consumption frameworks to eliminate point solutions.

• Evaluate, analyze, and optimize system resources related to large scale data processing, image processing, computer vision and deep learning.

• Work closely with the IT and the cloud infrastructure teams to deliver technical solutions. This includes end to end Agent orchestration, data layering, data/AI governance, metadata,

Qualifications:

Required

• Bachelor’s Degree in Mathematics/Statistics and 7 or more years of experience in the Predictive Analytics area

OR

• Zurich Certified Insurance Apprentice including an Associate Degree in Mathematics/Statistics and 7 or more years of experience in the Predictive Analytics area

OR

• High School Diploma or Equivalent and 9 or more years of experience in the Predictive Analytics area

• Ability to work in a fast-paced environment as a technical lead

• Strong software standards and engineering practices working within a shared codebase

Preferred

• Experience in building enterprise grade Data & AI solutions within commercial insurance or financial services industry.

• Experience in ELT capabilities (Data Factory, DBT, Five Tran etc.) and languages such as Python, R, Scala, Java, and SQL.

• Experience with various Microsoft Azure services (including Data Lake, Databricks, Data factory, Cosmos DB, Power Platform, Co-pilot studio, Azure ML, Power Automate, Power BI etc.).

• Experience with vector stores like Elastic, Pinecone, Mongo Atlas, or Redis

• Experience with cloud data warehouse solutions like Snowflake, Fabric, or Redshift

• Exposure to deployments for LLMs including end to end unstructured data pipeline build, design of vector stores, embeddings, and prompt engineering.

• Strong communication skills with ability to effectively explain complex material

At Zurich, compensation for roles is influenced by a variety of factors, including but not limited to the specific office location, role, skill set, and level of experience. In compliance with local laws, Zurich commits to providing a fair and reasonable compensation range for each role. For more information about our Total…

Subscribe to the AI Search Newsletter

Get top updates in AI to your inbox every weekend. It's free!