← All Jobs
Posted Mar 16, 2026

Data Architect with Data Modelling

Apply Now
Remote, Illinois 60007 Posted March 11th, 2026 Looking for more job opportunities? Click here! Job Type: Full Time Job Category: IT Job Description Role : Data Architect with Data Modelling Location : Remote FULL TIME ONLY Job Description Must Have Technical/Functional Skills • Excellent data modelling skills. Dimensional and 3NF needed. • Data modelling must have conceptual model showing core entities from CRM, e commerce, and finance systems, include relationships and grain for each table • Catalog federation or query federation • Must have implemented implement streaming (autoloader, CloudFiles, Spark Streaming, Event Hubs/Kafka) • Strong in Delta tables in Databricks • Extensive experience translating business requirements to data design requirements. • Extensive data flow documentation experience. • Azure experience - ADLS, Azure SQL. • Snowflake or Databricks design experience • Excellent Communication skills for both technical and business communications. • Excellent documentation skills. • Extensive experience translating business requirements to data design requirements. • Eight year’s experience as a data architect • Excellent data modelling skills. • Must have implemented PCI/tokenization/data masking/row-/column-/object-level security • Extensive data flow documentation experience. • Two year’s experience working in Databricks. • Databricks certification a plus. • Two year’s experience in Snowflake. • Snowflake certifications a plus. • Two year’s experience building enterprise data warehouses in Azure. • Azure solution architect certification preferred. • Azure database certifications preferred. • Azure AI certifications preferred. Roles & Responsibilities • Excellent data modelling skills. Dimensional and 3NF needed. • Data modelling - must have created conceptual model showing core entities from CRM, e commerce, and finance systems, include relationships and grain for each table • Catalog federation or query federation • Must have implemented implement streaming (autoloader, CloudFiles, Spark Streaming, Event Hubs/Kafka) • Strong in Delta tables in Databricks • Extensive experience translating business requirements to data design requirements. • Extensive data flow documentation experience. • Azure experience - ADLS, Azure SQL. • Snowflake or Databricks design experience • Excellent Communication skills for both technical and business communications. • Excellent documentation skills. • Extensive experience translating business requirements to data design requirements. • Eight years’ experience as a data architect • Excellent data modelling skills. • Must have implemented PCI/tokenization/data masking/row-/column-/object-level security • Extensive data flow documentation experience. • Two year’s experience working in Databricks. • Databricks certification a plus. Required Skills PERFORMANCE ARCHITECT