What Is Data Platform Architecture Jobs in Usa

33,402 positions found

System Administrator - Microsoft Purview (Data Catalog & Governance)
Salary not disclosed
Raleigh, NC 2 days ago
Role: System Administrator - Microsoft Purview (Data Catalog & Governance)

Location: 100% Remote

Duration: 12+ Months

Overview:

An experienced Administrator to operate and support the enterprise implementation of Microsoft Purview Data Catalog across a complex, multi-platform data environment. The administrator will be responsible for the day-to-day configuration, monitoring, and maintenance of Purview capabilities, ensuring reliable metadata ingestion, catalog quality, lineage visibility, and compliance alignment across governed data domains.

This role focuses on platform operations and governance execution, working within established architecture and enterprise governance standards.

Key Responsibilities

Platform Administration & Operations:


  • Administer and operate Microsoft Purview Data Map and Data Catalog environments.
  • Monitor platform health, scan execution, metadata ingestion, and lineage availability.
  • Troubleshoot and resolve catalog, scan, and connectivity issues.
  • Perform routine maintenance, configuration updates, and service optimizations.
  • Coordinate incident resolution with internal engineering teams and Microsoft support as required.

Data Source Management & Scanning:


  • Register, configure, and maintain data sources across Azure, M365, on?prem, and approved third?party platforms.
  • Configure and schedule metadata scans for supported sources.
  • Manage authentication for scans using managed identities, service principals, and Key Vault secrets.
  • Monitor scan performance, failures, and coverage; take corrective action as needed.
  • Optimize scan frequency and scope to balance cost, performance, and governance coverage.

Catalog Configuration & Metadata Management:


  • Maintain and enforce enterprise metadata standards within the Purview Catalog.
  • Manage business metadata, classifications, glossary terms, and custom attributes.
  • Ensure metadata accuracy, completeness, and consistency across data assets.
  • Support curation activities including asset certification and publishing.
  • Resolve duplicate, incomplete, or stale catalog entries.

Lineage & Discovery Enablement:


  • Enable and validate data lineage ingestion from supported data platforms.
  • Monitor lineage completeness and visibility for critical data assets.
  • Assist data consumers and stewards with lineage?based impact analysis.
  • Escalate lineage gaps or tool limitations requiring architectural or engineering remediation.

Security, Access & Governance Controls:


  • Configure and manage Purview role?based access control (RBAC) within collections.
  • Provision and maintain access for administrators, data curators, and data stewards.
  • Enforce domain?based access controls and separation of duties.
  • Integrate Purview access with Microsoft Entra ID.
  • Support sensitivity labels and classification alignment with Microsoft Information Protection.

Compliance & Risk Support:


  • Support automated discovery of sensitive data (PII, PCI, PHI).
  • Assist risk, audit, and compliance teams with catalog evidence and reporting.
  • Validate scan coverage for regulated data domains.
  • Support regulatory and audit initiatives (SOX, GLBA, NYDFS, GDPR, etc.).

User Support & Enablement:


  • Provide operational support to data producers, consumers, and data stewards.
  • Respond to access requests, catalog issues, and usage questions.
  • Maintain operational documentation, runbooks, and standard operating procedures.
  • Support onboarding of new data domains following established governance patterns.
  • Assist with training and adoption initiatives led by governance or architecture teams.


Required Qualifications:


  • 5+ years experience supporting enterprise data platforms or governance tools and 4+ years hands?on MS Purview experience at enterprise scale.
  • Hands?on experience administering Microsoft Purview Data Catalog.
  • Strong understanding of metadata management, data classification, and lineage concepts.
  • Working knowledge of Azure data services and enterprise data ecosystems.
  • Experience managing access controls and identities using Microsoft Entra ID.
  • Familiarity with regulated data environments and compliance requirements.
  • Strong troubleshooting, operational support, and documentation skills.


Preferred Qualifications:


  • Experience supporting Purview integrations with Synapse, Fabric, Databricks, Snowflake, or SQL Server.
  • Exposure to financial services or other regulated industries.
  • Experience with PowerShell, REST APIs, or basic automation for operational tasks.
  • Prior experience supporting enterprise data governance or stewardship programs.
Not Specified
Marketplace Platform Lead
✦ New
🏢 Spectraforce Technologies
Salary not disclosed
Newark, NJ 1 day ago
Job role: Marketplace Platform Lead

Location: Remote

Duration: 8+ months

Marketplace Platform Lead

Job Overview


The Marketplace Platform Lead is responsible for driving the end?to?end technical architecture and implementation of the enterprise Data Marketplace platform. This role spans stakeholder engagement, architectural definition, integration design, and hands-on leadership throughout implementation. The ideal candidate is a seasoned technical leader with deep experience designing integration patterns, building scalable platforms, and guiding engineering teams through complex cross-system solutions.

Key Responsibilities

Lead stakeholder meetings to gather business requirements, align on platform objectives, and clarify workflows and user journeys.

Conduct tool evaluations, build scoring frameworks, and make recommendations on platforms, vendors, and integration technologies.

Define end-to-end Marketplace architecture, including data flows, APIs, domain models, integration strategies, and platform components.

Design and lead the implementation of integration patterns, including API-based integrations, event-driven patterns, workflow orchestration, and cross-system interoperability.

Develop technical designs, architectural documents, and standards for Marketplace workflows, user flows, and extensibility patterns.

Provide hands-on architectural guidance to engineering teams throughout solution design, development, and delivery.

Oversee technical quality, scalability, performance, and security across Marketplace components and integrations.

Collaborate with product, engineering, data, and security teams to ensure compliance with enterprise data governance, privacy, and reliability standards.

Lead technical reviews, drive design decisions, and ensure alignment across cross-functional stakeholders.

Required Skills & Qualifications

8+ years of experience in software engineering, platform development, or technical architecture roles.

Strong expertise in designing and implementing integration architectures, including REST/GraphQL APIs, event-driven patterns, synchronous/asynchronous messaging, and workflow engines.

Deep understanding of distributed systems, microservices, and cloud-native solutions (Azure, AWS, or GCP).

Proficiency with API design, messaging systems, and enterprise integration frameworks.

Experience defining technical architecture, data flows, and workflow designs for complex platforms.

Ability to translate business requirements into technical designs, user flows, and actionable engineering plans.

Demonstrated leadership in guiding engineering teams through architectural decisions and implementation.

Strong communication skills with the ability to influence technical and non-technical partners.

Experience evaluating and scoring platforms, tools, or vendor solutions.

Solid knowledge of DevOps practices, CI/CD, infrastructure-as-code, observability, and security best practices.

Preferred Qualifications

Experience building or leading a Data Marketplace platform.

Familiarity with workflow orchestration platforms, rules engines, BPM tools, or catalog management systems.

Experience with enterprise identity systems (OAuth, SAML, SSO), access governance, and data privacy frameworks.

Background working with enterprise data platforms, data governance, or cross-domain integration patterns.

Prior experience leading architectural governance or serving as a platform architect in an enterprise environment.

Not Specified
Enterprise Architecture AI Architect
✦ New
Salary not disclosed
Hartford, CT 1 day ago
AI Architect Mastery, Enter prise Architecture , ETX   
(Full-Time, Location : Boston, Springfield, New York ) 

The AI Architect will evaluate and benchmark AI capabilities, define implementation patterns for AI services, and enable MassMutual to make impactful, long-term decisions around AI-enabled business capabilities. This includes developing strategies, reference architectures, roadmaps, and patterns to drive responsible AI Adoption, Research & Development. As an AI Architect, you will be part of MassMutual’s team focused on aligning AI technology strategy with business strategy and goals. You will guide the planning and design of AI capabilities to maximize their value in secure and responsible manner, using architecture strategies, blueprints, and roadmaps to standardize technology stacks and enhance AI engineering speed and agility.  

The MassMutual Enterprise Architecture team in the Enterprise Technology and Experience organization is composed of Business, Application, Infrastructure, Data, and Security architecture domains. The AI Architect will join this team to drive digital innovation and create a competitive advantage for MassMutual.  

The AI Architect will play a critical role in designing AI solutions that enhance MassMutual's digital capabilities and customer engagement. This role involves working closely with various domains of Enterprise Architecture (EA), Research & Development, business stakeholders, IT teams, Cyber, Governance and Privacy partners to design and implement AI solutions that align with business outcomes and architectural requirements  

Build Enterprise AI Architecture strategy and roadmaps 
Develop AI Reference Architectures and Guidance.
Update specification and publication of AI Standards and Patterns around AI technology, development, security, privacy and observability.
Build AI platform architecture and Integration Patterns, stay abreast of emerging AI technologies and integrate them into AI architecture as needed
Consult on AI capabilities for business and technology platforms and ensure alignment between AI architecture frameworks and standards and overall business strategy
Evaluate and lead AI architecture deliverables, perform capability assessments, and support technical evaluations for closing gaps
Actively publish deliverables and utilize multi-media to educate and engage with federated solution architecture community members
Partner with solution architect to document design decisions and solution architecture
Partner with technology leaders, business and governance partners to identify AI risks and process issues, then provide enterprise patterns to resolve the issues
Provide recommendations on problem solving, solution options, risks, cost/benefit analysis, and impact on cross-domain systems, business strategy, goals, and processes
Partner with extended enterprise architecture, enterprise cyber security, compliance, governance, privacy, business, and IT support teams to communicate and collaborate on architecture strategies, standards, and guidance
Partner with AI Governance, Security and Privacy team for supporting Secure, Ethical and Responsible use of AI.
Stay abreast of current and emerging AI threats and design AI architecture to mitigate them. Design capabilities for AI Governance and Observability
Achieve AI architecture compliance on requirements, including, but not limited to, Colorado AI Act, global data privacy requirements, and state and federal regulations

Bachelor's degree in computer science, IT, system analysis, or a related technical field
~8+ years of IT design and implementation experience with a minimum of two of the following (or similar) technical disciplines: AI/ML frameworks, AI/ML development and coding environments, cloud platforms, and big data technologies
~5+ years involvement with solution architecture development and delivery.
~5+ years of experience developing and interpreting business architecture.
~5+ years of experience in building solutions on complex cloud-native products, applications, and platforms
~5+ years of experience in a technology advisory role, experience working directly with any of AI, machine learning technologies, statistical systems, or big data platforms

Experience working with AI Governance, Privacy requirements like Colorado Artificial Intelligence Act (CAIA).
Experience supporting agile teams by providing guidance on design, opportunities, impact, and risks, taking account of technical and architectural debt
Preferred experience designing GenAI solutions, with experience in MLOPS, LLM, Amazon Bedrock, Q, OpenAI, CoPilot Studio, Langchain, Data Foundry, Data Fabric.
Trusted and respected as a thought leader who can influence and persuade business and IT leaders and IT development teams
Technology-neutral: remains unbiased toward any specific technology or vendor choice and is more interested in results than personal preferences
Ability to balance the long-term (big picture) and short-term implications of individual decisions
If you need an accommodation to complete the application process, please contact us and share the specifics of the assistance you need.
California residents:
permanent
Sr Platform Architect
✦ New
Salary not disclosed
Dunwoody, GA 1 day ago

Senior Platform Architect

Reports To: Director of Engineering

Department: Engineering

Location: Hybrid - Atlanta, GA


What makes MTech different:


Purpose-Driven Work – Build technology that solves real problems for the world

Casual & Collaborative – No corporate bureaucracy, direct access to senior leadership

Innovation-Focused – Healthy innovation pipeline expanding into new segments and technologies

Transparent & Data-Driven – Clear metrics, objectives, and visibility into company performance

Modern Development – Robust development tools, training programs, and technical excellence

Flexibility & Balance – Flexible work environment that values results over presenteeism



Job Summary

The Senior Platform Architect will lead the technical architecture, design, and modernization of large-scale, multi-tenant enterprise SaaS platforms built on Azure and the .NET stack. This role requires mastery of distributed systems, cloud-native design, and advanced engineering practices to deliver highly available, performant, and secure solutions for global consumer-facing SaaS and Agentic AI products.


Responsibilities and Duties


Architectural Design & Transformation

  • Lead migration from monolithic systems to modular monolith and microservices architectures using domain-driven design, bounded contexts, and decomposition strategies.
  • Design multi-tenant SaaS platforms with advanced tenant isolation, resource partitioning, and elastic scaling using Azure services.
  • Define and enforce architectural standards for .NET (C#), TypeScript, Angular, SQL Server, and Azure, including dependency injection, SOLID principles, asynchronous programming, and reactive patterns.
  • Design and implement distributed systems: service orchestration, API gateway management, IoT, edge computing, distributed transactions, eventual consistency, CQRS, and event sourcing.
  • Architect for cloud-native resiliency: circuit breakers, bulkheads, retries, failover, geo-redundancy, and disaster recovery using Azure App Services, Azure Functions, Service Bus, Cosmos DB, and Azure SQL.
  • Develop and maintain architecture documentation, reference models, and decision records using industry frameworks (TOGAF, Zachman, C4 Model).


Performance Engineering & Observability

  • Establish and monitor platform SLOs (latency, throughput, error rates, availability) mapped to customer SLAs.
  • Architect and implement advanced caching strategies, indexing, and query optimization for SQL Server and NoSQL stores in coordination with Senior Data Architect, Data Engineers, and Database Admins.
  • Design and implement telemetry pipelines: distributed tracing (OpenTelemetry), structured logging, metrics collection, and real-time dashboards for system health and diagnostics.
  • Conduct performance profiling, load testing, and capacity planning for backend services and frontend applications.


Automation, Quality, and DevOps

  • Architect and implement CI/CD pipelines with automated build, test, security scanning, and deployment workflows.
  • Integrate static code analysis, code coverage, and quality gates into the development lifecycle.
  • Design and enforce automated testing strategies: unit, integration, contract, and end-to-end tests for backend and frontend components.
  • Develop infrastructure as code (IaC) solutions for repeatable, scalable cloud provisioning.
  • Create incident response playbooks for rollback, failover, and recovery, drive down MTTR and automate remediation where possible.


Security, Compliance, and Governance

  • Architect for multi-tenant security: authentication/authorization (OAuth2, OpenID Connect), encryption at rest and in transit, secrets management, and compliance with SOC 1, SOC 2, GDPR, and other regulatory standards.
  • Implement secure software development lifecycle (SSDLC) practices, threat modeling, and vulnerability management, including ZDR, DLP, No Model Training policies with AI Models.
  • Ensure architectural governance and alignment with enterprise frameworks (TOGAF, Zachman), maintain architecture decision records, and participate in architecture review boards.


Technical Leadership & Collaboration

  • Mentor engineering teams in advanced architectural concepts, distributed systems, cloud-native development, and best practices.
  • Collaborate with Data Architect, DevOps, IT Services, Engineering and Product Management teams to ensure platform extensibility, integration, and support for complex business requirements.
  • Evaluate and integrate AI/ML services, advanced analytics, and developer productivity tools to enhance platform capabilities.
  • Champion a culture of technical excellence, continuous improvement, and innovation.


Required Experience & Skills

  • Minimum 10+ years in software/platform engineering, with at least 8 years in platform architecture for enterprise SaaS on Azure and .NET tech stack.
  • Proven experience architecting and delivering large-scale, multi-tenant SaaS platforms for global consumer-facing products.
  • Deep expertise in .NET (C#), Azure cloud services (App Services, Functions, Service Bus, Cosmos DB, SQL Server), Azure Open AI, Microsoft Agent Framework, TypeScript, Angular, CI/CD, automated testing, and observability.
  • Mastery of distributed systems, cloud-native patterns, event-driven architectures, and microservices.
  • Demonstrated success in technical debt reduction, performance engineering, and architectural modernization.
  • Experience with architectural frameworks (TOGAF, Zachman, C4 Model), architectural governance, and compliance.
  • Strong understanding of platform security, regulatory compliance, and multi-tenant SaaS challenges.


Success Metrics (First 12 Months)

  • Reduction in platform-related incidents/support tickets.
  • Improvement in deployment speed and release velocity.
  • Reduction in MTTR for platform incidents.
  • Achievement of modularization milestones (monolith decomposition, service rollout, platform observability in production).
  • Increase in automated test coverage, code quality, and system performance metrics.


Preferred Skills & Certifications

  • TOGAF, Zachman, or similar architecture certification.
  • Advanced knowledge of event sourcing, CQRS, service mesh, and cloud-native security.
  • Familiarity with semantic technologies, knowledge graphs, and AI/ML integration.
  • Hands-on experience with infrastructure as code, automated testing tools, and modern DevOps practices.
  • Strong background in platform security, compliance, and multi-tenant SaaS challenges.


EEO Statement

Integrated into our shared values is MTech’s commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. MTech aims to maintain a global inclusive workplace where every person is regarded fairly, appreciated for their uniqueness, advanced according to their accomplishments, and encouraged to fulfill their highest potential. We

Not Specified
Data Product Engineer
🏢 Spectraforce Technologies
Salary not disclosed
Newark, NJ 2 days ago
Job Title: Marketplace Data Product Engineer

Duration: 6+ months

Location: 100% Remote

Job Overview

The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.

Key Responsibilities


  • Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
  • Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
  • Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
  • Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
  • Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
  • Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
  • Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
  • Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
  • Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
  • Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.


Required Skills & Qualifications


  • 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
  • Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
  • Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
  • Excellent communication and presentation skills, with experience translating technical concepts for business partners.
  • Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
  • Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
  • Strong collaboration and stakeholder management skills across engineering, product, and business teams.
  • Comfortable working in fast?moving environments and driving clarity through ambiguity.


Preferred Qualifications


  • Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
  • Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
  • Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
  • Background in solution architecture, customer engineering, or sales engineering.
  • Experience developing demo environments, sample apps, or repeatable platform enablement assets.
  • Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.


Not Specified
Data Architect - Power & Utilities - Senior Manager- Consulting - Location OPEN
$250 +
San Francisco, CA 2 days ago

Location: Anywhere in Country


At EY, we’re all in to shape your future with confidence.


We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.


AI & Data - Data Architecture – Senior Manager – Power & Utilities Sector

EY is seeking a motivated professional with solid experience in the utilities sector to serve as a Senior Manager who possesses a robust background in Data Architecture, Data Modernization, End to end Data capabilities, AI, Gen AI, Agentic AI, preferably with a power systems / electrical engineering background and having delivered business use cases in Transmission / Distribution / Generation / Customer. The ideal candidate will have a history of working for consulting companies and be well-versed in the fast-paced culture of consulting work. This role is dedicated to the utilities sector, where the successful candidate will craft, deploy, and maintain large-scale AI data ready architectures.


The opportunity

You will help our clients enable better business outcomes while working in the rapidly growing Power & Utilities sector. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career.


As a Senior Manager in Data Architecture, you will have the opportunity to lead transformative technology projects and programs that align with our organizational strategy to achieve impactful outcomes. You will provide assurance to leadership by managing timelines, costs, and quality, and lead both technical and non-technical project teams in the development and implementation of cutting-edge technology solutions and infrastructure. You will have the opportunity to be face to face with external clients and build new and existing relationships in the sector. Your specialized knowledge in project and program delivery methods, including Agile and Waterfall, will be instrumental in coaching others and proposing solutions to technical constraints.


Your key responsibilities

In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include:


As Data Architect – Senior Manager, you will have an expert understanding of data architecture and data engineering and will be focused on problem-solving to design, architect, and present findings and solutions, leading more junior team members, and working with a wide variety of clients to sell and lead delivery of technology consulting services. You will be the go-to resource for understanding our clients’ problems and responding with appropriate methodologies and solutions anchored around data architectures, platforms, and technologies. You are responsible for helping to win new business for EY. You are a trusted advisor with a broad understanding of digital transformation initiatives, the analytic technology landscape, industry trends and client motivations. You are also a charismatic communicator and thought leader, capable of going toe-to-toe with the C-level in our clients and prospects and willing and able to constructively challenge them.


Skills and attributes for success

To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include:



  • Technical Skills Applications Integration
  • Cloud Computing and Cloud Computing Architecture
  • Data Architecture Design and Modelling
  • Data Integration and Data Quality
  • AI/Agentic AI driven data operations
  • Experience delivering business use cases in Transmission / Distribution / Generation / Customer.
  • Strong relationship management and business development skills.
  • Become a trusted advisor to your clients’ senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems.
  • Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements.
  • Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients’ objectives.
  • Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects.
  • Lead the design and implementation of modern data architectures, supporting transactional, operational, analytical, and AI solutions.
  • Direct and mentor global data architecture and engineering teams, fostering a culture of innovation, collaboration, and continuous improvement.
  • Establish data governance policies and practices, including data security, quality, and lifecycle management.
  • Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities.

To qualify for the role, you must have

  • A Bachelor’s degree required in STEM
  • 12+ years professional consulting experience in industry or in technology consulting.
  • 12+ years hands-on experience in architecting, designing, delivering or optimizing data lake solutions.
  • 5+ years’ experience with native cloud products and services such as Azure or GCP.
  • 8+ years of experience mentoring and leading teams of data architects and data engineers, fostering a culture of innovation and professional development.
  • In-depth knowledge of data architecture principles and best practices, including data modelling, data warehousing, data lakes, and data integration.
  • Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications.
  • Hands-on experience in designing end-to-end architectures and pipelines that collect, process, and deliver data to its destination efficiently and reliably.
  • Proficiency in data modelling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta Lakes.
  • Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations.
  • Previous hands‑on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala.
  • Ability to design data solutions that can scale horizontally and vertically while optimizing performance.
  • Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads.
  • Experience in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps).
  • Practical understanding of data encryption, access control, and security best practices to protect sensitive data.
  • Experience leading Infrastructure and Security engineers and architects in overall platform build.
  • Excellent leadership, communication, and project management skills.
  • Data Security and Database Management
  • Enterprise Data Management and Metadata Management
  • Ontology Design and Systems Design

Ideally, you’ll also have

  • Master’s degree in Electrical / Power Systems Engineering, Computer science, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience.
  • Experience working at big 4 or a major utility.
  • Experience with cloud data platforms like Databricks.
  • Experience in leading and influencing teams, with a focus on mentorship and professional development.
  • A passion for innovation and the strategic application of emerging technologies to solve real-world challenges.
  • The ability to foster an inclusive environment that values diverse perspectives and empowers team members.
  • Building and Managing Relationships
  • Client Trust and Value and Commercial Astuteness
  • Communicating With Impact and Digital Fluency

What we look for

We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role.


FY26NATAID


What we offer you

At EY, we’ll develop you with future-focused skills and equip you with world-class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more.



  • We offer a comprehensive compensation and benefits package where you’ll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
  • Join us in our team‑led and leader‑enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
  • Under our flexible vacation policy, you’ll decide how much vacation time you need based on your own personal circumstances. You’ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well‑being.

Are you ready to shape your future with confidence? Apply today.

EY accepts applications for this position on an on‑going basis.


For those living in California, please click here for additional information.


EY focuses on high‑ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.


EY | Building a better working world

EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.


Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.


EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.


EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.


EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY’s Talent Shared Services Team (TSS) or email the TSS at .


#J-18808-Ljbffr
Not Specified
Databricks Architect/ Senior Data Engineer
✦ New
🏢 OZ
Salary not disclosed
Boca Raton, FL 1 day ago

OZ – Databricks Architect/ Senior Data Engineer


Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.


We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!


What We're Looking For:

We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.


This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.


Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.


Position Overview:

The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.


This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.


Key Responsibilities:

  • Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
  • Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
  • DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
  • Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
  • Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
  • GenAI Applications Development: It is a big plus to have experience in GenAI application development


Requirements:

  • 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
  • Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
  • Strong programming skills in Python and SQL; experience with PySpark required.
  • Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
  • Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
  • Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
  • Strong understanding of data architecture, data modeling, and performance optimization.
  • Experience working with cross-functional teams to deliver enterprise data solutions.
  • Tackles complex data challenges, ensuring data quality and reliable delivery.


Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience designing enterprise-scale data platforms and modern data architectures.
  • Experience with data integration tools such as Azure Data Factory or similar platforms.
  • Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
  • Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
  • Databricks, Azure, or cloud certifications are preferred.
  • Strong problem-solving, communication, and technical leadership skills.


Technical Proficiency in:

  • Databricks, Apache Spark, PySpark, Delta Lake
  • Python, SQL, Scala (preferred)
  • Cloud platforms: Azure (preferred), AWS, or GCP
  • Azure Data Factory, Kafka, and modern data integration tools
  • Data warehousing: Databricks, Snowflake, or Azure Fabric
  • DevOps tools: Git, Azure DevOps, CI/CD pipelines
  • Data architecture, ETL/ELT design, and performance optimization


What You’re Looking For:

Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.


About Us:

OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.


OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.

Not Specified
Databricks - Lead Data Engineer
✦ New
Salary not disclosed
Atlanta, Georgia 10 hours ago

We Are Hiring: Databricks Lead Data Engineer – Director Equivalent Role

Location: Atlanta, USA

Work Model: Hybrid – 3 to 4 days in office per week (mandatory)

Eligibility: US Citizens and Green Card (GC) holders only

How to Apply

If you are interested in this position and have the required skills, please send across your resume at:

; ;

Paves Technologies is seeking a highly experienced Databricks Lead Data Engineer – Lead Level (Director Equivalent Role) to drive enterprise-scale data architecture, governance, and advanced analytics initiatives on Azure Cloud. This is a senior leadership role requiring deep Databricks expertise, strong data modeling capabilities, and hands-on architectural ownership across PySpark based distributed systems.

Role Overview

The ideal candidate will bring 10-12 + years of overall data engineering experience, including strong hands-on expertise with Azure Databricks, PySpark, Python, and Azure Cloud data services. You will define architecture standards, lead modernization initiatives, and implement scalable Medallion Architecture (Bronze, Silver, Gold layers) to support enterprise analytics and business intelligence.

Key Responsibilities

  • Lead end-to-end architecture and implementation of enterprise-scale data platforms using Azure Databricks on Azure Cloud.
  • Design and implement Medallion Architecture (Bronze, Silver, Gold layers) using Delta Lake best practices.
  • Build scalable PySpark-based ETL/ELT pipelines across ingestion (Bronze), transformation (Silver), and curated analytics (Gold) layers.
  • Develop advanced data transformations using Python, PySpark, Spark SQL, and advanced SQL constructs.
  • Architect robust data models (dimensional, star schema, normalized models) aligned to analytics and reporting needs.
  • Drive adoption of advanced Databricks capabilities including Unity Catalog, Declarative Pipelines, Delta Lake optimization, and governance frameworks.
  • Establish best practices for partitioning strategies, file compaction, Z-ordering, caching, broadcast joins, and query optimization.
  • Define and standardize reusable Azure Cloud data platform tools, templates, CI/CD frameworks, and infrastructure automation.
  • Work across Azure ecosystem components such as Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure DevOps, networking, and security services.
  • Ensure high standards for data quality, RBAC, lineage tracking, governance, and production stability.
  • Provide architectural leadership and mentorship to data engineering teams.

Required Experience & Skills

  • 10–12+ years of overall experience in Data Engineering.
  • Minimum 3+ years of strong hands-on Databricks experience.
  • Mandatory Certifications:
  • Databricks Certified Data Engineer Associate
  • Databricks Certified Data Engineer Professional
  • Deep hands-on expertise in PySpark, Python programming, and distributed Spark processing.
  • Strong experience designing and implementing Medallion Architecture (Bronze/Silver/Gold layers).
  • Advanced knowledge of Data Modeling, Data Analysis, and complex SQL (window functions, CTEs, execution plan tuning).
  • Strong understanding of Delta Lake architecture, schema evolution, partition strategies, performance optimization, and data governance.
  • Well-versed in enterprise Azure Cloud data platforms, reusable accelerators, CI/CD templates, and governance standards.
  • Proven experience architecting scalable, secure, cloud-native data solutions.
  • Strong leadership, stakeholder management, and executive communication skills.

How to Apply

If you are interested in this position and have the required skills, please send across your resume at:

; ;

Not Specified
Data Architect - Consumer Platform
✦ New
Salary not disclosed

The pay range for this role is $150,000 - $200,000/yr USD.


WHO WE ARE:


Headquartered in Southern California, Skechers—the Comfort Technology Company®—has spent over 30 years helping men, women, and kids everywhere look and feel good. Comfort innovation is at the core of everything we do, driving the development of stylish, high-quality products at a great value. From our diverse footwear collections to our expanding range of apparel and accessories, Skechers is a complete lifestyle brand.


ABOUT THE ROLE:


Skechers Digital Team is seeking a Digital Data Architect reporting to the Director, Digital Architecture, Consumer Domain. This role is responsible for designing and governing Skechers’ Consumer Data 360 ecosystem, enabling identity resolution, high-quality data foundations, personalization, loyalty intelligence, and machine learning capabilities across digital and retail channels.


The ideal candidate will be a strong technical leader, have hands-on full-stack technical knowledge in enterprise technologies related to Skecher’s consumer domain, and have the ability to work in a fast-paced agile environment. You should have knowledge of consumer programs from an architecture/industry perspective, and you should have strong hands-on experience designing solutions on the Salesforce Core Platform (including configuration, integration, and data model best practices).


You will work cross-functionally with Digital Engineering, Data Engineering, Data Science, Loyalty, and Marketing teams to architect scalable, secure, and high-performance data platforms that support advanced personalization and recommender systems.


WHAT YOU’LL DO:


  • Responsible for the full technical life cycle of consumer platform capabilities which includes:
  • Capability roadmap and technical architecture in alignment to consumer experience
  • Technical planning, design, and execution
  • Operations, analytics/reporting, and adoption
  • Define and evolve Skechers’ Consumer Data 360 architecture, including identity resolution (deterministic and probabilistic matching) and unified customer profiles.
  • Architect scalable data models and pipelines across CDP, CRM, e-commerce, marketing automation, data lake, and warehouse platforms.
  • Establish enterprise data quality frameworks including validation, deduplication, anomaly detection, and observability.
  • Optimize SQL workloads and large-scale distributed queries through performance tuning, partitioning, indexing, and workload management strategies.
  • Design and oversee ML pipelines supporting personalization, churn modeling, and recommender systems.
  • Partner with Data Science teams to productionize models using distributed platforms such as Databricks (Spark, Delta Lake, MLflow preferred).
  • Ensure secure data governance, access control (RBAC/ABAC), and compliance with GDPR, CCPA, and related privacy regulations.
  • Provide architectural oversight ensuring performance, scalability, resilience, and maintainability.
  • Collaborate with stakeholders to translate business objectives (LTV growth, personalization lift, engagement) into scalable data solutions.


REQUIREMENTS:


  • Computer Science, Data Engineering, or related degree or equivalent experience.
  • 12+ years experience architecting enterprise data platforms in cloud environments.
  • 9+ years experience with data engineering with a focus on consumer data.
  • 6+ years experience working with Salesforce platforms, including data models and enterprise integrations.
  • Strong experience with Data 360 and identity resolution architectures.
  • Proven expertise in SQL performance tuning and large-scale data modeling.
  • Hands-on experience implementing ML pipelines and recommender systems in production environments.
  • Experience with cloud technologies (AWS, GCP, or Azure).
  • Experience with integration patterns (API, ETL, event streaming).
  • Experience providing technical leadership and guidance across multiple projects and development teams.
  • Experience translating business requirements into detailed technical specifications and working with development teams through implementation, including issue resolution and stakeholder communication.
  • Strong project management skills including scope assessment, estimation, and clear technical communication with both business users and technical teams.
  • Must hold at least one of the following Salesforce Certifications (Platform App Builder, Platform Developer 1, JavaScript Developer 1).
  • Experience with Databricks or similar distributed data/ML platforms preferred.
Not Specified
Data Reporting Analyst
🏢 Deploy
Salary not disclosed
Birmingham, AL 2 days ago

DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.

You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.

At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.

Key Responsibilities

Enterprise Reporting (Hands-On Development)

  • Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
  • Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
  • Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
  • Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
  • Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
  • Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
  • Support reporting scalability as our client grows (new factories, new business units, new product lines).

Enterprise Reporting Standards & Governance

  • Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
  • Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
  • Review and approve reporting change requests, data model modifications, and access requests.
  • Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
  • Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.

Enterprise Data Architecture

  • Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
  • Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
  • Build and govern the centralized semantic model that powers reporting across the company.
  • Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
  • Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.

Leadership & Collaboration

  • Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
  • Communicate complex architectural concepts in clear, business-friendly terms.
  • Lead cross-functional initiatives that require unified data structures or scalable reporting.
  • Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.

Ideal Candidate Profile

  • Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
  • Track record of establishing and enforcing enterprise reporting standards and governance.
  • Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
  • Able to operate as both an individual contributor and a strategic leader.
  • Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
  • Comfortable influencing architectural decisions and guiding technical execution.
  • Strong command of foundational tools and languages such as:
  • DAX
  • Power Query / M
  • SQL
  • Fabric pipelines / ETL tooling
  • Experience with automation and AI-assisted analytics workflows.
Not Specified
jobs by JobLookup
✓ All jobs loaded