Cloudera Data Platform Reference Architecture Jobs in Usa

24,135 positions found — Page 2

Associate Partner, Data and Technology Transformation
✦ New
$250 +
Chicago, IL 1 day ago
Introduction
Your role and responsibilities
About the Opportunity

IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.


You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.


This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.


Key Responsibilities
Market Leadership & Growth

  • Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.


  • Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.


  • Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.



Strategic Advisory & Transformation Delivery

  • Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.


  • Architect integrated solutions that include:


  • Migration from legacy data platforms to modern cloud-based architectures


  • Data engineering and Information governance


  • Business intelligence and advanced analytics


  • GenAI-powered and agentic AI-driven automation and decisioning


  • Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.



Engagement Excellence & Financial Stewardship

  • Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.


  • Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.


  • Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.



Practice Building & Talent Development

  • Recruit, mentor, and grow top-tier consultants, architects, and data specialists.


  • Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.


  • Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.



Thought Leadership & Market Presence

  • Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.


  • Represent IBM at industry conferences, client events, webinars, and executive roundtables.


  • Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.



This position can be preformed anywhere in the US.


"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."


Required technical and professional expertise
Qualifications

  • 12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.


  • Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.


  • Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:


  • AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)


  • Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)


  • Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)


  • Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.


  • Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.


  • Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.


  • Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.


  • Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.


  • Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.


  • Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.



Preferred technical and professional experience

IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.


#J-18808-Ljbffr
Not Specified
Data Reporting Analyst
🏢 Deploy
Salary not disclosed
Birmingham, AL 2 days ago

DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.

You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.

At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.

Key Responsibilities

Enterprise Reporting (Hands-On Development)

  • Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
  • Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
  • Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
  • Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
  • Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
  • Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
  • Support reporting scalability as our client grows (new factories, new business units, new product lines).

Enterprise Reporting Standards & Governance

  • Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
  • Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
  • Review and approve reporting change requests, data model modifications, and access requests.
  • Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
  • Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.

Enterprise Data Architecture

  • Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
  • Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
  • Build and govern the centralized semantic model that powers reporting across the company.
  • Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
  • Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.

Leadership & Collaboration

  • Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
  • Communicate complex architectural concepts in clear, business-friendly terms.
  • Lead cross-functional initiatives that require unified data structures or scalable reporting.
  • Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.

Ideal Candidate Profile

  • Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
  • Track record of establishing and enforcing enterprise reporting standards and governance.
  • Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
  • Able to operate as both an individual contributor and a strategic leader.
  • Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
  • Comfortable influencing architectural decisions and guiding technical execution.
  • Strong command of foundational tools and languages such as:
  • DAX
  • Power Query / M
  • SQL
  • Fabric pipelines / ETL tooling
  • Experience with automation and AI-assisted analytics workflows.
Not Specified
System Administrator - Microsoft Purview (Data Catalog & Governance)
Salary not disclosed
Raleigh, NC 2 days ago
Role: System Administrator - Microsoft Purview (Data Catalog & Governance)

Location: 100% Remote

Duration: 12+ Months

Overview:

An experienced Administrator to operate and support the enterprise implementation of Microsoft Purview Data Catalog across a complex, multi-platform data environment. The administrator will be responsible for the day-to-day configuration, monitoring, and maintenance of Purview capabilities, ensuring reliable metadata ingestion, catalog quality, lineage visibility, and compliance alignment across governed data domains.

This role focuses on platform operations and governance execution, working within established architecture and enterprise governance standards.

Key Responsibilities

Platform Administration & Operations:


  • Administer and operate Microsoft Purview Data Map and Data Catalog environments.
  • Monitor platform health, scan execution, metadata ingestion, and lineage availability.
  • Troubleshoot and resolve catalog, scan, and connectivity issues.
  • Perform routine maintenance, configuration updates, and service optimizations.
  • Coordinate incident resolution with internal engineering teams and Microsoft support as required.

Data Source Management & Scanning:


  • Register, configure, and maintain data sources across Azure, M365, on?prem, and approved third?party platforms.
  • Configure and schedule metadata scans for supported sources.
  • Manage authentication for scans using managed identities, service principals, and Key Vault secrets.
  • Monitor scan performance, failures, and coverage; take corrective action as needed.
  • Optimize scan frequency and scope to balance cost, performance, and governance coverage.

Catalog Configuration & Metadata Management:


  • Maintain and enforce enterprise metadata standards within the Purview Catalog.
  • Manage business metadata, classifications, glossary terms, and custom attributes.
  • Ensure metadata accuracy, completeness, and consistency across data assets.
  • Support curation activities including asset certification and publishing.
  • Resolve duplicate, incomplete, or stale catalog entries.

Lineage & Discovery Enablement:


  • Enable and validate data lineage ingestion from supported data platforms.
  • Monitor lineage completeness and visibility for critical data assets.
  • Assist data consumers and stewards with lineage?based impact analysis.
  • Escalate lineage gaps or tool limitations requiring architectural or engineering remediation.

Security, Access & Governance Controls:


  • Configure and manage Purview role?based access control (RBAC) within collections.
  • Provision and maintain access for administrators, data curators, and data stewards.
  • Enforce domain?based access controls and separation of duties.
  • Integrate Purview access with Microsoft Entra ID.
  • Support sensitivity labels and classification alignment with Microsoft Information Protection.

Compliance & Risk Support:


  • Support automated discovery of sensitive data (PII, PCI, PHI).
  • Assist risk, audit, and compliance teams with catalog evidence and reporting.
  • Validate scan coverage for regulated data domains.
  • Support regulatory and audit initiatives (SOX, GLBA, NYDFS, GDPR, etc.).

User Support & Enablement:


  • Provide operational support to data producers, consumers, and data stewards.
  • Respond to access requests, catalog issues, and usage questions.
  • Maintain operational documentation, runbooks, and standard operating procedures.
  • Support onboarding of new data domains following established governance patterns.
  • Assist with training and adoption initiatives led by governance or architecture teams.


Required Qualifications:


  • 5+ years experience supporting enterprise data platforms or governance tools and 4+ years hands?on MS Purview experience at enterprise scale.
  • Hands?on experience administering Microsoft Purview Data Catalog.
  • Strong understanding of metadata management, data classification, and lineage concepts.
  • Working knowledge of Azure data services and enterprise data ecosystems.
  • Experience managing access controls and identities using Microsoft Entra ID.
  • Familiarity with regulated data environments and compliance requirements.
  • Strong troubleshooting, operational support, and documentation skills.


Preferred Qualifications:


  • Experience supporting Purview integrations with Synapse, Fabric, Databricks, Snowflake, or SQL Server.
  • Exposure to financial services or other regulated industries.
  • Experience with PowerShell, REST APIs, or basic automation for operational tasks.
  • Prior experience supporting enterprise data governance or stewardship programs.
Not Specified
Engineering Analytics Analyst, Engineering Data Analytics Tool Team
🏢 Boeing
$93,090 - 105,280
Arlington, WA 3 days ago

Job Description

At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.

At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.

The Boeing Commercial Airplane (BCA) Engineering Data Analytics Tool Team (BEDAT) is looking for a Engineering Analytics Analyst to assist in transforming the BCA Engineering Digital footprint in Everett, WA .

Primary Responsibilities:

  • Collect, analyze and implement technical requirements for key performance indicators and metrics in a Cognos based dashboard serving community of 1500 users

  • Design and support backend data source using MS SQL Server/Cognos, by extracting and staging data from 40 upstream databases, creating a single source authority for all BCA engineering related metrics and analytics

  • Develop ad-hoc queries, reports and analytical analysis through SQL, R and Tableau in collaboration with business partners to analyze emerging opportunities

  • Work closely with all levels of BCA Engineering leadership to understand the business and technical requirements

  • Google Cloud Platform familiarization

  • Leads cross-functional teams across multiple business processes

  • Ensures accurate deliverables and maintains results, and communicates to all participants

  • Collects, analyzes, documents, and integrates requirements from multiple process owners

  • Applies and makes recommendations for the process, data, and applications/systems architecture

  • May benchmark, or assist in benchmarking, best practices and industry standards; presents best practices at internal events

  • Learns to balance competing strategic initiatives

  • Conducts business requirements review, coordinates testing schedules, and assists in the preparation of test scripts

  • Communicates with information technology organizations to represent customers and functional users on project requirements, activities, and status

  • Serves as liaison to resolve business requirement issues between customer and information technology representatives

  • Demonstrates basic knowledge and use of Project Management and/or Program Management Best Practices tools necessary to assist clients working through the life cycle of an improvement project, Includes facilitating plan development

  • Seeks opportunities for company-wide synergy with practitioners of methods and tools from other skills or organizations

  • Assists with integration of remaining aspects of enterprise architecture (e.g. information, data, and applications architecture)

  • Ensures solution has architectural compliance and strategic alignment with business objectives

  • Leads, participates, or works together to reach agreement on the development of business architecture design, phased implementation, and use

Basic Qualifications (Required Skills/ Experience):

  • 1 or more years’ experience with collecting, organizing, synthesizing, and analyzing data from multiple sources, summarizes findings; develops conclusions and recommendations from appropriate data sources.

  • 1 or more years’ of experience utilizing and developing Analytical tools & code. ie. SQL, tableau, Cognos, teradata, cloud platforms etc.

  • Bachelors’ degree OR equivalent experience.

Preferred Qualifications (Desired Skills/Experience):

  • 1 more years' experience with supporting multiple managers / leaders with developing strategic monthly, quarterly and yearly strategic plans.

  • 1 or more years' experience working directly with executives or senior leaders

Drug Free Workplace:

Boeing is a Drug Free Workplace where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies.

Pay & Benefits:

At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities.

The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work.

The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements.

Pay is based upon candidate experience and qualifications, as well as market and business considerations.

Summary Pay Range:

Level 3 - $93,090 - $105,280

Applications for this position will be accepted until Mar. 23, 2026

Export Control Requirements:

This is not an Export Control position.

Education

Bachelor's Degree or Equivalent Required

Relocation

Relocation assistance is not a negotiable benefit for this position.

Visa Sponsorship

Employer will not sponsor applicants for employment visa status.

Shift

This position is for 1st shift

Equal Opportunity Employer:

Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.

permanent
Solution Architect - Microsoft Purview (Data Catalog & Governance)
🏢 Spectraforce Technologies
Salary not disclosed
Newark, NJ 3 days ago
Title-: Solution Architect - Microsoft Purview (Data Catalog & Governance)

Duration-: 10+ Months

Location: Remote

Overview

An experienced Solution Architect to lead the enterprise rollout of Microsoft Purview across a complex global, multi cloud environment. The consultant will define architecture, implement domain?based governance, and drive adoption of Purview capabilities including cataloging, lineage, classification, access governance, and compliance controls.

Key Responsibilities


  • Architecture & Implementation
  • Define target?state architecture for Microsoft Purview across Azure, AWS, M365, on prem, and third party platforms.
  • Develop and drive the implementation roadmap across U.S. Businesses, PGIM, Corporate Technology, and international units.
  • Establish Purview reference architecture, integration patterns, and guardrails.
  • Domain Based Governance
  • Design collections, hierarchies, and RBAC aligned to domain structures and legal entity boundaries.
  • Enable domain owned stewardship while enforcing enterprise taxonomies and governance standards.
  • Platform Configuration
  • Configure Data Map, Catalog, Scans, Classifications, Sensitivity Labels, and Lineage.
  • Optimize scan strategy (frequency, cost, performance) and extend classifiers and metadata models.
  • Security & Compliance
  • Integrate Purview with M365 Information Protection, Entra ID, and security baselines.
  • Support PII/PCI/PHI detection, access governance, and regulatory compliance (SOX, GLBA, NYDFS, GDPR).
  • Engineering & Integration
  • Integrate with Synapse, Fabric, Databricks (including Unity Catalog), Snowflake, SQL Server, AWS sources, and SAP/Oracle.
  • Implement IaC (Bicep/Terraform), CI/CD for Purview artifacts, and automation via APIs.
  • Adoption & Stakeholder Management
  • Deliver training, onboarding playbooks, and steward enablement.
  • Lead workshops for new data domains and products.
  • Provide executive level reporting on progress, risks, and KPIs.


Required Qualifications


  • 10+ years in data architecture/governance; 2+ years hands on Purview experience at enterprise scale.
  • Strong expertise in metadata management, lineage, classification, scan optimization, glossary management and domain based operating models.
  • Solid Azure ecosystem knowledge (Storage, Key Vault, Synapse, Fabric, Databricks), M365 Information Protection, and Entra ID.
  • Experience with IaC (Bicep/Terraform), APIs/Atlas, and scripting (PowerShell/Python).
  • Financial services or regulated industry exposure.
  • Excellent communication, stakeholder leadership, and cross domain facilitation skills.


Not Specified
Sr AI Platform Engineer(W2 Contract)
✦ New
🏢 Ampstek
Salary not disclosed
Charlotte, NC 5 hours ago

Job Title: Sr AI Platform Engineer- AI Platform Engineer (Guardrails, Observability & Evaluation Infrastructure)

Location, Charlotte, NC, USA (3 days onsite)



Role Overview



AI Platform Engineer to design and build the foundational components that power enterprise scale GenAI applications. This includes data guardrails, model safety tooling, observability pipelines, evaluation harnesses, and standardized logging/monitoring frameworks. This role is critical for enabling safe, reliable, and compliant AI development across multiple use cases, teams, and business units. Idea is to create the common platform services that AI team will build upon.


Key Responsibilities


1. Guardrails, Safety & Governance


• Design and implement data guardrail frameworks (pre processing, redaction, PII/PHI filtering, DLP integration, prompt defenses).

• Build “Model Armor” components such as:

o Input validation & sanitization

o Prompt injection defenses

o Harmful content detection & policy enforcement

o Output filtering, fact checking, grounding checks

• Integrate safety tooling (policy engines, classifiers, DLP APIs, safety models).

• Collaborate with Security, Compliance, and Data Privacy teams to ensure frameworks meet enterprise governance requirements.


2. Observability Frameworks


• Build and maintain observability pipelines using tools like Arize AI (tracing, quality metrics, dataset drift/hallucination tracking, embedding monitoring).

• Define and enforce platform wide standards for:

o Tracing LLM calls

o Token usage and cost monitoring

o Latency and reliability metrics

o Prompt/model version tracking

• Provide reusable SDKs or middleware for engineering teams to adopt observability with minimal friction.


3. Logging, Monitoring & Telemetry


• Design standardized LLM-specific logging schemas, including:

o Inputs/outputs

o Model metadata

o Retrieval metadata

o Safety flags

o User context and attribution

• Build monitoring dashboards for performance, cost, anomalies, errors, and safety events.

• Implement alerting and SLOs/SLIs for LLM inference systems.


4. Evaluation Infrastructure


• Architect and maintain evaluation harnesses for GenAI systems, including:

o RAG evaluation (faithfulness, relevance, hallucination risk)

o Summarization/QA evaluation

o Human-in-the-loop review workflows

o Automated eval pipelines integrated into CI/CD


• Support frameworks such as RAGAS, G Eval, rubric scoring, pairwise comparisons, and test case generation.

• Build reusable tooling for teams to write, run, and track model evaluations.


5. Platform Engineering & Reusable Components


• Develop shared libraries, APIs, and services for:


o Prompt management/versioning

o Embedding pipelines and model wrappers

o Retrieval adapters

o Common data loaders and document preprocessing

o Tool/function schemas


• Drive consistency across teams with standards, reference architectures, and best practices.

• Review system designs across use cases to ensure alignment to platform patterns.


6. Collaboration & Enablement


• Partner with AI engineers, product teams, and data scientists to understand cross cutting needs and convert them into reusable platform features.

• Create documentation, onboarding guides, examples, and developer tooling.

• Provide internal training (brown bags, workshops) on guardrails, observability, and evaluation frameworks.


Required Qualifications


Technical Skills


• 5–10+ years software engineering or ML infrastructure experience.

• Strong Python engineering fundamentals (FastAPI, async, typing/Pydantic, testing).

• Experience with model safety/guardrails approaches (prompt injection defense, PII redaction, toxicity filters, policy enforcement).

• Hands on with Arize AI, LangSmith, or similar LLM observability platforms.

• Experience creating evaluation frameworks using RAGAS, G Eval, or custom rubric systems.

• Strong familiarity with vector databases (Pinecone, Weaviate, Milvus), embeddings, and retrieval pipelines.

• Solid understanding of LLM architectures, tokenization, embeddings, context limits, and RAG patterns.

• Experience in cloud (GCP preferred), Kubernetes/GKE, containers, and CI/CD.

• Strong understanding of security, governance, DLP, data privacy, RBAC, and enterprise compliance requirements.


Soft Skills


• Strong documentation and communication skills.

• Ability to influence engineering teams and standardize best practices.

• Comfortable working across multiple stakeholders—platform, security, ML engineering, product.


Nice to Have


• Experience with LangChain/LangGraph or LlamaIndex orchestrations.

• Experience with , Rebuff, Protect AI, or similar LLM security tooling.

• Experience with GCP Vertex AI pipelines, Model Monitoring, and Vector Search.

• Familiarity with knowledge graphs, grounding models, fact checking models.

• Building SDKs or developer frameworks adopted across multiple teams.

• On prem or hybrid AI deployment experience.

contract
Senior Data Engineer
✦ New
Salary not disclosed
Boston, MA 5 hours ago

About the Company

Our client is a well-established global insurance and financial services organization known for its strong reputation, financial stability, and long-term commitment to innovation. The company is investing heavily in modern technology and data platforms to transform how insurance products are delivered to professionals and small businesses.


Within this organization, a rapidly growing digital product team is focused on simplifying insurance through modern technology, data-driven decision making, and scalable cloud infrastructure. The team operates in a collaborative, fast-paced environment and places a strong emphasis on engineering excellence, ownership, and continuous improvement.


About the Role

Our client is seeking a Senior Data Engineer to join their Platform Engineering team in Boston. This role will play a key part in designing, building, and scaling a modern cloud-native data platform used to support analytics, business intelligence, and data-driven decision making across the organization.


This individual will work closely with engineering leaders, data teams, and business stakeholders to develop scalable data integration pipelines and build the foundation for enterprise analytics. The role is highly hands-on and ideal for someone who enjoys building data platforms, working with cloud technologies, and enabling self-service analytics.


The team operates in an Agile environment and values collaboration, ownership, and continuous improvement.


Responsibilities

  • Design, develop, and maintain scalable data pipelines using Azure Data Factory, Databricks, and SQL Server
  • Build configuration-driven ingestion processes to support batch and near real-time data pipelines
  • Integrate and transform data from multiple sources including APIs, JSON, CSV, XLS, and other structured and semi-structured formats
  • Design and document data models, data flows, and data dictionaries to support enterprise analytics
  • Implement Lakehouse architecture using Medallion data modeling patterns
  • Develop datasets and reporting pipelines supporting Power BI and enterprise analytics
  • Maintain the health, security, and scalability of the Azure-based data platform
  • Implement best practices for data governance, metadata management, lineage, and access control
  • Build and maintain CI/CD pipelines for data infrastructure using tools such as GitHub Actions or CircleCI
  • Implement monitoring, alerting, and observability for data pipelines and platform performance
  • Optimize data pipelines for performance, scalability, and cost efficiency
  • Work cross-functionally with engineering, product, and business teams to define and deliver data products
  • Participate in Agile ceremonies including sprint planning, backlog refinement, and delivery cycles
  • Mentor junior engineers and contribute to the ongoing development of the data engineering team
  • Stay current on emerging technologies and best practices in data engineering


Qualifications

  • 7+ years of professional experience in data engineering or data platform development
  • Strong experience working in Azure environments, particularly: Azure Data Factory, Databricks, SQL Server / T-SQL
  • Experience building scalable ETL / ELT data pipelines
  • Experience integrating data from multiple sources including APIs, JSON, XML, CSV, and other formats
  • Strong understanding of data warehousing, dimensional modeling, and Lakehouse architecture
  • Experience with Power BI, SSRS, or other reporting platforms
  • Hands-on experience with CI/CD pipelines and DataOps practices
  • Familiarity with data governance, data quality, metadata management, and MDM
  • Experience working in Agile development environments
  • Strong communication and collaboration skills with the ability to work across technical and business teams
  • Experience mentoring or guiding other engineers is a plus
Not Specified
Azure Data Engineer
✦ New
Salary not disclosed
Queens 1 day ago
Job Description : We are seeking a hands-on Consultant with strong Azure ETL experience and advanced Power BI development skills.

They are required to have experience modernizing legacy Microsoft BI environments (including SSIS).

This is not an SSIS-only role.

The consultant will design, modernize, and enhance enterprise data and analytics solutions supporting Cyber Security, Physical Security, Electronic Security and Police operations.

This role includes evolving legacy SQL Server/SSIS-based processes into modern Azure data architectures while designing scalable new ETL/ELT pipelines and delivering executive-level analytics solutions.

The consultant will work directly with stakeholders to deliver production-grade reporting and analytics capabilities across multiple enterprise systems.

This requires architectural thinking and hands-on technical execution.

Core Responsibilities: Candidates must have direct experience building enterprise-grade ETL pipelines and executive Power BI dashboards.

Design and implement modern ETL/ELT pipelines in Azure Assess and refactor existing SSIS packages as part of broader modernization efforts Architect Lakehouse / Medallion data models Develop optimized dimensional data models (star schema) Integrate data from SQL Server, Oracle, APIs, and security platforms Design and deploy enterprise Power BI dashboards Build paginated reports using Power BI Report Builder Optimize DAX and dataset performance Implement Row-Level Security (RLS) Support CI/CD and DevOps deployment processes Produce technical documentation and data lineage artifacts Engage directly with executive stakeholders Required Technical Skills: (Must-Have) Data Engineering & Architecture: Strong ETL/ELT design and optimization experience Advanced SQL (expert-level required) Python / PySpark Dimensional data modeling (star schema required) REST API integrations Azure Data Stack: • Azure Data Factory • Azure Databricks • Azure Synapse Analytics • Azure Data Lake Storage Microsoft Data Platform: • Experience with SQL Server data warehouse environments • Working knowledge of SSIS and experience modernizing or migrating SSIS workflows to Azure-based solutions Power BI: Power BI Desktop (expert-level) Advanced DAX Executive dashboard development Paginated reports (Power BI Report Builder) Data Gateway configuration Incremental refresh Row-Level Security (RLS) Nice to Have: Microsoft Purview Terraform (Infrastructure-as-Code) Orchestration tools (Airflow or equivalent) Security systems data integration experience Experience with C# / .NET web application development (for integration with internal systems or APIs) Experience Requirements: 7+ years of hands-on data engineering / analytics delivery Demonstrated experience building production data pipelines in Azure Proven experience delivering executive-facing Power BI solutions Experience working in complex enterprise environments Software Skills: 4–6 years of experience in Azure for building, deploying, and managing cloud-based data and application services.

Technical Skills: 2–4 years of experience in .NET code development for developing and maintaining enterprise applications and data processing components.

6+ years of experience in Data Modeling including designing logical and physical data models for enterprise data warehouses and analytics systems.

6+ years of experience in Python scripting for data processing, automation, ETL development, and data transformation tasks.

6+ years of experience in Structured Query Language (SQL) for writing complex queries, stored procedures, performance tuning, and data manipulation.
Not Specified
Staff Software Engineer, AI Platform (Python/React)
Salary not disclosed
Purchase, NY 2 days ago

Join the team leading the next evolution of virtual care.

At Teladoc Health, you are empowered to bring your true self to work while helping millions of people live their healthiest lives.

Here you will be part of a high-performance culture where colleagues embrace challenges, drive transformative solutions, and create opportunities for growth. Together, we're transforming how better health happens.

Summary of Position

As a Staff Software Engineer, you are a senior individual contributor who leads the design and delivery of significant platform features and raises the bar for engineering quality across the team. You'll work handson in code-designing APIs and data flows, building services in Python/FastAPI and React frontends, and guiding solutions from idea to production. You'll mentor engineers, influence architecture and standards within and adjacent to your team, and partner closely with product and design to achieve clear, measurable outcomes. This role blends deep implementation work with pragmatic technical leadership by example.

Essential Duties and Responsibilities

  • Lead technical design for platform features and services, breaking ambiguous requirements into clear, incremental designs and stories for your team and adjacent partners.

  • Implement backend services in Python/FastAPI and React frontends end-to-end, owning a continuous stream of stories from idea to production.

  • Define and use clear API contracts and data flows between services and UIs, creating patterns and templates others can follow.

  • Champion high-quality engineering practices, including code reviews, documentation, and maintainable, testable designs.

  • Develop and improve automated testing (unit, integration, endtoend) and integrate these into everyday development and CI.

  • Improve CI/CD pipelines and release workflows for your team so the team can ship small, safe changes frequently and confidently.

  • Own the operational lifecycle of the features and services you build, including monitoring, observability, on-call participation, and incident follow-up.

  • Design and implement secure-by-default solutions, including robust authentication/authorization, input validation, and safe handling of sensitive data.

  • Identify and address reliability and performance risks early, proposing concrete technical improvements and sequencing them into the roadmap.

  • Mentor and unblock engineers through pairing, design discussions, and clear feedback; influence without formal authority.

  • Partners with product/design to shape requirements into incremental deliverables; escalates tradeoff decisions; proposes sequencing that optimizes value/risk.

The time spent on each responsibility reflects an estimate and is subject to change dependent on business needs.

Supervisory Responsibilities

No

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or related field; equivalent work experience is acceptable.

  • 7+ years of experience in software engineering.

  • Strong proficiency with Python and modern web backends (FastAPI, Flask, Django, or similar) and solid understanding of HTTP, API design, and data modeling.

  • Significant experience with React (or a comparable SPA framework) and building production frontends that talk to backend APIs.

  • Demonstrated ability to own features end-to-end in a small team: from shaping requirements through design, implementation, testing, deployment, and support.

  • Experience designing and working with distributed systems or multi-service architectures (e.g., service boundaries, async jobs, integration patterns).

  • Solid understanding of observability and operations for production systems (metrics, logs, traces, dashboards, alerting, incident response).

  • Strong understanding of security fundamentals (authentication, authorization, secure data handling) and how they apply to web services and UIs.

  • Deep familiarity with automated testing and CI/CD, and a track record of improving engineering workflows and quality.

  • Excellent communication and collaboration skills; comfortable working closely with product, design, and other stakeholders.

  • Proven ability to provide technical leadership in a hands-on way: unblocking others, making clear decisions, and raising the bar through code and reviews.

Bonus Qualifications

  • Experience in early-stage or small platform teams where engineers wear multiple hats and balance shipping with building foundations.

  • Experience with Azure and containerized deployments (or similar cloud-native environments).

  • Experience building platforms (developer platforms, data platforms, or similar) that serve multiple product teams.

  • Exposure to AI/ML or data-intensive applications (e.g., integrating with model inference APIs, data pipelines, or analytical data stores).

The base salary range for this position is$180,000 - $200,000. In addition to a base salary, this position is eligible for a performance bonus and benefits (subject to eligibility requirements) listed here: Teladoc Health Benefits 2026.Total compensation is based on several factors including, but not limited to, type of position, location, education level, work experience, and certifications.This information is applicable for all full-time positions.

#LI-SS2 #LI-Remote

We follow a Flexible Vacation Policy, intended for rest, relaxation, and personal time. All time off must be approved by your manager prior to use. You will also receive 80 hours of Paid Sick, Safe, and Caregiver Leave annually. This applies to full-time positions only. If you are applying for a part-time role, your recruiter can provide additional details.

As part of our hiring process, we verify identity and credentials, conduct interviews (live or video), and screen for fraud or misrepresentation. Applicants who falsify information will be disqualified.

Teladoc Health will not sponsor or transfer employment work visas for this position. Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future.

Why join Teladoc Health?

  • Teladoc Health is transforming how better health happens. Learn how when you join us in pursuit of our impactful mission.

  • Chart your career path with meaningful opportunities that empower you to grow, lead, and make a difference.

  • Join a multi-faceted community that celebrates each colleague's unique perspective and is focused on continually improving, each and every day.

  • Contribute to an innovative culture where fresh ideas are valued as we increase access to care in new ways.

  • Enjoy an inclusive benefits program centered around you and your family, with tailored programs that address your unique needs.

  • Explore candidate resources with tips and tricks from Teladoc Health recruiters and learn more about our company culture by exploring #TeamTeladocHealth on LinkedIn.

As an Equal Opportunity Employer, we never have and never will discriminate against any job candidate or employee due to age, race, religion, color, ethnicity, national origin, gender, gender identity/expression, sexual orientation, membership in an employee organization, medical condition, family history, genetic information, veteran status, marital status, parental status, or pregnancy). In our innovative and inclusive workplace, we prohibit discrimination and harassment of any kind.

Teladoc Health respects your privacy and is committed to maintaining the confidentiality and security of your personal information. In furtherance of your employment relationship with Teladoc Health, we collect personal information responsibly and in accordance with applicable data privacy laws, including but not limited to, the California Consumer Privacy Act (CCPA). Personal information is defined as: Any information or set of information relating to you, including (a) all information that identifies you or could reasonably be used to identify you, and (b) all information that any applicable law treats as personal information. Teladoc Health's Notice of Privacy Practices for U.S. Employees' Personal information is available at this link.

Not Specified
Data Product Engineer
🏢 Spectraforce Technologies
Salary not disclosed
Newark, NJ 2 days ago
Job Title: Marketplace Data Product Engineer

Duration: 6+ months

Location: 100% Remote

Job Overview

The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.

Key Responsibilities


  • Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
  • Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
  • Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
  • Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
  • Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
  • Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
  • Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
  • Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
  • Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
  • Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.


Required Skills & Qualifications


  • 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
  • Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
  • Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
  • Excellent communication and presentation skills, with experience translating technical concepts for business partners.
  • Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
  • Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
  • Strong collaboration and stakeholder management skills across engineering, product, and business teams.
  • Comfortable working in fast?moving environments and driving clarity through ambiguity.


Preferred Qualifications


  • Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
  • Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
  • Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
  • Background in solution architecture, customer engineering, or sales engineering.
  • Experience developing demo environments, sample apps, or repeatable platform enablement assets.
  • Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.


Not Specified
jobs by JobLookup
✓ All jobs loaded