What Is Data Platform Architecture Jobs in Usa
23,098 positions found — Page 3
Join the team leading the next evolution of virtual care.
At Teladoc Health, you are empowered to bring your true self to work while helping millions of people live their healthiest lives.
Here you will be part of a high-performance culture where colleagues embrace challenges, drive transformative solutions, and create opportunities for growth. Together, we're transforming how better health happens.
Summary of Position
As a Staff Software Engineer, you are a senior individual contributor who leads the design and delivery of significant platform features and raises the bar for engineering quality across the team. You'll work handson in code-designing APIs and data flows, building services in Python/FastAPI and React frontends, and guiding solutions from idea to production. You'll mentor engineers, influence architecture and standards within and adjacent to your team, and partner closely with product and design to achieve clear, measurable outcomes. This role blends deep implementation work with pragmatic technical leadership by example.
Essential Duties and Responsibilities
Lead technical design for platform features and services, breaking ambiguous requirements into clear, incremental designs and stories for your team and adjacent partners.
Implement backend services in Python/FastAPI and React frontends end-to-end, owning a continuous stream of stories from idea to production.
Define and use clear API contracts and data flows between services and UIs, creating patterns and templates others can follow.
Champion high-quality engineering practices, including code reviews, documentation, and maintainable, testable designs.
Develop and improve automated testing (unit, integration, endtoend) and integrate these into everyday development and CI.
Improve CI/CD pipelines and release workflows for your team so the team can ship small, safe changes frequently and confidently.
Own the operational lifecycle of the features and services you build, including monitoring, observability, on-call participation, and incident follow-up.
Design and implement secure-by-default solutions, including robust authentication/authorization, input validation, and safe handling of sensitive data.
Identify and address reliability and performance risks early, proposing concrete technical improvements and sequencing them into the roadmap.
Mentor and unblock engineers through pairing, design discussions, and clear feedback; influence without formal authority.
Partners with product/design to shape requirements into incremental deliverables; escalates tradeoff decisions; proposes sequencing that optimizes value/risk.
The time spent on each responsibility reflects an estimate and is subject to change dependent on business needs.
Supervisory Responsibilities
No
Required Qualifications
Bachelor's degree in Computer Science, Engineering, or related field; equivalent work experience is acceptable.
7+ years of experience in software engineering.
Strong proficiency with Python and modern web backends (FastAPI, Flask, Django, or similar) and solid understanding of HTTP, API design, and data modeling.
Significant experience with React (or a comparable SPA framework) and building production frontends that talk to backend APIs.
Demonstrated ability to own features end-to-end in a small team: from shaping requirements through design, implementation, testing, deployment, and support.
Experience designing and working with distributed systems or multi-service architectures (e.g., service boundaries, async jobs, integration patterns).
Solid understanding of observability and operations for production systems (metrics, logs, traces, dashboards, alerting, incident response).
Strong understanding of security fundamentals (authentication, authorization, secure data handling) and how they apply to web services and UIs.
Deep familiarity with automated testing and CI/CD, and a track record of improving engineering workflows and quality.
Excellent communication and collaboration skills; comfortable working closely with product, design, and other stakeholders.
Proven ability to provide technical leadership in a hands-on way: unblocking others, making clear decisions, and raising the bar through code and reviews.
Bonus Qualifications
Experience in early-stage or small platform teams where engineers wear multiple hats and balance shipping with building foundations.
Experience with Azure and containerized deployments (or similar cloud-native environments).
Experience building platforms (developer platforms, data platforms, or similar) that serve multiple product teams.
Exposure to AI/ML or data-intensive applications (e.g., integrating with model inference APIs, data pipelines, or analytical data stores).
The base salary range for this position is$180,000 - $200,000. In addition to a base salary, this position is eligible for a performance bonus and benefits (subject to eligibility requirements) listed here: Teladoc Health Benefits 2026.Total compensation is based on several factors including, but not limited to, type of position, location, education level, work experience, and certifications.This information is applicable for all full-time positions.
#LI-SS2 #LI-Remote
We follow a Flexible Vacation Policy, intended for rest, relaxation, and personal time. All time off must be approved by your manager prior to use. You will also receive 80 hours of Paid Sick, Safe, and Caregiver Leave annually. This applies to full-time positions only. If you are applying for a part-time role, your recruiter can provide additional details.
As part of our hiring process, we verify identity and credentials, conduct interviews (live or video), and screen for fraud or misrepresentation. Applicants who falsify information will be disqualified.
Teladoc Health will not sponsor or transfer employment work visas for this position. Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future.
Why join Teladoc Health?
Teladoc Health is transforming how better health happens. Learn how when you join us in pursuit of our impactful mission.
Chart your career path with meaningful opportunities that empower you to grow, lead, and make a difference.
Join a multi-faceted community that celebrates each colleague's unique perspective and is focused on continually improving, each and every day.
Contribute to an innovative culture where fresh ideas are valued as we increase access to care in new ways.
Enjoy an inclusive benefits program centered around you and your family, with tailored programs that address your unique needs.
Explore candidate resources with tips and tricks from Teladoc Health recruiters and learn more about our company culture by exploring #TeamTeladocHealth on LinkedIn.
As an Equal Opportunity Employer, we never have and never will discriminate against any job candidate or employee due to age, race, religion, color, ethnicity, national origin, gender, gender identity/expression, sexual orientation, membership in an employee organization, medical condition, family history, genetic information, veteran status, marital status, parental status, or pregnancy). In our innovative and inclusive workplace, we prohibit discrimination and harassment of any kind.
Teladoc Health respects your privacy and is committed to maintaining the confidentiality and security of your personal information. In furtherance of your employment relationship with Teladoc Health, we collect personal information responsibly and in accordance with applicable data privacy laws, including but not limited to, the California Consumer Privacy Act (CCPA). Personal information is defined as: Any information or set of information relating to you, including (a) all information that identifies you or could reasonably be used to identify you, and (b) all information that any applicable law treats as personal information. Teladoc Health's Notice of Privacy Practices for U.S. Employees' Personal information is available at this link.
Company Description
PG Forsta is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.
Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.
Our Mission:
We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.
Our Values:
To put Human Experience at the heart of organizations so every person can be seen and understood.
- Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.
- Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.
- Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.
- Dare to innovate:We challenge the status quo with creativity and innovation as our true north.
- Better together:We check our egos at the door. We work together, so we win together.
Duties & Responsibilities
Design and implement processes, systems and automation to streamline the development and deployment of AI solutions.
Architect robust, reliable solutions for specific AI applications using appropriate cloud-based and open source technologies.
Design and automate data pipelines to deliver complex data products to power training and online inference of AI systems.
Deploy ML models, LLMs and GenAI systems into production, ensuring reliability, efficiency, and scalability across cloud or hybrid environments.
Build and maintain robust CI/CD pipelines tailored to ML model lifecycle management, ensuring a streamlined and agile deployment process.
Monitor model performance, identify potential improvements, and integrate feedback loops for continuous learning and adaptation.
Integrate models with chat interfaces and conversational platforms to create responsive, user-centric applications.
Investigate and implement agent-based architectures that support conversational intelligence and interaction modeling.
Collaborate with cross-functional teams to design AI-driven features that enhance user experience and interaction within chat interfaces.
Work closely with data scientists, product managers, and engineers to ensure alignment on project goals, data requirements, and system constraints.
Mentor junior engineers and provide guidance on best practices in ML model development, deployment, and maintenance.
Create and maintain comprehensive documentation for model architectures, code implementations, data workflows, and deployment procedures to ensure reproducibility, transparency, and ease of collaboration.
Technical Skills
Experience with large-scale deployment tools and environments, including Docker, Kubernetes, and cloud platforms like AWS, Azure, or GCP.
Experience deploying and managing a variety of database technologies.
Experience deploying ML models at scale and optimizing models for low-latency, high-availability environments.
Strong programming skills in Python and proficiency in libraries such as NumPy, Pandas, and Scikit-learn.
Experience with data pipelines, ETL processes, and experience with distributed data frameworks like Apache Spark or Dask.
Familiarity with machine learning frameworks such as TensorFlow, PyTorch, and Hugging Face Transformers.
Knowledge of conversational AI, agent-based systems, and chat interface development.
Proven track record in deploying and maintaining ML and AI solutions in a production setting.
Experience with version control (e.g., Git) and CI/CD tools tailored to ML workflows.
Experience with MLOps.
Experience with Databricks is a plus.
Qualifications
Minimum Qualifications
5+ years of experience in platform engineering with a focus on with a focus on data and ML systems.
Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.
Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.
Additional Information for US based jobs:
Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.
Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
The expected base salary for this position ranges from $100,000 to $140,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.
All your information will be kept confidential according to EEO guidelines.
Our privacy policy can be found here:legal-privacy/
We areseekingan experienced and forward-thinkingSolution Architect - Data Engineeringto lead the design and implementation of scalable, secure, and high-performance data solutions. The ideal candidate will have deep expertise withPython and SQL, experience with data warehouses (Snowflake or something similar), a strong command ofengineering best practices(includinglinters and code formatters, project organization, and managing environments), and practical experience buildingCI/CD pipelinesto ensure robust, automated delivery of data pipelines and services.
Responsibilities
- Architect Scalable Data Solutions
Design and implement end-to-end data engineering architectures that are scalable, maintainable, and performant across batch and real-time processing systems.
- Engineering Leadership
Lead by example with high-quality Python code,utilizinglinters (e.g.,pylint,flake8,black) and enforcing code cleanliness, readability, and best practices across teams.
- CI/CD Pipeline Development
Build, manage, and optimize CI/CD pipelines using tools such asGitHub Actions,GitLab CI,CircleCI, orJenkinsto automate testing, code quality checks, and deployment of data engineering components.
- Data Governance & Quality
Establish data validation, logging, and monitoring strategies to ensure data integrity and reliability at scale.
- Collaborate Cross-Functionally
Work closely with data scientists, software engineers, DevOps, and business stakeholders to translate requirements into technical solutions and ensure alignment with overall enterprise architecture.
- Mentorship & Code Reviews
Provide guidance to junior developers, lead technical reviews, and enforce clean coding standards throughout the data engineering team.
Required Skills & Experience
- 7+ years of experience in software or data engineering, with 3+ years in an architectural or technical leadership role.
- Expert-levelproficiencyinPython and SQL, with a deep understanding of best practices, performance tuning, and maintainable code patterns.
- Proven experience withlinters,formatters, and other static analysis tools to ensure code quality and compliance.
- Hands-on experience designing and implementingCI/CD pipelinesfor data pipelines, APIs, and other backend services.
- Solid knowledge of modern data platforms and technologies (e.g., Spark, Airflow,dbt, Kafka, Snowflake,BigQuery, etc.).
- Strong understanding of software engineering practices such as version control, testing, and continuous integration.
Desired Skills & Experience
- Experience working in cloud environments (AWS, GCP, or Azure).
- Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation.
- Understanding of security, compliance, and governance in data pipelines.
- Excellent communication and documentation skills.
- Strong leadership presence with the ability to mentor and influence teams.
- Problem-solver with a focus on delivering value and simplicity through technology.
Wage and Benefits
We offer a Total Rewards package that includes medical and dental coverage, 401(k) plans, flex spending, life insurance, disability, employee discount program, employee stock purchase program and paid family benefits to support you and your family.The salary range for this position is posted below. Where an employee or prospective employee is paid within this range will depend on, among other factors, actual ranges for current/former employees in the subject position, market considerations, budgetary considerations, tenure and standing with the Company (applicable to current employees), as well as the employee's/applicant's skill set, level of experience, and qualifications.
Employment Transparency
It is the policy of our company to provide equal employment opportunities to all employees and applicants for employment without regard to race, color, ethnicity, gender, age, religion, creed, national origin, sexual orientation, gender identity, marital status, citizenship, genetic information, veteran status, disability, or any other basis prohibited by applicable federal, state, or local law.
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties, or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without notice.
The employer will make reasonable accommodations in compliance with the American with Disabilities Act of 1990. The job description will be reviewed periodically as duties and responsibilities change with business necessity. Essential and other job functions are subject to modification. Reasonable accommodations may be provided to enable individuals with disabilities to perform the essential functions.
For applicants to jobs in the United States: In compliance with the current Americans with Disabilities Act and state and local laws, if you have a disability and would like to request an accommodation to apply for a position with our company, please email .
Salary Range$200,000—$220,000 USD“Let goodness, fairness, and most importantly, love prevails in business; profits will inevitably follow.” – NK Chaudhary, founder
What we do for our team members:
- Comprehensive Benefits: Company Paid Holidays, PTO, Parental Involvement Leave, Maternity/Paternity Leave, EAP, No Cost Employee Medical Plan, Vision, Dental, and Company Paid Life Insurance. We also include a match on retirement (401K/Roth).
- Career Development: We're committed to providing growth for career development within the company, supporting our team members' aspirations with a well-defined succession plan that includes a variety of training and development opportunities.
- Pet-Friendly Workplace: We welcome your furry friends! Our 'Bring Your Dogs to Work' policy creates a pet-friendly atmosphere, allowing our team members to enjoy the companionship of their dogs during the workday.
- Wellness Support: Not only do we support an active lifestyle with our on-site basketball court and yoga studio, but we host quarterly mental health events to assist in creating a well-rounded work-life harmony for our team members.
- Sustainability Efforts: Reuse, Renew, and Refresh by joining our Green Team! Responsible for harvesting from the organic community garden, donating goods to local pet shelters and schools, creating educational workshops, leading nature walks, and much more, they promote well-being through sustainable practices.
Our Values
Empowerment • Inclusiveness • Responsibility • Progressive
Learn more about our company story here: Jaipur Rugs Foundation
Since 2004, the Jaipur Rugs Foundation has worked to improve the lives of rug-weaving artisans in India. This is done through training, skills development, and social interventions. By focusing on the ideas and solutions that create social value, the Foundation supports the dignity and heritage of these traditional artisans, believing that healthy and sustainable communities are key to the survival of traditional rug weaving. Jaipur Living has made ethical and socially conscious global citizenship the foundation of its business. Through social initiatives and the Jaipur Rugs Foundation, the company supports a supplier ecosystem without a middleman of more than 40,000 artisans in 700 villages across India by providing them with a livable wage, access to health care, leadership education, and opportunities for personal growth and development. Combining time-honored techniques and of-the-moment trends, every Jaipur Living product is as ethically and responsibly made as it is beautiful.
Learn more about the Jaipur Rugs Foundation here: are a fast-growing, design-led B2B home décor and textiles brand with big ambitions. Over the last 12 months, we have revolutionized our technical foundation, investing in Microsoft Dynamics 365 (F&O) and a Microsoft Fabric ecosystem. We are now looking for a seasoned leader to refine our existing infrastructure, optimize our end-to-end data workflows, and bridge the gap between "raw data" and "reliable business intelligence."
This role demands a strong balance of technical depth and operational management. While you must possess expert-level proficiency in data engineering, specifically within the Microsoft Fabric ecosystem and modern data platforms, we also need a leader who is experienced in analytics, data visualization, BI, and translating business needs into analytical solutions. You will be responsible for defining and executing an outcome-based Data & Analytics strategy, building and developing a global team of data engineers, BI developers, and data analysts, and ensuring the company has trusted, scalable, and decision-ready data at every level of the organization. The ideal candidate is a Fabric-certified or Fabric-trained leader, an exceptional communicator, and a proven people manager who can balance hands-on technical depth with strategic leadership.
Key Responsibilities:
Strategic Management & Outcome-Based Delivery
- Tactical Roadmap: Develop and execute a multi-year roadmap that aligns data engineering, BI, and advanced insights with business priorities (e.g., inventory efficiency, margin protection, and growth).
- Process Standardization: Define what “good” looks like for data reliability, documentation, insight quality, and business impact
- Baseline Maturity: Shift the organization from ad-hoc reporting to repeatable, trusted, decision-ready data products
- Advance Automation: Assess the current-state landscape and define a clear path from foundational reporting to automated, predictive analytics.
- Executive Communication: Serve as the single point of accountability for all data and analytics capabilities, translating technical progress into business-relevant implications across the organization
Infrastructure Optimization & Fabric Engineering
- Systemic Optimization: Lead the audit and refinement of the existing Fabric environment (Lakehouse, Pipelines, Notebooks) to improve overall performance, stability, and refresh reliability
- Engineering Standards: Set the "gold standard" for architecture, data modeling, testing, and deployment (CI/CD), ensuring the stack is hardened for enterprise-scale growth
- Reduce Manual Effort: Minimize operational risk by standardizing pipelines, refresh processes, and metric calculations
- Automation & Reliability: Systematically identify and eliminate manual reporting and spreadsheet-based workflows through robust automation in PySpark and Fabric
- Proactive Governance: Establish monitoring, alerting, and exception-handling processes to manage data quality and refresh failures before they impact the business
Analytics & Decision Enablement
- High-Quality BI Delivery: Oversee the design and delivery of visually appealing Power BI dashboards that simplify complexity and adhere to our design-led brand standards
- Metric Governance: Ensure KPI definitions and reporting logic are consistent across the company, acting as the arbiter of "the truth" for business metrics
- Advanced Analytics: Identify and operationalize high-value use cases for predictive analytics (e.g., demand forecasting, product lifecycle analysis) as platform maturity increases
- Business Translation: Partner with business leaders to translate business requirements into scalable, intuitive, impactful analytics solutions
- Business Evolution: Lead the transition from descriptive and diagnostic reporting to forward-looking insights that support planning and decision-making
Global Team Leadership & Talent Development
- People Leadership: Directly lead and develop a 3–5 person global team (primarily based in India), establishing clear roles, accountability, and a high-performance culture
- Skill Development: Create career paths and skill-development plans for engineers and analysts to ensure consistent, high-quality delivery
- Operating Model: Build a scalable offshore capability that delivers at speed while maintaining rigorous standards for code quality and documentation
Skills & Minimum Qualifications:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of knowledge, skill, and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities to perform essential functions.
- 10+ years of experience in data engineering, analytics, or BI, with director-level scope or equivalent ownership
- Deep hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Notebooks, semantic models)
- Fabric certification or formal Fabric training strongly preferred
- Strong experience with PySpark and Spark-based transformations
- Strong understanding of Azure data services and modern data architectures
- Exceptional dashboard-development skills using Power BI; portfolio-quality experience preferred
- Strong understanding of data storytelling, executive-ready visualization, and intuitive UI/UX design
- Experience gathering business requirements and translating them into analytical products
- Proven experience leading and developing global / offshore teams
- Strong communicator with the ability to influence at senior levels
- Experience supporting ERP-driven environments; Dynamics 365 preferred
- Ability to juggle strategy, execution, and stakeholder communication simultaneously
Success Measures (First 12–18 Months)
- Strategy Execution: An outcome-based Data & Analytics strategy that is fully operational and tied to business outcomes
- Optimized Infrastructure: A trusted, scalable Fabric platform with significantly reduced manual reporting and 99%+ data availability
- Dashboard Adoption: A suite of high-quality dashboards used daily and weekly by business leaders to drive decision-making
- Team Growth: A high-performing global team with a track record of delivering complex analytics products with speed and precision
Physical Requirements:
- Remaining in a seated position for long periods of time
- Standing is to remain on one’s feet in an upright position without moving about
- The ability to alternate between sitting and standing is present when a worker has the flexibility to choose between sitting or standing as needed when this need cannot be accommodated by schedules breaks and/or lunch period
- Lifting and transporting items that could weight up to 25 pounds
- Entering text or data into a computer by means of a traditional keyboard
- Expressing or exchanging ideas by means of the spoken work to impart oral information to clients and talent and convey detailed spoken instructions to other workers accurately and quickly
- The ability to hear, understand, and distinguish speech and/or other sounds such as in person and telephone
- Clarity of vision to see computer screens and workspace
Senior Data Modeler
Hybrid 3-4 days onsite
Location: Phoenix, Arizona
Salary: $130,000 - $150,000 base
A large, operationally complex organization is undergoing a major modernization of its data platform and is building a new, cloud-native analytics foundation from the ground up. This is a greenfield opportunity for a senior-level data modeler to establish best practices, influence architecture, and help shape how data is organized and used across the business.
This role sits at the center of a multi-year transformation focused on modern analytics, scalable data products, and strong collaboration between data and business teams.
What You’ll Be Working On
- Designing and implementing enterprise data models across conceptual, logical, and physical layers
- Establishing Medallion architecture patterns and reusable modeling assets
- Building dimensional and semantic models that support analytics and reporting
- Partnering closely with domain experts and functional leaders to translate business needs into data structures
- Collaborating with data engineers to align models with ELT pipelines and analytics frameworks
- Helping define modeling standards and upskilling senior engineers in modern data modeling practices
- Contributing hands-on to data engineering work where needed (SQL, transformations, optimization)
- Proactively identifying analytics opportunities and recommending data structures to support them
This role is roughly 40% data modeling, 30% hands-on engineering, and 30% cross-functional collaboration.
Must-Have Experience
- Strong, hands-on experience with data modeling (dimensional, canonical, semantic)
- Deep understanding of Medallion architecture
- Advanced SQL and experience working with a modern cloud data warehouse
- Experience with dbt for transformations and modeling
- Hands-on experience in cloud-native data environments (AWS preferred)
- Ability to work directly with business stakeholders and explain technical concepts clearly
- Experience collaborating closely with data engineers on execution
Nice to Have
- Python experience
- Familiarity with Informatica or reverse-engineering legacy data models
- Exposure to streaming or near-real-time data pipelines
- Experience with visualization tools (tool choice is flexible)
Who Will Thrive in This Role
- A senior individual contributor who enjoys building from scratch
- Someone who can act as a modeling expert and mentor in an organization formalizing this practice
- Comfortable working in ambiguity and taking initiative
- Strong communicator who enjoys partnering with both technical and non-technical teams
- Equally comfortable discussing business concepts and physical data models
Why This Role Is Unique
- Greenfield data modeling initiative with real influence
- Opportunity to define standards that will be used across the organization
- Work on large-scale, real-world operational and analytical data
- High visibility within a growing data organization
- Flexible work setup for individual contributors
If you’re excited about shaping a modern data foundation and want to be the person who defines how data is modeled, understood, and used, this is a rare opportunity to make a lasting impact.
We’re proud to be where the pets go and where the pet people go. If you want to make a real difference, create an exciting career path, feel welcome to be your whole self and nurture your wellbeing, Petco is the place for you.
Our core values capture that spirit as we work to improve lives by doing what’s right for pets and people.
- Pet First – Protect & Empower. All pets should Live their Best Life. We put the needs of pets and pet parents at the center of everything we do.
- Foster the Fun – Connect & Bond. Our Passion for pets brings us together! We celebrate the journey of pet parenthood through district experiences, products, and services.
- Let’s Go! Own & Commit. We are stronger as One Petco team. We bring our unique superpowers and champion authenticity in everyone to drive success.
We’re proud to be "where the pets go" to find everything they need to live their best lives for more than 60 years — from their favorite meals and toys, to trusted supplies and expert support from people who get it, because we live it. We believe in the universal truths of pet parenthood — the boundless boops, missing slippers, late night zoomies and everything in between. And we’re here for it. Every tail wag, every vet visit, every step of the way. We are 29,000+ strong and together we nurture the pet-human bond in more than 1,500 Petco stores across the U.S., Mexico and Puerto Rico, 250+ Vetco Total Care hospitals, hundreds of preventive care clinics and eight distribution centers. In 1999, we founded Petco Love. Together, we support thousands of local animal welfare groups nationwide and have helped find homes for approximately 7 million animals through in-store adoption events.
Membership, Customer Data & Loyalty
Position Overview
The Senior Digital Product Manager will lead digital product initiatives supporting Membership, Customer Data, and Loyalty programs for a $6B specialty retail organization. Will own the end-to-end product strategy and roadmap for customer identity, data platforms, and loyalty experiences across digital and in-store channels.
The ideal candidate brings deep expertise in customer data platforms (CDPs), identity resolution, loyalty ecosystems, personalization, and privacy governance, combined with strong business acumen and cross-functional leadership skills.
Key Responsibilities
Product Strategy & Vision
- Define and execute the multi-year product strategy for Membership, Customer Data, and Loyalty platforms.
- Develop and maintain a prioritized product roadmap aligned with enterprise growth, retention, and customer lifetime value (CLV) objectives.
- Identify opportunities to leverage customer data to drive personalization, engagement, and revenue growth.
- Lead development and optimization of customer data capabilities, including:
- Identity resolution and profile unification
- Data governance and compliance (GDPR, CCPA, etc.)
- Segmentation and audience management
- Real-time personalization enablement
- Partner with Engineering and Data teams to evolve CDP, CRM, and marketing technology stacks.
- Ensure scalable architecture to support omnichannel retail environments.
- Own digital product capabilities supporting loyalty enrollment, rewards management, tiering, promotions, and engagement campaigns.
- Optimize customer lifecycle journeys from acquisition through retention.
- Develop features that enhance member value proposition and drive repeat purchase behavior.
- Measure and improve loyalty program ROI, retention rate, and lifetime value.
- Lead agile product teams and collaborate closely with:
- Engineering
- Data Science & Analytics
- Marketing & CRM
- eCommerce
- Store Operations
- Finance & Legal
- Serve as the voice of the customer and translate business objectives into clear product requirements.
- Align stakeholders around KPIs and measurable outcomes.
- Define success metrics and KPIs (CLV, retention, engagement, incremental revenue, NPS).
- Use data and experimentation (A/B testing, cohort analysis) to drive product decisions.
- Build executive-level reporting and business cases for investment prioritization.
- 5+ years of product management experience, with 3+ years in digital product leadership.
- Deep expertise in customer data management, CDPs, CRM systems, and loyalty platforms.
- Experience in retail, specialty retail, consumer brands, or omnichannel environments.
- Proven track record of delivering data-driven personalization initiatives.
- Strong understanding of privacy regulations and data governance frameworks.
- Experience leading agile product teams and influencing cross-functional stakeholders.
- Demonstrated ability to manage complex platform integrations and enterprise-scale systems.
- Experience working in a multi-billion-dollar retail organization.
- Background in subscription or membership-based business models.
- Familiarity with leading CDP and CRM ecosystems (e.g., Salesforce, Adobe, Tealium, etc.).
- MBA or advanced degree in business, technology, or related field.
- Strategic thinker with strong commercial acumen
- Data-driven decision maker
- Influential communicator with executive presence
- Customer-obsessed mindset
- Bias for action and measurable impact
- Ability to operate in fast-paced, matrixed organizations
This role directly influences customer retention, personalization maturity, and revenue growth by shaping how the organization leverages its customer data assets. The Senior Digital Product Manager will play a critical role in strengthening membership value, loyalty engagement, and long-term customer relationships.
#CORP
Qualified applications with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act.
The pay ranges outlined below are presented in accordance with state-specific regulations. These ranges may differ in other areas and could be subject to variation based on regulatory minimum wage requirements. Actual pay rates will depend on factors such as position, location, level of experience, and applicable state or local minimum wage laws. If the regulatory minimum wage exceeds the minimum indicated in the pay range below, the regulatory minimum wage will be the minimum rate applied.
Salary Range: $103,800.00 - $155,700.00
Hourly or Salary Range will be reflected above. For a more detailed overview of Petco Total Rewards, including health and financial benefits, 401K, incentives, and PTO - see Animal Supplies, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, protected veteran status, or any other protected classification.
To translate this webpage to Spanish or other languages on your internet browser, click the translate button to the right of your browser address bar. Additional instructions can be found here: Google Chrome Help .
Para traducir esta página web al español u otros idiomas en su navegador de Internet, haga clic en el botón de traducción a la derecha de la barra de direcciones de su navegador. Puede encontrar instrucciones adicionales aquí: Google Chrome Ayuda.
As a Data Steward Senior Analyst, you are part of a team responsible for enabling and supporting compliance with data-related enterprise policies within their domains/business units. You and your team are responsible for identifying critical data and associated risks, maintaining data definitions, classifying data, supporting data sourcing / usage requests, measuring Data Risk Controls, and confirming Data Issues are remediated. You have the opportunity to partner across various business units, technology teams, and product/platform teams to define and implement the data governance strategy, supervising and leading data quality, resolving data/platform issues, and driving consistency, usability, and governance of specific product data across the enterprise.
In addition, this role will play a key part in effectively communicating new and updated data-related policies to the teams responsible for compliance. The individual must be skilled in preparing clear, engaging presentations that translate formal policy language into practical, easy-to-understand guidance and “tell the story” behind the policy requirements. The role will also support the delivery of training sessions, facilitate policy office hours, and serve as a go-to resource for questions related to data governance and retention compliance.
Your Primary Responsibilities may include:
• Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention (primary), Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others.
• Develop training materials and educate organization on Record Retention and Deletion processes and procedures.
• Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business.
• Collaborate with and influence product managers to ensure all new use cases are managed according to policies.
• Influence and contribute to strategic improvements to data assessment processes and analytical tools.
• Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams.
• Subject matter expertise on multiple platforms.
• Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap.
Qualifications include:
• 5 + years of experience in a similar role involved with ensuring compliance with Record Retention and Deletion policies.
• Strong communication skills and ability to influence and engage at multiple levels and cross functionally.
• Intermediate understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience.
• 5+ years of Data Quality Management experience.
• Strong familiarity with data architecture and/or data modeling concepts
• 5+ years of experience with Agile or SAFe project methodologies
• Bachelor’s degree in Finance, Engineering, Mathematics, Statistics, Computer Science or other similar fields.
• Preferred: Experience in Travel Industry.
• Preferred: Knowledge of RCSA (Risk Control Self-Assessment) methodology
Leadership Skills may include:
• Makes Decisions Quickly and Effectively: Drives effective outcome through decision making authority. Displays judgement and discretion in order to ensure deliverables are sufficient to the American Express policy and overall compliance.
• Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions.
• Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team.
• Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Data Engineer
Our client is seeking a Data Engineer to take ownership of end-to-end data processes within a growing, values-driven organization. This individual will play a key role in ensuring data is accurate, reliable, and actionable across the business. The ideal candidate is hands-on, detail-oriented, and comfortable working across the full data lifecycle—from ingestion and transformation to reporting and stakeholder enablement.
This role is a hybrid model in Portland, Oregon or Lakewood, Washington.
Data Engineer Responsibilities
- Own data quality across systems by identifying, troubleshooting, and resolving inconsistencies and inaccuracies.
- Design, build, and maintain scalable ETL/ELT pipelines to transform raw data into clean, structured datasets.
- Manage data ingestion processes from multiple internal and external sources, including APIs and databases.
- Develop and optimize SQL queries, data models, and schemas to support analytics and reporting needs.
- Create and maintain dashboards and reports in Power BI, ensuring they are accurate, user-friendly, and actionable.
- Partner with business stakeholders to translate requirements into data solutions and meaningful insights.
- Monitor pipeline performance and reliability, proactively addressing failures and inefficiencies.
- Contribute to data architecture design, including data lake structure and best practices.
- Document data sources, transformations, and workflows to support transparency and scalability.
- Collaborate cross-functionally with engineering and business teams to support data-driven decision making.
Data Engineer Qualifications
- 3+ years of experience in a data engineering, analytics engineering, or similar role with ownership of data pipelines and reporting.
- Strong proficiency in SQL, including complex queries, joins, and performance optimization.
- Hands-on experience with Python for data transformation, scripting, and automation.
- Proven experience building and maintaining Power BI dashboards, including data modeling and DAX.
- Experience designing and managing ETL/ELT processes and understanding when to apply each approach.
- Familiarity with cloud-based data platforms, preferably within a Microsoft ecosystem (e.g., Azure Data Factory, Synapse, or similar tools).
- Experience working with data lakes and modern data architecture concepts.
- Ability to work with APIs and semi-structured data formats such as JSON.
- Strong communication skills with the ability to explain data concepts to non-technical stakeholders.
- Detail-oriented with a strong sense of ownership and accountability for data accuracy.
Preferred:
- Experience with ERP or CRM systems as data sources (e.g., Microsoft Dynamics environments).
- Familiarity with transformation frameworks such as dbt.
- Experience working in smaller, collaborative teams with broad responsibilities.
- Background supporting financial or operational data where accuracy is critical.
About Us
Perform Properties is a Blackstone Real Estate portfolio company focused on high-performing retail and office properties with People-Appeal - vibrant spaces where people actively choose to work, shop, and gather. With expertise in transactions, development, leasing, and management, the company oversees over 33 million square feet of retail and office properties across the U.S. Learn more: .
Role Summary
Our VP, Data & Analytics unlocks data-driven growth at the speed of natural language through AI-enabled execution across Perform Properties. An innovative architect with deep business literacy, the VP, Data & Analytics will lead our efforts to put data at the center of our Technology capabilities with a modern, performant and AI-ready data & analytics platform. This critical capability will serve a wide range of business functions, enabling Investments, Portfolio, Operations and Finance people to put AI, BI, Analytics and other emerging technologies to work for them every day – not just talk about the potential & possibilities.
This role reports to the Chief Technology Officer and is based in the office, 5 days a week.
Essential Job Functions
- Drive Data Architecture & Engineering excellence that actively reduces our Coordination Tax
- Build Data Modelling & Analytics capabilities to reduce our Time-to-Productivity
- Champion Artificial and Business Intelligence (AI / BI) capabilities through compelling next generation interactions (Visualization, Natural Language & Agentic) that reduce our Time-to-Insight
- Cultivate Data governance & stakeholder engagement that creates real shared ownership of our platform
- Model the successful use of AI as a capabilities & resource extension, not just a gimmick
- Develop individuals & teams of technologists in the Data & Analytics space as their leader
Qualifications and Technical Competencies
- 10+ years leading Data Science, Data Engineering, Analytics and/or AI / ML-focused teams
- 5-7 years managing agile projects (Scrum, Kanban, SAFe)
- 3-5 years managing people (direct reports, manager of managers)
- Demonstrable success working with modern data platforms (Databricks, Snowflake, BigQuery, RedShift, Synapse)
- Demonstrable success delivering AI / ML initiatives (Natural Language Processing, Predictive Modeling, Statistical Modeling)
- Advanced proficiency in common data engineering tools (R, Python, DBT, SQL, Azure Data Factory)
- Advanced proficiency in common visualization tools (Tableau, PowerBI)
- Bachelor’s Degree in Computer Science, Mathematics or relevant tertiary education
Benefits & Compensation
Benefits: The Company provides a variety of benefits to employees, including health insurance coverage, retirement savings plan, paid holidays and paid time off (PTO).
Base Salary Range: $225,00-$265,000. This represents the presently-anticipated low and high end of the Company’s base salary range for this position. Actual base salary range may vary based on various factors, including but not limited to location and experience.
The additional total direct compensation and benefits described above are subject to the terms and conditions of any governing plans, policies, practices, agreements, or other materials or documents as in effect from time to time, including but not limited to terms and conditions regarding eligibility.
Closing
EEO Statement
Our company is proud to be an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Our employment decisions are based on individual qualifications, job requirements and business needs without regard to race, color, marital status, sex, sexual orientation, gender identity and/or expression, age, religion, disability, citizenship status, national origin, pregnancy, veteran status and or any other legally protected characteristics. We are committed to providing reasonable accommodations, if you need an accommodation to complete the application process, please email
#LI-Onsite