Cloudera Data Platform Cdp Jobs in Usa
12,118 positions found — Page 3
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
Team & Mission
The Privacy & Conversion Data team is responsible for how the company safely and compliantly uses conversion data to power monetization. We build and operate the core privacy infrastructure behind ads reporting and optimization, including controlled data environments, finegrained access controls, centralized privacy rules enforcement, and deidentification pipelines for conversion data. Our mission is to make conversion data privacypreserving by default-centralized, deidentified, auditable, and easy for teams to use, while maintaining high utility for advertisers and staying ahead of an evolving global regulatory landscape.
Role Summary
We're seeking a Staff Engineer to lead the architecture and technical direction for the conversion data privacy platform, spanning both core Conversion Data systems and deidentification for ads reporting. You'll own the endtoend design and evolution of privacycritical pipelines and services, partner closely with Product, Data Science, Legal, and infrastructure teams, and set the technical bar for how we use conversion data safely at scale.
What you'll do:
- Lead the technical strategy and architecture for conversion data privacy across access controls, deidentification, deletion, and privacy rules enforcement, driving toward a centralized, deidentifiedbydefault, automated privacy platform for monetization.
- Design and evolve core privacy infrastructure including controlled environments for sensitive data, finegrained authorization and policy enforcement, and a central policy repository that consistently governs access across major data platforms and query engines.
- Own deidentification pipelines for ads reporting endtoend-from separating sensitive and nonsensitive data, applying deidentification techniques and transformations, and generating privacypreserving datasets, to validating data utility and feeding reporting and analytics surfaces.
- Build and improve privacy frameworks and tooling (for both online and offline workflows) that make safe, compliant conversion data usage simple and selfservice for downstream teams, reducing onboarding friction for new datasets, restrictions, and use cases.
- Drive operational excellence and compliance by defining SLAs, building robust monitoring and alerting (e.g., deidentification quality, optout metrics, data leakages), leading incident response, and developing performant deletion and leakagehandling workflows that meet regulatory and audit requirements.
- Partner crossfunctionally with ads, data, product, legal, and infrastructure stakeholders to translate legal/privacy requirements into technical designs, make clear tradeoffs between privacy and utility, and drive alignment on roadmaps, launches, and policy changes that impact advertisers and users.
- Mentor and uplevel engineers across multiple teams, lead critical design and code reviews in privacysensitive areas, and establish best practices and documentation for privacybydesign, deidentification, and largescale data systems.
What we're looking for:
- BS+ in Computer Science (or related field) or equivalent practical experience.
- 8+ years of professional software engineering experience, with a focus on largescale data systems or distributed systems.
- Strong proficiency building and operating data pipelines and services using Java/Scala/Kotlin or Python, plus SQL; experience with modern big data ecosystems is a plus.
- Experience designing secure, reliable systems and APIs, with solid grounding in data modeling, access control, and performance optimization.
- Meaningful experience in at least one of: privacypreserving data systems (e.g., deidentification, kanonymity), ads measurement/attribution, or largescale analytics/experimentation platforms.
- Proven ability to drive crossteam technical initiatives from design through rollout, working closely with product, data science, and nonengineering partners (e.g., Legal, Compliance).
- Strong communication and leadership skills, with a track record of mentoring engineers, raising engineering standards, and making sound decisions in ambiguous, highimpact problem spaces.
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit our PinFlex page to learn more about our working model.
#LI-REMOTE
#LI-KK6
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$177,185—$364,795 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
As a Data Steward Senior Analyst, you are part of a team responsible for enabling and supporting compliance with data-related enterprise policies within their domains/business units. You and your team are responsible for identifying critical data and associated risks, maintaining data definitions, classifying data, supporting data sourcing / usage requests, measuring Data Risk Controls, and confirming Data Issues are remediated. You have the opportunity to partner across various business units, technology teams, and product/platform teams to define and implement the data governance strategy, supervising and leading data quality, resolving data/platform issues, and driving consistency, usability, and governance of specific product data across the enterprise.
In addition, this role will play a key part in effectively communicating new and updated data-related policies to the teams responsible for compliance. The individual must be skilled in preparing clear, engaging presentations that translate formal policy language into practical, easy-to-understand guidance and “tell the story” behind the policy requirements. The role will also support the delivery of training sessions, facilitate policy office hours, and serve as a go-to resource for questions related to data governance and retention compliance.
Your Primary Responsibilities may include:
• Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention (primary), Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others.
• Develop training materials and educate organization on Record Retention and Deletion processes and procedures.
• Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business.
• Collaborate with and influence product managers to ensure all new use cases are managed according to policies.
• Influence and contribute to strategic improvements to data assessment processes and analytical tools.
• Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams.
• Subject matter expertise on multiple platforms.
• Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap.
Qualifications include:
• 5 + years of experience in a similar role involved with ensuring compliance with Record Retention and Deletion policies.
• Strong communication skills and ability to influence and engage at multiple levels and cross functionally.
• Intermediate understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience.
• 5+ years of Data Quality Management experience.
• Strong familiarity with data architecture and/or data modeling concepts
• 5+ years of experience with Agile or SAFe project methodologies
• Bachelor’s degree in Finance, Engineering, Mathematics, Statistics, Computer Science or other similar fields.
• Preferred: Experience in Travel Industry.
• Preferred: Knowledge of RCSA (Risk Control Self-Assessment) methodology
Leadership Skills may include:
• Makes Decisions Quickly and Effectively: Drives effective outcome through decision making authority. Displays judgement and discretion in order to ensure deliverables are sufficient to the American Express policy and overall compliance.
• Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions.
• Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team.
• Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.
Company/Role Overview:
CliftonLarsonAllen (CLA) Search has been retained by Midwestern Higher Education Compact to identify a Data Manager to serve their team. The Midwestern Higher Education Compact (MHEC) brings together leaders from 12 Midwestern states to strengthen postsecondary education, advance student success, and promote regional economic vitality.
MHEC programs and initiatives save member states and students millions of dollars annually through time- and cost-savings opportunities. MHEC research supports workforce readiness and improves the quality, accessibility, and affordability of postsecondary education. MHEC convenings bring together leaders and subject experts to share knowledge, generate ideas, and develop collaborative solutions.
To learn more, click here:
What You’ll Do:
- Administer and maintain Microsoft Fabric, OneLake, and Azure environments.
- Design and deliver sophisticated data solutions that are innovative and sustainable.
- Ensure data infrastructure is secure, reliable, and scalable.
- Manage and improve how data is brought into the organization from multiple sources.
- Maintain accurate, well-structured, consistent, and complete data that ensure high quality and useability for internal staff.
- Develop and oversee standards on how data is collected, stored, and protected across departments.
- Manage MHEC’s customer relationship management (CRM) system, ensuring data integrity, integration with other platforms, and alignment with organizational needs.
- Partner with teams across the organization to monitor processes and make recommendations.
- Partner with research staff to understand data access patterns and develop storage strategies that accelerate research and analytics
- Develop and maintain Power BI dashboards and reports to deliver clear insights to senior leaders and decision-makers.
- Ensure staff have access to timely, clear, and meaningful data visualizations.
- Train staff to use reports and dashboards effectively.
- Support departments in using data to guide decision-making.
- Document data pipelines, integrations, and system processes.
- Recommend tools and practices that help MHEC grow its data capacity.
- Monitor developments in Microsoft’s data platforms and assess future needs.
What You’ll Need:
- Bachelor's degree or equivalent experience preferred.
- 5+ years’ experience, preferably with Microsoft data platforms including Power BI, Azure, and/or Fabric.
- Experience designing and maintaining data systems and dashboards.
- Experience in higher education or nonprofit sectors preferred.
- Strong technical understanding of Microsoft Fabric, OneLake, and Azure.
- Proficiency demonstrated in Python, R, SAS, SQL or other statistical/data management software
- Experience with data visualization platforms (Tableau, Power BI, or similar)
- Experience with Microsoft Dynamics and Power Automate is a plus but not required.
- Ability to plan, optimize, build, and maintain data pipelines and dashboards.
Overview
We are seeking a seasoned Analytics leader to build and lead our enterprise Analytics and Data Governance function in a modern group purchasing / procurement environment. This leader will turn our rich ecosystem of member, supplier, contract, and transaction data into a strategic asset that drives savings, compliance, growth, and differentiated insight for our members and suppliers.
This leader will also own the data governance operating model, enterprise metrics, and analytics roadmap that power member-facing insights, internal performance management, and AI use cases across the technology platform (Website, B2B eCommerce, supplier portal, sourcing tools, and partner integrations).
Key responsibilities
Data governance and policy
- Define and run the enterprise data governance framework covering member, supplier, contract, item, and transaction data domains.
- Establish data ownership and stewardship across functions (Category Management, Supplier Management, Finance, Sales, Marketing, Digital) driving clear accountabilities for data quality and definitions.
- Implement policies for responsible use of data in supplier programs, member reporting, and AI/ML models, ensuring compliance with contractual, regulatory, and privacy requirements.
- Drive data quality management (profiling, remediation, SLAs) for critical assets such as contract price files, item catalogs, rebate/accrual data, and member hierarchies.
- Oversee metadata, business glossary, and data lineage so teams can confidently understand “one source of truth” for core GPO metrics (e.g., committed vs. actual spend, penetration, compliance, savings delivered).
Analytics strategy and delivery
- Define the enterprise analytics vision and roadmap aligned to procurement value levers: spend visibility, category performance, contract compliance, leakage detection, rebate optimization, and supplier performance.
- Lead the design and delivery of standardized KPI suites and dashboards for executives, category teams, supplier partners, and member account teams (e.g., savings scorecards, compliance heatmaps, portfolio optimization).
- Partner with Product and Engineering to ensure the data platform (warehouse, semantic layer, BI tools) can support self-service analytics, embedded insights in member/supplier portals, and AI-driven use cases.
- Champion enterprise metrics and advanced analytics capabilities such as, forecasting, benchmarking, opportunity sizing, and integrity analytics, ensuring models are traceable, governed, and auditable.
- Translate business needs into clear data products (curated data sets, subject-area marts, APIs) that serve both internal teams and external-facing solutions.
Stakeholder leadership and collaboration
- Serve as the enterprise “single point of accountability” for data and analytics, aligning priorities across Technology, Category Management, Supplier Relations, Sales, Finance, and Operations.
- Partner with Supplier and Member-facing teams to co-create analytics offerings that differentiate the GPO (e.g., supplier growth playbooks, member CFO dashboards, public-sector transparency packs).
- Educate executives and business leaders on data literacy, standard metrics, and how to use insights in planning, negotiations, and supplier programs.
- Collaborate closely with Security, Legal, and Compliance to ensure that member and supplier data is used ethically and in line with contracts and regulations.
Team building and operations
- Build and lead a high-performing team of data analysts, analytics engineers, data governance managers, and data stewards.
- Define operating rhythms (data council, data domain forums, metric review cadences) that keep governance and analytics tightly connected to business outcomes.
- Establish and track KPIs for the data function itself (data quality scores, adoption of governed datasets, BI usage, time-to-insight).
- Select and manage key tools and vendors in the analytics and governance ecosystem (warehouse, BI, catalog/governance, quality monitoring).
Qualifications
- Bachelor’s or Master’s degree in Data/Computer Science, Information Systems, Analytics, Statistics, Business, or related field.
- 10+ years of experience in analytics, data governance, or enterprise data management, including 3–5+ years leading teams.
- Proven experience in a procurement, supply chain, GPO, distribution, or B2B marketplace environment strongly preferred.
- Demonstrated success implementing data governance frameworks and delivering analytics that directly influenced commercial or procurement outcomes (e.g., savings, compliance, supplier growth).
- Hands-on familiarity with modern data platforms (e.g., Snowflake/BigQuery/Redshift, dbt, Power BI/Tableau/Looker, and one or more data catalog/governance tools).
- Strong grasp of regulatory / contractual considerations relevant to member and supplier data (data sharing agreements, use of benchmarking, privacy/security standards).
- Excellent leadership, storytelling, and stakeholder management skills; able to influence at C-suite and board levels.
Attributes for success
- Business-first mindset: instinctively ties data work to member value, supplier value, and financial impact.
- Pragmatic operator: balances governance rigor with speed, enabling innovation rather than blocking it.
- Skilled translator: can convert complex data and AI topics into clear narratives for executives, sales, and category leaders.
- Culture builder: passionate about creating a data-driven culture that values standard definitions, trusted data, and measurable outcomes.
Compensation:
$150,000 to $200,000 per year annual salary.
Exact compensation may vary based on several factors, including skills, experience, and education.
Benefit packages for this role include: Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law.
At MVP Health Care, we're on a mission to create a healthier future for everyone. That means embracing innovation, championing equity, and continuously improving how we serve our communities. Our team is powered by people who are curious, humble, and committed to making a difference-every interaction, every day. We've been putting people first for over 40 years, offering high-quality health plans across New York and Vermont and partnering with forward-thinking organizations to deliver more personalized, equitable, and accessible care. As a not-for-profit, we invest in what matters most: our customers, our communities, and our team.
What's in it for you:
- Growth opportunities to uplevel your career
- A people-centric culture embracing and celebrating diverse perspectives, backgrounds, and experiences within our team
- Competitive compensation and comprehensive benefits focused on well-being
- An opportunity to shape the future of health care by joining a team recognized as a Best Place to Work For in the NY Capital District, one of the Best Companies to Work For in New York, and an Inclusive Workplace.
You'll contribute to our humble pursuit of excellence by bringing curiosity to spark innovation, humility to collaborate as a team, and a deep commitment to being the difference for our customers. Your role will reflect our shared goal of enhancing health care delivery and building healthier, more vibrant communities.
Qualifications you'll bring:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or equivalent experience.
- The availability to work full-time, hybrid
- 5+ years in data engineering, platform engineering, or cloud data operations.
- 3+ years hands-on Azure Databricks experience in production.
- 3+ years cloud (Azure) data platform engineering experience.
- Curiosity to foster innovation and pave the way for growth
- Humility to play as a team
- Commitment to being the difference for our customers in every interaction
Your key responsibilities:
- Workspace Administration (Dev/Test/Prod): Administer, standardize, and optimize Azure Databricks workspaces across Dev/Test/Prod, including clusters, jobs, libraries, pools, policies, and secrets.
- Platform Standards & Root-Cause Analysis: Define, implement, and enforce platform standards and guardrails; lead advanced troubleshooting and root-cause analysis in partnership with data, cloud, and security teams.
- Spark Performance & Cost Optimization: Tune Spark workloads and cluster configurations to maximize performance and minimize cost across batch and streaming pipelines; develop visibility into compute consumption and efficiency trends.
- Governance (Unity Catalog): Operate and enhance Unity Catalog governance, including secure access, lineage, auditing, and compliance alignment with enterprise and regulatory requirements.
- Security & Identity Integration: Partner with Security and Cloud teams to implement Azure networking patterns, Azure Active Directory (AAD) integration, and workspace isolation controls.
- DevOps (CI/CD & Git): Design, build, and maintain CI/CD pipelines and Git-based workflows for Databricks notebooks, jobs, Delta Live Tables, and Unity Catalog artifacts.
- Infrastructure as Code (Terraform): Provision and manage Databricks infrastructure using Terraform (IaC) to enable consistent SDLC promotion, repeatability, and scalable operations.
- Lakehouse Enablement & Reusable Patterns: Develop and promote lakehouse platform patterns, reusable frameworks, and standardized orchestration for batch and streaming data pipelines.
- Observability & Reliability Engineering: Implement observability and reliability practices, including monitoring, dashboards, alerting, SLIs/SLOs, and incident response to improve platform stability and recoverability.
- Technical Leadership & Continuous Improvement: Mentor engineers and document standards while staying current with Databricks and Azure capabilities; drive continuous improvement through pragmatic enhancements and best practices.
- Contribute to our humble pursuit of excellence by performing various responsibilities that may arise, reflecting our collective goal of enhancing healthcare delivery and being the difference for the customer.
Where you'll be:
Hybrid in Schenectady, NY or Rochester, NY
#cs
Pay Transparency
MVP Health Care is committed to providing competitive employee compensation and benefits packages. The base pay range provided for this role reflects our good faith compensation estimate at the time of posting. MVP adheres to pay transparency nondiscrimination principles. Specific employment offers and associated compensation will be extended individually based on several factors, including but not limited to geographic location; relevant experience, education, and training; and the nature of and demand for the role.
We do not request current or historical salary information from candidates.
$121,767.00-$161,949.75
MVP's Inclusion Statement
At MVP Health Care, we believe creating healthier communities begins with nurturing a healthy workplace. As an organization, we strive to create space for individuals from diverse backgrounds and all walks of life to have a voice and thrive. Our shared curiosity and connectedness make us stronger, and our unique perspectives are catalysts for creativity and collaboration.
MVP is an equal opportunity employer and recruits, employs, trains, compensates, and promotes without discrimination based on race, color, creed, national origin, citizenship, ethnicity, ancestry, sex, gender identity, gender expression, religion, age, marital status, personal appearance, sexual orientation, family responsibilities, familial status, physical or mental disability, handicapping condition, medical condition, pregnancy status, predisposing genetic characteristics or information, domestic violence victim status, political affiliation, military or veteran status, Vietnam-era or special disabled Veteran or other legally protected classifications.
To support a safe, drug-free workplace, pre-employment criminal background checks and drug testing are part of our hiring process. If you require accommodations during the application process due to a disability, please contact our Talent team at .
Role: Data & Analytics Technical Program Manager (TPM)
LOCATION: PHOENIX, AZ (HYBRID)
Full-Time/Direct Hire
Overview
We are looking for a Data & Analytics TPM to lead delivery of enterprise data initiatives and help
scale the company’s analytics platform. This role will coordinate data engineering, analytics, and
business teams to deliver high-impact data products and insights.
Ideally we need someone who is comes from a core hands-on technical Data & Analytics background and transitioned in to Program Management.
Responsibilities
- Lead delivery of data platform and analytics programs.
- Manage initiatives across Snowflake, data pipelines, and BI analytics.
- Coordinate data engineering, analytics, and business stakeholders.
- Track roadmap, milestones, and execution for data initiatives.
- Drive adoption of dashboards, data products, and analytics capabilities.
Qualifications
- 5–8+ years in Technical Program Management or Data/Analytics programs
- Must have experience Technical experience in Snowflake, ETL, BI Analytics, AWS Cloud.
- Experience with modern data platforms (Snowflake, ETL pipelines, cloud)
- Strong stakeholder and program management skills
- Experience working with data engineering and analytics teams
We Are Hiring: Databricks Lead Data Engineer – Director Equivalent Role
Location: Atlanta, USA
Work Model: Hybrid – 3 to 4 days in office per week (mandatory)
Eligibility: US Citizens and Green Card (GC) holders only
How to Apply
If you are interested in this position and have the required skills, please send across your resume at:
; ;
Paves Technologies is seeking a highly experienced Databricks Lead Data Engineer – Lead Level (Director Equivalent Role) to drive enterprise-scale data architecture, governance, and advanced analytics initiatives on Azure Cloud. This is a senior leadership role requiring deep Databricks expertise, strong data modeling capabilities, and hands-on architectural ownership across PySpark based distributed systems.
Role Overview
The ideal candidate will bring 10-12 + years of overall data engineering experience, including strong hands-on expertise with Azure Databricks, PySpark, Python, and Azure Cloud data services. You will define architecture standards, lead modernization initiatives, and implement scalable Medallion Architecture (Bronze, Silver, Gold layers) to support enterprise analytics and business intelligence.
Key Responsibilities
- Lead end-to-end architecture and implementation of enterprise-scale data platforms using Azure Databricks on Azure Cloud.
- Design and implement Medallion Architecture (Bronze, Silver, Gold layers) using Delta Lake best practices.
- Build scalable PySpark-based ETL/ELT pipelines across ingestion (Bronze), transformation (Silver), and curated analytics (Gold) layers.
- Develop advanced data transformations using Python, PySpark, Spark SQL, and advanced SQL constructs.
- Architect robust data models (dimensional, star schema, normalized models) aligned to analytics and reporting needs.
- Drive adoption of advanced Databricks capabilities including Unity Catalog, Declarative Pipelines, Delta Lake optimization, and governance frameworks.
- Establish best practices for partitioning strategies, file compaction, Z-ordering, caching, broadcast joins, and query optimization.
- Define and standardize reusable Azure Cloud data platform tools, templates, CI/CD frameworks, and infrastructure automation.
- Work across Azure ecosystem components such as Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure DevOps, networking, and security services.
- Ensure high standards for data quality, RBAC, lineage tracking, governance, and production stability.
- Provide architectural leadership and mentorship to data engineering teams.
Required Experience & Skills
- 10–12+ years of overall experience in Data Engineering.
- Minimum 3+ years of strong hands-on Databricks experience.
- Mandatory Certifications:
- Databricks Certified Data Engineer Associate
- Databricks Certified Data Engineer Professional
- Deep hands-on expertise in PySpark, Python programming, and distributed Spark processing.
- Strong experience designing and implementing Medallion Architecture (Bronze/Silver/Gold layers).
- Advanced knowledge of Data Modeling, Data Analysis, and complex SQL (window functions, CTEs, execution plan tuning).
- Strong understanding of Delta Lake architecture, schema evolution, partition strategies, performance optimization, and data governance.
- Well-versed in enterprise Azure Cloud data platforms, reusable accelerators, CI/CD templates, and governance standards.
- Proven experience architecting scalable, secure, cloud-native data solutions.
- Strong leadership, stakeholder management, and executive communication skills.
How to Apply
If you are interested in this position and have the required skills, please send across your resume at:
; ;
Company Description
Press Ganey is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.
Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.
Our Mission:
We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.
Our Values:
To put Human Experience at the heart of organizations so every person can be seen and understood.
Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.
Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.
Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.
Dare to innovate:We challenge the status quo with creativity and innovation as our true north.
Better together:We check our egos at the door. We work together, so we win together.
We are seeking an experienced Staff Data Engineer to join our Unified Data Platform team. The ideal candidate will design, develop, and maintain enterprise-scale data infrastructure leveraging Azure and Databricks technologies. This role involves building robust data pipelines, optimizing data workflows, and ensuring data quality and governance across the platform. You will collaborate closely with analytics, data science, and business teams to enable data-driven decision-making.
Duties & Responsibilities:
- Design, build, and optimizedata pipelinesand workflows inAzureandDatabricks, including Data Lake and SQL Database integrations.
- Implement scalableETL/ELT frameworksusingAzure Data Factory,Databricks, andSpark.
- Optimize data structures and queries for performance, reliability, and cost efficiency.
- Drivedata quality and governance initiatives, including metadata management and validation frameworks.
- Collaborate with cross-functional teams to define and implementdata modelsaligned with business and analytical requirements.
- Maintain clear documentation and enforce engineering best practices for reproducibility and maintainability.
- Ensure adherence tosecurity, compliance, and data privacystandards.
- Mentor junior engineers and contribute to establishingengineering best practices.
- SupportCI/CD pipeline developmentfor data workflows using GitLab or Azure DevOps.
- Partner with data consumers to publish curated datasets into reporting tools such asPower BI.
- Stay current with advancements inAzure, Databricks, Delta Lake, and data architecture trends.
Technical Skills:
- Advanced proficiency inAzure 5+ years(Data Lake, ADF, SQL).
- Strong expertise inDatabricks (5+ years),Apache Spark (5+ years), andDelta Lake (5+ years).
- Proficient inSQL (10+ years)andPython (5+ years); familiarity withScalais a plus.
- Strong understanding ofdata modeling,data governance, andmetadata management.
- Knowledge ofsource control (Git),CI/CD, and modern DevOps practices.
- Familiarity withPower BIvisualization tool.
Minimum Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, or related field.
- 7+ yearsof experience in data engineering, with significant hands-on work incloud-based data platforms (Azure).
- Experience buildingreal-time data pipelinesand streaming frameworks.
- Strong analytical and problem-solving skills.
- Proven ability tolead projectsand mentor engineers.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Master's degree in Computer Science, Engineering, or a related field.
- Exposure tomachine learning integrationwithin data engineering pipelines.
Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.
Additional Information for US based jobs:
Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.
Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
The expected base salary for this position ranges from $110,000 to $170,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.
All your information will be kept confidential according to EEO guidelines.
Our privacy policy can be found here:legal-privacy/
About Us
Perform Properties is a Blackstone Real Estate portfolio company focused on high-performing retail and office properties with People-Appeal - vibrant spaces where people actively choose to work, shop, and gather. With expertise in transactions, development, leasing, and management, the company oversees over 33 million square feet of retail and office properties across the U.S. Learn more: .
Role Summary
Our VP, Data & Analytics unlocks data-driven growth at the speed of natural language through AI-enabled execution across Perform Properties. An innovative architect with deep business literacy, the VP, Data & Analytics will lead our efforts to put data at the center of our Technology capabilities with a modern, performant and AI-ready data & analytics platform. This critical capability will serve a wide range of business functions, enabling Investments, Portfolio, Operations and Finance people to put AI, BI, Analytics and other emerging technologies to work for them every day – not just talk about the potential & possibilities.
This role reports to the Chief Technology Officer and is based in the office, 5 days a week.
Essential Job Functions
- Drive Data Architecture & Engineering excellence that actively reduces our Coordination Tax
- Build Data Modelling & Analytics capabilities to reduce our Time-to-Productivity
- Champion Artificial and Business Intelligence (AI / BI) capabilities through compelling next generation interactions (Visualization, Natural Language & Agentic) that reduce our Time-to-Insight
- Cultivate Data governance & stakeholder engagement that creates real shared ownership of our platform
- Model the successful use of AI as a capabilities & resource extension, not just a gimmick
- Develop individuals & teams of technologists in the Data & Analytics space as their leader
Qualifications and Technical Competencies
- 10+ years leading Data Science, Data Engineering, Analytics and/or AI / ML-focused teams
- 5-7 years managing agile projects (Scrum, Kanban, SAFe)
- 3-5 years managing people (direct reports, manager of managers)
- Demonstrable success working with modern data platforms (Databricks, Snowflake, BigQuery, RedShift, Synapse)
- Demonstrable success delivering AI / ML initiatives (Natural Language Processing, Predictive Modeling, Statistical Modeling)
- Advanced proficiency in common data engineering tools (R, Python, DBT, SQL, Azure Data Factory)
- Advanced proficiency in common visualization tools (Tableau, PowerBI)
- Bachelor’s Degree in Computer Science, Mathematics or relevant tertiary education
Benefits & Compensation
Benefits: The Company provides a variety of benefits to employees, including health insurance coverage, retirement savings plan, paid holidays and paid time off (PTO).
Base Salary Range: $225,00-$265,000. This represents the presently-anticipated low and high end of the Company’s base salary range for this position. Actual base salary range may vary based on various factors, including but not limited to location and experience.
The additional total direct compensation and benefits described above are subject to the terms and conditions of any governing plans, policies, practices, agreements, or other materials or documents as in effect from time to time, including but not limited to terms and conditions regarding eligibility.
Closing
EEO Statement
Our company is proud to be an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Our employment decisions are based on individual qualifications, job requirements and business needs without regard to race, color, marital status, sex, sexual orientation, gender identity and/or expression, age, religion, disability, citizenship status, national origin, pregnancy, veteran status and or any other legally protected characteristics. We are committed to providing reasonable accommodations, if you need an accommodation to complete the application process, please email
#LI-Onsite
The University of Maryland (UMD) seeks a Manager of Data Analytics Enablement to lead the adoption and modernization of enterprise analytics capabilities that enable trusted, data-informed decision-making across campus.
This is an exciting time to join UMD as we advance enterprise data and analytics through a period of innovative growth and modernization.
This role will play a key part in shaping the future of enterprise business intelligence, advancing Microsoft Power BI and Fabric capabilities, and embedding sustainable data quality and stewardship practices into analytics workflows.
Reporting to the Director of Enterprise Data Services, this position partners with institutional leaders, IT teams, and enterprise stakeholders to deliver reliable data products, consistent metrics, and actionable insights.
The manager will lead a team of data professionals and advance practical, operational governance practices that support trusted analytics and long-term institutional impact.
Key Responsibilities: Lead the strategy, development, and continuous improvement of the university’s enterprise business intelligence environment, including Microsoft Power BI and Microsoft Fabric.
Establish standards, best practices, and architectural patterns for semantic models, dashboards, and analytics delivery.
Guide migration and modernization efforts to ensure scalable, secure, and high-performing analytics solutions.
Develop and manage an analytics intake, prioritization, and delivery framework aligned with institutional priorities.
Define and implement data quality monitoring practices to ensure reliability, accuracy, and consistency of enterprise data assets.
Partner with technical teams to embed validation, monitoring, and observability into data pipelines and lakehouse environments.
Promote consistent metric definitions and collaborate with campus stakeholders to clarify data ownership and stewardship roles.
Support adoption of metadata management, data catalog, and lineage capabilities.
Ensure analytics solutions align with university standards for security, privacy, and responsible data use.
Manage, mentor, and develop a team of analytics and data professionals, fostering a culture of quality, collaboration, and service.
Communicate analytics priorities, progress, and impact to leadership and campus partners.
**This position is considered essential and may be required to work at the normal work location or an alternative location during a major catastrophic event, weather emergency, or other operational emergency to help maintain the continuity of University services.
** **May be required to work evenings, nights, weekends, or different shifts for extended periods.
** KNOWLEDGE, SKILLS, & ABILITIES: Knowledge of data privacy and security principles and practices necessary to protect systems and data from threats.
Knowledge in areas of subject matter expertise such as databases, data modeling, ETL, reporting, data governance practices, metadata management, data stewardship, and/or regulatory compliance.
Skill in SQL or programming/scripting languages (e.g.; Python) used for integrations, data pipelines, report development, and data management.
Skill in adapting communication style to different audiences, including technical, business, and executive stakeholders.
Skill in the use of office productivity software such as Office 365 or Google Workspaces.
Ability to lead presentations and training for large groups.
Ability to manage communications and relationships with technical and business stakeholders.
Ability to collaborate effectively with other Managers, Assistant Directors, and Directors to identify and solve problems, make improvements, and address ongoing issues.
Ability to provide a team with effective direction and support in implementations using standards and techniques that lead to a repeatable and reliable solution.
Ability to ensure documentation standards and procedures are implemented for all team responsibilities.
Ability to define deadlines and manage the quality of the work delivered.
Ability to comprehend and handle interpersonal dynamics, demonstrate empathy towards team members, and effectively manage conflicts or challenging circumstances.
Ability to coach and mentor team members in order to enhance their performance, provide constructive feedback, and support skill development.
Physical Demands: Sedentary work.
Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects.
Repetitive motion.
Substantial movements (motions) of the wrists, hands, and/or fingers.
The worker is required to have close visual acuity to perform an activity such as: preparing and analyzing data and figures; transcribing; viewing a computer terminal; extensive reading.
Minimum Qualifications Education: Bachelor’s degree from an accredited college or university.
Experience: Three (3) years of professional experience supporting the operations, maintenance, and administration of data systems, analytics platforms, or data management programs.
One (1) year leading or supervising professional staff.
Other: Additional work experience as defined above may be substituted on a year for year basis for up to four (4) years of the required education.
Preferences: Demonstrated experience leading business intelligence or enterprise analytics initiatives.
Experience managing or mentoring data professionals in a collaborative team environment.
Strong experience with Power BI and modern data platforms such as Microsoft Fabric, Databricks, or similar cloud-based analytics ecosystems.
Proficiency with SQL and/or Python in support of analytics, data modeling, or data quality initiatives.
Experience implementing or advancing data quality practices, including validation, monitoring, or metric standardization.
Experience supporting practical data governance activities such as establishing shared definitions, coordinating data stewardship, or implementing metadata/catalog tools.
Demonstrated ability to collaborate across diverse stakeholders and translate business needs into scalable analytics solutions.
Strong communication skills with the ability to engage both technical and non-technical audiences.
Experience using Jira or similar tools for work intake, project tracking, and prioritization.
Additional Information: Please note that all positions within the Division of Information Technology (DIT) have an in person component with expected time in our College Park, MD location per week.
Telework is not a guaranteed work arrangement.
Visa Sponsorship Information: DIT will not sponsor the successful candidate for work authorization in the United States now or in the future.
F1 STEM OPT support is not available for this position.
Required Application Materials: Resume, Cover Letter, List of three References Best Consideration Date: March 26, 2026 Open Until Filled: Yes Salary Range: $149,120.00
- $178,944.00 Please apply at: Job Risks: Not Applicable to This Position Financial Disclosure Required: No For more information on Financial Disclosure, please visit Maryland's State Ethics Commission website .
Department: DIT-EE-Enterprise Data Services Worker Sub-Type: Staff Regular Benefits Summary: For more information on Regular Exempt benefits, select this link .
Background Checks: Offers of employment are contingent on completion of a background check.
Information reported by the background check will not automatically disqualify anyone from employment.
Before any adverse decision, the finalist will have an opportunity to provide information to the University regarding disclosable background check information.
The University reserves the right to rescind the offer of employment or otherwise decline or terminate employment if the information reported by the background check is deemed incompatible with the position, regardless of when the background check is completed.
Employment Eligibility: The successful candidate must complete employment eligibility verification (on Form I-9) by presenting documents that establish identity and work authorization within the timeframe required by federal immigration law, and where applicable, to demonstrate renewed employment authorization.
Failure to complete employment eligibility verification or reverification within the timeframe set forth by law may result in suspension or termination of employment.
EEO Statement : The University of Maryland, College Park is an Equal Opportunity Employer.
All qualified applicants will receive equal consideration for employment.
Please read the University’s Equal Employment Opportunity Statement of Policy.
Title IX Non-Discrimination Notice See above description for requirements