Cloudera Data Platform Jobs in Usa
10,548 positions found
Unleash Your Potential at Quad – Don't Miss Out!
Ready to supercharge your career and make a lasting impact? At Quad, we're excited to welcome ambitious individuals who are driven to excel. Are you mechanically inclined or maintenance-savvy? Ready to take on a new challenge? Look no further! We're looking for motivated, detail-oriented individuals to join our vibrant team in Spartanburg, SC. Your adventure to success begins now – grab this opportunity!
Our 82,000 sq. ft. facility in Spartanburg, SC, is a state-of-the-art packaging plant that serves a diverse range of clients, including medical, pharmaceutical, and well-known liquor and tobacco brands. We offer Sheetfed offset, and narrow-web flexo, along with die cutting and custom folding/gluing, all supported by advanced inline quality control systems.
The facility is well-lit with both natural and artificial lighting, climate-controlled , and impeccably clean. We take great pride in fostering a friendly, team-oriented atmosphere where everyone collaborates to achieve our goals!
Headquartered in Wisconsin, Quad is a marketing experience company that helps brands make direct consumer connections, from household to instore to online. The company does this through its MX Solutions Suite, a comprehensive range of marketing and print services that seamlessly integrate creative, production, and media solutions across online and offline channels. Supported by state-of-the-art technology and data-driven intelligence, Quad simplifies the complexities of marketing by removing friction wherever it occurs along the marketing journey. With approximately 11,000 employees in 11 countries, we serve around 2,100 clients, including industry-leading blue-chip companies that serve both business and consumers in multiple industry verticals, with a particular focus on commerce, including retail, consumer packaged goods, and direct-to-consumer; financial services; and health. Quad is ranked among the largest agency companies in the U.S. by Ad Age , buoyed by its full-service media agency, Rise , and creative agency, Betty . Quad is also one of the largest commercial printers in North America, according to Printing Impressions .
Quad is seeking a Flexo Press Operator for our Spartanburg, SC location. We are looking for operators who are flexible and can work the night shift:
4 pm – 2 am (Mon-Thurs)
Wages start between $19.00 - $25.00 / hour or more based on relevant work experience and a strong employment history.
Essential Functions of this position include:
- Prepare for Operation - Access job ticket information and set up a flexographic printing press to produce labels and other products to customer specifications. Ensure the machine is adequately stocked with the correct raw materials for each job.
- Operate Flexographic Press - Operate assigned equipment in accordance with company safety standards and departmental SOPs to produce printed products according to customer specifications. Continually monitor supply levels of inks, paper, and other required materials to add as needed. Make routine adjustments as needed to maintain print quality and correct any issues as soon as possible.
- Perform Troubleshooting & Maintenance - Observe and monitor machine operations to determine whether adjustments are needed. Perform basic maintenance and advanced troubleshooting of assigned equipment during shift.
- Perform Quality Checks - Complete quality checklist(s) and other required documentation. Perform visual quality checks of the product throughout the printing process to ensure customer satisfaction. Flag bad product for removal from job run. Cut samples from each job and compare them to product standards to ensure compliance with customer specifications.
- Perform Line Clearance - Clean assigned area by removing all products from the line, trash, boxes, and other supplies associated with a completed order.
Required Knowledge, Skills, and Abilities include:
- Knowledge of the setup and operation of a flexographic printing press,, specifically Mark Andy P5 experience preferred, but will consider those with experience on other models as well.
- Mechanical aptitude and skills to perform troubleshooting and maintenance.
- Attention to detail and accuracy.
- Excellent communication skills.
- Ability to analyze problems for root causes and determine solutions.
- Ability to match and detect differences in similar color shades and hues.
- Ability to understand, remember, and apply/follow written and verbal instructions.
- Ability to understand, remember, and communicate routine, factual information.
- Ability to complete routine, existing forms.
- Ability to organize one's schedule and tasks for efficient workflow and production.
- Ability to perform tasks with room for personal interpretation; problem-solving involves a supervisor when needed.
- Ability to count accurately.
- Ability to add, subtract, multiply, and divide numerical data.
- Ability to use measuring equipment to determine substrate sizes, etc.
Working Conditions include:
- Requires work with moving mechanical parts.
- Requires work in a noisy, fast-paced environment where forklifts and other machinery are used.
- Requires work at risk of electrical shock.
Additional Information
The actual rate of pay offered will vary based upon, but not limited to: education, skills, experience, proficiency, performance, shift, and location. In addition to base salary, depending on the role, the total compensation package may also include overtime and shift differentials. Quad offers benefits including medical, dental, and vision coverage, paid time off, disability insurance, annual discretionary match to 401(k) based on company performance, life insurance, and other voluntary supplemental insurance coverages, plus childbirth short-term disability insurance, paid parental leave, adoption & surrogacy benefits, pet insurance, and more!
If you're ready to take the next step in your career with Quad, apply today and become part of a team that values growth, innovation, and your potential to excel.
Location: Anywhere in Country
At EY, we’re all in to shape your future with confidence.
We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
AI & Data - Data Architecture – Senior Manager – Power & Utilities Sector
EY is seeking a motivated professional with solid experience in the utilities sector to serve as a Senior Manager who possesses a robust background in Data Architecture, Data Modernization, End to end Data capabilities, AI, Gen AI, Agentic AI, preferably with a power systems / electrical engineering background and having delivered business use cases in Transmission / Distribution / Generation / Customer. The ideal candidate will have a history of working for consulting companies and be well-versed in the fast-paced culture of consulting work. This role is dedicated to the utilities sector, where the successful candidate will craft, deploy, and maintain large-scale AI data ready architectures.
The opportunity
You will help our clients enable better business outcomes while working in the rapidly growing Power & Utilities sector. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career.
As a Senior Manager in Data Architecture, you will have the opportunity to lead transformative technology projects and programs that align with our organizational strategy to achieve impactful outcomes. You will provide assurance to leadership by managing timelines, costs, and quality, and lead both technical and non-technical project teams in the development and implementation of cutting-edge technology solutions and infrastructure. You will have the opportunity to be face to face with external clients and build new and existing relationships in the sector. Your specialized knowledge in project and program delivery methods, including Agile and Waterfall, will be instrumental in coaching others and proposing solutions to technical constraints.
Your key responsibilities
In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include:
As Data Architect – Senior Manager, you will have an expert understanding of data architecture and data engineering and will be focused on problem-solving to design, architect, and present findings and solutions, leading more junior team members, and working with a wide variety of clients to sell and lead delivery of technology consulting services. You will be the go-to resource for understanding our clients’ problems and responding with appropriate methodologies and solutions anchored around data architectures, platforms, and technologies. You are responsible for helping to win new business for EY. You are a trusted advisor with a broad understanding of digital transformation initiatives, the analytic technology landscape, industry trends and client motivations. You are also a charismatic communicator and thought leader, capable of going toe-to-toe with the C-level in our clients and prospects and willing and able to constructively challenge them.
Skills and attributes for success
To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include:
- Technical Skills Applications Integration
- Cloud Computing and Cloud Computing Architecture
- Data Architecture Design and Modelling
- Data Integration and Data Quality
- AI/Agentic AI driven data operations
- Experience delivering business use cases in Transmission / Distribution / Generation / Customer.
- Strong relationship management and business development skills.
- Become a trusted advisor to your clients’ senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems.
- Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements.
- Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients’ objectives.
- Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects.
- Lead the design and implementation of modern data architectures, supporting transactional, operational, analytical, and AI solutions.
- Direct and mentor global data architecture and engineering teams, fostering a culture of innovation, collaboration, and continuous improvement.
- Establish data governance policies and practices, including data security, quality, and lifecycle management.
- Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities.
To qualify for the role, you must have
- A Bachelor’s degree required in STEM
- 12+ years professional consulting experience in industry or in technology consulting.
- 12+ years hands-on experience in architecting, designing, delivering or optimizing data lake solutions.
- 5+ years’ experience with native cloud products and services such as Azure or GCP.
- 8+ years of experience mentoring and leading teams of data architects and data engineers, fostering a culture of innovation and professional development.
- In-depth knowledge of data architecture principles and best practices, including data modelling, data warehousing, data lakes, and data integration.
- Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications.
- Hands-on experience in designing end-to-end architectures and pipelines that collect, process, and deliver data to its destination efficiently and reliably.
- Proficiency in data modelling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta Lakes.
- Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations.
- Previous hands‑on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala.
- Ability to design data solutions that can scale horizontally and vertically while optimizing performance.
- Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads.
- Experience in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps).
- Practical understanding of data encryption, access control, and security best practices to protect sensitive data.
- Experience leading Infrastructure and Security engineers and architects in overall platform build.
- Excellent leadership, communication, and project management skills.
- Data Security and Database Management
- Enterprise Data Management and Metadata Management
- Ontology Design and Systems Design
Ideally, you’ll also have
- Master’s degree in Electrical / Power Systems Engineering, Computer science, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience.
- Experience working at big 4 or a major utility.
- Experience with cloud data platforms like Databricks.
- Experience in leading and influencing teams, with a focus on mentorship and professional development.
- A passion for innovation and the strategic application of emerging technologies to solve real-world challenges.
- The ability to foster an inclusive environment that values diverse perspectives and empowers team members.
- Building and Managing Relationships
- Client Trust and Value and Commercial Astuteness
- Communicating With Impact and Digital Fluency
What we look for
We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role.
FY26NATAID
What we offer you
At EY, we’ll develop you with future-focused skills and equip you with world-class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more.
- We offer a comprehensive compensation and benefits package where you’ll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
- Join us in our team‑led and leader‑enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
- Under our flexible vacation policy, you’ll decide how much vacation time you need based on your own personal circumstances. You’ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well‑being.
Are you ready to shape your future with confidence? Apply today.
EY accepts applications for this position on an on‑going basis.
For those living in California, please click here for additional information.
EY focuses on high‑ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
EY | Building a better working world
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY’s Talent Shared Services Team (TSS) or email the TSS at .
#J-18808-Ljbffr
8116 - Midtown Office - 2220 W. Broad Street, Richmond, Virginia, 23220
CarMax, the way your career should be!
About this job
As a BI Platform Engineer, you will be responsible for the administration, optimization, and support of enterprise business intelligence platforms including Power BI and Tableau. You will work closely with multiple analyst and Technology Infrastructure teams to ensure high availability, performance, and scalability of BI environments. Your expertise in Data & Analytics platform engineering, automation, cloud technologies, and user enablement will help drive data democratization and empower business users with reliable, secure, and performant analytics tools.
In addition, you will leverage Artificial Intelligence (AI) capabilities to enhance platform operations, automate routine tasks, and improve user experience. Your ability to integrate intelligent automation and predictive analytics into BI workflows will help drive innovation and efficiency across the organization.
What you will do – Essential Responsibilities
- Administer, monitor, and optimize Power BI and Tableau platforms across cloud and on-prem environments.
- Implement and manage user access, security roles, and governance policies to ensure data protection and compliance.
- Manage PowerBI Fabric capacities, gateway, workspaces and licensing
- Collaborate with cross-functional teams to support dashboard development, data source integration, and performance tuning.
- Automate platform maintenance tasks including upgrades, patching, backups, access provisioning and license management.
- Develop and maintain CI/CD pipelines for BI content deployment and version control.
- Integrate AI tools to automate platform monitoring, anomaly detection, and performance optimization.
- Provide technical support and troubleshooting for BI platform issues and user inquiries.
- Drive adoption of BI tools through training, documentation, and enablement initiatives.
- Monitor platform usage and performance metrics to identify opportunities for optimization and cost savings.
- Stay current with BI platform updates, features, and industry best practices.
- Partner with data governance and security teams to ensure compliance with enterprise standards.
- Participate in major incident response and root cause analysis for BI-related outages or performance issues.
- Mentor junior team members and promote best practices in BI platform administration and engineering.
Purpose of the role
The BI Admin/Platform Engineer plays a critical role in ensuring the reliability, scalability, and usability of enterprise BI platforms, enabling data-driven decision-making across the organization. By integrating AI capabilities, this role also contributes to smarter, more efficient platform operations and user experiences.
Qualifications and Requirements
Basic Qualifications
- 5+ years of experience administering Power BI.
- Experience with Azure services including Azure SQL, Azure Data Factory, and Azure Active Directory.
- Strong understanding of BI architecture, data modeling, and dashboard performance optimization.
- Proficiency in scripting languages such as PowerShell, Python, or Bash for automation.
- Experience with CI/CD tools such as Azure DevOps or GitHub Actions.
- Familiarity with enterprise data lake/warehouse environments (EDL/EDW).
- Strong troubleshooting skills and experience with platform monitoring tools.
- Strong documentation, communication, and presentation skills.
- Experience working in Agile/Scrum environments
- Experience in cloud cost-savings plans, reviews, and reserved instances.
- Ability to positively influence team norms, culture, and technical vision
- Excellent communication skills with the ability to adapt to the audience
- Experience in a fast-paced, highly collaborative agile team within a Product-oriented organization.
- Effective problem-solving, analytical thinking, and a cloud-native and DevOps mindset.
Preferred Qualifications
- Experience in PowerBI Fabric and migration from PBIRS to Fabric
- Bachelor’s/Master’s degree in Computer Science, Information Systems, or related field
- Power BI and Tableau certifications
- Snowflake SnowPro, Azure, Databricks certifications
- Experience with cloud services such as Snowflake, Databricks, Azure Data Factory, Event Hub, Functions, Batch, Key Vault, and Log Analytics
- Strong experience with popular database programming languages such as SQL, PL/SQL, Stored Procedures
- Experience with Snowflake, Databricks, and other modern data platforms.
- Knowledge of REST APIs and scripting for platform automation.
- Familiarity with data governance, metadata management, and self-service BI enablement.
Work Location and Arrangement: This role will be based out of the CarMax Midtown office, Richmond VA or CarMax Technology Hub, Plano TX and have a Hybrid work arrangement.
- Associates based in Richmond work onsite 5 days per week.
- Associates based in Plano work onsite 2 days per week.
Work Authorization: Applicants must be currently authorized to work in the United States on a full-time basis. Sponsorship will not be considered for this specific role.
The pay range for this role is $150,000 - $200,000/yr USD.
WHO WE ARE:
Headquartered in Southern California, Skechers—the Comfort Technology Company®—has spent over 30 years helping men, women, and kids everywhere look and feel good. Comfort innovation is at the core of everything we do, driving the development of stylish, high-quality products at a great value. From our diverse footwear collections to our expanding range of apparel and accessories, Skechers is a complete lifestyle brand.
ABOUT THE ROLE:
Skechers Digital Team is seeking a Digital Data Architect reporting to the Director, Digital Architecture, Consumer Domain. This role is responsible for designing and governing Skechers’ Consumer Data 360 ecosystem, enabling identity resolution, high-quality data foundations, personalization, loyalty intelligence, and machine learning capabilities across digital and retail channels.
The ideal candidate will be a strong technical leader, have hands-on full-stack technical knowledge in enterprise technologies related to Skecher’s consumer domain, and have the ability to work in a fast-paced agile environment. You should have knowledge of consumer programs from an architecture/industry perspective, and you should have strong hands-on experience designing solutions on the Salesforce Core Platform (including configuration, integration, and data model best practices).
You will work cross-functionally with Digital Engineering, Data Engineering, Data Science, Loyalty, and Marketing teams to architect scalable, secure, and high-performance data platforms that support advanced personalization and recommender systems.
WHAT YOU’LL DO:
- Responsible for the full technical life cycle of consumer platform capabilities which includes:
- Capability roadmap and technical architecture in alignment to consumer experience
- Technical planning, design, and execution
- Operations, analytics/reporting, and adoption
- Define and evolve Skechers’ Consumer Data 360 architecture, including identity resolution (deterministic and probabilistic matching) and unified customer profiles.
- Architect scalable data models and pipelines across CDP, CRM, e-commerce, marketing automation, data lake, and warehouse platforms.
- Establish enterprise data quality frameworks including validation, deduplication, anomaly detection, and observability.
- Optimize SQL workloads and large-scale distributed queries through performance tuning, partitioning, indexing, and workload management strategies.
- Design and oversee ML pipelines supporting personalization, churn modeling, and recommender systems.
- Partner with Data Science teams to productionize models using distributed platforms such as Databricks (Spark, Delta Lake, MLflow preferred).
- Ensure secure data governance, access control (RBAC/ABAC), and compliance with GDPR, CCPA, and related privacy regulations.
- Provide architectural oversight ensuring performance, scalability, resilience, and maintainability.
- Collaborate with stakeholders to translate business objectives (LTV growth, personalization lift, engagement) into scalable data solutions.
REQUIREMENTS:
- Computer Science, Data Engineering, or related degree or equivalent experience.
- 12+ years experience architecting enterprise data platforms in cloud environments.
- 9+ years experience with data engineering with a focus on consumer data.
- 6+ years experience working with Salesforce platforms, including data models and enterprise integrations.
- Strong experience with Data 360 and identity resolution architectures.
- Proven expertise in SQL performance tuning and large-scale data modeling.
- Hands-on experience implementing ML pipelines and recommender systems in production environments.
- Experience with cloud technologies (AWS, GCP, or Azure).
- Experience with integration patterns (API, ETL, event streaming).
- Experience providing technical leadership and guidance across multiple projects and development teams.
- Experience translating business requirements into detailed technical specifications and working with development teams through implementation, including issue resolution and stakeholder communication.
- Strong project management skills including scope assessment, estimation, and clear technical communication with both business users and technical teams.
- Must hold at least one of the following Salesforce Certifications (Platform App Builder, Platform Developer 1, JavaScript Developer 1).
- Experience with Databricks or similar distributed data/ML platforms preferred.
Sr. Data Engineer (Hybrid)
Chicago, IL
The American Medical Association (AMA) is the nation's largest professional Association of physicians and a non-profit organization. We are a unifying voice and powerful ally for America's physicians, the patients they care for, and the promise of a healthier nation. To be part of the AMA is to be part of our Mission to promote the art and science of medicine and the betterment of public health.
At AMA, our mission to improve the health of the nation starts with our people. We foster an inclusive, people-first culture where every employee is empowered to perform at their best. Together, we advance meaningful change in health care and the communities we serve.
We encourage and support professional development for our employees, and we are dedicated to social responsibility. We invite you to learn more about us and we look forward to getting to know you.
We have an opportunity at our corporate offices in Chicago for a Sr. Data Engineer (Hybrid) on our Information Technology team. This is a hybrid position reporting into our Chicago, IL office, requiring 3 days a week in the office.
As a Sr. Data Engineer, you will play a key role in implementing
and maintaining AMA's enterprise data platform to support analytics,
interoperability, and responsible AI adoption. This role partners closely with
platform engineering, data governance, data science, IT security, and business
stakeholders to deliver highquality, reliable, and secure data products. This
role contributes to AMA's modern lakehouse architecture, optimizing data
operations, and embedding governance and quality standards into engineering
workflows. This role serves as a
senior technical contributor within the team-providing mentorship to junior
engineers and implementing engineering best practices within the data platform function,
in alignment with architectural direction set by leadership.
RESPONSIBILITIES:
Data Engineering & AI Enablement
- Build and maintain scalable data pipelines and
ETL/ELT workflows supporting analytics, operational reporting, and AI/ML use
cases. - Implement best practice patterns for ingestion,
transformation, modeling, and orchestration within a modern lakehouse
environment (e.g., Databricks, Delta Lake, Azure Data Lake). - Develop highperformance
data models and curated datasets with strong attention to quality, usability,
and interoperability; create reusable engineering components and automation. - Collaborate with the Architecture Team, the Data
Platform Lead, and federated IT teams to optimize storage, compute, and
architectural patterns for performance and costefficiency. - Build model-ready data sets and feature
pipelines to support AI/ ML use cases; serve as a technical coordination point
supporting business units' AI-related infrastructure needs. - Collaborate with data scientists and AI Working
Group to operationalize models responsibly and maintain ongoing monitoring
signals.
Governance, Quality & Compliance
- Embed data governance, metadata standards,
lineage tracking, and quality controls directly into engineering workflows;
ensure technical implementation and alignment within engineering workflows. - Work with the Data Governance Lead and business
stakeholders to operationalize stewardship, classification, validation,
retention, and access standards. - Implement privacybydesign and securitybydesign
principles, ensuring compliance with internal policies and regulatory
obligations. - Maintain documentation for pipelines, datasets,
and transformations to support transparency and audit requirements.
Platform Reliability, Observability & Optimization
- Monitor and troubleshoot pipeline failures,
performance bottlenecks, data anomalies, and platformlevel issues. - Implement observability tooling, alerts,
logging, and dashboards to ensure endtoend reliability. - Support cost governance by optimizing compute
resources, refining job schedules, and advising on efficient architecture. - Collaborate with the Data Platform Lead on
scaling, configuration management, CI/CD pipelines, and environment management. - Collaborate with business units to understand
data needs, translate them into engineering requirements, and deliver
fit-for-purpose data solutions; share and apply best practices and emerging
technologies within assigned initiatives. - Work with IT Security and Legal/ Compliance to
ensure platform and datasets meet risk and regulatory standards.
Staff Management
- Lead, mentor, and provide management oversight
for staff. - Responsible for setting objectives, evaluating
employee performance, and fostering a collaborative team environment. - Responsible for developing staff knowledge and
skills to support career development.
May include other responsibilities as assigned
REQUIREMENTS:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or related field preferred or equivalent work experience and HS diploma/equivalent education required.
- 5+ years of experience in data engineering within cloud environments
- Experience in people management preferred.
- Demonstrated hands-on experience with modern data platforms (Databricks preferred).
- Proficiency in Python, SQL, and data
transformation frameworks. - Experience designing and operationalizing
ETL/ELT pipelines, orchestration workflows (Airflow, Databricks Workflows), and
CI/CD processes. - Solid understanding of data modeling,
structured/unstructured data patterns, and schema design. - Experience implementing governance and quality
controls: metadata, lineage, validation, stewardship workflows. - Working knowledge of cloud architecture, IAM,
networking, and security best practices. - Demonstrated ability to collaborate across
technical and business teams. - Exposure to AI/ML engineering concepts, feature
stores, model monitoring, or MLOps patterns. - Experience with infrastructureascode
(Terraform, CloudFormation) or DevOps tooling.
The American Medical Association is located at 330 N. Wabash Avenue, Chicago, IL 60611 and is convenient to all public transportation in Chicago.
This role is an exempt position, and the salary range for this position is $115,523.42-$150,972.44. This is the lowest to highest salary we believe we would pay for this role at the time of this posting. An employee's pay within the salary range will be determined by a variety of factors including but not limited to business consideration and geographical location, as well as candidate qualifications, such as skills, education, and experience. Employees are also eligible to participate in an incentive plan. To learn more about the American Medical Association's benefits offerings, please click here.
We are an equal opportunity employer, committed to diversity in our workforce. All qualified applicants will receive consideration for employment. As an EOE/AA employer, the American Medical Association will not discriminate in its employment practices due to an applicant's race, color, religion, sex, age, national origin, sexual orientation, gender identity and veteran or disability status.
THE AMA IS COMMITTED TO IMPROVING THE HEALTH OF THE NATION
Apply NowShare Save JobRemote working/work at home options are available for this role.
Your role and responsibilities
About the Opportunity
IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.
You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.
This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.
Key Responsibilities
Market Leadership & Growth
Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.
Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.
Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.
Strategic Advisory & Transformation Delivery
Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.
Architect integrated solutions that include:
Migration from legacy data platforms to modern cloud-based architectures
Data engineering and Information governance
Business intelligence and advanced analytics
GenAI-powered and agentic AI-driven automation and decisioning
Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.
Engagement Excellence & Financial Stewardship
Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.
Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.
Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.
Practice Building & Talent Development
Recruit, mentor, and grow top-tier consultants, architects, and data specialists.
Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.
Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.
Thought Leadership & Market Presence
Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.
Represent IBM at industry conferences, client events, webinars, and executive roundtables.
Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.
This position can be preformed anywhere in the US.
"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."
Required technical and professional expertise
Qualifications
12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.
Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.
Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:
AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)
Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)
Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)
Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.
Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.
Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.
Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.
Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.
Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.
Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.
Preferred technical and professional experience
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
#J-18808-Ljbffr
Job Title: Senior Manager, Data Architecture (Ref: 195759)
Location: Charlotte, North Carolina – In-Office (5 Days Per Week)
Salary: Up to $175,000 + Bonus
Contact:
We’re looking for an experienced and forward-thinking Senior Manager, Data Architecture to define and lead the enterprise data architecture strategy within a large-scale, data-driven organization. This is a high-impact leadership role where you’ll shape the long-term data roadmap, modernize architecture standards, and guide the evolution of a cloud-based data platform.
In this role, you’ll lead a team of data architects and modelers while partnering closely with Data Engineering, Analytics, BI, Platform, and business stakeholders. You’ll ensure scalable, secure, and high-performing data solutions that enable advanced analytics, operational reporting, and strategic decision-making across the enterprise.
What You’ll Do
- Define and maintain the enterprise data architecture vision aligned to business and technology strategy
- Lead, mentor, and grow a team of data architects and modelers, establishing best practices and standards
- Design and govern scalable data platforms leveraging Azure, Snowflake, and Databricks
- Establish enterprise standards for data modeling (Dimensional, 3NF, Data Vault), integration, and storage
- Define architecture patterns for ingestion, transformation, and cross-domain data integration
- Drive architectural consistency across analytics, BI, and operational data products
- Partner with Data Governance teams to enforce data quality, lineage, metadata, and compliance standards
- Ensure solutions meet security, privacy, and regulatory requirements
- Collaborate with Engineering and Platform teams on cloud architecture and long-term technical roadmap
- Communicate complex architectural designs clearly to both technical and executive stakeholders
What You’ll Bring
- 7+ years of experience in data architecture or advanced data engineering roles
- 5+ years in a dedicated Data Architect or equivalent leadership capacity
- Deep experience designing enterprise-scale data platforms in cloud environments
- Strong expertise in Microsoft Azure data services
- Expert-level knowledge of Snowflake and Databricks
- Extensive experience with enterprise data modeling methodologies (Dimensional, 3NF, Data Vault)
- Experience with data modeling tools such as Erwin (preferred)
- Proven experience leading or mentoring architects or senior technical professionals
- Strong understanding of governance, security, and regulatory considerations in enterprise data environments
- Exceptional communication skills with the ability to influence senior stakeholders
Qualifications
- Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field (or equivalent experience)
- 10+ years of progressive experience in data architecture, engineering, or enterprise data platform design
Location: 100% Remote
Duration: 12+ Months
Overview:
An experienced Administrator to operate and support the enterprise implementation of Microsoft Purview Data Catalog across a complex, multi-platform data environment. The administrator will be responsible for the day-to-day configuration, monitoring, and maintenance of Purview capabilities, ensuring reliable metadata ingestion, catalog quality, lineage visibility, and compliance alignment across governed data domains.
This role focuses on platform operations and governance execution, working within established architecture and enterprise governance standards.
Key Responsibilities
Platform Administration & Operations:
- Administer and operate Microsoft Purview Data Map and Data Catalog environments.
- Monitor platform health, scan execution, metadata ingestion, and lineage availability.
- Troubleshoot and resolve catalog, scan, and connectivity issues.
- Perform routine maintenance, configuration updates, and service optimizations.
- Coordinate incident resolution with internal engineering teams and Microsoft support as required.
Data Source Management & Scanning:
- Register, configure, and maintain data sources across Azure, M365, on?prem, and approved third?party platforms.
- Configure and schedule metadata scans for supported sources.
- Manage authentication for scans using managed identities, service principals, and Key Vault secrets.
- Monitor scan performance, failures, and coverage; take corrective action as needed.
- Optimize scan frequency and scope to balance cost, performance, and governance coverage.
Catalog Configuration & Metadata Management:
- Maintain and enforce enterprise metadata standards within the Purview Catalog.
- Manage business metadata, classifications, glossary terms, and custom attributes.
- Ensure metadata accuracy, completeness, and consistency across data assets.
- Support curation activities including asset certification and publishing.
- Resolve duplicate, incomplete, or stale catalog entries.
Lineage & Discovery Enablement:
- Enable and validate data lineage ingestion from supported data platforms.
- Monitor lineage completeness and visibility for critical data assets.
- Assist data consumers and stewards with lineage?based impact analysis.
- Escalate lineage gaps or tool limitations requiring architectural or engineering remediation.
Security, Access & Governance Controls:
- Configure and manage Purview role?based access control (RBAC) within collections.
- Provision and maintain access for administrators, data curators, and data stewards.
- Enforce domain?based access controls and separation of duties.
- Integrate Purview access with Microsoft Entra ID.
- Support sensitivity labels and classification alignment with Microsoft Information Protection.
Compliance & Risk Support:
- Support automated discovery of sensitive data (PII, PCI, PHI).
- Assist risk, audit, and compliance teams with catalog evidence and reporting.
- Validate scan coverage for regulated data domains.
- Support regulatory and audit initiatives (SOX, GLBA, NYDFS, GDPR, etc.).
User Support & Enablement:
- Provide operational support to data producers, consumers, and data stewards.
- Respond to access requests, catalog issues, and usage questions.
- Maintain operational documentation, runbooks, and standard operating procedures.
- Support onboarding of new data domains following established governance patterns.
- Assist with training and adoption initiatives led by governance or architecture teams.
Required Qualifications:
- 5+ years experience supporting enterprise data platforms or governance tools and 4+ years hands?on MS Purview experience at enterprise scale.
- Hands?on experience administering Microsoft Purview Data Catalog.
- Strong understanding of metadata management, data classification, and lineage concepts.
- Working knowledge of Azure data services and enterprise data ecosystems.
- Experience managing access controls and identities using Microsoft Entra ID.
- Familiarity with regulated data environments and compliance requirements.
- Strong troubleshooting, operational support, and documentation skills.
Preferred Qualifications:
- Experience supporting Purview integrations with Synapse, Fabric, Databricks, Snowflake, or SQL Server.
- Exposure to financial services or other regulated industries.
- Experience with PowerShell, REST APIs, or basic automation for operational tasks.
- Prior experience supporting enterprise data governance or stewardship programs.
Duration: 6+ months
Location: 100% Remote
Job Overview
The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.
Key Responsibilities
- Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
- Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
- Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
- Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
- Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
- Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
- Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
- Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
- Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
- Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.
Required Skills & Qualifications
- 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
- Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
- Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
- Excellent communication and presentation skills, with experience translating technical concepts for business partners.
- Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
- Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
- Strong collaboration and stakeholder management skills across engineering, product, and business teams.
- Comfortable working in fast?moving environments and driving clarity through ambiguity.
Preferred Qualifications
- Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
- Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
- Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
- Background in solution architecture, customer engineering, or sales engineering.
- Experience developing demo environments, sample apps, or repeatable platform enablement assets.
- Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.
Location: Remote
Duration: 8+ months
Marketplace Platform Lead
Job Overview
The Marketplace Platform Lead is responsible for driving the end?to?end technical architecture and implementation of the enterprise Data Marketplace platform. This role spans stakeholder engagement, architectural definition, integration design, and hands-on leadership throughout implementation. The ideal candidate is a seasoned technical leader with deep experience designing integration patterns, building scalable platforms, and guiding engineering teams through complex cross-system solutions.
Key Responsibilities
Lead stakeholder meetings to gather business requirements, align on platform objectives, and clarify workflows and user journeys.
Conduct tool evaluations, build scoring frameworks, and make recommendations on platforms, vendors, and integration technologies.
Define end-to-end Marketplace architecture, including data flows, APIs, domain models, integration strategies, and platform components.
Design and lead the implementation of integration patterns, including API-based integrations, event-driven patterns, workflow orchestration, and cross-system interoperability.
Develop technical designs, architectural documents, and standards for Marketplace workflows, user flows, and extensibility patterns.
Provide hands-on architectural guidance to engineering teams throughout solution design, development, and delivery.
Oversee technical quality, scalability, performance, and security across Marketplace components and integrations.
Collaborate with product, engineering, data, and security teams to ensure compliance with enterprise data governance, privacy, and reliability standards.
Lead technical reviews, drive design decisions, and ensure alignment across cross-functional stakeholders.
Required Skills & Qualifications
8+ years of experience in software engineering, platform development, or technical architecture roles.
Strong expertise in designing and implementing integration architectures, including REST/GraphQL APIs, event-driven patterns, synchronous/asynchronous messaging, and workflow engines.
Deep understanding of distributed systems, microservices, and cloud-native solutions (Azure, AWS, or GCP).
Proficiency with API design, messaging systems, and enterprise integration frameworks.
Experience defining technical architecture, data flows, and workflow designs for complex platforms.
Ability to translate business requirements into technical designs, user flows, and actionable engineering plans.
Demonstrated leadership in guiding engineering teams through architectural decisions and implementation.
Strong communication skills with the ability to influence technical and non-technical partners.
Experience evaluating and scoring platforms, tools, or vendor solutions.
Solid knowledge of DevOps practices, CI/CD, infrastructure-as-code, observability, and security best practices.
Preferred Qualifications
Experience building or leading a Data Marketplace platform.
Familiarity with workflow orchestration platforms, rules engines, BPM tools, or catalog management systems.
Experience with enterprise identity systems (OAuth, SAML, SSO), access governance, and data privacy frameworks.
Background working with enterprise data platforms, data governance, or cross-domain integration patterns.
Prior experience leading architectural governance or serving as a platform architect in an enterprise environment.
Job Title – Lead Data Engineer
Please note this role is not able to offer visa transfer or sponsorship now or in the future
About the role
As a Lead Data Engineer, you will make an impact by designing, building, and operating scalable, cloud‑native data platforms supporting batch and streaming use cases, with strong focus on governance, performance, and reliability. You will be a valued member of the Data Engineering team and work collaboratively with cross‑functional engineering, cloud, and architecture stakeholders.
In this role, you will:
- Design, build, and operate scalable cloud‑native data platforms supporting batch and streaming workloads with strong governance, performance, and reliability.
- Develop and operate data systems on AWS, Azure, and GCP, designing cloud‑native, scalable, and cost‑efficient data solutions.
- Build modern data architectures including data lakes, data lakehouses, and data hubs, with strong understanding of ingestion patterns, data governance, data modeling, observability, and platform best practices.
- Develop data ingestion and collection pipelines using Kafka and AWS Glue; work with modern storage formats such as Apache Iceberg and Parquet.
- Design and develop real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks, with understanding of event‑driven architectures and low‑latency data processing.
- Perform data transformation and modeling using SQL‑based frameworks and orchestration tools such as dbt, AWS Glue, and Airflow, including Slowly Changing Dimensions (SCD) and schema evolution.
- Use Apache Spark extensively for large‑scale data transformations across batch and streaming workloads.
Work model
We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 4 days a week in a client or Cognizant office in Atlanta, GA. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
What you need to have to be considered
- Hands‑on experience developing and operating data systems on AWS, Azure, and GCP.
- Proven ability to design cloud‑native, scalable, and cost‑efficient data solutions.
- Experience building data lakes, data lakehouses, and data hubs with strong understanding of ingestion patterns, governance, modeling, observability, and platform best practices.
- Expertise in data ingestion and collection using Kafka and AWS Glue, with experience in Apache Iceberg and Parquet.
- Strong experience designing and developing real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks.
- Deep expertise in data transformation and modeling using SQL‑based frameworks and orchestration tools including dbt, AWS Glue, and Airflow, with knowledge of SCD and schema evolution.
- Extensive experience using Apache Spark for large‑scale batch and streaming data transformations.
These will help you stand out
- Experience with event‑driven architectures and low‑latency data processing.
- Strong understanding of schema evolution, SCD modeling, and modern data modeling concepts.
- Experience with Apache Iceberg, Parquet, and modern ingestion/storage patterns.
- Strong knowledge of observability, governance, and platform best practices.
- Ability to partner effectively with cloud, architecture, and engineering teams.
Salary and Other Compensation:
Applications will be accepted until March 17, 2025.
The annual salary for this position is between $81,000 - $135,000, depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
- Medical/Dental/Vision/Life Insurance
- Paid holidays plus Paid Time Off
- 401(k) plan and contributions
- Long‑term/Short‑term Disability
- Paid Parental Leave
- Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
Team & Mission
The Privacy & Conversion Data team is responsible for how the company safely and compliantly uses conversion data to power monetization. We build and operate the core privacy infrastructure behind ads reporting and optimization, including controlled data environments, finegrained access controls, centralized privacy rules enforcement, and deidentification pipelines for conversion data. Our mission is to make conversion data privacypreserving by default-centralized, deidentified, auditable, and easy for teams to use, while maintaining high utility for advertisers and staying ahead of an evolving global regulatory landscape.
Role Summary
We're seeking a Staff Engineer to lead the architecture and technical direction for the conversion data privacy platform, spanning both core Conversion Data systems and deidentification for ads reporting. You'll own the endtoend design and evolution of privacycritical pipelines and services, partner closely with Product, Data Science, Legal, and infrastructure teams, and set the technical bar for how we use conversion data safely at scale.
What you'll do:
- Lead the technical strategy and architecture for conversion data privacy across access controls, deidentification, deletion, and privacy rules enforcement, driving toward a centralized, deidentifiedbydefault, automated privacy platform for monetization.
- Design and evolve core privacy infrastructure including controlled environments for sensitive data, finegrained authorization and policy enforcement, and a central policy repository that consistently governs access across major data platforms and query engines.
- Own deidentification pipelines for ads reporting endtoend-from separating sensitive and nonsensitive data, applying deidentification techniques and transformations, and generating privacypreserving datasets, to validating data utility and feeding reporting and analytics surfaces.
- Build and improve privacy frameworks and tooling (for both online and offline workflows) that make safe, compliant conversion data usage simple and selfservice for downstream teams, reducing onboarding friction for new datasets, restrictions, and use cases.
- Drive operational excellence and compliance by defining SLAs, building robust monitoring and alerting (e.g., deidentification quality, optout metrics, data leakages), leading incident response, and developing performant deletion and leakagehandling workflows that meet regulatory and audit requirements.
- Partner crossfunctionally with ads, data, product, legal, and infrastructure stakeholders to translate legal/privacy requirements into technical designs, make clear tradeoffs between privacy and utility, and drive alignment on roadmaps, launches, and policy changes that impact advertisers and users.
- Mentor and uplevel engineers across multiple teams, lead critical design and code reviews in privacysensitive areas, and establish best practices and documentation for privacybydesign, deidentification, and largescale data systems.
What we're looking for:
- BS+ in Computer Science (or related field) or equivalent practical experience.
- 8+ years of professional software engineering experience, with a focus on largescale data systems or distributed systems.
- Strong proficiency building and operating data pipelines and services using Java/Scala/Kotlin or Python, plus SQL; experience with modern big data ecosystems is a plus.
- Experience designing secure, reliable systems and APIs, with solid grounding in data modeling, access control, and performance optimization.
- Meaningful experience in at least one of: privacypreserving data systems (e.g., deidentification, kanonymity), ads measurement/attribution, or largescale analytics/experimentation platforms.
- Proven ability to drive crossteam technical initiatives from design through rollout, working closely with product, data science, and nonengineering partners (e.g., Legal, Compliance).
- Strong communication and leadership skills, with a track record of mentoring engineers, raising engineering standards, and making sound decisions in ambiguous, highimpact problem spaces.
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit our PinFlex page to learn more about our working model.
#LI-REMOTE
#LI-KK6
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$177,185—$364,795 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
Company/Role Overview:
CliftonLarsonAllen (CLA) Search has been retained by Midwestern Higher Education Compact to identify a Data Manager to serve their team. The Midwestern Higher Education Compact (MHEC) brings together leaders from 12 Midwestern states to strengthen postsecondary education, advance student success, and promote regional economic vitality.
MHEC programs and initiatives save member states and students millions of dollars annually through time- and cost-savings opportunities. MHEC research supports workforce readiness and improves the quality, accessibility, and affordability of postsecondary education. MHEC convenings bring together leaders and subject experts to share knowledge, generate ideas, and develop collaborative solutions.
To learn more, click here:
What You’ll Do:
- Administer and maintain Microsoft Fabric, OneLake, and Azure environments.
- Design and deliver sophisticated data solutions that are innovative and sustainable.
- Ensure data infrastructure is secure, reliable, and scalable.
- Manage and improve how data is brought into the organization from multiple sources.
- Maintain accurate, well-structured, consistent, and complete data that ensure high quality and useability for internal staff.
- Develop and oversee standards on how data is collected, stored, and protected across departments.
- Manage MHEC’s customer relationship management (CRM) system, ensuring data integrity, integration with other platforms, and alignment with organizational needs.
- Partner with teams across the organization to monitor processes and make recommendations.
- Partner with research staff to understand data access patterns and develop storage strategies that accelerate research and analytics
- Develop and maintain Power BI dashboards and reports to deliver clear insights to senior leaders and decision-makers.
- Ensure staff have access to timely, clear, and meaningful data visualizations.
- Train staff to use reports and dashboards effectively.
- Support departments in using data to guide decision-making.
- Document data pipelines, integrations, and system processes.
- Recommend tools and practices that help MHEC grow its data capacity.
- Monitor developments in Microsoft’s data platforms and assess future needs.
What You’ll Need:
- Bachelor's degree or equivalent experience preferred.
- 5+ years’ experience, preferably with Microsoft data platforms including Power BI, Azure, and/or Fabric.
- Experience designing and maintaining data systems and dashboards.
- Experience in higher education or nonprofit sectors preferred.
- Strong technical understanding of Microsoft Fabric, OneLake, and Azure.
- Proficiency demonstrated in Python, R, SAS, SQL or other statistical/data management software
- Experience with data visualization platforms (Tableau, Power BI, or similar)
- Experience with Microsoft Dynamics and Power Automate is a plus but not required.
- Ability to plan, optimize, build, and maintain data pipelines and dashboards.
Join the team leading the next evolution of virtual care.
At Teladoc Health, you are empowered to bring your true self to work while helping millions of people live their healthiest lives.
Here you will be part of a high-performance culture where colleagues embrace challenges, drive transformative solutions, and create opportunities for growth. Together, we're transforming how better health happens.
Summary of Position
As a Staff Software Engineer, you are a senior individual contributor who leads the design and delivery of significant platform features and raises the bar for engineering quality across the team. You'll work handson in code-designing APIs and data flows, building services in Python/FastAPI and React frontends, and guiding solutions from idea to production. You'll mentor engineers, influence architecture and standards within and adjacent to your team, and partner closely with product and design to achieve clear, measurable outcomes. This role blends deep implementation work with pragmatic technical leadership by example.
Essential Duties and Responsibilities
Lead technical design for platform features and services, breaking ambiguous requirements into clear, incremental designs and stories for your team and adjacent partners.
Implement backend services in Python/FastAPI and React frontends end-to-end, owning a continuous stream of stories from idea to production.
Define and use clear API contracts and data flows between services and UIs, creating patterns and templates others can follow.
Champion high-quality engineering practices, including code reviews, documentation, and maintainable, testable designs.
Develop and improve automated testing (unit, integration, endtoend) and integrate these into everyday development and CI.
Improve CI/CD pipelines and release workflows for your team so the team can ship small, safe changes frequently and confidently.
Own the operational lifecycle of the features and services you build, including monitoring, observability, on-call participation, and incident follow-up.
Design and implement secure-by-default solutions, including robust authentication/authorization, input validation, and safe handling of sensitive data.
Identify and address reliability and performance risks early, proposing concrete technical improvements and sequencing them into the roadmap.
Mentor and unblock engineers through pairing, design discussions, and clear feedback; influence without formal authority.
Partners with product/design to shape requirements into incremental deliverables; escalates tradeoff decisions; proposes sequencing that optimizes value/risk.
The time spent on each responsibility reflects an estimate and is subject to change dependent on business needs.
Supervisory Responsibilities
No
Required Qualifications
Bachelor's degree in Computer Science, Engineering, or related field; equivalent work experience is acceptable.
7+ years of experience in software engineering.
Strong proficiency with Python and modern web backends (FastAPI, Flask, Django, or similar) and solid understanding of HTTP, API design, and data modeling.
Significant experience with React (or a comparable SPA framework) and building production frontends that talk to backend APIs.
Demonstrated ability to own features end-to-end in a small team: from shaping requirements through design, implementation, testing, deployment, and support.
Experience designing and working with distributed systems or multi-service architectures (e.g., service boundaries, async jobs, integration patterns).
Solid understanding of observability and operations for production systems (metrics, logs, traces, dashboards, alerting, incident response).
Strong understanding of security fundamentals (authentication, authorization, secure data handling) and how they apply to web services and UIs.
Deep familiarity with automated testing and CI/CD, and a track record of improving engineering workflows and quality.
Excellent communication and collaboration skills; comfortable working closely with product, design, and other stakeholders.
Proven ability to provide technical leadership in a hands-on way: unblocking others, making clear decisions, and raising the bar through code and reviews.
Bonus Qualifications
Experience in early-stage or small platform teams where engineers wear multiple hats and balance shipping with building foundations.
Experience with Azure and containerized deployments (or similar cloud-native environments).
Experience building platforms (developer platforms, data platforms, or similar) that serve multiple product teams.
Exposure to AI/ML or data-intensive applications (e.g., integrating with model inference APIs, data pipelines, or analytical data stores).
The base salary range for this position is$180,000 - $200,000. In addition to a base salary, this position is eligible for a performance bonus and benefits (subject to eligibility requirements) listed here: Teladoc Health Benefits 2026.Total compensation is based on several factors including, but not limited to, type of position, location, education level, work experience, and certifications.This information is applicable for all full-time positions.
#LI-SS2 #LI-Remote
We follow a Flexible Vacation Policy, intended for rest, relaxation, and personal time. All time off must be approved by your manager prior to use. You will also receive 80 hours of Paid Sick, Safe, and Caregiver Leave annually. This applies to full-time positions only. If you are applying for a part-time role, your recruiter can provide additional details.
As part of our hiring process, we verify identity and credentials, conduct interviews (live or video), and screen for fraud or misrepresentation. Applicants who falsify information will be disqualified.
Teladoc Health will not sponsor or transfer employment work visas for this position. Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future.
Why join Teladoc Health?
Teladoc Health is transforming how better health happens. Learn how when you join us in pursuit of our impactful mission.
Chart your career path with meaningful opportunities that empower you to grow, lead, and make a difference.
Join a multi-faceted community that celebrates each colleague's unique perspective and is focused on continually improving, each and every day.
Contribute to an innovative culture where fresh ideas are valued as we increase access to care in new ways.
Enjoy an inclusive benefits program centered around you and your family, with tailored programs that address your unique needs.
Explore candidate resources with tips and tricks from Teladoc Health recruiters and learn more about our company culture by exploring #TeamTeladocHealth on LinkedIn.
As an Equal Opportunity Employer, we never have and never will discriminate against any job candidate or employee due to age, race, religion, color, ethnicity, national origin, gender, gender identity/expression, sexual orientation, membership in an employee organization, medical condition, family history, genetic information, veteran status, marital status, parental status, or pregnancy). In our innovative and inclusive workplace, we prohibit discrimination and harassment of any kind.
Teladoc Health respects your privacy and is committed to maintaining the confidentiality and security of your personal information. In furtherance of your employment relationship with Teladoc Health, we collect personal information responsibly and in accordance with applicable data privacy laws, including but not limited to, the California Consumer Privacy Act (CCPA). Personal information is defined as: Any information or set of information relating to you, including (a) all information that identifies you or could reasonably be used to identify you, and (b) all information that any applicable law treats as personal information. Teladoc Health's Notice of Privacy Practices for U.S. Employees' Personal information is available at this link.
Company Description
PG Forsta is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.
Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.
Our Mission:
We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.
Our Values:
To put Human Experience at the heart of organizations so every person can be seen and understood.
- Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.
- Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.
- Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.
- Dare to innovate:We challenge the status quo with creativity and innovation as our true north.
- Better together:We check our egos at the door. We work together, so we win together.
Duties & Responsibilities
Design and implement processes, systems and automation to streamline the development and deployment of AI solutions.
Architect robust, reliable solutions for specific AI applications using appropriate cloud-based and open source technologies.
Design and automate data pipelines to deliver complex data products to power training and online inference of AI systems.
Deploy ML models, LLMs and GenAI systems into production, ensuring reliability, efficiency, and scalability across cloud or hybrid environments.
Build and maintain robust CI/CD pipelines tailored to ML model lifecycle management, ensuring a streamlined and agile deployment process.
Monitor model performance, identify potential improvements, and integrate feedback loops for continuous learning and adaptation.
Integrate models with chat interfaces and conversational platforms to create responsive, user-centric applications.
Investigate and implement agent-based architectures that support conversational intelligence and interaction modeling.
Collaborate with cross-functional teams to design AI-driven features that enhance user experience and interaction within chat interfaces.
Work closely with data scientists, product managers, and engineers to ensure alignment on project goals, data requirements, and system constraints.
Mentor junior engineers and provide guidance on best practices in ML model development, deployment, and maintenance.
Create and maintain comprehensive documentation for model architectures, code implementations, data workflows, and deployment procedures to ensure reproducibility, transparency, and ease of collaboration.
Technical Skills
Experience with large-scale deployment tools and environments, including Docker, Kubernetes, and cloud platforms like AWS, Azure, or GCP.
Experience deploying and managing a variety of database technologies.
Experience deploying ML models at scale and optimizing models for low-latency, high-availability environments.
Strong programming skills in Python and proficiency in libraries such as NumPy, Pandas, and Scikit-learn.
Experience with data pipelines, ETL processes, and experience with distributed data frameworks like Apache Spark or Dask.
Familiarity with machine learning frameworks such as TensorFlow, PyTorch, and Hugging Face Transformers.
Knowledge of conversational AI, agent-based systems, and chat interface development.
Proven track record in deploying and maintaining ML and AI solutions in a production setting.
Experience with version control (e.g., Git) and CI/CD tools tailored to ML workflows.
Experience with MLOps.
Experience with Databricks is a plus.
Qualifications
Minimum Qualifications
5+ years of experience in platform engineering with a focus on with a focus on data and ML systems.
Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.
Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.
Additional Information for US based jobs:
Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.
Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
The expected base salary for this position ranges from $100,000 to $140,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.
All your information will be kept confidential according to EEO guidelines.
Our privacy policy can be found here:legal-privacy/
As a Data Steward Senior Analyst, you are part of a team responsible for enabling and supporting compliance with data-related enterprise policies within their domains/business units. You and your team are responsible for identifying critical data and associated risks, maintaining data definitions, classifying data, supporting data sourcing / usage requests, measuring Data Risk Controls, and confirming Data Issues are remediated. You have the opportunity to partner across various business units, technology teams, and product/platform teams to define and implement the data governance strategy, supervising and leading data quality, resolving data/platform issues, and driving consistency, usability, and governance of specific product data across the enterprise.
In addition, this role will play a key part in effectively communicating new and updated data-related policies to the teams responsible for compliance. The individual must be skilled in preparing clear, engaging presentations that translate formal policy language into practical, easy-to-understand guidance and “tell the story” behind the policy requirements. The role will also support the delivery of training sessions, facilitate policy office hours, and serve as a go-to resource for questions related to data governance and retention compliance.
Your Primary Responsibilities may include:
• Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention (primary), Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others.
• Develop training materials and educate organization on Record Retention and Deletion processes and procedures.
• Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business.
• Collaborate with and influence product managers to ensure all new use cases are managed according to policies.
• Influence and contribute to strategic improvements to data assessment processes and analytical tools.
• Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams.
• Subject matter expertise on multiple platforms.
• Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap.
Qualifications include:
• 5 + years of experience in a similar role involved with ensuring compliance with Record Retention and Deletion policies.
• Strong communication skills and ability to influence and engage at multiple levels and cross functionally.
• Intermediate understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience.
• 5+ years of Data Quality Management experience.
• Strong familiarity with data architecture and/or data modeling concepts
• 5+ years of experience with Agile or SAFe project methodologies
• Bachelor’s degree in Finance, Engineering, Mathematics, Statistics, Computer Science or other similar fields.
• Preferred: Experience in Travel Industry.
• Preferred: Knowledge of RCSA (Risk Control Self-Assessment) methodology
Leadership Skills may include:
• Makes Decisions Quickly and Effectively: Drives effective outcome through decision making authority. Displays judgement and discretion in order to ensure deliverables are sufficient to the American Express policy and overall compliance.
• Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions.
• Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team.
• Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.
8116 - Midtown Office - 2220 W. Broad Street, Richmond, Virginia, 23220
Job Description
What you will do – Essential Responsibilities
- Given long term strategic goals, can lay out a path across many versions.
- Participates in and supports initiatives outside of main area of responsibility.
- High degree of influence of data product direction and has ownership over large components.
- Thinks both strategically and tactically, keeping in mind both technical goals and company goals.
- Provides technical leadership for projects including 3–4 senior level individuals.
- The data engineer will be considered a blend of data and analytics “guru.” This role will promote the available data and analytics capabilities and expertise to business unit leaders and educate them in leveraging these capabilities in achieving their business goals.
- Work with data governance team members and information stewards and participate in vetting and promoting content created in the business and by data scientists to the curated data catalog for governed reuse.
- May be required to present at conferences to demonstrate company’s technical prowess.
Purpose of the role
Senior Principal Engineers partner with Engineers and Solution Architects to develop solutions and implement standards that ensure an unrivaled data experience. You are an expert in your craft and seen as a platform and implementation owner. You are an active contributor in the industry and have a passion for continuous learning.
Senior Principal Engineers practice hands-on development, have oversight of the technical tasks of others, and are the owners of the standards and best practices. Our Senior Principal Engineers act as a technical mentor to others and is an expert in supporting multiple areas of the business.
Qualifications and Requirements
Basic Qualifications
- Bachelor’s Degree in Computer Science, Decision Science, Engineering, Statistics, or a related field, or equivalent alternative education, skills, and/or practical experience is required and 8+ years of work experience required in data management disciplines including [data integration, modeling, optimization and data quality], and/or other areas directly relevant to data engineering responsibilities and tasks; multiple certifications preferred or
- Master’s Degree in Computer Science, Decision Science, Engineering, Statistics, or a related field, or equivalent alternative education, skills, and/or practical experience is required and 6+ years of work experience required in data management disciplines including [data integration, modeling, optimization and data quality], and/or other areas directly relevant to data engineering responsibilities and tasks; multiple certifications preferred.
Preferred Qualifications
- Expert experience working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. These should include [ETL/ELT, data replication/CDC, message-oriented data movement]
- Strong/expert experience with multiple advanced analytics tools languages such as [R, Python, Java, C++, Scala, others].
- Strong/expert experience with popular database programming languages including [SQL, PL/SQL, others] on both relational and non-relational databases.
- Strong experience with cloud data platforms such as Databricks, Snowflake
- Expert experience with data discovery, analytics, and data quality controls
- Expert experience in data modeling and ontologies
- Strong experience with microservices to Serve Data
- Strong experience in cloud platforms such as Azure, AWS, GCP
Work Location and Arrangement: This role will be based out of the CarMax Midtown office, Richmond VA or CarMax Technology Hub, Plano TX and have a Hybrid work arrangement.
- Associates based in Richmond work onsite 5 days per week.
- Associates based in Plano work onsite 2 days per week.
Work Authorization: Applicants must be currently authorized to work in the United States on a full-time basis. Sponsorship will not be considered for this specific role.
About CarMax
CarMax disrupted the auto industry by delivering the honest, transparent and high-integrity experience customers want and deserve. This innovative thinking around the way cars are bought and sold has helped us become the nation’s largest retailer of used cars, with over 250 locations nationwide.
Our amazing team of more than 25,000 associates work together to deliver iconic customer experiences. Along the way, we help every associate grow their career and achieve their best, at work and in their community. We are recognized for our commitment to training and diversity and are one of the FORTUNE 100 Best Companies to Work For®.
Our Commitment to Diversity and Inclusion:
CarMax is committed to bringing together people from different backgrounds and perspectives, providing employees with a safe, welcoming, and inclusive work environment.
CarMax is an equal opportunity employer, and all qualified candidates will receive consideration for employment without regard to age, race, color, religion, sex, sexual orientation, gender identity, genetic information, national origin, protected veteran status, disability status, or any other characteristic protected by law.
Company Description
Press Ganey is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.
Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.
Our Mission:
We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.
Our Values:
To put Human Experience at the heart of organizations so every person can be seen and understood.
Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.
Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.
Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.
Dare to innovate:We challenge the status quo with creativity and innovation as our true north.
Better together:We check our egos at the door. We work together, so we win together.
We are seeking an experienced Staff Data Engineer to join our Unified Data Platform team. The ideal candidate will design, develop, and maintain enterprise-scale data infrastructure leveraging Azure and Databricks technologies. This role involves building robust data pipelines, optimizing data workflows, and ensuring data quality and governance across the platform. You will collaborate closely with analytics, data science, and business teams to enable data-driven decision-making.
Duties & Responsibilities:
- Design, build, and optimizedata pipelinesand workflows inAzureandDatabricks, including Data Lake and SQL Database integrations.
- Implement scalableETL/ELT frameworksusingAzure Data Factory,Databricks, andSpark.
- Optimize data structures and queries for performance, reliability, and cost efficiency.
- Drivedata quality and governance initiatives, including metadata management and validation frameworks.
- Collaborate with cross-functional teams to define and implementdata modelsaligned with business and analytical requirements.
- Maintain clear documentation and enforce engineering best practices for reproducibility and maintainability.
- Ensure adherence tosecurity, compliance, and data privacystandards.
- Mentor junior engineers and contribute to establishingengineering best practices.
- SupportCI/CD pipeline developmentfor data workflows using GitLab or Azure DevOps.
- Partner with data consumers to publish curated datasets into reporting tools such asPower BI.
- Stay current with advancements inAzure, Databricks, Delta Lake, and data architecture trends.
Technical Skills:
- Advanced proficiency inAzure 5+ years(Data Lake, ADF, SQL).
- Strong expertise inDatabricks (5+ years),Apache Spark (5+ years), andDelta Lake (5+ years).
- Proficient inSQL (10+ years)andPython (5+ years); familiarity withScalais a plus.
- Strong understanding ofdata modeling,data governance, andmetadata management.
- Knowledge ofsource control (Git),CI/CD, and modern DevOps practices.
- Familiarity withPower BIvisualization tool.
Minimum Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, or related field.
- 7+ yearsof experience in data engineering, with significant hands-on work incloud-based data platforms (Azure).
- Experience buildingreal-time data pipelinesand streaming frameworks.
- Strong analytical and problem-solving skills.
- Proven ability tolead projectsand mentor engineers.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Master's degree in Computer Science, Engineering, or a related field.
- Exposure tomachine learning integrationwithin data engineering pipelines.
Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.
Additional Information for US based jobs:
Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.
Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
The expected base salary for this position ranges from $110,000 to $170,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.
All your information will be kept confidential according to EEO guidelines.
Our privacy policy can be found here:legal-privacy/
About the Role
Our Decision Intelligence (DI) team is seeking a Senior / Lead Data Architect to drive enterprise data strategy and accelerate AI‑enabled transformation across McKesson. DI plays a critical role in enabling data‑driven change and delivering measurable business value through high‑quality data, advanced analytics, and intelligent automation.
This role will define and evolve the enterprise‑wide data and semantic architecture required to support AI‑driven insights, agentic automation, and next‑generation data products. The ideal candidate is a strategic thought partner, a hands‑on architect, and a leader capable of translating business outcomes into scalable technical solutions.
Responsibilities
Data Architecture Leadership
- Architect canonical data domains across customer, product, pricing, supply chain, contracting, and financial performance.
- Design semantic layers, business ontologies, subject‑area models, and metric definition frameworks to power enterprise AI agents and decisioning systems.
- Define architectural principles for data interoperability, lineage, access control, security, and multi‑cloud integration.
- Align data platform and architecture decisions with the USPD AI Roadmap and enterprise AI strategy.
Establish standards and patterns for:
- RAG pipelines
- Vector search
- Metadata-driven orchestration
- Multi-modal ingestion (text, events, real-time signals)
Provide architectural oversight and strategic guidance across enterprise data products including:
- Finance, Pricing, and Supply Chain Data Products
- FIA
- ContractIQ
- Specialty Leakage Agents
- Design a robust, scalable, and interoperable data environment that supports AI-ready, governed, high-quality enterprise data.
- Influence programs and project teams on best practices related to data quality, architecture, modeling, observability, and governance.
- Leverage data architecture frameworks to translate complex relational entities into business cases, use cases, and AI-enablement requirements.
- Partner with product, engineering, and analytics leaders to accelerate data product creation and improve enterprise decision intelligence maturity.
Advanced Data System Design
- Architect complex distributed data systems that ensure scalability, performance, reliability, and real-time integration across business-critical operations.
- Design and govern enterprise-wide data models, data flows, reference architectures, and integration patterns.
Produce high-quality data design deliverables including:
- Data models
- Entity relationship diagrams (ERDs)
- Data flow diagrams
- System interface schemas
- Comprehensive data dictionaries and metadata documentation
- Ensure optimal functioning of AI/ML pipelines, including data quality controls, observability patterns, and architecture for low-latency analytics.
- Guide engineering teams on reusable patterns for ingestion, transformation, curation, semantic enrichment, and operationalization.
Minimum Qualifications
- 7+ years of experience in data engineering, data architecture, or enterprise data platform development.
- Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field.
Required Skills
- 7+ years designing enterprise data architecture across large, complex organizations.
- Demonstrated experience with Enterprise data modeling, semantic layers, canonical domains
- Large-scale integration across heterogeneous systems
- Databricks, Snowflake, MDM platforms, SAP, Salesforce/Conga
- Designing intuitive architectural patterns to simplify complex data landscapes.
- Strong understanding of data quality frameworks, governance, lineage, metadata, and regulatory compliance.
Leadership Skills
- Ownership-driven leader with a track record of guiding engineering teams through delivery.
- Acts as a change champion, elevating architecture maturity and influencing cross-functional adoption of best practices.
Strategic Thinking
- Strong analytical capability and the ability to develop long-term data strategies aligned to enterprise objectives and future-state AI readiness.
Problem Solving
- Creative, innovative problem solver capable of architecting solutions for highly complex data and AI challenges.
Job Title: Enterprise AI-Ready Data Architect / Senior Data Engineer
Duration : 6 Months
Location: East Hanover (Onsite: 3days & 2 days remote a week)
Job Description:
The Enterprise AI-Ready Data Architect / Senior Data Engineer is a hybrid role with a focus on enterprise data architecture, AI integration, and hands-on data engineering. You will design and implement AI-ready, analytics-ready data products and semantic layers (including ontologies) that enable scalable enterprise analytics and integration with AI agents and GenAI use cases. You will embed governance-by-design (quality, lineage, contracts, observability) and partner closely with business and technology stakeholders—in pharmaceutical domains.
Key Responsibilities
1) Enterprise Data Architecture (AI-Ready by Design)
• Define and deliver strategic enterprise data architectures that scale and support AI-ready outcomes.
• Design data workflows capturing as-is and to-be states for enterprise modernization.
• Establish architecture patterns for:
• Semantic Context Layer
• Data Warehouses, Data Lakehouses
• Data Catalogs and Data Marketplaces
• Event-driven and metadata-driven architectures
• Distributed data management (Data Mesh, Data Fabric, Domain-Driven Design)
• Streaming data management
2) Data Products, Semantic Products, and Master Data
• Design data products that are AI-ready and reusable across domains and use cases.
• Build and govern semantic models, metrics-first modeling, and ontologies (knowledge graph concepts).
• Deliver Master Data Management (MDM) capabilities and align master/reference data with business needs.
• Support structured and unstructured data management to enable broader AI and analytics capabilities.
3) AI Integration and GenAI Enablement
• Enable contextual intelligence and data enrichment using:
• Contextual retrieval, entity linking, enrichment using LLMs and embeddings
• Vector search, RAG pipelines, and LLM-based enrichment
• Implement graph-based approaches:
• RDF, OWL, and SPARQL querying
• Property graph / knowledge graph modeling for relationships and reasoning
4) Data Engineering Delivery
• Design and implement robust ETL/ELT pipelines and orchestration frameworks.
• Develop high-quality transformations and data modeling using:
• Advanced SQL
• Tools such as dbt, Airflow, Dataiku
• Ensure production-grade engineering practices for performance, reliability, and maintainability across pipelines.
5) Governance and Standards (Embedded)
• Implement open-source data standards across:
• Data contracts
• Data quality
• Data lineage
• Lead metadata-driven governance through metadata management, observability, and policy-aligned design.
Skills and Qualifications
Core Technical Skills
Advanced SQL proficiency
Data platforms and governance tooling experience (one or more):
Snowflake, Databricks, Collibra, Salesforce
ELT/ETL and orchestration:
dbt, Airflow, Dataiku
BI and reporting:
Power BI
Cloud platforms:
AWS, Azure, GCP
Modern architecture and data management:
Data Mesh, Data Fabric, streaming, metadata-driven architecture
Graph and semantic technologies:
Knowledge graphs, property graphs (Neo4J), RDF/OWL, SPARQL, graph query languages
Domain and Modeling Expertise
• Experience with data modeling techniques:
• Conceptual, logical, physical modeling—preferably for the pharmaceutical industry
• Semantic modeling, ontology design, and reusable metric layers
• MDM concepts and implementation approaches
AI and GenAI Enablement Skills
• Familiarity with GenAI technologies for enhancing analysis/reporting and data enrichment
• Experience with embeddings, vector search, RAG patterns, and entity resolution/linking concepts
Nice to Have
• Experience with Palantir platform
Recommended Certifications
• CDMP (DAMA)
• TOGAF
• EDM Council frameworks:
• DCAM, CDMC, Open Knowledge Graph, Data Ethics and Responsible AI
Qualifications
• 10+ years of experience in data architecture, process automation, implementation and large-scale data engineering, ideally in pharmaceutical
• Advanced technical engineering and hands-on experience in data modeling for OLAP, workflow automation, AI/ML integration
• ETL pipeline design and development
• Bachelor’s degree in computer science, information technology, engineering, or data science
• Strong problem-solving skills and attention to detail.
• Excellent communication skills with the ability to work with senior stakeholders to translate business requirements to technical data requirements